Initially, I created an EKS cluster from my local machine using eksctl
, and I was able to connect to the cluster without any issues. After changing the access settings to use a private endpoint instead of a public one, I can no longer access the cluster from my local machine, which was expected.
I used AWS Cloud9, deployed in the same VPC as the EKS cluster. I also used the same IAM user to create the Cloud9 environment, and Cloud9 has access to the EKS cluster. I installed eksctl
and kubectl
in Cloud9 and copied the kubeconfig file from my local machine to Cloud9. Despite this, even after changing the access back to public, I still cannot access the cluster.
I suspect there might be an issue with the kubeconfig file. However, I haven’t been able to figure out the problem. Could someone please help me resolve this issue?
I got this error
2
Answers
Use this command to update the kubeconfig file:
As the EK cluster is private you can only connect to the EKS cluster through the same VPC or connected network to the vpc.
From the error message it is clear that you are not able to connect to eks cluster api server endpoint.
So if cloud9 is in same vpc then check the below points.