skip to Main Content

Initially, I created an EKS cluster from my local machine using eksctl, and I was able to connect to the cluster without any issues. After changing the access settings to use a private endpoint instead of a public one, I can no longer access the cluster from my local machine, which was expected.

I used AWS Cloud9, deployed in the same VPC as the EKS cluster. I also used the same IAM user to create the Cloud9 environment, and Cloud9 has access to the EKS cluster. I installed eksctl and kubectl in Cloud9 and copied the kubeconfig file from my local machine to Cloud9. Despite this, even after changing the access back to public, I still cannot access the cluster.

I suspect there might be an issue with the kubeconfig file. However, I haven’t been able to figure out the problem. Could someone please help me resolve this issue?

I got this error

enter image description here

2

Answers


  1. Use this command to update the kubeconfig file:

    aws eks update-kubeconfig --region region-code --name my-cluster
    
    Login or Signup to reply.
  2. As the EK cluster is private you can only connect to the EKS cluster through the same VPC or connected network to the vpc.

    From the error message it is clear that you are not able to connect to eks cluster api server endpoint.

    So if cloud9 is in same vpc then check the below points.

    • Does cluster security group allow inbound access from cloud9 on port 443.
    • Confirm if there is any NACL blocking ingress and egress traffic.
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search