How to add embedded cluster to local kubernetes config?

Hello,

I have setup an embedded cluster, and would like to have it on my local kubernetes config.

I tried the following steps:

  • save the output of kubectl config view --minify --raw from the embedded instance into my local ~/.kube/config
  • replace the IP in the generated config with localhost
  • SSH-forward port 6443 on localhost

But I get the following error:

x509: certificate is valid for ip-172-31-47-87, kubernetes, kubernetes.default, kubernetes.default.svc, kubernetes.default.svc.cluster.local, not localhost

How do I fix this certificate issue? Or maybe there’s an easier way to connect my local kubernetes to the cluster?

@chbrosso interesting approach. Since the cert is only valid for specifc hostnames (and specifically not localhost), maybe you could try faking DNS for the node name:

  1. Adding an /etc/hosts entry to forward the node hostname ip-172-31-47-87 to either localhost (keep the port-forward step) or directly to the server IP (remove the port-forward step)
  2. changing the name in the local kubeconfig to ip-172-31-47-87 instead of localhost

Aside – I’m assuming here that you’re changing the host IP in the local kubeconfig because you don’t have direct network access to the target server’s private IP.

1 Like

Thanks, it worked!
Indeed I’m on EC2, I assume I’d need to setup VPC configuration to get access to the private IP address, but that’s not something I’m used to.

1 Like