Kubernetes secure multi-region cluster Issue

Finally got a secure cluster up and running in kubernetes, but I keep seeing this error and I do not see any documentation on it. We have 3 pods (statefulset) all behind a public loadbalancer. The IP of the load balancer is on the certificate that is used to secure/start the cluster.

root@cockroachdb-public.dss:26257/defaultdb> SHOW ZONE CONFIGURATIONS;
pq: get-all-names: different nodes 1 and 2 with the same address ‘XXX.XXX.XXX.XXX:26257’

Also in the web UI it says “21 under-replicated ranges” which leads me to believe, things are not replicating properly. Any help would be greatly appreciated!!!

Hey Jason,

Does your admin UI show any dead nodes? Also, if you could send us the debug zip that would be useful as well.

You can email it to me here.

Did your nodes recently restart for any reason? Could be that the address for node 1 and 2 may think they are still the same.