Aws Elb kubernetes and socketio 400 http error

10/31/2019

I try to deploy a socket.io application on kubernetes on aws with a classic elb.

On my node service I use sticky-session and a redis socket.io adapter like explained here: https://socket.io/docs/using-multiple-nodes/ . And it's working great in a local environnement. But in once it's deployed on kubernetes it breaks 2 from 3 times with an http 400 error. Also when i have only one pod so I think it's comming from the load balancer or kubernetes that don't transmit the client ip but the instance IP so the sticky node Balancing isn't working well.

I tried with aws NLB without success the target group healthcheck is tuning all instances to unhealthy so the calls timed Out.

Am I the only one in this situation? Is there a solution maybe with another library than socket.io that don't need sticky session or maybe a solution on kubernetes?

Thanks for the help

-- Lucas Zientek
amazon-web-services
kubernetes
node.js
socket.io
websocket

0 Answers