I have set up in my k8s cluster a MongoDB database with the following configuration:
apiVersion: apps/v1
kind: Deployment
metadata:
name: mongodb-deployment
labels:
app: mongodb
spec:
replicas: 1
selector:
matchLabels:
app: mongodb
template:
metadata:
labels:
app: mongodb
spec:
containers:
- name: mongodb
image: mongo
ports:
- containerPort: 27017
env:
- name: MONGO_INITDB_ROOT_USERNAME
valueFrom:
secretKeyRef:
name: db-secret
key: mongo-root-username
- name: MONGO_INITDB_ROOT_PASSWORD
valueFrom:
secretKeyRef:
name: db-secret
key: mongo-root-password
---
apiVersion: v1
kind: Service
metadata:
name: mongodb-service
spec:
selector:
app: mongodb
type: LoadBalancer
ports:
- protocol: TCP
port: 27020
targetPort: 27017
nodePort: 30010
(the type of my service is LoadBalancer so that I can debug it from outside my cluster).
I have a Node.js app inside the k8s cluster (same namespace) which executes the following code:
mongoose.connect(
`mongodb://${process.env.MONGODB_USERNAME}:${process.env.PASSWORD}@mongodb-service:27020`,
{
useNewUrlParser: true,
useUnifiedTopology: true,
connectTimeoutMS: 1000,
},
(err) => {
console.log(err);
}
);
When I try authenticating with the previous code, mongoose fails to connect and console.log(err)
prints an AuthenticationFailed MongoError. If however I remove the credentials from the connection string, mongoose manages to connect to the database (which it shouldn't as I have specified credentials as environment variables in my deployment).
The weirdest part is that if I now try connecting using MongoDBCompass on my machine, it's the opposite (or actually it's the expected behavior): the database refuses the connection without credentials but accepts it with the credentials in the connection string.
Finally, it didn't have anything to do with Kubernetes. Adding ?authSource=admin
at the end of the connection string solved my problem.