Can I use a single elasticsearch/kibana for multiple k8 clusters?

12/5/2019

Do you know of any gotcha's or requirements that would not allow using a single ES/kibana as a target for fluentd in multiple k8 clusters?

We are engineering rolling out a new kubernetes model. I have requirements to run multiple kubernetes clusters, lets say 4-6. Even though the workload is split in multiple k8 clusters, I do not have a requirement to split the logging and believe it would be easier to find the logs for pods in all clusters in a centralized location. Also less maintenance for kibana/elasticsearch.

Using EFK for Kubernetes, can I point Fluentd from multiple k8 clusters at a single ElasticSearch/Kibana? I don't think I'm the first one with this thought however I haven't been able to find any discussion of doing this. Found lots of discussions of setting up efk but all that I have found only discuss a single k8 to its own elasticsearch/kibana.

Has anyone else gone down the path of using a single es/kibana to service logs from multiple kubernetes clusters? We'll plunge ahead with testing it out but seeing if anyone else has already gone down this road.

-- Chad Ernst
efk
elasticsearch
kibana
kubernetes
logging

2 Answers

12/5/2019

I dont think you should create an elastic instance for each kubernetes cluster, you can run a main elastic instance and index it all logs.

But even if you don`t have an elastic instance for each kubernetes client, i think you sohuld have a drp, so lets says instead moving your logs of all pods to elastic directly, maybe move it to kafka, and then split it to two elastic clusters.

Also it is very depend on the use case, if every kubernetes cluster is on different regions, and you need the pod`s logs in low latency (<1s), so maybe one elastic instance is not the right answer.

-- ShemTov
Source: StackOverflow

12/5/2019

Based on [1] we can read:

Fluentd collects logs from pods running on cluster nodes, then routes them to a central​​​​​​ized Elasticsearch.

Then Elasticsearch ingests these logs from Fluentd and stores them in a central location. It is also used to efficiently search text files.

Kibana is the UI; the user can visualize the collected logs and metrics and create custom dashboards based on queries.

There are several ways in which they can solve your dilemma:

a) Create a centralized dashboard and use each cluster’s Elasticsearch as backend. So you can see all your clusters logs in one place.

b) Create an Elasticsearch cluster and add each Elasticsearch into it. This is NOT the best option since you will duplicate your data several times, you will need to handle each index shards and you will need to fight with the split brain dilemma but it’s great for data resiliency.

c) Use another solution like an APM (New Relic, Instana, etc) to fully centralize your logs in one place.

[1] https://techbeacon.com/enterprise-it/9-top-open-source-tools-monitoring-kubernetes

-- Armando Cuevas
Source: StackOverflow