How to integrate from EKS to ELK

Hi team,

We would like to explore about the integrate from EKS to ELK.

Normally, we have getting the logs from below path.

Logs Flow - FileBeat/MetricBeat -> AWS Kafka -> Logstash -> ES -> Kibana

So, we would like to integrate the EKS to ELK. So, we have already getting the docker pushing logs to ELK. The same manner, we need to connect the log flow between EKS to AWS Kafka.

Could someone please guide, how could we integrate EKS to ELK that would be helpful for us to explore more on this.

Thanks & Regards,
Yasar Arafaath A.

Perhaps Look At Elastic Agent

Fleet Managed

Or Stand Alone since you probably want to send to Kafka

You can still use Beats there are docs on how to deploy on Kubernetes

Or if you are shipping logs in another manner to S3 you could probably forward with

Hi Stephen,

I have read all those your documents.

I am bit confused, could you please tell me, which is the document working with my requirements.

we have already getting the docker pushing logs to ELK from one of the application team. But we don't have the much knowledge about docker. So, The same manner, we need to connect the log flow between EKS to AWS Kafka.

My queries are below

[1] How could we get the docker logs from ELK integration.
[2] How could we integrate the docker logs from EKS to Kafka.
[3] For EKS, we need to install the Kubernetes.. if yes, where we need to do that ?

and final

[4] How could we integrate from EKS to ELK.

Regards,
Yasar Arafaath A

I am confused those 2 sentences seems to contradict each other.

So If you already have docker logs pushing to Elastic then just changes the output to Kafka... what are you using to ship the logs...

If you follow this and just changes the output to Kafka that will get your logs from EKS to Kafka

EKS -> Filebeat (log) -> Kafka

Here is the manifest:

https://raw.githubusercontent.com/elastic/beats/8.5/deploy/kubernetes/filebeat-kubernetes.yaml

Change this part along with the

    output.elasticsearch:
      hosts: ['${ELASTICSEARCH_HOST:elasticsearch}:${ELASTICSEARCH_PORT:9200}']
      username: ${ELASTICSEARCH_USERNAME}
      password: ${ELASTICSEARCH_PASSWORD}

to Kafka Output.

Example

output.kafka:
  # initial brokers for reading cluster metadata
  hosts: ["kafka1:9092", "kafka2:9092", "kafka3:9092"]

  # message topic selection + partitioning
  topic: '%{[fields.log_topic]}'
  partition.round_robin:
    reachable_only: false

  required_acks: 1
  compression: gzip
  max_message_bytes: 1000000

Your flow is quite complex for just getting started... and if you do not have experience with the components it will be quite hard.

You are going to need to "Dig in and Test / Try / Fix" it is the only way...

When I see this

EKS Logs Flow -> FileBeat/MetricBeat -> AWS Kafka -> Logstash -> ES -> Kibana

I suggest this as a progression..

EKS Logs Flow -> FileBeat/MetricBeat -> ES -> Kibana

EKS Logs Flow -> FileBeat/MetricBeat ->  Logstash -> ES -> Kibana

EKS Logs Flow -> FileBeat/MetricBeat -> AWS Kafka -> Logstash -> ES -> Kibana

EKS is Kubernetes... I am confused...

Good Luck! come back with specific questions on your journey.

Hi Stephen,

Thanks for the helpful.

Please find my below query.

Could you please guide us, how can we install the Filebeat for Docker/Docker images. So, that we can accomplish the first step right.. ??

Please help us some Filebeat config that would be helpful for us

Thanks,
Yasar Arafaath

I don't know what that means. Do you want install filebeat on kubernetes? If so, here are the directions.

You need to actually try and then ask questions. I'm not going to try to guess and create a step-by-step instructions for you. That's what we have documentation for.

You will see the people that get good answers. Provide specific questions and then specific configurations and then specific results, logs, errors, etc.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.