Question on Shipping logs from multiple microservice to Elasticsearch

I am considering pushing the application logs from multiple microservices using filebeats to a remote logstash server. Will the fiebeat acquire the lock on the logstash server to write the file? What are the considerations and best practices for pushing the logs from filebeats to logstash server.

@Pradeep_Kumar3 Welcome to the community.

I am not sure what that means.

Many Filebeats can send logs to a single logstash instance (or multiple)

Filebeat will use the logstash output.

Logstash will use the beats input

And a generic basic logstash pipeline would look like this if you are using filebeat modules

input {
  beats {
    port => 5044
  }
}

output {
  if [@metadata][pipeline] {
    elasticsearch {
      hosts => "https://061ab24010a2482e9d64729fdb0fd93a.us-east-1.aws.found.io:9243"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
      pipeline => "%{[@metadata][pipeline]}" 
      user => "elastic"
      password => "secret"
    }
  } else {
    elasticsearch {
      hosts => "https://061ab24010a2482e9d64729fdb0fd93a.us-east-1.aws.found.io:9243"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
      user => "elastic"
      password => "secret"
    }
  }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.