Logstash not shipping data to Elasticsearch

How do I get logstash to ship data to Elasticsearch? I'm not sure what to change in the logstash.yml file or what section I should change for that matter. My pipelines.yml points to /etc/logstash/conf.d/syslog.conf... This file is pointing to my elasticsearch.

Maybe I'm confused as to the purpose of the logstash.yml vs the syslog.conf.

Any clarification would be amazing because the documentation is impossible to follow...

Thanks,

Joe

logstash.yml - parameters related how will LS run, batch size, log info/debug/..., xpack settings etc.
syslog.conf - configuration related to data processing - input, filter, output.

As always said, and again... add debug in output:

output {
    elasticsearch {
      hosts => ["http://server:9200"]
      index => "indexname"
    } 

    stdout {codec => rubydebug}
}
  1. Check is there any data displayed in the command line. Start LS as process not as service.
  2. Check does your data come to LS
    curl http://localhost:9200/_nodes/stats/pipelines?pretty
  3. If there is no data, make another syslog-nofiltering.conf
input { ... same, copy from /etc/logstash/conf.d/syslog.conf ... }
filter {} # emty, no filtering
output { stdout {codec => rubydebug} }
  1. Use tcpdump to dump network data for your port.

logstash.yml is the overall configurations of logstash

conf files like syslog.conf are where you describe the input, logic, and output of the actual data processing the "pipelines" that get executed in logstash

There is 1 logstash.yml there can be many .conf pipeline files

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.