currently I'm working on configuring an ELK Stack.
Filebeat should put JSON oneliners ONLY into Logstash for parsing.
Filebeat should put /var/log/secure, /var/log/yum.log, /var/log/audit/audit.log directly into Elasticsearch.
With my current configuration, I only achieve that both prospectors where send to both outputs....
I thought pipelineing will help here but it didn't... Maybe my approach is wrong?
This kind of event routing is currently not possible with filebeat. The pipeline setting is for configuring the ingest node pipeline in Elasticsearch. This setting is not available for logstash.
Either use 2 filebeat instances or send all events to one output type only.
In 6.0 only one output can be selected, but the pipeline can be configured in the prospector. The pipeline setting is then forwarded to Logstash in the @metadata.pipeline setting. This would allow you to do filtering and forwarding to ES ingest node pipeline as well.
Indentation in your config files seems to be pretty off. This a copy'n paste error?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.