Hi friends, I use logstash 2.4, kafka 0.10.1 and a cluster Elasticsearch 5.1.1
I import from Active Directory three logs: Application logs, System logs and Security logs.
I create three topics in zookeeper from that logs :
_ topic_id => "ActiveDirectory-Application-Logs"
_ topic_id => "ActiveDirectory-System-Logs"
_ topic_id => "ActiveDirectory-Security-Logs"
For storing datas, I use three indices in Elasticsearch:
_ index => "logstash-application"
_ index => "logstash-system"
_ index => "logstash-security"
When I start the pipeline, datas are stored in elasticsearch. But in my logstash-security index, there is some mistakes, some system or application logs are present in security, or security logs in application.
I don't know how to force logs to go in the right index.
Logstash has a single event pipeline where events from all inputs go to all outputs. Separating inputs and outputs in different files does not change this. If you want an event to only reach some outputs you need to wrap the outputs in a conditionals to selects how events are routed.
This exact question comes up here quite often. I'm sure you'll find useful information in the archives.
I don't believe the embedded option is supported in Logstash 2.0 and later, but it appears it complains about the very beginning of the file. Check that you don't have a garbage character there. Otherwise reduce the configuration until it works to narrow it down.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.