I am using logstash 5.6.3 and am trying to figureout how to configure multiple filter and output for a single input. My setup is filebeat forwarding to logstash and from logstash I want out put to elasticsearch as well as csvfile. in both output required fields are little different and formatting is also bit different.
Could someone please give me some direction?
Appreciate your help.
Thanks PandKing for quick answer. I guess my original question was not clear enough.
I could reach to somewhat similar point. I'm actually stuck at filter and list of fields that I want in both the outputs.
Below is sample http log.
50...* - - [25/Apr/2018:01:11:52 -0000] "GET https://myserver/my/path http/2" 200 1636728 200 1636728 0 0 335 579 468 571 0.524 0.450 DIRECT FIN FIN TCP_MISS "AppleCoreMedia/1.0.0.14W585a (Apple TV; U; CPU OS 10_2_1 like Mac OS X; en_us)" 14BF2CA8-9291-42E0-8A32-3FF6897ACBD9
I want to parse all the fields and send it to elastic as is. I want to do little extra when I send it to file.
I want to convert httpdate to epoch and change IP to geo location details.
Now what is happening is, any mutation I do in filter and all new fields are ending up in elastic index.
Can I be selective about which fields goes to elastic and which fields goes to file?
Can I be selective about which fields goes to elastic and which fields goes to file?
No, but you can use a clone filter to split each event in two. Configure Logstash to send the original to ES and modify the cloned event as you please and send it to the file.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.