i'v got the problem that i have only one input in to logstash via filebeat but two difrent logfiles, a liferaylog an a apache Acceess_log. i can see that filebeat analyse that two files and send it tp logstash.
elasticsearch {
hosts => "localhost:9200"
index => "logstash-%{+YYYY.MM.dd}"
}
}
The Problem is that i can't finde the Apache Accesslog in my KIBANA. WHats the problem? Have i to configure to inputs for two different filetypes with a different PORT?
Not sure if you can have two completely separate filter configs, someone else will have to comment on that.
If it is possible to have several filter blocks in the config, then you are trying to grok the message field twice. You would probably want to add if statements and add some fields that you can use for that logic. You can add fields in Filebeat easily.
I have all my filters in one file, inside one filter {}.
@Axel_Kruger@A_B Yes, you can distribute filters across multiple files. logstash will concatenate the files, then start compiling. So, stupid as it might be, if you point a pipeline at directory containing these 4 files, it will work as if all the configuration were in one file, because by the time it starts compiling, effectively it is one file.
That said, both grok filters will apply to both types of files, so you will always get _grokparsefailure tags. Also, the patterns are not anchored, so it will be expensive for them to fail, since they will backtrack a lot. The configuration would be a lot cheaper if they were anchored to beginning of line with ^
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.