I am deploying ELK for POC. In that, I have pushed the logs into the logstash shipper(I am able to see that using tcpdump). But then I am not able to see them in Elasticsearch and kibana. I am able to see them at Redis. This issue came after I applied filter plugin in the Logstash indexer.
So it appears that the Logstash instance that reads from Redis and posts to ES stopped working when you added a filter? Does it start to work again if you remove the filter?
It did not work. I reinstalled logstash, thus removing the logstash patterns folder and just using one pattern with the grok filter . It started working.
I want to use these logstash patterns. How can I use them without editing the Gemfile? It appears to me that by editing the Gemfile the indexer had stopped working.
I have applied multiple grok patterns as follows in the logstash
filter {
grok {
match => ["message", " %{SYSLOGTIMESTAMP:timestamp} (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:logsource} %{SYSLOGPROG}"]
}
grok {
match => { "message" => "Accepted %{WORD:auth_method} for %{USER:username} from %{IP:src_ip} port %{INT:src_port} ssh2" }
}
grok {
match => { "message" => "Invalid user %{USER:username} from %{IP:src_ip}" }
}
grok {
match => { "message" => "Failed %{WORD:auth_method} for %{USER:username} from %{IP:src_ip} port %{INT:src_port} ssh2" }
}
}
The logstash indexer has extracted the fields from the first two grok patterns but not from the last two. Could you help me on this?
How can I extract fields for different logs? What would be the appropriate way to do the filtering and indexing in the logstash for different types of logs?
The logstash indexer has extracted the fields from the first two grok patterns but not from the last two. Could you help me on this?
Yes, if you show me an example message that's currently not processed correctly.
How can I extract fields for different logs? What would be the appropriate way to do the filtering and indexing in the logstash for different types of logs?
Please show the raw output from a stdout { codec => rubydebug } output for a message that isn't processed correctly (and presumably has a _grokparsefailure tag).
You have multiple grok filters but only one of them matches the input, so the remaining filters obviously fail and tag the event with _grokparsefailure. Note that you can list multiple grok expressions in a single grok filter which helps in this situation.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.