Not able to index the logs coming from Shipper through redis into the ELK

Hi,

I am deploying ELK for POC. In that, I have pushed the logs into the logstash shipper(I am able to see that using tcpdump). But then I am not able to see them in Elasticsearch and kibana. I am able to see them at Redis. This issue came after I applied filter plugin in the Logstash indexer.

Could anybody help me with this?

So it appears that the Logstash instance that reads from Redis and posts to ES stopped working when you added a filter? Does it start to work again if you remove the filter?

Hi Magnus,

It did not work. I reinstalled logstash, thus removing the logstash patterns folder and just using one pattern with the grok filter . It started working.

Earlier I had tried to use the patterns ( https://github.com/logstash-plugins/logstash-patterns-core/tree/master/patterns ) by installing the Jruby and run the Gemfile but this did not work.

I want to use these logstash patterns. How can I use them without editing the Gemfile? It appears to me that by editing the Gemfile the indexer had stopped working.

I don't understand exactly what you want to do, but it sounds like your should either

  • upgrade the logstash-patterns-core plugin or
  • add the wanted pattern file(s) in e.g. /etc/logstash/patterns and point your grok filter to that directory.

Hey Magnus,

I have applied multiple grok patterns as follows in the logstash

filter {

  grok {
match => ["message", " %{SYSLOGTIMESTAMP:timestamp} (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:logsource} %{SYSLOGPROG}"]
 }  
  grok {
    match => { "message" => "Accepted %{WORD:auth_method} for %{USER:username} from %{IP:src_ip} port %{INT:src_port} ssh2" }
  }
  grok {
    match => { "message" => "Invalid user %{USER:username} from %{IP:src_ip}" }
  }
    grok {
    match => { "message" => "Failed %{WORD:auth_method} for %{USER:username} from %{IP:src_ip} port %{INT:src_port} ssh2" }
  }

}

The logstash indexer has extracted the fields from the first two grok patterns but not from the last two. Could you help me on this?

How can I extract fields for different logs? What would be the appropriate way to do the filtering and indexing in the logstash for different types of logs?

The logstash indexer has extracted the fields from the first two grok patterns but not from the last two. Could you help me on this?

Yes, if you show me an example message that's currently not processed correctly.

How can I extract fields for different logs? What would be the appropriate way to do the filtering and indexing in the logstash for different types of logs?

Use conditionals on e.g. the type field. See https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html.

Hi Magnus,

Thanks for the reply.

The sample logs which is not being processed correctly are of these 3 types

  1. Invaild user username from 123.123.123.123
  2. Failed password for username from 123.123.123.123 port 1234 ssh2
  3. Failed password for invaild username from 123.123.123.123 port 1234 ssh2

Please show the raw output from a stdout { codec => rubydebug } output for a message that isn't processed correctly (and presumably has a _grokparsefailure tag).

Hi Mangus,
The output for the message " Invaild user username from 123.123.123.123 "

{
"message" => "Invaild user username from 123.123.123.123",
"@version" => "1",
"@timestamp" => "2016-09-15T06:11:37.151Z",
"host" => "ip-10-0-0-10",
"tags" => [
[0] "_grokparsefailure"
]
}

"Invalid" is misspelled in the log message but not in your grok expression.

Hi Mangus,

It has worked. But the tag _grokparsefailure is still there. Will this cause any problem during the parsing.?

Invalid user username from 123.123.123.123

{
"message" => "Invalid user username from 123.123.123.123",
"@version" => "1",
"@timestamp" => "2016-09-15T07:06:54.759Z",
"host" => "ip-10-0-0-10",
"tags" => [
[0] "_grokparsefailure"
],
"username" => "username",
"src_ip" => "123.123.123.123"
}

You have multiple grok filters but only one of them matches the input, so the remaining filters obviously fail and tag the event with _grokparsefailure. Note that you can list multiple grok expressions in a single grok filter which helps in this situation.