My Grok filter is not working

Hi Guys,

I am struggling with basic grok filter and seeking your help please

Here is my file and would like to parse as geotype, attack_type

"103.40.130.84": "bad reputation"
"103.40.132.18": "known attacker"
"103.40.138.100": "bad reputation"
"103.40.151.66": "known attacker"
"103.40.151.90": "anonymizer"
"103.40.160.194": "known attacker"
"103.40.162.173": "known attacker"
"103.40.197.163": "known attacker"
"103.40.226.70": "known attacker"

input {
file {
path => ["/opt/listbot/iprep.yaml"]
type => "isnti"
}

}
filter {
if [type] == "isnti" {

    grok {
            match => [ "message", "%{IPV4:src_ip}\:%{SPACE}{WORD:attack}" ]
    }
}

}

"103.40.151.90": "anonymizer" This gets matched perfectly however "103.41.124.155": "known attacker" does not get match can someone please help?

OK - Seems I should have used DATA instead of WORD

Now I am unable to run the pipeline. Can someone please look at my config

input {
    file {
    path => ["/opt/listbot/iprep.yaml"]
    type => "isnti"
    }

}
filter {
if [type] == "isnti" {

    grok {
            match => [ "message", "%{IPV4:src_ip}"\:%{SPACE}"%{DATA:attack}" ]
    }

}

}

output {
stdout {
codec => rubydebug
}

}

And here is the error -

[FATAL] 2017-11-21 06:08:14.500 [LogStash::Runner] runner - The given configuration is invalid. Reason: Expected one of #, {, ,, ] at line 12, column 41 (byte 159) after filter {
if [type] == "isnti" {

    grok {
            match => [ "message", "%{IPV4:src_ip}"

You have double quotes in you double-quoted string. That's not going to work unless you escape them or change to single quotes. Try

match => [ "message", '%{IPV4:src_ip}": "%{DATA:attack}"' ]

which I think is easier to read than

match => [ "message", "%{IPV4:src_ip}\": \"%{DATA:attack}\"" ]

Yep that worked. Now for me I am being novice not sure why data is not appearing in ES. rebydebug properly shows data is being indexed but somehow data is not appearing in ES.

My index pattern got created properly.

"@timestamp" => 2017-11-21T09:34:47.868Z,
     "geoip" => {
          "timezone" => "Asia/Hong_Kong",
                "ip" => "103.16.228.104",
          "latitude" => 22.25,
      "country_name" => "Hong Kong",
     "country_code2" => "HK",
    "continent_code" => "AS",
     "country_code3" => "HK",
          "location" => {
        "lon" => 114.1667,
        "lat" => 22.25
    },
         "longitude" => 114.1667
},
    "attack" => "bot, crawler",
        "ip" => "103.16.228.104",
  "@version" => "1",
      "host" => "broelk.isn.in",
   "message" => "\"103.16.228.104\": \"bot, crawler\"",
      "type" => "isnti"

input {
            stdin {
            type => "isnti"
    }

}
filter {
if [type] == "isnti" {

    grok {
            match => [ "message", '"%{IPV4:ip}": "%{DATA:attack}"' ]
    }

    geoip  { source => "ip" }

}

}

output {
if [type] == "isnti" {
stdout {
codec => rubydebug
}

elasticsearch {

hosts => ["192.168.5.27:9200"]

index => "logstash-isnti-%{+YYYY.MM.dd}"

template => "/etc/logstash/logstash-template.json"

}

            }

}

Ok - Got it and fixed the data. It was time issue on my server fixed it with ntpdate :slight_smile:

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.