Blason
(R)
November 20, 2017, 7:01pm
1
Hi Guys,
I am struggling with basic grok filter and seeking your help please
Here is my file and would like to parse as geotype, attack_type
"103.40.130.84": "bad reputation"
"103.40.132.18": "known attacker"
"103.40.138.100": "bad reputation"
"103.40.151.66": "known attacker"
"103.40.151.90": "anonymizer"
"103.40.160.194": "known attacker"
"103.40.162.173": "known attacker"
"103.40.197.163": "known attacker"
"103.40.226.70": "known attacker"
input {
file {
path => ["/opt/listbot/iprep.yaml"]
type => "isnti"
}
}
filter {
if [type] == "isnti" {
grok {
match => [ "message", "%{IPV4:src_ip}\:%{SPACE}{WORD:attack}" ]
}
}
}
"103.40.151.90": "anonymizer" This gets matched perfectly however "103.41.124.155": "known attacker" does not get match can someone please help?
Blason
(R)
November 20, 2017, 7:04pm
2
OK - Seems I should have used DATA instead of WORD
Blason
(R)
November 20, 2017, 7:16pm
3
Now I am unable to run the pipeline. Can someone please look at my config
input {
file {
path => ["/opt/listbot/iprep.yaml"]
type => "isnti"
}
}
filter {
if [type] == "isnti" {
grok {
match => [ "message", "%{IPV4:src_ip}"\:%{SPACE}"%{DATA:attack}" ]
}
}
}
output {
stdout {
codec => rubydebug
}
}
And here is the error -
[FATAL] 2017-11-21 06:08:14.500 [LogStash::Runner] runner - The given configuration is invalid. Reason: Expected one of #, {, ,, ] at line 12, column 41 (byte 159) after filter {
if [type] == "isnti" {
grok {
match => [ "message", "%{IPV4:src_ip}"
Badger
November 20, 2017, 7:47pm
4
You have double quotes in you double-quoted string. That's not going to work unless you escape them or change to single quotes. Try
match => [ "message", '%{IPV4:src_ip}": "%{DATA:attack}"' ]
which I think is easier to read than
match => [ "message", "%{IPV4:src_ip}\": \"%{DATA:attack}\"" ]
Blason
(R)
November 21, 2017, 4:08am
5
Yep that worked. Now for me I am being novice not sure why data is not appearing in ES. rebydebug properly shows data is being indexed but somehow data is not appearing in ES.
My index pattern got created properly.
"@timestamp" => 2017-11-21T09:34:47.868Z,
"geoip" => {
"timezone" => "Asia/Hong_Kong",
"ip" => "103.16.228.104",
"latitude" => 22.25,
"country_name" => "Hong Kong",
"country_code2" => "HK",
"continent_code" => "AS",
"country_code3" => "HK",
"location" => {
"lon" => 114.1667,
"lat" => 22.25
},
"longitude" => 114.1667
},
"attack" => "bot, crawler",
"ip" => "103.16.228.104",
"@version" => "1",
"host" => "broelk.isn.in",
"message" => "\"103.16.228.104\": \"bot, crawler\"",
"type" => "isnti"
input {
stdin {
type => "isnti"
}
}
filter {
if [type] == "isnti" {
grok {
match => [ "message", '"%{IPV4:ip}": "%{DATA:attack}"' ]
}
geoip { source => "ip" }
}
}
output {
if [type] == "isnti" {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["192.168.5.27:9200"]
index => "logstash-isnti-%{+YYYY.MM.dd}"
template => "/etc/logstash/logstash-template.json"
}
}
}
Blason
(R)
November 21, 2017, 4:24am
6
Ok - Got it and fixed the data. It was time issue on my server fixed it with ntpdate
1 Like
system
(system)
Closed
December 19, 2017, 4:24am
7
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.