Grok_help

{"@timestamp":"2018-03-31T04:00:00.237Z","@version":"1","message":"2018-03-31 03:59:22,12.106.8.702,ABC33&&000012345678,03/31/2018 03:59:59,23.47893,92.38397,0,6,ON,0,0,79.06,0,3395,588,2,0,40430,-71,12,17","tags":["_grokparsefailure"]}

can anyone help me in building a grok filter for this log i just want this "message":"2018-03-31 03:59:22,12.106.8.702,ABC33&&000012345678,03/31/2018 03:59:59,23.47893,92.38397,0,6,ON,0,0,79.06,0,3395,588,2,0,40430,-71,12,17" in elasticsearch

i made this grok ("message":"%{DATA:data}",) but it is throwing error when im trying to run config test

I don't understand. What is the desired result? Show us an example.

this is the log {"@timestamp":"2018-03-31T04:00:00.237Z","@version":"1","message":"2018-03-31 03:59:22,12.106.8.702,ABC33&&000012345678,03/31/2018 03:59:59,23.47893,92.38397,0,6,ON,0,0,79.06,0,3395,588,2,0,40430,-71,12,17","tags":["_grokparsefailure"]}

but i just want to take data starting from "message":"2018-03-31 ************ till ,40430,-71,12,17"
im using this grok ("message":"%{DATA:data}",) im using this to filter and
im getting this output in grok debugger and this is what i need

{
"data": [
[
"message":"2018-03-31 03:59:22,12.106.8.702,ABC33&&000012345678,03/31/2018 03:59:59,23.47893,92.38397,0,6,ON,0,0,79.06,0,3395,588,2,0,40430,-71,12,17"
]
]
}

but when running config test im getting this error

ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path //usr/share/logstash/config/log4j2.properties. Using default config which logs to console
06:31:43.022 [LogStash::Runner] FATAL logstash.runner - The given configuration is invalid. Reason: Expected one of #, => at line 12, column 8 (byte 123) after filter {
grok {
match => {"message" => "("message":"%{DATA:data}",)"}
}

output {

Hey anush,

you could try to change your grok call from

grok { match => {"message" => "("message":"%{DATA:data}",)"} }

to

grok { match => {"message" => '("message":"%{DATA:data}",)'} }

using single qoute chars, then logstash does not get confused about start and end of the commands.

Cheers,
Markus

thanks a lot it helped me out

1 Like

Don't use a grok filter to parse JSON. Use a json codec in your input plugin instead, or possibly a json filter.

but now im getting this
s":["_grokparsefailure"]}
{
"@timestamp" => 2018-05-14T09:29:10.267Z,
"data" => "2018-03-31 03:59:22,12.106.8.702,ABC33&&000012345678,03/31/2018 03:59:59,23.47893,92.38397,0,6,ON,0,0,79.06,0,3395,588,2,0,40430,-71,12,17",
"@version" => "1",
"host" => "ls10",
"message" => "{"@timestamp":"2018-03-31T04:00:00.237Z","@version":"1","message":"2018-03-31 03:59:22,12.106.8.702,ABC33&&000012345678,03/31/2018 03:59:59,23.47893,92.38397,0,6,ON,0,0,79.06,0,3395,588,2,0,40430,-71,12,17","tags":["_grokparsefailure"]}"
}

i just want data in message i dont want extra parameters like "{"@timestamp":"2018-03-31T04:00:00.237Z","@version":"1",

What does your configuration look like? Always provide configuration and input together with the output you're getting.

I don't think the @timestamp field can be deleted, but for the rest a mutate filter can remove that undesired fields.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.