Grok custom pattern is not working

In logstash, we were trying to parse a sample log using custom grok pattern.
The sample log looks like:

2017.09.26 00:59:47:158 UTC | Info | Sip | UserProfileNonCall [Thread #231] | +7777777777 | callhalf-10000000000

We have tried to match the input logs with following logstash configuration:

input {
file {
path => "/var/log/samplelog.txt"
start_position => "beginning"
codec => multiline {
pattern => "^(?[0-9]+.[0-9]+.[0-9]+)"
negate => true
what => "previous"
}

}

}
filter {
grok {
match => [ "message" , "(?[0-9]+.[0-9]+.[0-9]+)" ]
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
}
stdout { codec => rubydebug }
}

We have tried with multiline codec. The above mentioned configuration is not giving any error message in logstash.log file. But we are not able to see the logs in kibana.

Have tried different combinations of custom grok pattern for matching the log but its not working. So can you please provide the solution for this?

I recommand you to Use http://grokconstructor.appspot.com/do/construction to help you with grok (or to verify if they work)

what I understand of your grok is that you just try to match the TimeStamp and you don't take care of the rest of the log
And I see maybe 3 errors in your grok pattern.

First : it should start with "\A" so => "\A(?[0-9]+.[0-9]+.[0-9]+)"
Second : the syntax use { for grok not [ try this : match => {"message" , "\A(?[0-9]+.[0-9]+.[0-9]+)" }
third: try "after" instead of "previous" (but it seems that your log isn't multiline)

(and use the preformatted mode to show the spaces and indentation plz ^^)

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.