Grok custom pattern is not working

(Aparna Thomas) #1

In logstash, we were trying to parse a sample log using custom grok pattern.
The sample log looks like:

2017.09.26 00:59:47:158 UTC | Info | Sip | UserProfileNonCall [Thread #231] | +7777777777 | callhalf-10000000000

We have tried to match the input logs with following logstash configuration:

input {
file {
path => "/var/log/samplelog.txt"
start_position => "beginning"
codec => multiline {
pattern => "^(?[0-9]+.[0-9]+.[0-9]+)"
negate => true
what => "previous"


filter {
grok {
match => [ "message" , "(?[0-9]+.[0-9]+.[0-9]+)" ]
output {
elasticsearch {
hosts => [ "localhost:9200" ]
stdout { codec => rubydebug }

We have tried with multiline codec. The above mentioned configuration is not giving any error message in logstash.log file. But we are not able to see the logs in kibana.

Have tried different combinations of custom grok pattern for matching the log but its not working. So can you please provide the solution for this?

(Rodolphe Redouté) #2

I recommand you to Use to help you with grok (or to verify if they work)

what I understand of your grok is that you just try to match the TimeStamp and you don't take care of the rest of the log
And I see maybe 3 errors in your grok pattern.

First : it should start with "\A" so => "\A(?[0-9]+.[0-9]+.[0-9]+)"
Second : the syntax use { for grok not [ try this : match => {"message" , "\A(?[0-9]+.[0-9]+.[0-9]+)" }
third: try "after" instead of "previous" (but it seems that your log isn't multiline)

(and use the preformatted mode to show the spaces and indentation plz ^^)

(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.