Issue with custom parsing in grok filter

Hi All,

We have a custom log file, which we need to parse to extract some required fields out of it. We have setup Logstash to parse the data and push it to Elasticsearch.

Sample log file.
{"level":"info","message":"API Request:stopInstance, user:user1@gmail.com, InstanceID:i-033daf56fhjb4102c, Message: User initiated EC2 STOP event","timestamp":"2019-03-13T15:54:09.313Z"}

We wanted to extract the fields as below.

level : info
EventName : stopInstance
UserID : user1@gmail.com
InstanceId : i-033daf56fhjb4102c

We tried to use custom regex patterns in grok filter but it didn't work out. We are getting grok parse error. Below is the test logstash config file.

input {
   stdin{}
}
filter {
    grok {
      match => ["message", '(?<EventName>(?<=API:).*?(?= ::))',"message", '(?<level>(?<=el\":\").*?(?=\",\")),"message", '(?<UserID>(?<=user:).*?(?=, Instance))',"message", '(?<InstanceID>(?<=eID).*?(?=, Message))'
        ]
    }
}
output {
   file {
      path => "./test"
   }
}

Could you guys help us here please?

Can someone help us on this please

I would suggest

    json { source => "message" }
    kv { field_split => ", " value_split => ":" }

you can then rename the field with a mutate filter.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.