Logstash plus filebeat pipeline

Hi there,

I have a api log file with the following format

2018.27.12 17:37:28.423 GET /api/v1/catalogManagement/productOffering 200 4
2018.27.12 17:37:28.242 GET /api/v1/addressManagement/address 200 1214

I would like to extract all the fields from log file and parse with grok to setup the @timestamp according first two records in each row


What have you got so far? Have you gone through This introduction to Logstash?

The main issue I have is how to parse the date based on dots and timestamp and setup the field timestamp

What have you got so far? What is not working?

I need parsing dates and timestamps from fields, and then using that date and timestamp as the logstash timestamp for the event.

The fomat of hte lines in the file is like this:
2018.27.12 17:37:28.423 GET /api/v1/catalogManagement/productOffering 200 4

Are you able to do it using grok ?

you can try patterns on the below link:


I tried matching the date with pattern:


Thanks so much for your input, I was able to make it work with a solution similar to the one you are proposing and I was also able to setup @timestamp using date. The app logs were quite tricky also becouse I had different date formats, now I fixed them. I was using hat debug site, its very useful

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.