Log timestamp filed converting as time filter field

Hello Experts,

Need help to covert my log timestamp into time filter field.

I have log entry like below:

Mar 5 03:23:26 0-1 proxy-server: 05/Mar/2019/11/23/26 GET /v1/ACC_1/.trash-01f2f42f-8cc0-45d2-a4ea-b8b117b0b659/0361543942741126%253A27a29814-dd8f-43ee-b768-19af98bf1d07%253A108/1543943058.64696 HTTP/1.0 200 - python-requests/2.5.1%20CPython/2.7.5%20Linux/3.10.0-327.36.3.el7.x86_64 - - - - tx6b4c8afe9e2a4b8eb7073-005c7e5c2e - 0.0871 - - 1551785006.259215117 1551785006.346281052 0

Grok patterns used as below but no luck.

  1. SYSLOGTIMESTAMP, this grok pattern working when I ran from Grok Debugger tool but in elasticsearch its showing as _dateparsefailure
  2. %{MONTHDAY}[/]%{MONTH}[/]%{YEAR}[/]%{HOUR}[/]%{MINUTE}[/]%{SECOND} %{DATA} %{DATA}: I did manual way but still same problem

here is my date filter from conf file:

date {
match => ["logdate","MMM d HH:mm:ss"]
target => "logdate"
timezone => "PST8PDT"


For that first one the field is

Mar  4 15:59:53

with two spaces, so you could use

match => ["logdate", "MMM dd HH:mm:ss", "MMM  d HH:mm:ss"]
1 Like

Awesome @Badger, that was quick.. worked perfectly. I really appreciate your help.


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.