Grok make log missing

Here is my example log
<134>May 24 17:15:52 asdsgag.com 1,yyyy/mm/dd 17:15:52,001801056715,TRAFFIC,end,1,yyyy/mm/dd 17:15:52,xxx.xxx.xxx.xxx,xxx.xxx.xxx.xxx,0.0.0.0,0.0.0.0,TEST,,,incomplete,local1,wifi,office,cd5.32,rt5.11,asdsgag,2021/05/24 17:15:52,278516,1,54685,9165,0,0,0x19,tcp,allow,66,66,0,1,2021/05/18 15:05:43,0,any,0,187895353,0x8000005240000000,0.0.0.0-0.255.255.255,0.0.0.0-0.255.255.255,0,1,0,aged-out,13,0,0,0,,FW,from-policy,,,0,,0,,N/A,0,0,0,0

<[0-9]+>%{MONTH} %{MONTHDAY} %{TIME} %{DATA:url},%{DATA:datetime},[0-9]+,%{DATA:type},%{GREEDYDATA:event}
<[0-9]+>%{MONTH} %{MONTHDAY} %{TIME} %{DATA:url},%{DATA:datetime},[0-9]+,(?<type>[A-Z]+),%{GREEDYDATA:event}
this one make my log gone no error no grokparsefailure

<[0-9]+>%{MONTH} %{MONTHDAY} %{TIME} %{DATA:url},%{DATA:datetime},[0-9]+,%{GREEDYDATA:event}
this one work properly

but I want to split my log to get more field. I try to check what happen and end here where my log is gone, but no idea how to check or trace the error. Dont have any info in /var/log/logstash/logstash-plain.log
also try add stdout { codec => rubydebug } to out put stil dont get any return in journalctl -fu logstash

Any idea?

Hi Dawit,

I can understand the tail of your message string will be in event field, since you've used greedydata.
Can you pls elaborate on what do you mean by log gone?
Are you unable to see it on discover?

Log gone in my case it not appear on Kibana's Discover page(my index-pattern) not event single field and also not in jounalctl -fu logstash. I can't find my log anywhere and I am confident that it already sent and match with my criteria because I try to make _grokparsefailure then it appear again.

P.S. I use grok debugger in DevTool of Kibana and it work.

Hey Dawit,

Couple of things, which might help to understand the issue.

  1. What is source of data and how its shipped?
    If its via static file, make sure its not problem with registry file

  2. Have you tried querying to kibana index?
    Many time data won't appear on kibana discover due to

  • The timefield of index pattern is missing from the document, like missing @timestamp
  • The timezones difference, means your data is available in es index, its just not appearing on discover as its ingested with future timestamp

The best way in such cases to check _count in index or look for specific terms.

As you are saying "I am confident that it already sent and match with my criteria because I try to make _grokparsefailure then it appear again."
I believe you should find it in index by querying, possible via dev tools.

Any other thing I can think extremely that you can check your logstash process in debug mode

Hope It helps!

  1. Logs are sent from network tools it is a traffic log. It sent to logstash directly via set IP and port
  2. Yes, I try that it return null. Doc count also not increase. I'm quite sure that date time is correct as I mention that when I make it fail, it show with the correct timestamp

Debug mode, I don't know how to turn it on, but I used this config in logstash .conf

output { 
          elasticsearch { host => ["host"] index => "name"} 
          stdout { codec => rubydebug }
}

I don't see anything from rubydebug message

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.