I need help on grok patterns. I have a log file where few fields are missing(several fields actually), based on that I have come up with below grok patterns but its not working. In the grok contructor - it showing as matched but most fields are showing blank.
for example AlgoString in first line & Client are showing blank. Please anyone help on this.
logs lines:
07:11:02.002015|INF|RAW(0x2590ed0)|=> Received OrderNew={ Transaction=574F5D76-4BD-3-65 AlgoString=TEST:0:0:50:50 ClOrdId=-1 Client=TEST1
10:30:20.790316|INF|RAW(0x2b77e400ab70)|=> Received OrderNew={ Transaction=574F8C2C-C0E63-7456-65 Price=2121 ClOrdId=1 Client=TEST2
09:11:36.682557|INF|RAW(0x307fbc0)|=> Received OrderNew={ Transaction=574F79B8-A6943-91F-65 Price=664 ClOrdId=32
next, there is a "space" before your Client= which is throwing off the query
Next, you are using a lot of greedy statements .* I would replace these with .+ or better yet actual data. \S is nice cause it matches everything not a space. .* has some wierd effects and sometimes matches more then you expect.
Finally, Maybe you don't want to mach the json data with Grok and use the json filter. Something like this would work nicely (oh wait this is not json. Well try the KV filter
*See if something like this helps, Sorry I have some typo's but you should get the jist of what I was doing with it.
filter{
#get message and create new field with json like data
grok => ["message", "^%{TIME:logtime}\|%{DATA:severity}\|(|.+)\|.+Received %{DATA:TransType}=%{GREEDYDATA:app_data}"
}
kv{
source =>"app_data"
}
}
Of course if you do have valid json you can replace kv with the json filter and parsing would be automagic
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.