I am using custom grok pattern to parse my log file, Not matter what i do it always give me the _grokparsefailure exception.
However, it work perfectly on https://grokdebug.herokuapp.com/
My custom pattern file is located at,
C:\Users\Username\projects\Logstash\bin\patterns
Filename: mylogpattern
LogLevel [I|V|D|M|W|E|A|F]
MODULE \b\w+\b|------
MESSAGEID (?:[+-]?(?:[0-9]+))|----
SUBMODULE (.*?:)
MESSAGE (.*)|(.*?:)|(.*\s*?:)
My Logstash Config File Looks like this:
input{
beats{
host => "192.168.56.1"
port => 7088
congestion_threshold => 200
}
}
filter {
if [type] == "MyLog"{
grok{
patterns_dir => ["C:\Users\Username\projects\Logstash\logstash\bin\patterns"]
match => { "message" => "%{YEAR:Year}%{MONTHNUM:Month}%{MONTHDAY:Day} %{HOUR:Hour}%{MINUTE:Minute}%{SECOND:Second} %{LogLevel:LogVerbosity} %{MODULE:MODULENAME}%{SPACE}%{MESSAGEID:MESSAGEID} %{SUBMODULE:SUBMODULE} %{MESSAGE:MESSAGE}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
Sample Log File:
20160509 115108 I GEN 0000 ConnectionHandler.reconnect: Reconnect to the DB was done.
20160509 115108 I GEN 0000 84.1.3.1000012 : Reconnect to the DB was done.
It works perfectly on grok debugger, but somehow it is failing while parsing.
tags:beats_input_codec_plain_applied, _grokparsefailure
Could someone please help me out with this, What am I doing wrong?
1 Like
Are you sure that log pass through your "if" ?
you can test that by adding "add_tag => "" in your grok.
MrLee
May 11, 2016, 12:10pm
3
How do you start your logstash?
Do you have some other config files in the same config directory?
1 Like
Starting it using this command, config file is in different folder
logstash agent -f C:\Users\shubhamd\projects\Logstash\logstash\conf.d\central.conf -l C:\Users\shubhamd\projects\Logstash\logstash\conf.d\lslog.log –-verbose
When i added the add_tag statement.
Kibana was showing the correct log for a few seconds
then again, it was displaying somwthing as
host:localhost.localdomain tags:beats_input_codec_plain_applied, _grokparsefailure _id:AVSfyfAcoQty0p9BNk2p _type:MyLog _index:filebeat-2016.05.11 _score: -
Filebeat debug has something like this,
2016/05/11 12:23:32.087717 output.go:87: DBG output worker: publish 1808 events
2016/05/11 12:23:32.087761 client.go:146: DBG Try to publish 1808 events to logstash with window size 1024
is it due to the heavy log count.
I have already increased congestion_threshold => 200
What can i do to fix it.
MrLee
May 11, 2016, 12:39pm
7
"It works perfectly on grok debugger, but somehow it is failing while parsing."
If so, it must be some log item match failed. If any, it will raise "_grokparsefailure" error.
Maybe you should check your log data overall or try to deal abnormal situation in logstash.
All log follows essentially the same format, also i have added GREEDYDATA filter for the long messages, it is able to parse it successfully. I can still see some parsed logs in kibana.
Its just the log count is really high and log file also have continuous blank lines in between.
Does it try to parse those blank lines as well ?
We don't see the new tag, so the problem must be your condition if [type] == "MyLog"
.
can you try to replace if [type] == "MyLog"
by if "MyLog" in [type]
?
tags:beats_input_codec_plain_applied, Groked message:20160502 092820 I BL 0003 284.1.3.1000051 : : No new files waiting in AC for collection @version :1 @timestamp :May 11th 2016, 17:52:23.705 type:Gatherer beat.hostname:localhost.localdomain beat.name:localhost.localdomain source:/var/log/log/mylog.log offset:583,680 input_type:log count:1 fields: - host:localhost.localdomain Year:2016 Month:05 Day:02 Hour:09 Minute:28 Second:20 LogVerbosity:I MODULENAME:BL MESSAGEID:0003 SUBMODULE:284.1.3.1000051 : MESSAGE:ROAMING_VOICE_AC_COLLECTOR_1000051: logmessage_at:2016-05-11T12:22:23.705Z
this is the successfully parsed log output in kibana, i can see it has Groked tag.
But still , logs which came after fe seconds are still not getting parse.
What to do ?
1 Like
MrLee
May 11, 2016, 12:57pm
11
@Clement_Ros
In my opinion,If the "if" condition is not satisfied, it will not pass though the grok filter, will not raise a "_grokparsefailure" error.
It won't solve your problem, but that LogLevel pattern matches the "|" character, which I'm sure you didn't intend. It should be [IVDMWEAF] instead.
Thanks Eric, Made the correction.
Still no luck with parsefailure error
So , I limited my log file to log count of around 800 and it worked.
Will be increasing more counts and do the load testing.
I am using filebeat as log shipper.
Just in case if it is the issue, does anyone know what changes needs to be made in any BELK , in order at to handle high frequency of logs.
Appreciate any help you could provide.
1 Like