amruth
(Amruth)
September 4, 2017, 9:38pm
1
Hi,
I am using auto_flush_interval in multiline codec plugin. It works fine for all the events but when it reaches the end of file, it is throwing grokparsefailure error. And when I remove auto_flush_interval, there are no issues(but I will be loosing the last event). Can someone please solve this issue?
Thank you
amruth
(Amruth)
September 6, 2017, 6:14pm
2
Can someone please help me with the issue?
It probably helps if you show your configuration and provide examples of what the data looks like.
amruth
(Amruth)
September 7, 2017, 7:38pm
4
Here is my config,
input{
file{
path=>"/opt/file.log"
start_position=> "beginning"
sincedb_path => "/var/lib/sincedb.log"
type=>"success"
codec => multiline {
pattern => "^entry"
negate => true
what => previous
auto_flush_interval => 10
max_lines => 100000
max_bytes => "1000 MiB"
}
}
file{
path=>"/opt/file.log"
start_position=> "beginning"
sincedb_path => "/var/lib/sincedb_fail.log"
type=>"failure"
codec => multiline {
pattern => "^%{TIMESTAMP_ISO8601}"
negate => true
what => previous
auto_flush_interval => 10
max_lines => 100000
max_bytes => "1000 MiB"
}
}
filter {
if [type] == "success"{
grok {
match => {"message" => "Query =>\n%{GREEDYDATA:query}\nTotal query time: %{NUMBER:query_time:float}%{DATA}\[Query Timestamp\: %{NUMBER:Query_Timestamp}%{GREEDYDATA}]"}
}
mutate {
gsub => ["Query_Timestamp","\d\d\d$",""]
remove_field => ["message"]
add_field =>{"component" => "Queries"}
add_field =>{"app_name" => "XYZ"}
}
date {
match => ["Query_Timestamp", "UNIX_MS"]
target => "timestamp"
}
}
if [type] == "failure"{
if "ERROR" in [message] and "failure" in [message] {
grok {
match => {"message" => "%{TIMESTAMP_ISO8601:timestamp}%{GREEDYDATA} ERROR \[\] %{DATA:error} reason:%{GREEDYDATA:reason}"}
}
mutate{
#gsub => ["timestamp",".{5}$",""]
gsub => ["reason","\n", ""]
gsub => ["reason", "\[\]", ""]
strip => ["reason"]
remove_field => ["message"]
add_field =>{"component" => "Queries"}
add_field =>{"app_name" => "XYZ"}
}
date {
match => ["timestamp", "yyyy-MM-dd HH:mm:ss:SSSSSSSSS"]
target => "timestamp"
}
}
else{
drop{}
}
}
}
output{
stdout{
codec=>rubydebug
}
}
And here is the data,
2017-04-19 14:56:01:57716 [139897046292224 rid: ] [] Server: started client connection thread
2017-04-19 14:56:12:68588 [139897046292224 rid: ] [] [Query Timestamp: 1492628172620975]
2017-04-19 14:56:12:68591 [139897046292224 rid:a7213aba8ef6] [] [Query Timestamp: 1492628172620975]
2017-04-19 14:56:12:68591 [139897046292224 rid:a7213aba8ef6] [] [Query Timestamp: 1492628172620975]
entry=>
show tables
Total query time: 0.014951
2017-04-19 14:56:12:68591 [139897046292224 rid:a7213aba8ef6] [] [Query Timestamp: 1492628172620975] Server: query completed
2017-04-19 15:55:08:3604570 [139897046292224 rid: ] [] Server:
entry=>
show tables
Total query time: 0.014951
2017-04-19 14:56:12:68591 [139897046292224 rid:a7213aba8ef6] [] [Query Timestamp: 1492628172620975] Server: query completed
2017-04-19 15:55:08:3604570 [139897046292224 rid: ] []
amruth
(Amruth)
September 12, 2017, 11:11am
5
Hi, can someone please help on this issue?
Why do you have 2 file inputs for the same file? What is the desired output and what are you actually seeing?
amruth
(Amruth)
September 19, 2017, 10:58am
7
One is to capture success transactions and the other is to capture failed transactions(they have different patterns, so reading file twice).
Desired output is, it should print the last event after 10 seconds(auto_flush_interval => 10). Instead it is throwing _grokparsefailure after reaching the last event. For example, if there are 10 events in the file(this is a log file which continuously grows), it's printing 9 events normally, but throwing _grokparsefailure continuously(it never ends)
amruth
(Amruth)
September 29, 2017, 8:23pm
8
Hi, This is crucial now. Any help is highly appreciated. Can someone please help me solve this issue?
system
(system)
Closed
October 27, 2017, 8:24pm
9
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.