Auto_flush_interval in multiline codec plugin errors out

Hi,

I am using auto_flush_interval in multiline codec plugin. It works fine for all the events but when it reaches the end of file, it is throwing grokparsefailure error. And when I remove auto_flush_interval, there are no issues(but I will be loosing the last event). Can someone please solve this issue?

Thank you

Can someone please help me with the issue?

It probably helps if you show your configuration and provide examples of what the data looks like.

Here is my config,

input{

	file{
		path=>"/opt/file.log"
		start_position=> "beginning"
		sincedb_path => "/var/lib/sincedb.log"
		type=>"success"
		
		
		codec => multiline {
			pattern => "^entry"
			negate => true
			what => previous
			auto_flush_interval => 10
			max_lines => 100000
			max_bytes => "1000 MiB"
		}
	}
	
	file{
		path=>"/opt/file.log"
		start_position=> "beginning"
		sincedb_path => "/var/lib/sincedb_fail.log"
		type=>"failure"
		
		codec => multiline {
			pattern => "^%{TIMESTAMP_ISO8601}"
			negate => true
			what => previous
			auto_flush_interval => 10
			max_lines => 100000
			max_bytes => "1000 MiB"
		}       
	}
filter {

	if [type]  == "success"{
		grok {
			match => {"message" => "Query =>\n%{GREEDYDATA:query}\nTotal query time: %{NUMBER:query_time:float}%{DATA}\[Query Timestamp\: %{NUMBER:Query_Timestamp}%{GREEDYDATA}]"}
		}
	
		mutate {
			gsub => ["Query_Timestamp","\d\d\d$",""]
			remove_field => ["message"]
			add_field =>{"component" => "Queries"}
			add_field =>{"app_name" => "XYZ"}
		}
	
		date {
			match => ["Query_Timestamp", "UNIX_MS"]
			target => "timestamp"
		}
	}
	
	if [type]  == "failure"{
		if "ERROR" in [message] and "failure" in [message] {
			grok {
				match => {"message" => "%{TIMESTAMP_ISO8601:timestamp}%{GREEDYDATA} ERROR \[\] %{DATA:error} reason:%{GREEDYDATA:reason}"}
			}
		
			mutate{
				#gsub => ["timestamp",".{5}$",""]
				gsub => ["reason","\n", ""]
				gsub => ["reason", "\[\]", ""]
				strip => ["reason"]
				remove_field => ["message"]
				add_field =>{"component" => "Queries"}
				add_field =>{"app_name" => "XYZ"}
			}
			
			date {
				match => ["timestamp", "yyyy-MM-dd HH:mm:ss:SSSSSSSSS"]
				target => "timestamp"
			}
	    }
	
		else{
			drop{}
		}
	}
}

output{

	stdout{
		codec=>rubydebug
	}
}

And here is the data,

2017-04-19 14:56:01:57716 [139897046292224 rid:            ]  []  Server: started client connection thread

2017-04-19 14:56:12:68588 [139897046292224 rid:            ]  [] [Query Timestamp: 1492628172620975]  
2017-04-19 14:56:12:68591 [139897046292224 rid:a7213aba8ef6]  [] [Query Timestamp: 1492628172620975]  
2017-04-19 14:56:12:68591 [139897046292224 rid:a7213aba8ef6]  [] [Query Timestamp: 1492628172620975]  
entry=>
show tables
Total query time: 0.014951

2017-04-19 14:56:12:68591 [139897046292224 rid:a7213aba8ef6]  [] [Query Timestamp: 1492628172620975]  Server: query completed
2017-04-19 15:55:08:3604570 [139897046292224 rid:            ]  []  Server: 
entry=>
show tables
Total query time: 0.014951

2017-04-19 14:56:12:68591 [139897046292224 rid:a7213aba8ef6]  [] [Query Timestamp: 1492628172620975]  Server: query completed
2017-04-19 15:55:08:3604570 [139897046292224 rid:            ]  []

Hi, can someone please help on this issue?

Why do you have 2 file inputs for the same file? What is the desired output and what are you actually seeing?

One is to capture success transactions and the other is to capture failed transactions(they have different patterns, so reading file twice).

Desired output is, it should print the last event after 10 seconds(auto_flush_interval => 10). Instead it is throwing _grokparsefailure after reaching the last event. For example, if there are 10 events in the file(this is a log file which continuously grows), it's printing 9 events normally, but throwing _grokparsefailure continuously(it never ends)

Hi, This is crucial now. Any help is highly appreciated. Can someone please help me solve this issue?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.