How to merge two lines having different patterns in a log file?

SAMPLE LOG:

2017-07-14 17:47:11 +0000 [INFO] from application in application-akka.actor.default-dispatcher-9 - requestId: 8e5ccc9dbb4611cf9f363efbbe8e1c59,user: gaurav@sigmoid.com, organization: Vdopia, view: Auction, queryTz: UTC, queryStartTime: 1498262400000, queryEndTime: 1500051600000, dimensions: ["D001"], measures: ["M008","M002","M024"], filters: [], requestType: chartType

2017-07-14 17:47:11 +0000 [INFO] from application in application-akka.actor.default-dispatcher-15 - requestId: ad3265d3e99aa541e045ea726a669d65,user: gaurav@sigmoid.com, organization: Vdopia, view: Auction, queryTz: UTC, queryStartTime: 1498262400000, queryEndTime: 1500051600000, dimensions: ["D018"], measures: ["M002","M014","M019","M006","M012","M008","M024"], filters: [], requestType: chartType

2017-07-14 17:47:16 +0000 [INFO] from application in application-akka.actor.default-dispatcher-9 - requestId: 8e5ccc9dbb4611cf9f363efbbe8e1c59,elapsedTime: 5702 ms,status: Success,code: 200,message: Success

2017-07-14 17:47:17 +0000 [INFO] from application in application-akka.actor.default-dispatcher-15 - requestId: ad3265d3e99aa541e045ea726a669d65,elapsedTime: 5968 ms,status: Success,code: 200,message: Success

One line is for the query request and another is for the query result, how can i merge both lines according to the requestId?

My log format is log4j & I am also facing the same problem. Please help us out.

Thanks

Have you looked at the aggregate filter?

I am using below conf for processing my logs but i am not getting appropriate result. Can you please guide me where i am doing wrong.

input {
file {
path => "/home/amit/Desktop/application-log-2017-07-14.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter {
grok{
match=>{
"message"=>'(?%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND} %{ISO8601_TIMEZONE}) [INFO] from application in application-akka.actor.default-dispatcher-%{WORD:phaltu} - requestId: %{WORD:id},user: %{NOTSPACE:email}, organization: %{WORD:organization}, view: %{WORD:view}, queryTz: %{WORD:query_time_zone}, queryStartTime: %{NUMBER:start_time}, queryEndTime: %{NUMBER:end_time}'
}
}

grok{
match=>{
"message"=>'(?%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND} %{ISO8601_TIMEZONE}) [INFO] from application in application-akka.actor.default-dispatcher-%{WORD:phaltu} - requestId: %{WORD:id},elapsedTime: %{INT:elapsed_time:int} ms,status: %{WORD:status},code: %{NUMBER},message: %{WORD:message_status}'
}
}

if [view] == "Auction" {
	aggregate {
		task_id => "%{id}"
		code => "event.set('elapsed_time', map['elapsed_time'] = event.get('elapsed_time'))"
		map_action => "update"
	}
}

if [view] == "Bid" {
	aggregate {
		task_id => "%{id}"
		code => "event.set('status', map['status'] = event.get('status'))"
		map_action => "update"
	}
}

if [view] == "Default" {
	aggregate {
		task_id => "%{id}"
		code => "event.set('message_status', map['message_status'] = event.get('message_status'))"
		map_action => "update"
		end_of_task => true
		timeout => 120
	}
}   

}

output {
elasticsearch {
hosts => "localhost"
index => "vdopia"
}
stdout { codec => rubydebug }
}

I have used the above aggregation terminology, but working at all, it not even entering inside the if condition.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.