Not able to send multiple date fields in logstash


#1

My Log file-
MachineName|FeedName|Version|InputFileName|LoadType|StartTime|NumOfRecordsinCEMP|NumOfRecordsinStage|Status|Message|EndTime|EventSeverity|Action|time_taken_ms
EDTD-LYNXSQL01|Falcon|2017-04-13_180858_7480000|2017-04-13_180858_7482003_urn_uuid_67d977da-25c4-4e31-9f84-8a37562662e5.xml|Delta|2017-04-21 10:02:01.827|0|4|Success|File Loaded into DB|2017-04-21 10:37:49.000|Info|I|2147173
EDTD-LYNXSQL01|Falcon|2017-04-13_180916_0290000|2017-04-13_180916_0299310_urn_uuid_1d9a9239-fba8-497c-8023-6fe1eec70aaa.xml|Delta|2017-04-21 10:02:03.470|0|1|Success|File Loaded into DB|2017-04-21 10:37:49.000|Info|I|2145530MachineName|FeedName|Version|InputFileName|LoadType|StartTime|NumOfRecordsinCEMP|NumOfRecordsinStage|Status|Message|EndTime|EventSeverity|Action|time_taken_ms
EDTD-LYNXSQL01|Falcon|2017-04-13_180858_7480000|2017-04-13_180858_7482003_urn_uuid_67d977da-25c4-4e31-9f84-8a37562662e5.xml|Delta|2017-04-21 10:02:01.827|0|4|Success|File Loaded into DB|2017-04-21 10:37:49.000|Info|I|2147173
EDTD-LYNXSQL01|Falcon|2017-04-13_180916_0290000|2017-04-13_180916_0299310_urn_uuid_1d9a9239-fba8-497c-8023-6fe1eec70aaa.xml|Delta|2017-04-21 10:02:03.470|0|1|Success|File Loaded into DB|2017-04-21 10:37:49.000|Info|I|2145530

My logstash config file is-
filter {
if [type] == "RDS_Logs"{

if [message] =~ /^"MachineName|FeedName|Version"/ {
drop { }
}
csv {
separator => "|"
#MachineName|FeedName|Version|InputFileName|LoadType|StartTime|NumOfRecordsinCEMP|NumOfRecordsinStage|Status|Message|EndTime|EventSeverity|Action|time_taken_ms
columns => ["MachineName","FeedName","Version","InputFileName","LoadType","StartTime","NumOfRecordsinCEMP","NumOfRecordsinStage","Status","sp-message","EndTime","EventSeverity","Action","time_taken_ms"]
}

date {
	match => ["StartTime","YYYY-MM-dd;HH:mm:ss.SSS","ISO8601"]
	timezone => "Europe/London"
	target => "StartTime"
	locale=>"en"	

 }
date {
	match => ["EndTime", "YYYY-MM-dd;HH:mm:ss.SSS","ISO8601"]
	timezone => "Europe/London"
	target => "EndTime"
	locale=>"en"	

 }

}

mutate {
gsub => [ "StartTime"," ",";",
"EndTime"," ",";"
]
}
}

I am not getting any error, logstash is able to parse StartTime and Endtime fields. But both fields type is set to "string" in Kibana. I need to set it to "date".


(Mark Walkom) #2

Can you please reformat your post and use code tags, it's a bit hard to read as is :slight_smile:


#3

My Log file-
MachineName|FeedName|Version|InputFileName|LoadType|StartTime|NumOfRecordsinCEMP|NumOfRecordsinStage|Status|Message|EndTime|EventSeverity|Action|time_taken_ms
EDTD-LYNXSQL01|Falcon|2017-04-13_180858_7480000|2017-04-13_180858_7482003_urn_uuid_67d977da-25c4-4e31-9f84-8a37562662e5.xml|Delta|2017-04-21 10:02:01.827|0|4|Success|File Loaded into DB|2017-04-21 10:37:49.000|Info|I|2147173
EDTD-LYNXSQL01|Falcon|2017-04-13_180916_0290000|2017-04-13_180916_0299310_urn_uuid_1d9a9239-fba8-497c-8023-6fe1eec70aaa.xml|Delta|2017-04-21 10:02:03.470|0|1|Success|File Loaded into DB|2017-04-21 10:37:49.000|Info|I|2145530MachineName|FeedName|Version|InputFileName|LoadType|StartTime|NumOfRecordsinCEMP|NumOfRecordsinStage|Status|Message|EndTime|EventSeverity|Action|time_taken_ms
EDTD-LYNXSQL01|Falcon|2017-04-13_180858_7480000|2017-04-13_180858_7482003_urn_uuid_67d977da-25c4-4e31-9f84-8a37562662e5.xml|Delta|2017-04-21 10:02:01.827|0|4|Success|File Loaded into DB|2017-04-21 10:37:49.000|Info|I|2147173
EDTD-LYNXSQL01|Falcon|2017-04-13_180916_0290000|2017-04-13_180916_0299310_urn_uuid_1d9a9239-fba8-497c-8023-6fe1eec70aaa.xml|Delta|2017-04-21 10:02:03.470|0|1|Success|File Loaded into DB|2017-04-21 10:37:49.000|Info|I|2145530

My Logstash config is-

filter {
if [type] == "RDS_Logs"{

if [message]  =~ /^"MachineName|FeedName|Version"/ {
drop { }
}
  csv {
      separator => "|"
     columns => ["MachineName","FeedName","Version","InputFileName","LoadType","StartTime","NumOfRecordsinCEMP","NumOfRecordsinStage","Status","sp-message","EndTime","EventSeverity","Action","time_taken_ms"]
 	}
	date {
  		match => ["StartTime","YYYY-MM-dd;HH:mm:ss.SSS","ISO8601"]
		timezone => "Europe/London"
		target => "StartTime"
		locale=>"en"	
	 	 }
	date {
  		match => ["EndTime", "YYYY-MM-dd;HH:mm:ss.SSS","ISO8601"]
		timezone => "Europe/London"
		target => "EndTime"
		locale=>"en"	
	 	 }
   }
	
   mutate {
	gsub => [ "StartTime"," ",";",
		    "EndTime"," ",";"
		   ]
}
}

I am not getting any error, logstash is able to parse StartTime and Endtime fields. But both fields type is set to "string" in Kibana. I need to set it to "date".


(Magnus B├Ąck) #4

You can't change a field's mapping once it has been set (unless you reindex or otherwise recreate your index). Can you try reindexing or create a new index and see if it corrects itself?


#5

Thanks Magnus..I will recreate index and check


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.