Importing palo alto logs, " failed to parse " error on time

I'm trying to import palo alto logs, and there are two fields ReceiveTime and GenerateTime, with time stamp as below

       "ReceiveTime" => "2016/02/12 19:03:09",
      "GenerateTime" => "2016/02/12 19:03:09",

in the logstash.conf , I have a date plugin as follows, which appears to match fine
date {
#timezone => "America/New_York"
timezone => "America/Chicago"
#match => [ "GenerateTime", "YYYY/MM/dd HH:mm" ]

Problem is ReceiveTime should only be HH:mm , but it appends date also

       "ReceiveTime" => "2016/02/12 19:03:09",

The error seems to be when it attempts to push it to ES.

status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [ReceiveTime]", "caused_by"=>{"type"=>"illegal_argument_exception",
"reason"=>"Invalid format: "2016/02/12 19:03:03" is malformed at "/02/12 19:03:03""}}}}, :level=>:warn}

How do I resolve this? I am new to ES/LS

thank you

the GenertateTime is no commented out...

match => [ "GenerateTime", "YYYY/MM/dd HH:mm" ]

Hi,

I am facing same issue.
Grok Filters is parsing logs properly but elasticsearch is giving exception

Logs of elasticsearch :

MapperParsingException[failed to parse [GenerateTime]]; nested: IllegalArgumentException[Invalid format: "2016/08/02 00:35:44" is malformed at "/08/02 00:35:44"];

Please help :slight_smile:

Your mapping in elasticsearch for this field is incorrect.

so every time, if any new fields occurs, we have to update elasticsearch mappings?

Yes. Unless you want this field to be a String.

If you want it to be a Date, so you are able to run date histograms for example on it), then you need to provide the right mapping.

Here you did not tell us what is your mapping/config/... So hard to tell more.

Thanks David, I got it.
I updated logstash output filter to add elasticsearch mapping :

30-elasticsearch-output.conf

output {
if [type] == "paloalto_firewall" {
elasticsearch { hosts => ["localhost:9200"]
template => "/etc/logstash/elasticsearch-template.json"
template_overwrite => true
}
}
}

elasticsearch-template.json

{
"template" : "logstash-",
"settings" : {
"index.refresh_interval" : "5s"
},
"mappings" : {
"_default_" : {
"_all" : {"enabled" : true},
"dynamic_templates" : [ {
"message_field" : {
"match" : "message",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string", "index" : "analyzed", "omit_norms" : true
}
}
}, {
"string_fields" : {
"match" : "
",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string", "index" : "analyzed", "omit_norms" : true,
"fields" : {
"raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 256}
}
}
}
} ],
"properties" : {
"@version": { "type": "string", "index": "not_analyzed" },
"geoip" : {
"type" : "object",
"dynamic": true,
"path": "full",
"properties" : {
"location" : { "type" : "geo_point", "lat_lon" : true, "geohash" : true }
}
},
"SourceGeo" : {
"type" : "object",
"dynamic": true,
"path": "full",
"properties" : {
"location" : {"type" : "geo_point", "lat_lon" : true, "geohash" : true }
}
},
"DestinationGeo": {
"type": "object",
"dynamic": true,
"path": "full",
"properties" : {
"location" : { "type" : "geo_point", "lat_lon" : true, "geohash" : true }
}
}
}
}
}
}