@timestamp not updating with my logs file timestamp

Hi,

My Log messages are kind of json object :
{"clientToken":"GMTALEOAPPLY","event":"uid_needs_to_map","initialReferral":"https://www.oscarplusmcmaster.ca/myAccount/internships-coop/engineering/postings.htm","ipAddress":"99.228.86.28","isp":{"autonomousSystemOrganization":"Rogers Cable Communications Inc.","autonomousSystemNumber":"812","isp":"Rogers Cable","organization":"Rogers Cable"},"location":{"continentCode":"NA","continentName":"North America","country":"Canada","countryIsocode":"CA","city":"Brampton","time_zone":"America/Toronto","latitude":"43.7328","longitude":"-79.7953","postalCode":"L6Z","registeredCountry":"Canada","registeredCountryIsocode":"CA","subdivisions":"Ontario","subdivisionsIsocode":"ON"},"phenomRefnum":"GENEA0085","timestamp":"1/14/17 9:49:14 AM"}

Where I have timestamp : "timestamp":"1/14/17 9:49:14 AM"

I have given logstash filter:

The # character at the beginning of a line indicates a comment. Use

comments to describe your configuration.

input {
beats {
port => "5043"

}

}

The filter part of this file is commented out to indicate that it is

optional.

filter {
kv {
field_split => ","
value_split => ":"
include_brackets => false
remove_field => [ "id%" ,"[%]","%."]
remove_field => [ "*[{}]*" ]
}

date {
match => [ "timestamp", "MM/dd/yy hh:mm:ss a", "ISO8601"]
target => "timestamp"
add_field => { "debug" => "timestamp"}

}

}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "pson"
}
stdout {}
}

But my timestamp is not updating to @timestamp
Please help.. its a major blocker for me ..

Did you try adding M/dd/yy hh:mm:ss a as a date pattern? If that doesn't help please show an example event. Copy/paste from Kibana's JSON tab.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.