Hi Folks,
Logstash noob here. I am trying to put some JSON data but logstash is not setting my log data timestamp to @timestamp field. It continues to use the time when the data is read into logstash.
Here's the log data
{ "event_ts": "2016-Jun-07 20:13:51", "property": "propname", "aftype": "unicast" }
Below are is my conf file:
input {
stdin{}
}
filter {
date {
match => [ "event_ts", "YYYY-MM-dd HH:mm:ss,SSS Z"]
}
}
output {
stdout {
codec => rubydebug
}
}
Output:
{"event_ts": "2016-Jun-07 20:13:51.987", "property": "numRoutes", "aftype": "ipv4-ucast"}
{
"message" => "{"event_ts": "2016-Jun-07 20:13:51.987", "property": "numRoutes", "aftype": "ipv4-ucast"}",
"@version" => "1",
"@timestamp" => "2016-06-08T04:06:45.565Z",
"host" => "skaliann-ucs-e1"
}
@timestamp field is not the UTC equivalent of event_ts.
What am I doing wrong here? Please help.
As far as I can see (without having had time to test anything) there are a couple of issues. Firstly you have not used a json codec or filter to parse the JSON coming in, which means that the event_ts
field has not been extracted and does not exist. In addition to this I don't think your date pattern in the date filter matches what you actually are passing in as you 1) have a three letter month instead of 2 digits, 2) there is a period instead of a comma before the milliseconds and 3) you are not supplying a time zone.
In my filter I use this :
date {
match => ["myTimestamp", "yyyy-MM-dd-HH.mm.ss.SSSSSS"]
target => "@timestamp"
}
I think you need to set the target attribute.
I think you need to set the target attribute.
Only if the target field is something other than @timestamp
.
Hi,
I am also a newbie and I am trying to setup filebeat->logstash->elasticsearch chain and I am having problems with @timestamp which is not being transferred from the logfile and a timestamp when the message arrives into logstash is used instead. I will describe it on the example below. I hope somebody will help me understanding this problem and correcting it.
Logfile has this format:
{"application":"MyTestApp","source_host":"apphost01","message":"Hello_World","@timestamp":"2017-02-14T11:38:32.257Z"}
filebeat.yml:
filebeat.prospectors: - input_type: log paths: - /tmp/json.log output.logstash: hosts: ["localhost:5043"]
logstash pipeline conf:
input { beats { port => "5043" codec => json type => "log4j-json" } } output { stdout { codec => rubydebug } }
This is the logstash's output: (notice the @timestamp is different than one in the log entry)
{ "source_host" => "apphost01", "@timestamp" => 2017-02-15T14:47:57.593Z, "application" => "MyTestApp", "offset" => 118, "@version" => "1", "input_type" => "log", "beat" => { "hostname" => "tpl450", "name" => "tpl450", "version" => "5.2.1" }, "host" => "tpl450", "source" => "/tmp/json.log", "message" => "Hello_World", "type" => "log", "tags" => [ [0] "beats_input_codec_json_applied" ] }
@sagittarius, please start your own topic for your problem.