How to avoid timestamp field getting appeared in Output

Here is my Configuration file:
input {
tcp {
port => 5000
#type => syslog
}
udp {
port => 5000
#type => syslog
}
}

filter {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp}\s+%{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
remove_field => ["@version", "host", "message", "@timestamp"]
add_field => [ "received_at", "%{@timestamp}" ]
#add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}

output {
stdout { codec => rubydebug }
}

Here is my Output :
{
"syslog_timestamp" => "Dec 8 23:11:57",
"syslog_hostname" => "reg-mx480-1.tlab.com",
"syslog_program" => "dcd",
"syslog_pid" => "79823",
"syslog_message" => "unknown encaps_ohead; dev ams0, encaps 0, flags 0x1, addr-fam 2, ifdp_type=104, overhead=-1\r",
"received_at" => "2015-12-15T22:57:56.856Z",
"syslog_severity_code" => 5,
"syslog_facility_code" => 1,
"syslog_facility" => "user-level",
"syslog_severity" => "notice",
"@timestamp" => "2015-12-09T07:11:57.000Z" <<< i need to remove this.
}

Though I am explicitly removing the @timestamp field, It is still appearing in output.

You're indeed removing @timestamp in your grok filter but you're adding it right back with your date filter. I'd expect you to be more interested in deleting the syslog_timestamp field.

Nope. I need two timestamps
1: syslog_timesatmp => which is syslog message
2: current time at which logstash instance is receiving this message.

So in above example :
"syslog_timestamp" => "Dec 8 23:11:57",
and
"received_at" => "2015-12-15T22:57:56.856Z",

I dont need this "@timestamp" => "2015-12-09T07:11:57.000Z"

Then set the target option of the date filter to store the parsed date in the syslog_timestamp field instead of in @timestamp which is the default.

Thanks !

My log Line is :
Dec 8 23:12:20 HOSTNAME mgd[41101]: UI_DBASE_LOGIN_EVENT: User 'SDNSS' entering configuration mode

And what I am getting after parsing is :
{
"syslog_timestamp" => "Dec 8 23:12:20",
"syslog_hostname" => "HOSTNAME",
"syslog_program" => "mgd",
"syslog_pid" => "41101",
"syslog_message" => "UI_DBASE_LOGIN_EVENT: User 'SDNSS' entering configuration mode\r",
"syslog_severity_code" => 5,
"syslog_facility_code" => 1,
"syslog_facility" => "user-level",
"syslog_severity" => "notice",
"@timestamp" => "2016-12-09T07:12:20.000Z"
}

How the value of @timestamp field is getting populated here ?
Clearly we see syslog message contains dated 8th dec at 11 PM, and I have sent this message to logstash on 21st Jan, 2016 around 5.30 PM.

How that timestamp field is getting populated and displayed ?

Use the date filter to populate the @timestamp field with the actual timestamp of each event. If you don't the field will contain the current time when Logstash received the message.

Thanks Mate. I think you didnt understand the problem here.

Here is my filter section

filter {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp}\s+%{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
remove_field => ["@version", "host", "message", "@timestamp"]
}
date {
target => "syslog_timestamp"
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}

As you have suggested earlier, I have put target in date filter as syslog_timestamp.
so here are two problems :
1: So with this as input line for above filter section
Dec 8 23:12:36 HOSTNAME dcd[46711]: unknown encaps_ohead; dev ams0, encaps 0, flags 0x1, addr-fam 2, ifdp_type=104, overhead=-1

What I am getting output of logstash is :
{
"syslog_timestamp" => "2016-12-09T07:12:36.000Z",
"syslog_hostname" => "HOSTNAME",
"syslog_program" => "dcd",
"syslog_pid" => "46711",
"syslog_message" => "unknown encaps_ohead; dev ams0, encaps 0, flags 0x1, addr-fam 2, ifdp_type=104, overhead=-1\r",
}

Now problem is from where logstash is getting this date : 2016-12-09T07:12:36.000Z
I have told date filter to take a date from syslog_timestamp which Dec 8 2015 at 11 PM roughly.

I dont understand why is this ? and from where its getting this time.
The machine on which logstash is running current time is : Thu Jan 21 19:44:24 PST 2016

2: I have simple requirement here
I will push syslog events from multiple network providers at any random time to my logstash instance. I want to get ONLY two timestamps.

  • One which I see in syslog event
  • One at which logstash instance receives this message.
    I am not able to figure it out how can I get this simple requirement done. I am sorry but Date filter and Grok filter documentation does not tell anything about timestamp and their manipulation.

I hope you understand what I am trying to say here.

Thanks,
Gaurav

Now problem is from where logstash is getting this date : 2016-12-09T07:12:36.000Z
I have told date filter to take a date from syslog_timestamp which Dec 8 2015 at 11 PM roughly.

As explained in another thread, @timestamp is expected to be UTC. Same goes for all fields populated by the date filter.

1 Like