Parse logfile date into Kibana's timestamp

Hello, I have a date in log file like this one 2017-01-01 07:57:22 , I want to extract only the month and the day 01-01, using a logstash filter, then use it as Kibana's timestamp.

here's how my logstash filter looks like:

input {
	tcp {
		port => 5000
		codec => multiline {
            pattern => "^(\s|{')"
            what => "previous"
        }
	}
}

filter {
	grok{
			 match => [ "message", "'item_scraped_count': %{NUMBER:scraped:int}" ]
			 match => [ "message", "%{TIMESTAMP_ISO8601:timestamp}" ]
	}

	date{
		     match => [ "timestamp", "yyyy-MM-dd HH:mm:ss,SSS" ]
		     target => "timestamp"
	}
}

output {
	elasticsearch {
		hosts => "elasticsearch:9200"
	}
}

You haven't really described what the problem is so it's hard to help. Have you configured the index pattern in Kibana to use timestamp as the timestamp field?

I edited the post, added the logstash config file.

Have you configured the index pattern in Kibana to use timestamp as the timestamp field?

How can I achieve this?

I edited the post, added the logstash config file.

Yes, but that's not what I asked for. I understand what you want, but you're not telling us what you currently get. Is the timestamp field not populated with the parsed timestamp? Does it have the wrong timezone? Is it correctly populated but Kibana ignores it?

How can I achieve this?

In the Kibana settings there's a dropdown for choosing the timestamp field for a particular index pattern. If you don't want to use the default @timestamp field (which Logstash is going to send anyway) you have to tell Kibana which field to use.

Why not stick with @timestamp until you're more comfortable with the stack?

Is the timestamp field not populated with the parsed timestamp? Does it have the wrong timezone? Is it correctly populated but Kibana ignores it?

Yes, precisely.
My extracted field exists with the correct values, but Kibana doesn't read it into its @timestamp field.

In the Kibana settings there's a dropdown for choosing the timestamp field for a particular index pattern. If you don't want to use the default @timestamp field (which Logstash is going to send anyway) you have to tell Kibana which field to use.

Ah, I got it, but it only shows the default @timestamp field, maybe because my extracted field is parsed as a string not a date?

Why not stick with @timestamp until you're more comfortable with the stack?

Problem is my extracted timestamp is different from the default one, the default one is the time when I indexed the logfile, I don't want this.

My extracted field exists with the correct values, but Kibana doesn't read it into its @timestamp field.

No, because you're storing the parsed timestamp in the timestamp field instead of in @timestamp. Remove the target option for your date filter so that you store it in @timestamp.

Ah, I got it, but it only shows the default @timestamp field, maybe because my extracted field is parsed as a string not a date?

Yes, that's probably it.

I removed the target option line, but Kibana still parses them as seperate fields.

How my extracted field is parsed as a string? the date filter should return a date field, shouldn't it?

I removed the target option line, but Kibana still parses them as seperate fields.

Please show your configuration and an example event from Kibana (produced by that configuration, of course).

My logstash configuration:

filter {
	grok{
			 match => [ "message", "'item_scraped_count': %{NUMBER:scraped:int}" ]
			 match => [ "message", "%{TIMESTAMP_ISO8601:timestamp}" ]
	}

	date{
		     match => [ "timestamp", "yyyy-MM-dd HH:mm:ss,SSS" ]
	}
}

Kibana fields:

An example event:

The _dateparsefailure tag indicates that the date filter failed. Your Logstash logs will tell you why, but without looking I see that your date pattern isn't matching the input. You don't have any milliseconds in your timestamp field so you need to delete ",SSS" from your date pattern.

Nice catch, how can I tweak my filters to extract only the month and day, like 2017-01-01 07:57:22 >> 01-01 ?

I don't think timestamps without a year are supported. What would it even mean? How do you want to use such values?

Maybe if I am visualizing data within the same year.

That doesn't explain why you explicitly want to remove the year from the timestamp.

If all my events are within 2017 for example, I think it would be redundant and somehow boring to keep mentioning the year in the visualizations, don't you agree?

That's a visualization problem. You should still store the full date.

so you need to delete ",SSS" from your date pattern.

still the same problem :smiley:

Still getting _dateparsefailure? If yes, look in the Logstash log like I said. If no, what do you get?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.