Use "-" as delimiter in timestamp

I am using filebeat to feed csv files into logstash .Timestamp in the logs i need to process is as below.

11/05/2016-00:00:00,80,78,78,89
11/05/2016-00:05:00,80,78,78,89
11/05/2016-00:10:00,80,78,78,89

i use a simple filter in logstash

filter
{
csv {
separator => ","
}
date {
match => [ "message", "MM/dd/YYYY-HH:mm:ss" ]
}
}

I see following error in the logstash tail logs.

[2016-11-06T21:15:23,783][WARN ][logstash.filters.date ] Failed parsing date from field {:field=>"message", :value=>"11/05/2016-00:00:00,80,78,78,89,79,65,101,75,97,71,66,79,61,58,73,104,90,75,76,71,89,73,65,67,82,81,73,62,100,70,92,62,83,90,74,53,81,87,88,103,75,72,73,86", :exception=>"Invalid format: "11/05/2016-00:00:00,80,78,78,89,79,65,101,75,97,71,..." is malformed at ",80,78,78,89,79,65,101,75,97,71,..."", :config_parsers=>"MM/dd/YYYY-HH:mm:ss", :config_locale=>"default=en_US"}

Can anyone please help me with parsing "-" in the timestamp ?
Any help would be greatly appreciated.

br//

You should define column names in the CSV filter, then specify the actual timestamp column.

Using the date filter on message tells it to apply the entire event, not just one field.

Hi Mark,
Thank you so much for the prompt reply .

Yes i have been trying that as well with below filter
filter
{
csv {
separator => ","

    columns => ["mytimestamp","nodename"]
      }

date {
match => [ "mytimestamp", "MM/dd/YYYY HH:mm:ss" ]
}
}

Using this filter does not give any error but i do not see any other logs either in the logstash logs file.

Also the file is not transferred subsequently to kibana interface.
It is kind of weird behavior but logs only reach kibana either when there is a parsing error or when i have no filter on date.
Using above filter in date does not send logs to kibana interface.

br//

Use a stdout { codec => rubydebug } output until you've verified that your input and filters work alright. Only then enable the elasticsearch output.

Hi Mark,

This is how my logstash looks like but i still dont see any logs in the /var/log/logstash/logstash-plain.log other than the logs for stop and start of logstash service which i do after any change.
input {
beats {
port => 5044
}
}
filter
{
csv {
separator => ","
columns => ["mytimestamp","nodename"]
}
date {
match => [ "mytimestamp", "MM/dd/YYYY HH:mm:ss" ]
target => "@timestamp"
}
}
output
{
stdout {codec => rubydebug}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.