Date field format in Logstash

Hi, I am new to ELK; I am trying to get my first example working but Logstash doesn't seem to like my date format.

stock.conf

input {
file {
path => ["/Users/alialaie/Desktop/Examples/Data/Intro/stock.csv"]
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["Date","Open","High","Low","Close","Volume","Adj Close"]
}

  date {
        match => ["Date", "dd-MM-yyyy HH:mm:ss"]
    }

mutate {convert => ["High", "float"]}
mutate {convert => ["Open", "float"]}
mutate {convert => ["Low", "float"]}
mutate {convert => ["Close", "float"]}
mutate {convert => ["Volume", "float"]}
mutate {convert => ["Adj Close", "float"]}

}

output {
elasticsearch {
hosts => "http://localhost:9200"
action => "index"
index => "stock"}
}

Here is the sample of stock.csv

Date,Open,High,Low,Close,Volume,Adj Close^M
02-04-2015 12:01:01,125.029999,125.559998,124.190002,125.32,32220100,122.294596^M
01-04-2015 12:01:01,124.82,125.120003,123.099998,124.25,40621400,121.250427^M
31-03-2015 12:01:01,126.089996,126.489998,124.360001,124.43,42090600,121.426082^M
30-03-2015 12:01:01,124.050003,126.400002,124,126.370003,47099700,123.31925^M
27-03-2015 12:01:01,124.57,124.699997,122.910004,123.25,39546200,120.274569^M
26-03-2015 12:01:01,122.760002,124.879997,122.599998,124.239998,47572900,121.240667^M
25-03-2015 12:01:01,126.540001,126.82,123.379997,123.379997,51655200,120.401428^M
24-03-2015 12:01:01,127.230003,128.039993,126.559998,126.690002,32842300,123.631525^M
23-03-2015 12:01:01,127.120003,127.849998,126.519997,127.209999,37709700,124.138968^M

Any help would be appreciated.

The date filter works fine for me.

$ cat test.config 
input { stdin {} }
output { stdout { codec => rubydebug } }
filter {
  date {
    match => ["message", "dd-MM-yyyy HH:mm:ss"]
  }
}
$ echo '02-04-2015 12:01:01' | /opt/logstash/bin/logstash -f test.config
Settings: Default pipeline workers: 8
Logstash startup completed
{
       "message" => "02-04-2015 12:01:01",
      "@version" => "1",
    "@timestamp" => "2015-04-02T10:01:01.000Z",
          "host" => "lnxolofon"
}
Logstash shutdown completed

What do your events look like? Use a stdout { codec => rubydebug } output while debugging.

Hi Magus;
Did my homework. On the test.config the output looks like this :
Pipeline main started
{
"message" => "02-04-2015 12:01:01",
"@version" => "1",
"@timestamp" => "2015-04-02T10:01:01.000Z",
"host" => "Alis-MBP.home"
}
Pipeline main has been shutdown

The outcome of the stdout {codec=> rubydebug } looks like this
{
"message" => "16-12-1980 12:01:01,25.375,25.375,25.25,25.25,26432000,0.378845\r",
"@version" => "1",
"@timestamp" => "1980-12-16T11:01:01.000Z",
"path" => "/Users/alialaie/Desktop/Examples/Data/Intro/stock.csv",
"host" => "Alis-MBP.home",
"Date" => "16-12-1980 12:01:01",
"Open" => 25.375,
"High" => 25.375,
"Low" => 25.25,
"Close" => 25.25,
"Volume" => 26432000.0,
"Adj Close" => 0.378845
}
Date is being picked by ES as string and not a date. Am I doing something wrong here.
Thanks

The @timestamp field is populated from the message field exactly as expected (assuming your timezone is UTC+1), so we're good there. But you're saying the field in Elasticsearch has the wrong mapping? Did you determine that with the get mapping API?

This is what I did just right now:
curl -XGET 'http://localhost:9200/stock'
{"stock":{"aliases":{},"mappings":{"logs":{"properties":{"@timestamp":{"type":"date","format":"strict_date_optional_time||epoch_millis"},"

When I go to kibana to find my index, the field "Date" of type "date" is not being recognised as time field event. The only time field event is @timestamp. Am I doing something wrong here.

Well, if you want the date filter to write to the Date field instead of @timestamp you'll have to configure it accordingly (using the target option). Secondly you'll want to explicitly map the Date field as the date type, preferably using an index template.

1 Like

Thanks. Problem solved.