Logstash date to datetime format

Hi,

Im using logstash to get data from mysql and pass them to elasticsearch.
Unfortunately date format is default parsed to zulu time. In need get data in datetime format
like yyyy-mm-dd H:i:s without t and z. Below is my logstash config

`input {
jdbc {
jdbc_driver_library => "/etc/mysql/driver/mysql-connector-java-5.1.48/mysql-connector-java-5.1.48-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/db"
jdbc_user => root
jdbc_password => "secret"
tracking_column => "id"
use_column_value=> true
statement => "SELECT * FROM db.logs;"
schedule => " * * * * * *"
}

}

output {
elasticsearch {
document_id=> "%{id}"
document_type => "_doc"
index => "logs"
hosts => "http://localhost:9200"
sniffing => true
}

stdout{
codec => rubydebug
}
}

filter {

date {
match => ["date", "yyyy-MM-dd HH:mm:ss"]
}
}`

In this case i received "_dateparsefailure".
Im also tried with mutate gsub and convert but not successed. Please help

If date is already a LogStash::Timestamp you can use ruby and strftime to format it as any string format you like.

here is syntax. search on it and you will find it more
setting up new date_time from old value
Also you have syntax wrong on date

date {
    match => ["date", "yyyy-MM-dd HH:mm:ss"]
    target => "date"
}

ruby {
    code => "
      event.set('date_time', event.get('date').time.localtime.strftime('%Y-%m-%d %H:%M:%S'))
    "
  }

you can use grok to convert that in to seperate field.

grok { match => { "date_time" => "^%{YEAR:year}-%{MONTHNUM2:month}-%{MONTHDAY:day}" } }

    and you will have three new field, year,month,day
and you can join them if you want string date not date format

mutate { 
   add_field => { "testdate" => "%{year}-%{month}-%{day} } 
  }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.