I have tried to parse data from mysql into elk stack i have date column which has date format 2020-04-01 00:00:01 and 2020-06-25 13:36:24 but when i parse the data the date is not showing in date format its still showing in string field. My configuration file is
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/NBP"
jdbc_user => ""
jdbc_password => ""
jdbc_driver_library => "/etc/logstash/mysql-connector-java-8.0.16.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
statement => "SELECT * FROM ******l
"
use_column_value => true
tracking_column => "id"
clean_run => true
last_run_metadata_path => ".logstash_jdbc_last_run"
jdbc_paging_enabled => true
jdbc_page_size => "5000"
}
}
output {
elasticsearch {
hosts => "********"
index => "***"
document_id => "%{id}"
}
stdout{
codec => rubydebug
}
}
You'll either have to configure the mapping of your index to define your date field as a date data type in ES that accepts this date format or use a date filter in Logstash to parse the date. (I'd suggest doing the latter.) Either way you'll have to reindex your data because you can't change the mapping of an existing index.
Thanks for the reply i create date field with datetime datetype and it works
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.