I am porting data from Mysql to Elasticsearch using logtash 5.2
I need to have a field named record_time as the timestamp in Elasticsearch, I used date filter, and it does not work, and there is no warning.
input {
jdbc {
jdbc_driver_library => "./mysql-connector-java-5.1.36.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://myserver:port/mundotrack?zeroDateTimeBehavior=convertToNull"
jdbc_user => "user"
jdbc_password => "pass"
jdbc_page_size => 10
jdbc_paging_enabled => true
statement => " select * from mytable order by record_time limit 10 "
}
}
filter {
date {
match => ["record_time","yyyy-MM-dd'T'HH:mm:ss.SSSZ", "ISO8601", "UNIX_MS","UNIX", "yyyy-MM-dd HH:mm:ss"]
timezone => "America/Toronto"
locale => "en"
target => "@timestamp"
}
}
output {
elasticsearch {
hosts => ["myhost"]
index => "myindex"
template_name => "mytemplate"
}
}
This is when I print out the record_time field using
stdout { codec => json }
.
.
The template is defined as
PUT /_template/mytemplate
{
"template": "myindex*",
"order" : 2,
"settings": {
"number_of_shards" : 2,
"number_of_replicas" : 1
},
"mappings" : {
"log" : {
"_all" : {"enabled" : false, "omit_norms" : true},
"dynamic_templates" : [
{
"string_fields" : {
"match" : "*",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string",
"index": "no"
}
}
}
]
}
}
}
I am not new to this filter, it works pretty good when when I used filebeat to forward data to logstash. But when I directly get data from mysql, it has problems.