ok I am starting from start:
I Have a miscrosoft SQL table with a column of type datetime
By default when I run a logstash import , logstash detect microsoft sql columns as text type.
This is my logstash file :
input {
jdbc {
jdbc_connection_string => "jdbc:sqlserver://***.***.***.***;databaseName=myDatabaseName;"
jdbc_user => "user"
jdbc_password => "pass"
jdbc_driver_library => "C:\Program Files (x86)\sqljdbc_6.0\enu\sqljdbc42.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
statement => "SELECT
DATE_VISA_DEMANDE
FROM FILM WHERE DATE_VISA_DEMANDE IS NOT NULL"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "test_view"
document_type => "film_infos"
}
}
logstash output in console:
[[main]<jdbc] INFO logstash.inputs.jdbc - (0.109000s) SELECT
DATE_VISA_DEMANDE
FROM FILM WHERE DATE_VISA_DEMANDE IS NOT NULL
16:48:04.817 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>["http://localhost:9200"]}}
16:48:04.818 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Running health check to see if an Elasticsearch connection is working {:url=>#<URI::HTTP:0x5bab2d11 URL:http://localhost:9200>, :healthcheck_path=>"/"}
16:48:06.145 [[main]-pipeline-manager] WARN logstash.outputs.elasticsearch - Restored connection to ES instance {:url=>#<URI::HTTP:0x5bab2d11 URL:http://localhost:9200>}
16:48:06.146 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Using mapping template from {:path=>nil}
16:48:06.867 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001,
"settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false},
"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}},
{"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}],
"properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true,
"properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
16:48:06.980 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - New Elasticsearch output
{:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["localhost:9200"]}
16:48:06.986 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>1000}
16:48:06.988 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
16:48:07.378 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
16:48:10.005 [LogStash::Runner] WARN logstash.agent - stopping pipeline {:id=>"main"}
this is my result mapping:
"test_view": {
"mappings": {
"film_infos": {
"properties": {
"@timestamp": {
"type": "date"
},
"@version": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"date_visa_demande": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
}
Do you know what to do for having a date type in elasticsearch please?