Error while sending data from mysql to Logstash

Hi All,

below is my config file for sending data from mysql to logstash

input {
jdbc {
jdbc_driver_library => "C:/ELK/logstash-6.2.3/mysql-connector-java-5.1.47/mysql-connector-java-5.1.47-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/automation?useSSL=true&verifyServerCertificate=false&requireSSL=true"
jdbc_user => ""
jdbc_password => ""
statement => "select * from tbl_hybris_waiting where ai_id>:sql_last_value"
use_column_value => true
tracking_column => ai_id
tracking_column_type => "numeric"
last_run_metadata_path => "C:\ELK\data\MetarunData.logstash_jdbc_last_run_hybriswaiting_database"
#clean_run => true
schedule =>"* * * * * *"

filter {
mutate { convert => {"ai_id" => "integer"} }
date {
match => ["execution month", "yyyy\MM"]
target => "execution month"

elasticsearch {
hosts => ["localhost:9200"]
index => "hybriswaiting_database"

and table data is in below manner

ai_id date processcode action execution time region execution month
1 10/26/2018 15:39 waitForStockNotification Tue Oct 24 06:58:03 GMT 2017 AU 2017-10
2 10/26/2018 15:39 waitForStockNotification Mon Oct 23 02:38:18 GMT 2017 AU 2017-10
3 10/26/2018 15:39 waitForStockNotification Thu Oct 26 11:11:15 GMT 2017 AU2 2017-10
4 10/26/2018 15:39 waitForStockNotification Sun Oct 29 05:35:24 GMT 2017 AU2 2017-10

but im getting the below error while sending data to logstash

Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"hybriswaiting_database", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x79d612f5], :response=>{"index"=>{"_index"=>"hybriswaiting_database", "_type"=>"doc", "_id"=>"PPROy2YBBHN1pAdlFGMy", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [execution month]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: """}}}}}

This part of the error message looks like there is a problem with the date format setting in your target index. Can you post the mapping you have in Elasticsearch for the "hybriswaiting_database" index? You should get it with something like

curl -X GET "<your_es_adress:your_port>/<your_indec_name>/_mapping"

{"hybriswaiting_database":{"mappings":{"doc":{"properties":{"@timestamp":{"type":"date"},"@version":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"action":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"ai_id":{"type":"long"},"date":{"type":"date"},"execution month":{"type":"date"},"execution time":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"processcode":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"region":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}}}}}}}

this is the one im getting

when "execution month" column is empty logstash is not taking that particular row

This means your index is expecting a "date" with the default format, which is "strict_date_optional_time||epoch_millis", so you can index date strings like 2015-01-01 or 2015-01-01T12:10:30Z or plain unix time milliseconds like 1420070400001 but not something that is just a year and month. In order to do that you need to specify a different "format" when the index mapping is created. In the case of Logstash this is sometimes done via and index template, but might also be done manually by yourself. Another option would be to convert your YYYY-MM date to YYYY-MM-01 somewhere in your Logstash pipeline.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.