Logstash has fetched the data but not shipped into Elasticsearch

I am using Logstash to fetch data from another host's mysql database and want to ship the data into Elastic database.After starting the Logstash ,I can observe the Logstash has already fetch the new data which I insert into mysql ,but I can not find the new data in my Elastic database.
here are my jdbc.conf:
input {
stdin {
}
jdbc {
# mysql jdbc connection string to our backup databse
jdbc_connection_string => "jdbc:mysql://192.168.9.223:3306/FANTASY"
# the user we wish to excute our statement as
jdbc_user => "root"
jdbc_password => "123456"
# the path to our downloaded jdbc driver
jdbc_driver_library => "E:\Elasticsearch\elasticsearch-6.2.4\logstash-6.2.4\logstash-core\lib\jars\mysql-connector-java-8.0.11.jar"
# the name of the driver class for mysql
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_paging_enabled => "true"
jdbc_page_size => "50000"
record_last_run => true
last_run_metadata_path => "E:\Elasticsearch\elasticsearch-6.2.4\logstash-6.2.4\logs.logstash_jdbc_last_run"
use_column_value => true
tracking_column => "id"
tracking_column_type => "numeric"
clean_run => false
statement_filepath => "E:\Elasticsearch\elasticsearch-6.2.4\logstash-6.2.4\jdbc.sql"
schedule => "*/10 * * * *"
type => "jdbc"
}
}
filter {
json {
source => "message"
remove_field => ["message"]
}
}
output {
elasticsearch {
hosts => ["192.168.9.58:9200"]
#port => "9200"
#protocol => "http"
index => "lee-index"
document_id => "%{id}"
}
stdout {
codec => json_lines
}
}

and I do nothing changed in pipelines.yml and logstash.yml.
How can I get the data into Elastic ?Any suggestions would be welcomed
I have check the result in command line :
[2018-06-01T16:28:50,028][WARN ][logstash.filters.json ] Parsed JSON object/hash requires a target configuration option {:source=>"message", :raw=>""}
[2018-06-01T16:28:50,045][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"%{id}", :_index=>"lee-index", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x2b149481], :response=>{"index"=>{"_index"=>"lee-index", "_type"=>"doc", "_id"=>"%{id}", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Rejecting mapping update to [lee-index] as the final mapping would have more than 1 type: [fulltext, doc]"}}}}

It seems there is mapping error in logstash. Here are my index setting :

PUT /lee-index

PUT /lee-index/fulltext/_mapping
{

    "properties": {

        "content": {

            "type": "text",

            "analyzer": "ik_max_word",

            "search_analyzer": "ik_max_word"

        }
    }

}

Why there is an fulltext type?

Have you looked in the Logstash log for clues? What if you turn up the loglevel?

It seems I have find the reason , but why there is both doc type and fulltext type ?
[2018-06-01T16:30:01,623][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"5", :_index=>"lee-index", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x17f26092], :response=>{"index"=>{"_index"=>"lee-index", "_type"=>"doc", "_id"=>"5", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Rejecting mapping update to [lee-index] as the final mapping would have more than 1 type: [fulltext, doc]"}}}}

I have solved my problem .Thanks for you advice !

It seems I have find the reason , but why there is both doc type and fulltext type ?

Hard to tell. There's nothing in your configuration that adds a "fulltext" type so I don't know where it comes from.

After changing the index type into doc ,the data can be shipped correctly.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.