When I run Logstash getting this "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [Document_Type] tried to parse field [Document_Type] as object, but found a concrete value"}}}}
it means that in your mapping you defined that Document_Type as a object (in fact it looks like an object) but you're passing it as something different (in fact what you wrote there under that bold Document_Type is an array of numbers, not an object).
Can you please share a sample of input doc and the complete mapping for that index?
Above example working fine, But when indexing using logstash.
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/test?useCursorFetch=true"
# The path to our downloaded jdbc driver
jdbc_driver_library => "mysql-connector-java-8.0.18.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
statement => "SELECT a.fileid, a.filename, group_concat(b.document_type) as Document_Type FROM file_table AS a LEFT JOIN file_doctype AS b ON a.fileid = b.fileid group by a.fileid order by a.fileid"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["localhost:9200"]
index => ["testindex"]
}
}
I am getting this "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [Document_Type] tried to parse field [Document_Type] as object, but found a concrete value"}}}}
I'm sorry I've been kinda busy and didn't see the notification. Anyway, can you share the output of logstash? I mean, you're sending to both elasticsearch and standard output.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.