Hello,
I have defined my own index in elasticsearch
PUT donor
{
"mappings": {
"properties": {
{...}
"donations": {"type": "nested"}
}
Basically I have a donor user in mysql and the donor has many donations, I have built a json array of object donations, and I wish to feed all this data to elasticsearch through logstash. Mysql builds the json array perfectly, however, when logstash does the query to send it to elastic, weird stuff occurs and I get this:
{:status=>400, :action=>["index", {:_id=>"137347", :_index=>"donors", :routing=>nil}, { "donations"=>"[\"{\\\"donation_id\\\": 13378812, \\\"donation_type_id\\\": 4, \\\"donation_center_id\\\": 1, \\\"donation_status_id\\\": 1, \\\"donation_sponsor_id\\\": 334},{\\\"donation_id\\\": 33432523, \\\"donation_type_id\\\": 4, \\\"donation_center_id\\\": 2, \\\"donation_status_id\\\": 1, \\\"donation_sponsor_id\\\": 300}\"]"}], :response=>{"index"=>{"status"=>400, "error"=>{"type"=>"document_parsing_exception", "reason"=>"[1:36] object mapping for [donations] tried to parse field [donations] as object, but found a concrete value"}}}}
Why is it finding a concrete value? If I manually post my result from mysql through Kibana it works...
Sample result as returned by mysql:
[{"donation_id": 13334300, "donation_type_id": 2, "donation_center_id": 2, "donation_status_id": 3, "donation_sponsor_id": 393}, {"donation_id": 33388011, "donation_type_id": 4, "donation_center_id": 3, "donation_status_id": 3, "donation_sponsor_id": 387}]
The logstash config is pretty bare bones, no filters, no processing no nothing. Input mysql, output elasticsearch. That is all
The TLDR is, I want logstash to not do anything to my entity, just pass it through as is. Why is it so hard? I have been stuck on this issue for hours. Please help.