Hi , im new to logstash and found it to be super useful to loadup data from different sources to an es cluster ! But ive run into one problem which ive been stuck with for a while now and would really love some help.
Im trying to construct and store a json object in using the data from input and transformed in filter before sending them to my es output. I was following along this tutorial How to keep Elasticsearch synced with a RDBMS using Logstash and JDBC | Elastic Blog .
Heres the gist of the transformation i want to perform. I would like to have a json object constructed using the input.
Assuming the above mentioned tutorial is followed, and we have some more columns in rdbms table such as DoB
,data i want to load to es is as follows:
{
"id": "123",
"name": "John",
"details": {
"DoB": "01/01/2000",
}
}
as i have defined my index mapping as below:
{
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"name": {
"type": "search_as_you_type"
},
"details": {
"properties": {
"DoB": {
"type": "date",
}
}
}
}
}
}
But when i tried to run my own logstash pipeline by adding this filter:
mutate {
replace => {
"DoB" =>
{ "DoB" => "%{DoB}" }
}
}
, i keep running into the following error:
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [details] tried to parse field [details] as object, but found a concrete value"}}}}
Is this not a desirable way to do index data on Elasticsearch ? Even so , how would one populate object type mapping on elastic using logstash? Would be super helpful if anyone could help me understand how to go about this.