Correct Mapping into ElasticSearch for JSON

Hi Guys,

I am pretty new to elasticsearch and was trying to get data from a postgres database to elasticsearch. I have currently used the following link to get data:

The issue, i am facing is the mapping of the incoming data. As mapping is done default and it uses the first element of the input. My JSON obecjt from postgres is being recognised as a string .
eg.

"field": "{"max"=>28, "min"=>18, "custom"=>[{"id"=>"6017279726270", "name"=>"Product_Viewers_30"}], "ana1"=>[2], "an2"=>{"ana21"=>["GA"], "an3"=>["home"]}, "types"=>["elastic"]}"

This is a JSON object in postgres. Can anyone let me know, how to go about it.

Thanks !

Two things come to mind.

The JSON object you describe has escaped quotes. Therefore it is not a JSON object. Maybe you should unescape those quotes.

Does the JDBC importer recognizes the JSON-object as a JSON-object? Or is it just any other value?

Hi

Thanks for the reply!

I think its just taking it as a string , so elasticsearch itself assumes it to be a string as well. I was thinking, can we save mapping in a way that the string can be stripped and get converted into a json object. My final goal is to essentially use the fields such as (max and min) and plot data via kibana

You could look into the document field type to use in your mapping.
https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-object-type.html

The JDBC importer can not recognize the JSON automatically und decided to proceed with a string. To be valid ES document JSON, it must have embracing { and }. You can tweak your SQL statement with string concatenation to achieve this.

1 Like

Hi

Thanks for the reply !
Currently i went ahead with a python script. Not sure about the sql statement though.