Convert column to JSON

Hi,
I am new to ELK.
I am trying to import a CSV file into elasticsearch using Logstash.I have got all the fields into elasticSearch and it is of type doc.

I can see it in both Table and json formats.

My problem is, one of the fields contains multiple sub fields, I want to parse them as Jason. For example
Brand:Honda, car: "City", year:"2002", Engine:"1.5L",Dimensions:"{Length:"300cm", Breadth:"180cm", Height"150cm"}"

The Dimensions is shown as a string and not as a JSON. Is there any way to parse Dimensions as JSON using logstash filters.
The current conf file looks like this.
input {

    file {
    path => "C:/program files/logstash-6.5.1/data/raw.csv"
    start_position => "beginning"
	  sincedb_path => "nul"
    codec => "json"
  }
filter {
      csv {
        separator => ","
        columns => ["Brand","Car","year","Dimensions"]
      }
      json {
        source => ["Dimensions"]
      }
}
output {

elasticsearch {

hosts => ["http://localhost:9200"]

index => ["dermapenexperimentjson"]

manage_template => false

}

stdout {codec => rubydebug}

}

Now this script runs, but Dimensions is still shown as string in Elasticsearch not as a Json.

I get the following error if I try to introduce a json filter for Dimensions in config file.

:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported

Can you post a proper example of your JSON? The one you posted above is not valid JSON.

hope this helps you,

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.