Parse json data with logstash

Hi all,
I'm a newbie to logstash and I'm trying to parse a JSON data file like the following

{
    "A001": {
        "X": 503744.7,
        "Y": 4726339.0,
        "Z": 458.84,
        "LON": -2.954286956913572,
        "LAT": 42.68952475979137,
        "dates": [
            "2015-01-01",
            "2015-01-02",
            "2015-01-03",
            "2015-01-04",
            "2015-01-05",
            "2015-01-06"            
        ],
        "values": [
            "56.9",
            "49.7",
            "48.1",
            "37.1",
            "34.4",
            "35.9"         
        ]
    },
    "A002": {
        "X": 607870.5,
        "Y": 4670754.0,
        "Z": 264.83,
        "LON": -1.69378623727067,
        "LAT": 42.18149989583031,
        "dates": [
            "2015-01-01",
            "2015-01-02",
            "2015-01-03",
            "2015-01-04"          
        ],
        "values": [
            "287",
            "231",
            "207",
            "191"
		]	
    },
    "A403": {
        "X": 868708.0,
        "Y": 4709148.0,
        "Z": 849.0,
        "LON": 1.483146867002623,
        "LAT": 42.44694604132231,
        "dates": [
            "2015-01-01",
            "2015-01-02",
            "2015-01-03",
            "2015-01-04",
            "2015-01-05",
            "2015-01-06",
            "2015-01-07",
            "2015-01-08",
            "2015-01-09"            
        ],
        "values": [
            "2.296",
            "7.033",
            "2.298",
            "2.275",
            "7.207",
            "5.456",
            "4.794",
            "4.24",
            "4.748"
        ]
    }
}

and I'm using the following .conf file

input {
        file {
                type => "json"
                start_position => "beginning"
                path => "/etc/logstash/json-data.json"
                sincedb_path => "/dev/null" #to force to parse from the beginning
                codec => "json_lines"
        }
}

filter {
        json {
                source => "message"
        }
}

output {

        elasticsearch {
                hosts => [ "https://ip:9200" ]
                user => "elastic"
                password =>  "mypassword"
                ssl_certificate_verification => false
        index => "demo-json"
        document_type => "json"
        }

        stdout { codec => rubydebug }

but I'm getting this error:

 JSON parse error, original data now in message field {:message=>"incompatible json object type=java.lang.String , only hash map or arrays are supported", 

Could you please tell me what I'm doing wrong?

thanks in advance

What is the full error message? Why are you using a json filter if you have already parsed the JSON using a json_lines codec?

Hi @Badger
this is the error I got in /var/log/logstash/logstash-plain.log

[WARN ][logstash.filters.json    ][main][798c76c6b7564c05f1f7e4bd3babb64bebdec2ed56d7e2d2a44c15d6f4372356] Error parsing json {:source=>"message", :raw=>"        \"Z\": 849.0,", :exception=>#<LogStash::Json::ParserError: Unexpected character (':' (code 58)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
 at [Source: (byte[])"        "Z": 849.0,"; line: 1, column: 13]>}
[2022-02-21T16:45:57,504][WARN ][logstash.filters.json    ][main][798c76c6b7564c05f1f7e4bd3babb64bebdec2ed56d7e2d2a44c15d6f4372356] Error parsing json {:source=>"message", :raw=>"        \"LON\": 1.483146867002623,", :exception=>#<LogStash::Json::ParserError: Unexpected character (':' (code 58)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
 at [Source: (byte[])"        "LON": 1.483146867002623,"; line: 1, column: 15]>}
[2022-02-21T16:45:57,504][WARN ][logstash.filters.json    ][main][798c76c6b7564c05f1f7e4bd3babb64bebdec2ed56d7e2d2a44c15d6f4372356] Error parsing json {:source=>"message", :raw=>"    }", :exception=>#<LogStash::Json::ParserError: Unexpected close marker '}': expected ']' (for root starting at [Source: (byte[])"    }"; line: 1, column: 0])
 at [Source: (byte[])"    }"; line: 1, column: 6]>}
[2022-02-21T16:45:57,505][WARN ][logstash.filters.json    ][main][798c76c6b7564c05f1f7e4bd3babb64bebdec2ed56d7e2d2a44c15d6f4372356] Error parsing json {:source=>"message", :raw=>"        \"LAT\": 42.44694604132231,", :exception=>#<LogStash::Json::ParserError: Unexpected character (':' (code 58)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
 at [Source: (byte[])"        "LAT": 42.44694604132231,"; line: 1, column: 15]>}
[2022-02-21T16:45:57,505][WARN ][logstash.filters.json    ][main][798c76c6b7564c05f1f7e4bd3babb64bebdec2ed56d7e2d2a44c15d6f4372356] Error parsing json {:source=>"message", :raw=>"            \"4.24\",", :exception=>#<LogStash::Json::ParserError: Unexpected character (',' (code 44)): expected a value
 at [Source: (byte[])"            "4.24","; line: 1, column: 20]>}

About the json filter probably I misunderstood the documentation.

How should I arrange the conf file?

thanks

See this thread.

Dear @Badger
following our suggestion now I'm able to correctly parse the JSON file. May I ask you why it is messing up a bit the order of the fields with respect to the input file? For example, the field LON is at the bottom or the field Z in the middle.

"@timestamp" => 2022-02-23T09:23:05.369615Z,
          "A403" => {
             "X" => 868708.0,
             "Y" => 4709148.0,
           "LAT" => 42.44694604132231,
        "values" => [
            [0] "2.296",
            [1] "7.033",
            [2] "2.298",
            [3] "2.275",
            [4] "7.207",
            [5] "5.456",
            [6] "4.794",
            [7] "4.24",
            [8] "4.748"
        ],
             "Z" => 849.0,
         "dates" => [
            [0] "2015-01-01",
            [1] "2015-01-02",
            [2] "2015-01-03",
            [3] "2015-01-04",
            [4] "2015-01-05",
            [5] "2015-01-06",
            [6] "2015-01-07",
            [7] "2015-01-08",
            [8] "2015-01-09"
        ],
           "LON" => 1.483146867002623
    },

thanks

Why do you care about the order of the fields on the event?

@Badger actually I was worried that something was still wrong that's why I was asking :sweat_smile: . If the order does not matter for possible next steps, then do not consider my question

I do not believe that order matters.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.