Logstash parsing json

I am trying to use log-stash to read input file 1.log in JSON format and write on elasticsearch. This is my log file:

{"key":"value00"}
{"key":"value01"}
{"key1":[{"key2":"value02"},{"key3":"value03"},{"key4":[{"key5":"value 04"}]}]}

and this is my configuration file:

input {
  file {
    type => "json"
    path => "/logstash/1.log"
  }
}
filter{
  json {
    source => "message"
    remove_field => ["message"]
  }
}
output {
    elasticsearch {
        hosts => ["192.168.1.6:9200"]
        user => "elastic"
        password => "something"
    }
}

the log-stash behaviour is completely random. Some times it works correctly but, some times it returns the following error for the same input structure:

Error parsing json {:source=>"message", :raw=>"4\"}]}]}", :exception=>#<LogStash::Json::ParserError: Unexpected character ('"' (code 34)): Expected space separating root-level values

Hi,

Which version of logstash are you running ?

The type field is deprecated you should get rid of it in the input.

Also, can you provide sample of your json formatted data ?

Hi,
thanks for your response. I'm using logstash version 7.11.0.
Also removing type filed cause following warnings and the out put in elastic is like the figure.

[WARN ] 2021-02-23 13:32:08.679 [[main]>worker2] json - Error parsing json {:source=>"message", :raw=>"22\":\"value01\"}", :exception=>#<LogStash::Json::ParserError: Unexpected character ('"' (code 34)): Expected space separating root-level values
 at [Source: (byte[])"22":"value01"}"; line: 1, column: 4]>}
[WARN ] 2021-02-23 13:32:08.703 [[main]>worker1] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x433f6176>], :response=>{"index"=>{"_index"=>"logstash-2021.02.16-000001", "_type"=>"_doc", "_id"=>"4wdWzncBWvA1N8Dsw_u8", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [key1] of type [text] in document with id '4wdWzncBWvA1N8Dsw_u8'. Preview of field's value: '{key2=value02}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:64"}}}}}

if you mean the correct out put in elastic by json formatted data its like this for something like last json in input data:

{
  "_index": "logstash-2021.02.16-000001",
  "_type": "_doc",
  "_id": "Ccx7yHcBz9PWCXgpyOE8",
  "_version": 1,
  "_score": null,
  "_source": {
    "ali222": [
      {
        "ere1231za": "123123michel"
      },
      {
        "ahm123123ad": "ali123123"
      },
      {
        "reza": [
          {
            "ali": "asdas22d"
          }
        ]
      }
    ],
    "host": "blue",
    "type": "json",
    "path": "/home/ali/logstash/1.log",
    "@version": "1",
    "@timestamp": "2021-02-22T06:44:51.259Z"
  },
  "fields": {
    "@timestamp": [
      "2021-02-22T06:44:51.259Z"
    ]
  },
  "sort": [
    1613976291259
  ]
}

I'm really appreciate your help.

I meant the input data contained in the file 1.log i'm affraid it's not valid json data i was asking for a log line of this file.

As seen in the screenshot it looks like the field "message" is not valid json.

this is the all contents of 1.log

Oh so here you have your error the values shown above does not represent a valid JSON.

Do you built it yourself ?

1 Like

yes i wrote a python script to build this file. but as i said before its behave randomly; sometime works and sometime not.

you right using a python script to produce the jsons solved the problem thank you.

Glad to help