Logstash JSON parse error

Hi, how do i fix this parsing error in logstash

[ERROR] 2021-05-08 15:30:46.678 [[main]<file] json - JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Invalid FieldReference: 0x[]>, :data=>"{"method": "POST", "path": "/", "headers": {"host": "54.90.21.149", "connection": "keep-alive", "accept-encoding": "gzip, deflate", "accept": "/", "user-agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.129 Safari/537.36", "content-length": "20", "content-type": "application/x-www-form-urlencoded"}, "uuid": "9604dd46-0697-4ee8-8bc5-75748a87a918",
"peer": {"ip": "52.154.74.227", "port": 54584}, "status": 200, "post_data": {"0x": "androxgh0st"}, "cookies": {"sess_uuid": null}, "response_msg": {"version": "0.6.0", "response": {"message": {"detection": {"name": "unknown", "order": 0, "type": 1, "version": "0.6.0"}, "sess_uuid": "8202c9a0-1146-46c7-9459-4f1acb045f61"}}}, "timestamp": "2021-05-08T10:00:46.327611"}"}

Here is the configuration file that I'm using (tpotce/logstash.conf at master · telekom-security/tpotce · GitHub)

And logstash is trying to parse this JSON File containing the following data

{"method": "POST", "path": "/", "headers": {"host": "4.90.21.149", "connection": "keep-alive", "accept-encoding": "gzip, deflate", "accept": "*/*", "user-agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.129 Safari/537.36", "content-length": "20", "content-type": "application/x-www-form-urlencoded"}, "uuid": "0a6d9708-5b53-4860-acef-26fb00dc60c4", "peer": {"ip": "19.213.44.208", "port": 55661}, "status": 200, "post_data": {"0x[]": "androxgh0st"}, "cookies": {"sess_uuid": null}, "response_msg": {"version": "0.6.0", "response": {"message": {"detection": {"name": "unknown", "order": 0, "type": 1, "version": "0.6.0"}, "sess_uuid": "ca5d5151-69a9-4f26-9511-7a15b97972be"}}}, "timestamp": "2021-05-10T05:44:48.617841"}
{"method": "GET", "path": "/.env", "headers": {"host": "4.90.21.149", "connection": "keep-alive", "accept-encoding": "gzip, deflate", "accept": "*/*", "user-agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.129 Safari/537.36"}, "uuid": "0a6d9708-5b53-4860-acef-26fb00dc60c4", "peer": {"ip": "19.213.44.208", "port": 63109}, "status": 200, "cookies": {"sess_uuid": null}, "response_msg": {"version": "0.6.0", "response": {"message": {"detection": {"name": "index", "order": 1, "type": 1, "version": "0.6.0"}, "sess_uuid": "ca5d5151-69a9-4f26-9511-7a15b97972be"}}}, "timestamp": "2021-05-10T05:44:50.330817"}

thanks in advance

This is a known issue. According to the last post in that thread you can work around it by setting the target option (and then moving it with mutate).

Hi @Badger, Can you please help me out a little on what i should to exactly?

input {
    file {
    path => ["/data/tanner/log/tanner_report.json"]
    codec => json
    type => "Tanner"
  }

}
filter {
    if [type] == "Tanner" {
    date {
      match => [ "timestamp", "ISO8601" ]
    }
    mutate {
      rename => {
        "[peer][ip]" => "src_ip"
        "[peer][port]" => "src_port"
      }
      add_field => {
        "dest_port" => "80"
      }
    }
  }
}
output {
  elasticsearch {
    hosts => ["elasticsearch:9200"]
    # With templates now being legacy and ILM in place we need to set the daily index with its template manually. Otherwise a new index might be created with differents settings configured through Kibana.
    index => "logstash-%{+YYYY.MM.dd}"
    template => "/etc/logstash/tpot_es_template.json"
#    document_type => "doc"
  }

Do not use a codec. Use a json filter. Something like

json { source => "message" target => "someField" }

The move the fields up to the root using something like this.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.