JSON Parse Failure using logstash

Hi All,

I'm trying to ingest a json file using logstash adn i'm getting json parse failure message, can someone guide me where i'm doing the mistake

My json FIle

[
  {
    "Org": "ogr1",
    "date_last": "2020-01-14 02:02:06",
    "did": 3100,
    "emessage": "snmpInTotalReqVars : Value is high",
    "eseverity": 3,
    "esource": 4,
    "event_id": 242913,
    "ext_ticket_ref": "INC0601526",
    "hostname": "server1.yourdomain.com",
    "ip": "12.2.0.6",
    "notify_count": 3,
    "roa_id": 2,
    "user_ack": 26
  },
  {
    "Org": "ogr1",
    "date_last": "2020-02-24 03:21:04",
    "did": 3100,
    "emessage": "snmpInTotalReqVars : Value is high",
    "eseverity": 3,
    "esource": 4,
    "event_id": 476,
    "ext_ticket_ref": "INC0926941",
    "hostname": "server1.yourdomain.com",
    "ip": "12.2.0.6",
    "notify_count": 1,
    "roa_id": 2,
    "user_ack": 0
  },
  {
    "Org": "ogr2",
    "date_last": "2020-03-10 02:06:04",
    "did": 3100,
    "emessage": "snmpInTotalReqVars : Value is high",
    "eseverity": 3,
    "esource": 4,
    "event_id": 476,
    "ext_ticket_ref": "INC0926941",
    "hostname": "server2.yourdomain.com",
    "ip": "12.2.0.2",
    "notify_count": 1,
    "roa_id": 2,
    "user_ack": 0
  },
  {
    "Org": "ogr2",
    "date_last": "2020-03-11 02:06:05",
    "did": 3100,
    "emessage": "snmpInTotalReqVars : Value is high",
    "eseverity": 3,
    "esource": 4,
    "event_id": 476,
    "ext_ticket_ref": "INC0926941",
    "hostname": "server2.yourdomain.com",
    "ip": "12.2.0.2",
    "notify_count": 1,
    "roa_id": 2,
    "user_ack": 0
  },
  ]

My config file

input {
  file {
    path => "/opt/curl_output/data.json"
    start_position => "beginning"
    sincedb_path => "/dev/null"
    codec => "json"
  }
}
filter {
    json {
    source => "message"
    target => "parsed"
}
}
output {
  elasticsearch {
    hosts => ["https://elastic.co:9243"]
    user => "*****"
    password => "*****"
    index => "perf-test"
  }
#  stdout { codec => rubydebug }
}

Below is one of the line from error message

2020-04-27T18:59:44,929][ERROR][logstash.codecs.json ][main] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>" \"did\": \"4333\","}

Not sure what mistake i'm doing, the error message is there for every single line of my json.

Any ideas and suggestions will be great help for me.

Thanks
Gautham

You need to use a multiline codec instead of a json codec. See here for an example.

Thank you @Badger it partially worked.

One thing which is not happening is the split, all the data getting stored in "message" field, when i use a split filter its breaking down every word in my JSON.

 "path" => "/opt/curl_output/10events.json",
          "host" => "Logstash",
      "@version" => "1",
    "@timestamp" => 2020-04-28T06:28:24.114Z,
       "message" => "    \"event_id\": 476,"
}
{
          "path" => "/opt/curl_output/10events.json",
          "host" => "Logstash",
      "@version" => "1",
    "@timestamp" => 2020-04-28T06:28:24.114Z,
       "message" => "    \"ext_ticket_ref\": \"INC0926941\","
}
{
          "path" => "/opt/curl_output/10events.json",
          "host" => "Logstash",
      "@version" => "1",
    "@timestamp" => 2020-04-28T06:28:24.114Z,
       "message" => "    \"hostname\": \"server1.yourdomain.com\","
}
{
          "path" => "/opt/curl_output/10events.json",
          "host" => "DDP-PROD-Logstash",
      "@version" => "1",
    "@timestamp" => 2020-04-28T06:28:24.114Z,
       "message" => "    \"ip\": \"12.2.0.2\","

Thanks
Gautham

The split has to be after the json filter.

It worked @Badger thank you very much.

Thanks
Gautham

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.