Parse single-line json with logstash

Hello, I'm looking for assistance with my attempt of passing logs of .json type to Elasticsearch using Logstash.
The tricky moment is that the .json file contains one single valid data and is being ignored by Logstash.

Example of .json log content:

{"playerName":"Medico","logSource":"Bprint","location":[12.505,29.147]}

Config file for Logstash:

input {
  file {
    path => "C:/logs/*.json"
    start_position => "beginning"
    sincedb_path => "NUL"
  }
}

filter {
  mutate {
    gsub => [ "message", "\]\}", "]}
    " ]
  }

  split {
    field => "message"
  }

  json{
    source=> "message"
    remove_field => ["{message}"]
  }
  
  mutate {
    remove_field => ["message", "host", "@version", "type"]
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"] 
    manage_template => false
    index => "map"                   
  }
  stdout { codec => rubydebug }
}

As you see, my approach was to treat the .json input as plaint text and mutate it with gsub by adding a newline in the end of the raw string and then treat it as json.

The reason for this approach is that if I manually modify the created .json log file by adding a newline (pressing Enter key) and save – Logstash parses data and sends to Elastcsearch as expected (no gsub mutation is required in that case).

Also, I was inspired by this topic

But the approach does not work. I've tried multiple other approaches (like using multiline, json_lines, json codecs) and different gsub variations with no success. As long as .json has single line, it won't evoke Logstash. Looking for some support here. Thanks in advance!

This should work.

input {
  file {
   path => "C:/logs/*.json"
   start_position => beginning
   sincedb_path => "NUL"
   delimiter => "]}"
  }
}

filter {

  mutate { update => { "message" => "%{message}]}" } }

  if [message] =~ /^\r\n/ {
    mutate { gsub => [ "message", "\r\n", "" ] }
  }
  
  json{
    source=> "message"
  }

  #mutate{   remove_field => [ "log", "host", "event", "@version", "@timestamp" ] }

}

output {
  stdout {}
}

You can also use } as delimeter if is JSON doesn't have objects.

1 Like

Thanks a lot for quick reply, Rios!
I made it work thanks to your suggestion. The config file now looks the following:

input {
  file {
    path => "C:/logs/*"
    start_position => "beginning"
    sincedb_path => "NUL"
    codec => plain {
            charset => "UTF-16LE"
        }
    delimiter => "}"
  }
}

filter {
  mutate { update => { "message" => "%{message}}" } }

  if [message] =~ /^\r\n/ {
    mutate { gsub => [ "message", "\r\n", "}" ] }
  }

  split {
    field => "message"
  }

  json{
    source=> "message"
    remove_field => ["{message}"]
  }
  
  mutate {
    remove_field => ["message", "host", "@version", "type"]
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"] 
    manage_template => false
    index => "map"                   
  }
  stdout { codec => rubydebug }
}

My json log would not have nested objects, so I decided to not use ] in delimiter.

Cheers!

1 Like

You are welcome.

Long ling the king and the Elastic team.