Hello, I'm looking for assistance with my attempt of passing logs of .json type to Elasticsearch using Logstash.
The tricky moment is that the .json file contains one single valid data and is being ignored by Logstash.
Example of .json log content:
{"playerName":"Medico","logSource":"Bprint","location":[12.505,29.147]}
Config file for Logstash:
input {
file {
path => "C:/logs/*.json"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
mutate {
gsub => [ "message", "\]\}", "]}
" ]
}
split {
field => "message"
}
json{
source=> "message"
remove_field => ["{message}"]
}
mutate {
remove_field => ["message", "host", "@version", "type"]
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
manage_template => false
index => "map"
}
stdout { codec => rubydebug }
}
As you see, my approach was to treat the .json input as plaint text and mutate it with gsub by adding a newline in the end of the raw string and then treat it as json.
The reason for this approach is that if I manually modify the created .json log file by adding a newline (pressing Enter key) and save – Logstash parses data and sends to Elastcsearch as expected (no gsub mutation is required in that case).
Also, I was inspired by this topic
But the approach does not work. I've tried multiple other approaches (like using multiline, json_lines, json codecs) and different gsub variations with no success. As long as .json has single line, it won't evoke Logstash. Looking for some support here. Thanks in advance!