We're trying to ingest nmon data that's recorded over a 24 hour timespan into Logstash. We first convert it to json usin nmon2json (I understand that njmon is an option, but we are trying to use existing nmon files for now), then move the file to the logstash server.
After getting to a point where the json file is validated, we see these messages:
[2022-07-13T13:44:54,986][ERROR][logstash.codecs.json ][nmon][nmon] JSON parse error, original data now in message field {:message=>"Unexpected close marker '}': expected ']' (for root starting at [Source: (String)"\t},"; line: 1, column: 0])\n at [Source: (String)"\t},"; line: 1, column: 3]", :exception=>LogStash::Json::ParserError, :data=>"\t},"}
The conf file we have for logstash is this (output ommitted)
input {
file {
codec => "json"
path => ["/shared/NMON/*.json"]
sincedb_path => "/dev/null"
tags => ["nmon"]
id => "nmon"
mode => "read"
start_position => "beginning"
stat_interval => "2s"
file_completed_action => "delete"
}
}
filter {
json
{
source => "message"
}
}
I'm not sure if I should be using both the codec and filter. For another company that we work with, their configuration file is, and it seems to work but they also reprocess the created json file first.
input {
file {
codec => "json"
path => ["/shared*"]
sincedb_path => "/dev/null"
tags => ["nmon"]
id => "nmon"
mode => "read"
start_position => "beginning"
stat_interval => "2s"
file_completed_action => "delete"
}
}
filter {
date {
match => [ "timestamp", "HH:mm:ss'T'dd-MMM-yyyy" ]
}
}
I've looked in the past topics as well, and I don't really see a defined solutions for nmon2json. Can anyone help?