I have an appliance that sends JSON formatted logs over a TCP port. The "input" section in my Logstash pipeline configuration file looks like:
input {
tcp {
host => "192.168.1.93"
port => 5501
codec => json
id => "my_appliance"
}
}
This has worked correctly, providing field mappings for my "filter" section, since Logstash v 6.1. Now, after I upgrade from 7.1.1 to 7.2.0, the JSON parsing has gone haywire. Sometimes I get one field mapped to a value (instead of the usual multiple fields) in a single event. Most of the time I get large blocks of the JSON stream mapped to "message" with an associated "_jsonparsefailure" tag.
I downgraded my distribution back to 7.1.1 and everything works fine again.
I'm looking for a path forward. To debug, I'm trying to dump the TCP JSON stream to a file so I can run it through my standalone Logstash v7.2.0 offline using the following filter:
input {
tcp {
host => "192.168.10.93"
port => 5501
codec => plain
id => "my_appliance"
}
}
filter {
}
output {
file {
path => "/var/tmp/myappliance_dbg.log"
codec => plain
}
}
I briefly switch to this configuration to produce data for debugging and then restart with my original. This produces a non-delimited text file with JSON events that look reasonable:
2019-07-05T22:39:02.090Z 192.168.10.93 {"event":"run","_system_name":"localhost","_write_ts":"2019-07-05T22:37:57.519314Z","uid":"CN9472U34"}
Now I want to read the saved JSON data stream using my original configuration, replacing the TCP input shown above with a file input:
input {
file {
path => "/var/tmp/myappliance_dbg.log"
start_position => beginning
sincedb_path => "/dev/null"
}
}
I get no output from this configuration whatsoever. The Logstash messages show the pipeline running normally. Just no output.
Can I get any help, either on the problem upgrading from 7.1.1 to 7.2.0 or on my debugging procedure?