New json filter crashing logstash

We are filtering sflow data with logstash with the following filter
filter {
json {
source => "message"
type => "json"
}
}
Which works fine. However, it seems it's deprecated.

But if we use the supported 1.5 json filter

filter {
if [type] == "sflow" {
json {
source => "message"
}
}
}

logstash will crash immediately or after a few seconds without any error in the logs.
We are doing something wrong ?

That looks totally fine to me. Please increase the logging verbosity with --verbose or --debug.

--verbose does not give anything but "adding pattern" with the list of pattern used
With --debug we can see all the flows in the log file but no sflow log is recorded. All the received logs are registered in the log file but nothing seems to be outputed to elasticsearch.

There is no error in the logs.

So... the crash is gone?

Huuu nope... It still crashes but without any errors that's the weird thing. The logs get filled up with received and parsed log and sflow but nothing is actualy outputed to elasticsearch.
Once this occurs and most of the time, only kill -9 will work to stop logstash.

I wish i could see something of interest in the log but :frowning:

What is the volume of logs you are trying to process?

We ran into similar issues with Logstash and it ended up being memory consumption issue with logstash. Take a look at the memory consumption of the machine and of the logstash process, when it dies from out of memory there are no errors in the log files and it just hangs.

-- Asaf.

Thanks, i'll look into that but this server has 176Go of ram and and 28 allocated to logstash itself, 32 allocated to elasticsearch.
I dont think that's the issue

Did you ever resolve this issue? I'm seeing the same exact thing with no errors in the log.

  if [types] == "platform_json" {
    json {
      source => "message"
    }
  }