JSON filter swallowing events with no errors

Hello,

I'm trying to do some testing on ingesting JSON data, and having trouble with the JSON filter. My pipeline configuration is below (output filter has been modified to remove credentials and IPs)

input {
  http {}
}

filter {
  json {
    source => "message"
    remove_field => ["message"]
  }
  mutate {
    remove_field => ["headers"]
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => ["test-index"]
    user => "user"
    password => "password"
  }
}

The issue I'm having is that if I comment out the JSON filter (so input, mutate, and output only) it works fine. I get events in Elastic and they look good (with the exception of being a giant block of text in the message field). With the JSON filter uncommented... I get nothing. No errors in Logstash, no errors in Elasticsearch, and no events in Elastic. If I look in Pipeline Stack Monitoring in Kibana, it seems to show events going through, but nothing shows up.

If I comment out the JSON filter again and re-send the data, it works again.

edit: I also found out that if I specify a target in the JSON filter, it works.

That suggests to me that you are getting mapping exceptions for some of the fields that the json filter parses. The elasticsearch filter logs those. What does logstash log?

I watched both the Logstash log and the ES log (at least I thought I did) specifically looking for mapping errors and didn't see them. I just removed the index template entirely (which only had mappings in it) and it worked. So apparently I missed something somewhere. Thank you!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.