I'm pushing a few IIS logs to logstash and some of the "message" fields do not get parsed. I've tested the log lines in the "message" field against the pattern that's in the default.json injest and the lines were parsed correctly but Kibana's just showing the log line in one "message" field. Strangely other lines get parsed correctly and I can see all the fields broken down. Any thoughts on what maybe happening?
Are you using the IIS module in filebeat? I am asking this because all the parsing is done in an ingest pipeline installed on Elasticsearch and since you are running Logstash in the middle you might not send the events to the ingest pipeline.
@pierhugues I'm using only Filebeat to directly send events to Elasticsearch with just the IIS module enabled. The lines it's processing are coming from the same IIS log file, all the IIS log fields are enabled. Everything is a default install.
Here's an example of a log line that fails to parse (edited to remove my server info). It parses correctly in the Grok test parser and is kept fully in the "message" field:
Not sure if this makes a difference but there are hundreds of similar lines with only the date/time and last two fields that differ. But I would have thought that because all the lines have a different timestamp they should all still be parsed correctly.
That's the weird part. I have identical lines that are parsed and some that aren't. Here are two screenshots from lines that are literally next to each other in Kibana and the filebeat config. There are no errors in the Filebeat log, just two INFO lines.
The events from the manually defined prospector doesn't go into the ingest pipeline so the field are not extracted. So when you say some events are correct and some are not, they are in fact duplicates that doesn't go through the same flow inside filebeat.
Removing the following lines from your configuration should fix your problem.