We've been trying to create a pipeline for logs from a security tool using the recently released CEF module, but we've been getting an error about the log format and its parsing.
The tool would pull its logs via an API call and then it will send it over syslog to a localhost. Filebeat with the CEF module would run on the same host listing on the syslog port.
....This is a module for receiving Common Event Format (CEF) data over Syslog. When messages are received over the syslog protocol the syslog input will parse the header and set the timestamp value. Then the decode_cef processor is applied to parse the CEF encoded data. The decoded data is written into a cef object field. Lastly any Elastic Common Schema (ECS) fields that can be populated with the CEF data are populated.
But, we're getting an error for the above pipeline.
The syslog input is failing to parse the syslog header. We're seeing this problem a lot because Filebeat's syslog input is too strict and only supports BSD-style RFC3164 messages.
In your case it might be related to the date format that your CEF exporter is using. Do you have a config option to change it?
As an alternative, you can modify the module to use the udp input instead of the syslog input, which does no parsing. See this message:
The file you need to change is module/cef/log/config/input.yml under /usr/share/filebeat/....
Adjusting /usr/share/filebeat/module/cef/log/config/input.yml seems to fix the issue with the pipeline. Right now, I can see the logs are being ingested but they are not being parsed. The vendor log gets send to a message field.
This what I'm getting (in Kinbana) after disabling the option to write logs to an output file in the vendor agent configuration file, not in filbebeat configuration file. Still not sure why the filebeat configuration enabled: false switch is not working.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.