I am currently using Elastic-agent to transmit log data to Elasticsearch through Logstash. I have two integrations configured: one for Fortigate and another for Custom UDP Logs.
However, with the Custom UDP Logs integration, using the logs-system.security-1.41.0 ingest pipeline, I have encountered an issue where the fields parsed from the logs are not being correctly mapped to Elasticsearch. This has resulted in the data not being parsed properly.
What could be causing this issue? Are there any configuration steps I might have missed? I hope someone with experience can provide some guidance or suggestions.
It is not exactly an issue, the ingest pipeline was not built to parse this data, there is no processors to parse it.
The security ingest pipeline expect that Windows events are being collected by Elastic Agent itself, it is the agent that performs locally a parse on the events to create the winlog.* fields, in this case the parse is almost all done on the Agent side.
You would need to get the Windows Events using an Elastic Agent or at least Winlogbeat.
Unfortunately in the way you are getting it you will need to build a custom ingest pipeline that would parse those fields.
I have set up index templates and custom ingest pipelines to parse these fields. However, the index template created for Custom UDP Logs is always 'logs-'. To successfully parse the fields, it should be the 'logs-windows-security-events-' that I established. Do you know where the problem might be? Any advice would be appreciated. Thank you.
I'm not sure, I do not use Elastic Agent to custom logs as I prefer to use Logstash because of the flexibility.
But looking at the configuration you shared there is at least one thing that is wrong, the dataset name.
You are using the dataset name as windows-security-events, but you cannot use this, there is a message bellow the Dataset name that says that you can't use the minus signal (-) in the dataset name.
Also, you are trying to change too many things at the same time, this makes everything harder to troubleshoot and understand what is not working.
You are trying to use a custom parsing, a custom mapping and a custom dataset name.
Another thing is that you enabled the Syslog Parser in the integration, but you are also trying to parse the raw message in your custom pipeline.
My suggestion is that you take some steps back and try to changes things one at the time.
First make sure that you are receiving the logs, unparsed, in the default dataset for the UDP integration, it is udp.generic if I'm not wrong.
After that, change the dataset to something custom, like windows_security_alerts and make sure that the events are arriving in that dataset.
Then enable syslog parsing and check how the message is transformed and what you need to parse.
After that you can create your parse in the custom ingest pipeline and then adjust the mapping.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.