How to properly map fields from a CSV file in Elastic Custom Log Integration?

In Elastic, I have created a "Custom log integration", installed it on the agent and everything is working as it should. I have custom programme that dumps status information into a csv-file. Example data is:

"@timestamp","log.level","message"
"2024-03-11 09:45:10","error","Database backup has NOT completed successfully. The error is: xxxxx"
"2024-03-11 09:50:00","information","Database backup was completed successfully."

When I setup the custom log to read from this file, it also reads the content as it should and the header is ignored, but all the data ends up in the message field. I would of course want that @timestamp goes to @timestamp, log.level goes to log.level and message should of course only be the message and not the other fields.

I go to "Edit Custom Logs integration" => "Change defaults" => "Advanced options" => and then I edit the mapping called 'logs-generic@custom" in order to map the fields from the csv-file, but no matter how I specify the mapping, it is ignored.

So: Am I having the right approach and if so, how should I actually setup such a mapping?

Or as an alternative: Can I format the logfile in another way that will make the magic happen all by it selve?

Thanks a lot for your help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.