Elastic-Agent - Custom Log Integration

Hi

I am trying to get a 'Custom Log' working with Elastic-Agent via the GUI. I have managed to get the custom log file being processed but I need to define mappings for it. the format of the file is minimal at this stage int he format of <string>,<IP address>:

Test,1.2.3.4
Test2,5,6,7,8

I can see there is a 'Custom Configuration' section where I can put in some YAML, would this be where I put some mapping information? Is there any examples of documentation I could work with?

At the moment, I just have a message field with the blob of text in so I need to be able to split it out and define the IP as an IP.

Thanks in advance

Phil

I assume all your data ends up in logs-generic-default data stream at the moment?

It seems you are asking for 2 things:

  • Adding an ingest pipeline to process your data
  • Adding mappings for it

Currently my recommendation is to first change the target index. You can do this by setting a different dataset value. Lets assume you set foo, your data will end up in logs-foo-default. Now what you need to do is create an Elasticsearch index template for logs-foo-default with your mappings and add an ingest pipeline to the settings. In the best case, you do this before shipping any data so the new settings and mappings all apply directly. Otherwise you need to trigger a rollover on the datastream to get the new mappings and settings.

Hope the above helps you to move forward here. We are working on ideas on how to make this simpler in the future.

That was very helpful and pointed me in the correct direction thank you.

For anyone else, my process to for this was:

From the Kibana home page, use the Upload a File option to import the file I was working with. This was a useful test and created the Ingest Pipeline for me and allowed me to confirm my mappings.

I then created an ILM policy for this as I wanted my data to use a specific ILM which expired the events out every 12hrs (its threat intel data so quickly old). I set the ILM policy to expire data every 12hrs and delete.

Then create an index template from Index Management. Reference the ILM policy and the ingest pipeline and its now working.

3 Likes

Great to hear you got it working and thanks for sharing more details for others that want to do the same!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.