Custom Logs Ingest Pipeline

@ruflin

This is more or less a duplicate of Fleet Custom Logs Ingest Pipeline, but I still find it hard to figure out which steps I have to follow. I would really appreciate better documentation or some useful directions.

I am also trying to ingest a log file using the custom log integration in Fleet. Is there a way to configure a custom ingest pipeline as well so I can parse the custom fields in the message field using grok patterns?

1 Like

@joshdover You have been recently working on some ideas to improve this further. Do you have by chance some links that can be shared?

@mostlyjason For awareness of the discussion.

We are planning some changes to how we setup the templates to make user customizations much easier to add (see kibana#121118), however this won't fully solve the custom ingest pipeline case.

The best option I have right now is to edit the existing logs-log.log@custom component and then rollover the data stream:

  1. Create a new ingest pipeline
  2. Edit the logs-log.log@custom component template to add the default_pipeline index setting to point to the newly created ingest pipeline
  3. Rollover any existing data streams that match logs-log.log-* to apply the new settings using the Rollover API

Hey Josh,

I agree with @jsteenkamp . Some better docs on this would be nice.

Eg:Custom logs | Elastic Docs

how to use ingest pipeline either via the "Custom configurations" in the integration settings, and if its possible to just use the same configuration as in filebeat. eg can I just throw in output.Elasticsearch.pipeline: 'somepipeline'

It would be better just adding a optional field in the custom logs integration settings where it's possible to lookup existing pipelines or telling users a ingest pipeline is required etc.

1 Like

@joshdover thank you for your suggestions.

Based on your input, I have found this instruction online and guess what, it works in just two simple steps.

  1. create your custom pipeline
  2. define your custom pipeline in the custom configurations

1 Like

Ty. I’ve been looking for this for a few days and tested a few different things.:joy:

This should really be added to the custom logs docs as well.

@ruflin and @joshdover it is possible to setup a merge request on git or some way to update the docs from the user side ?

Looking forward to hearing back.

1 Like

It could be contributed here: integrations/packages/log at master · elastic/integrations · GitHub

What you did works, but using the pipeline setting on the input itself is something we don't encourage. The feature is there because we inherit it from beats directly. Instead the pipeline should be set in the settings on the data stream. But as discussed above, we don't provide everything needed here yet.

You know pretty well how the stack works and I'm sure it would be simple for you to change from a setting in the yaml to a setting on the data stream as soon as we have it. What I'm worried around documenting it is that many users will start using it and then eventually will be stuck. Maybe we can mention exactly this issue in the docs?

1 Like

Hey Ruflin,

I've setup it up via the components now. if this is the best practice for it at the moment.

got this weird issue though, I assume it's with the mappings not being setup correctly?

Something looks off with the mappings. You must set keyword as the default. Have a look at some of the existing templates installed.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.