Fleet Custom Logs integration add_fields processor

Hello Team, we are trying to add fields to a Custom Logs integration via "Advanced settings" tried many formats but none of them is working:

- processors:

fields.myField: value

   myField: value

If we just set :

tags: ["myvalue"]

It adds it to the documents but we want a more complex structure.

EDIT: We are using Logstash as Ouput instead of ES cloud.


Think you got a couple syntax errors.. you got the - in the wrong place... name your target fields etc .. see here for proper syntax (this agent integration is based on filebeat)
should look like

  - add_fields:
      target: project
        name: myproject
        id: '574734885120952459'

Adds these fields to any event:

  "project": {
    "name": "myproject",
    "id": "574734885120952459"

results in the doc

Thanks @stephenb , looks like using Logstash output instead of ES makes the difference.

Tried with ES output and all the variants worked, then switched back to Logstash and no luck.

Do you have any thoughts about this?

now works Thank you!

Good to hear you got it working!

What was the solution?... it is good to share with the community for the next person!

It worked locally but now it is not working , after adding the custom code log files just stop ingesting,

I am not sure I Understand Locally vs What... I tried mine on a Fleet Managed Agent and it worked fine... exactly what is your config? and did you look at the agent logs?

the syntax is very particular... sure no typos / formatting issues?

Locally vs client's cluster. I copied and pasted the same was working locally into the Client's Fleet server. Once I paste the the custom config that worked locally logs stops ingesting, I remove it and logs work again.

Logstash is not using this enrichment fields at all

We see Fleet publish the right event to Logstash, Logstash stdoutput shows the event too but we don't see the data in Elasticsearch

You are most likely describing a mapping issue.....

  1. the new fields do not match the existing mapping and thus the documents can not be ingested because mapping does not match or

  2. They have strict mapping setting that does not allow new fields and thus the document can not indexed

There should be a pretty descriptive error in the logstash logs... find it... and show me.

your can get the mapping of index with

GET /my-index that should show you as well

Thought the same, mappings are set as dynamic. Now will try to remove the noise to find some error log from Logstash.

1 Like

Also is there an ingest pipeline on the elasticsearch side?

Also if you put

stdout {codec => json_lines}

in the logstash you can get a sample message and then just try to post via curl or kibana Dev Tools and you should easily see the error.

There can still be an issue like if someone already put in a field with the same name that was a keyword and now you are trying to put in as an object ... that is a pretty common hard bug to find :slight_smile:

Mystery solved. Was having an error of fields limit exceeded. Not sure why there are so many fields in there but that's a different topic.

Thank you again!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.