Think you got a couple syntax errors.. you got the - in the wrong place... name your target fields etc .. see here for proper syntax (this agent integration is based on filebeat)
should look like
I am not sure I Understand Locally vs What... I tried mine on a Fleet Managed Agent and it worked fine... exactly what is your config? and did you look at the agent logs?
the syntax is very particular... sure no typos / formatting issues?
Locally vs client's cluster. I copied and pasted the same was working locally into the Client's Fleet server. Once I paste the the custom config that worked locally logs stops ingesting, I remove it and logs work again.
Logstash is not using this enrichment fields at all
We see Fleet publish the right event to Logstash, Logstash stdoutput shows the event too but we don't see the data in Elasticsearch
Also is there an ingest pipeline on the elasticsearch side?
Also if you put
stdout {codec => json_lines}
in the logstash you can get a sample message and then just try to post via curl or kibana Dev Tools and you should easily see the error.
There can still be an issue like if someone already put in a field with the same name that was a keyword and now you are trying to put in as an object ... that is a pretty common hard bug to find
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.