1 Logstash instance installed on the elastic-agent
We have already configured the elastic-agent to send UDP custom log to our Elastic cluster and everything worked
Now, we want to enrich data with dns lookup information and for that we have installed a Logstash instance that filter those data from the beat
What we don't understand is the flow
It seams that with Logstash data arrive to Elastic but the ingest pipeline that without it parse event, they don't work anymore
Can someone tell me if we have to recreate all ingest pipeline in the Logstashe pipeline section ?
Below there is my logstash pipeline configuration
input {
elastic_agent {
port => 5044
ssl_enabled => true
ssl_certificate_authorities => ["/etc/logstash/certificati/ca.crt"]
ssl_certificate => "/etc/logstash/certificati/logstash.crt"
ssl_key => "/etc/logstash/certificati/logstash.pkcs8.key"
ssl_client_authentication => "required"
}
}
output {
elasticsearch {
hosts => ["https://192.168.197.135:9200"]
api_key => "xxxxxxxxxxx:xxxxxxxxxx"
data_stream => true
ssl => true
cacert => "/etc/logstash/certificati/http_ca.crt"
}
}
How can I tell to logstash to preserve raw data?
As you can see, at the moment I don't add any data to the original event
Your input is a little different from the one in the documentation, you can check the example here.
Basically it is missing the setting enrich => none.
This depends on a lot of things, basically what the source data looks like, and what you want to change, but the main recommendation is to not make any changes on Logstash, do the enrich on a custom Ingest Pipeline.
It is basically a trial and error situation, you need to test a lot of things to see what you can change without breaking the ingest pipelines.
But as mentioned in the documentation:
Please be aware that the structure of the documents sent from Elastic Agent to Logstash must not be modified by the pipeline. We recommend that the pipeline doesn’t edit or remove the fields and their contents. Editing the structure of the documents coming from Elastic Agent can prevent the Elasticsearch ingest pipelines associated to the integrations in use to work correctly. We cannot guarantee that the Elasticsearch ingest pipelines associated to the integrations using Elastic Agent can work with missing or modified fields.
Thank you Leandro, I will try your suggestion as soon as possible
Meanwhile, can I ask you for another information?
Do you know how can I debug the data that Logstash send to Elasticsearch?
I have already increase the log level on Logstash and if I add multiple output in its configuration, I can see the events writed on file.
Is there a method to eventually see why the elasticsearch pipeline with Logstash activated don't parse the events?
My final goal is to do a dns lookup for the hostname of the local machines that I monitored with the elastic-agent
I try to follow this document
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.