I have two logstash instances. One instance A receives events from elastic agents. The second, instance B, inputs the events and outputs them further downstream.
But my only option is to transfer files from instance A to instance B.
I hoped that the output and input plugins could be configured with a codec which would handled this. But I am unable to find a working configuration.
For example, it seems that the metadata information is not included in the json_lines output.
Basically I would like, for example, the output file plugin on instance A to write everything about the log events to files. Transfer the files to server B. Let logstash on server B have a input plugin configured which reads the events and process them further downstream.
Simply copy the @metadata Field using a log stash mutate copy filter. Copy it to a field called metadata_backup. Export to a file and then when you import the file on the other logstash simply use a mutate copy filter to copy from metadata_backup to @metadata.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.