Transfer log events as files between Logstash instances

Hi,

I have two logstash instances. One instance A receives events from elastic agents. The second, instance B, inputs the events and outputs them further downstream.

But my only option is to transfer files from instance A to instance B.

I hoped that the output and input plugins could be configured with a codec which would handled this. But I am unable to find a working configuration.

For example, it seems that the metadata information is not included in the json_lines output.

Basically I would like, for example, the output file plugin on instance A to write everything about the log events to files. Transfer the files to server B. Let logstash on server B have a input plugin configured which reads the events and process them further downstream.

Any suggestions ?

Simply copy the @metadata Field using a log stash mutate copy filter. Copy it to a field called metadata_backup. Export to a file and then when you import the file on the other logstash simply use a mutate copy filter to copy from metadata_backup to @metadata.

Thank you for the recommendation.

And do you suggest that I use the json_lines plugins for output and input ?

I'm not sure why you need to save to a file to transfer so I'm not sure what the best might be but json lines seems like a good option to me

Thank you @strawgate. On instance A I configured the logstash with

filter {
  mutate {
    copy => { "@metadata" => "metadata_backup" }
  }
}

And then used the file output plugin with the json_lines codec.

One the receiving instance B I used the json codec on the file input plugin and the following filter configuration

filter {
  mutate {
    copy => { "metadata_backup" => "@metadata" }
  }
}

Works.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.