Logstash as filebeat passthrough

Is it possible to pass filebeat logs straight through logstash without logstash doing any processing?

My setup is as follows

Client using filebeats => beats input on logstash-proxy => tcp output on logstash-proxy => tcp input on logstash => filter => tcp output on logstash => ElasticSearch

My problem is that the logstash-proxy is processing my filebeats input and sending a bunch of extra junk to my primary logstash to filter.
The goal of the logstash-proxy is to build queues for remote sites in case of Internet outages and such without impacting clients. The issue that I have is that the proxy is continually adding JSON stuff to it's output, and changing the information which is breaking my pipelines.

I would prefer not to have to update all of my pipelines to parse out the excess information that the logstash-proxy is adding.

I have found a large portion of the solution:
https://www.elastic.co/guide/en/logstash/current/ls-to-ls.html

I have some things working but not everything quite yet. Lumberjack is missing some critical components, but I am working through each error one at a time.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.