Logstash Input adds data stamp and host

I am using Logstash to receive data from a syslog server and push it into different Kafka topics. Logstash appears to add a data/time stamp and hostname at that start of the raw event. Is there a way stop this happening? I am grabbing raw audit events and pushing them to different topics but need to capture just the raw event.

Reading further it looks like this is metdata being added by Logstash

can you share your logstash input&output

btw. You can remove fields by adding these to your logstash

filter {
mutate { remove_field => [ "field1", "field2", "field3", ... "fieldN" ] }

Syslog event looks like this prior to sending to logstash: <13>Jan 11 11:36:20 server1.domain.net {BLAHBLAHBLAH....

Using the following in logstash:

input {
port =>1001
type=> syslog

The output is this:
2022-01-11T11:35:01.785Z syslogserver.domain.net <13>Jan 11 11:36:20 server1.domain.net {BLAHBLAHBLAH....
Is it possible to stop log stash adding 2022-01-11T11:35:01.785Z syslogserver.domain.net

Are you using the Kafka output? Please share your full pipeline.

If you want to send the raw event to the Kafka topics, you will need to have the following codec in your output.

codec => plain { format => "%{message}" }

That was it. Thank you so much

On a separate matter, what happens if you add_field a value into a field that already has a value? Does it append the data or overwrite?

P.S. Will start new thread if required.

It becomes an array.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.