Logstash sends the log to elasticsearch and Syslog-ng or rsyslog(multiple outputs) which keeps the original log

I am wanting to configure the log from: Filebeat -> Logstash -> Elasticsearch and syslog-ng(or rsyslog).

I configured with Syslog-ng to get the log following the instructions at: Sending logs from Logstash to syslog-ng - Blog - syslog-ng Community - syslog-ng Community. However, this log contains the entire log content of logstash. This makes it difficult for me.

Example: The original log in /var/log/auth.log is as follows:
Jan 25 16:29:01 TOBY CRON[52171]: pam_unix(cron:session): session opened for user root by TOBYTEST

But when I go to Syslog-ng, which I reconfigure, I get the following:
Jan 25 16:28:38 192.168.0.18 "ubuntu-srv","os":{"kernel":"5.4.0-94-generic","codename":"focal","name":"Ubuntu" ,"family":"debian","type":"linux","version":"20.04.3 LTS (Focal Fossa)","platform":"ubuntu"},"ip":["192.168.0.20 ","fe80::20c:29ff:fe7e:e409"],"containerized":false,"name":"ubuntu-srv","id":"54c327f620594e9faab9b7b42ebb37c3","mac":["00:0c: 29:7e:e4:09"],"architecture":"x86_64"} LOGSTASH[-]: 2022-01-25T09:28:37.731Z {hostname=ubuntu-srv, os={kernel=5.4.0-94 -generic, codename=focal, name=Ubuntu, family=debian, type=linux, version=20.04.3 LTS (Focal Fossa), platform=ubuntu}, ip=[192.168.0.20, fe80::20c:29ff:fe7e :e409], containerized=false, name=ubuntu-srv, id=54c327f620594e9faab9b7b42ebb37c3, mac=[00:0c:29:7e:e4:09], architecture=x86_64} Jan 25 16:29:01 TOBY CRON[52171] : pam_unix(cron:session): session opened for user root by TOBYTEST

Where 192.168.0.20 is the IP server containing Filebeat, 192.168.0.18 is the server running logstash(ELK).

So how can Syslog-ng (or rsyslog) only get the same log as the original log from filebeat?

Please help me. Thank you!

Welcome to our community! :smiley:

As mentioned on reddit, there's no way to do this. Logstash will always reorganise things.

What does your pipeline looks like? How is the output sending data to Syslog-ng?

If you have the original message in some field, you could try to send to Syslog-ng using the udp or tcp outputs and specify that field in the format of the codec.

Something like this:

output {
    udp {
        host => "hostname"
        port => "port"
        codec => plain { format => "%{fieldWithOriginalMessage}" }
    }
}

My pipeline defaults to logstash from the start. I haven't edited anything since I'm a newbie too.
I fixed it to:

udp {
    host => "hostname"
    port => "port"
    codec => plain {format => "%{fieldWithOriginalMessage}"}
}

Like yours, but not able to send logs via Syslog-ng at all.

After I fixed my logstash output configuration to:

input {
Beat {
port => 5044
}
}

output {
elastic {
server => ["http://localhost:9200"]
index => "% {[@ metadata] [beat]} -% {[@ metadata] [version]}"
}

syslog {
server => "192.168.0.19"
port => "514"
protocol => "tcp"
rfc => "rfc5424"
codec => plain {format => "% {message}"}
}
}

then the log was shortened, with the original log:

"Jan 26 16:29:01 TOBY CRON [52171]: pam_unix (cron: session): session open to root user by TUANVD5555"

after switching to Syslog-ng:

"January 28 03:23:30 192.168.0.18 {"hostname":"ubuntu-srv","os":{"kernel":"5.4.0-94-generic","codename":"focus "," name ":"Ubuntu",,"type":"linux",,"family":"debian",,"version":"20.04.3 LTS (Focal Fossa)",,"platform":"ubuntu"}, "container" : false, "ip": ["192.168.0.20", "fe80::20c:29ff:fe7e:e409"], "name": "ubuntu-srv", "id": "54c327f620594e9faab9b7b42ebb37c3", " mac": ["00:0c:29:7e:e4:09"], "architecture": "x86_64"} LOGSTASH - - - Jan 26 16:29:01 TOBY CRON [52171]: pam_unix (cron: session ): opened session for root user by TUANVD5555 "

Can you guide me in more detail so I can get the original to Syslog log? Thank you very much, please!

Update:
after I fixed the output logstash to:

input {
beats {
port => 5044
}
}

output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}

tcp {
host => "192.168.0.19"
port => "514"
codec => line { format => "%{metric_name} %{message}" }
}
stdout {}
}

Then the log I received on Syslog-ng was completely the same as on the server containing Beat. Suc as:

Jan 26 16:29:01 TOBY CRON [52171]: pam_unix (cron: session): session open to root user by TUANVD5555

However, it does not contain a Header to distinguish the devices that push the log to. Currently, I need these headers to appear on Syslog-ng. Can you help me?. Please!

What is the output you expect to receive with the tcp output sending to your syslog? Can you give an example?

Also, what is this header you talking about?

Share your filebeat config as well.

You will probabily need to build a custom field in logstash to create this message in the format you want.

Hi @leandrojmp

Exactly I want to get the log from the whole linux/windows server and use Wazuh-manager for analysis and monitoring. However, the ELK system already exists so I am thinking to do the following: Server(filebeat) -> logstash -> Syslog -> Wazuh-manager.

The output log format that I want to enter Syslog will be preserved. That is, logstash here only does forward log when it is received from filebeat without adding any other fields.

I also don't have much experience in creating custom fields in logstash, can you help me. Please!

You need to provide examples of what is the original message you are collecting with filebeat and what is the message you expect to have in the tcp output, please share examples of your messages before and the expected output.

To create a field in logstash you use the add_field option of the mutate filter.

Also, if you want to use Wazuh, it is better to use the Wazuh agent collecting the logs directly on the source. (Wazuh is not supported here, if you have any questions about wazuh you need to check in their forums).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.