Hello,
I don't know if I am on the right section but here it goes:
I have a syslog server and I already connected for it to send the logs to my linux box where I have ELK installed. I have the IP and everything but I am not very knowledgeable on this. So, I don't know how to check if it actually is sending syslog to the Linux box or not and also where would this log go (definitely /var/log right?)
Could anyone help?
I have already configured fortigate syslogs to be sent to Linux, anyone know how to get these syslogs and upload to elasticsearch/kibana? do I use logstash, logbeat?
Logstash usage or beats depend on what you want to do, although it's two tools make relatively the same thing with a few difference.
For what you want to do, I think it will fadra first understand the usefulness of each.
link: https://www.elastic.co/fr/products/beats
Beat 1 |
Beat 2 | ---> Logstash (optionnal ) --> ES --> Kibana
Beat .. |
Beat n |
Thank for the reply,
I have the port where the syslogs are being sent to
If I am using logstash, do I just change the configuration to listen to the port that syslogs are being sent?
input {
tcp {
port => SYSLOG_PORT
type => syslog
}
udp {
port => SYSLOG_PORT
type => syslog
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
output {
elasticsearch { hosts => ["10.130.233.242:9200"] }
stdout { codec => rubydebug }
}
Is that all or do I need to do something else?
edit: forgot to add the syslogs are coming from fortigate