it is my first time with ELK, in the my workplace i woud get collection , standardization, analysis, correlation, reporting the logs of Firewall palo and switch CISCO...I am in a hurry, I worked with a lot of tools mangement, but i can't normalize the logs
I am not very familiar with configuring Cisco devices but I would expect it to be similar to what we use for our Juniper syslogs... My assumption is based on this
Add a syslog input to Logstash. Something like this
Only the port number is really required. Change Juniper to Cisco (or whatever you want to call it...) The rest are optional configurations for my environment and log formats. That grok pattern will probably not work for you so start without it.
Configure your Cisco devices to use the IP of your Logstash machine and the port for the syslog input.
The you need at least an output configuration. This is what I use
Hi @A_B , Thanks a lot for your reply. Sincerely it's not very clear to me I already use graylog 2.5 for log collection, I would like to work with ELK for normalization and log analysis...in my case i work in switch cisco and firewall palo alto....I will continue to search again
With this you would have Logstash listening on TCP port 12345 and expecting JSON data. The output is sent to STDOUT, so your console. Send one test message to Logstash to make sure everything works.
Then you can add an elasticsearch output and start to add filters that will let you manipulate the logs.
i am block bro, it so diffcult for me to combinate Graylog + ELK.
1/ ELK ( logs cisco switch) :
the logging service is activated on the cisco switch, it sends its logs on port 1514, UDP
/etc/logstash/conf.d/s-input.conf
input {
udp {
port => 1514
type => "syslog-cisco"
}
tcp {
port => 1514
type => "syslog-cisco"
}
}
Please there are other configurations to do to have the switch logs ??
input {
udp {
port => 1514
codec => json # add codec if you can set Cisco to output JSON, otherwise remove this line for now
}
}
filter {}
output {
stdout { codec => rubydebug }
}
If you start Logstash with that config you should start seeing logs on coming in on your console.
@A_B thank you for your efforts , I will follow your example. I want to know precisely who is responsible for log normalization in ELK: Logstash or elasticsearch ??
That is sending JSON data so I have also the json codec set for the Logstash input. The data should be received no matter what, it will just not be parsed correctly... Change LOGSTASH_IP to the IP or hostname you are using. The script expects one command line argument like
$ ./script_name.sh test
netcat needs to be installed for it to work as well...
Even before you try to send any data you should make sure Logstash is listening on the correct IP and port. Here is an example using netcat. My Logstash is listening on UDP port 5515
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.