Firstly, let me just say that I am new in elk stack but I already think it's great solution.
I use elk with syslog, to collect and search logs from multiple client systems. What I did so far:
syslog is listening on 514 port, processing every message (saving to specific file) and redirecting every message also to localhost 1514 ( to logstash).
input{
udp
{
host => "127.0.0.1"
port => 1514
type => syslog
}
}
My question is: can I do something, to group my logs per IP of client? For example, I want to show only logs from client X (A.B.C.D ip address) - how to do it? Should I create seprate index for each client? or maybe some filter?
I'm not sure where I want to group them. What I want is to show all messages from client X on one page, build visualization for only one client etc. Now all messages from all clients are in one place and to find specific client I need to use "search" option.
Can you please tell me how to set up this filter in kibana? I tried to enter this in "search" field but it doesn't work. I though filters are placed in logstash but maybe I'm wrong.
Mark - thank you for your help. I think I solved my problem: instead of redirecting my syslog to logstash port 1514, I changed my logstash configuration:
Now, as a search criteria I can choose "path" value (for example /path/to/saved/logs/2/syslog.log). If so, only logs from one host (host 2) would be shown. Another thing is I don't understand yet what are the pros and cons of such solution - I think I will open a new discussion about it.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.