Good morning everyone,
I'm starting with logstash and elasticsearch so maybe this is a silly question, if its sorry for make lost your time guys.
We've set a central server as the syslogd server and rest of servers are forwarding system messages to it.
These log files are later forwarded from this syslog central server to our Logstash, elasticsearch, kibana installation
Indexes are properly created and information is properly forwarded
now we need to move one step further and check the lines on the messages files in order to generate a file with only those lines that match some perl pattern expressions like "nfs_statfs:\s+statfs\s+error
" or "Inquiry\s+failed\s+on\s+FCP\s+device\s+with\s+device\s+id\s+0x\w{6}"
this is our current configuration file:
cat logstash-syslogfullv3.conf
input {
file {
path => "/var/log/hpoodganglia0*/*"
type => "syslog"
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "environment", "DEV" ]
add_field => [ "system", "HPC_pRed_Cluster" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
output {
elasticsearch {
host => "rbalhpc06"
cluster => "robinhood"
}
}
Thanks in advance for your help and patience
kind Regards