How to extract specific entries matching a pattern to a different file

Good morning everyone,

I'm starting with logstash and elasticsearch so maybe this is a silly question, if its sorry for make lost your time guys.

We've set a central server as the syslogd server and rest of servers are forwarding system messages to it.

These log files are later forwarded from this syslog central server to our Logstash, elasticsearch, kibana installation

Indexes are properly created and information is properly forwarded

now we need to move one step further and check the lines on the messages files in order to generate a file with only those lines that match some perl pattern expressions like "nfs_statfs:\s+statfs\s+error
" or "Inquiry\s+failed\s+on\s+FCP\s+device\s+with\s+device\s+id\s+0x\w{6}"

this is our current configuration file:

cat logstash-syslogfullv3.conf

input {
file {
path => "/var/log/hpoodganglia0*/*"
type => "syslog"
}
}

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "environment", "DEV" ]
add_field => [ "system", "HPC_pRed_Cluster" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

output {
elasticsearch {
host => "rbalhpc06"
cluster => "robinhood"
}
}

Thanks in advance for your help and patience

kind Regards

Add an extra output, wrapped in a conditional:

output {
  if [message] =~ /Inquiry\s+failed\s+on\s+FCP\s+device\s+with\s+device\s+id\s+0x\w{6}/ {
    file {
      path => "/path/to/log"
    }
  }
}

good afternoon Markus,

that make the trick, thx a lot for fast help

the file is now be created on syslog server on the proper way, is there a way to make that this file is processed as well and index created on elasticsearch in order to be able to check on kibana the number of entries created in this file?

regards

With the configuration you originally posted you're already sending all messages to Elasticsearch, but then you additionally wanted some messages written to a separate file and that's the question I answered.