Parse path with file input in logstash

Im throwing in some suricata logs through logstash and this works fine, but I'm wondering if there is a better way to separate and tag the log files...

> input {
>   file {
>     path => "/home/user/nfs/suricata/bronn/eve.json"
>     codec => "json"
>     start_position => "beginning"
>     sincedb_path => "/dev/null"
>     type => "suricata"
>   }

As you can see, I'm using a nfs mount to store all of "eve.json" files from different computers. In this case, "bronn" is the specific computer. Suricata outputs its json with eve.json.

Filter:

  if [path] =~ "(?<![\w\d])bronn(?![\w\d])" {
    mutate {
      add_field => { "monitor-hostname" => "bronn-monitor" }
    }

Im wondering if there is a better way to define one path in the input section "/home/user/nfs/suricata/*/*" and then parse the path for the computer name so I can use it later on when adding a field.
The way its working right now is fine but in the filter section, I would have to add a new conditional for each computer added, which requires restarting logstash and would get pretty large eventually.

Thank you

1 Like

Was able to accomplish this with:

path => "/home/user/nfs/suricata/*/*"

filter{
grok {
match => {
path => "%{GREEDYDATA}/%{GREEDYDATA:monitor-hostname}.json"
}
}
}

I strongly suggest you don't use two GREEDYDATA patterns like that. It's inefficient and could match incorrectly. I suggest this instead:

/(?<monitor-hostname>[^/]+)\.json$
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.