How to parse out logs based on a hostname

hi i am trying to parse out nginx logs based on a specific hostname and applygrep to those logs
and send the output to a seperate file and i was thinking of using bash scripting for this. I would like to know how can i insert a bash scripting inside the grok expression.
here is the sample log:
1.1.1.1 2.2.2.2 - - [17/Sep/2019:13:55:48 +0000] "GET /search&search_term=elastic+logstash+kibana+md&lat=0.000000000&lng=-0.0000000 HTTP/1.0" 302 328 "abc.com" "-" "" "-" - 0.054 0.056 . - 127.0.0.1:1234

and here is the sample grok expression i use:
input { stdin { } }

filter {
  grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
  }
  
}

output {
  elasticsearch { hosts => ["localhost:9200"] }
  stdout { codec => rubydebug }
}

i was thinking if i can use the if statement after the match and try to send it to the other folder if or if any other language if i need to use. Thanks

I do not really understand what you mean, but you can change the output via a field name:
https://www.elastic.co/guide/en/logstash/current/plugins-outputs-file.html#plugins-outputs-file-path

So if you want to seperate the output into dedicated files for each hosts you can use the fileoutput.
If you need more separation you can use if-statements in the output too.

like:

output {
  if [hosts] {
    elasticsearch { hosts => ["localhost:9200"] }
  }
  if "somestring" in [field] {
    stdout { codec => rubydebug }
  }
}

Hope this helps you

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.