Log Parser for multiple fields and files

I need to fetch data which occurs in multiple files and multiple times in the same file.which is presented in the below format.
19-10-25 Name Succ Fail Reject Thrput Response time (ms)
(/s) Avg Min Max
03:08:58 Total request 0 0 0 0 - - -
Inactive session 0 0 0 0 - - -
Reevaluated session 0 0 0 0 - - -
03:09:08 Total request 0 0 0 0 - - -
Inactive session 0 0 0 0 - - -
Reevaluated session 0 0 0 0 - - -

If you want different parsing per file but you're using a single filebeat instance, you could add a field to the message in filebeat.yml (the filebeat config), per input:

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /var/log/*.log

  # Here, you can define a custom field
  fields:
     apache: true

Then, in your logstash pipeline, you could add a conditional like so:

input {
	beats {
		port => <your port>
	}
}

filter {
	if [fields][apache] {
		grok {
		   match => { "message" => <your pattern> }
	} else {
                   match => { "message" => <your other pattern> }
}
output {
	stdout { codec => rubydebug }
       }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.