Is there any option to store logs into a file before parsing it in GROK. Please suggest. Can I use logger.info() in Filter plugin.
There is no option AFAIK.
What is received, will be in in the message field or in event.original with ECS v8
"event" => {
"sequence" => 0,
"original" => "Some text"
}
What you can do is to save in a file at the end - in the output section
file { path => "/path/filename_%{+YYYY-MM-dd}.txt" }
or use
stdout { codec => rubydebug{} }
You might ruby code to save in a file. If you have some issues with paste here a sample and what fields you expect, someone will help
why not use clone for those events Clone filter plugin | Logstash Reference [8.6] | Elastic and then in output section, write only cloned events to the file?
Use pipeline-to-pipeline communication with a forked-path pattern to process events in two different ways. That can also be done using a clone filter and conditionals.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.