Parsing log

Hello greetings
I made the configuration to integrate FileBeat, Logstash, and Elastic.
I need to create a filter every time I find an ERROR in the log. Any solution for this?
Where can I find documentation on this, I am starting to use these tools.


You can take a look at to start configuring logstash filters.

1 Like

Hi, thanks you.

input {
beats {
port => "5044"

The filter part of this file is commented out to indicate that it is


filter {
grok {
match => { "message" => "%{DATE:fecha} %{TIME:time} %{LOGLEVEL:logLevel} %{GREEDYDATA:data}" }
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }

I would like to know how to consume this information through a web service through elastic. Any recommendation ?

I think you should add more information about your intended use case.

For example, you can use alerting (depending on your elastic subscription) to search within the ingested data in elasticsearch and send information to a webservice when errors are found.

I associate the idea of consumption more with queues and other services like Redis, Kafka, etc, but for sure there are multliple solutions that fit your purpose.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.