Filter or remove some logs in logstash

I have Filebeat -> logstash -> Elasticsearch setup and everything is working fine.
I have filter condition in logstash so that I can parse.
In my log file it will have some additional logs lines like

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

  • Server started +
    ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

I can see these logs in elasticserach. I don't want these logs to be in elasticsearch.
Is there anyway so the I can only send particular logs to logstash.

Look further into your filebeat.yml setup

in particular the paths: section you need to direct the Prospector in finer detail

for example if you want to exclude a log file type you can use wildcards

- type: log

  # Change to true to enable this prospector configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- /var/log/*.log
    - d:\logs\iamalog*.log

Thanks from the response @bloke but my issue is different.
I have a log file, inside that log file I have some log lines as I have mentioned above how to remove that.

You might be able to use a drop_event processor in Filebeat or a drop filter in Logstash.

use the exclude feature in the prospector, within filebeat.yml

  # Exclude lines. A list of regular expressions to match. It drops the lines that are
  # matching any regular expression from the list.
  exclude_lines: ['^DBG']

and match your lines with a regex tool like https://regex101.com/

hope it helps

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.