Filebeat multi line stopped working

Hi,

I have been using the filebeat for sometime with multiline pattern. Everything was working fine last week it stopped working.

I was using 7.14.1 for filebeat but now i've tried 7.6.2 and 7.15 both. I've been trying to find the cause and tried several things and started from very basic config too. When i do not enable multiline i read about 2k messages (not every message is multiline in my logs) but when i enable multiline pattern most of the messages gets grouped into one and i only read ~ 50 messages and when i see the message in discover there are multiple messages into one.

Not sure what happened, i even tried it on a fresh machine with fresh installation. I've wasted many hours on reconfiguration and troubleshooting but unable to find the cause please help i have a demo soon.

Weird thing is why it stopped working?

My filebeat.yml is:

filebeat.inputs:

- type: log

  # Change to true to enable this input configuration.
  enabled: true
  # Paths that should be crawled and fetched. Glob based paths.
  paths: 
    - C:\logs\log1\**
  multiline.pattern: '^[0-9]{4}/[0-9]{2}/[0-9]{2}'
  multiline.negate: true
  multiline.match: after
# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

# ======================= Elasticsearch template setting =======================

setup.template.settings:
  index.number_of_shards: 1
  #index.codec: best_compression
  #_source.enabled: false

# =================================== Kibana ===================================

# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:

  # Kibana Host
  # Scheme and port can be left out and will be set to the default (http and 5601)
  # In case you specify and additional path, the scheme is required: http://localhost:5601/path
  # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
  #host: "localhost:5601"

  # Kibana Space ID
  # ID of the Kibana Space into which the dashboards should be loaded. By default,
  # the Default Space will be used.
  #space.id:
  
output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["localhost:9200"]

# ------------------------------ Logstash Output -------------------------------
#output.logstash:
  # The Logstash hosts
  #hosts: ["localhost:5044"]
   
  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

# ================================= Processors =================================
processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

Thanks

I have tested indentation as well

filebeat test config -c filebeat.yml

it is OK

Hi,

I've found the cause just after posting this question.

there were 2 problems:
1 was that while troubleshooting i somehow changed the multi-line pattern on my new system which explained why it was not working on the new one when everything else was fine.

  1. On the system where it stopped working, the reason i am able to understand so far was that i opened my yml file in intelliJ and somehow it changed the format. Although the tab level is still 2 spaces , but there must be something else so whenever i use the filebeat.yml file which was opened in intelliJ it doesn't work but when i use the new one (never opened in intelliJ) with everything same as the older one then i get what i expect.

It is very weird and i wanted to share this to save others time. I tried everything that was possible so i can feel the frustration level if someone gets stuck in a situation as mine.

Thanks