Logstash blocked when using Regex filter

Dear community,

When I add a specific filter in my pipeline, few Kb of data are processed then everything is blocked. (No more output when debugging logstash).

When I check my beats logs, it says that the remote port is closed :

2022-03-08T14:01:27.848+0100	ERROR	[publisher_pipeline_output]	pipeline/output.go:154	Failed to connect to backoff(async(tcp://server1:5044)): dial tcp 10.X.X.X:5044: connectex: No connection could be made because the target machine actively refused it.

Here are the problematic filters :

 filter {
      if [log][file][path] =~ /^E:.*Dirianance.*_Logs_hrep.*\.log$/ {
          mutate {
              add_field => {
                  "classification" => "debug"
                  "service" => "guiv"
                  "es_index" => "guiv-%{+yyyy.MM.dd}"
              }
          }
      }
  }

&

filter {
    if [log][file][path] =~ /^C:.*bpubnet.*logs.*LogFiles.*ZXYS4.*\.log$/ {
        mutate {
            add_field => {
                "classification" => "security"
                "service" => "ffz"
                "es_index" => "ffz-%{+yyyy.MM.dd}"
            }
        }
    }
}

And here is the ouput :

output {
    elasticsearch {
        hosts => ["https://server1:9200"]
        ssl => true
        ssl_certificate_verification => true
        cacert => "/etc/logstash/certs/elasticsearch-ca.pem"
        user => "dt_logstash_writer"
        password => "XXXXXXXXXXXXXXXXX"
        ilm_enabled => "false"
        manage_template => false
        index => "%{es_index}"
        ecs_compatibility => "disabled"
    }
}

I first thought it was a queue problem and try to change the value of queue.max_bytes from 1gb to 64mo. But it did not fix the problem.

I've found a simple fix which consist of setting these variables directly from the beats config, but I want to understand why the Regex are blocking the software.

Any Idea why these regex blocks logstash when trying to process inputs ?

Thanks for your help.

Best regards,

Valentin Magnan

Actually, I think it is caused by my variable "es_index".

Pretty sure about it since this filter also blocks logstash :

 filter {
  2     if [service] == "ffz" {
  3         mutate {
  4             add_field => {
  5                 "es_index" => "ffz-%{+yyyy.MM.dd}"
  6             }
  7         }
  8     }
  9 }

But it does not make sense for me because since it is a simple text variable in my index-pattern and when I create the field, it is also text.

Problem was the index used was not existing in Elasticsearch.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.