Grok filter with multiple regex not working

I have design the below simple regex pattern. And this was working fine for me.

input {
    beats {
        port => "5044"
        ssl => true
        ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
        ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
    }
}
 filter {
    grok {
        match => { "message" => "(?<timestamp>[0-9]{4}.[0-9]{2}.[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}.[0-9]{3})"}
    }
}

output {
  elasticsearch {
    hosts => ["192.168.200.42:9200"]
    sniffing => true
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
  }
}

However, when I tried with 2 regex patterns below conf file failed to wirte data to the elastic seach.

input {
    beats {
        port => "5044"
        ssl => true
        ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
        ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
    }
}
 filter {
    grok {
        match => { "message" => "(?<timestamp>[0-9]{4}.[0-9]{2}.[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}.[0-9]{3})"}
    }
    grok {
        match => { "message" => "(?<corid> [[0-9][a-z]]{8}-[[0-9][a-z]]{4}-[[0-9][a-z]]{4}-[[0-9][a-z]]{4}-[[0-9][a-z]]{12})"}
    }

}

output {
  elasticsearch {
    hosts => ["192.168.200.42:9200"]
    sniffing => true
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
  }
}

Since I am new to ELK stack as well as for grok and it's behaviour appriciate if someone can help me on this.

Thanks in advance :slight_smile:

If Logstash has problems sending to ES it'll tell you about it in the log.

While debugging inputs and filters use a simple output like stdout { codec => rubydebug }. Once you've confirmed that things are behaving as expected you can add additional complexity like an ES output.

Thanks @magnusbaeck , Please guide me on how to add stdout. I did something like below. It didn't work as expected

output {
stdout { codec => rubydebug }
  elasticsearch {
    hosts => ["192.168.200.42:9200"]
    sniffing => true
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
  }
}

You can just try this in output section of your conf file:

output {
stdout { codec => rubydebug }
}

Regards

Thanks @vchandrashekar ! Can't I add both some which I mentioned above ?

with the rubydebug entry, you can verify if you are seeing the expected output.

Once are OK with the data, you debug why the data is not being sent to elastic search.

If possible please paste the error that you are seeing.

@vchandrashekar logstash.log is the file that we need to check right ?

yes.

HI All,

This is the complete logstash config which I am having right now.

input {
    beats {
        port => "5044"
        ssl => true
        ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
        ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
    }
}
 filter {
    grok {
        match => {"message" => "(?<timestamp>[0-9]{4}.[0-9]{2}.[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}.[0-9]{3})"}
        match => {"message" => "(?<corid> [[0-9][a-z]]{8}-[[0-9][a-z]]{4}-[[0-9][a-z]]{4}-[[0-9][a-z]]{4}-[[0-9][a-z]]{12})"}
    }
}

output {
  stdout { codec => rubydebug }
}

And below is the tail of the logstash.log. Any idea on how to troubleshoot the above ?

root@localhost:/var/log# tail -f logstash.log
tail: cannot open ‘logstash.log’ for reading: No such file or directory
root@localhost:/var/log# cd logstash/
root@localhost:/var/log/logstash# tail -f logstash.log
{:timestamp=>"2018-05-09T14:10:31.256000+0530", :message=>"SIGTERM received. Shutting down the pipeline.", :level=>:warn}
{:timestamp=>"2018-05-09T14:16:28.769000+0530", :message=>"SIGTERM received. Shutting down the pipeline.", :level=>:warn}
{:timestamp=>"2018-05-09T14:19:28.381000+0530", :message=>"SIGTERM received. Shutting down the pipeline.", :level=>:warn}
{:timestamp=>"2018-05-09T15:04:20.584000+0530", :message=>"SIGTERM received. Shutting down the pipeline.", :level=>:warn}

{:timestamp=>"2018-05-09T15:07:47.107000+0530", :message=>"SIGTERM received. Shutting down the pipeline.", :level=>:warn}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.