Greedydata does not work

Hello,

I have Logstash v2.3 with the following configuration:
> input {
> beats {
> port => 5044
> }
> }
> filter {
> grok {
> match => ["message", "(?m)[%{HTTPDATE:msg_timestamp}] [%{LOGLEVEL:log_level}] %{GREEDYDATA:log_message}"]
> }
> date {
> match => ["msg_timestamp", "dd/MMM/YYYY:HH:mm:ss Z"]
> target => "@timestamp"
> }
> }
> output {
> elasticsearch {
> ...
> }

Moreover, I have the multiline pattern enabled in filebeat as:

pattern: ^[
negate: true
match: after

As log_message I'm expecting to have any kind of character in possibly multilines. So messages, as the following, should match but grok is failing:
[26/Oct/2016:10:10:29 +0200] [DEBUG] Preparing/etc

However in http://grokdebug.herokuapp.com/ everything works perfectly. What should I change? Is there any known problem/bug with greedydata? What should I use instead?

Thanks in advance,
Cheers

I'm assuming you've escaped your square brackets, and if I make that change it works fine:

$ cat test.config 
input { stdin { } }
output { stdout { codec => rubydebug } }
filter {
  grok {
    match => {
      "message" => "(?m)\[%{HTTPDATE:msg_timestamp}\] \[%{LOGLEVEL:log_level}\] %{GREEDYDATA:log_message}"
    }
  }
}
$ echo '[26/Oct/2016:10:10:29 +0200] [DEBUG] Preparing/etc' | logstash -f test.config
Settings: Default pipeline workers: 8
Pipeline main started
{
          "message" => "[26/Oct/2016:10:10:29 +0200] [DEBUG] Preparing/etc",
         "@version" => "1",
       "@timestamp" => "2016-10-26T17:31:02.403Z",
             "host" => "bertie",
    "msg_timestamp" => "26/Oct/2016:10:10:29 +0200",
        "log_level" => "DEBUG",
      "log_message" => "Preparing/etc"
}
Pipeline main has been shutdown
stopping pipeline {:id=>"main"}

This match does not work against your example line

HTTPDATE Does not seem to be correct, as the format

Oh it looks like you have not escaped the this has to be \[ and \]

Ups, I didn't realise that it was escaped, it's like the following:
(?m)\[%{HTTPDATE:msg_timestamp}\] \[%{LOGLEVEL:log_level}\] %{GREEDYDATA:log_message}

I seem to have a problem with the escaping character. I have detected that messages like "XXXX /etc" work without any problem but other messages like "XXX/etc" do not. Please, note the space after the "/". What do you think could be the issue? Why is it that sometimes the "/" presents problems and sometimes it doesn't? I give an example below:

...
input_type	  	log
message	  	[26/Oct/2016:10:10:29 +0200] [DEBUG] Preparing/etc
offset	  	438,072
tags	  	beats_input_codec_plain_applied, _grokparsefailure
...

I don't see any issue and the line you provided works in grokdebuger. Greedydata takes everything

maybe the tag is getting set on a different filter? try add_tag_on_falure => ["ANYTHING"] to this grok statement just to see if it still has a grok failure

https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html#plugins-filters-grok-tag_on_failure

If it does the "failure is coming from something else. otherwise if you can post your config

That's the problem, there is no problem with the pattern match (also working in http://grokdebug.herokuapp.com/ ) but it's not running properly when there is a escape character. I have no other filter as I have posted in my configuration, so the problem is just coming from the grok:
input {
beats {
port => 5044
}
}
filter {
grok {
match => ["message", "(?m)\[%{HTTPDATE:msg_timestamp}\] \[%{LOGLEVEL:log_level}\] %{GREEDYDATA:log_message}"]
}
date {
match => ["msg_timestamp", "dd/MMM/YYYY:HH:mm:ss Z"]
target => "@timestamp"
}
}
output {
elasticsearch {
...
}

Is there anything that I should set? Anything I should be aware?

match => {
    "message" => "(?m)\[%{HTTPDATE:msg_timestamp}\] \[%{LOGLEVEL:log_level}\] %{GREEDYDATA:log_message}"
}

What you are showing above does not look correct. Have you tried with the exact configuration Magnus provided in his example above?

Yes, it is what I'm trying. I was editing my answer when you replied. I forgot again to escape here.

But that is not exactly what Magnus provided. You syntax is wrong. Either copy the entire filter configuration from the example (not just the expression) or have a look in the documentation to see how a match clause should be constructed.

I have tried that syntax but still does not work. It might be a problem with escaping character, not about how the grok is built. Any suggestion?

input {
beats {
port => 5044
}
}
filter {
grok {
match => {"message" => "(?m)\[%{HTTPDATE:msg_timestamp}\] \[%{LOGLEVEL:log_level}\] %{GREEDYDATA:log_message}"}
}
date {
match => ["msg_timestamp", "dd/MMM/YYYY:HH:mm:ss Z"]
target => "@timestamp"
}
}
output {
elasticsearch {
...
}

Can you replace the elasticsearch output with stdout { codec => rubydebug } and show us the output and exactly what is not working?