Fields are not added

Hi,

I have problem with adding filed in logstash

part of the source code:

grok {
patterns_dir => "../patterns"
match => [ "message", "%{TIMESTAMP_ISO8601:logtime}\ [%{NUMBER:ThreadNum}]\ %{LOGLEVEL:LogLevel}\ %{GREEDYDATA:module}\ -\ %{GREEDYDATA:syslog_message}" ]
}

date {
match => ["logtime", "YYYY-MM-dd HH:mm:ss,SSS"]
}
mutate{
add_field => [ "action", "" ]
add_field => [ "port", "0" ]
add_field => [ "type", "" ]
}

grok {
match => [ "syslog_message" , "Action:\ [%{WORD:action}]"]
match => [ "syslog_message" , "Port:\ [%{NUMBER:port}]"]
match => [ "syslog_message" , "Type:\ [%{WORD:type}]"]
}

When i check for data in ES it is only visible Action
other two fields are not visible port and type.

Imported expamples for syslog_message:
this is a test
Action; [Test]
Action: [Get], Port: [3], Type: [test]

all have been tested in online grok-debuger and works fine.

What i'm doing wrong?

Thanks for help

br,Dani

1 Like

@danivu can you explain in more detail what your are trying to do and which version of Logstash/Elasticsearch you are using?

I did a quick and simple test using the following config and just the mutate to add the fields and it is working fine:

input {
        tcp { port => 5011 }
}

filter {
        mutate{
                add_field => [ "action", "" ]
                add_field => [ "port", "0" ]
                add_field => [ "type", "" ]
        }
}

output {
        stdout { codec => rubydebug }
}

sending 123 via tcp results in:

{
    "@timestamp" => 2017-01-13T16:08:56.495Z,
          "port" => [
        [0] 45396,
        [1] "0"
    ],
      "@version" => "1",
          "host" => "127.0.0.1",
        "action" => "",
       "message" => "test",
          "type" => "",
          "tags" => []
}

As you can see port is there (element 1 in the port array), action and type have also been added.
Now, sending this to Elasticsearch will not work, as the type of the ES index would result from the type field, which you set to an empty string. And ES does not allow an empty string as _type.
Kind of the same issue with port, Logstash adds a port field by default (at least with tcp/udp input) and adding your port will result in an array of port, as seen above.
I would recommend using other names, to not conflict with the default fields. E.g. syslog_type and syslog_port.

Looking at it again, you do not need to add the fields with mutate if you add them in the grok anyways.
The problem with your grok is that you defined it to match either of the 3 patterns you provided. The way you wrote it it's an OR.
If the line always starts with Action pattern 2 and 3 will never match as the message would need to start with Port or Type. That the reason why it only parses the Action field.

E.g:

grok {
	match => [ "message" , "Action: \[%{WORD:action}\], Port: \[%{NUMBER:syslog_port}\], Type: \[%{WORD:syslog_type}\]" ]
}

results in:

{
     "@timestamp" => 2017-01-13T16:31:13.276Z,
           "port" => 45770,
    "syslog_type" => "test",
       "@version" => "1",
           "host" => "127.0.0.1",
         "action" => "Get",
        "message" => "Action: [Get], Port: [3], Type: [test]",
    "syslog_port" => "3",
           "tags" => []
}

Hi,

I'm using version logstash 2.3.2. Thank for detailed answer,

This is a log4net logging and we want to get events submitted for ELG.

I also tried without mutate and only filed 'Action' was included in ES.
I defined 3 patterns and it is an or because it can also
e.g.

grok {
match => [ "message" , "Action: [%{WORD:action}], Port: [%{NUMBER:syslog_port}], Type: [%{WORD:syslog_type}]" ]
}

and works fine.

when i use separate for each field

grok {
match => [ "message" , "Action: [%{WORD:action}]"]
match => [ "message" , "Port: [%{NUMBER:syslog_port}]"]
match => [ "message" , " Type: [%{WORD:syslog_type}]" ]
}
Action filed will be always included.

Results in (ruby debug):

       "message" => "2017-01-16 08:40:23,596 [195] INFO  Service - Action: [ELK_DEP_DECLINED], Port: [50], Type: [FAILED]\r",
      "@version" => "1",
    "@timestamp" => "2017-01-16T08:40:23.596Z",
          "path" => "2017-01-16-08.log.11",
          "host" => "pppp",
          "type" => "log4net",
       "logtime" => "2017-01-16 08:40:23,596",
     "ThreadNum" => 195.0,
      "LogLevel" => "INFO",           
"syslog_message" => "Action: [ELK_DEP_DECLINED], Port: [50], Type: [GBP]\r",
        "action" => "ELK_DEP_DECLINED"

}

only action is read, other tow fields Port and Type are not.
Message field is original message

When only Action filed is included non of the filed is recognized:

       "message" => "2017-01-16 10:34:14,076 [71] INFO  Controller - Action: [ELK_DEP]\r",
      "@version" => "1",
    "@timestamp" => "2017-01-16T10:34:14.076Z",
          "path" => "C.2017-01-16-10.log.100",
          "host" => "010",
          "type" => "log4net",
       "logtime" => "2017-01-16 10:34:14,076",
     "ThreadNum" => 71.0,
      "LogLevel" => "INFO",
        "module" => "Controller",
"syslog_message" => "Action: [ELK_DEP]\r",
          "tags" => [
    [0] "_grokparsefailure"

But online is not correct solution or it is a bug in logstash-grok.

br,Dani

Hi

I made a solution with IF

if !( "Port" in [syslog_message] )
{
grok {
match => [ "syslog_message" , "Action: [%{WORD:action}]"]
}
}

if "Port" in [syslog_message]
{
    grok { 
        match => [ "syslog_message" , "Action: \[%{WORD:action}\], Port: \[%{NUMBER:syslog_por}\], Type: \[%{WORD:syslog_type}\]"] 
    }
}

All fields are readed those only with action and all three.

But this not make sense , because i can do a lot of combinations, this will have impact on spending a lot of time in maintenance.

br,Dani

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.