@danivu can you explain in more detail what your are trying to do and which version of Logstash/Elasticsearch you are using?
I did a quick and simple test using the following config and just the mutate to add the fields and it is working fine:
input {
tcp { port => 5011 }
}
filter {
mutate{
add_field => [ "action", "" ]
add_field => [ "port", "0" ]
add_field => [ "type", "" ]
}
}
output {
stdout { codec => rubydebug }
}
sending 123
via tcp results in:
{
"@timestamp" => 2017-01-13T16:08:56.495Z,
"port" => [
[0] 45396,
[1] "0"
],
"@version" => "1",
"host" => "127.0.0.1",
"action" => "",
"message" => "test",
"type" => "",
"tags" => []
}
As you can see port is there (element 1 in the port array), action and type have also been added.
Now, sending this to Elasticsearch will not work, as the type
of the ES index would result from the type field, which you set to an empty string. And ES does not allow an empty string as _type.
Kind of the same issue with port
, Logstash adds a port field by default (at least with tcp/udp input) and adding your port
will result in an array of port, as seen above.
I would recommend using other names, to not conflict with the default fields. E.g. syslog_type
and syslog_port
.
Looking at it again, you do not need to add the fields with mutate if you add them in the grok anyways.
The problem with your grok is that you defined it to match either of the 3 patterns you provided. The way you wrote it it's an OR
.
If the line always starts with Action
pattern 2 and 3 will never match as the message would need to start with Port
or Type
. That the reason why it only parses the Action field.
E.g:
grok {
match => [ "message" , "Action: \[%{WORD:action}\], Port: \[%{NUMBER:syslog_port}\], Type: \[%{WORD:syslog_type}\]" ]
}
results in:
{
"@timestamp" => 2017-01-13T16:31:13.276Z,
"port" => 45770,
"syslog_type" => "test",
"@version" => "1",
"host" => "127.0.0.1",
"action" => "Get",
"message" => "Action: [Get], Port: [3], Type: [test]",
"syslog_port" => "3",
"tags" => []
}