Custom fields visible in stdout, but not propagated to ElasticSearch

Hi ELK community. I am quite newby, so I apologize in advance if the question is too simple.
I have a configuration like this:

  1. FileBeat (version 6.6.0) and LogStash (6.6.0) running on host A. For filtering, I use the plugin Grok. I have a few custom patterns in a mypatterns/ directory, generating a couple of new fields.

  2. A second LogStash (vs 5.6.10) running on host B.

  3. A ElasticSearch intance (5.6.10) running on host C.

To send data from LogStash A to LogStash B I am using the plugin Lumberjack. So, in host A I have something like this:

input {
    beats {
        port => "5044"
    }
}

filter {
    grok {
        match => { "message" => "%{PATTERN1:pattern1}"}
        add_field => { "event" => "starting" }
        break_on_match => false
        patterns_dir => ["/etc/logstash/patterns/starterlog"]
    }
    grok {
        match => { "message" => "%{PATTERN2:pattern2}"}
        add_field => { "event" => "finished" }
        break_on_match => false
        patterns_dir => ["/etc/logstash/patterns/starterlog"]
    }
}

output {
    lumberjack {
        codec => json
        hosts => "hostB"
        ssl_certificate => "/etc/logstash/certs/lumberjack.cert"
        port => 5555
    }
}

and the configuration in LogStash B is like this:

input {
    beats {
        codec => json
        port => 5555
        ssl => true
        ssl_certificate => "/etc/logstash/certs/lumberjack.cert"
        ssl_key => "/etc/logstash/certs/lumberjack.key"
    }
}

#output {
#    stdout { codec => rubydebug }
#}

output {
    elasticsearch {
        hosts => [ "hostC:9210" ]
        index => "myindex"
    }
}

Here is the result.
When I use output = stdout in LogStash B, everything seems to work fine. I see in the stdout the JSON docs, with those custom fields created in host A, as expected.
However, when I use output = elasticsearch, the custom fields are not registered. I see the index is created, and it contains all the standard fields (like @timestamp, host, message, etc.) but none of mine (pattern1, pattern2, event).

Clearly is not a network problem, as the index is propagated fine from B to C. It has to be a configuration issue in the LogStash part.
Any clue, hint, link to the exact piece of documentation explaining how to fix it, etc. is more than welcome.

Thanks a lot in advance.
Cheers,
Jose

Update. If I change the setup, and I make the first LogStash to talk directly to ElasticSearch, I see exactly the same behavior. So it seems that using the lumberjack plugin is not the issue... But I still need to find out where is the problem exactly :frowning:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.