If condition not working

I use Logstash + Filebeat + Elasticsearch.

In filebeat the elasticsearch plugin is enabled so that the logs of elasticsearch get collected.

Now i want to define a simple condition in a logstash pipeline. I want to do something when the elasticsearch node is started. For that i use "elasticsearch.server" event.dataset.

Setting up the condition like this does not work:

if [message] == "started"

So i end up using regex which also doesn't work:

if [message] =~ /^started$/

Looking into the document in elasticsearch itself i see that the document itself is fine and would match my condition:

"message": "started"

What's the issue???

The elasticsearch plugin normally sends to the ES ingest pipeline. Did you change filebeat to send to logstash?

It sounds like you are using filebeat to ingest elasticsearch logs and send them to logstash. Is that correct?

If you use

output { stdout { codec => rubydebug } }

what does the event you want to test look like?

yes, filebeat sends events to logstash

Yes that's correct. I can't see the event at all altough i do see 'elasticsearch.gc' events.

output {
    if [@metadata][pipeline] {
        elasticsearch {
            hosts => ["https://elastic_server01:9200", "https://elastic_server02.bk.datev.de:9200"]
            manage_template => false
            index => "%{[@metadata][beat]}-%{[@metadata][version]}"
            pipeline => "%{[@metadata][pipeline]}"
            user => "logstash_writer"
            password => "${elasticsearch.password}"
    }
  } else {
    elasticsearch {
        hosts => ["https://elastic_server01.bk.datev.de:9200", "https://elastic_server02.bk.datev.de:9200"]
        manage_template => false
        index => "%{[@metadata][beat]}-%{[@metadata][version]}"
            user => "logstash_writer"
            password => "${elasticsearch.password}"
    }
  }
  stdout { codec => rubydebug }
}

EDIT: I found the event:

Mar 05 10:31:15 my_server02 logstash[13800]: "event" => {
Mar 05 10:31:15 my_server02 logstash[13800]: "module" => "elasticsearch",
Mar 05 10:31:15 my_server02 logstash[13800]: "dataset" => "elasticsearch.server",
Mar 05 10:31:15 my_server02 logstash[13800]: "timezone" => "+01:00"
Mar 05 10:31:15 my_server02 logstash[13800]: },
Mar 05 10:31:15 my_server02 logstash[13800]: "@timestamp" => 2020-03-05T09:31:09.815Z,
Mar 05 10:31:15 my_server02 logstash[13800]: "ecs" => {
Mar 05 10:31:15 my_server02 logstash[13800]: "version" => "1.4.0"
Mar 05 10:31:15 my_server02 logstash[13800]: },
Mar 05 10:31:15 my_server02 logstash[13800]: "fileset" => {
Mar 05 10:31:15 my_server02 logstash[13800]: "name" => "server"
Mar 05 10:31:15 my_server02 logstash[13800]: },
Mar 05 10:31:15 my_server02 logstash[13800]: "agent" => {
Mar 05 10:31:15 my_server02 logstash[13800]: "version" => "7.6.0",
Mar 05 10:31:15 my_server02 logstash[13800]: "id" => "972693ee-ca89-4755-9d13-a07ca3e7938e",
Mar 05 10:31:15 my_server02 logstash[13800]: "type" => "filebeat",
Mar 05 10:31:15 my_server02 logstash[13800]: "ephemeral_id" => "ef2f4a2e-4649-4a05-96e2-6aca66e2bd8b",
Mar 05 10:31:15 my_server02 logstash[13800]: "hostname" => "my_server02"
Mar 05 10:31:15 my_server02 logstash[13800]: },
Mar 05 10:31:15 my_server02 logstash[13800]: "host" => {
Mar 05 10:31:15 my_server02 logstash[13800]: "hostname" => "my_server02",
Mar 05 10:31:15 my_server02 logstash[13800]: "id" => "3c4c59e5f48c4a4a93f7287374e2cc3b",
Mar 05 10:31:15 my_server02 logstash[13800]: "containerized" => false,
Mar 05 10:31:15 my_server02 logstash[13800]: "architecture" => "x86_64",
Mar 05 10:31:15 my_server02 logstash[13800]: "name" => "my_server02",
Mar 05 10:31:15 my_server02 logstash[13800]: "version" => "1.4.0"
Mar 05 10:31:15 my_server02 logstash[13800]: },
Mar 05 10:31:15 my_server02 logstash[13800]: "fileset" => {
Mar 05 10:31:15 my_server02 logstash[13800]: "name" => "server"
Mar 05 10:31:15 my_server02 logstash[13800]: },
Mar 05 10:31:15 my_server02 logstash[13800]: "agent" => {
Mar 05 10:31:15 my_server02 logstash[13800]: "version" => "7.6.0",
Mar 05 10:31:15 my_server02 logstash[13800]: "id" => "972693ee-ca89-4755-9d13-a07ca3e7938e",
Mar 05 10:31:15 my_server02 logstash[13800]: "type" => "filebeat",
Mar 05 10:31:15 my_server02 logstash[13800]: "ephemeral_id" => "ef2f4a2e-4649-4a05-96e2-6aca66e2bd8b",
Mar 05 10:31:15 my_server02 logstash[13800]: "hostname" => "my_server02"
Mar 05 10:31:15 my_server02 logstash[13800]: },
Mar 05 10:31:15 my_server02 logstash[13800]: "host" => {
Mar 05 10:31:15 my_server02 logstash[13800]: "hostname" => "my_server02",
Mar 05 10:31:15 my_server02 logstash[13800]: "id" => "3c4c59e5f48c4a4a93f7287374e2cc3b",
Mar 05 10:31:15 my_server02 logstash[13800]: "containerized" => false,
Mar 05 10:31:15 my_server02 logstash[13800]: "architecture" => "x86_64",
Mar 05 10:31:15 my_server02 logstash[13800]: "name" => "my_server02",
Mar 05 10:31:15 my_server02 logstash[13800]: "os" => {
Mar 05 10:31:15 my_server02 logstash[13800]: "kernel" => "3.10.0-1062.12.1.el7.x86_64",
Mar 05 10:31:15 my_server02 logstash[13800]: "platform" => "rhel",
Mar 05 10:31:15 my_server02 logstash[13800]: "version" => "7.7 (Maipo)",
Mar 05 10:31:15 my_server02 logstash[13800]: "family" => "redhat",
Mar 05 10:31:15 my_server02 logstash[13800]: "codename" => "Maipo",
Mar 05 10:31:15 my_server02 logstash[13800]: "name" => "Red Hat Enterprise Linux Server"
Mar 05 10:31:15 my_server02 logstash[13800]: }
Mar 05 10:31:15 my_server02 logstash[13800]: },
Mar 05 10:31:15 my_server02 logstash[13800]: "message" => "[2020-03-05T10:31:08,439][INFO ][o.e.n.Node               ] [my_server02] started",
Mar 05 10:31:15 my_server02 logstash[13800]: "tags" => [
Mar 05 10:31:15 my_server02 logstash[13800]: [0] "beats_input_codec_plain_applied",
Mar 05 10:31:15 my_server02 logstash[13800]: [1] "node_started"
Mar 05 10:31:15 my_server02 logstash[13800]: ],
Mar 05 10:31:15 my_server02 logstash[13800]: "service" => {
Mar 05 10:31:15 my_server02 logstash[13800]: "type" => "elasticsearch"
Mar 05 10:31:15 my_server02 logstash[13800]: },
Mar 05 10:31:15 my_server02 logstash[13800]: "log" => {
Mar 05 10:31:15 my_server02 logstash[13800]: "offset" => 126301,
Mar 05 10:31:15 my_server02 logstash[13800]: "file" => {
Mar 05 10:31:15 my_server02 logstash[13800]: "path" => "/u01/services/elasticsearch/logs/P244_Lab_server.log"
Mar 05 10:31:15 my_server02 logstash[13800]: }
Mar 05 10:31:15 my_server02 logstash[13800]: },
Mar 05 10:31:15 my_server02 logstash[13800]: "restart_begin" => 2020-03-05T09:30:49.359Z
Mar 05 10:31:15 my_server02 logstash[13800]: }
Mar 05 10:31:15 my_server02 logstash[13800]: {
Mar 05 10:31:15 my_server02 logstash[13800]: "@version" => "1",
Mar 05 10:31:15 my_server02 logstash[13800]: "input" => {
Mar 05 10:31:15 my_server02 logstash[13800]: "type" => "log"
Mar 05 10:31:15 my_server02 logstash[13800]: },

Well, that is unexpected. Now i understand why it doesn't work. The message shown in kibana is not the message in the logstash pipeline.

message in logstash pipeline is considered to be the raw input message containing all the unextracted fields. In most cases it is the raw line collected from the log.

message in the filebeat-* index is the 'cleaned up' message. Timestamp and other information has already been extracted.

That is very misleading. How can i access the 'cleaned up' message in the logstash pipeline?

I'm guessing that it is not possible out of the box, because the final 'message' field is generated by the elasticsearch pipeline and not by logstash?

Try

if "started" in [message]

the solution is:

if [message] =~ /\] started$/ {

a more precise solution would be (not tested):

if [message] =~ /\[%{[agent][hostname]}\] started$/ {

i wanted an exact match and not any message containing the word started

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.