Logstash Unable to Output Data to Elasticsearch

I am using Elasticsearch 8.13.4, Logstash 8.13.4, and Filebeat 8.13.4. I followed the steps in the documentation here to get familiar with Logstash, but I am unable to sync data to Elasticsearch. I am encountering the following warning message:

/home/tom/logstash-8.13.4/vendor/bundle/jruby/3.1.0/gems/manticore-0.9.1-java/lib/manticore/client.rb:534: warning: already initialized constant Manticore::Client::ByteArrayEntity

config:


input {
   beats {
     port => "5044"
   }
}
filter {
  grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
  }
  geoip {
    source => "[source][address]"
    target => "geoip"
  }
}
output {
  elasticsearch {
    hosts => ["https://localhost:9200"]
    index => "tom_test"
    user => "user"
    password => "password"
    ssl_enabled => true
    ssl_certificate_authorities => ["../elasticsearch-8.13.4/config/certs/http_ca.crt"]
  }
}

Can anyone provide assistance or guidance on how to resolve this issue?

Additionally, I noticed that some examples in the documentation do not execute successfully:

  1. The first-pipeline.conf configuration file provided in the documentation is as follows:

plaintext

input {
    beats {
        port => "5044"
    }
}
filter {
    grok {
        match => { "message" => "%{COMBINEDAPACHELOG}" }
    }
}
output {
    stdout { codec => rubydebug }
}

The data shown in the documentation for Logstash is:

json

{
    "request" => "/presentations/logstash-monitorama-2013/images/kibana-search.png",
    "agent" => "\"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1700.77 Safari/537.36\"",
    "offset" => 325,
    "auth" => "-",
    "ident" => "-",
    "verb" => "GET",
    "prospector" => {
        "type" => "log"
    },
    "input" => {
        "type" => "log"
    },
    "source" => "/path/to/file/logstash-tutorial.log",
    "message" => "83.149.9.216 - - [04/Jan/2015:05:13:42 +0000] \"GET /presentations/logstash-monitorama-2013/images/kibana-search.png HTTP/1.1\" 200 203023 \"http://semicomplete.com/presentations/logstash-monitorama-2013/\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1700.77 Safari/537.36\"",
    "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
    "referrer" => "\"http://semicomplete.com/presentations/logstash-monitorama-2013/\"",
    "@timestamp" => "2017-11-09T02:51:12.416Z",
    "response" => "200",
    "bytes" => "203023",
    "clientip" => "83.149.9.216",
    "@version" => "1",
    "beat" => {
        "name" => "My-MacBook-Pro.local",
        "hostname" => "My-MacBook-Pro.local",
        "version" => "6.0.0"
    },
    "host" => "My-MacBook-Pro.local",
    "httpversion" => "1.1",
    "timestamp" => "04/Jan/2015:05:13:42 +0000"
}

However, the actual output data structure is as follows:

json

{
    "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
    "input" => {
        "type" => "filestream"
    },
    "@timestamp" => "2024-08-12T06:38:39.793Z",
    "timestamp" => "04/Jan/2015:05:30:37 +0000",
    "user_agent" => {
        "original" => "Mozilla/5.0 (X11; Linux x86_64; rv:24.0) Gecko/20140205 Firefox/24.0 Iceweasel/24.3.0"
    },
    "event" => {
        "original" => "86.1.76.62 - - [04/Jan/2015:05:30:37 +0000] \"GET /reset.css HTTP/1.1\" 200 1015 \"http://www.semicomplete.com/projects/xdotool/\" \"Mozilla/5.0 (X11; Linux x86_64; rv:24.0) Gecko/20140205 Firefox/24.0 Iceweasel/24.3.0\""
    },
    "ecs" => {
        "version" => "8.0.0"
    },
    "host" => {
        "name" => "bumblebee"
    },
    "log" => {
        "offset" => 24033,
        "file" => {
            "inode" => "581022",
            "device_id" => "64768",
            "path" => "/home/tom/logstash-tutorial-dataset"
        }
    },
    "source" => {
        "address" => "86.1.76.62"
    },
    "agent" => {
        "id" => "0b49ea59-b9e2-4782-a3e2-680c90a73d84",
        "version" => "8.13.4",
        "ephemeral_id" => "c66d1aae-5c33-422a-998a-86d4fbea79b9",
        "name" => "bumblebee",
        "type" => "filebeat"
    },
    "@version" => "1",
    "http" => {
        "request" => {
            "method" => "GET",
            "referrer" => "http://www.semicomplete.com/projects/xdotool/"
        },
        "version" => "1.1",
        "response" => {
            "body" => {
                "bytes" => 1015
            },
            "status_code" => 200
        }
    },
    "url" => {
        "original" => "/reset.css"
    },
    "message" => "86.1.76.62 - - [04/Jan/2015:05:30:37 +0000] \"GET /reset.css HTTP/1.1\" 200 1015 \"http://www.semicomplete.com/projects/xdotool/\" \"Mozilla/5.0 (X11; Linux x86_64; rv:24.0) Gecko/20140205 Firefox/24.0 Iceweasel/24.3.0\""
}

The following plugin configuration does not work:

plaintext

geoip {
    source => "clientip"
}
  1. When outputting data to Logstash, the configuration file in the documentation is as follows:

plaintext

output {
    elasticsearch {
        hosts => [ "localhost:9200" ]
    }
}

Since Elasticsearch 8 and later versions start in secure mode, the above configuration will connect to Elasticsearch using HTTP instead of HTTPS.

I would like to inquire if the examples provided in the documentation are guaranteed to work, or if there might be issues with the documentation.

From Elastic Search to Logstash

Hey Tom,
Thanks for sharing first-pipeline.conf file and the output. Looks like data is coming correctly. So there is no problem between filebeat and logstash.

Let's start to debug your logstash conf with the following command.

/usr/share/logstash/bin/logstash --config.test_and_exit -f `your_logstash_conf.conf`

Please note that Elasticsearch can generate datastream or a regular index. Share the following API call and let's check

GET _cat/indices/tom_test

Thank you for your reply.

bin/logstash -f first-pipeline.conf --config.test_and_exit

the log output is as follows:

Using bundled JDK: /home/tom/logstash-8.13.4/jdk
/home/tom/logstash-8.13.4/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_int
/home/tom/logstash-8.13.4/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_f
Sending Logstash logs to /home/tom/logstash-8.13.4/logs which is now configured via log4j2.properties
[2024-08-13T14:48:12,312][INFO ][logstash.runner          ] Log4j configuration path used is: /home/tom/logstash-8.13.4/config/log4j2.properties
[2024-08-13T14:48:12,327][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.13.4", "jruby.version"=>"jruby 9.4.5.0 (3.1.4) 2023-11-02 1abae2700f OpenJDK 64-Bit Server VM 17.0.11+9 on 17.0.11+9 +indy +jit [x86_64-linux]"}
[2024-08-13T14:48:12,330][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-08-13T14:48:12,341][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-08-13T14:48:12,350][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-08-13T14:48:12,762][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2024-08-13T14:48:13,594][INFO ][org.reflections.Reflections] Reflections took 175 ms to scan 1 urls, producing 132 keys and 468 values
/home/tom/logstash-8.13.4/vendor/bundle/jruby/3.1.0/gems/amazing_print-1.6.0/lib/amazing_print/formatter.rb:37: warning: previous definition of cast was here
[2024-08-13T14:48:14,499][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
Configuration OK
[2024-08-13T14:48:14,501][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash

Logstash can start normally, but after Filebeat starts, Logstash shows the following exception:

/home/tom/logstash-8.13.4/vendor/bundle/jruby/3.1.0/gems/manticore-0.9.1-java/lib/manticore/client.rb:534: warning: already initialized constant Manticore::Client::ByteArrayEntity

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.