Logstash plugin azure_event_hubs: undefined method `getHostContext'

Logstash version: 8.15.1
OS: Amazon Linux 2
Steps to reproduce:

rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch

added repo in /etc/yum.repos.d/logstash.repo

[logstash-8.x]
name=Elastic repository for 8.x packages
baseurl=https://artifacts.elastic.co/packages/8.x/yum
gpgcheck=1
gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch
enabled=1
autorefresh=1
type=rpm-md

in /etc/logstash/logstash.yml all settings remained the same except:

api.http.host: MY_IP

in /etc/logstash/jvm.options all settings remained the same except:

-Xms3762m
-Xmx3762m

(I have 8GB of RAM on the instance)

In /etc/logstash/conf.d/my.input.conf

input {
   azure_event_hubs {
     config_mode => "advanced"
     threads => 8
     decorate_events => true
     event_hubs => [
        {"EVENT_HUB_NAME" => {
         event_hub_connection => "Endpoint=sb://...."
         initial_position => "beginning"
         consumer_group => "CONSUMER_NAME"
        }}
     ]
   }
}

In /etc/logstash/conf.d/my-output.conf

output {
         rabbitmq {
             vhost => VHOST_NAME
             exchange => EXCHANGE_NAME
             exchange_type => direct
             host => "MY_RABBITMQ_HOST"
             durable => true
             key => "MY_KEY"
             user => "MY_USER"
             password => "MY_PASS"
             port => 5672
         }
}

I have successfully installed azure_event_hubs:

/usr/share/logstash/bin/logstash-plugin install logstash-input-azure_event_hubs
Using bundled JDK: /usr/share/logstash/jdk
Validating logstash-input-azure_event_hubs
Resolving mixin dependencies
Installing logstash-input-azure_event_hubs
Installation successful
service logstash start

Output:

Sep 12 21:50:43 MY_IP systemd: Started logstash.
Sep 12 21:50:43 MY_IP logstash: Using bundled JDK: /usr/share/logstash/jdk
Sep 12 21:50:46 MY_IP dhclient[2234]: XMT: Solicit on eth0, interval 109200ms.
Sep 12 21:51:03 MY_IP logstash: Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
Sep 12 21:51:03 MY_IP logstash: [2024-09-12T21:51:03,825][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
Sep 12 21:51:03 MY_IP logstash: [2024-09-12T21:51:03,834][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.15.1", "jruby.version"=>"jruby 9.4.8.0 (3.1.4) 2024-07-02 4d41e55a67 OpenJDK 64-Bit Server VM 21.0.4+7-LTS on 21.0.4+7-LTS +indy +jit [x86_64-linux]"}
Sep 12 21:51:03 MY_IP logstash: [2024-09-12T21:51:03,838][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms3762m, -Xmx3762m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
Sep 12 21:51:03 MY_IP logstash: [2024-09-12T21:51:03,845][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
Sep 12 21:51:03 MY_IP logstash: [2024-09-12T21:51:03,846][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
Sep 12 21:51:04 MY_IP logstash: [2024-09-12T21:51:04,958][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
Sep 12 21:51:05 MY_IP logstash: [2024-09-12T21:51:05,757][INFO ][org.reflections.Reflections] Reflections took 207 ms to scan 1 urls, producing 138 keys and 481 values
Sep 12 21:51:06 MY_IP logstash: [2024-09-12T21:51:06,035][INFO ][logstash.codecs.json     ] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
Sep 12 21:51:06 MY_IP logstash: [2024-09-12T21:51:06,097][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
Sep 12 21:51:06 MY_IP logstash: [2024-09-12T21:51:06,262][INFO ][logstash.outputs.rabbitmq][main] Connected to RabbitMQ {:url=>"amqp://RABBITMQ"}
Sep 12 21:51:06 MY_IP logstash: [2024-09-12T21:51:06,326][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/etc/logstash/conf.d/my-input.conf", "/etc/logstash/conf.d/my-output.conf"], :thread=>"#<Thread:0x539a6fb4 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
Sep 12 21:51:06 MY_IP logstash: [2024-09-12T21:51:06,932][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.6}
Sep 12 21:51:06 MY_IP logstash: [2024-09-12T21:51:06,945][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
Sep 12 21:51:06 MY_IP logstash: [2024-09-12T21:51:06,959][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
Sep 12 21:51:06 MY_IP logstash: [2024-09-12T21:51:06,962][INFO ][logstash.inputs.azureeventhubs][main][f1e9b248855f4a1c1785796fb0bdc9f6e83e4ff99f7463bf822f0bac4c8c3f59] Event Hub EVENT_HUB_NAME is initializing...
Sep 12 21:51:06 MY_IP logstash: [2024-09-12T21:51:06,962][WARN ][logstash.inputs.azureeventhubs][main][f1e9b248855f4a1c1785796fb0bdc9f6e83e4ff99f7463bf822f0bac4c8c3f59] You have NOT specified a `storage_connection_string` for EVENT_HUB_NAME. This configuration is only supported for a single Logstash instance.
Sep 12 21:51:06 MY_IP logstash: [2024-09-12T21:51:06,969][INFO ][com.microsoft.azure.eventprocessorhost.EventProcessorHost][main][f1e9b248855f4a1c1785796fb0bdc9f6e83e4ff99f7463bf822f0bac4c8c3f59] host logstash-51407a62-30ca-4a61-ab83-fdf0a879e27d: New EventProcessorHost created.
Sep 12 21:51:07 MY_IP logstash: [2024-09-12T21:51:06,970][ERROR][logstash.inputs.azureeventhubs][main][f1e9b248855f4a1c1785796fb0bdc9f6e83e4ff99f7463bf822f0bac4c8c3f59] Event Hub failed during initialization. {:event_hub_name=>"EVENT_HUB_NAME", :exception=>#<NoMethodError: undefined method `getHostContext' for #<Java::ComMicrosoftAzureEventprocessorhost::EventProcessorHost:0x11daf61c>
Sep 12 21:51:07 MY_IP logstash: Did you mean?  get_host_name>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-input-azure_event_hubs-1.4.7/lib/logstash/inputs/azure_event_hubs.rb:408:in `block in run'"]}
Sep 12 21:51:08 MY_IP logstash: [2024-09-12T21:51:08,087][WARN ][logstash.outputs.rabbitmq][main] RabbitMQ connection was closed {:url=>"amqp://RABBITMQ", :automatic_recovery=>true, :cause=>#<Java::ComRabbitmqClient::ShutdownSignalException: clean connection shutdown; protocol method: #method<connection.close>(reply-code=200, reply-text=OK, class-id=0, method-id=0)>}
Sep 12 21:51:08 MY_IP logstash: [2024-09-12T21:51:08,088][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
Sep 12 21:51:08 MY_IP logstash: [2024-09-12T21:51:08,465][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
Sep 12 21:51:08 MY_IP logstash: [2024-09-12T21:51:08,473][INFO ][logstash.runner          ] Logstash shut down.
Sep 12 21:51:08 MY_IP systemd: logstash.service holdoff time over, scheduling restart.
Sep 12 21:51:08 MY_IP systemd: Stopped logstash.

Any idea on how to solve this please? Thank you!

Can anyone give an opinion on this please?
@leandrojmp Sorry to have tagged you, I saw your reply on my other topic, could you please help me in this topic as well? Thank you very much!