Logstash - Collecting SNMP data from network devices

Hello Gentlemen,

Installed the logstash agent in on prem Windows server. Configure to collect the SNMP from network devices. However i found error "[2022-12-20T09:05:57,307][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit"

I tried to follow the solution in this forum, but i cannot locate the .lock file in the directory. i am also not able to start/stop logstash and its not found under services. Below are my config :

input {
snmp {
hosts => [{host => "udp:10.1.133.250/161" community => "public"}]
}
}

output {
elasticsearch {

cloud_id => ["dXMtY2VudHJhbDEuZ2NwLmNsb3VkLmVzLmlvOjQ0MyQyZDNjMjJkYWEzZWQ0ZTczYmM3OGZjOWVlNzE3ZDNiNiQ4YzU5YmUyY2JiY2E0ZGU5YmY2ZDg4ZjFjNjI5MjI0Ng=="]
cloud_auth => "elastic:R618RKOpso1cznDlp0qzcqDl"

index => "snmp"
}
}

Please advise if had any troubleshooting step i should take to resolve the issue.

Thanks in advance.

How did you start LS, as a batch or a service? It might you lost nssm.exe
Have you checked few lines more from the log?
Do you have enough memory?
As million times repeated, have you try to start with log.level: debug?

Its a batch i believe.

memory is enough.

Yes, i change log.level : info to log.level : debug

Below are the total logs error :

[2022-12-22T08:25:57,333][INFO ][logstash.runner ] Log4j configuration path used is: C:\Program Files\logstash-8.5.3\config\log4j2.properties
[2022-12-22T08:25:57,357][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.5.3", "jruby.version"=>"jruby 9.3.9.0 (2.6.8) 2022-10-24 537cd1f8bc OpenJDK 64-Bit Server VM 17.0.5+8 on 17.0.5+8 +indy +jit [x86_64-mswin32]"}
[2022-12-22T08:25:57,362][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2022-12-22T08:25:57,438][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:790) ~[jruby.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:753) ~[jruby.jar:?]
at C_3a_.Program_20_Files.logstash_minus_8_dot_5_dot_3.lib.bootstrap.environment.(C:\Program Files\logstash-8.5.3\lib\bootstrap\environment.rb:91) ~[?:?]

Anything with [ERROR] after or before the last line [FATAL] bla bla?

Log4j configuration path used is: C:\Program Files\logstash-8.5.3\config\log4j2.properties

Can you check how do you start? Where is .conf located? Have you move directories?

1 Like

The config path looks correct. Logstash.yml file under the correct path.

I am not done any start. Running the batch file after re configure the yml.file. How to restart for windows?

The new logs looks like this :

[2022-12-22T09:50:17,512][INFO ][logstash.runner ] Log4j configuration path used is: C:\Program Files\logstash-8.5.3\config\log4j2.properties
[2022-12-22T09:50:17,531][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.5.3", "jruby.version"=>"jruby 9.3.9.0 (2.6.8) 2022-10-24 537cd1f8bc OpenJDK 64-Bit Server VM 17.0.5+8 on 17.0.5+8 +indy +jit [x86_64-mswin32]"}
[2022-12-22T09:50:17,537][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2022-12-22T09:50:17,548][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"C:/Program Files/logstash-8.5.3/modules/fb_apache/configuration"}
[2022-12-22T09:50:17,569][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x43e78ed6 @directory="C:/Program Files/logstash-8.5.3/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
[2022-12-22T09:50:17,575][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"C:/Program Files/logstash-8.5.3/modules/netflow/configuration"}
[2022-12-22T09:50:17,577][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x59af828a @directory="C:/Program Files/logstash-8.5.3/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[2022-12-22T09:50:17,654][INFO ][logstash.settings ] Creating directory {:setting=>"path.queue", :path=>"C:/Program Files/logstash-8.5.3/data/queue"}
[2022-12-22T09:50:17,885][INFO ][logstash.settings ] Creating directory {:setting=>"path.dead_letter_queue", :path=>"C:/Program Files/logstash-8.5.3/data/dead_letter_queue"}
[2022-12-22T09:50:17,922][DEBUG][logstash.runner ] Setting global FieldReference escape style: none
[2022-12-22T09:50:18,044][DEBUG][logstash.runner ] -------- Logstash Settings (* means modified) ---------
[2022-12-22T09:50:18,081][DEBUG][logstash.runner ] allow_superuser: true
[2022-12-22T09:50:18,089][DEBUG][logstash.runner ] node.name: "WFRSTOHLE00"
[2022-12-22T09:50:18,092][DEBUG][logstash.runner ] path.data: "C:/Program Files/logstash-8.5.3/data"
[2022-12-22T09:50:18,097][DEBUG][logstash.runner ] modules.cli: #<Java::OrgLogstashUtil::ModulesSettingArray: >
[2022-12-22T09:50:18,099][DEBUG][logstash.runner ] modules:
[2022-12-22T09:50:18,102][DEBUG][logstash.runner ] modules_list:
[2022-12-22T09:50:18,106][DEBUG][logstash.runner ] modules_variable_list:
[2022-12-22T09:50:18,108][DEBUG][logstash.runner ] modules_setup: false
[2022-12-22T09:50:18,113][DEBUG][logstash.runner ] config.test_and_exit: false
[2022-12-22T09:50:18,152][DEBUG][logstash.runner ] config.reload.automatic: false
[2022-12-22T09:50:18,158][DEBUG][logstash.runner ] config.reload.interval: #Java::OrgLogstashUtil::TimeValue:0x30012b46
[2022-12-22T09:50:18,163][DEBUG][logstash.runner ] config.support_escapes: false
[2022-12-22T09:50:18,167][DEBUG][logstash.runner ] config.field_reference.escape_style: "none"
[2022-12-22T09:50:18,171][DEBUG][logstash.runner ] metric.collect: true
[2022-12-22T09:50:18,178][DEBUG][logstash.runner ] pipeline.id: "main"
[2022-12-22T09:50:18,222][DEBUG][logstash.runner ] pipeline.system: false
[2022-12-22T09:50:18,228][DEBUG][logstash.runner ] pipeline.workers: 2
[2022-12-22T09:50:18,232][DEBUG][logstash.runner ] pipeline.batch.size: 125
[2022-12-22T09:50:18,236][DEBUG][logstash.runner ] pipeline.batch.delay: 50
[2022-12-22T09:50:18,242][DEBUG][logstash.runner ] pipeline.unsafe_shutdown: false
[2022-12-22T09:50:18,245][DEBUG][logstash.runner ] pipeline.reloadable: true
[2022-12-22T09:50:18,249][DEBUG][logstash.runner ] pipeline.plugin_classloaders: false
[2022-12-22T09:50:18,254][DEBUG][logstash.runner ] pipeline.separate_logs: false
[2022-12-22T09:50:18,258][DEBUG][logstash.runner ] pipeline.ordered: "auto"
[2022-12-22T09:50:18,261][DEBUG][logstash.runner ] pipeline.ecs_compatibility: "v8"
[2022-12-22T09:50:18,270][DEBUG][logstash.runner ] path.plugins:
[2022-12-22T09:50:18,307][DEBUG][logstash.runner ] config.debug: false
[2022-12-22T09:50:18,312][DEBUG][logstash.runner ] *log.level: "debug" (default: "info")
[2022-12-22T09:50:18,315][DEBUG][logstash.runner ] version: false
[2022-12-22T09:50:18,331][DEBUG][logstash.runner ] help: false
[2022-12-22T09:50:18,337][DEBUG][logstash.runner ] enable-local-plugin-development: false
[2022-12-22T09:50:18,340][DEBUG][logstash.runner ] log.format: "plain"
[2022-12-22T09:50:18,342][DEBUG][logstash.runner ] api.enabled: true
[2022-12-22T09:50:18,345][DEBUG][logstash.runner ] api.http.host: "127.0.0.1"
[2022-12-22T09:50:18,347][DEBUG][logstash.runner ] api.http.port: 9600..9700
[2022-12-22T09:50:18,350][DEBUG][logstash.runner ] api.environment: "production"
[2022-12-22T09:50:18,352][DEBUG][logstash.runner ] api.auth.type: "none"
[2022-12-22T09:50:18,355][DEBUG][logstash.runner ] api.auth.basic.password_policy.mode: "WARN"
[2022-12-22T09:50:18,375][DEBUG][logstash.runner ] api.auth.basic.password_policy.length.minimum: 8
[2022-12-22T09:50:18,378][DEBUG][logstash.runner ] api.auth.basic.password_policy.include.upper: "REQUIRED"
[2022-12-22T09:50:18,382][DEBUG][logstash.runner ] api.auth.basic.password_policy.include.lower: "REQUIRED"
[2022-12-22T09:50:18,384][DEBUG][logstash.runner ] api.auth.basic.password_policy.include.digit: "REQUIRED"
[2022-12-22T09:50:18,387][DEBUG][logstash.runner ] api.auth.basic.password_policy.include.symbol: "OPTIONAL"
[2022-12-22T09:50:18,394][DEBUG][logstash.runner ] api.ssl.enabled: false
[2022-12-22T09:50:18,441][DEBUG][logstash.runner ] queue.type: "memory"
[2022-12-22T09:50:18,448][DEBUG][logstash.runner ] queue.drain: false
[2022-12-22T09:50:18,450][DEBUG][logstash.runner ] queue.page_capacity: 67108864
[2022-12-22T09:50:18,453][DEBUG][logstash.runner ] queue.max_bytes: 1073741824
[2022-12-22T09:50:18,465][DEBUG][logstash.runner ] queue.max_events: 0
[2022-12-22T09:50:18,468][DEBUG][logstash.runner ] queue.checkpoint.acks: 1024
[2022-12-22T09:50:18,472][DEBUG][logstash.runner ] queue.checkpoint.writes: 1024
[2022-12-22T09:50:18,475][DEBUG][logstash.runner ] queue.checkpoint.interval: 1000
[2022-12-22T09:50:18,479][DEBUG][logstash.runner ] queue.checkpoint.retry: true
[2022-12-22T09:50:18,496][DEBUG][logstash.runner ] dead_letter_queue.enable: false
[2022-12-22T09:50:18,500][DEBUG][logstash.runner ] dead_letter_queue.max_bytes: 1073741824
[2022-12-22T09:50:18,507][DEBUG][logstash.runner ] dead_letter_queue.flush_interval: 5000
[2022-12-22T09:50:18,510][DEBUG][logstash.runner ] dead_letter_queue.storage_policy: "drop_newer"
[2022-12-22T09:50:18,512][DEBUG][logstash.runner ] slowlog.threshold.warn: #Java::OrgLogstashUtil::TimeValue:0x42bd5540
[2022-12-22T09:50:18,523][DEBUG][logstash.runner ] slowlog.threshold.info: #Java::OrgLogstashUtil::TimeValue:0xa04462d
[2022-12-22T09:50:18,527][DEBUG][logstash.runner ] slowlog.threshold.debug: #Java::OrgLogstashUtil::TimeValue:0x1235d778
[2022-12-22T09:50:18,531][DEBUG][logstash.runner ] slowlog.threshold.trace: #Java::OrgLogstashUtil::TimeValue:0x75ee4105
[2022-12-22T09:50:18,535][DEBUG][logstash.runner ] keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
[2022-12-22T09:50:18,539][DEBUG][logstash.runner ] keystore.file: "C:/Program Files/logstash-8.5.3/config/logstash.keystore"
[2022-12-22T09:50:18,541][DEBUG][logstash.runner ] path.queue: "C:/Program Files/logstash-8.5.3/data/queue"
[2022-12-22T09:50:18,544][DEBUG][logstash.runner ] path.dead_letter_queue: "C:/Program Files/logstash-8.5.3/data/dead_letter_queue"
[2022-12-22T09:50:18,549][DEBUG][logstash.runner ] path.settings: "C:/Program Files/logstash-8.5.3/config"
[2022-12-22T09:50:18,552][DEBUG][logstash.runner ] path.logs: "C:/Program Files/logstash-8.5.3/logs"
[2022-12-22T09:50:18,554][DEBUG][logstash.runner ] xpack.monitoring.enabled: false
[2022-12-22T09:50:18,560][DEBUG][logstash.runner ] xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2022-12-22T09:50:18,563][DEBUG][logstash.runner ] xpack.monitoring.collection.interval: #Java::OrgLogstashUtil::TimeValue:0x609047a6
[2022-12-22T09:50:18,568][DEBUG][logstash.runner ] xpack.monitoring.collection.timeout_interval: #Java::OrgLogstashUtil::TimeValue:0x211b6256
[2022-12-22T09:50:18,572][DEBUG][logstash.runner ] xpack.monitoring.elasticsearch.username: "logstash_system"
[2022-12-22T09:50:18,592][DEBUG][logstash.runner ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2022-12-22T09:50:18,595][DEBUG][logstash.runner ] xpack.monitoring.elasticsearch.sniffing: false
[2022-12-22T09:50:18,599][DEBUG][logstash.runner ] xpack.monitoring.collection.pipeline.details.enabled: true
[2022-12-22T09:50:18,609][DEBUG][logstash.runner ] xpack.monitoring.collection.config.enabled: true
[2022-12-22T09:50:18,612][DEBUG][logstash.runner ] monitoring.enabled: false
[2022-12-22T09:50:18,614][DEBUG][logstash.runner ] monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2022-12-22T09:50:18,620][DEBUG][logstash.runner ] monitoring.collection.interval: #Java::OrgLogstashUtil::TimeValue:0x44ca5127
[2022-12-22T09:50:18,634][DEBUG][logstash.runner ] monitoring.collection.timeout_interval: #Java::OrgLogstashUtil::TimeValue:0x2fb05142
[2022-12-22T09:50:18,643][DEBUG][logstash.runner ] monitoring.elasticsearch.username: "logstash_system"
[2022-12-22T09:50:18,644][DEBUG][logstash.runner ] monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2022-12-22T09:50:18,659][DEBUG][logstash.runner ] monitoring.elasticsearch.sniffing: false
[2022-12-22T09:50:18,662][DEBUG][logstash.runner ] monitoring.collection.pipeline.details.enabled: true
[2022-12-22T09:50:18,664][DEBUG][logstash.runner ] monitoring.collection.config.enabled: true
[2022-12-22T09:50:18,667][DEBUG][logstash.runner ] node.uuid: ""
[2022-12-22T09:50:18,674][DEBUG][logstash.runner ] xpack.management.enabled: false
[2022-12-22T09:50:18,683][DEBUG][logstash.runner ] xpack.management.logstash.poll_interval: #Java::OrgLogstashUtil::TimeValue:0x34260b5e
[2022-12-22T09:50:18,685][DEBUG][logstash.runner ] xpack.management.pipeline.id: ["main"]
[2022-12-22T09:50:18,688][DEBUG][logstash.runner ] xpack.management.elasticsearch.username: "logstash_system"
[2022-12-22T09:50:18,693][DEBUG][logstash.runner ] xpack.management.elasticsearch.hosts: ["https://localhost:9200"]
[2022-12-22T09:50:18,694][DEBUG][logstash.runner ] xpack.management.elasticsearch.ssl.verification_mode: "certificate"
[2022-12-22T09:50:18,696][DEBUG][logstash.runner ] xpack.management.elasticsearch.sniffing: false
[2022-12-22T09:50:18,698][DEBUG][logstash.runner ] --------------- Logstash Settings -------------------
[2022-12-22T09:50:18,821][DEBUG][logstash.config.source.multilocal] Reading pipeline configurations from YAML {:location=>"C:/Program Files/logstash-8.5.3/config/pipelines.yml"}
[2022-12-22T09:50:19,484][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:790) ~[jruby.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:753) ~[jruby.jar:?]
at C_3a_.Program_20_Files.logstash_minus_8_dot_5_dot_3.lib.bootstrap.environment.(C:\Program Files\logstash-8.5.3\lib\bootstrap\environment.rb:91) ~[?:?]

Check your pipelines.yml file.

Please enclose your configs/logs in code block. It will make them easier to read, hence help us help you.

1 Like

Where is your logstash.conf or what is named which contains input , output ... Most likely you have logstash.conf which is OK, but LS cannot see it in the path. I don't see it inside the config directory except logstash-sample.conf.

And again:
Can you check how do you start? Batch or NSSM. If is with the command line/Task scheduler, then you have C:\Program Files\logstash-8.5.3\config\something under quotas"" because of spaces in the directory path.
To run this on the Win platform should use:
"C:\Program Files\logstash-8.5.3\bin\logstash.bat" -f "C:\Program Files\logstash-8.5.3\config\logstash-sample.conf"

Have you move LS directories or aka logstash.conf?

1 Like

The input output config in logstash.yml file in the dir, not under conf. i add the new conf file with logstash.conf file add the input output in the file.

Start by Batch not NSSM.

I did not move any file. By default i can see only logstash.yml file not logstash.conf file.

My pipeline.yml are all commented out.

That's the issue right there. The logstash.yml should contain Logstash configurations, not Logstash pipeline configurations. You should place your pipeline configurations/definitions, i.e. input, filter, output, in separate config files which are either run through the command line with the -f flag as mentioned by @Rios, or if you are running as a Windows service, you will need to update pipelines.yml to define your pipeline to load your pipeline config files.

1 Like

Just to add few more things to @hendry.lim

  • except of default values, you may change debug level in logstash.yml.
    log.level: info
    If LS is running, then go for fine tuning param in logstash.yml
  • put your pipeline or processing configuration which copy in the first post in a file, for instance logstash-snmp.conf
    "C:\Program Files\logstash-8.5.3\bin\logstash.bat" -f "C:\Program Files\logstash-8.5.3\config\logstash-snmp.conf" --path.settings "C:\Program Files\logstash-8.5.3\config"
  • wouldn't change anything in pipelines.yml, leave as is default-no extra params, just to run LS.

When LS start processing SNMP messages, then change specific params in .yml step by step.

1 Like

I did re change the name to logstash-snmp-conf, change the log.level to info and redo the changes in pipelines.yml

Run the command and receive below error:

Any idea which part might be wrong? Thanks so much @Rios @hendry.lim in advance for all the suggestion.

In PS:
Invoke-Expression "& 'C:\Program Files\logstash-8.5.3\bin\logstash.bat' -f 'C:\Program Files\logstash-8.5.3\config\logstash-snmp.conf' --path.settings 'C:\Program Files\logstash-8.5.3\config\'"

or just

cd C:\Program Files\logstash-8.5.3\bin\
.\logstash.bat -f "C:\Program Files\logstash-8.5.3\config\logstash-snmp.conf" --path.settings "C:\Program Files\logstash-8.5.3\config"

Also you can use this in CMD

"C:\Program Files\logstash-8.5.3\bin\logstash.bat" -f "C:\Program Files\logstash-8.5.3\config\logstash-snmp.conf" --path.settings "C:\Program Files\logstash-8.5.3\config"

1 Like

[2022-12-22T14:23:17,724][ERROR][logstash.inputs.snmp ] Unknown setting 'host' for snmp

input {
snmp {
hosts => [{host => "udp:10.1.133.250/161"
community => "public"}
...

And after that you will get next message:
[ERROR][logstash.javapipeline ][main] Pipeline error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: at least one get OID, one walk OID, or one table OID is required>,

Add your OIDs and read the documentation
get => ["1.3.6.1.2.1.1.1.0", "1.3.6.1.2.1.1.3.0", "1.3.6.1.2.1.1.5.0"]

1 Like

Re corrected the config also i change the community value as per our internal discussion, and run the command, the process hanging like this :

PS C:\Program Files\logstash-8.5.3\bin> .\logstash.bat -f "C:\Program Files\logstash-8.5.3\config\logstash-snmp.conf" --path.settings "C:\Program Files\logstash-8.5.3\config"
"Using bundled JDK: C:\Program Files\logstash-8.5.3\jdk\bin\java.exe"
Sending Logstash logs to C:/Program Files/logstash-8.5.3/logs which is now configured via log4j2.properties
[2022-12-22T15:55:47,098][INFO ][logstash.runner ] Log4j configuration path used is: C:\Program Files\logstash-8.5.3\config\log4j2.properties
[2022-12-22T15:55:47,112][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.5.3", "jruby.version"=>"jruby 9.3.9.0 (2.6.8) 2022-10-24 537cd1f8bc OpenJDK 64-Bit Server VM 17.0.5+8 on 17.0
.5+8 +indy +jit [x86_64-mswin32]"}
[2022-12-22T15:55:47,118][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -XX
:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.s
un.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.
tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.chann
els=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2022-12-22T15:55:47,303][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2022-12-22T15:55:52,901][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2022-12-22T15:55:53,373][INFO ][org.reflections.Reflections] Reflections took 584 ms to scan 1 urls, producing 125 keys and 438 values
[2022-12-22T15:55:54,997][INFO ][logstash.javapipeline ] Pipeline main is configured with pipeline.ecs_compatibility: v8 setting. All plugins in this pipeline will default to ecs_compatibility => v8 unl
ess explicitly configured otherwise.
[2022-12-22T15:55:55,122][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["//127.0.0.1"]}
[2022-12-22T15:55:55,791][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[https://elastic:xxxxxx@2d3c22daa3ed4e73bc78fc9ee717d3b6.us-central1.gcp.c
loud.es.io:443/]}}
[2022-12-22T15:55:57,749][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@2d3c22daa3ed4e73bc78fc9ee717d3b6.us-central1.gcp.cloud.es.io:443/"}
[2022-12-22T15:55:57,918][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.5.2) {:es_version=>8}
[2022-12-22T15:55:57,926][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>8}
[2022-12-22T15:55:58,149][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. data_stream => auto resolved to false
[2022-12-22T15:55:58,145][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. data_stream => auto resolved to false
[2022-12-22T15:55:58,159][WARN ][logstash.outputs.elasticsearch][main] Elasticsearch Output configured with ecs_compatibility => v8, which resolved to an UNRELEASED preview of version 8.0.0 of the Elastic Comm
on Schema. Once ECS v8 and an updated release of this plugin are publicly available, you will need to update this plugin to resolve this warning.
[2022-12-22T15:55:58,386][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2022-12-22T15:55:58,975][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>
250, "pipeline.sources"=>["C:/Program Files/logstash-8.5.3/config/logstash-snmp.conf"], :thread=>"#<Thread:0x465f47a1 run>"}
[2022-12-22T15:56:00,981][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>1.99}
[2022-12-22T15:56:01,065][INFO ][logstash.inputs.snmp ][main] using plugin provided MIB path C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/mibs/logstash
[2022-12-22T15:56:01,111][INFO ][logstash.inputs.snmp ][main] using plugin provided MIB path C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/mibs/ietf
(eval):1426: warning: key "bfdSessDiag" is duplicated and overwritten on line 1423
(eval):1462: warning: key "bfdSessDiag" is duplicated and overwritten on line 1459
(eval):1658: warning: key "ipNetToMediaPhysAddress" is duplicated and overwritten on line 1663
(eval):2013: warning: key "mplsXCOperStatus" is duplicated and overwritten on line 2010
(eval):2049: warning: key "mplsXCOperStatus" is duplicated and overwritten on line 2046
(eval):1136: warning: key "ospfTrapControlGroup" is duplicated and overwritten on line 1133
(eval):2175: warning: key "pwOperStatus" is duplicated and overwritten on line 2172
(eval):2219: warning: key "pwOperStatus" is duplicated and overwritten on line 2216
(eval):1835: warning: key "rdbmsGroup" is duplicated and overwritten on line 1840
(eval):732: warning: key "rip2GlobalGroup" is duplicated and overwritten on line 729
(eval):732: warning: key "rip2IfStatGroup" is duplicated and overwritten on line 733
(eval):732: warning: key "rip2IfConfGroup" is duplicated and overwritten on line 737
(eval):732: warning: key "rip2PeerGroup" is duplicated and overwritten on line 741
(eval):3011: warning: key "slapmPolicyMonitorStatus" is duplicated and overwritten on line 3016
(eval):3063: warning: key "slapmPolicyMonitorStatus" is duplicated and overwritten on line 3068
(eval):3279: warning: key "slapmSubcomponentMonitorStatus" is duplicated and overwritten on line 3288
(eval):3337: warning: key "slapmSubcomponentMonitorStatus" is duplicated and overwritten on line 3346
(eval):3389: warning: key "slapmPRMonStatus" is duplicated and overwritten on line 3394
(eval):3441: warning: key "slapmPRMonStatus" is duplicated and overwritten on line 3446
(eval):3669: warning: key "slapmSubcomponentMonitorStatus" is duplicated and overwritten on line 3678
(eval):3727: warning: key "slapmSubcomponentMonitorStatus" is duplicated and overwritten on line 3736
[2022-12-22T15:56:07,261][INFO ][logstash.inputs.snmp ][main] ECS compatibility is enabled but target option was not specified. This may cause fields to be set at the top-level of the event where they are
likely to clash with the Elastic Common Schema. It is recommended to set the target option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this messag
e)
[2022-12-22T15:56:07,296][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2022-12-22T15:56:07,842][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}

Its not normal to the command to hang like this for more 15 min right?

I re change the LS conf file, which i switch the community value to "private". And execute the command, getting this error :

PS C:\Program Files\logstash-8.5.3\bin> .\logstash.bat -f "C:\Program Files\logstash-8.5.3\config\logstash-snmp.conf" --path.settings "C:\Program Files\logstash-8.5.3\config"
"Using bundled JDK: C:\Program Files\logstash-8.5.3\jdk\bin\java.exe"
Sending Logstash logs to C:/Program Files/logstash-8.5.3/logs which is now configured via log4j2.properties
[2022-12-23T07:11:58,421][INFO ][logstash.runner ] Log4j configuration path used is: C:\Program Files\logstash-8.5.3\config\log4j2.properties
[2022-12-23T07:11:58,442][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.5.3", "jruby.version"=>"jruby 9.3.9.0 (2.6.8) 2022-10-24 537cd1f8bc OpenJDK 64-Bit Server VM 17.0.5+8 on 17.0
.5+8 +indy +jit [x86_64-mswin32]"}
[2022-12-23T07:11:58,455][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -XX
:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.s
un.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.
tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.chann
els=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2022-12-23T07:11:58,640][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2022-12-23T07:12:02,349][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2022-12-23T07:12:06,077][INFO ][org.reflections.Reflections] Reflections took 336 ms to scan 1 urls, producing 125 keys and 438 values
[2022-12-23T07:12:08,372][INFO ][logstash.javapipeline ] Pipeline main is configured with pipeline.ecs_compatibility: v8 setting. All plugins in this pipeline will default to ecs_compatibility => v8 unl
ess explicitly configured otherwise.
[2022-12-23T07:12:08,466][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["//127.0.0.1"]}
[2022-12-23T07:12:09,158][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[https://elastic:xxxxxx@2d3c22daa3ed4e73bc78fc9ee717d3b6.us-central1.gcp.c
loud.es.io:443/]}}
[2022-12-23T07:12:10,555][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@2d3c22daa3ed4e73bc78fc9ee717d3b6.us-central1.gcp.cloud.es.io:443/"}
[2022-12-23T07:12:10,689][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.5.2) {:es_version=>8}
[2022-12-23T07:12:10,721][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>8}
[2022-12-23T07:12:10,966][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. data_stream => auto resolved to false
[2022-12-23T07:12:11,002][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. data_stream => auto resolved to false
[2022-12-23T07:12:11,011][WARN ][logstash.outputs.elasticsearch][main] Elasticsearch Output configured with ecs_compatibility => v8, which resolved to an UNRELEASED preview of version 8.0.0 of the Elastic Comm
on Schema. Once ECS v8 and an updated release of this plugin are publicly available, you will need to update this plugin to resolve this warning.
[2022-12-23T07:12:11,132][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2022-12-23T07:12:11,527][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>
250, "pipeline.sources"=>["C:/Program Files/logstash-8.5.3/config/logstash-snmp.conf"], :thread=>"#<Thread:0x5ccdf5e7 run>"}
[2022-12-23T07:12:12,915][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>1.38}
[2022-12-23T07:12:13,018][INFO ][logstash.inputs.snmp ][main] using plugin provided MIB path C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/mibs/logstash
[2022-12-23T07:12:13,065][INFO ][logstash.inputs.snmp ][main] using plugin provided MIB path C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/mibs/ietf
(eval):1426: warning: key "bfdSessDiag" is duplicated and overwritten on line 1423
(eval):1462: warning: key "bfdSessDiag" is duplicated and overwritten on line 1459
(eval):1658: warning: key "ipNetToMediaPhysAddress" is duplicated and overwritten on line 1663
(eval):2013: warning: key "mplsXCOperStatus" is duplicated and overwritten on line 2010
(eval):2049: warning: key "mplsXCOperStatus" is duplicated and overwritten on line 2046
(eval):1136: warning: key "ospfTrapControlGroup" is duplicated and overwritten on line 1133
(eval):2175: warning: key "pwOperStatus" is duplicated and overwritten on line 2172
(eval):2219: warning: key "pwOperStatus" is duplicated and overwritten on line 2216
(eval):1835: warning: key "rdbmsGroup" is duplicated and overwritten on line 1840
(eval):732: warning: key "rip2GlobalGroup" is duplicated and overwritten on line 729
(eval):732: warning: key "rip2IfStatGroup" is duplicated and overwritten on line 733
(eval):732: warning: key "rip2IfConfGroup" is duplicated and overwritten on line 737
(eval):732: warning: key "rip2PeerGroup" is duplicated and overwritten on line 741
(eval):3011: warning: key "slapmPolicyMonitorStatus" is duplicated and overwritten on line 3016
(eval):3063: warning: key "slapmPolicyMonitorStatus" is duplicated and overwritten on line 3068
(eval):3279: warning: key "slapmSubcomponentMonitorStatus" is duplicated and overwritten on line 3288
(eval):3337: warning: key "slapmSubcomponentMonitorStatus" is duplicated and overwritten on line 3346
(eval):3389: warning: key "slapmPRMonStatus" is duplicated and overwritten on line 3394
(eval):3441: warning: key "slapmPRMonStatus" is duplicated and overwritten on line 3446
(eval):3669: warning: key "slapmSubcomponentMonitorStatus" is duplicated and overwritten on line 3678
(eval):3727: warning: key "slapmSubcomponentMonitorStatus" is duplicated and overwritten on line 3736
[2022-12-23T07:12:18,211][INFO ][logstash.inputs.snmp ][main] ECS compatibility is enabled but target option was not specified. This may cause fields to be set at the top-level of the event where they are
likely to clash with the Elastic Common Schema. It is recommended to set the target option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this messag
e)
[2022-12-23T07:12:18,227][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2022-12-23T07:12:18,373][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2022-12-23T07:12:21,330][ERROR][logstash.inputs.snmp ][main][0da1120beb9b72d1952f3f798864d3bb2f4d3505a9f4355ae2d543d8e0606dc9] error invoking get operation, ignoring {:host=>"10.1.133.250", :oids=>["1.3.6.1
.2.1.1.1.0", "1.3.6.1.2.1.1.3.0", "1.3.6.1.2.1.1.5.0"], :exception=>#<LogStash::SnmpClientError: timeout sending snmp get request to target 10.1.133.250/161>, :backtrace=>["C:/Program Files/logstash-8.5.3/vendor
/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp/base_client.rb:39:in get'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inp uts/snmp.rb:210:in block in poll_clients'", "org/jruby/RubyArray.java:1865:in each'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:202:i n poll_clients'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:197:in block in run'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jru by/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:380:in every'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:196:in run'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:411:ininputworker'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:402:in block in start_inpu t'"]} [2022-12-23T07:12:51,277][ERROR][logstash.inputs.snmp ][main][0da1120beb9b72d1952f3f798864d3bb2f4d3505a9f4355ae2d543d8e0606dc9] error invoking get operation, ignoring {:host=>"10.1.133.250", :oids=>["1.3.6.1 .2.1.1.1.0", "1.3.6.1.2.1.1.3.0", "1.3.6.1.2.1.1.5.0"], :exception=>#<LogStash::SnmpClientError: timeout sending snmp get request to target 10.1.133.250/161>, :backtrace=>["C:/Program Files/logstash-8.5.3/vendor /bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp/base_client.rb:39:in get'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inp
uts/snmp.rb:210:in block in poll_clients'", "org/jruby/RubyArray.java:1865:in each'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:202:i
n poll_clients'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:197:in block in run'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jru
by/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:380:in every'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:196:in
run'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:411:in inputworker'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:402:in block in start_inpu
t'"]}
[2022-12-23T07:13:21,279][ERROR][logstash.inputs.snmp ][main][0da1120beb9b72d1952f3f798864d3bb2f4d3505a9f4355ae2d543d8e0606dc9] error invoking get operation, ignoring {:host=>"10.1.133.250", :oids=>["1.3.6.1
.2.1.1.1.0", "1.3.6.1.2.1.1.3.0", "1.3.6.1.2.1.1.5.0"], :exception=>#<LogStash::SnmpClientError: timeout sending snmp get request to target 10.1.133.250/161>, :backtrace=>["C:/Program Files/logstash-8.5.3/vendor
/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp/base_client.rb:39:in get'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inp uts/snmp.rb:210:in block in poll_clients'", "org/jruby/RubyArray.java:1865:in each'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:202:i n poll_clients'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:197:in block in run'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jru by/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:380:in every'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:196:in run'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:411:ininputworker'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:402:in block in start_inpu t'"]} [2022-12-23T07:13:51,277][ERROR][logstash.inputs.snmp ][main][0da1120beb9b72d1952f3f798864d3bb2f4d3505a9f4355ae2d543d8e0606dc9] error invoking get operation, ignoring {:host=>"10.1.133.250", :oids=>["1.3.6.1 .2.1.1.1.0", "1.3.6.1.2.1.1.3.0", "1.3.6.1.2.1.1.5.0"], :exception=>#<LogStash::SnmpClientError: timeout sending snmp get request to target 10.1.133.250/161>, :backtrace=>["C:/Program Files/logstash-8.5.3/vendor /bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp/base_client.rb:39:in get'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inp
uts/snmp.rb:210:in block in poll_clients'", "org/jruby/RubyArray.java:1865:in each'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:202:i
n poll_clients'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:197:in block in run'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jru
by/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:380:in every'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:196:in
run'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:411:in inputworker'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:402:in block in start_inpu
t'"]}
[2022-12-23T07:14:21,278][ERROR][logstash.inputs.snmp ][main][0da1120beb9b72d1952f3f798864d3bb2f4d3505a9f4355ae2d543d8e0606dc9] error invoking get operation, ignoring {:host=>"10.1.133.250", :oids=>["1.3.6.1
.2.1.1.1.0", "1.3.6.1.2.1.1.3.0", "1.3.6.1.2.1.1.5.0"], :exception=>#<LogStash::SnmpClientError: timeout sending snmp get request to target 10.1.133.250/161>, :backtrace=>["C:/Program Files/logstash-8.5.3/vendor
/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp/base_client.rb:39:in get'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inp uts/snmp.rb:210:in block in poll_clients'", "org/jruby/RubyArray.java:1865:in each'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:202:i n poll_clients'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:197:in block in run'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jru by/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:380:in every'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:196:in run'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:411:ininputworker'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:402:in block in start_inpu t'"]} [2022-12-23T07:14:51,279][ERROR][logstash.inputs.snmp ][main][0da1120beb9b72d1952f3f798864d3bb2f4d3505a9f4355ae2d543d8e0606dc9] error invoking get operation, ignoring {:host=>"10.1.133.250", :oids=>["1.3.6.1 .2.1.1.1.0", "1.3.6.1.2.1.1.3.0", "1.3.6.1.2.1.1.5.0"], :exception=>#<LogStash::SnmpClientError: timeout sending snmp get request to target 10.1.133.250/161>, :backtrace=>["C:/Program Files/logstash-8.5.3/vendor /bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp/base_client.rb:39:in get'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inp
uts/snmp.rb:210:in block in poll_clients'", "org/jruby/RubyArray.java:1865:in each'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:202:i
n poll_clients'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:197:in block in run'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jru
by/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:380:in every'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:196:in
run'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:411:in inputworker'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:402:in block in start_inpu

[2022-12-23T07:18:21,284][ERROR][logstash.inputs.snmp ][main][0da1120beb9b72d1952f3f798864d3bb2f4d3505a9f4355ae2d543d8e0606dc9] error invoking get operation, ignoring {:host=>"10.1.133.250", :oids=>["1.3.6.1
.2.1.1.1.0", "1.3.6.1.2.1.1.3.0", "1.3.6.1.2.1.1.5.0"], :exception=>#<LogStash::SnmpClientError: timeout sending snmp get request to target 10.1.133.250/161>, :backtrace=>["C:/Program Files/logstash-8.5.3/vendor
/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp/base_client.rb:39:in get'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inp uts/snmp.rb:210:in block in poll_clients'", "org/jruby/RubyArray.java:1865:in each'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:202:i n poll_clients'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:197:in block in run'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jru by/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:380:in every'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:196:in run'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:411:ininputworker'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:402:in block in start_inpu t'"]} [2022-12-23T07:18:51,284][ERROR][logstash.inputs.snmp ][main][0da1120beb9b72d1952f3f798864d3bb2f4d3505a9f4355ae2d543d8e0606dc9] error invoking get operation, ignoring {:host=>"10.1.133.250", :oids=>["1.3.6.1 .2.1.1.1.0", "1.3.6.1.2.1.1.3.0", "1.3.6.1.2.1.1.5.0"], :exception=>#<LogStash::SnmpClientError: timeout sending snmp get request to target 10.1.133.250/161>, :backtrace=>["C:/Program Files/logstash-8.5.3/vendor /bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp/base_client.rb:39:in get'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inp
uts/snmp.rb:210:in block in poll_clients'", "org/jruby/RubyArray.java:1865:in each'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:202:i
n poll_clients'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:197:in block in run'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jru
by/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:380:in every'", "C:/Program Files/logstash-8.5.3/vendor/bundle/jruby/2.6.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:196:in
run'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:411:in inputworker'", "C:/Program Files/logstash-8.5.3/logstash-core/lib/logstash/java_pipeline.rb:402:in block in start_inpu
t'"]}

error invoking get operation, ignoring {:host=>"10.1.133.250", :oids=>["1.3.6.1
.2.1.1.1.0", "1.3.6.1.2.1.1.3.0", "1.3.6.1.2.1.1.5.0"], :exception=>#<LogStash::SnmpClientError: timeout sending snmp get request to target 10.1.133.250/161>, :backtrace=>["C:/Program Files/logstash-8.5.3/vendor
/bundle/jruby/2.6.0/gems/logstash-input-

I see the error, time out sending SNMP get request to the device.

I checked internally and confirm the connection establish between device to Elastic cluster server.

Any advise? Thank you

Appreciate this isn't what you asked, but you appear to installed logstash just for the purpose of collecting SNMP logs, might I suggest two things:

1 - check out filebeat, there is a simple to configure SNMP module
2 - amend your posts to remove/obfuscate the cloud_id and cloud_auth and change your elastic password asap as you have just posted them on a public forum

1 Like

Thanks @Craig_Lawson . I agree the original question was able to resolve by @Rios and @hendry.lim suggestion. Even its lead to other error, i should open new topic for that.

Yes i did change it.

Can you advise any link/document show the filebeat config to setup the SNMP? That would be very helpful.

From the logs you posted previously, there was an error connecting to your device
SnmpClientError: timeout sending snmp get request to target 10.1.133.250/161>.

AFAIK, none of the Beats support SNMP (at least not Elastic supported Beats). We use Logstash to poll SNMP and use a custom method to receive SNMP trap, as Logstash snmptrap input does not support SNMP v3.

1 Like

Exactly. that the reason when i see the error message, i did quickly check and we confirm the device are connected properly with our ELK cluster server where we install the logstash. we did indtall snmp tracking app and we can see the devices able to communicate with the Elastic server.

Not sure which parts in logstash config was going wrong. Appreciate all the comments and inputs on this.

will check if the devices are using SNMP v3 to send the data