Pipeline was Ignoring Logstash

I'm newbie for using Logstash and Elasticsearch. I wanted to sync my MongoDB data into Elasticsearch using Logstash Plugin (logstash-input-mongodb).
In my mongodata.conf is

input {
uri => 'mongodb://127.0.0.1:27017/final?ssl=true'
placeholder_db_dir => '/opt/logstash-mongodb/'
placeholder_db_name => 'logstash_sqlite.db'
collection => 'twitter_stream'
batch_size => 5000
}
filter {

}
output {
stdout {
codec => rubydebug
}
elasticsearch {
action => "index"
index => "twitter_stream"
hosts => ["localhost:9200"]
}
}

While I running bin/logstash -f /etc/logstash/conf.d/mongodata.conf --path.settings /etc/logstash/

The error was displayed like this

Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2020-02-28T08:48:20,246][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-02-28T08:48:20,331][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.6.0"}
[2020-02-28T08:48:20,883][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \t\r\n], "#", "{" at line 2, column 13 (byte 21) after input {\n uri ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:47:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:55:in compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:17:in block in compile_sources'", "org/jruby/RubyArray.java:2580:in map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:14:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:161:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:27:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:326:in block in converge_state'"]}
[2020-02-28T08:48:21,114][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2020-02-28T08:48:25,969][INFO ][logstash.runner ] Logstash shut down.

Please help me, I don't have any idea about this.

You need to tell logstash what input you want to use...

input {
    mongodb {
        uri => 'mongodb://127.0.0.1:27017/final?ssl=true'
        ...
    }
}

The mongodb input is not included in logstash by default, so you will need to install it.

I have been changed like you said to tell logstash what input I want to use. Then another error has seems like below

Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
    [2020-02-29T15:52:57,572][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
    [2020-02-29T15:52:57,658][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.0"}
    [2020-02-29T15:52:58,929][INFO ][org.reflections.Reflections] Reflections took 28 ms to scan 1 urls, producing 20 keys and 40 values 
    [2020-02-29T15:52:59,828][INFO ][logstash.inputs.mongodb  ] Using version 0.1.x input plugin 'mongodb'. This plugin isn't well supported by the community and likely has no maintainer.
    [2020-02-29T15:53:01,035][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
    [2020-02-29T15:53:01,185][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
    [2020-02-29T15:53:01,235][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
    [2020-02-29T15:53:01,239][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
    [2020-02-29T15:53:01,283][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
    [2020-02-29T15:53:01,325][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
    [2020-02-29T15:53:01,356][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
    [2020-02-29T15:53:01,361][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["/etc/logstash/conf.d/mongodata.conf"], :thread=>"#<Thread:0x59832e88 run>"}
    [2020-02-29T15:53:01,389][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
    D, [2020-02-29T15:53:02.434226 #14599] DEBUG -- : MONGODB | EVENT: #<TopologyOpening topology=Unknown[]>
    D, [2020-02-29T15:53:02.449904 #14599] DEBUG -- : MONGODB | Topology type 'unknown' initializing.
    D, [2020-02-29T15:53:02.500051 #14599] DEBUG -- : MONGODB | EVENT: #<TopologyChanged prev=Unknown[] new=Unknown[localhost:27017]>
    D, [2020-02-29T15:53:02.503619 #14599] DEBUG -- : MONGODB | Topology type 'Unknown' changed to type 'Unknown'.
    D, [2020-02-29T15:53:02.532637 #14599] DEBUG -- : MONGODB | EVENT: #<ServerOpening address=localhost:27017 topology=Unknown[localhost:27017]>
    D, [2020-02-29T15:53:02.536434 #14599] DEBUG -- : MONGODB | Server localhost:27017 initializing.
    D, [2020-02-29T15:53:02.596045 #14599] DEBUG -- : MONGODB | Waiting for up to 29.97 seconds for servers to be scanned: #<Cluster topology=Unknown[localhost:27017] servers=[#<Server address=localhost:27017 UNKNOWN>]>
    D, [2020-02-29T15:53:02.792976 #14599] DEBUG -- : MONGODB | EVENT: #<ServerDescriptionChanged address=localhost:27017 topology=Single[localhost:27017] prev=#<Mongo::Server:Description:0x2056 config={} average_round_trip_time=> new=#<Mongo::Server:Description:0x2054 config={"ismaster"=>true, "maxBsonObjectSize"=>16777216, "maxMessageSizeBytes"=>48000000, "maxWriteBatchSize"=>100000, "localTime"=>2020-02-29 07:53:02 UTC, "logicalSessionTimeoutMinutes"=>30, "connectionId"=>29, "minWireVersion"=>0, "maxWireVersion"=>8, "readOnly"=>false, "ok"=>1.0} average_round_trip_time=0.174114>>
    D, [2020-02-29T15:53:02.799393 #14599] DEBUG -- : MONGODB | Server description for localhost:27017 changed from 'unknown' to 'standalone'.
    D, [2020-02-29T15:53:02.833400 #14599] DEBUG -- : MONGODB | EVENT: #<TopologyChanged prev=Unknown[localhost:27017] new=Single[localhost:27017]>
    D, [2020-02-29T15:53:02.834710 #14599] DEBUG -- : MONGODB | Topology type 'Unknown' changed to type 'Single'.
    [2020-02-29T15:53:02,844][INFO ][logstash.inputs.mongodb  ][main] Registering MongoDB input
    D, [2020-02-29T15:53:03.339575 #14599] DEBUG -- : MONGODB | EVENT: #<ServerDescriptionChanged address=localhost:27017 topology=Single[localhost:27017] prev=#<Mongo::Server:Description:0x2054 config={"ismaster"=>true, "maxBsonObjectSize"=>16777216, "maxMessageSizeBytes"=>48000000, "maxWriteBatchSize"=>100000, "localTime"=>2020-02-29 07:53:02 UTC, "logicalSessionTimeoutMinutes"=>30, "connectionId"=>29, "minWireVersion"=>0, "maxWireVersion"=>8, "readOnly"=>false, "ok"=>1.0} average_round_trip_time=0.174114> new=#<Mongo::Server:Description:0x2064 config={"ismaster"=>true, "maxBsonObjectSize"=>16777216, "maxMessageSizeBytes"=>48000000, "maxWriteBatchSize"=>100000, "localTime"=>2020-02-29 07:53:03 UTC, "logicalSessionTimeoutMinutes"=>30, "connectionId"=>31, "minWireVersion"=>0, "maxWireVersion"=>8, "readOnly"=>false, "ok"=>1.0} average_round_trip_time=0.143011>>
    D, [2020-02-29T15:53:03.341407 #14599] DEBUG -- : MONGODB | Server description for localhost:27017 changed from 'standalone' to 'standalone'.
    D, [2020-02-29T15:53:03.343816 #14599] DEBUG -- : MONGODB | EVENT: #<TopologyChanged prev=Single[localhost:27017] new=Single[localhost:27017]>
    D, [2020-02-29T15:53:03.344563 #14599] DEBUG -- : MONGODB | Topology type 'Single' changed to type 'Single'.
    D, [2020-02-29T15:53:03.509494 #14599] DEBUG -- : MONGODB | [7] localhost:27017 #1 | final.listCollections | STARTED | {"listCollections"=>1, "cursor"=>{}, "nameOnly"=>true, "lsid"=>{"id"=><BSON::Binary:0x2070 type=uuid data=0x68e86ed90de44e32...>}}
    D, [2020-02-29T15:53:03.526470 #14599] DEBUG -- : MONGODB | [7] localhost:27017 | final.listCollections | FAILED | wrong number of arguments (given 2, expected 1) | 0.007495s
    [2020-02-29T15:53:04,289][ERROR][logstash.javapipeline    ][main] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<ArgumentError: wrong number of arguments (given 2, expected 1)>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bson-4.8.0-java/lib/bson/hash.rb:115:in `from_bson'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/dbref.rb:104:in `from_bson'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/protocol/serializers.rb:268:in `deserialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/protocol/serializers.rb:211:in `deserialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/protocol/message.rb:319:in `deserialize_field'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/protocol/message.rb:160:in `block in deserialize'", "org/jruby/RubyArray.java:1814:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/protocol/message.rb:156:in `deserialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server/connection_base.rb:109:in `block in deliver'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server/connectable.rb:83:in `ensure_connected'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server/connection_base.rb:100:in `deliver'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server/connection.rb:395:in `deliver'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server/connection_base.rb:93:in `dispatch'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/executable.rb:56:in `block in dispatch_message'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server/connection_pool.rb:557:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server.rb:417:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/executable.rb:55:in `dispatch_message'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/executable.rb:50:in `get_result'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/executable.rb:29:in `block in do_execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/response_handling.rb:73:in `unpin_maybe'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/executable.rb:26:in `do_execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/executable.rb:38:in `execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/op_msg_or_command.rb:27:in `execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/collections_info.rb:44:in `execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/database/view.rb:168:in `send_initial_query'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/database/view.rb:58:in `block in collection_names'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/retryable.rb:61:in `block in read_with_retry_cursor'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/retryable.rb:316:in `modern_read_with_retry'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/retryable.rb:117:in `read_with_retry'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/retryable.rb:60:in `read_with_retry_cursor'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/database/view.rb:57:in `collection_names'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/database.rb:120:in `collection_names'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-mongodb-0.4.1/lib/logstash/inputs/mongodb.rb:137:in `get_collection_names'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-mongodb-0.4.1/lib/logstash/inputs/mongodb.rb:156:in `update_watched_collections'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-mongodb-0.4.1/lib/logstash/inputs/mongodb.rb:182:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:200:in `block in register_plugins'", "org/jruby/RubyArray.java:1814:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:199:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:310:in `start_inputs'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:270:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:154:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:109:in `block in start'"], "pipeline.sources"=>["/etc/logstash/conf.d/mongodata.conf"], :thread=>"#<Thread:0x59832e88 run>"}
    [2020-02-29T15:53:04,301][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
    [2020-02-29T15:53:04,468][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
    [2020-02-29T15:53:09,519][INFO ][logstash.runner          ] Logstash shut down.

I don't have any idea about that, is there solution for this error?

remove action and see.

I had remove action and still get an error

> Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
> [2020-02-29T17:56:22,872][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
> [2020-02-29T17:56:22,954][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.0"}
> [2020-02-29T17:56:24,178][INFO ][org.reflections.Reflections] Reflections took 28 ms to scan 1 urls, producing 20 keys and 40 values 
> [2020-02-29T17:56:25,067][INFO ][logstash.inputs.mongodb  ] Using version 0.1.x input plugin 'mongodb'. This plugin isn't well supported by the community and likely has no maintainer.
> [2020-02-29T17:56:26,275][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
> [2020-02-29T17:56:26,404][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
> [2020-02-29T17:56:26,448][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
> [2020-02-29T17:56:26,452][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
> [2020-02-29T17:56:26,495][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
> [2020-02-29T17:56:26,538][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
> [2020-02-29T17:56:26,564][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
> [2020-02-29T17:56:26,568][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["/etc/logstash/conf.d/mongodata.conf"], :thread=>"#<Thread:0x1cbb20f2 run>"}
> [2020-02-29T17:56:26,604][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
> D, [2020-02-29T17:56:27.661091 #10131] DEBUG -- : MONGODB | EVENT: #<TopologyOpening topology=Unknown[]>
> D, [2020-02-29T17:56:27.728825 #10131] DEBUG -- : MONGODB | Topology type 'unknown' initializing.
> D, [2020-02-29T17:56:27.776549 #10131] DEBUG -- : MONGODB | EVENT: #<TopologyChanged prev=Unknown[] new=Unknown[localhost:27017]>
> D, [2020-02-29T17:56:27.779607 #10131] DEBUG -- : MONGODB | Topology type 'Unknown' changed to type 'Unknown'.
> D, [2020-02-29T17:56:27.806262 #10131] DEBUG -- : MONGODB | EVENT: #<ServerOpening address=localhost:27017 topology=Unknown[localhost:27017]>
> D, [2020-02-29T17:56:27.808724 #10131] DEBUG -- : MONGODB | Server localhost:27017 initializing.
> D, [2020-02-29T17:56:27.866321 #10131] DEBUG -- : MONGODB | Waiting for up to 29.98 seconds for servers to be scanned: #<Cluster topology=Unknown[localhost:27017] servers=[#<Server address=localhost:27017 UNKNOWN>]>
> D, [2020-02-29T17:56:28.053383 #10131] DEBUG -- : MONGODB | EVENT: #<ServerDescriptionChanged address=localhost:27017 topology=Single[localhost:27017] prev=#<Mongo::Server:Description:0x2056 config={} average_round_trip_time=> new=#<Mongo::Server:Description:0x2054 config={"ismaster"=>true, "maxBsonObjectSize"=>16777216, "maxMessageSizeBytes"=>48000000, "maxWriteBatchSize"=>100000, "localTime"=>2020-02-29 09:56:27 UTC, "logicalSessionTimeoutMinutes"=>30, "connectionId"=>75, "minWireVersion"=>0, "maxWireVersion"=>8, "readOnly"=>false, "ok"=>1.0} average_round_trip_time=0.16498600000000002>>
> D, [2020-02-29T17:56:28.058878 #10131] DEBUG -- : MONGODB | Server description for localhost:27017 changed from 'unknown' to 'standalone'.
> D, [2020-02-29T17:56:28.093351 #10131] DEBUG -- : MONGODB | EVENT: #<TopologyChanged prev=Unknown[localhost:27017] new=Single[localhost:27017]>
> D, [2020-02-29T17:56:28.094469 #10131] DEBUG -- : MONGODB | Topology type 'Unknown' changed to type 'Single'.
> [2020-02-29T17:56:28,103][INFO ][logstash.inputs.mongodb  ][main] Registering MongoDB input
> D, [2020-02-29T17:56:28.601391 #10131] DEBUG -- : MONGODB | EVENT: #<ServerDescriptionChanged address=localhost:27017 topology=Single[localhost:27017] prev=#<Mongo::Server:Description:0x2054 config={"ismaster"=>true, "maxBsonObjectSize"=>16777216, "maxMessageSizeBytes"=>48000000, "maxWriteBatchSize"=>100000, "localTime"=>2020-02-29 09:56:27 UTC, "logicalSessionTimeoutMinutes"=>30, "connectionId"=>75, "minWireVersion"=>0, "maxWireVersion"=>8, "readOnly"=>false, "ok"=>1.0} average_round_trip_time=0.16498600000000002> new=#<Mongo::Server:Description:0x2064 config={"ismaster"=>true, "maxBsonObjectSize"=>16777216, "maxMessageSizeBytes"=>48000000, "maxWriteBatchSize"=>100000, "localTime"=>2020-02-29 09:56:28 UTC, "logicalSessionTimeoutMinutes"=>30, "connectionId"=>77, "minWireVersion"=>0, "maxWireVersion"=>8, "readOnly"=>false, "ok"=>1.0} average_round_trip_time=0.1359092>>
> D, [2020-02-29T17:56:28.603520 #10131] DEBUG -- : MONGODB | Server description for localhost:27017 changed from 'standalone' to 'standalone'.
> D, [2020-02-29T17:56:28.606495 #10131] DEBUG -- : MONGODB | EVENT: #<TopologyChanged prev=Single[localhost:27017] new=Single[localhost:27017]>
> D, [2020-02-29T17:56:28.607332 #10131] DEBUG -- : MONGODB | Topology type 'Single' changed to type 'Single'.
> D, [2020-02-29T17:56:28.761074 #10131] DEBUG -- : MONGODB | [7] localhost:27017 #1 | final.listCollections | STARTED | {"listCollections"=>1, "cursor"=>{}, "nameOnly"=>true, "lsid"=>{"id"=><BSON::Binary:0x2070 type=uuid data=0xef19401a623a42db...>}}
> D, [2020-02-29T17:56:28.780752 #10131] DEBUG -- : MONGODB | [7] localhost:27017 | final.listCollections | FAILED | wrong number of arguments (given 2, expected 1) | 0.008188s
> [2020-02-29T17:56:29,498][ERROR][logstash.javapipeline    ][main] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<ArgumentError: wrong number of arguments (given 2, expected 1)>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bson-4.8.0-java/lib/bson/hash.rb:115:in `from_bson'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/dbref.rb:104:in `from_bson'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/protocol/serializers.rb:268:in `deserialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/protocol/serializers.rb:211:in `deserialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/protocol/message.rb:319:in `deserialize_field'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/protocol/message.rb:160:in `block in deserialize'", "org/jruby/RubyArray.java:1814:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/protocol/message.rb:156:in `deserialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server/connection_base.rb:109:in `block in deliver'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server/connectable.rb:83:in `ensure_connected'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server/connection_base.rb:100:in `deliver'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server/connection.rb:395:in `deliver'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server/connection_base.rb:93:in `dispatch'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/executable.rb:56:in `block in dispatch_message'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server/connection_pool.rb:557:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/server.rb:417:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/executable.rb:55:in `dispatch_message'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/executable.rb:50:in `get_result'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/executable.rb:29:in `block in do_execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/response_handling.rb:87:in `add_server_diagnostics'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/executable.rb:28:in `block in do_execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/response_handling.rb:73:in `unpin_maybe'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/executable.rb:26:in `do_execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/executable.rb:38:in `execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/shared/op_msg_or_command.rb:27:in `execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/operation/collections_info.rb:44:in `execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/database/view.rb:168:in `send_initial_query'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/database/view.rb:58:in `block in collection_names'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/retryable.rb:61:in `block in read_with_retry_cursor'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/retryable.rb:316:in `modern_read_with_retry'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/retryable.rb:117:in `read_with_retry'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/retryable.rb:60:in `read_with_retry_cursor'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/database/view.rb:57:in `collection_names'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.3/lib/mongo/database.rb:120:in `collection_names'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-mongodb-0.4.1/lib/logstash/inputs/mongodb.rb:137:in `get_collection_names'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-mongodb-0.4.1/lib/logstash/inputs/mongodb.rb:156:in `update_watched_collections'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-mongodb-0.4.1/lib/logstash/inputs/mongodb.rb:182:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:200:in `block in register_plugins'", "org/jruby/RubyArray.java:1814:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:199:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:310:in `start_inputs'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:270:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:154:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:109:in `block in start'"], "pipeline.sources"=>["/etc/logstash/conf.d/mongodata.conf"], :thread=>"#<Thread:0x1cbb20f2 run>"}
> [2020-02-29T17:56:29,510][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
> [2020-02-29T17:56:29,684][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
> [2020-02-29T17:56:34,749][INFO ][logstash.runner          ] Logstash shut down.

Older versions of Java were more forgiving of calling a method with too many arguments. Modern versions are not. This is most likely a bug in the plugin. You may be able to work around it by installing an older version of Java.