Logstash SNS output failing

Hi,

I am trying to send the logs to the SNS topic through Filebeat and Logstash. But it is failing with the attached below the message.

When I try to send the output to stdout, then it works without any issues.

Can someone guide me on how to resolve the issue?

Sending Logstash logs to /usr/local/Cellar/logstash/7.8.0/libexec/logs which is now configured via log4j2.properties
[2020-07-23T09:30:19,450][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-07-23T09:30:19,533][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.8.0", "jruby.version"=>"jruby 9.2.11.1 (2.5.7) 2020-03-25 b1f55b1a40 Java HotSpot(TM) 64-Bit Server VM 25.231-b11 on 1.8.0_231-b11 +indy +jit [darwin-x86_64]"}
[2020-07-23T09:30:20,846][INFO ][org.reflections.Reflections] Reflections took 29 ms to scan 1 urls, producing 21 keys and 41 values 
NEWER VERSION AVAILABLE: Please upgrade to AWS SDK For Ruby V3
[2020-07-23T09:30:22,282][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>16, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2000, "pipeline.sources"=>["/usr/local/etc/logstash/logstash.conf"], :thread=>"#<Thread:0x386b4ffa run>"}
[2020-07-23T09:30:22,878][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-07-23T09:30:22,892][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-07-23T09:30:22,964][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-07-23T09:30:22,970][INFO ][org.logstash.beats.Server][main][717c4c0d8c895f360390a7ff084c7f11a1e966a8a8ce2e0727c8e70148b36e54] Starting server on port: 5044
[2020-07-23T09:30:23,167][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-07-23T09:30:32,843][ERROR][org.logstash.execution.WorkerLoop][main] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.
org.jruby.exceptions.NoMethodError: (NoMethodError) undefined method `bytesize' for {"name"=>"AMBIN02006"}:Hash
	at RUBY.trim_bytes(/usr/local/Cellar/logstash/7.8.0/libexec/logstash-core/lib/logstash/util/unicode_trimmer.rb:16) ~[?:?]
	at RUBY.send_sns_message(/usr/local/Cellar/logstash/7.8.0/libexec/vendor/bundle/jruby/2.5.0/gems/logstash-output-sns-4.0.7/lib/logstash/outputs/sns.rb:98) ~[?:?]
	at usr.local.Cellar.logstash.$7_dot_8_dot_0.libexec.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_sns_minus_4_dot_0_dot_7.lib.logstash.outputs.sns.register(/usr/local/Cellar/logstash/7.8.0/libexec/vendor/bundle/jruby/2.5.0/gems/logstash-output-sns-4.0.7/lib/logstash/outputs/sns.rb:66) ~[?:?]
	at RUBY.encode(/usr/local/Cellar/logstash/7.8.0/libexec/vendor/bundle/jruby/2.5.0/gems/logstash-codec-plain-3.0.6/lib/logstash/codecs/plain.rb:40) ~[?:?]
	at RUBY.encode(/usr/local/Cellar/logstash/7.8.0/libexec/logstash-core/lib/logstash/codecs/delegator.rb:48) ~[?:?]
	at org.logstash.instrument.metrics.AbstractSimpleMetricExt.time(org/logstash/instrument/metrics/AbstractSimpleMetricExt.java:65) ~[logstash-core.jar:?]
	at org.logstash.instrument.metrics.AbstractNamespacedMetricExt.time(org/logstash/instrument/metrics/AbstractNamespacedMetricExt.java:64) ~[logstash-core.jar:?]
	at RUBY.encode(/usr/local/Cellar/logstash/7.8.0/libexec/logstash-core/lib/logstash/codecs/delegator.rb:47) ~[?:?]
	at RUBY.receive(/usr/local/Cellar/logstash/7.8.0/libexec/vendor/bundle/jruby/2.5.0/gems/logstash-output-sns-4.0.7/lib/logstash/outputs/sns.rb:81) ~[?:?]
	at usr.local.Cellar.logstash.$7_dot_8_dot_0.libexec.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/local/Cellar/logstash/7.8.0/libexec/logstash-core/lib/logstash/outputs/base.rb:105) ~[?:?]
	at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809) ~[jruby-complete-9.2.11.1.jar:?]
	at usr.local.Cellar.logstash.$7_dot_8_dot_0.libexec.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/local/Cellar/logstash/7.8.0/libexec/logstash-core/lib/logstash/outputs/base.rb:105) ~[?:?]
	at org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:138) ~[logstash-core.jar:?]
	at org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.multi_receive(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:121) ~[logstash-core.jar:?]
	at usr.local.Cellar.logstash.$7_dot_8_dot_0.libexec.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/local/Cellar/logstash/7.8.0/libexec/logstash-core/lib/logstash/java_pipeline.rb:278) ~[?:?]
[2020-07-23T09:30:32,843][ERROR][org.logstash.execution.WorkerLoop][main] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.
org.jruby.exceptions.NoMethodError: (NoMethodError) undefined method `bytesize' for {"name"=>"AMBIN02006"}:Hash
	at RUBY.trim_bytes(/usr/local/Cellar/logstash/7.8.0/libexec/logstash-core/lib/logstash/util/unicode_trimmer.rb:16) ~[?:?]
	at RUBY.send_sns_message(/usr/local/Cellar/logstash/7.8.0/libexec/vendor/bundle/jruby/2.5.0/gems/logstash-output-sns-4.0.7/lib/logstash/outputs/sns.rb:98) ~[?:?]
	at usr.local.Cellar.logstash.$7_dot_8_dot_0.libexec.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_sns_minus_4_dot_0_dot_7.lib.logstash.outputs.sns.register(/usr/local/Cellar/logstash/7.8.0/libexec/vendor/bundle/jruby/2.5.0/gems/logstash-output-sns-4.0.7/lib/logstash/outputs/sns.rb:66) ~[?:?]
	at usr.local.Cellar.logstash.$7_dot_8_dot_0.libexec.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_codec_minus_plain_minus_3_dot_0_dot_6.lib.logstash.codecs.plain.encode(/usr/local/Cellar/logstash/7.8.0/libexec/vendor/bundle/jruby/2.5.0/gems/logstash-codec-plain-3.0.6/lib/logstash/codecs/plain.rb:40) ~[?:?]
	at RUBY.encode(/usr/local/Cellar/logstash/7.8.0/libexec/logstash-core/lib/logstash/codecs/delegator.rb:48) ~[?:?]
	at org.logstash.instrument.metrics.AbstractSimpleMetricExt.time(org/logstash/instrument/metrics/AbstractSimpleMetricExt.java:65) ~[logstash-core.jar:?]
	at org.logstash.instrument.metrics.AbstractNamespacedMetricExt.time(org/logstash/instrument/metrics/AbstractNamespacedMetricExt.java:64) ~[logstash-core.jar:?]
	at RUBY.encode(/usr/local/Cellar/logstash/7.8.0/libexec/logstash-core/lib/logstash/codecs/delegator.rb:47) ~[?:?]
	at RUBY.receive(/usr/local/Cellar/logstash/7.8.0/libexec/vendor/bundle/jruby/2.5.0/gems/logstash-output-sns-4.0.7/lib/logstash/outputs/sns.rb:81) ~[?:?]
	at usr.local.Cellar.logstash.$7_dot_8_dot_0.libexec.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/local/Cellar/logstash/7.8.0/libexec/logstash-core/lib/logstash/outputs/base.rb:105) ~[?:?]
	at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809) ~[jruby-complete-9.2.11.1.jar:?]
	at usr.local.Cellar.logstash.$7_dot_8_dot_0.libexec.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/local/Cellar/logstash/7.8.0/libexec/logstash-core/lib/logstash/outputs/base.rb:105) ~[?:?]
	at org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:138) ~[logstash-core.jar:?]
	at org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.multi_receive(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:121) ~[logstash-core.jar:?]
	at usr.local.Cellar.logstash.$7_dot_8_dot_0.libexec.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/local/Cellar/logstash/7.8.0/libexec/logstash-core/lib/logstash/java_pipeline.rb:278) ~[?:?]
[2020-07-23T09:30:32,950][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

Filebeat configuration:

filebeat.config.modules:
  enabled: true
  path: /etc/filebeat/modules.d/*.yml

filebeat.modules:
  - module: apache
    # Access logs
    access:
      enabled: true
      var.paths: ["/var/log/httpd/access_log","/var/log/httpd/ssl_access_log"]
    # Error logs
    error:
      enabled: true
      var.paths: ["/var/httpd/httpd/error_log*","/var/log/httpd/ssl_error_log"]

filebeat.inputs:
  - type: log
    enabled: true
    paths:
      - /var/log/httpd/*.log

output.logstash:
  hosts: ["logstash-domain:5044"]

Logstash configuration:

input {
    beats {
        port => 5044
    }
}

filter {
    grok {
      match => { "message" => "%{COMBINEDAPACHELOG}" }
    }
    date {
    match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
  }
  geoip {
      source => "clientip"
    }
}

output {
	sns {
		arn => "arn:aws:sns:us-east-1:1234567890:logstash-topic"
		region => "us-east-1"
	}
}

Regards,
Vishnu

I have no experience at all with this plugin. But it seems like it is trying to process a field with the content {"name"=>"AMBIN02006"} and expecting it to be a string and therefore crashes. I guess that this is the content of the field host that it is trying to use as the sns_subject? Then flattening "host":{"name"=>"AMBIN02006"} to "host":"AMBIN02006" might make it work?
If that doesn't help, you should probably post the ruby debug output of an example event to give us a clue what might be wrong.

Thanks for your input. Yes, that's my host value and no idea why that field is being processed. I am new to Elastic stack.

Now, I have updated the filter section as below and that seems to be working. Can you check, if that's correct?

filter {

    mutate {
        split => ["hostname", "."]
        add_field => { "shortHostname" => "%{hostname[0]}" }
    }

    mutate {
        rename => ["shortHostname", "hostname" ]
    }

    mutate {
        split => ["host", "."]
        add_field => { "shortHostname" => "%{host[0]}" }
    }

    mutate {
        rename => ["shortHostname", "host" ]
    }
}

I believe, I found the issue. Adding the below solved the issue.

filter {
	mutate { add_field => { "sns_subject" => "%{[host][name]}" } }
}

What I meant was mutate { rename => {"[host][name]" => "host"}} which would have lead you to the same sns_subject as %{host} is the default value for that. It's even better that you defined that field explicitly now because it's clearer :slight_smile:

Hi everryone, @Jenni I just found that the above solution works on my local logstash (version - 7.8.0) and can see the message at SQS.
But when I try with AWS Logstash docker container (same 7.8.0), am getting the following error. Any idea?

2020-07-23T19:21:04.412+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_sns_minus_4_dot_0_dot_7.lib.logstash.outputs.sns.receive(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-sns-4.0.7/lib/logstash/outputs/sns.rb:81) ~[?:?]

2020-07-23T19:21:04.412+05:30
at usr.share.logstash.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:105) ~[?:?]

2020-07-23T19:21:04.412+05:30
at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809) ~[jruby-complete-9.2.11.1.jar:?]

2020-07-23T19:21:04.412+05:30
at usr.share.logstash.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:105) ~[?:?]

2020-07-23T19:21:04.412+05:30
at org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:138) ~[logstash-core.jar:?]

2020-07-23T19:21:04.412+05:30
at org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.multi_receive(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:121) ~[logstash-core.jar:?]

2020-07-23T19:21:04.412+05:30
at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:278) ~[?:?]

2020-07-23T19:21:04.416+05:30
[2020-07-23T13:51:04,380][ERROR][org.logstash.execution.WorkerLoop][main] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.

2020-07-23T19:21:04.416+05:30
org.jruby.exceptions.StandardError: (NetworkingError) execution expired

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.seahorse.client.plugins.raise_response_errors.call(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/seahorse/client/plugins/raise_response_errors.rb:15) ~[?:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.aws_minus_sdk_minus_core.plugins.jsonvalue_converter.call(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/aws-sdk-core/plugins/jsonvalue_converter.rb:20) ~[?:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.aws_minus_sdk_minus_core.plugins.idempotency_token.call(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/aws-sdk-core/plugins/idempotency_token.rb:18) ~[?:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.aws_minus_sdk_minus_core.plugins.param_converter.call(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/aws-sdk-core/plugins/param_converter.rb:20) ~[?:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.seahorse.client.plugins.response_target.call(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/seahorse/client/plugins/response_target.rb:21) ~[?:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.seahorse.client.request.send_request(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/seahorse/client/request.rb:70) ~[?:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.seahorse.client.base.define_operation_methods(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/seahorse/client/base.rb:207) ~[?:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_sns_minus_4_dot_0_dot_7.lib.logstash.outputs.sns.send_sns_message(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-sns-4.0.7/lib/logstash/outputs/sns.rb:103) ~[?:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_sns_minus_4_dot_0_dot_7.lib.logstash.outputs.sns.register(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-sns-4.0.7/lib/logstash/outputs/sns.rb:66) ~[?:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_codec_minus_json_minus_3_dot_0_dot_5.lib.logstash.codecs.json.encode(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-codec-json-3.0.5/lib/logstash/codecs/json.rb:42) ~[?:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.logstash_minus_core.lib.logstash.codecs.delegator.encode(/usr/share/logstash/logstash-core/lib/logstash/codecs/delegator.rb:48) ~[?:?]

2020-07-23T19:21:04.416+05:30
at org.logstash.instrument.metrics.AbstractSimpleMetricExt.time(org/logstash/instrument/metrics/AbstractSimpleMetricExt.java:65) ~[logstash-core.jar:?]

2020-07-23T19:21:04.416+05:30
at org.logstash.instrument.metrics.AbstractNamespacedMetricExt.time(org/logstash/instrument/metrics/AbstractNamespacedMetricExt.java:64) ~[logstash-core.jar:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.logstash_minus_core.lib.logstash.codecs.delegator.encode(/usr/share/logstash/logstash-core/lib/logstash/codecs/delegator.rb:47) ~[?:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_sns_minus_4_dot_0_dot_7.lib.logstash.outputs.sns.receive(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-sns-4.0.7/lib/logstash/outputs/sns.rb:81) ~[?:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:105) ~[?:?]

2020-07-23T19:21:04.416+05:30
at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809) ~[jruby-complete-9.2.11.1.jar:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:105) ~[?:?]

2020-07-23T19:21:04.416+05:30
at org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:138) ~[logstash-core.jar:?]

2020-07-23T19:21:04.416+05:30
at org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.multi_receive(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:121) ~[logstash-core.jar:?]

2020-07-23T19:21:04.416+05:30
at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:278) ~[?:?]

2020-07-23T19:21:04.420+05:30
warning: thread "[main]>worker0" terminated with exception (report_on_exception is true):

2020-07-23T19:21:04.421+05:30
java.lang.IllegalStateException: org.jruby.exceptions.StandardError: (NetworkingError) execution expired

2020-07-23T19:21:04.421+05:30
at org.logstash.execution.WorkerLoop.run(org/logstash/execution/WorkerLoop.java:105)

2020-07-23T19:21:04.421+05:30
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

2020-07-23T19:21:04.421+05:30
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(jdk/internal/reflect/NativeMethodAccessorImpl.java:62)

2020-07-23T19:21:04.421+05:30
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(jdk/internal/reflect/DelegatingMethodAccessorImpl.java:43)

2020-07-23T19:21:04.421+05:30
at java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:566)

2020-07-23T19:21:04.421+05:30
at org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:426)

2020-07-23T19:21:04.421+05:30
at org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:293)

2020-07-23T19:21:04.421+05:30
at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:278)

2020-07-23T19:21:04.421+05:30
at org.jruby.RubyProc.call(org/jruby/RubyProc.java:318)

2020-07-23T19:21:04.421+05:30
at java.lang.Thread.run(java/lang/Thread.java:834)

2020-07-23T19:21:04.421+05:30
Caused by: org.jruby.exceptions.StandardError: (NetworkingError) execution expired

2020-07-23T19:21:04.421+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.seahorse.client.plugins.raise_response_errors.call(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/seahorse/client/plugins/raise_response_errors.rb:15)

2020-07-23T19:21:04.421+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.aws_minus_sdk_minus_core.plugins.jsonvalue_converter.call(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/aws-sdk-core/plugins/jsonvalue_converter.rb:20)

2020-07-23T19:21:04.421+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.aws_minus_sdk_minus_core.plugins.idempotency_token.call(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/aws-sdk-core/plugins/idempotency_token.rb:18)

2020-07-23T19:21:04.421+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.aws_minus_sdk_minus_core.plugins.param_converter.call(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/aws-sdk-core/plugins/param_converter.rb:20)

2020-07-23T19:21:04.421+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.seahorse.client.plugins.response_target.call(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/seahorse/client/plugins/response_target.rb:21)

2020-07-23T19:21:04.421+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.seahorse.client.request.send_request(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/seahorse/client/request.rb:70)

2020-07-23T19:21:04.421+05:30
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.aws_minus_sdk_minus_core_minus_2_dot_11_dot_501.lib.seahorse.client.base.define_operation_methods(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.501/lib/seahorse/client/base.rb:207)

2020-07-23T19:21:04.421+05:30
at RUBY.send_sns_message(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-sns-4.0.7/lib/logstash/outputs/sns.rb:103)

That looks like problem with your connection, but unfortunately I don't know what to do about that.

Yes, it was the network. I managed to solve the issue by creating a VPC endpoint for SNS.

Thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.