No events in DLQ


#1

Hi all,
I'm working with Logstash 5.5 and the new Feature DQL.

I've installed the plugin version 1.0.5 for LS 5.5:

/usr/share/logstash/bin/logstash-plugin list --verbose
...
logstash-input-dead_letter_queue (1.0.5)
logstash-input-elasticsearch (4.0.4)
...

This is the logstash.yml:

# ------------ Dead-Letter Queue Settings --------------
# Flag to turn on dead-letter queue.
#
dead_letter_queue.enable: true
#
# If using dead_letter_queue.enable: true, the directory path where the data files will be stored.
# Default is path.data/dead_letter_queue
#
path.dead_letter_queue: /data/logstashData/dead_letter_queue

In the dead_letter_queue dir, I have ten empty files (1.log ... 10.log) and in the same logstash pipeline I have 2 inputs:
Kafka
DQL

input {
  kafka {
    bootstrap_servers => "...:9092,...:9092"
    topics_pattern => "logstash-55*"  
    consumer_threads => 3
    decorate_events => true          
    codec => "json"
  }
  
  dead_letter_queue {
    path => "/data/logstashData/dead_letter_queue" 
    commit_offsets => true 
    codec => "json"
  }
}

In order to automatically handle dead events, because I could have some different mapping fields in JSON to ingest, I use a "log_version" field that is incremented by a Ruby script in the pipeline that determines the name of the index (ex: logstash-*-VERSION-YYYY.MM.DD).
In this way, if an event could not be indexed, It's put into the DLQ with version=1, then it's read by DLQ input and the version is incremented by 1 (version=2) and so on.

Now, in Elasticsearch I have this exception: MapperParsingException: object mapping for [mapping_request.payload] tried to parse field [payload] as object, but found a concrete value.

I think that this event must be in the DLQ but nothing is changed.

Logstash Pipeline API:

"inputs" : [ {
        "id" : "0fc849ea2666cb7428155be4777136ae64990ef7-2",
        "events" : {
          "out" : 0,
          "queue_push_duration_in_millis" : 0
        },
        "name" : "dead_letter_queue"
      }, {
        "id" : "0fc849ea2666cb7428155be4777136ae64990ef7-1",
        "events" : {
          "out" : 15209085,
          "queue_push_duration_in_millis" : 2000258
        },
        "name" : "kafka"
      } ],

How can I verify that DLQ is well-configured?

Thank you.


(Rob Bavey) #2

Hi,

Couple of quick questions for you:

  • Do you have an example of the data you are trying to ingest and the mapping (please remove any sensitive information)
  • What version of the logstash-output-elasticsearch plugin are you using?
  • Do you see any related error messages in the logstash log?
  • Just to clarify - each of the .log files in the dead letter queue directory are empty, or a single byte?

Thanks!

Rob


#3

Hi, thank you for your reply. Today it works. So now I'm able to process some (but not all) dead events.

About your questions

  1. I'm cannot mutate the fields because I don't know which is the problematic field, I'm ingesting some json logs that are very very long and different. Is there a metadata that indicates the specific field? Otherwise I cannot process it and the easiest way is to create a separate index with the same pattern.
  2. logstash-output-elasticsearch (7.3.6)
  3. With the DLQ up&running sometimes I get this error:

[2017-07-25T16:41:19,389][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::DeadLetterQueue path=>"/data/logstashData/dead_letter_queue", sincedb_path=>"/data/logstashDLQData/dlq_offset", commit_offsets=>true, codec=><LogStash::Codecs::JSON id=>"json_73337b45-c626-4cae-aab9-b2f4d8775a1e", enable_metric=>true, charset=>"UTF-8">, id=>"52ff26acd4e4d46066179db0a3cdf329cbe3c13c-1", enable_metric=>true, pipeline_id=>"main">
Error:
[2017-07-25T16:41:20,390][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::DeadLetterQueue path=>"/data/logstashData/dead_letter_queue", sincedb_path=>"/data/logstashDLQData/dlq_offset", commit_offsets=>true, codec=><LogStash::Codecs::JSON id=>"json_73337b45-c626-4cae-aab9-b2f4d8775a1e", enable_metric=>true, charset=>"UTF-8">, id=>"52ff26acd4e4d46066179db0a3cdf329cbe3c13c-1", enable_metric=>true, pipeline_id=>"main">
Error:
[2017-07-25T16:41:21,608][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::DeadLetterQueue path=>"/data/logstashData/dead_letter_queue", sincedb_path=>"/data/logstashDLQData/dlq_offset", commit_offsets=>true, codec=><LogStash::Codecs::JSON id=>"json_73337b45-c626-4cae-aab9-b2f4d8775a1e", enable_metric=>true, charset=>"UTF-8">, id=>"52ff26acd4e4d46066179db0a3cdf329cbe3c13c-1", enable_metric=>true, pipeline_id=>"main">
Error:
[2017-07-25T16:41:22,736][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::DeadLetterQueue path=>"/data/logstashData/dead_letter_queue", sincedb_path=>"/data/logstashDLQData/dlq_offset", commit_offsets=>true, codec=><LogStash::Codecs::JSON id=>"json_73337b45-c626-4cae-aab9-b2f4d8775a1e", enable_metric=>true, charset=>"UTF-8">, id=>"52ff26acd4e4d46066179db0a3cdf329cbe3c13c-1", enable_metric=>true, pipeline_id=>"main">
Error:
[2017-07-25T16:41:24,009][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::DeadLetterQueue path=>"/data/logstashData/dead_letter_queue", sincedb_path=>"/data/logstashDLQData/dlq_offset", commit_offsets=>true, codec=><LogStash::Codecs::JSON id=>"json_73337b45-c626-4cae-aab9-b2f4d8775a1e", enable_metric=>true, charset=>"UTF-8">, id=>"52ff26acd4e4d46066179db0a3cdf329cbe3c13c-1", enable_metric=>true, pipeline_id=>"main">
Error:
[2017-07-25T16:41:25,012][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::DeadLetterQueue path=>"/data/logstashData/dead_letter_queue", sincedb_path=>"/data/logstashDLQData/dlq_offset", commit_offsets=>true, codec=><LogStash::Codecs::JSON id=>"json_73337b45-c626-4cae-aab9-b2f4d8775a1e", enable_metric=>true, charset=>"UTF-8">, id=>"52ff26acd4e4d46066179db0a3cdf329cbe3c13c-1", enable_metric=>true, pipeline_id=>"main">
Error:
[2017-07-25T16:41:47,544][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::DeadLetterQueue path=>"/data/logstashData/dead_letter_queue", sincedb_path=>"/data/logstashDLQData/dlq_offset", commit_offsets=>true, codec=><LogStash::Codecs::JSON id=>"json_73337b45-c626-4cae-aab9-b2f4d8775a1e", enable_metric=>true, charset=>"UTF-8">, id=>"52ff26acd4e4d46066179db0a3cdf329cbe3c13c-1", enable_metric=>true, pipeline_id=>"main">
Error: invalid checksum of record

Please note that some json are very very long and some fields may contains the Base64 of PDF, etc. I really don't know the exact content.

Thank you.


(Rob Bavey) #4

Hi,

Glad to hear you are having some success today! Thank you for the information - let's see if we can help you with some of your other issues

The metadata includes the exception payload from Elasticsearch, but does not parse it into a separate metadata field - eg
"reason" => "Could not index event to Elasticsearch. status: 400, action: [\"index\", {:_id=>nil, :_index=>\"logstash-2017.06.22\", :_type=>\"logs\", :_routing=>nil}, 2017-06-22T01:29:29.804Z Suyogs-MacBook-Pro-2.local {\"geoip\":{\"location\":\"home\"}}], response: {\"index\"=>{\"_index\"=>\"logstash-2017.06.22\", \"_type\"=>\"logs\", \"_id\"=>\"AVzNayPze1iR9yDdI2MD\", \"status\"=>400, \"error\"=>{\"type\"=>\"mapper_parsing_exception\", \"reason\"=>\"failed to parse\", \"caused_by\"=>{\"type\"=>\"illegal_argument_exception\", \"reason\"=>\"illegal latitude value [266.30859375] for geoip.location\"}}}}"

This sounds like it could be a bug in the dead letter queue plugin. If you can reproduce this, would you mind running Logstash with debug logging (run with --log.level=debug on the command-line) and posting the same error, if it comes up, so we have more information to track down your issue.

Thanks,

Rob


#5

Hi,

Elasticsearch gave me this message 4/5 times today:

MapperParsingException: object mapping for [mapping_request.payload] tried to parse field [payload] as object, but found a concrete value

But logstash didn't put the events in the DLQ. Last update of DLQ's files is yesterday. No way to get them indexed :confused:

If I use debug in logstash, It produces an amount of data that I cannot manage :confused:

No other ways?

Thanks


(Rob Bavey) #6

Hi,

I've filed a bug ticket for Logstash (https://github.com/elastic/logstash/issues/7820), because it looks like there could be an issue at our end. If you could review the bug entry and add as much information as you can to the ticket - log files, config files and any data that can help us reproduce the problem at our end would be hugely useful.

Thanks!

Roh


#7

Thank you.


(system) #8

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.