Logstash: max_bytes_length_exceeded exception

Hi all,

I have a short question, which might be very stupid indeed.

I have a document that got 34321 byte of lenght. So I get a response like

response=>{"index"=>{"_index"=>"kafka-test-car-2018.01.15", "_type"=>"logs", "_id"=>"AWD6HJIWT8X2jMCM5ROz", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Document contains at least one immense term in field=\"message\" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: '[123, 34, 104, 101, 97, 100, 101, 114, 115, 34, 58, 123, 34, 97, 101, 95, 116, 114, 97, 99, 107, 105, 110, 103, 34, 58, 34, 73, 68, 58]...', original message: bytes can be at most 32766 in length; got 34321", "caused_by"=>{"type"=>"max_bytes_length_exceeded_exception", "reason"=>"max_bytes_length_exceeded_exception: bytes can be at most 32766 in length; got 34321"}}}}

The content of this very large field (named "message") does not need to be searchable, but I want to index it anyway. According to kibana it is of type String.

My template looks like this:

{
"template" : "test-",
"settings" : {
"index.refresh_interval" : "15s",
"number_of_shards" : 10,
"number_of_replicas" : 0
},
"mappings" : {
"default" : {
"_all" : {"enabled" : true},
"dynamic_templates" : [ {
"string_fields" : {
"match" : "
",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string", "index" : "not_analyzed", "omit_norms" : true
}
}
} ],
"properties" : {
"@version": { "type": "string", "index": "not_analyzed" },
"geoip" : {
"type" : "object",
"dynamic": true,
"properties" : {
"location" : { "type" : "geo_point" }
}
},
"message":{
"type" : "string",
"index": "not_analyzed" ,
"ignore_above": 32700
}
}
}
}
}

But I still get the above error. What am I missing here?

Thanks in advance!
Anna

Edit: I am using logstash 5.2 and elasticsearch 5.2

Hi,
i thing you need this settings in your elasticsearch master node:

http.max_initial_line_length:

hope this helps.

my settings is "http.max_initial_line_length: 150kb"

@tension83
thanks for your reply :slight_smile:

But I don't think this will help me, as this is just related to the http API. Right?
My problem ocrurres while indexing with logstash.

Ok, I found the correct mapping via trial and error :joy:
If you have the same problem, use this:

{
    "template": "test-*",
    "settings": {
      "index": {
        "number_of_shards": "10",
        "number_of_replicas": "0",
        "refresh_interval": "15s"
      }
    },
    "mappings": {
      "_default_": {
        "dynamic_templates": [
          {
            "string_fields": {
              "mapping": {
                "ignore_above": 10922,
                "index": "not_analyzed",
                "omit_norms": true,
                "type": "string"
              },
              "match_mapping_type": "string",
              "match": "*"
            }
          }
        ],
        "_all": {
          "enabled": true
        },
        "properties": {
          "geoip": {
            "dynamic": true,
            "type": "object",
            "properties": {
              "location": {
                "type": "geo_point"
              }
            }
          },
          "@version": {
            "index": "not_analyzed",
            "type": "string"
          },
          "message": {
            "ignore_above": 10922,
            "index": "not_analyzed",
            "type": "string"
          }
        }
      }
    },
    "aliases": {}
  }
}

This topic may be closed :slight_smile: :sunflower:

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.