Unable to ship data to logstash 6.8 via Python

We have a Python script to collect data and ship to Logstash. Please find the python script and logstash config below.

import socket
import json
import logging
from datetime import datetime
import sys

print("starting to send data to Elastic search")
# Create TCP/IP socket
print("Creating TCP/IP socket")
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
message = []
try:
    # Connect to port where server is running
    server_address = ('<host>', 50504)
    sock.connect(server_address)
    data = {'@test' : 'test1', '@message': 'python test message', '@tags': ['python', 'test']}
    sock.sendall(json.dumps(data).encode())
    print("Sent")
except socket.error as e:
    sys.stderr.write(str(e))
finally:
    sock.close()

Logstash conf looks like the following:

 input {
   tcp {
       port => 50504
       type => "xxx"
       id   => "yyy"
       codec => json
   }
 }
 
 filter {
     if [type] == "xxx" {
             json {
                 source => "message"
               }
   }
 
   date {
           match => ["time", "yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ", "ISO8601", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss"]
        }
 
 }
 
 output {
              elasticsearch {
                                 hosts => ["<hosts>:4080"]
                                 index => "logstash-%{[type]}-s3-%{+YYYY.MM.dd}"
                                 id    => "zzz"
                            }
       }

Getting the following error on logstash log:

[2020-07-02T16:08:12,176][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-xxx-s3-2020.07.02", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x119d0bf>], :response=>{"index"=>{"_index"=>"logstash-convergence-status-s3-2020.07.02", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"[_default_] mappings are not allowed on new indices and should no longer be used. See [https://www.elastic.co/guide/en/elasticsearch/reference/current/breaking-changes-7.0.html#default-mapping-not-allowed] for more information."}}}}

Wanted to check if python could be used for shipping logs to logstash from version 6.8 and above.

That appears to be version 7.0 or above, not 6.8.

You have an index template that includes a _default_ mapping, which was deprecated in 6.0 and results in an error in 7.0 or above. Update the template and remove the _default_ mapping.

The logstash version is logstash-6.8.6-1.noarch and elasticsearch version is elasticsearch-7.7.1-1.x86_64

OK, so update the template and remove the _default_ mapping.

The template was already updated as part of ELK-7 upgrade. Following is the current elasticsearch template.

curl -XGET localhost:4080/_template/xxx?pretty
{
  "xxx" : {
    "order" : 0,
    "index_patterns" : [
      "xxx-*"
    ],
    "settings" : {
      "index" : {
        "number_of_shards" : "2",
      }
    },
    "mappings" : {
      "properties" : {
        "hostname" : {
          "type" : "keyword"
        },
        "@timestamp" : {
          "format" : "strict_date_optional_time||epoch_millis",
          "type" : "date"
        },
        "port" : {
          "type" : "long"
        },
        "logdate" : {
          "format" : "strict_date_optional_time||epoch_millis",
          "type" : "date"
        },
        "@version" : {
          "type" : "keyword"
        },
        "type" : {
          "type" : "keyword"
        }
      }
    },
    "aliases" : { }
  }
}

Note:
Logstash was upgraded to logstash-7.7.1-1.x86_64 and was restarted. It was giving the same error.

What do you get from

curl -XGET localhost:4080/_template/xxx?include_type_name
{
  "xxx": {
    "order": 0,
    "index_patterns": [
      "xxx-*"
    ],
    "settings": {
      "index": {
        "number_of_shards": "2",
      }
    },
    "mappings": {
      "_doc": {
        "properties": {
          "hostname": {
            "type": "keyword"
          },
          "@timestamp": {
            "format": "strict_date_optional_time||epoch_millis",
            "type": "date"
          },
          "port": {
            "type": "long"
          },
          "logdate": {
            "format": "strict_date_optional_time||epoch_millis",
            "type": "date"
          },
          "@version": {
            "type": "keyword"
          },
          "type": {
            "type": "keyword"
          }
        }
      }
    },
    "aliases": {}
  }
}

Try setting

manage_template => false

on the elasticsearch output. It may be trying to load the 6.x template, which includes _default_.

Tried adding the above in logstash conf

output {
		    		 elasticsearch {
                                hosts => ["<host>:4080"]
                                index => "logstash-%{[type]}-s3-%{+YYYY.MM.dd}"
                                id    => "zzz"
                                http_compression => true
                                manage_template => false
                           }
      }

Returning the same error in the logs

[2020-07-02T18:30:07,054][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-xxx-s3-2020.07.02", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x62492c0f>], :response=>{"index"=>{"_index"=>"logstash-xxx-s3-2020.07.02", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"[_default_] mappings are not allowed on new indices and should no longer be used. See [https://www.elastic.co/guide/en/elasticsearch/reference/current/breaking-changes-7.0.html#default-mapping-not-allowed] for more information."}}}}

OK, check the output of

curl -XGET localhost:4080/logstash-xxx-s3-2020.07.02/_mapping?include_type name

Are there any other templates that would be getting applied? Check

curl  -XGET localhost:4080/_template

Do you get an error for

curl  -XGET localhost:4080/_index_template

logstash-xxx-s3-2020.07.02 index isn't getting created on elasticsearch end.

There was one overriding template which had the index pattern as logstash-*

{
  "logstash" : {
    "order" : 0,
    "version" : 60001,
    "index_patterns" : [
      "logstash-*"
    ],
    "settings" : {
      "index" : {
        "refresh_interval" : "5s"
      }
    },
    "mappings" : { },
    "aliases" : { }
  }
}

After that template was deleted, the index got created and data started flowing in.

Thank you for the help @Badger

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.