Elasticsearch index error

elasticsearch.yml
network.host: localhost
http.port: 9200
transport.host: localhost
transport.tcp.port: 9300
discovery.seed_hosts:

  • localhost:9300
    cluster.initial_master_nodes:
  • ${HOSTNAME}
    action.auto_create_index: .monitoring*,.watches,.triggered_watches,.watcher-history*,.ml*

Filebeat.yml
filebeat.inputs:

  • type: log
    enabled: true
    paths:
    #- /var/log/auth.log
    • /var/log/custom.log
      filebeat.registry.path: /var/lib/filebeat/registry
      filebeat.registry.file_permissions: 0600
      filebeat.shutdown_timeout: 5s
      json.keys_under_root: true
      json.add_error_key: true

output.logstash:
hosts: ["localhost:5044"]
ssl.certificate_authorities: ["/etc//certs/logstash.crt"]

setup.kibana:
host: "localhost:5601"

processors:

  • rename:
    fields:
    • from: "log.file.path"
      to: "path"
      ignore_missing: false
      fail_on_error: true

logging.level: info
logging.to_files: true
logging.files:
path: /var/log/filebeat
name: filebeat
keepfiles: 7
permissions: 0644

[2021-01-10T15:02:28,890][WARN ][logstash.outputs.elasticsearch][main][62fa879f9cb3d6d767fe396c109c26137c7dc5006c2178486e9ce8686fcd58c5] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-7.10.1", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x2f050055], :response=>{"index"=>{"_index"=>"filebeat-7.10.1-2020.12.24-000001", "_type"=>"_doc", "_id"=>"wjmI63YBHDLDNbTjU-Dc", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [txnDateTime] of type [keyword] in document with id 'wjmI63YBHDLDNbTjU-Dc'. Preview of field's value: '{date={month=1, year=2021, day=7}, time={hour=10, nano=0, minute=23, second=1}}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:478"}}}}}

Thanks

Please format your code/logs/config using the </> button, or markdown style back ticks. It helps to make things easy to read which helps us help you :slight_smile:

Specficially Preview of field's value: '{date={month=1, year=2021, day=7}, time={hour=10, nano=0, minute=23, second=1 is not valid for that field type. You need to find what is generating this and fix it.

Thanks for reply.
First time logs format was wrong. Now I logs format json type. All logs better now. But not show index. cause preview logs format was wrong. How can upgrade fields format so that any logs not lose.

What does your Logstash output config look like?

Logstash output:

output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "%{[@metadata][beat]}-%{[@metadata][version]}"
}
stdout {
codec => rubydebug
}
}

GET filebeat-7.10.1/_mapping

    "clientRequestDateTime" : {
      "type" : "keyword",
      "ignore_above" : 1024
    }

But it will be like:

    "clientRequestDateTime" : {
      "properties" : {
        "date" : {
          "properties" : {
            "day" : {
              "type" : "long"
            },
            "month" : {
              "type" : "long"
            },
            "year" : {
              "type" : "long"
            }
          }
        },
        "time" : {
          "properties" : {
            "hour" : {
              "type" : "long"
            },
            "minute" : {
              "type" : "long"
            },
            "nano" : {
              "type" : "long"
            },
            "second" : {
              "type" : "long"
            }
          }
        }
      }
    }

Thanks