Log stash schema less input fields

Hi,
I'm new in log-stash what I trying to do is to store my request and response JSON of my web app into the elastic search using log-stash with no change in their JSON format.
the problem is a field named start_time is long in some APIs and string in others which causes the following error by log-stash and data loss in elastic search.

[WARN ] 2021-02-24 11:30:30.071 [[main]>worker2] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x6c703a1d>], :response=>{"index"=>{"_index"=>"logstash", "_type"=>"_doc", "_id"=>"AuAN03cBCS4Qzugpwaws", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [data.payload.keepers.start_time] of type [long] in document with id 'AuAN03cBCS4Qzugpwaws'. Preview of field's value: '01/01/1399'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: \"01/01/1399\""}}}}}

and this is my config file:

input {
  file {
    path => "/home/ali/logstash/1.log"
  }
}
filter{
  json {
    source => "message"
    remove_field => ["message"]
  }
}
output {
    elasticsearch {
        hosts => ["*****"]
        user => "****"
        password => "****"
    }
}

how can i force the log-stash to use schema-less records? I'm really appreciate helps in details.

The issue is in elasticsearch, not logstash. A field can only have one type. If you have dynamic mapping enabled then the first document that is indexed determines the field type. You can override that using an index template.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.