Logstash add @version which is not in the original document

Hi,
I have this logstash config that moves data between 2 elasticsearch, when I run it I get this error, any idea how to solve this?

input {
    elasticsearch {
        hosts => ["<source>"]
        user => "**"
        password => "**"
        index => "idx-uk-category-0005"
        size => 1000
        scroll => "10m"
        codec => "json"
        docinfo => true
    }
}
# a note in this section indicates that filter can be selected
filter {
}
output {
    elasticsearch {
        hosts => ["<target>"]
        user => "**"
        password => "**"
        index => "idx-uk-category-0005"
    }
    stdout { codec => rubydebug { metadata => true } }
}

This is the error:

[main][07849e361f87e2bca773dc6ffd4bb555a7e9d671c249307fdab01f4cdcf6fb4e] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"idx-uk-category-0005", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x62c197e2], :response=>{"index"=>{"_index"=>"idx-uk-category-0005", "_type"=>"_doc", "_id"=>"4RtK5nUBh2BTJtDt2rUv", "status"=>400, "error"=>{"type"=>"strict_dynamic_mapping_exception", "reason"=>"mapping set to strict, dynamic introduction of [@version] within [_doc] is not allowed"}}}}

Correct, logstash will always add a @version field when it creates the event. You can remove it using

mutate { remove_field => [ "@version" ] }

in your filter section.

Thank you, this seems to solve the problem, also found another field added, so final filter should be

mutate { remove_field => [ "@version", "@timestamp" ] }

But I found another problem, logstash uses an auto generated _id, not the original _id in the source document, any idea how to force it to use the original _id from the source document?

I figured out how to force to use the original document id, I add this in the output section
document_id => "%{[@metadata][_id]}"