Logstash - Sending "_id" in my document generates error

I have logstash using file input and elasticsearch output plugins and configured this way

input {
	file {
		path => "D:/log_streaming/my_app/*.log"
		start_position => "beginning"
		sincedb_path => "NUL"
		mode => "read"
		file_completed_action => "delete"
	}
}
filter {
    json {
        source => "message"
    }
	mutate {
		remove_field => "path"
		remove_field => "host"
		remove_field => "message"
		remove_field => "@version"
	}
}
output {
    file {
        path => "/log_streaming/my_app/log/logelastic-%{+yyyy-MM-dd_HH.mm.ss.SSS}.log"	
    }
	elasticsearch {
		hosts => "http://localhost:9200"
		index => "journaling_insert"
	}
}

the file in input is formatted as:

{"CHAINCODE":"8971","REMOTEIP":"1.111.1.11","STOREATTRIBUTE5":"StoreDB Value","DATETIME":"2025-03-24T17:52:56.238","STOREATTRIBUTE4":"StoreDB Value","FLSECURITY":{"SID":"1111"},"FLCUSTOMER":{"FIRSTNAME":"Gandalf","LASTNAME":"the Grey"},"EVENTID":"16","STOREATTRIBUTE3":"Passed Value","STOREATTRIBUTE2":"StoreDB Value"}
{"CHAINCODE":"8971","REMOTEIP":"1.111.1.11","STOREATTRIBUTE5":"StoreDB Value","DATETIME":"2025-03-24T17:52:56.238","STOREATTRIBUTE4":"StoreDB Value","FLCUSTOMER":{"FIRSTNAME":"Gandalf","LASTNAME":"the Grey"},"FLTRANSACTIONATTRIBUTES":{"INVOICENUMBER":"1111"},"EVENTID":"17","DRAWERIDENT":"test","STOREATTRIBUTE3":"StoreDB Value","STOREATTRIBUTE2":"StoreDB Value"}

And all is good.
But if I try to add the "_id" I want for my document in this format:

{"STOREATTRIBUTE2":"StoreDB Value","FLCUSTOMER":{"LASTNAME":"the Grey","FIRSTNAME":"Gandalf"},"EVENTID":"16","_id":"BCEE8FD9-1356-4636-BBBA-DA93C813BED7","STOREATTRIBUTE5":"StoreDB Value","@timestamp":"2025-03-24T17:08:49.344Z","REMOTEIP":"1.111.1.11","DATETIME":"2025-03-24T17:08:47.718","FLSECURITY":{"SID":"1111"},"CHAINCODE":"8971","STOREATTRIBUTE3":"Passed Value","STOREATTRIBUTE4":"StoreDB Value"}
{"DRAWERIDENT":"test","FLTRANSACTIONATTRIBUTES":{"INVOICENUMBER":"1111"},"STOREATTRIBUTE2":"StoreDB Value","EVENTID":"17","@timestamp":"2025-03-24T17:08:49.344Z","_id":"88426CCD-C1AC-4D8A-BE94-634EC01FB505","STOREATTRIBUTE5":"StoreDB Value","FLCUSTOMER":{"LASTNAME":"the Grey","FIRSTNAME":"Gandalf"},"REMOTEIP":"1.111.1.11","DATETIME":"2025-03-24T17:08:47.718","CHAINCODE":"8971","STOREATTRIBUTE3":"StoreDB Value","STOREATTRIBUTE4":"StoreDB Value"}

I receive and error:

Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"journaling_insert", :routing=>nil}, {"STOREATTRIBUTE2"=>"StoreDB Value", "FLCUSTOMER"=>{"LASTNAME"=>"the Grey", "FIRSTNAME"=>"Gandalf"}, "EVENTID"=>"16", "_id"=>"BCEE8FD9-1356-4636-BBBA-DA93C813BED7", "STOREATTRIBUTE5"=>"StoreDB Value", "@timestamp"=>2025-03-24T17:08:49.344Z, "REMOTEIP"=>"1.111.1.11", "DATETIME"=>"2025-03-24T17:08:47.718", "FLSECURITY"=>{"SID"=>"1111"}, "CHAINCODE"=>"8971", "STOREATTRIBUTE3"=>"Passed Value", "STOREATTRIBUTE4"=>"StoreDB Value"}], :response=>{"index"=>{"_index"=>"journaling-000002", "_type"=>"_doc", "_id"=>"T2sgyZUBTmjvSlXy5fsP", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"Field [_id] is a metadata field and cannot be added inside a document. Use the index API request parameters."}}}}

How can I inject the "_id" in my document?

Look at the format of the bulk API call. The _id is supplied in the first line, along with the operation and index name, not as part of the document.

Try

mutate { rename => { "_id" => "[@metadata][_id]" } }

and add id => "%{[@metadata][_id]}" to the elasticsearch output.

Yes, if I use http output plugin, I can add the "_id" in the index, as you helped me in the previous question, but with the elasticsearch plugin I need to pass just the document lines and there "_id" is not accepted.

So, I will try to execute your mutate for the _id and let you know.

I guess it is automatically able to understand that he needs to do it per every line to be inserted.

@Badger , it works. The final configuration is:

input {
	file {
		path => "D:/log_streaming/my_app/records/*.log"
		start_position => "beginning"
		sincedb_path => "NUL"
		mode => "read"
		file_completed_action => "delete"
	}
}
filter {
    json {
        source => "message"
    }
	mutate {
		rename => { "_id" => "[@metadata][_id]" }
		remove_field => "path"
		remove_field => "host"
		remove_field => "message"
		remove_field => "@version"
	}
}
output {
    file {
        path => "/log_streaming/my_app/logelastic/log-%{+yyyy-MM-dd_HH.mm.ss.SSS}.log"	
    }
	elasticsearch {
		hosts => "http://localhost:9200"
		index => "journaling_insert"
		document_id => "%{[@metadata][_id]}"
		doc_as_upsert => "true"
	}
}