How to remove fields in logstash output

Yeah, I put the codec to json in the input settings, and remove the rename+mutate of [id], this can update the exact doc of elasticsearch, but the document source has a [id] field which i dont want to set in the source.
my logstash config file is setting like this:

input {
kafka {
bootstrap_servers => "xx.xx.xx.xx:9092"
#client_id => "logstash1" # The purpose of this is to be able to track the source of requests beyond just ip/port by allowing a logical application name to be included.
group_id => "im_group"
auto_offset_reset => "latest" #automatically reset the offset to the latest offset
consumer_threads => 3
topics => ["im"]
#type => "message"
codec => json {
charset => "UTF-8"
}
}
}

filter {
mutate {
add_field => {"@fields" => "%{fields}"}
rename => { "[operation]" => "[@metadata][operation]" }
#rename => { "[id]" => "[@metadata][id]" }
}
json {
source => "@fields"
remove_field => ["@fields","@version","@timestamp","fields"]
}
}

output {
if [@metadata][operation] == "create" {
elasticsearch {
hosts => ["xx.xx.xx.xx:9200"]
index => "im"
timeout => 300
#user => "elastic"
#password => "changeme"
}
} else if [@metadata][operation] == "update" {
elasticsearch {
hosts => ["xx.xx.xx.xx:9200"]
index => "im"
action => "update"
doc_as_upsert => true
timeout => 300
document_id => "%{[id]}"
#document_id => "%{[@metadata][id]}"
#user => "elastic"
#password => "changeme"
}
the result of update is like this:
{

  • "_id": "Bn-Jt2kBUAnQ0IgRcSc2",
  • "_index": "im",
  • "_score": 1,

"_source": {

  • "accept": 0,
  • "content": "xxxxxxxxxxxxx",
  • "created_at": "2019-03-25 10:12:56",
  • "from_id": 12796,
  • "id": "Bn-Jt2kBUAnQ0IgRcSc2",
  • "is_revoke": 0,
  • "sign": "im_messages",
  • "to_id": 12797,
  • "updated_at": "2019-03-25 14:14:45"},
  • "_type": "doc"

}
the [id] field in both the doc and the inside source of doc, i dont want it to add to the source