Http filter leaves fields, that are inserted to elasticsearch

Basically my pipeline looks like this:

input {
    (...) #some input
}
filter {
    http {
        (...) #http request
    }
}
output{
    elasticsearch{
        (...)
        action => "update"
        doc_as_upsert => true
    }
}

I need http filter to do a particular HTTP request. But nothing more in http filter, I am not reading data back.
Usage of that filter unwanted result is, that fields produced by http filter are being inserted into ELS document which is updated. For example: headers, body, @version, @timestamp, ...
I tried to use mutate with remove_field to get rid of those unnecessary fields, which work for
headers and body, but I am left with fields like @version, @timestamp which I cannot remove (document will not be updated in ELS) but also do not want them in my document _source.
Is there a way to perform http filter without adding new fields to the event?

@version and @timestamp are not added by the http filter. They are added to all events by logstash. If I recall correctly @timestamp is mandatory, elasticsearch requires it to be present. Not sure about @version.

Yes, my mistake.
I would delete this topic, but I do not have permission to do so.