Join 2 indexes with 1 primary key logstash filter elasticsearch

So tried to join a transfrom index to index table. I saw this as reference : How to join fields from multi events into single event? and it seems like it should work by using elasticsearch filter plugin.
transform index name : transform_ndex
transform index contains these fields : extract.keyword, service.keyword.cardinality, timestamp_tries.max,userID.keyword
here is my new index configuration:

input
{
    beats
    {
        port =>5053
    }
}

filter
{
         csv
         {
            skip_header => true
            columns => ["Name","Email","date"]
            separator => ","
        }
        grok {
        match => {"[log][file][path]" => "%{POSINT:extract}"}

        elasticsearch {
        hosts => ["127.0.0.1"]
        index => "transform_authentication"
        query => "extract:%{extract.keyword} AND email:%{userID.keyword}"
        fields => {
            "timestamp_tries.max" => "timestamp_tries.max"
            "service.keyword.cardinality" => "service.keyword.cardinality"
        }
    }

}
output {
        stdout{codec=>rubydebug}
}

field in new index (extract) should be the same as transform_ndex(extract.keyword) and new index (email) should be the same as transform_ndex(userID.keyword).
after it passed the query I want to add fields from transform_ndex to new_index (timestamp_tries.max, service.keyword.cardinality)

it keeps on returning error

I don't know if I missed the concept or missed the conf file. Please tell me how to do this. Thank you