thanks for the fast responce
whats funny to me is all other filters and outputs running ok
only this filter does not provide output in service
for reference I have used this git source
[2020-05-27T20:16:49,252][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.7.0"}
[2020-05-27T20:16:57,173][INFO ][org.reflections.Reflections] Reflections took 61 ms to scan 1 urls, producing 21 keys and 41 values
[2020-05-27T20:17:36,236][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://127.0.0.1:9200/]}}
[2020-05-27T20:17:36,596][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2020-05-27T20:17:36,676][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-05-27T20:17:36,682][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2020-05-27T20:17:36,786][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//127.0.0.1"]}
[2020-05-27T20:17:36,820][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2020-05-27T20:17:36,829][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-05-27T20:17:36,838][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-05-27T20:17:36,844][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2020-05-27T20:17:36,859][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-05-27T20:17:36,886][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2020-05-27T20:17:36,902][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-05-27T20:17:36,981][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-05-27T20:17:36,986][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-05-27T20:17:37,231][INFO ][logstash.filters.geoip ][main] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.3-java/vendor/GeoLite2-City.mmdb"}
[2020-05-27T20:17:37,490][INFO ][logstash.filters.geoip ][main] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.3-java/vendor/GeoLite2-City.mmdb"}
[2020-05-27T20:17:37,938][INFO ][logstash.filters.geoip ][main] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.3-java/vendor/GeoLite2-City.mmdb"}
[2020-05-27T20:17:38,859][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2020-05-27T20:17:38,868][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>48, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>6000, "pipeline.sources"=>["/etc/logstash/conf.d/10-import.conf", "/etc/logstash/conf.d/30-filter-mail.conf", "/etc/logstash/conf.d/30-filter-nginx.conf", "/etc/logstash/conf.d/31-filter-auth.conf", "/etc/logstash/conf.d/50-filter-dovecot.conf", "/etc/logstash/conf.d/50-filter-postfix.conf", "/etc/logstash/conf.d/50-filter-postgrey.conf", "/etc/logstash/conf.d/51-filter-postfix-postproc.conf", "/etc/logstash/conf.d/65-filter-spamd.conf", "/etc/logstash/conf.d/90-output.conf"], :thread=>"#<Thread:0x5441fe run>"}
[2020-05-27T20:17:47,497][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-05-27T20:17:47,660][INFO ][filewatch.observingtail ][main][da7aa9bb1b33dce2079843c8dfd2ee334309b530fc4189372c6a0d146bb59c70] START, creating Discoverer, Watch with file and sincedb collections
[2020-05-27T20:17:47,659][INFO ][filewatch.observingtail ][main][9ce5eb4961358279a3ac473ad32cbc26a8a3e31cec0e82c6bc6b3d810bbd3712] START, creating Discoverer, Watch with file and sincedb collections
[2020-05-27T20:17:47,661][INFO ][filewatch.observingtail ][main][df2ca38f8542ce31331753b1b083ce4299bcf6b81c2ccd1e97634cf7edcc813b] START, creating Discoverer, Watch with file and sincedb collections
[2020-05-27T20:17:47,675][INFO ][filewatch.observingtail ][main][e3bc4c168dfc480910a5e5d6e71c2afcf8733f8a5ca5fc1cadab35c3b75b6f86] START, creating Discoverer, Watch with file and sincedb collections
[2020-05-27T20:17:47,828][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2020-05-27T20:17:48,596][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}