Logstash does not update logs Elasticsearch input plugin

Hi guys!
I have an issue with Logstash Elasticsearch input plugin. I try to transfer data from one ES cluster to other ES cluster, but Logstash transfer data only to time when pipeline starting. For example, pipeline started at 11:30, on the new ES cluster I can see data only till this time.
Logstash does not update data after this time.

But if I change logstash input (I used cloudwatch for the test) all works fine.

my config :

    input {
      # Read all documents from Elasticsearch matching the given query
      elasticsearch {
        hosts => "primary-elasticsearch-host:9200"
        index => "*"
        query => '{ "query": {"match_all": {}} }'
        size => 2000
        scroll => "2m"
        docinfo => true
      }
    }


    output {

      elasticsearch {
        hosts => ["new-elasticsearch-host:9200"]
        index => "infra-%{+YYYY.MM.dd}"
        user => "user"
        password => "password"
        manage_template => false
      }
    }

my logstash logs :

OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.headius.backport9.modules.Modules to method java.lang.Object.finalize()
WARNING: Please consider reporting this to the maintainers of com.headius.backport9.modules.Modules
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Installing file: /usr/share/logstash/plugins/logstash-offline-plugins-7.4.0.zip
Install successful
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.headius.backport9.modules.Modules (file:/usr/share/logstash/logstash-core/lib/jars/jruby-complete-9.2.8.0.jar) to field java.io.FileDescriptor.fd
WARNING: Please consider reporting this to the maintainers of com.headius.backport9.modules.Modules
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2020-05-27T20:40:27,899][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[2020-05-27T20:40:27,915][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
[2020-05-27T20:40:29,017][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-05-27T20:40:29,027][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.4.0"}
[2020-05-27T20:40:29,111][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"380e64a0-909b-4306-b46c-67e8f6e8d74d", :path=>"/usr/share/logstash/data/uuid"}
[2020-05-27T20:40:33,210][INFO ][org.reflections.Reflections] Reflections took 92 ms to scan 1 urls, producing 20 keys and 40 values
[2020-05-27T20:40:35,617][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://admin:xxxxxx@elasticsearch-master:9200/]}}
[2020-05-27T20:40:36,234][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://admin:xxxxxx@elasticsearch-master:9200/"}
[2020-05-27T20:40:36,397][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2020-05-27T20:40:36,400][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-05-27T20:40:36,426][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//elasticsearch-master:9200"]}
[2020-05-27T20:40:36,797][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-05-27T20:40:36,804][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, :thread=>"#<Thread:0x72363f3b run>"}
[2020-05-27T20:40:38,396][INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>"main"}
[2020-05-27T20:40:38,750][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-05-27T20:40:40,400][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-05-27T20:41:53,518][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"6-infra-2020.05.27", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x69853a6>], :response=>{"index"=>{"_index"=>"6-infra-2020.05.27", "_type"=>"_doc", "_id"=>"ELveV3IBVEPZ9qCeuAEy", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [processed_json.ts] of type [float] in document with id 'ELveV3IBVEPZ9qCeuAEy'. Preview of field's value: '2020-05-27T12:15:53.913Z'", "caused_by"=>{"type"=>"number_format_exception", "reason"=>"For input string: \"2020-05-27T12:15:53.913Z\""}}}}}
[2020-05-27T20:41:53,520][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"6-infra-2020.05.27", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x12adf3b6>], :response=>{"index"=>{"_index"=>"6-infra-2020.05.27", "_type"=>"_doc", "_id"=>"EbveV3IBVEPZ9qCeuAEy", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [processed_json.ts] of type [float] in document with id 'EbveV3IBVEPZ9qCeuAEy'. Preview of field's value: '2020-05-27T12:15:53.962Z'", "caused_by"=>{"type"=>"number_format_exception", "reason"=>"For input string: \"2020-05-27T12:15:53.962Z\""}}}}}
[2020-05-27T20:41:53,521][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"6-infra-2020.05.27", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x69ba885e>], :response=>{"index"=>{"_index"=>"6-infra-2020.05.27", "_type"=>"_doc", "_id"=>"ErveV3IBVEPZ9qCeuAEy", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [processed_json.ts] of type [float] in document with id 'ErveV3IBVEPZ9qCeuAEy'. Preview of field's value: '2020-05-27T12:15:53.965Z'", "caused_by"=>{"type"=>"number_format_exception", "reason"=>"For input string: \"2020-05-27T12:15:53.965Z\""}}}}}

What wrong in my config?
any idea?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.