ElasticSearch Input Plugin has no log output

I want to extract one ES from another es periodically through logstash, but only the startup log, no other log information。。。THX
config:

input {
  #beats {
  #  port => 5044
  #}
  elasticsearch {
    hosts => ["10.1.25.106:9210"]
    index => "chat-*"
    query => '{"query":{"range": {"sendTime": {"gt": "now-2d/d", "lt": "now/d"}}}}'
    #size => 500
    schedule => "* 11 * * * Asia/Shanghai"
    #schedule => "45 * * * *"
    scroll => "5m"
    docinfo => true
    docinfo_target => "[@metadata][doc]"
  }
}

output {
  elasticsearch {
    hosts => ["http://10.12.12.111:9200"]
    index => "%{[@metadata][_index]}"
    #user => "elastic"
    #password => "changeme"
  }
}

logs:

Using JAVA_HOME defined java: /home/hollycrm/app/tools/jdk1.8.0_251
WARNING, using JAVA_HOME while Logstash distribution comes with a bundled JDK
Sending Logstash logs to /home/hollycrm/app/tools/logstash-7.10.1/logs which is now configured via log4j2.properties
[2021-12-25T10:58:32,719][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.10.1", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 25.251-b08 on 1.8.0_251-b08 +indy +jit [linux-x86_64]"}
[2021-12-25T10:58:33,513][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-12-25T10:58:36,772][INFO ][org.reflections.Reflections] Reflections took 58 ms to scan 1 urls, producing 23 keys and 47 values 
[2021-12-25T10:58:37,961][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://10.12.12.111:9200/]}}
[2021-12-25T10:58:38,221][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://10.12.12.111:9200/"}
[2021-12-25T10:58:38,295][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2021-12-25T10:58:38,302][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2021-12-25T10:58:38,407][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://10.12.12.111:9200"]}
[2021-12-25T10:58:38,520][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2021-12-25T10:58:38,646][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/home/hollycrm/app/tools/logstash-7.10.1/config/logstash-sample.conf"], :thread=>"#<Thread:0x690701c1 run>"}
[2021-12-25T10:58:38,783][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2021-12-25T10:58:40,087][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.43}
[2021-12-25T10:58:40,353][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2021-12-25T10:58:40,425][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2021-12-25T10:58:40,975][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
/home/hollycrm/app/tools/logstash-7.10.1/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/cronline.rb:77: warning: constant ::Fixnum is deprecated

This schedule can be read as "once per minute, between 11:00am and 11:59am Shanghai-local time".

Your logs indicate a window of time that is not between 11:00 and 11:59, so I would not expect it to schedule any work.

Logstash does not print a log when data is extracted successfully; The config is error: Docinfo_target => "[@metadata][doc]" update to: docinfo_target => "[@metadata]" Data on normal pull ;thank you

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.