Logstash doesn't loop log data to ES

Here's the logstash start process result:

/opt/logstash/bin/logstash -f /opt/logstash/config/logstash.conf
Sending Logstash logs to /opt/logstash/logs which is now configured via log4j2.properties
[2020-04-14T09:11:20,604][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-04-14T09:11:20,844][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.2"}
[2020-04-14T09:11:24,473][INFO ][org.reflections.Reflections] Reflections took 88 ms to scan 1 urls, producing 20 keys and 40 values 
[2020-04-14T09:11:28,066][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://10.71.4.9:9200/]}}
[2020-04-14T09:11:28,530][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://10.71.4.9:9200/"}
[2020-04-14T09:11:28,650][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-04-14T09:11:28,660][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-04-14T09:11:28,899][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//10.71.4.9:9200"]}
[2020-04-14T09:11:29,029][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-04-14T09:11:29,117][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-04-14T09:11:29,129][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/opt/logstash/config/logstash.conf"], :thread=>"#<Thread:0x3ede1de4 run>"}
[2020-04-14T09:11:29,317][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-04-14T09:11:31,820][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/opt/logstash/data/plugins/inputs/file/.sincedb_2ce6f232da790296fc018978c1ba0a04", :path=>["/var/log/*.log", "/var/log/message"]}
[2020-04-14T09:11:31,889][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-04-14T09:11:32,023][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-04-14T09:11:32,039][INFO ][filewatch.observingtail  ][main] START, creating Discoverer, Watch with file and sincedb collections
[2020-04-14T09:11:32,609][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

as the logstash input and output configuration file as below:

input {
    file {
        path => ["/var/log/*.log","/var/log/message"]
        type => "system"
        start_position => "beginning"
        }
}
output {
    elasticsearch { 
      hosts => ["10.71.4.9:9200"]
      index => "system-log-%{+YYYY.MM.dd}"
        }
    stdout { codec => rubydebug }
}

as the logstash start the first time it's working there and has data loop to ES then no update.
The kibana screen shows that there's no data after 16:00 yesterday.

Have you had any data appended to any of the tracked log files since Logstash initially processed the data?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.