Logstash can't load the logs to ES, just looping and do nothing


(Lam) #1

Hi,

I am a beginner, try to load logs using logstash 6.2.4, sometime work, sometime just loop there and do nothing, normal log as below:

no error message at all, is my logstash issue? .conf file issue? (but it work before). or ES problem? ES setup as default setting only.

[2018-11-04T11:20:21,530][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/elastic/logstash-6.2.4/modules/fb_apache/configuration"}
[2018-11-04T11:20:21,560][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/elastic/logstash-6.2.4/modules/netflow/configuration"}
[2018-11-04T11:20:21,905][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-11-04T11:20:22,990][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.4"}
[2018-11-04T11:20:23,920][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-04T11:20:32,048][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-11-04T11:20:32,572][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2018-11-04T11:20:32,587][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-11-04T11:20:32,806][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-11-04T11:20:32,939][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-11-04T11:20:32,955][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-11-04T11:20:32,970][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-11-04T11:20:33,002][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-11-04T11:20:33,062][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2018-11-04T11:20:33,417][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"C:/elastic/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2018-11-04T11:20:34,992][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x56c9ad8b run>"}
[2018-11-04T11:20:35,101][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}


(Lam) #2

here is the debug logs :

==== loop here - do nothing =====

[2018-11-03T17:53:55,467][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-03T17:53:55,467][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-03T17:53:58,186][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x21acfc92 sleep>"}
[2018-11-03T17:54:00,497][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-03T17:54:00,497][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-03T17:54:03,200][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x21acfc92 sleep>"}
[2018-11-03T17:54:05,529][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-03T17:54:05,544][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-03T17:54:08,109][DEBUG][logstash.inputs.file ] _globbed_files: C:/elastic/ddilogs/threats2018*.csv: glob is: ["C:/elastic/ddilogs/threats201807.csv", "C:/elastic/ddilogs/threats201808.csv", "C:/elastic/ddilogs/threats201809.csv", "C:/elastic/ddilogs/threats201810.csv"]
[2018-11-03T17:54:08,200][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x21acfc92 sleep>"}
[2018-11-03T17:54:10,560][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-03T17:54:10,562][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-03T17:54:13,200][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x21acfc92 sleep>"}
[2018-11-03T17:54:15,569][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-03T17:54:15,570][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-03T17:54:18,201][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x21acfc92 sleep>"}
[2018-11-03T17:54:20,577][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-03T17:54:20,578][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-03T17:54:23,217][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x21acfc92 sleep>"}
[2018-11-03T17:54:23,567][DEBUG][logstash.inputs.file ] _globbed_files: C:/elastic/ddilogs/threats2018*.csv: glob is: ["C:/elastic/ddilogs/threats201807.csv", "C:/elastic/ddilogs/threats201808.csv", "C:/elastic/ddilogs/threats201809.csv", "C:/elastic/ddilogs/threats201810.csv"]
[2018-11-03T17:54:25,591][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-03T17:54:25,591][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-03T17:54:28,218][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x21acfc92 sleep>"}
[2018-11-03T17:54:30,602][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-03T17:54:30,602][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-03T17:54:33,228][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x21acfc92 sleep>"}
[2018-11-03T17:54:35,623][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-03T17:54:35,624][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-03T17:54:38,229][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x21acfc92 sleep>"}
[2018-11-03T17:54:38,883][DEBUG][logstash.inputs.file ] _globbed_files: C:/elastic/ddilogs/threats2018*.csv: glob is: ["C:/elastic/ddilogs/threats201807.csv", "C:/elastic/ddilogs/threats201808.csv", "C:/elastic/ddilogs/threats201809.csv", "C:/elastic/ddilogs/threats201810.csv"]
[2018-11-03T17:54:40,633][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-03T17:54:40,633][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-03T17:54:43,234][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x21acfc92 sleep>"}
[2018-11-03T17:54:45,641][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-03T17:54:45,641][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-03T17:54:48,241][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x21acfc92 sleep>"}
[2018-11-03T17:54:50,655][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-03T17:54:50,655][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-03T17:54:53,255][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x21acfc92 sleep>"}
[2018-11-03T17:54:54,196][DEBUG][logstash.inputs.file ] _globbed_files: C:/elastic/ddilogs/threats2018*.csv: glob is: ["C:/elastic/ddilogs/threats201807.csv", "C:/elastic/ddilogs/threats201808.csv", "C:/elastic/ddilogs/threats201809.csv", "C:/elastic/ddilogs/threats201810.csv"]
[2018-11-03T17:54:55,675][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-03T17:54:55,675][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-03T17:54:58,268][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x21acfc92 sleep>"}


(Lam) #3

switch from logstash version 6.2.4 to version 6.4.2, it work.
However, I was using 6.4.2, it didn't work out. someone suggest to switch 6.2.4.

something wrong somewhere? any advise?


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.