Logstash plugin error


#1

This is the error I was getting:

[ERROR][logstash.pipeline        ] A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::Elasticsearch hosts=

But now that error isn't even showing up, it just keep looping this:

[INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"arcsight", :directory=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/x-pack-5.6.4-java/modules/arcsight/configuration"}
[INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://logstash_system:xxxxxx@localhost:9200/]}}
[INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://logstash_system:xxxxxx@localhost:9200/, :path=>"/"}
[WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://logstash_system:xxxxxx@localhost:9200/"}
[INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[INFO ][logstash.pipeline        ] Starting pipeline {"id"=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>2}
[INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://logstash_system:xxxxxx@localhost:9200/]}}
[INFO ][logstash.licensechecker.licensereader] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://logstash_system:xxxxxx@localhost:9200/, :path=>"/"}
[WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"http://logstash_system:xxxxxx@localhost:9200/"}
[INFO ][logstash.pipeline        ] Pipeline .monitoring-logstash started
[INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>1000}
[INFO ][logstash.pipeline        ] Pipeline main started
[INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[WARN ][logstash.agent           ] stopping pipeline {:id=>".monitoring-logstash"}
[WARN ][logstash.agent           ] stopping pipeline {:id=>"main"}

I'm trying to get data from an index using a logstash input file and then use rsyslog after to tag it and send it. Here's the input file (with a couple placeholders and not real values so i can post it here):

input {  
elasticsearch {    
hosts => "host:9300"    
#ssl => true    
#ca_file => "/app/elasticsearch-5.2.0/config/x-pack/ca.crt"    
user => "elastic"    
password => "password"    
index => "index"    
query =>'{
    "query":                  
        {"range":{"time" :                               
            {"gte" : "now-15m","lt" : "now"}                           
            }                 
        },
"_source" ["time","alert_name","payload.hits.hits._source.as1","payload.hits.hits._source.ipSrc","payload.hits.hits._source.portSrc",
"payload.hits.hits._source.g1","payload.hits.hits._source.ipDst","payload.hits.hits._source.p2"]
             }'    
size => 5000    
scroll => "2000m"
codec => "json"  
    }
}
filter{
split { field => "[payload][hits][hits]" }
}
 
output {
stdout { codec => line { format => "%{[payload][hits][hits]} %{[alert_name]}"}}
   }

(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.