Hello Team,
Kindly help me with below error when i try to pull data from a URL in csv format and store it in a csv file output. I have been trying googling around but could not get any answer. last hope is elastic community. kindly let me know what error I am doing with my configuration.
below is the error
[2019-04-02T19:44:09,476][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.7.0"}
[2019-04-02T19:44:21,765][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-04-02T19:44:22,070][INFO ][logstash.inputs.http_poller] Registering http_poller Input {:type=>nil, :schedule=>{"every"=>"2s"}, :timeout=>nil}
[2019-04-02T19:44:22,125][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"<LogStash::Inputs::HTTP_Poller schedule=>{\"every\"=>\"2s\"}, urls=>{\"minemeld\"=>\"http://192.168.56.30/feeds/DARP-output-feedgreen?tr=1&v=csv&f=indicator|threatIP&f=confidence&f=sources|feeds\", \"codec\"=>\"line\"}, id=>\"b2c19b71025ef17abe0e54722f8bdb435e1a5296d1b6da6c1007e741a1fb431f\", enable_metric=>true, codec=><LogStash::Codecs::JSON id=>\"json_4a209d70-7ae1-47ef-90f5-cf5951a5a4c6\", enable_metric=>true, charset=>\"UTF-8\">, request_timeout=>60, socket_timeout=>10, connect_timeout=>10, follow_redirects=>true, pool_max=>50, pool_max_per_route=>25, keepalive=>true, automatic_retries=>1, retry_non_idempotent=>false, validate_after_inactivity=>200, keystore_type=>\"JKS\", truststore_type=>\"JKS\", cookies=>true, metadata_target=>\"@metadata\">", :error=>"Invalid URL http://192.168.56.30/feeds/DARP-output-feedgreen?tr=1&v=csv&f=indicator|threatIP&f=confidence&f=sources|feeds", :thread=>"#<Thread:0x5e7c605a run>"}
[2019-04-02T19:44:23,987][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: Invalid URL http://192.168.56.30/feeds/DARP-output-feedgreen?tr=1&v=csv&f=indicator|threatIP&f=confidence&f=sources|feeds>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-http_poller-4.0.5/lib/logstash/inputs/http_poller.rb:105:in `validate_request!'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-http_poller-4.0.5/lib/logstash/inputs/http_poller.rb:97:in `normalize_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-http_poller-4.0.5/lib/logstash/inputs/http_poller.rb:57:in `block in setup_requests!'", "org/jruby/RubyHash.java:1419:in `each'", "org/jruby/RubyEnumerable.java:833:in `map'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-http_poller-4.0.5/lib/logstash/inputs/http_poller.rb:57:in `setup_requests!'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-http_poller-4.0.5/lib/logstash/inputs/http_poller.rb:47:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:259:in `register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:270:in `block in register_plugins'", "org/jruby/RubyArray.java:1792:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:270:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:413:in `start_inputs'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:311:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:217:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:176:in `block in start'"], :thread=>"#<Thread:0x5e7c605a run>"}
[2019-04-02T19:44:24,039][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2019-04-02T19:44:24,690][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Below is my config file
input {
http_poller {
schedule => { "every" => "2s" }
urls => {
minemeld => "http://192.168.56.30/feeds/DARP-output-feedgreen?tr=1&v=csv&f=indicator|threatIP&f=confidence&f=sources|feeds"
codec => "line"
}
}
}
filter {
csv {
separator => ","
columns => ["threatIP","confidence","feeds"]
}
}
output {
csv {
fields => ["threatIP","confidence","feeds"]
path => "/tmp/darpintel.csv"
}
}