Running Perl Script Exec Input - No Output

Hello,

My goal is to take the output of a Perl script and parse it through logstash. I can run the script successfully from shell and it prints the CSV data that I want to parse and export to elastic search.

Example:

kyle@ELK:$ ./test.pl XX.XX.XX.XX -pa XXXXXXXX -o csv
41|222|1|65685|1461169607|484943|1000000|1|1|21|2|192.168.50.100|8.8.8.8|8|0|1|7|2|1|0|0|0|d2903b34-065d-11e6-9cd8-86bbaec155d7|9999997|0|1296|0|268434434|00000000-0000-0000-0000-000057167b10|7dfee1ba-063f-11e6-b296-d10a16b9b5d8|7377cab8-063f-11e6-8234-8411ab8a4f0a|7e3dbda4-063f-11e6-b296-d10a16b9b5d8|73a70404-063f-11e6-8234-8411ab8a4f0a|1461169567|2|14605|0|840|0

When I attempt to run the same script from the exec input it runs, but I do not see any output in logstash.stdout. So, I have no clue what it is doing. Am I going about this the right way? Will it accept the command line flags?

input {
exec {
command => 'perl test.pl XX.XX.XX.XX -pa XXXXXXXX -o csv'
interval => 10000
}
}

filter {
csv {
separator => "|"
columns => ..Removed ..
}
if ([Removed] == " Removed") {
drop { }
}
}

output {
stdout {
codec => rubydebug
}
}
~

This should work fine. Are you sure you're not nuking each message with the drop filter?

$ /opt/logstash/bin/logstash -e 'input { exec { command => "echo hello" interval => 1 } } output { stdout { codec => rubydebug } }'
Settings: Default pipeline workers: 2
Pipeline main started
{
       "message" => "hello\n",
      "@version" => "1",
    "@timestamp" => "2016-04-20T17:22:17.047Z",
          "host" => "hallonet",
       "command" => "echo hello"
}
{
       "message" => "hello\n",
      "@version" => "1",
    "@timestamp" => "2016-04-20T17:22:17.989Z",
          "host" => "hallonet",
       "command" => "echo hello"
}
^CSIGINT received. Shutting down the agent. {:level=>:warn}
stopping pipeline {:id=>"main"}
{
       "message" => "hello\n",
      "@version" => "1",
    "@timestamp" => "2016-04-20T17:22:18.989Z",
          "host" => "hallonet",
       "command" => "echo hello"
}
Pipeline main has been shutdown

Almost positive. The drop filter is taking out the CSV headers. I have tested the output using a file and it matched and dropped exactly what I wanted. To be doubly sure I will take out the filter.

Regards,
Kyle Ross

Removed the drop filter and still no love. I also removed all other configuration files to make sure that nothing was catching it first. I then mimiced the command you did above.

kyle@ELK:/opt/logstash/bin$ sudo ./logstash -e 'input { exec { command => "perl test.pl XX.XX.XX.XX -pa XXXXXXXXXXX -o csv" interval => 3600 } } output { stdout { codec => rubydebug } }'
Settings: Default pipeline workers: 2
Can't open perl script "test.pl": No such file or directory
Logstash startup completed
{
"message" => "",
"@version" => "1",
"@timestamp" => "2016-04-20T18:28:48.202Z",
"host" => "ELK",
"command" => "perl test.pl XX.XX.XX.XX -pa XXXXXXXX -o csv"

The strange thing is that when I run this from the configuration file I do not see this error in the logstash logs. If I do the full path it says it doesn't have a Perl module installed (which I do). It is in the same folder as the Perl script.

Is there a specific folder that I should place the scripts? I currently have it in /var/lib/logstash

Oh. I wouldn't make any assumptions about the current directory of Logtash, and you shouldn't store the scripts in /var/lib/logstash anyway. Change into the directory containing the script files first, i.e. change your command to cd /some/path && perl test.pl ....

Great. Thank you. I am now past the Perl errors. Now I seem to be butting heads with the plugin.

{:timestamp=>"2016-04-20T16:11:03.513000-0400", :message=>"Error while running command", :command=>"perl test.pl XX.XX.XX.XX -pa XXXXXXXX -o csv", :e=>#<IOError: closed stream>, :backtrace=>["org/jruby/RubyIO.java:3067:in read'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-exec-2.0.6/lib/logstash/inputs/exec.rb:80:in execute'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-exec-2.0.6/lib/logstash/inputs/exec.rb:43:in inner_run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-exec-2.0.6/lib/logstash/inputs/exec.rb:37:in run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/pipeline.rb:334:in inputworker'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/pipeline.rb:328:in start_input'"], :level=>:error}
{:timestamp=>"2016-04-20T16:11:04.536000-0400", :message=>#<LogStash::PipelineReporter::Snapshot:0x7b81410e @data={:events_filtered=>1, :events_consumed=>1, :worker_count=>2, :inflight_count=>1, :worker_states=>[{:status=>"dead", :alive=>false, :index=>0, :inflight_count=>0}, {:status=>"sleep", :alive=>true, :index=>1, :inflight_count=>1}], :output_info=>[{:type=>"stdout", :config=>{"codec"=>"rubydebug"}, :is_multi_worker=>false, :events_received=>1, :workers=><Java::JavaUtilConcurrent::CopyOnWriteArrayList:-2006038812 [<LogStash::Outputs::Stdout codec=><LogStash::Codecs::RubyDebug metadata=>false>, workers=>1>]>, :busy_workers=>0}], :thread_info=>[{"thread_id"=>18, "name"=>"[base]>worker1", "plugin"=>["LogStash::Filters::CSV", {"separator"=>"|", "columns"=>["block_type", "block_length", "sensor_id", "event_id", "event_second", "event_microsecond", "signature_id", "generator_id", "signature_revision", "classification_id", "priority_id", "ip_source", "ip_destination", "sport_itype", "dport_icode", "protocol", "impact_flag", "impact", "blocked", "mpls_label", "vlanId", "pad", "policy_uuid", "user_id", "web_application_id", "client_application_id", "application_protocol_id", "firewall_rule_id", "firewall_policy_uuid", "interface_ingress_uuid", "interface_egress_uuid", "security_zone_ingress_uuid", "security_zone_egress_uuid", "connection_second", "connection_instance_id", "connection_counter", "ip_src_country", "ip_dst_country", "num_ioc"]}], "backtrace"=>["[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/pipeline.rb:301:in synchronize'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/pipeline.rb:301:in inflight_batches_synchronize'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/pipeline.rb:226:in worker_loop'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/pipeline.rb:193:in start_workers'"], "blocked_on"=>nil, "status"=>"sleep", "current_call"=>"[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/pipeline.rb:301:in synchronize'"}], :stalling_threads_info=>[{"thread_id"=>18, "name"=>"[base]>worker1", "plugin"=>["LogStash::Filters::CSV", {"separator"=>"|", "columns"=>["block_type", "block_length", "sensor_id", "event_id", "event_second", "event_microsecond", "signature_id", "generator_id", "signature_revision", "classification_id", "priority_id", "ip_source", "ip_destination", "sport_itype", "dport_icode", "protocol", "impact_flag", "impact", "blocked", "mpls_label", "vlanId", "pad", "policy_uuid", "user_id", "web_application_id", "client_application_id", "application_protocol_id", "firewall_rule_id", "firewall_policy_uuid", "interface_ingress_uuid", "interface_egress_uuid", "security_zone_ingress_uuid", "security_zone_egress_uuid", "connection_second", "connection_instance_id", "connection_counter", "ip_src_country", "ip_dst_country", "num_ioc"]}], "current_call"=>"[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/pipeline.rb:301:in synchronize'"}]}>, :level=>:warn}
{:timestamp=>"2016-04-20T16:11:04.542000-0400", :message=>"The shutdown process appears to be stalled due to busy or blocked plugins. Check the logs for more information.", :level=>:error}
{:timestamp=>"2016-04-20T17:14:41.856000-0400", :message=>"Error parsing csv", :field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>, :level=>:warn}