Logstash Configuration Error (Failed to execute action)

Hi,

I am literally 2 days old with ELK Stack and I know I have missed something.
I'm running version 6.2.4 on RHEL 7.4.
I'm getting the following error looping in /var/log/logstash/logstash-plain.log:

[2018-05-11T07:31:21,342][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x407c162a@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:247 sleep>"}
[2018-05-11T07:31:21,437][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 24, column 19 (byte 881) after filter {\n    csv {\n        columns => [\"date\", \"time\", \"time_taken\", \"c_ip\", \"cs_username\", \"cs_auth_group\", \"s-supplier-name\", \"s-supplier-ip\", \"s-supplier-country\", \"s-supplier-failures\", \"x_exception_id\", \"sc_filter_result\", \"cs_categories\", \"cs_referer\", \"sc_status\", \"s_action\", \"cs_method\", \"rs_content_type\", \"cs_uri_scheme\", \"cs_host\", \"cs_uri_port\", \"cs_uri_path\", \"cs_uri_query\", \"cs_uri_extension\", \"cs_user_agent\", \"s_ip\", \"sc_bytes\", \"cs_bytes\", \"x_virus_id\", \"x_bluecoat_application_name\", \"x_bluecoat_application_operation\"]\n\tseparator => \" \"\n    }\n\n    if [date] and [time] {\n        mutate {\n            add_field => { \"timestamp\" => \"%{date} %{time}\" }\n        }\n        date {\n            match => [\"timestamp\", \"YYYY-MM-dd HH:mm:ss\" ]\n            timezone => ['UTC']\n        }\n    }\n\noutput {\n    elasticsearch ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:169:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315:in `block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
[2018-05-11T07:31:21,450][INFO ][logstash.inputs.metrics  ] Monitoring License OK
[2018-05-11T07:31:23,278][INFO ][logstash.pipeline        ] Pipeline has terminated {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x407c162a@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:247 run>"}

But when I run a config.test:

 /usr/share/logstash/bin/logstash -f bluecoat_pipeline.conf --config.test_and_exit
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[INFO ] 2018-05-11 07:40:32.478 [main] scaffold - Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[INFO ] 2018-05-11 07:40:32.513 [main] scaffold - Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[INFO ] 2018-05-11 07:40:32.637 [main] scaffold - Initializing module {:module_name=>"arcsight", :directory=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.4-java/modules/arcsight/configuration"}
[WARN ] 2018-05-11 07:40:32.848 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
Configuration OK
[INFO ] 2018-05-11 07:40:34.236 [LogStash::Runner] runner - Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash

Since I can only post 7000 characters, I will put my config below, please read on.

The error mentions "line 24, column 19" which doesn't exist.

Here is my config:

input {
     tcp {
        port => 51401
         }
}
filter {
    # drop comment lines
    if ([message] =~ /^#/) {
      drop{}
    }
    csv {
        columns => ["date", "time", "time_taken", "c_ip", "cs_username", "cs_auth_group", "s_supplier_name", "s_supplier_ip", "s_supplier_country", "s_supplier_failures", "x_exception_id", "sc_filter_result", "cs_categories", "cs_referer", "sc_status", "s_action", "cs_method", "rs_content_type", "cs_uri_scheme", "cs_host", "cs_uri_port", "cs_uri_path", "cs_uri_query", "cs_uri_extension", "cs_user_agent", "s_ip", "sc_bytes", "cs_bytes", "x_virus_id", "x_bluecoat_application_name", "x_bluecoat_application_operation", "x_bluecoat_application_groups", "cs_threat_risk", "x_bluecoat_transaction_uuid"]
        separator => " "
    }
    # parse timestamp
    if [date] and [time] {
        mutate {
            add_field => { "timestamp" => "%{date} %{time}" }
        }
        date {
            match => ["timestamp", "YYYY-MM-dd HH:mm:ss" ]
            timezone => ['UTC']
        }
    }
    # enrich log entry with destination geolocation info
    if ( [s_supplier_ip] and [s_supplier_ip] != "-" ){
        geoip {
            source => "s_supplier_ip"
        }
    } else if ( [cs_host] and [cs_host] != "-" ){
        mutate {
            add_field => {"cs_host_ip" => "%{cs_host}" }
        }
        dns {
            resolve => ["cs_host_ip"]
            action => "replace"
        }
        geoip {
            source => "cs_host_ip"
        }
    }
    # parse User-Agent header
    if ([cs_user_agent] and [cs_user_agent] != "" and [cs_user_agent] != "-") {
        useragent {
            source => "cs_user_agent"
            prefix => "user_agent_"
        }
    }
    # split Blue Coat web site categories into an array
    if ([cs_categories] and [cs_categories] != "" and [cs_categories] != "-") {
        mutate {
            split => { "cs_categories" => ";" }
        }
    }
    # type convert number fields
    mutate {
        convert => ["sc_bytes", "integer",
                     "time_taken", "integer",
                     "r_port", "integer",
                     "s_port", "integer",
                     "cs_bytes", "integer",
                     "duration", "integer"
                   ]
    }
    # cleanup
    mutate {
        remove_field => ["message", "date", "time"]
    }
}
output {
    elasticsearch {
        hosts => ["localhost:9200"]
    }
}

Config created by @airman604 (twitter) from https://medium.com/@airman604/elk-stack-and-blue-coat-logs-part-3-9a36f520b1a

I pretty much copied it, changed the input, the "columns" and the output.

Please let me know what I missed, or if you need any more info.

Thank you

Perhaps bluecoat_pipeline.conf isn't the only file in your configuration directory (/etc/logstash/conf.d or whatever)?

1 Like

@magnusbaeck

I was under the impression that it only read the *.conf files. I had some backup files in that location thinking that they would just be ignored.

After moving the other files, it seems to be working.

Thank you

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.