Logstash ERROR with CVS parsing in a Pod

Hey there,

I'm currently trying to automatically ingest a CVS when launching a Logstash pod connected to ECK and I have the following error:

[ERROR] 2021-06-28 07:25:43.494 [Converge PipelineAction::Create<main>] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \\t\\r\\n], \"#\", [A-Za-z0-9_-], '\"', \"'\", [A-Za-z_], \"-\", [0-9], \"[\", \"{\" at line 3, column 13 (byte 30) after input {\n  file {\n    path => ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in `compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:187:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:72:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:389:in `block in converge_state'"]}
[INFO ] 2021-06-28 07:25:43.492 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
[INFO ] 2021-06-28 07:25:43.595 [LogStash::Runner] runner - Logstash shut down.

The CVS structure is the following:

header1,header2,header3,header4,header5,header6,header7,header8
0,3,Mr. Surname1 Name1,male,22,1,0,7.25
1,1,Mrs. Surname2 (Second surname) Name2,female,38,1,0,71.2833
[...]

And here is the conf:

  logstash.conf: |
    input {
      file {
        path => /data/file.csv
        start_position => "beginning"
        sincedb_path => "/dev/null"
      }
    }
    filter {
      csv {
        separator => ","
        skip_header => "true"
        columns => [
          "header1",
          "header2",
          "header3",
          "header4",
          "header5",
          "header6",
          "header7",
          "header8"
        ]
      }
    }
    output {
      elasticsearch {
        index => "index1"
        hosts => [ "${ES_HOSTS}" ]
        user => "${ES_USER}"
        password => "${ES_PASSWORD}"
        cacert => '/etc/logstash/certificates/ca.crt'
      }
    }
    stdout {}

The confs and file are loaded with ConfigMaps and they're indeed present in the pod when it launches, so Logstash should have access to it.

From what I understand, the issue is with the parsing itself, but the structure doesn't seem complicated and I don't understand what could be wrong with the filter...

Any idea what could be happening?

Thanks in advance for your help!

Cheers,

Julien

Hi,

Put quotes around /data/file.csv.

Cad.

1 Like

Thanks a lot! It was indeed simply this... I never quite get when we need to use quotes/double quotes or not in yaml, I guess I'll just default to using them now.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.