CSV Import - Keys sporadically wrong

Hello,

I have this simple csv test file:

D1,A1,A2,A3
15.2.2019,Firstname1,Lastname1,1200
16.2.2019,Firstname2,Lastname2,2200

which I import with this config file:

input {
  file {
    path => "/Users/tbra/appl/elastic/logstash-6.6.0/input/test.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}

filter {
  csv {
    separator => ","
    autodetect_column_names => true
    autogenerate_column_names => true
  }
}

output {
  stdout {}
}

When I start logstash I receive this correct output:

{
 "A2" => "Lastname1",
 "A3" => "1200",
 "path" => "/Users/tbra/appl/elastic/logstash-6.6.0/input/test.csv",
 "@timestamp" => 2019-02-15T12:41:15.762Z,
 "@version" => "1",
 "message" => "15.2.2019,Firstname1,Lastname1,1200",
 "host" => "TBRMac.local",
  "D1" => "15.2.2019",
  "A1" => "Firstname1"
}
{
"A2" => "Lastname2",
"A3" => "2200",
"path" => "/Users/tbra/appl/elastic/logstash-6.6.0/input/test.csv",
"@timestamp" => 2019-02-15T12:41:15.763Z,
"@version" => "1",
"message" => "16.2.2019,Firstname2,Lastname2,2200",
"host" => "TBRMac.local",
"D1" => "16.2.2019",
 "A1" => "Firstname2"
}

After restarting logstash I receive this wrong output:

{
"host" => "TBRMac.local",
"15.2.2019" => "D1",
"1200" => "A3",
"@version" => "1",
"path" => "/Users/tbra/appl/elastic/logstash-6.6.0/input/test.csv",
"@timestamp" => 2019-02-15T12:42:00.346Z,
"Firstname1" => "A1",
"message" => "D1,A1,A2,A3",
"Lastname1" => "A2"
}
{
"host" => "TBRMac.local",
"15.2.2019" => "16.2.2019",
"1200" => "2200",
"@version" => "1",
"path" => "/Users/tbra/appl/elastic/logstash-6.6.0/input/test.csv",
"@timestamp" => 2019-02-15T12:42:00.432Z,
"Firstname1" => "Firstname2",
"message" => "16.2.2019,Firstname2,Lastname2,2200",
"Lastname1" => "Lastname2"
}

What am I doing wrong ?
Thanks, best regards
Thomas

P.S: I am using Logstash 6.6.0

Have you set --pipeline.workers 1 ?

Hello Badger,

thank you - that was the solution :slight_smile:

Maybe this important precondition should be prominent in the cvs plugin documentation.

Best regards
Thomas

I agree. There is a issue open for the underlying problem, but it would be good to get the requirement into the documentation, because the issue is unlikely to get fixed, so the requirement will not change.

Agreed that it should at least be documented: https://github.com/logstash-plugins/logstash-filter-csv/pull/70

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.