Error parsing csv - NoMethodError

Hello,

I have lots of following message in logstash's logs :

[WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>}
[WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>}
[WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>}
[WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>}
[WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>}
[WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>}
[WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>}

I have the impression that there are always the same number of errors lines.

Here my logstash config :

input {
  file {
    path => "/data/logstash/stats/ALERTEP_JOUR_*.csv"
    type => "alertp"
    max_open_files => 30000
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}
filter {
  if [type] == "alertp" {
    csv {
      separator => ";"
      skip_empty_columns => true
      columns => [ "POID", "ID0", "TYPE", "POIEV", "CREATED", "MO_T", "READ_ACCESS", "WRITE_ACCESS", "ACOBJ_DB", "AC_OBJ_ID0", "AC_OBJ_TYPE", "AC_OBJ_REV", "CREOR", "CREMIT", "CURREL", "DURATION", "DATE_DELETE", "MESSAGE_ID", "SDN", "PERCENTAGE", "MESSAGE_PRIORITY", "RESOURCE", "MESSAGE_STATUS", "MESSAGE_DATE_SENT", "MESSAGE_TEXT", "SO_NAME", "DATE_SENT", "STATUS", "DATE_ACK", "INDICATOR_VALUE", "MESSAGE_TYPE" ]
    }
    if [message] =~ "\bPOID\b" {   #DELETE HEADER
      drop { }
    }
    if [message] =~ "\brows\b" {  #DELETE useless line
      drop { }
    }
    if [message] =~ /^\s*$/ {   #DELETE BLANK LINE
      drop { }
    }
    if "_csvparsefailure" in [tags] {   #DELETE ALL MESSAGE when parsing is faillure
      drop { }
    }
    date {
      match => [ "CREATED" , "UNIX" ]
      #remove_field => ["CREATED"]
    }
  }
}
output {
  if [type] == "alertp" {
    elasticsearch {
      hosts => ["opm1zels01.com:9200","opm1zels02.com:9200","opm1zels03.com:9200"]
      index => "alertp-%{+YYYY.MM.dd}"
    }
  }
}

And here examples of lines :

101;290605206;/suivi_alerte;3;1503445487;1503445553;L;L;101;14218737171;/accnt;95;-52224;0;0;172800;1503618346;101018818505;000081167549;99,87134;1;2030076;4;1503445546;http://bl2.com;;1503445549;01;1503445553;0;2
101;290629;/suivi_alerte;3;1503446025;1503446045;L;L;101;77092760;/accnt;0;;0;0;172800;1503618839;101018818538;000071297691;;1;3;1;1503446039;mobe.;L24;1503446042;00;1503446045;0;3
101;2905383851;/suivi_alerte;3;1503446347;1503446357;L;L;101;19016637;/accnt;0;-512000;0;-48684;172800;1503619143;101018818573;00000028308;90,4532;1;2030104;1;1503446343;http://bl2.com;;1503446354;00;1503446357;0;1
101;29056547;/suivi_alerte;3;1503448124;1503448777;L;L;101;149078570958;/accnt;55;-52224;0;0;172800;1503620980;101018818711;0009469010;80,356780;1;2030076;1;1503448180;http://bl.com;;1503448772;00;1503448777;0;2

Hypotheses of erros :

=> Negative numeric values ​​create errors
=> the backslash "/" in line create errors
=> Adresse IP in line create errors
=> Pourcentage (99,8912 / 80,2370 ...) create errors

What do you think about that ?

Also I use it in mapping in elasticsearch :

"mappings": {
    "_default_": {
      "dynamic_templates": [
        {
          "strings_as_keywords": {
            "match_mapping_type": "string",
            "mapping": {
              "type": "keyword"
            }
          }
        }
      ],
      "_all": {
        "enabled": false
      }
    }
  }

Wild guess here -do you think you might have some blank lines in the csv files?

1 Like

It's ok, the resolution is :

To put, the remove blank line (if) at begginnig of filter.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.