Import csv file failed

Hello,

I try to insert CSV file in elastic and visualyse it to kibana but it doesn't work.

I start elastic and kibana.

The conf file is:

input {
  file {
    path => "C:/Users/florenth/bureau/analyseFH/donnee_qivivo.csv"
    start_position => "beginning"
    sincedb_path => "NULL"
  }
}
filter {
  csv { 
    separator => ";"
    columns => ["Mois","NB_heure_chauffage","Temp_int","Temps_présence","temp_ext"]
    }
}
output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "logstash_qivivo_test"
      }
  stdout {codec => rubydebug}
}

the logstash command and answer is:

Microsoft Windows [version 10.0.18362.778]
(c) 2019 Microsoft Corporation. Tous droits réservés.

C:\Users\florenth>cd C:\Users\florenth\Desktop\AnnalyseFH\

C:\Users\florenth\Desktop\AnnalyseFH>logstash -f logstash.conf
Sending Logstash logs to C:/ELK/logstash-7.6.2/logs which is now configured via log4j2.properties
[2020-05-11T14:34:05,355][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-05-11T14:34:05,474][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.2"}
[2020-05-11T14:34:07,386][INFO ][org.reflections.Reflections] Reflections took 61 ms to scan 1 urls, producing 20 keys and 40 values
[2020-05-11T14:34:10,192][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-05-11T14:34:10,376][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-05-11T14:34:10,441][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-05-11T14:34:10,445][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-05-11T14:34:10,502][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2020-05-11T14:34:10,572][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-05-11T14:34:10,622][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-05-11T14:34:10,632][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["C:/Users/florenth/Desktop/AnnalyseFH/logstash.conf"], :thread=>"#<Thread:0x1e536c2f run>"}
[2020-05-11T14:34:10,679][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-05-11T14:34:12,196][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-05-11T14:34:12,260][INFO ][filewatch.observingtail  ][main] START, creating Discoverer, Watch with file and sincedb collections
[2020-05-11T14:34:12,283][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-05-11T14:34:12,645][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Could you help me?

That should be "NUL", not "NULL". If that does not help then set log.level to trace and see what the filewatch module logs.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.