Csv load with logstash not working

Hello everyone,

I am trying to load a csv file with logstash with the following conf :

############################################################################################################################
input
{
file
{
path => "C:\Users\BEKRISO\KIBANA7.0.1\INPUT\9r_piste_audit.csv"
start_position => "beginning"
sincedb_path => "C:\Users\BEKRISO\KIBANA7.0.1\sincedb"
}
}

############################################################################################################################

filter
{
csv
{
separator => ","
columns => ["Date et heure","Utilisateur","Code","Libellé évènement","Code retour","Application","Code site","Objet Start","Usage cache","Valeur avant modif","Valeur après modif"]
}
}

##############################################################################################################################

output
{
elasticsearch
{
hosts => "cas0000658713:9200"
index => "monbeaunode_1"
}

stdout {}

}

############################################################################################################################

and don't understand why my file isn't loaded, this is what I get as a result :

C:\Users\BEKRISO\KIBANA7.0.1>TITLE Logstash

C:\Users\BEKRISO\KIBANA7.0.1>del .\sincedb

C:\Users\BEKRISO\KIBANA7.0.1>REM set JAVA_HOME=C:\Users\BEKRISO\KIBANA7.0.1\Java\jre1.8.0_72

C:\Users\BEKRISO\KIBANA7.0.1>set JAVA_HOME=C:\Users\BEKRISO\KIBANA7.0.1\Java\jre1.8.0_72

C:\Users\BEKRISO\KIBANA7.0.1>.\logstash\bin\logstash.bat -f logstash7-testbp.conf -l .\log\logstash7-testbp.log --debug --verbose
Sending Logstash logs to .\log\logstash7-testbp.log which is now configured via log4j2.properties
[2019-05-28T17:44:51,570][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-05-28T17:44:51,586][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.0.1"}
[2019-05-28T17:44:59,024][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://cas0000658713:9200/]}}
[2019-05-28T17:44:59,212][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://cas0000658713:9200/"}
[2019-05-28T17:44:59,258][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-05-28T17:44:59,258][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2019-05-28T17:44:59,274][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//cas0000658713:9200"]}
[2019-05-28T17:44:59,290][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-05-28T17:44:59,305][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x15bcca0 run>"}
[2019-05-28T17:44:59,399][INFO ][logstash.outputs.elasticsearch] Index Lifecycle Management is set to 'auto', but will be disabled - Index Lifecycle management is not installed on your Elasticsearch cluster
[2019-05-28T17:44:59,399][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-05-28T17:45:00,368][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-05-28T17:45:00,462][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-05-28T17:45:00,462][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-05-28T17:45:00,899][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Can someone help me to solve this problem pls ?

Use forward slash instead of backslash in Windows paths for the file input.

1 Like

Thank you for your help, it works now

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.