Logstash is not able to import CSV files

I am trying to import a CSV file to elasticsearch using logstash. There is no error displayed on the screen after running the logstash.bat. However, there is no index created in ealsticsearch. Please help.

Below is the content of the last log file.
[2018-10-01T00:21:52,539][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.1"}
[2018-10-01T00:21:59,852][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-10-01T00:22:00,730][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-10-01T00:22:00,743][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-10-01T00:22:01,164][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-10-01T00:22:01,300][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-10-01T00:22:01,312][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-10-01T00:22:01,436][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-10-01T00:22:01,404][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2018-10-01T00:22:01,651][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-10-01T00:22:02,983][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/Users/ahkartika/Downloads/logstash-6.4.1/data/plugins/inputs/file/.sincedb_c810ab01ee71279c8ef52f6ad226d496", :path=>["C:\TESLA\ELASTIC\data\cars.csv"]}
[2018-10-01T00:22:03,053][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x6f6a12e4 run>"}
[2018-10-01T00:22:03,172][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-10-01T00:22:03,173][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-10-01T00:22:03,845][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Does your File Input configuration include a start_position => beginning directive? By default, when running in "tail" mode, the file input plugin only emits events for new lines that are added to the file after it has been opened.

You can also switch it to mode => read to make it read only what is present in the file when it is opened and not continue waiting for new lines to be added to the file.

You may need to delete the sincedb file in order to get the plugin to "forget" where it last left off:

C:/Users/ahkartika/Downloads/logstash-6.4.1/data/plugins/inputs/file/.sincedb_c810ab01ee71279c8ef52f6ad226d496

Thanks for your reply.
I have added the start_position => "beginning" and mode => "read". I also removed "sincedb".
It still cannot import the CSV file.

Are you ingesting from windows or linux? Whats your logstash config file looks like?

The elasticsearch and logstash is installed in windows.
Here is my config file:
input {
file {
path => "C:\ELASTIC\data\cars.csv"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => [ "maker", "model", "mileage", "manufacture_year", "engine_displacement", "engine_power", "body_type", "color_slug", "stk_year", "transmission", "door_count", "seat_count", "fuel_type", "date_created", "date_last_seen", "price_eur" ]
}
mutate {convert => ["milage", "integer"] }
mutate {convert => ["price_eur", "float"] }
mutate {convert => ["engine_power", "integer"] }
mutate {convert => ["door_count", "integer"] }
mutate {convert => ["seat_count", "integer"] }
}
output {
elasticsearch {
hosts => "localhost"
index => "cars"
document_type => "sold_cars"
}
stdout {}
}

OK yeah I had this problem the other week and spent like 2 hours trying to figure out why I couldn't ingest a simple CSV. It turns out that the path needed to be in a UNIX format (change all "\" to "/").. This seemed to change from 6.3.X to 6.4.X as the windows style path format used to work..

Change your input to:

input {
file {
path => "C:/ELASTIC/data/cars.csv"
sincedb_path =>  "NUL"
start_position =>  "beginning"
}
1 Like

I am getting the same error on 6.4.1 logstash, but it working on 6.0.0 logstash.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.