Failing to import a simple csv

Hello, I'm trying to load some data from a cvs to start learning elastic. But I'm having problems since yesterday and I can't import the data. This is what I did:

I downloaded the zipfile and I extracted it. I'm using Windows 7. I uncommented the heap size variables from the config/jvm.options file:

-Xms4g
-Xmx4g

I run the elasticsearch.exe and I visited http://localhost:9200/ with my browser and I got:

{
"name" : "D2-iPGl",
"cluster_name" : "elasticsearch",
"cluster_uuid" : "uRuuK5voT3uZ3QvoHzf3QQ",
"version" : {
"number" : "6.5.4",
"build_flavor" : "default",
"build_type" : "zip",
"build_hash" : "d2ef93d",
"build_date" : "2018-12-17T21:17:40.758843Z",
"build_snapshot" : false,
"lucene_version" : "7.5.0",
"minimum_wire_compatibility_version" : "5.6.0",
"minimum_index_compatibility_version" : "5.0.0"
},
"tagline" : "You Know, for Search"
}

Then I run

bin\logstash -f files\logstash_products.conf

Which has the content from https://github.com/pranav-shukla/learningelasticstack/tree/master/chapter-03#import-product-data-into-elasticsearch, which some modifications as the paths, since I'm running it on Windiows:

input {
file {
path => "C:/Users/myusername/logstash-6.5.4/files/products.csv"
start_position => "beginning"
sincedb_path => "null"
codec => plain {
charset => "ISO-8859-1"
}
}
}
filter {
csv {
separator => ","
columns => ["id","title","description","manufacturer","price"]
}

mutate {
remove_field => ["@version","@timestamp","path","host", "tags", "message"]
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "amazon_products"
document_type => "products"
}
stdout {}
}

And the output is always the same:

[2019-01-18T16:18:09,682][INFO ][logstash.pipeline ] Pipeline started suc
cessfully {:pipeline_id=>"main", :thread=>"#<Thread:0x487627f8 run>"}
[2019-01-18T16:18:09,754][INFO ][logstash.agent ] Pipelines running {:
count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-01-18T16:18:09,769][INFO ][filewatch.observingtail ] START, creating Disc
overer, Watch with file and sincedb collections
[2019-01-18T16:18:10,079][INFO ][logstash.agent ] Successfully started
Logstash API endpoint {:port=>9600}

And then it kind of feezes.
Where can I get more feedback to know what's wrong? What should I do?
Thanks in advance,

That should be

sincedb_path => "NUL"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.