Unable to send the csv file to elastic search and neither I am able to get the output on console

This is the logstash configuration file ---

input {
file {
path => "C:\Users\pr389076\Prakhar\cars.csv"
start_position =>"beginning"
sincedb_path => "NUL"
}
}

filter {
csv {
separator => ","
columns => [ "maker", "model", "mileage", "manufacture_year", "engine_displacement", "engine_power", "body_type", "stk_year", "transmission", "door_count", "seat_count", "fuel_type", "date_created", "date_last_seen", "price_eur" ]
}
mutate {convert => ["mileage", "integer"] }
mutate {convert => ["price_eur", "float"] }
mutate {convert => ["engine_power", "integer"] }
mutate {convert => ["door_count", "integer"] }
mutate {convert => ["seat_count", "integer"] }
}

output {
elasticsearch {
hosts => "localhost:9200"
index => "cars"
document_type => "sold_cars"
manage_template => false
}
stdout {}
}

When I am running this command ---
logstash -f first-pipeline.config

I am unable to get the output on Elastic search and neither on console--
The console output------
C:\Users\pr389076\Prakhar\logstash-6.4.0\logstash-6.4.0\bin>logstash -f first-pipeline.config
Sending Logstash logs to C:/Users/pr389076/Prakhar/logstash-6.4.0/logstash-6.4.0/logs which is now configured via log4j2.properties
[2018-09-19T14:14:55,712][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-09-19T14:14:57,758][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.0"}
[2018-09-19T14:15:02,730][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"cars", manage_template=>false, id=>"35b602260b15e33fd18ec63192aab428211215cc0125cb5e49d9ede4f2fb762b", hosts=>[//localhost:9200], document_type=>"sold_cars", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_d0a01641-6cc6-4ace-b11f-4c6aab186691", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-09-19T14:15:05,581][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-09-19T14:15:07,190][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-09-19T14:15:07,210][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-09-19T14:15:07,830][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-09-19T14:15:08,100][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-09-19T14:15:08,128][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-09-19T14:15:08,228][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-09-19T14:15:09,731][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x3041c624 run>"}
[2018-09-19T14:15:09,905][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-09-19T14:15:09,953][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-09-19T14:15:10,691][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

start_position does not work as you expect.

If you want to simply read a file from the beginning to the end, try read mode.
Settings:
mode
file_completed_action
file_completed_log_path

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.