Logstash -f is failed to create index - windows environment


#1

logstash version - 6.3.0
running on windows 7 - 64 machine

my config file


input{
file{
path => "c:/lax/station.csv"
start_position => "beginning"
sincedb_path => "c:/lax"
}
}

filter
{
csv
{
separator => ","
columns => [
"id","name","slug"
]
}
}
output {
elasticsearch{
hosts => "localhost"
index => "stations"
document_type => "station_details"
}
stdout {}
}


executed below command from windows command prompt
logstash -f c:\lax\logstash_csv.config

-- noticed below output on command prompt window but it didn't create index when I checked in kibana-


Sending Logstash's logs to C:/Users/hg/Downloads/logstash-6.3.0/logs which is no
w configured via log4j2.properties
[2018-06-29T10:29:57,916][WARN ][logstash.config.source.multilocal] Ignoring the
'pipelines.yml' file because modules or command line options are specified
[2018-06-29T10:29:59,748][INFO ][logstash.runner ] Starting Logstash {"
logstash.version"=>"6.3.0"}
[2018-06-29T10:30:05,966][WARN ][logstash.outputs.elasticsearch] You are using a
deprecated config setting "document_type" set in elasticsearch. Deprecated sett
ings will continue to work, but are scheduled for removal from logstash in the f
uture. Document types are being deprecated in Elasticsearch 6.0, and removed ent
irely in 7.0. You should avoid this feature If you have any questions about this
, please visit the #logstash channel on freenode irc. {:name=>"document_type", :
plugin=><LogStash::Outputs::ElasticSearch hosts=>[//localhost], index=>"stations
", document_type=>"station_details", id=>"e6d8989aaccfcbb4237999a23f80aa35129f68
3058d1efa36b41c276769663c6", enable_metric=>true, codec=><LogStash::Codecs::Plai
n id=>"plain_96c96752-bd3c-4934-b063-ac63416f3b53", enable_metric=>true, charset
=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", templa
te_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"
painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_inter
val=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certi
ficate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool
_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivi
ty=>10000, http_compression=>false>}
[2018-06-29T10:30:09,581][INFO ][logstash.pipeline ] Starting pipeline {:
pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipelin
e.batch.delay"=>50}
[2018-06-29T10:30:11,363][INFO ][logstash.outputs.elasticsearch] Elasticsearch p
ool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-06-29T10:30:11,399][INFO ][logstash.outputs.elasticsearch] Running health
check to see if an Elasticsearch connection is working {:healthcheck_url=>http:/
/localhost:9200/, :path=>"/"}
[2018-06-29T10:30:12,130][WARN ][logstash.outputs.elasticsearch] Restored connec
tion to ES instance {:url=>"http://localhost:9200/"}
[2018-06-29T10:30:12,278][INFO ][logstash.outputs.elasticsearch] ES Output versi
on determined {:es_version=>6}
[2018-06-29T10:30:12,278][WARN ][logstash.outputs.elasticsearch] Detected a 6.x
and above cluster: the type event field won't be used to determine the documen
t _type {:es_version=>6}
[2018-06-29T10:30:12,324][INFO ][logstash.outputs.elasticsearch] Using mapping t
emplate from {:path=>nil}
[2018-06-29T10:30:12,501][INFO ][logstash.outputs.elasticsearch] Attempting to i
nstall template {:manage_template=>{"template"=>"logstash-*", "version"=>60001,
"settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynami
c_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=

"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"ma
tch"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>
false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "pro
perties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geo
ip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=
"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_f
loat"}}}}}}}}
[2018-06-29T10:30:12,705][INFO ][logstash.outputs.elasticsearch] New Elasticsear
ch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2018-06-29T10:30:15,081][ERROR][logstash.pipeline ] Error registering pl
ugin {:pipeline_id=>"main", :plugin=>"<LogStash::Inputs::File path=>["c:/lax/st
ation.csv"], start_position=>"beginning", sincedb_path=>"c:/lax", id=>"d28
4a4d33a34039f8364fccbe87b708ce950cdb5acc7effbf37b64b6b8387e30", enable_metric=>
true, codec=><LogStash::Codecs::Plain id=>"plain_ab35b07b-fd5d-480c-91f9-6fae8f
ed7dad", enable_metric=>true, charset=>"UTF-8">, stat_interval=>1, discover_i
nterval=>15, sincedb_write_interval=>15, delimiter=>"\n", close_older=>3600>"
, :error=>"The "sincedb_path" argument must point to a file, received a direct
ory: "c:/lax"", :thread=>"#<Thread:0x25337751 run>"}
[2018-06-29T10:30:15,443][ERROR][logstash.pipeline ] Pipeline aborted due
to error {:pipeline_id=>"main", :exception=>#<ArgumentError: The "sincedb_path"
argument must point to a file, received a directory: "c:/lax">, :backtrace=>["C
:/Users/hg/Downloads/logstash-6.3.0/vendor/bundle/jruby/2.3.0/gems/logstash-inpu
t-file-4.0.5/lib/logstash/inputs/file.rb:232:in register'", "C:/Users/hg/Downlo ads/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:340:inregister_plugi
n'", "C:/Users/hg/Downloads/logstash-6.3.0/logstash-core/lib/logstash/pipeline.r
b:351:in block in register_plugins'", "org/jruby/RubyArray.java:1734:ineach'"
, "C:/Users/hg/Downloads/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:3
51:in register_plugins'", "C:/Users/hg/Downloads/logstash-6.3.0/logstash-core/l ib/logstash/pipeline.rb:498:instart_inputs'", "C:/Users/hg/Downloads/logstash-
6.3.0/logstash-core/lib/logstash/pipeline.rb:392:in start_workers'", "C:/Users/ hg/Downloads/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:288:inrun'"
, "C:/Users/hg/Downloads/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:2
48:in `block in start'"], :thread=>"#<Thread:0x25337751 run>"}
[2018-06-29T10:30:15,555][ERROR][logstash.agent ] Failed to execute ac
tion {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message
=>"Could not execute action: PipelineAction::Create, action_result: false"
, :backtrace=>nil}
[2018-06-29T10:30:16,555][INFO ][logstash.agent ] Successfully started
Logstash API endpoint {:port=>9600}


(Magnus Bäck) #2

Here's the error message:

The "sincedb_path" argument must point to a file, received a directory


#3

Many thanks Mangnuas!


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.