Hello
I'm using logstash 6.1.0 to parse csv files and send them to elasticsearch.
With elastic 2.1.0 it's working quite fine.
But with elastic 6.1.0 I have a logstash error:
[2017-12-22T17:34:04,553][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-12-22T17:34:04,578][ERROR][logstash.outputs.elasticsearch] Failed to install template. {:message=>"Template file '' could not be found!", :class
=>"ArgumentError", :backtrace=>["C:/appl/elk/logstash-6.1.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.2-java/lib/logstash/outpu
ts/elasticsearch/template_manager.rb:31:in read_template_file'", "C:/appl/elk/logstash-6.1.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticse arch-9.0.2-java/lib/logstash/outputs/elasticsearch/template_manager.rb:17:in
get_template'", "C:/appl/elk/logstash-6.1.0/vendor/bundle/jruby/2.3.0/ge
ms/logstash-output-elasticsearch-9.0.2-java/lib/logstash/outputs/elasticsearch/template_manager.rb:7:in install_template'", "C:/appl/elk/logstash-6.1 .0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:57:in
install_template'", "C:
/appl/elk/logstash-6.1.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:26:in r egister'", "C:/appl/elk/logstash-6.1.0/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:9:in
register'", "C:/appl/elk/logstash-6.1.0/
logstash-core/lib/logstash/output_delegator.rb:43:in register'", "C:/appl/elk/logstash-6.1.0/logstash-core/lib/logstash/pipeline.rb:343:in
register_
plugin'", "C:/appl/elk/logstash-6.1.0/logstash-core/lib/logstash/pipeline.rb:354:in block in register_plugins'", "org/jruby/RubyArray.java:1734:in
e
ach'", "C:/appl/elk/logstash-6.1.0/logstash-core/lib/logstash/pipeline.rb:354:in register_plugins'", "C:/appl/elk/logstash-6.1.0/logstash-core/lib/lo gstash/pipeline.rb:743:in
maybe_setup_out_plugins'", "C:/appl/elk/logstash-6.1.0/logstash-core/lib/logstash/pipeline.rb:364:in start_workers'", "C:/ appl/elk/logstash-6.1.0/logstash-core/lib/logstash/pipeline.rb:288:in
run'", "C:/appl/elk/logstash-6.1.0/logstash-core/lib/logstash/pipeline.rb:248:i
n `block in start'"]}
[2017-12-22T17:34:04,587][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http:
//localhost:9200"]}
[2017-12-22T17:34:04,627][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125
, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x394787d5 run>"}
[2017-12-22T17:34:05,353][INFO ][logstash.pipeline ] Pipeline started {"pipeline.id"=>"main"}
[2017-12-22T17:34:05,491][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}
[2017-12-22T17:34:07,245][ERROR][logstash.outputs.elasticsearch] Encountered a retryable error. Will Retry with exponential backoff {:code=>401, :url
=>"http://localhost:9200/_bulk"}
My logstash.conf file is that one:
input {
file {
path => ["C:/file/current/*.csv"]
sincedb_path => "....\data\logst\since.db"
start_position => "beginning"
ignore_older => 0
}
}
filter {
csv {
separator => ";"
source => message
columns => ["ID","Value","Timestamp","DateAction","ServerName"]
remove_field => ["message","host","path","@version","@timestamp"]
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "myindex"
document_type => "mytype"
}
stdout { codec => rubydebug }
}
my csv file looks like:
ID;Value;Timestamp;DateAction;ServerName
numeric.saw.uint8;12;2017-12-22T16:59:58;2017-12-22T17:00:01;PLENO-PBHFLBG
numeric.saw.int8;-77;2017-12-22T16:59:58;2017-12-22T17:00:01;PLENO-PBHFLBG
Regards
Driss