Hi, I am new to Data Analytics and trying to do a POC for one of our clients using ELK. My Kibana and Elastic Search servers are working fine and using the command bin/logstash -r -f /Users/gitika.dua/Documents/Data_Analytics/logstash-7.0.0/bin/logstash_demo.config my logstash server is also running. However, below is the error which I am getting and my csv data is not been able to pipelined in ElasticSearch:
Sending Logstash logs to /Users/gitika.dua/Documents/Data_Analytics/logstash-7.0.0/logs which is now configured via log4j2.properties
[2019-04-23T11:07:49,439][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-04-23T11:07:49,472][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-04-23T11:07:49,497][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-04-23T11:07:49,510][ERROR][logstash.javapipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::File start_position=>"beginning", path=>["/Users/gitika.dua//Users/gitika.dua/Desktop/annexA.csv"], id=>"359754cabed553e6577193e9891db97dcf87b490699bba6068fa4b89e8b6eece", sincedb_path=>"/dev/null", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_d2cae1f6-2ea0-4b52-9fbd-61805aad62dc", enable_metric=>true, charset=>"UTF-8">, stat_interval=>1.0, discover_interval=>15, sincedb_write_interval=>15.0, delimiter=>"\n", close_older=>3600.0, mode=>"tail", file_completed_action=>"delete", sincedb_clean_after=>1209600.0, file_chunk_size=>32768, file_chunk_count=>140737488355327, file_sort_by=>"last_modified", file_sort_direction=>"asc">
Error: Permission denied - Permission denied
Exception: Errno::EACCES
Stack: org/jruby/RubyFile.java:1263:in utime' uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:1133:in
block in touch'
org/jruby/RubyArray.java:1792:in each' uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:1130:in
touch'
/Users/gitika.dua/Documents/Data_Analytics/logstash-7.0.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.10/lib/filewatch/sincedb_collection.rb:22:in initialize' /Users/gitika.dua/Documents/Data_Analytics/logstash-7.0.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.10/lib/filewatch/observing_base.rb:62:in
build_watch_and_dependencies'
/Users/gitika.dua/Documents/Data_Analytics/logstash-7.0.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.10/lib/filewatch/observing_base.rb:56:in `initialize'
/Users/gitika.dua/Documents/Data_Analytics/logstash-7.0.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.10/lib/logstash/inputs/file.rb:332:in start_processing' /Users/gitika.dua/Documents/Data_Analytics/logstash-7.0.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.10/lib/logstash/inputs/file.rb:337:in
run'
/Users/gitika.dua/Documents/Data_Analytics/logstash-7.0.0/logstash-core/lib/logstash/java_pipeline.rb:297:in inputworker' /Users/gitika.dua/Documents/Data_Analytics/logstash-7.0.0/logstash-core/lib/logstash/java_pipeline.rb:290:in
block in start_input'
[2019-04-23T11:07:49,714][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-04-23T11:07:50,517][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
Here is my config file:
input {
file {
path => "/Users/gitika.dua//Users/gitika.dua/Desktop/annexA.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter{
csv {
separator => ","
columns => [ "state/data/@class", "state/data/pasReferenceNumber", "state/data/slapNumber", "state/data/farmName", "state/data/farmAddress", "state/data/cphhGBPigHerdNumber",
"state/data/signed", "state/data/date", "state/data/vetAddress", "state/data/ov", "state/data/fsa", "state/data/fbo", "state/data/linearId/externalId", "state/data/linearId/id",
"state/data/participants/0", "state/data/participants/1", "state/data/participants/2", "state/contract", "state/notary", "state/encumbrance", "state/constraint/@class", "state/constraint/key", "ref/txhash", "ref/index" ]
}
mutate {convert => ["state/data/pasReferenceNumber", "integer"]}
mutate {convert => ["state/data/cphhGBPigHerdNumber", "integer"]}
mutate {convert => ["state/data/date", "integer"]}
mutate {convert => ["ref/index", "integer"]}
}
output{
elasticsearch{
hosts => ["http://localhost:9200"]
index => "demo"
document_type => "demo_data"
}
}
Looking forward for teh help.