Hello,
I am new for ELK.
I am using :
- elasticsearch-2.1.0
- logstash-2.1.1
- kibana-4.3.0-windows
I tried to configure ELK to monitoring my application logs and I followed different tutorials and different logstash configuration, but I am getting all the time this error :
[logstash-*] IndexNotFoundException[no such index]
Yes, I tried this : logstash -e 'input { stdin { } } output { stdout {} }'
and it is working fine.
And I added this in my output: output { elasticsearch { hosts => ["localhost:9200"] } stdout { codec => rubydebug } }
But still doesn't work. because here: http://localhost:9200/_cat/indices
I have only this : yellow open .kibana 1 1 1 0 3.1kb 3.1kb
Errata codices:
I tried the same steps in Ubuntu and it was working.
Than I deleted the index in elasticsearch with : curl -XDELETE http://localhost:9200/logstash-2015.12.30/
and try to recreate it with a different config file and logstash wasn't sent the new index to the elasticsearch.
Hi
I am in the same situation and having the same problem on windows.
I followed the instructions and they don't work.
Logstash is not creating the index in Elasticsearch.
Why?
To make logstash to read and process your input every time you run logstash, use "sincedb_path" option to /dev/null (cit.)
but I found this solution : input { file { path => "/path/to/logstash-tutorial.log" start_position => beginning sincedb_path => "/dev/null" } }
and it is working
Thank you.
I did
sincedb_path => "/dev/null"
and Logstash created the index in Elasticsearch.
However, Logstash keeps reading the file and sending it. As if in a loop.
I made the input file with oneline.
Now I have a thousand identical lines (hits) in Elasticsearch
yes.
I am trying this example.
I used the one line they provided.
When set
sincedb_path => "/dev/null"
Logstash kept on sending the content again and again
because in Windows there is no /dev/null.
I tried
sincedb_path => "nul"
and it works so far.
You both need to understand that sincedb keeps track of where LS has processed in any file that it reads and by setting that to /dev/null you are implying that you don't want to track the progress.
Hi @warkolm,
Please let me share what I infer from you.
when set sincedb at nul then every time Logstash is run it will start reading from the beginning of the file. They will cause duplicate entries.
Right?
The reason I am touching sincedb is that I could not get Logstash to create an index in Elasticsearch.
Any suggestion to solve this problem?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.