Logstash Index error : [logstash-*] IndexNotFoundException[no such index]

Yes, I tried this :
logstash -e 'input { stdin { } } output { stdout {} }'
and it is working fine.

And I added this in my output:
output { elasticsearch { hosts => ["localhost:9200"] } stdout { codec => rubydebug } }
But still doesn't work. because here:
http://localhost:9200/_cat/indices
I have only this :
yellow open .kibana 1 1 1 0 3.1kb 3.1kb

I tried the same steps in Ubuntu and it is working immediately.

Thank you for your time

Errata codices:
I tried the same steps in Ubuntu and it was working.
Than I deleted the index in elasticsearch with :
curl -XDELETE http://localhost:9200/logstash-2015.12.30/
and try to recreate it with a different config file and logstash wasn't sent the new index to the elasticsearch.

someone know why ?

Hi
I am in the same situation and having the same problem on windows.
I followed the instructions and they don't work.
Logstash is not creating the index in Elasticsearch.
Why?

I don't know why.

To make logstash to read and process your input every time you run logstash, use "sincedb_path" option to /dev/null (cit.)

but I found this solution :
input { file { path => "/path/to/logstash-tutorial.log" start_position => beginning sincedb_path => "/dev/null" } }
and it is working

Thank you.
I did
sincedb_path => "/dev/null"
and Logstash created the index in Elasticsearch.

However, Logstash keeps reading the file and sending it. As if in a loop.
I made the input file with oneline.
Now I have a thousand identical lines (hits) in Elasticsearch

I tried only this example
https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html
with their logs, and it is working fine.

Now I am starting to "play" with different logs.
I let you know

yes.
I am trying this example.
I used the one line they provided.
When set
sincedb_path => "/dev/null"
Logstash kept on sending the content again and again
because in Windows there is no /dev/null.

I tried
sincedb_path => "nul"
and it works so far.

You both need to understand that sincedb keeps track of where LS has processed in any file that it reads and by setting that to /dev/null you are implying that you don't want to track the progress.

Hi @warkolm,
Please let me share what I infer from you.
when set sincedb at nul then every time Logstash is run it will start reading from the beginning of the file. They will cause duplicate entries.
Right?

The reason I am touching sincedb is that I could not get Logstash to create an index in Elasticsearch.
Any suggestion to solve this problem?

If you set it to /dev/null it will.

Hi @yahoo,
Did you resolve this problem : "Logstash is run it will start reading from the beginning of the file.

if yes, could you explain to me how, please ?

Yes. I did. after many trials and roaming through the web.
I am in a windows environment.
What worked for me is the sincedb path points o a file in an area that I have permission for r/w/m

sincedb_path => "D:\dir1\dir2\..\.sincedb_gpuz" 

Please let me know if it works for you

Thank you,
I did the same but I made a mistake see my post :
Duplicated logs from logstash after append logs - #5 by carmelom

and I sort it out with this command line :

echo "test log message" >> logfile.log

PS: my logstash is not working very well in Windows and I moved to Ubuntu and it is easier to manage

this is what is in my logstash-conf file. indexes are not getting created in elasticsearch. I am getting the no such index error (http://localhost:9200/logstash-2016.01.19/_search?q=response=200). I am using windows 7. what am I doing wrong? please help

input {
file {
path => "C:\Users\rx1234\Documents\tmp\logstash-tutorial"
start_position => beginning
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
geoip {
source => "clientip"
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout {}
}

Try to add this line in to your configuration:

or :
sincedb_path =>"/dev/null" ( may is only for linux )
or
sincedb_path => "null"( may this is for windows )

then verify if logstash is creating the index :
curl -XGET http://localhost:9200/_cat/indices?v
or
http://localhost:9200/_cat/indices?v

added
sincedb_path => "null"
to the config file. Output from logstash pasted below. Still no index. Since stdout {} statement is also there in conf file, if index did get created, shouldn't there be some stdout statements?

Any other ideas?

C:\Users\rx1234\Documents\logstash-2.1.1\bin>logstash -f first-pipeline.conf
io/console not supported; tty will not be manipulated
Settings: Default filter workers: 4
Logstash startup completed

found the problem -- needed an extension to the input file. the ".txt" was missing earlier

logstash-tutorial-dataset.txt

I actually been trying this example explained here Parsing Logs with Logstash | Logstash Reference [8.11] | Elastic

and as most of you got the Logstash Index Error.

I added the sincedb_path => "/dev/null" but no luck

only got this
logstash-config-files]$ ../logstash-2.3.4/bin/logstash -f ./first-pipeline.conf
Settings: Default pipeline workers: 12
Pipeline main started
.
THIS WOULD stall like this for whatever it takes with no output, and while checking with the output from Elasticsearch still doesn't have index:

$ curl -XGET 'localhost:9200/logstash-$DATE/_search?pretty&q=response=200'
{
"error" : {
"root_cause" : [ {
"type" : "index_not_found_exception",
"reason" : "no such index",
"resource.type" : "index_or_alias",
"resource.id" : "logstash-$DATE",
"index" : "logstash-$DATE"
} ],
"type" : "index_not_found_exception",
"reason" : "no such index",
"resource.type" : "index_or_alias",
"resource.id" : "logstash-$DATE",
"index" : "logstash-$DATE"
},
"status" : 404
}

FYI, using stdout {} instead of elasticsearch give me some output but I really need to test this with Elasticsearch but no luck.

bearing in mind elasticsearch is running and the log file is in the right path which I made sure of !

Thanks,