Logstash Index error : [logstash-*] IndexNotFoundException[no such index]

Hi @yahoo,
Did you resolve this problem : "Logstash is run it will start reading from the beginning of the file.

if yes, could you explain to me how, please ?

Yes. I did. after many trials and roaming through the web.
I am in a windows environment.
What worked for me is the sincedb path points o a file in an area that I have permission for r/w/m

sincedb_path => "D:\dir1\dir2\..\.sincedb_gpuz" 

Please let me know if it works for you

Thank you,
I did the same but I made a mistake see my post :
Duplicated logs from logstash after append logs - #5 by carmelom

and I sort it out with this command line :

echo "test log message" >> logfile.log

PS: my logstash is not working very well in Windows and I moved to Ubuntu and it is easier to manage

this is what is in my logstash-conf file. indexes are not getting created in elasticsearch. I am getting the no such index error (http://localhost:9200/logstash-2016.01.19/_search?q=response=200). I am using windows 7. what am I doing wrong? please help

input {
file {
path => "C:\Users\rx1234\Documents\tmp\logstash-tutorial"
start_position => beginning
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
geoip {
source => "clientip"
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout {}
}

Try to add this line in to your configuration:

or :
sincedb_path =>"/dev/null" ( may is only for linux )
or
sincedb_path => "null"( may this is for windows )

then verify if logstash is creating the index :
curl -XGET http://localhost:9200/_cat/indices?v
or
http://localhost:9200/_cat/indices?v

added
sincedb_path => "null"
to the config file. Output from logstash pasted below. Still no index. Since stdout {} statement is also there in conf file, if index did get created, shouldn't there be some stdout statements?

Any other ideas?

C:\Users\rx1234\Documents\logstash-2.1.1\bin>logstash -f first-pipeline.conf
io/console not supported; tty will not be manipulated
Settings: Default filter workers: 4
Logstash startup completed

found the problem -- needed an extension to the input file. the ".txt" was missing earlier

logstash-tutorial-dataset.txt

I actually been trying this example explained here Parsing Logs with Logstash | Logstash Reference [8.11] | Elastic

and as most of you got the Logstash Index Error.

I added the sincedb_path => "/dev/null" but no luck

only got this
logstash-config-files]$ ../logstash-2.3.4/bin/logstash -f ./first-pipeline.conf
Settings: Default pipeline workers: 12
Pipeline main started
.
THIS WOULD stall like this for whatever it takes with no output, and while checking with the output from Elasticsearch still doesn't have index:

$ curl -XGET 'localhost:9200/logstash-$DATE/_search?pretty&q=response=200'
{
"error" : {
"root_cause" : [ {
"type" : "index_not_found_exception",
"reason" : "no such index",
"resource.type" : "index_or_alias",
"resource.id" : "logstash-$DATE",
"index" : "logstash-$DATE"
} ],
"type" : "index_not_found_exception",
"reason" : "no such index",
"resource.type" : "index_or_alias",
"resource.id" : "logstash-$DATE",
"index" : "logstash-$DATE"
},
"status" : 404
}

FYI, using stdout {} instead of elasticsearch give me some output but I really need to test this with Elasticsearch but no luck.

bearing in mind elasticsearch is running and the log file is in the right path which I made sure of !

Thanks,