Logstash & Beat :index_not_found_exception


(brouk) #1

Hello,

I am following theLogstash tutorial . This Command
curl -XGET 'localhost:9200/logstash-2018.06.24/_search?q=response=200'
returns the following error, instead of the data shown in the tutorial:
{ "error" : { "root_cause" : [ { "type" : "index_not_found_exception", "reason" : "no such index", "resource.type" : "index_or_alias", "resource.id" : "logstash-2018.06.24", "index_uuid" : "_na_", "index" : "logstash-2018.06.24" } ], "type" : "index_not_found_exception", "reason" : "no such index", "resource.type" : "index_or_alias", "resource.id" : "logstash-2018.06.24", "index_uuid" : "_na_", "index" : "logstash-2018.06.24" }, "status" : 404
Elasticsearch is running correctly on the (localhost). This is the content of logstash.conf as indicated by the tutorial:
input {
beats {
port => "5044"
}
}

filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
geoip {
source => "clientip"
}
}

output {
elasticsearch {
hosts => [ "localhost:9200" ]
}
}

and the Logstash process is run, as indicated, as
bin/logstash -f logstash.conf --config.test_and_exit

What is the reason for this error?


(Christian Dahlqvist) #2

I think you will need to elaborate a bit more on your issue if anyone is going to be able to help you...


(brouk) #3

i did it


(Christian Dahlqvist) #4

If you specify this you actually do not process any data. This is meant as a first step to verify that your config is OK. If it passes, you need to remove it to process data.

You can also look at this introduction to Logstash.


(brouk) #5

hello Christian,
yes i know i do that to test my logstash.config configuration and i get : ## expected result ....Conifg Validation resulat: OK.Exiting Logstash.After that i do this Command : --config.reload.automatic to restart my logstash.conf.
But i get this same issue !?


(Christian Dahlqvist) #6

What happens if you start Logstash like this: bin/logstash -f logstash.conf


(brouk) #7

So due to the logstash-toturial the command (--config.test_and_exit) is for test the new configuration and the second command (--config.reload.automatic) is for starting logstash and automatic reloading after any change in logstash.conf. (BOTH COMMANDOS WORK GOOD)
So when I set my logstash output to (stdout {codec => rubydebug}), on the console I get the parsed log events quite clearly. The question is if i configure my logstash output to elasticSearch it should theoretically store my log event in elasticSearch, for that i need an index, due to logstash toturial guid i need to run this command (curl -XGET 'localhost: 9200 / logstash-2018.06. 24 / _search? Pretty & q = response = 200 '), at this point I get the issuie. how can I even save my new parsed log events in elasticsearch and browse & visiualize those in kibana


(Christian Dahlqvist) #8

Your data may not end up in that particular daily index, so instead do: curl -XGET 'localhost:9200/logstash-*/_search?pretty&q=response=200'. This will query all Logstash indices.

You can also run curl -XGET localhost:9200/_cat/indices to verify that indices have been created.


(system) #9

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.