I'm quite new to this ELK stack stuff, and have reached my limit on searching for an answer, but I have the issue where I'm attempting to index some logs that I'm assuming have been sent over through logstash, but I'm getting the error "Unable to fetch mapping. Do you have any indices matching the pattern?". I can perform a tcpdump and see that traffic is going to port 9200 on the loopback interface, so I'm assuming this is Logstash sending my syslog traffic to elasticsearch, as I do not have a Kibana web page pulled up that would create that traffic. The issue though is I do not know how to verify if Elasticsearch has received my data or not. In the elasticsearch.yml file, I declared a path for my 'path.data', which I assume is where the logs should go, but it doesn't appear that any of my files are growing which tells me, if I'm understanding this correctly, that Elasticsearch is not receiving the data. Does ES possibly store this data somewhere else? I also couldn't find a configuration for Kibana to direct kibana where to look for these logs other than directing them to port 9200 for ES. Is this all the configuration needed for kibana to find the data?
Also, the kibana service runs under a user named kibana, and when logging into kibana, I'm using an account created for kibana. Elasticsearch is running under a different user, so wouldn't the Kibana user need to receive permissions to read that data as well?
Also, another question for confirmation. In the logstash.conf file, there are three sections: input, filter, output. I'm assuming that Logstash only sends the data to the output configuration if the log matches the filter configuration?
Sorry for jumping around, if I need to create another thread I will, thanks!