I am facing issue where I can see index has not got created after some changes in logstash pipeline file.
Can you please help me with below questions to understand. I am finding difficulty in getting it.
I have simple architecture, filebeat will look for log files and will send to logstash and logstash will send to es.
If there are daily log files getting created (e.g log1, log2 etc..) will new indices also gets created daily? (e.g index1, index2 etc..) or there will be one index getting bigger in size day by day. What makes indices to create daily same like log files getting created daily
Is there any particular time in a day when es creates new index daily?
What if one day, log file is not created (due to any issue), then new index also will not get created for that day?
and then if log file gets created the next day, will automatically index also gets created or we have to do something to get it created.
- Say I have 30 files which are indexed into es. If my elasticsearch data (due to any issue) gets deleted, corrupted etc.. and then if i deploy es again then it will again indexed those files then I am trying to understand where is my loss here. unless and until I have source log files I can get it indexed by deploying es again and again (not thinking about any saved dashboard etc. which anyway will be lost) so I can safely delete es. Is there anything I am missing or not understanding any disadvantages here by doing so.
means even if something happens to es, until I have the source log files, I can get them indexed and run query on them by deploying es again so I can purposely or accidently delete elasticsearch and need not to worry as I have those log files intact.
I am just trying to understand, how much risk it will be if es gets deleted but I have the log files and my purpose is to view those logs in kibana and run query on them.