The correct use of logstash

I have four types of log files.
Every day I get the 4 types from several machines and, I have for each machine : t1-20200703, t2-20200703, t3-20200703 and t4-20200703.
After a few days I will have hundreds of files and I want to know how I can organize all this on elasticsearch? and on kibana?
I'm using logstash.

		hosts => ["localhost:9200"]
		index => "%{type}%{+yyyy.MM.dd}"

Thank you

There's not really enough info here to be helpful, but some general guidelines:

  • You need to avoid too many indices / shards. You should probably combine log files if they are for the same application and almost certainly combine logs from different hosts.
  • You should research ILM (index lifecycle management), it is the "new way" instead of date based indexes and automate management and deletion of old indices.

could you please explain how can I combine log files if they are for the same application ?

Just send to the same index.

my index takes the date as its name. so it automatically changes its name when there is new data !

As @rugenl pointed out using ILM will be the better way of automating the process of index management on the Elastic side.

If you think logs from different machines are of similar structure, then you can store logs from different machines into same index. Probably add host information to the logs, so that you can filter on these fields to get logs from different machines. Checkout Elastic Common Schema to standardize naming conventions.

As you start using ILM, no need to specify the particular index name, you will be providing the alias name which will be a static name. Elasticsearch will automatically indexes into a date based index and will rollover automatically based on your ILM policy.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.