Ingest data to DataStream Through Filebeat

Hi @Debasis_Mallick

What you are trying to do is kinda orthogonal to data streams... If you want daily indices created exactly at 12:00 UTC etc...

You should create daily indices not datastreams ... but here are a couple thoughts.

  1. If you create the Data Stream and Load the first Data at ~12:00 AM UTC which will create the backing indices then use the 1 day to rollover in the ILM policy then the rollover should happen approximately the same team each day.

  2. You could create a simple bash / cron script and roll over the data stream each day using the _rollover API.

  3. Go back to indices and use the daily index syntax

I think those are your options...

By the way, you have not explained why it is so important to have these datasets in separate indices. We hear this often, and often, it is for "appearance" only reasons.

These are my thoughts

1 Like

Thanks @stephenb for these suggestion will check if feasible or not. Can you help on the above query which is related to rollover not happening automatically if csv files are not present in paths folder as mentioned above . Is there anyway to check logs related rollover for DS.

type: filestream
  id: my-filestream-id
  enabled: true
  paths:
   - /cbdata/elastic/cb4lv1/democsv1/*.csv

Adding to above points we had observed also one day index rollover happened in 11:50 AM IST and next day it creates 12:00 PM IST. As you had mentioned in the earlier thread, it should rollover at the same time. Please correct me if I am wrong.

Thanks,
Debasis