How can i insert the history data into TSDB

I study Set up a time series data stream (TSDS) | Elasticsearch Guide [8.5] | Elastic
but i insert the data ,found a error

PUT metrics-weather_sensors-dev/_bulk
{ "create":{ } }
{ "@timestamp": "2099-05-06T16:21:15.000Z", "sensor_id": "HAL-000001", "location": "plains", "temperature": 26.7,"humidity": 49.9 }
{ "create":{ } }
{ "@timestamp": "2099-05-06T16:25:42.000Z", "sensor_id": "SYKENET-000001", "location": "swamp", "temperature": 32.4, "humidity": 88.9 }


          "type": "illegal_argument_exception",
          "reason": "the document timestamp [2099-05-06T16:21:15.000Z] is outside of ranges of currently writable indices [[2022-11-03T03:38:56.000Z,2022-11-03T09:44:56.178Z]]"
        }

there is not create indices 2099-*

if I insert the data of 2021 ,how can set the elasticsearch auth create aaaaa-2021.01.01-000001

thanks

Did you run through the other steps on that page? What was the outcome of the steps?

yes,the ILM,setting,mapping,template is all create

{
  "template": {
    "settings": {
      "index": {
        "lifecycle": {
          "name": "my-lifecycle-policy"
        },
        "mode": "time_series",
        "codec": "best_compression",
        "routing": {
          "allocation": {
            "include": {
              "_tier_preference": "data_hot"
            }
          }
        },
        "time_series": {
          "end_time": "2022-11-03T10:01:56.000Z",
          "start_time": "2022-11-03T04:01:56.000Z"
        },
        "look_ahead_time": "3h",
        "routing_path": [
          "sensor_id",
          "location"
        ]
      }
    },
    "mappings": {
      "dynamic": "true",
      "dynamic_date_formats": [
        "strict_date_optional_time",
        "yyyy/MM/dd HH:mm:ss Z||yyyy/MM/dd Z"
      ],
      "dynamic_templates": [],
      "date_detection": true,
      "numeric_detection": false,
      "properties": {
        "@timestamp": {
          "type": "date",
          "format": "strict_date_optional_time"
        },
        "humidity": {
          "type": "half_float",
          "time_series_metric": "gauge"
        },
        "location": {
          "type": "keyword",
          "time_series_dimension": true
        },
        "sensor_id": {
          "type": "keyword",
          "time_series_dimension": true
        },
        "temperature": {
          "type": "half_float",
          "time_series_metric": "gauge"
        }
      }
    },
    "aliases": {}
  }
}

it shoud set the start and end time

"time_series": {
      "end_time": "2099-05-06T16:21:15.000Z",
      "start_time": "2016-05-06T16:21:15.000Z"
    }

A tsdb data stream is hard coded to set the index.time_series.start_time and index.time_series.end_time settings upon creation of the a backing index.

When a data stream is newly created, the first backing index's start_time setting will be set to: now-look_ahead_time (now-3h in your case) and the end_time setting will be set to: now+look_ahead_time (now+3h in your case). Only data that falls into that range is accepted.

So in your example you will need to adjust the @timestamp field value that fall into that range in order to get these documents accepted. I think the docs should be improved to highlight this fact.

A tsdb data stream is designed to accept recent metric data (which is based on current time and the index.look_ahead_time index setting). As a tsdb data stream gets older and has rolled over then the older backing indices can still accept writes via a data stream (unlike regular data streams). But the most recent backing index is designed to handle recent metric data. The best way to see for what time range a data stream will accept writes, it is best the check the get data stream api. This returns a tsdb data stream's temporal range.

If you're just curious about how well tsdb works for your metric data set, I think it is best to just create a tsdb index (an index with index.mode=time_series) and then experiment how well your data compresses.

1 Like

thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.