ElasticSearch as a time series database

we currently have data that looks something like this

index: device_data_2020-01-01 and docs looks something like this

{deviceId: "a001", timestamp: "2020-02-21T19:33:00", metrics:{ port0: 12, port1: 13 }}
{deviceId: "a002", timestamp: "2020-02-21T19:33:00", , metrics:{ port0: 14, port1: 15, port3: 22 }}
{deviceId: "a001", timestamp: "2020-02-21T19:34:00", metrics:{ port0: 13, port1: 12 }}
and so on ...

a single document is around 5KB,
the column names are not fixed and it is pretty nice that they get indexed automatically.

The only issue we keep hitting are the artificial limits like max columns per index and max rows returned(we heavily rely on Elasticsearch's SQL functionality which makes life so much easier to create reports).

two questions,

is elastic-search a good fit for this or are we abusing ElasticSearch? (at the moment, data is around a gig but we expect it grow to 4TB)

we don't use elastic-search for any kind of search, is there any way we can disable the text search indexing throughout the index device_data_*. I don't know if we will save anything though.

By columns, do you mean you are getting warnings about exceeding field limits?

yes but that is not the problem, I know you can change the field limit.
the question primarily, "is elasticSearch right for such a use case for wherein time-series data is retained for 2 to 3 years" which no one seems to know or answer

I've been using ES for storing log data for 7-8 years now, and it's probably our most popular use case.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.