Hello dear community
Sorry for the possibly incorrect questions below. I'm just trying to learn the miracle of Elasticsearch
I want to store ~10 billion records with data history of different cryptocurrencies.
I was advised to store them all in this form:
PUT /coin_charts/_doc/bitcoin-1642716000000
{
"currency": "bitcoin",
"priceUsd": 36000,
"time": 1642716000000
}
There are questions around this:
- Is it normal to store so many records with such a structure? Farther away, very often it will be necessary to do searches on the field "time" and "currency" to obtain the value of a certain cryptocurrency for some time. For that, I issue this query:
POST /coin_charts/_search
{
"size": 1,
"query": {
"bool": {
"must": [
{"term": {"currency": "bitcoin"}},
{"range": {"time": {"gt": 1357110000000}}}
]
}
}
}
-
I need to get information about different currencies for 100 completely different dates. I thought it would make sense to get this in one request. Is it possible to do this? If so, how can I modify my request (from question #1)? Since for each date now I have to perform a separate request.
-
I noticed that the speed of adding new records is very slow. About 100-150ms. Is it normal? It seems to me that in version 5 the addition worked many times faster. Is it possible to speed up the addition of new records? Maybe there is some
multi-threaded append processing or something similar? -
What are the optimal server characteristics for this, and perhaps additional settings need to be applied so that requests are processed very quickly?
The main thing is speed