This is pretty standard for logstash type data.
Use daily indexes, don't use TTL.
Regards,
Mark Walkom
Infrastructure Engineer
Campaign Monitor
email: markw@campaignmonitor.com
web: www.campaignmonitor.com
On 14 July 2014 11:40, LiMac cnwangyong@gmail.com wrote:
Hi folks,
I am trying to index a huge number of time serial data. The total number
will be 5k docs for one second which will continue for several months. I
also need to search these data, but only inside a very small time rage,
maybe one hour. Is there any best practice for this kind of use case?Thanks!
Alan
--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/7b659b5f-7f50-483d-a2d5-de9c2e4b650c%40googlegroups.com
https://groups.google.com/d/msgid/elasticsearch/7b659b5f-7f50-483d-a2d5-de9c2e4b650c%40googlegroups.com?utm_medium=email&utm_source=footer
.
For more options, visit https://groups.google.com/d/optout.
--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAEM624Y4GSxMc39SJf%3DCk5MwANMt7PRpnXt_oY3ZSyqNZoBpzw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.