i am new to ES i would like to monitor 100000 endpoint with Metric Beats.
Can you please help me with below question:
Can i monitor 100000 system with beats. is it possible?
can i store those data to Elastic search, since data generated for those endpoints will be huge.
what are the points i should consider while doing something like this. since all endpoint will be pointing to one Elastic search server.
Yes, that should be possible. Elasticsearch can scale to handle very large data volumes.
If the data volume is huge you will need a sizeable Elasticsearch cluster, not just a single server. It is probably also impractical to have 100000 Beats write directly to Elasticsearch, so you will probably need to design an ingest pipeline like described in this blog post.
can i use kafka as buffer before storing to ES or Directly contacting ES after filtering and processing via Logstash. i want to run some ML algorithm on the monitored data.
will there be any performance issue if i Run my ML algorithm on ES data.
can anyone reply on the thread from ES team.
Yes, that should be possible.
There is really not a lot of information about exactly what you will be running, which makes it hard to judge what impact it may have on the performance of the cluster.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.