Hi Friends,
Newbie to ELK, so patience with me.
I have a grid product (Object storage) which is a cluster that is compound of nodes.
It is based on nodes (server with disks) grouped into Zones, and the data is stored using object policies with replicas.
The cluster has a HTTP API to where I can get its info & stats using HTTP calls & get json replies, which are easy to push into elasticsearch. The problem is that the product doesn't have historical logging, and I want to use elastic-search to store (mainly statistics historically) that data over time & be able to present it using Kibana.
The bricks that I can get via the HTTP calls are:
0 - Cluster info
1 - Node info
2 - Zones
3 - Policies info
4 - Statistics summary cluster (sum of all nodes)
5 - Statistics detailed cluster (sum of all nodes)
6 - Statistics detailed per node
Most will remain static, (will check updates daily) and what is the most dynamic and I plan to collect every 5min is #6.
my plan is to wget/curl the per node statistics every 5min (returns json output, see below) and push it into elasticsearch.
Given the above, how would you recommend I build the indexes, so that I can easily use Kibana to build a nice dashboard for historical table/graphic view ?
wget -q --user=XXX --password=XXX http://10.12.11.51:8088/mgmt/statistics_detail?address=10.12.11.51 -O - | /usr/bin/python -mjson.tool > statistics_detail_single
{
"avgDelLatency": [
0,
"ms"
],
"avgGetLatency": [
0,
"ms"
],
"avgPutLatency": [
0,
"ms"
],
"fileDeletesPerSec": [
0,
"FDPS"
],
"fileReadsPerSec": [
0,
"FRPS"
],
"fileWritesPerSec": [
0,
"FWPS"
],
"getThroughput": [
0.0,
"MB/s"
],
"putThroughput": [
0.0,
"MB/s"
],
"totalDelFailureCount": [
0,
"requests"
],
"totalDelSuccessCount": [
0,
"requests"
],
"totalGetFailureCount": [
0,
"requests"
],
"totalGetSuccessCount": [
0,
"requests"
],
"totalPutFailureCount": [
0,
"requests"
],
"totalPutSuccessCount": [
66761,
"requests"
]
}