Real-Life Large-Scale Application and Question

Hello to all.

My company contracted an AWS server and I am using Kibana to monitor data.
We have 4 "products," each one with 6 IoT devices (like sensors). Each sensor generates one document. All documents have only the field of date in common.
All these generates approx. 700K documents per day, all in one index per day. It has been working like this for several months now.
I started using Sense in order to get some specific data, but I got stuck with the following programming question (probably an easy one, but not for a newby).
Please check the next code; I filtered all documents thru some sensors:

GET /logstash-*/_search
{
"query": {
"filtered": {
"query": {
"query_string": {
"query": "uuid.sensorId :"189" uuid.sensorId :"373" uuid.sensorId :"221" uuid.sensorId :"125" uuid.sensorId :"389" uuid.sensorId :"181" uuid.sensorId :"165" uuid.sensorId :"381""
}
},
"filter": {
"bool": {
"must": [
{
"match": {
"measure.Status": {
"query": 1
}
}
},
{
"match": {
"measure.protectionAlarm": {
"query": 0
}
}
}
]
}
}
}
},
"aggs": {
"collected_values": {
"date_histogram": {
"field": "measure.timestamp",
"interval": "day",
"format": "yyyy-MM-dd",
"min_doc_count": 1
}
}
}
}

THE RESULT:
I have all buckets with dates when the 4 products worked fine (status=1 and Alarm=0). A partial glimpse to the hits is something like this:
"_index": "logstash-2016.03.22",
"_type": "logs",
"_id": "AVOcZuWGnLM3x4WLbj8X",
...
"_index": "logstash-2016.04.11",
"_type": "logs",
"_id": "AVQDPuKX3-ObnL4jC1FB",
...
"buckets": [
{
"key_as_string": "2015-12-04",
"key": 1449187200000,
"doc_count": 884
},
{
"key_as_string": "2016-02-01",
"key": 1454284800000,
"doc_count": 27
},
...

Now, my problem: I want to get the documents generated from 4 different sensors having the dates from the buckets.
What is the instruction and command to cross-reference this?

Thanks for any help

I've moved this to the Elasticsearch category, the User Groups one is for meetup groups :slight_smile:

1 Like