I see that Packetbeat records cumulative values for its interim Flow captures. What is the best way to extract the derivatives for this? A Beat processor, or sending to Logstash or something clever on Kibana (note we do not yet have Timelion on our Kibana)?
I set up a flow, filtered on TCP port 7222 to capture Tibco EMS traffic but, looking at the "start_time" and the "last_time", each flow seems to span a long period of time (one spanned half a day), I understand that Tibco EMS tries to keep a continuous connection going to send its data over. Therefore only looking at FINAL flow stats is not practical here.
{
"_index": "xxxxxx-packetbeat-6.2.0-xxxxx",
"_type": "doc",
"_id": "AWGzrzrlEVEpPOEFhA9H",
"_score": null,
"_source": {
"@timestamp": "2018-02-20T14:47:40.019Z",
"last_time": "2018-02-20T14:47:38.912Z",
"type": "flow",
"source": {
"mac": "xxxxxxxxxxxxxxx",
"ip": "xxxxxxxxxxxxx",
"port": 7222,
"stats": {
"net_packets_total": 5099,
"net_bytes_total": 1602913
}
},
"dest": {
"ip": "xxxxxxxx",
"port": xxxx,
"stats": {
"net_packets_total": 5168,
"net_bytes_total": 352122
},
"mac": "xxxxxxxxxxxxx"
},
"start_time": "2018-02-20T13:13:48.994Z",
"beat": {
"name": "Tibco-Packetbeat",
"hostname": "xxxxxxxxxxxx",
"version": "6.2.0"
},
"flow_id": "EQQA////DP//////FP8BAAEAUFatAPMAUFatAXEKOoAqCjqAml3H1Uk",
"final": false,
"transport": "tcp"
},
"fields": {
"last_time": [
1519138058912
],
"start_time": [
1519132428994
],
"@timestamp": [
1519138060019
]
},
"sort": [
1519138060019
]
}