The documentation (Network fields) specifies the field "network.bytes" is the sum of "source.bytes" and "destination.bytes".
In my case this fields are missing. I have only this values :
netflow.octet_delta_count
netflow.packet_delta_count
netflow.post_octet_delta_count
netflow.post_packet_delta_count
On the other side, Meraki specifies (Meraki netflow overview) the netflow template sent and specially two field used to collect bytes :
bytes
out_bytes
I don't understand why I have differents values, the relationship between them and how the network.bytes is calculated. Is this possible to have some explication about this ?
Hey @jquintard, can you confirm that the network.bytes field is a number? If so, it will be storing the number of bytes itself. Timelion currently ignores the Kibana index-pattern field formatters so if you'd like to convert from bytes to kB or mB in Timelion, you'll have to do this yourself.
@Brandon_Kobel, it's a number. But I think you have not understand my problem (my poor English probably). I have not an unit issue, I known how to divide the number in timelion. It's just the bytes from the two sources (meraki dashboard and kibana) are not the same (or approximately the same).
Example from the previous screenshot :
At 12AM Meraki : around 1,2 Mb/s so 0,15MB/s Kibana : around 41 676 MB/s
From 0,15MB/s to 41 675 MB/s the gap is huge.
I dont understand why 41 675 MB/s (it's just impossible).
The previous graph on the thread but with only one minute of data. I receive 22 flow. The first give 68,8MB, the last 137,2MB so in 1 minute, 137,2 - 68,8 = 68,4MB of data are received/send by the interface. The bandwidth is therefore 68,4 / 60 = 1,14MB/s...
@rashid, I'm sorry to ask your help, I have read the post, specified by brandon, but it's allready I do. Do you know why I have a big huge between what I see in my meraki dashboard and you I got in kibana... How I can troubleshoot. Thanks your help.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.