Does response time equals to latency or there is calculation for latency?
Because when I use the sample dashboard for topbeat, I saw that there is a latency diagram showing response time as X-axis in Kibana and does it calculate only between the server and the client or also between client an client?
response time is difference between first byte seen from request and first time seen from response message (say latency).
What you mean by between client and client
?
But doesn't latency means like jamming?
I meant like between client side computers
Can you elaborate? I don't fully get what you mean.
- As seen from the web. latency seems has a bit different meaning to response time as latency is more related to delay; however, packetbeat did provide a latency diagram based on response time; therefore, I would like to know if the response time in packetbeat just equal to latency.
- I have one centralized ES server and few clients server with logstash and packetbeat. Thus, would like to know packetbeat is just collecting data between ES server and client or also between clients
-
response time equals difference between first byte seen on request and response each. It's the latency at the point of measurement.
-
Packetbeat is collecting all traffic it sees, depending on configuration. It constructs a BPF filter from protocols configured. One can customize the packet filter (similar to tcpdump syntax) to restrict collection even more. What does
between clients
mean? Which clients are communication directly + which protocols are used?
This topic was automatically closed after 21 days. New replies are no longer allowed.