Hello, everyone!
In this post, I would like to share with you calculate the bandwidth utilization from the RMON data
RMON data is a performance platform to monitor a packet transmit and receive. It captures a number of packets in a type of frame size, a total packet, bad packet, oversize packet, FCS (aka CRC), Jitter, broadcast, unicast, and so on. Some RMON data does not show bandwidth utilization. Well, we can calculate the bandwidth utilization through the number of packets provided.
To calculate the bandwidth, get the number of packets in a good Octet Receive (byte) and Good Octet Transmitted (byte). Some RMON shows Received Bytes and Transmitted Bytes instead. And then, get the value of the capture sampling period(in a value of second). If there is no capture sampling period provided, we can consider the capture sampling period as per second (1 second).
Below is the example:

Here is the formula:

Good Octet Receive = we take from the value of packet in the Good Octet Received / Receive Bytes
Good Octet Transmit = we take from the value of packet in the Good Octet Transmitted / Transmitted Bytes.
Sampling Period = we take from the value of the sampling period. If there is no sampling period provided, we can consider it as 1 second sampling period.
Configured bandwidth = we take from the port bandwidth (10Gbps, 1Gbps, 100Mbps, 10Mbps)
That's all, I hope you like it.
Thank you!


