Latency and Bandwidth: Brothers from another Mother.

When it comes to network connectivity, bandwidth and latency are ‘brothers from another mother’ – close enough to be related but different enough to be noticeable. Understanding this distinction can mean the difference between an acceptable user experience and utter frustration.

Bandwidth is a measure of how much data can move (measured in X bits per second) and latency is a measure of the delay in moving that data (measured in milliseconds), between two nodes. In other words, bandwidth measures size and latency measures speed.

Do not conflate bandwidth and latency – size and speed are different measures. Imagine a car and a bus leaving Saskatoon at the same time, heading for a Rider game – the car seats four, the bus forty eight – the car arrives in Regina thirty minutes sooner then the bus and gets a head-start on tailgating. The car has lower latency then the bus, but the bus delivers twelve times more people.

Depending on your requirement – if getting there as fast you can to party is your thing, then latency (speed), is important to you. However, if your requirement is to get as many people to the game as efficiently as you can, then bandwidth (size) is more essential.

Bandwidth is crucial when you need to move large files. Data replication is an example of this. If those files, however, need to arrive ‘on-time’, then latency becomes vital. Think about a recent Zoom Video call you had with your team; conversations that flowed seamlessly versus jumbled messes of people talking over each other – this is the difference between acceptable and unacceptable latency.

Why is low latency so important?

Well, imagine high latency occurring during a remote medical procedure? Or how about an autonomous vehicle on delivery? I Would not want to be crossing the street when the AV’s braking algorithm is delayed by 150 milliseconds.

Most people focus on bandwidth as the main contributor to a poor network user experience, but latency can be the real culprit – it does not matter how much traffic you can move if it doesn’t arrive precisely when its needed. Any business application that demands fast, secure and reliable data access — such as machine data analytics, security analytics or operational analytics — will need low latency in order to be successful.

Conversely, business applications that do not require working with hot data (or primary workloads), such as archiving, backup and disaster recovery, can function without the strict high-performance and low latency requirements of other functions. Every organization in every industry, reduced network latency can open doors to new cloud projects and flexibility, which can help companies cut operational and infrastructure costs across the board.

The key takeaway here is that having enough bandwidth, while necessary, is not enough to ensure the performance of remote or cloud-based applications. High latency can have an extremely negative affect on collaboration tools, impacting productivity and derailing your cloud deployment strategy.

Next time: What Impacts Latency and tips on how you can improve it.

Download our latest infographic on Latency, here.

More to explore