Speedboat

Latency, The Next Broadband Measuring Stick?

For years now there has been a very narrow focus on performance of broadband networks: bandwidth.  Every carrier touts their latest plan and how fast it is, but the truth is that doubling the speed of a connection does not always double the value of the experience.  There are some system limitations and perhaps even technical issues to be resolved, but the bottom line is that looking at broadband one dimensionally may be shortsighted.  There are other key metrics to consider, especially latency, reliability, and speed boosting.

Latency Lesson 101

Latency is the amount of time it takes data to travel from a computer to a predetermined spot in the ‘net and back again.  Latency translates readily into the snappiness or responsive feel of a program.  Low latency indicates a faster response time between the computer and whatever it is connecting to, and is a better overall metric in many regards once minimum bandwidth thresholds have been achieved.

There is some relationship between latency and bandwidth as larger quantities of data going and coming at the same time get the job done sooner.  A reasonable analogy would be a massive freeway where you could change the available number of lanes in either direction as well as the number of on/off ramps.  On/off ramps naturally cause a slight and temporary slowdown, and the cars representing data can sometimes experience a traffic jam.  With upload speeds tending to be far smaller than download speeds, the average data freeway might have 10 lanes in one direction and only one or two in the opposite!  This means for a car full of data to do a round trip, the return trip is likely to be the slowest and most agonizing!

Such an imbalance is referred to as asymmetrical, and there are some symmetrical forms of broadband available.  While asymmetrical tends to be most affordable, it is not necessarily a bad thing.  That being said, the disparity between download and upload speeds can cause programs that take full use of bandwidth to display ‘lag’ as will situations in which a bandwidth intensive application is paired with an application that requires lower latency.  Simply put, the on/off ramps are causing a digital traffic jam.  Quality of service (QoS) technologies aim to alleviate this problem by offering express lanes for thru-traffic, but that creates a secondary bottleneck of high speed lanes not unlike carpool lanes on a real freeway; one day these lanes are fast, the next they are the slowest.

Reliability is King

Consistency is a big deal in networking for just the same reasons as outlined above.  How reliable a connection tends to be is simply the tip of the iceberg, but there is plenty under the surface.  The contents under the surface of this particular iceberg are similarly difficult to see, yet just as menacing to latency-specific tasks.  They include network utilization by neighbors, backbone load, and even the temperature!  This is one of the reasons that providers like Comcast do their best to control bandwidth consumption.

If the neighbors are hogging up the bandwidth in the area, the much larger data freeway your connection feeds into may already be clogged.  A similar situation may exist when the data flows from an ISP to the backbone of the Internet, can be easily pronounced whenever regional weather knocks out backup and secondary communications systems.  Lastly, as temperature increases the connectivity quality of electrical wires decrease.  This causes signals to be repeated in order for them to register properly.  The net result is that there are a number of factors that are beyond the control of the average user, yet each one can impact the reliability of a service.  When reliability goes down, latency suffers proportionately.

Speed Boost or Speeding Ticket?

Speed boost technologies come in many shapes or forms, but they are very much like other technologies that are taking the forefront in everything from the CPU to the graphics card: they can go above and beyond for short periods of time when the demand is there.  The problem, much like we can see on the CPU or GPU performance side of a computer is that performance is not predictable.  Worse, some types of speed boosting technology used cached data to supply old information that can actually increase latency as new/correct information is sought out behind the scenes.

Author: Chad Weirick

Related Articles:

Photo Credit: Ross Elliott

{ 0 comments… add one now }

Leave a Comment

Previous post:

Next post: