In general, most people assume that their bandwidth is their internet speed, however that is not true. It is simply the amount of data they can receive in a second. For example, if your internet provider offers you 100Mbps, it does not mean that you will have a download speed of 100 megabits per second, but that you could possibly receive 100 megabits every second with your connection. Your true internet speed comes from a combination of bandwidth and latency.
What is Latency?
Latency is defined as the amount of time it takes to send information from one machine to another. It could also be described as the delay between the time when the information gets sent from the server and the time when the client receives it. Latency is most commonly measured in milliseconds (ms) and could be referred to as the ping rate when running speed tests.
What is the difference between bandwidth and latency?
The best way to explain both these concepts is to visualize data flow through cables as water flowing through pipes.
- Latency has to do with the water in the pipe and how fast it moves through the pipe.
- Bandwidth has to do with the width of the pipe, determining the amount of water that can move through the pipe at a certain time
From this example, we can conclude that latency is a way to measure speed and bandwidth is not. There is however an important correlation between bandwidth and latency.
The relationship between bandwidth and latency.
There is a cause and effect relationship between latency and bandwidth, meaning one will affect how the other functions, and ultimately your internet speed. More precisely, the amount of bandwidth you have may greatly affect the latency you get, as low bandwidth will cause congestion (i.e. more data is being sent than the bandwidth allows, causing queuing), which in turn would increase the latency.
It could be easily illustrated with a comparison between a small one-lane road and a highway. If 20 cars are going through a 5-lane highway, they will arrive faster to the destination than if there were 20 cars driving through a 1-lane road.
What is an acceptable latency?
The perfect latency figure is 0ms, however it is physically impossible as there is no medium that transmits data faster than fiber optic cable, which we currently use to transfer data over long distances. The average latency of fiber optic cable is 4.9 microseconds per kilometer (0.0049ms/km), however it could be increased based on cable imperfections.
The acceptable latency figure is relative to the use case of the internet connection:
- If you are streaming video, maximum latency should be around 4 to 5 seconds (4000 – 5000ms)
- If using VoIP (voice over IP), the latency should be much lower, at about 150-200ms
- If playing games online, the desired latency figure is even lower, at 30-50ms
What factors affect latency?
Besides bandwidth, which we discussed previously, there are two main factors that affect latency:
The main factor is distance between the client and the server. Since data takes time to transverse distances, the further you are away from the source of the data, the longer it will take for it to reach your machine.
The second factor is congestion, which goes hand in hand with bandwidth. The lower the bandwidth of your connection, the higher is the chance of you experiencing congestion. Think of it as a traffic jam at a narrowing road. There are more cars arriving to the destination than there are lanes to accommodate for them, so some must stand and wait before they can reach their destination.
Ways to reduce latency.
If you are the end user, then there is practically no way for you to reduce latency other than using servers which are located the nearest to you geographically, and using your internet connection solely for the task at hand, reducing congestion which could have been caused by other processes using the connection. For example, if you wish to download a large file from the internet as fast as possible, then try not to use the same connection to watch youtube, as that takes up bandwidth from the download.
As a server admin, you could use the following techniques:
- Fewer External HTTP requests
- Using a CDN
- Using Prefetching Methods
- Browser Caching
For a more in-depth tutorial, check out this article on Network Optimization. (coming soon)