What is latency? It’s the time delay in the network. In other words, how long it takes to get a packet from node to node. You might be inclined to think that latency has something to do with bandwidth, but they are quite different characteristics.
Here’s an example. Suppose we are transmitting a data stream up to a satellite in geosynchronous orbit. One application is video feeds from the other side of the world. You see them on the TV news every night. Have you noticed that there is a slight delay in the connection? The TV anchor has to wait a second for the reporter to reply or they’ll talk over the top of each other. That pause is latency. It has to do with how long it takes for the signal to go up to the satellite and back down to the far location.
Now, how much do you suppose that it would speed up the television feed if we doubled the bandwidth? The answer: Not at all. As long as you don’t have packets waiting in a queue because of insufficient bandwidth, doubling, tripling or making the bandwidth 100x won’t speed up the circuit at all.
Why? Because the radio signal traveling thorough space is already at the speed of light. The Einsteinian limit of 186,000 miles per second sets the threshold on how low latency can be between point A and point B. You can’t make the signal go faster, but you can sure make it go slower.
Signals traveling to satellites in space go slowly enough. If you have a satellite Internet connection, you know that quite well. VoIP is impossible unless you want to use your connection like a walkie-talkie. Only one person can talk at a time and then wait for the other to reply. That’s called half-duplex. We’re used to full duplex, where two people can talk at the same time and still hear each other without interruption.
Low latency was inherent in the analog telephone system. There was very little but wire connecting two telephone sets. It takes maybe a millisecond to go a hundred miles. The same is true for TDM (Time Division Multiplexed) circuits such as T1 lines. You lose a little in the conversion process, but the synchronized channel propagates as fast as the signal can travel in copper wires or glass fibers.
So, what slows things down on the Internet? It’s all those routers between source and destination. Each one adds some milliseconds or fraction thereof to the process. Get a dozen or more routers between you and the other end of the transmission, and you’ll notice the latency building up.
Sometimes latency is important and sometimes it isn’t. The Internet was designed for things like scientific data file transfers and email. Neither of these applications is going to be much affected by a few dozen or hundred milliseconds of latency. It wasn’t until real-time interactive applications came along that anyone really took note of the latency issue.
What is latency sensitive? VoIP telephony for sure. Even so, 100 ms is considered quite workable for phone conversations. That same 100 ms can be a real annoyance to real-time gamers who might find the small but noticeable lag between action and reaction to be annoying. But 100 ms can be the difference between profit and loss in high speed financial trading. With new financial centers being built and traders increasingly using computers to automatically place their trades, the subject of networking latency has become a hot topic for very high speed networks.
So how do you decrease the latency on your network? First, forget the Internet or anything modeled after it. The Internet was designed to be self-healing, so it will route your packets any which way it can to get them to their destination. The lowest latency networks have high speed fiber in as straight a line as possible between locations. There is also as little equipment as possible between end points. Signal regenerators may be needed, but switches and routers need to be minimized. Whatever switching and routing equipment does exist has to be designed to minimize latency by running as fast as possible internally and doing a few functions as absolutely necessary. The more you process a signal, the longer it takes.
If low latency it essential to your operation, you need to specify that. Just saying that you want a 10 Gbps connection will guarantee you bandwidth, but not necessarily the lowest latency possible. Many major carriers are sensitive to the needs of financial trading and other businesses where network latency makes a difference. They offer special low latency network connections designed specifically for those needs.
Do you have a low latency requirement for your business? If so, be sure your say so when you check high bandwidth network service prices and availability for your locations.
No comments:
Post a Comment