Jitter is the variation in latency, or difference in transmission delay, between packets transmitted between two systems in a computer network. Find out the technical definition of this term, its consequences, and solutions to address this problem.
The term jitter is used in the field of computer networks. Here is its definition. It refers to the transmission delay difference end-to-end between different packets of the same packet stream during transmission from one system to another.
This means that one package takes longer than another to travel between the two systems. To express this notion, we can also talk about latency variation. There are several causes of jitter. It can occur as a result of network congestion, timing drift or routing changes.
In electronics, this phenomenon is due to the fluctuation of a signal. This can be caused by electromagnetic interference.
Why is computer jitter a problem?
This effect is particularly problematic in the case of real-time communicationssuch as VoIP telephony or video conferencing and other real-time services. However, this is not the case for streaming. This is because the videos are already recorded and the player caches the content before playing it back.
If the jitter is high (high latency variation)At some times the packages will all arrive at the same time, while at other times no packages will arrive. In the case of a conference call, for example, the packets are the words of the caller. When the latency increases, the words no longer arrive and the flow is shifted.
When the latency becomes lower, the words in transit all reach the recipient at the same time and the result is inaudible. Sentence fragments are lost. It is also a problem in the case of hosted servers and other VDI (Virtual Desktop Infrastructure) for the same reasons.
How to measure jitter?
For measure this phenomenonFor example, we can base our analysis on the packets with the most varied transmission delay over a certain time period. The transmission delay between the start of the packet at the source and the arrival of the end of the packet at the destination is measured.
In the case of instantaneous variation in packet delay (delay difference between successive packets), RFC 3393 indicates how to measure jitter. For example, if packets are sent every 10 ms, but the second packet is received 20 ms after the first, the instantaneous jitter is -10 ms. This is referred to as “dispersal”. On the contrary, if the second packet is received 5 ms after the first, the instantaneous jitter is equivalent to a delay of +5 ms. This is referred to as “agglutination”.
How can we limit this phenomenon?
To limit this phenomenon, in the case of multimedia streams, you can use a jitter buffer. on a network router or on a computer at the receiver. This is an area within which packets are collected, stored and sent to the voice processor at equally spaced intervals.
Thus, the application consuming the network packets receives them from the buffer. This reduces the latency variation. On distinguishes two types of jitter buffers: static and dynamic. A static buffer is hardware based. The manufacturer configures it. A dynamic buffer is software-based and can be configured by the network administrator. It is designed to adapt to changes in network delays. Another solution to limit jitter is to route traffic by selecting the most stable routes.