Home » Technology » Network congestion control algorithm, isn’t it fair?

Network congestion control algorithm, isn’t it fair?


A new study on network congestion points to possible biases in the algorithms that originally managed to ensure that all users had fair access to the network. It is said that there is a possibility that bias exists in congestion control algorithms including TCP BBR. It is pointed out that this will inevitably lead to a starvation situation in which one or more senders can receive little bandwidth compared to other senders, which normalizes.

Computers and devices that transmit data over the Internet break the data into smaller packets and use special congestion control algorithms to determine the rate at which packets are sent. Without an algorithm, your computer would not be able to correctly determine the rate at which data packets are sent over the network because of the lack of information such as the quality of your network connection or how many networks other senders are using.

Congestion control algorithms can infer congestion from packet loss and delay and determine at what rate to transmit data. If the speed of sending packets is too slow, the available bandwidth cannot be fully utilized. These discarded packets have to be retransmitted, resulting in a long delay. In other words, the congestion control algorithm plays a role in minimizing latency by fairly allocating available network capacity to users.

However, the Internet is complex, and packets can be delayed or lost for reasons independent of congestion. Congestion control algorithms cannot distinguish between congestion-related delays and congestion-related delays, which researchers collectively call jitter, so unpredictable jitter confuses the algorithm, resulting in the algorithm sending packets at unequal rates. is completely shut out from the network.

The team points out that it is basically impossible for current congestion control algorithms to avoid these problems, given the complexity of the network path and all its impact on data packets. Even after experimenting with existing algorithms and creating new ones, there have been instances where someone is shut out from the network without exception. He said he was surprised by this result, as it is widely believed that such an algorithm is fair.

In this study, it was found that the current delay-intensive algorithm cannot prevent starvation, but there is a possibility that the problem can be avoided if there is an algorithm that does not concentrate the delay. The research team is planning further research on whether an algorithm that can solve the problem can be found, saying that the fact that such a simple problem has not been known for a long time with a widely used algorithm shows how difficult it is to understand the algorithm only through empirical experiments. Related content this placecan be checked in

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.