Occasionally I read about Quality of Service provisions which can be used, for example, to ensure that video data transmitted across networks is treated with priority so that frames do not get dropped during videoconferencing.
This strikes me as very Wrong.
What we used to have is that there were networks and there were clients on these networks, which used more or less bandwidth depending on the application. Some communications across the networks needed to have minimal lag time, while with others, it was less critical (for example, for file transfers, I'd say that it's typically not significant whether a given packet arrives a tenth of a second later or exactly "on time" compared to other packets).
Then came along applications such as video streaming. These typically needed large bandwidths; they also typically wanted minimal lag and minimal data loss.
Then what happens? Not only is this newcomer tolerated, but they're actually rewarded for their bandwidth-guzzling, time-sensitive nature!
Essentially, it seems that QoS is telling such applications, "It's perfectly fine for you to use up much of the bandwidth previously shared by many other applications, and we'll even let you through more quickly than existing, more polite and bandwidth-conscious applications!"
It's not my fault that Alice wants to videoconference with Bob rather than, saying calling him on the phone or sending him (plain-text) email; I don't quite see why my data packets should suddenly be demoted to second-class citizens just because she things that real-time video streaming should be preferred and this preference "enforced" with QoS.
In my mind, their packets should have no more priority than mine, and should compete equally and fairly for bandwidth; preferably, they'd even be throttled to ensure that they do not take up an excessive amount of bandwidth and to reduce the slowdown that other applications suffer.
Or am I grossly misunderstanding something about QoS?