The effect variable network behaviour on audio or video are quite hard to estimate.
Typically, a playout buffer will be set given the delays in the encoder, compressor, decoder, decompressor and transmission, and the overall delay budget, to put a drop dead deadline on arrival of packets over the Internet.
Packets are either lost in transit or re-ordered or delayed so that some number just don't make it. The loss distributions in the Internet are complex, and change as the network evolves. However, the degradation of the received and perceived signal will increase with compression and loss - since the compression exploits redundancy, it removes the signals very tolerance for loss. Lossy compression is worse sine it reduces the signal to its bare essential components.
The designer of a compression scheme for packet transmission has two possible approaches to this problem: firstly, careful choice of compression scheme and ratio, to minimize the imact of loss for the average delivered rate; secondly, the use of smart protection by re-adding redundancy to the packets in a way that is specifically designed to protect against packet loss.
Suffice it to say that under heavy load, there are significant losses and that they are quite often spatially and temporally correlated (that is to say that if a opacket is lost, it is more likely that the next one to the same place is more likely to be lost, and that packets destined for simialr areas of the network are also more likely to be lost).
The size of a packet in the Internet for audio or video might be in the range 320 bytes up to 1500 bytes, typically. This means that the loss of media data is quite significant even for only a single lost packet.
There are a variety of enhancement approaches to protecting the media stream by clever mapping of the data into packets - one can go further and add general packet level FEC codes, or even media specific FECs [#!rat!#][#!freephone!#], or when the delays are sufficiently low, or in a playback situation, recover from loss using retransmission strategies - some users propose a hybrid of media specific FEC and retransmission.
Finally, it remains to be seen if new scalable CODECs might be devised that, combined with intelligent distribution of the compressed media data over a sequence of packets, lead to better quality results in the face of the Internet's current curious, and somewhat pathological loss behaviour.