Change default audio to 44100hz, not 48000hz

DeMoN

Member
Some months ago you made it optional to choose back 44100hz, and I really thank you for that.

But you also should default it to 44100hz.

Nobody in the world needs 48khz as a stream and is a complete waste of bitrate in the wrong direction if you do lossy audio coding with completely unnecessary 48khz ..

Why is 48khz the default?

Low bitrates should be and is much better invested inside 44,1 khz. Then you keep more quality, than to waste it into unnecessary 48khz ..

Please change that default to 44,1 khz.

People with not so much knowledge tend do stay at default settings of course. And so lots of people now stream low audio bitrate at 48khz :S
 

Muf

Forum Moderator
DeMoN said:
Why is 48khz the default?
Because 99.9% of audio devices and games default to it, and by using the same sample rate as the audio device you avoid having to resample the audio which reduces the quality needlessly. For a given MP3/AAC bitrate there is no quality/bandwidth advantage to using 44.1KHz, as psychoacoustical modeling will eliminate inaudible frequencies anyway. Also, since 99.9% of audio devices default to 48KHz as their playback frequency, you avoid your viewers having to resample from 44.1KHz back to the 48KHz of their audio device, again reducing quality needlessly.
 

Krazy

Town drunk
I'm also curious where you are getting this notion that the higher sample rate is bad for lower bitrates. I've not been able to information saying this at all. I'm not even really sure sample rate and bitrate are completely related.
 

ThoNohT

Developer
Of course they are related. Say we are using 16 bit audio, then every sample would take 16 bits, or 2 bytes. More samples in a second equals more bits needed.

So for uncompressed 16 bit 44.1KHz audio, you need 705600 bits per second. For uncompressed 16 bit 48KHz audio, you need 768000 bits per second. Even though the difference is only a whopping 8.8%, it is related to bitrate. Now, of course, we do compress audio. So in the end, the bitrate is what you are going to set it at anyway, with as a difference, that 9% more compression.

Which means you really shouldn't worry about the sample rate vs bitrate question at all. What's more important, as Muf said, is the playback frequencies that most common devices use, and the aim to keep the required resampling moments to a minimum.
 
Top