Question / Help Stream bitrate measured in bytes but record bitrate measured in bits

jeffrey6301

New Member
I'v done a lot of test, if my internet upload speed is around 40 Mbps, I can stream in 5000 kbps, set it higher, frame start to drop.
That seems like stream bitrate is measured in bytes.

But when I set my record bitrate to 8000 kbps, the output file of a 23:28 video is only 1,436,891,385 bytes.
If it is bytes, 8000*1000*(23*60+28) = 11,264,000,000 bytes
If it is bits, 8000*1000*(23*60+28)/8 = 1,408,000,000 bytes
So obviously it's bit.

I do another test to confirm it, record in 64000 kbps for 53 seconds, the output file is 426,880,710 bytes
64000*1000*53/8 = 424,000,000

Almost same, that means record bitrate is measured in bits.
Then why is the stream bitrate measured in bytes?
It can't be my upload speed is 40 Mbps but frame start to drop after it uses 1/8 of the bandwidth.
 
Last edited:
Top