1080p@60fps means resolution 1920x1080.
So the amount of pixel is 1920*1080 = 2073600 pixel.
One raw pixel has 8 bit for red, 8 bit for green and 8 bit for blue, thus 24 bit or 3 bytes.
In bytes, one raw frame is 2073600 pixel * 3 bytes = 6220800 bytes per frame (one frame is 1 rendered image).
60 frames per second means 60 full raw frames, thus 6220800 bytes per frame * 60 frames per second = 373248000 bytes per second.
If you stream this with bitrate 6000 (
recommended bitrate by Twitch for 1080p 60 fps), you are sending with 6000 kkbs, this is 6000 kilo bit per second, which is 6000 * 1000 bit per second = 6000000 bit per second.
8 bit are 1 byte, so 6000000 bit per second = 6000000 / 8 bytes per second = 750000 bytes per second.
Now you can compare the ratio between raw data and compressed data. Raw data is 373248000 bytes per second. Compressed data is 750000 bytes per second. This is a ratio of 750000 / 373248000 = 0.002. This means video encoding compresses the data with the factor 0.002 (or by 1/500, if you want it descriptive).
Per pixel: with 24 bit per raw pixel, this is 24 * 0.002 = 0.048 bit per compressed pixel.
tl;dr
A 1080p 60fps stream with bitrate 6000 carries 0.048 bit per pixel.
You wanted math nothing more nothing less, you got math nothing more nothing less, but I doubt you can deduct any useful with with regard to your streams or recordings.
In the end you need to balance the trinity resolution/bitrate or quality/resource consumption to something that gives the best looking video for your given hardware and your given raw material. Usually, you can always get only 2 out of the 3. That's different for every one of us here in the forum. If you just want a setting suited to your machine, fire up OBS, do Tools->Auto Configuration Wizard and forget about all the math.