No problem! What I mean is that when you measure % of CPU used, it usually means "Given how many cycles the CPU performed in one second; how many of those were for OBS?" But CPUs can run at different numbers of cycles per second at different times - and the operating system (i.e. Windows or Linux) influences that.
As an example, with my laptop
unplugged - I can see the following clock rates (only showing first four cores):
Code:
user@host:~$ cat /proc/cpuinfo | grep "MHz"
cpu MHz : 1397.164
cpu MHz : 1397.332
cpu MHz : 1400.000
cpu MHz : 1397.320
And when I look at htop; OBS is reporting about 15-16%
of one CPU when idle.
Whereas here's what I see with the laptop
plugged in using cpuinfo:
Code:
user@host:~$ cat /proc/cpuinfo | grep MHz
cpu MHz : 2984.146
cpu MHz : 2992.961
cpu MHz : 1494.243
cpu MHz : 1494.066
The numbers bounce around a bit because the P-state is responding to load/demand. When I look at htop, OBS is reporting about 13%
of one CPU. Note that
overall I see about 3% usage of all the cores on my AMD Ryzen 9 4900HS when OBS is idle.
What I'm saying is: we don't know if Windows or Linux is using "more" CPU yet because we don't know what clock rates Windows/Linux have been working with. The %age isn't enough on its own.
Finally: you might also want to look at the GPU load. I use
nvtop
(see
https://github.com/Syllo/nvtop) and, again, the % GPU used depends whether I'm plugged in or not because the GPU
also changes its clock rate based on demand/power settings. So when I'm recording (which requires encoding on GPU) I might see 23% GPU when my laptop is unplugged, but only 5% when plugged in - because the GPU clock rate has changed from 200MHz to 1750MHz.
Of course, this all changes a lot once I'm adding things into the scene, so these numbers are just to illustrate some of the quirks of measuring CPU/GPU usage ... they're not intended to illustrate how much CPU/GPU % you might actually see in the wild.