I'm still seeing similar posts that seem to be the same sort of issue so going to add mine here and see if I can get some pointers. This seems a bit more then just GPU overload related.
I can't seem to figure out and solve this issue involving OBS's FPS with DX11 vs DX12/Performance "weirdness" it seems. Ive tried so many different approaches and can't seem to put my finger on what's really going on. Especially when I run the same type of test via a older CPU/GPU combo and seem to get better results with realistically only diff. being that, and its Intel based. DX12 is giving me insane headroom to avoid GPU overload on the AMD one. I never noticed the issue on my old system, I only came across it with the new one.
Now I've spent weeks testing different Ideas.. from 2+ year old windows ISO's, Older OBS versions, 32/64bit, SLOBS, NDI, NVENC, to new Motherboard, etc. Division 2, Tomb raider, etc all same thing happens, especially with Dx11. I have to massively limit settings/FPS to not get the GPU overload. Oddly DX11 has the issue at 1440/140 and settings majorly reduced. IF I change to DX12, I literally can Max everything, I'm talking RTX shadows, 4k, etc and still barely dent the OBS FPS. Which seems a bit crazy of a performance jump. Obviously DX12 would be the way to go but its so "finnicky" with a lot of games.
I see the GPU overload on the Intel system but nothing near as significant.. So this has been pointing me toward something with either Windows, or perhaps AMD itself somehow since I can't understand how the much older hardware handling it better. I see a lot of people with the same thing happening post with some form of AMD hardware, perhaps I'm grasping at straws here..
Old system quick rundown: 4930k/GTX 780TI
New: Ryzen 2700x/RTX 2080TI I've even tried pushing Nvenc onto my GTX 1080
A recent OBS log: https://obsproject.com/logs/YBKfF-UKAmYUCv0Z
I've went through all ideas I can come up with, for weeks now. I'll gladly run down them again with someone from OBS etc if needed, to try to either solve, or figure out what the root cause could be. Thanks guys!
I can't seem to figure out and solve this issue involving OBS's FPS with DX11 vs DX12/Performance "weirdness" it seems. Ive tried so many different approaches and can't seem to put my finger on what's really going on. Especially when I run the same type of test via a older CPU/GPU combo and seem to get better results with realistically only diff. being that, and its Intel based. DX12 is giving me insane headroom to avoid GPU overload on the AMD one. I never noticed the issue on my old system, I only came across it with the new one.
Now I've spent weeks testing different Ideas.. from 2+ year old windows ISO's, Older OBS versions, 32/64bit, SLOBS, NDI, NVENC, to new Motherboard, etc. Division 2, Tomb raider, etc all same thing happens, especially with Dx11. I have to massively limit settings/FPS to not get the GPU overload. Oddly DX11 has the issue at 1440/140 and settings majorly reduced. IF I change to DX12, I literally can Max everything, I'm talking RTX shadows, 4k, etc and still barely dent the OBS FPS. Which seems a bit crazy of a performance jump. Obviously DX12 would be the way to go but its so "finnicky" with a lot of games.
I see the GPU overload on the Intel system but nothing near as significant.. So this has been pointing me toward something with either Windows, or perhaps AMD itself somehow since I can't understand how the much older hardware handling it better. I see a lot of people with the same thing happening post with some form of AMD hardware, perhaps I'm grasping at straws here..
Old system quick rundown: 4930k/GTX 780TI
New: Ryzen 2700x/RTX 2080TI I've even tried pushing Nvenc onto my GTX 1080
A recent OBS log: https://obsproject.com/logs/YBKfF-UKAmYUCv0Z
I've went through all ideas I can come up with, for weeks now. I'll gladly run down them again with someone from OBS etc if needed, to try to either solve, or figure out what the root cause could be. Thanks guys!