Game and Window capture fails because D3D11 texture's pixel format is incorrect

xOTV6dj3

New Member
Display capture works fine. Both Game and Window capture just display a black screen.

Relevant OBS log:
19:29:04.980: [game-capture: 'Game Capture'] d3d11_shtex_init: failed to create texture
19:29:04.980: [game-capture: 'Game Capture'] ----------------- d3d11 capture freed ----------------
19:29:04.996: [game-capture: 'Game Capture'] create_d3d11_tex: failed to create texture (0x80070057): The parameter is incorrect.

Because I'm developing the D3D11 application, I can capture the debug output. There seems to be one of these errors for each corresponding entry in the OBS log:
D3D11 ERROR: ID3D11Device::CreateTexture2D: CreateResource: DXGI_FORMAT_R8G8B8A8_UNORM and DXGI_FORMAT_R8G8B8A8_UNORM_SRGB cannot be created with D3D10_DDI_RESOURCE_MISC_SHARED. Consider DXGI_FORMAT_B8G8R8A8_UNORM[_SRGB] instead. These formats are available in Direct3D9 as D3DFMT_A8R8G8B8, and in Direct3D10.1 (when CheckFormatSupport so indicates) and for all Direct3D11 devices. [ STATE_CREATION ERROR #1048617: CREATERESOURCE_DXGI_FORMAT_R8G8B8A8_CANNOT_BE_SHARED]
D3D11 ERROR: ID3D11Device::CreateTexture2D: Returning E_INVALIDARG, meaning invalid parameters were passed. [ STATE_CREATION ERROR #104: CREATETEXTURE2D_INVALIDARG_RETURN]
create_d3d11_tex: failed to create texture (0x80070057): The parameter is incorrect.


The output seems to indicate that OBS attempts to use DXGI_FORMAT_R8G8B8A8_UNORM, when it should use DXGI_FORMAT_B8G8R8A8_UNORM.
I have tried changing the back buffer format of my swap chain, but that didn't seem to have any effect on OBS.

I successfully tried capturing Overwatch. OBS log file says:
20:02:16.350: [game-capture: 'Game Capture'] attempting to hook process: Overwatch.exe
20:02:16.392: [game-capture: 'Game Capture'] using direct hook
20:02:16.417: [game-capture: 'Game Capture'] Hooked to process: Overwatch.exe
20:02:16.417: [game-capture: 'Game Capture'] (half life scientist) everything.. seems to be in order
20:02:16.429: [game-capture: 'Game Capture'] Hooked DXGI
20:02:16.437: [game-capture: 'Game Capture'] d3d11 shared texture capture successful
20:02:16.448: [game-capture: 'Game Capture'] shared texture capture successful

So it seems to work in principle.
Either something is wrong generally with my setup, or my application initializes D3D11 in a way that is not expected by OBS. How could I debug this? Is there any information for developers on how to make an application compatible with OBS?

Full log (although I doubt this will add much): https://obsproject.com/logs/l5MLbaBI-tVPGtcz
 

qhobbes

Active Member
1. You are running Windows 10 1803, which has not been supported by Microsoft since 2019-11-12. We recommend updating to the latest Windows release to ensure continued security, functionality, and compatibility.
2. Run OBS as Admin
3. Display and Game Capture Sources interfere with each other. Never put them in the same scene.
 

xOTV6dj3

New Member
Those points are irrelevant.

I found the problem by more or less randomly changing things, and after digging around in the source code of OBS I also realised I must have made a mistake in my assessments. Changing the buffer format of the swap chain to BGRA does indeed solve the problem.

The primary issue is that OBS unconditionally attempts to allocate the texture for capturing with the whatever texture format the original swap chain's back buffer has.
However, my application uses the D3D11 API with hardware feature level 9.3 (i.e. D3D_FEATURE_LEVEL_9_3 in D3D11CreateDevice).

When creating a texture resource with D3D10_DDI_RESOURCE_MISC_SHARED (such that it can be accesses by another process), hardware feature level 9.3 only permits BGRA textures, as that is the native pixel format of GDI and the DWM. Note that D3D9 uses the old, native-endian nomenclature (D3DFMT_A8R8G8B8 means BGRA on LE machines), whereas D3D10 and newer use the always-byte-wise nomenclature. Capture works fine regardless of pixel format with hardware feature level 10.0 or newer.

This would primarily affect applications using DXGI, while targetting 9.3 hardware for compatibility during the early transition phase. So, probably a small number of games in the Vista era, late 2000s. Hardware was advancing rapidly at that time.
Not that much information on this topic either. Just hints that this is indeed an issue with feature level 9.3, as well as some vague statements that BGRA might be the universally more compatible format:
This post suggests that BGRA was briefly thrown out in DXGI 1.0, only to return in DXGI 1.1. However, this does not match my own observations:

A fallback strategy of using BGRA in case of texture creation failure could improve compatibility with a small number of games from this time span, however, depending on how exactly OBS uses the texture (I didn't dive that deep into the source), it might be necessary to explicitly swizzle the color data.

It is probably not worth implementing, but I suppose it is worth nothing that capture won't work in this constellation.
 
Top