Question / Help Problem with game capture and laptop

BamboeCHOP

New Member
Hello everyone,

yesterday I got my new laptop (Lenovo Y580) and I tried to stream with it. I set OBS up the same way as I did on my desktop PC. I told OBS to use the nVidia GPU and not the Intel. In OBS I chose the nVidia GPU. However, I get a black screen when trying to stream Dota 2 using the global sources.

I did everything as told in these two threads: http://obsproject.com/forum/viewtopic.php?f=5&t=5965 & http://obsproject.com/forum/viewtopic.p ... 0&start=10 (last post)

3SALO

3SASA


Logfile:
Code:
10:48:48: Open Broadcaster Software v0.542b - 32bit (´・ω・`)
10:48:48: -------------------------------
10:48:48: CPU Name: Intel(R) Core(TM) i7-3630QM CPU @ 2.40GHz
10:48:48: CPU Speed: 2395MHz
10:48:48: Physical Memory:  4095MB Total, 4095MB Free
10:48:48: stepping id: 9, model 10, family 6, type 0, extmodel 1, extfamily 0, HTT 1, logical cores 8, total cores 4
10:48:48: monitor 1: pos={0, 0}, size={1920, 1080}
10:48:48: Windows Version: 6.2 Build 9200 
10:48:48: Aero is Enabled
10:48:48: -------------------------------
10:48:48: OBS Modules:
10:48:48: Base Address     Module
10:48:48: 00EC0000         OBS.exe
10:48:48: 66B30000         OBSApi.dll
10:48:48: 668B0000         DShowPlugin.dll
10:48:48: 6BDB0000         GraphicsCapture.dll
10:48:48: 6BC60000         NoiseGate.dll
10:48:48: 6A6F0000         PSVPlugin.dll
10:48:48: ------------------------------------------
10:48:48: Adapter 1
10:48:48:   Video Adapter: NVIDIA GeForce GTX 660M
10:48:48:   Video Adapter Dedicated Video Memory: 2091712512
10:48:48:   Video Adapter Shared System Memory: 2147479552
10:48:48: ------------------------------------------
10:48:48: Adapter 2
10:48:48:   Video Adapter: NVIDIA GeForce GTX 660M
10:48:48:   Video Adapter Dedicated Video Memory: 2091712512
10:48:48:   Video Adapter Shared System Memory: 2147479552
10:48:48: ------------------------------------------
10:48:48: Adapter 3
10:48:48:   Video Adapter: Microsoft Basic Render Driver
10:48:48:   Video Adapter Dedicated Video Memory: 2091712512
10:48:48:   Video Adapter Shared System Memory: 2147479552
10:48:48: =====Stream Start: 2013-08-03, 10:48:48===============================================
10:48:48:   Multithreaded optimizations: On
10:48:48:   Base resolution: 1920x1080
10:48:48:   Output resolution: 1280x720
10:48:48: ------------------------------------------
10:48:48: Loading up D3D10...
10:48:48: Playback device Default
10:48:48: ------------------------------------------
10:48:48: Using desktop audio input: Lautsprecher (Realtek High Definition Audio)
10:48:48: ------------------------------------------
10:48:48: Using auxilary audio input: Mikrofon (Realtek High Definition Audio)
10:48:49: ------------------------------------------
10:48:49: Audio Encoding: AAC
10:48:49:     bitrate: 192
10:48:49: Using graphics capture
10:48:49: ------------------------------------------
10:48:49: Video Encoding: x264
10:48:49:     fps: 60
10:48:49:     width: 1280, height: 720
10:48:49:     preset: veryfast
10:48:49:     CBR: no
10:48:49:     CFR: no
10:48:49:     max bitrate: 3000
10:48:49:     buffer size: 1000
10:48:49:     quality: 10
10:48:49: ------------------------------------------
Warning -- D3D10Texture::CreateFromSharedHandle: Failed to open shared handle, result = 0x80070057
Warning -- SharedTexCapture::Init: Could not create shared texture
10:48:55: Total frames rendered: 223, number of frames that lagged: 1 (0.45%) (it's okay for some frames to lag)
10:48:55: =====Stream End: 2013-08-03, 10:48:55=================================================
10:48:58: 
10:48:58: Profiler results:
10:48:58: 
10:48:58: ==============================================================
10:48:58: frame - [100%] [avg time: 11.138 ms (cpu time: avg 2.102 ms, total 468.75 ms)] [avg calls per frame: 1] [children: 99.9%] [unaccounted: 0.0718%]
10:48:58: | frame preprocessing and rendering - [85.7%] [avg time: 9.55 ms (cpu time: avg 0.63 ms, total 140.625 ms)] [avg calls per frame: 1] [children: 0.00898%] [unaccounted: 85.7%]
10:48:58: | | scene->Preprocess - [0.00898%] [avg time: 0.001 ms (cpu time: avg 0 ms, total 0 ms)] [avg calls per frame: 1]
10:48:58: | video encoding and uploading - [14.2%] [avg time: 1.58 ms (cpu time: avg 1.471 ms, total 328.125 ms)] [avg calls per frame: 1] [children: 14%] [unaccounted: 0.233%]
10:48:58: | | flush - [1.01%] [avg time: 0.113 ms (cpu time: avg 0.14 ms, total 31.25 ms)] [avg calls per frame: 1]
10:48:58: | | CopyResource - [0.189%] [avg time: 0.021 ms (cpu time: avg 0 ms, total 0 ms)] [avg calls per frame: 0]
10:48:58: | | conversion to 4:2:0 - [0.0449%] [avg time: 0.005 ms (cpu time: avg 0 ms, total 0 ms)] [avg calls per frame: 0]
10:48:58: | | call to encoder - [12.7%] [avg time: 1.414 ms (cpu time: avg 1.331 ms, total 296.875 ms)] [avg calls per frame: 0]
10:48:58: | | sending stuff out - [0.00898%] [avg time: 0.001 ms (cpu time: avg 0 ms, total 0 ms)] [avg calls per frame: 0]
10:48:58: | Convert444Threads - [296%] [avg time: 32.968 ms (cpu time: avg 1.121 ms, total 250 ms)] [avg calls per frame: 1]
10:48:58: ==============================================================
10:48:58:

captureHookLog:
Code:
2013-08-03, 10:48:49: we're booting up: 
10:48:49: D3D9 Present
10:48:49: D3D9EndScene called
10:48:49: D3DPRESENT_PARAMETERS {
10:48:49: 	BackBufferWidth: 1920
10:48:49: 	BackBufferHeight: 1080
10:48:49: 	BackBufferFormat: D3DFMT_A8R8G8B8
10:48:49: 	BackBufferCount: 1
10:48:49: 	MultiSampleType: D3DMULTISAMPLE_NONE
10:48:49: 	MultiSampleQuality: 0
10:48:49: 	SwapEffect: D3DSWAPEFFECT_DISCARD
10:48:49: 	hDeviceWindow: 459636
10:48:49: 	Windowed: true
10:48:49: 	EnableAutoDepthStencil: true
10:48:49: 	AutoDepthStencilFormat: D3DFMT_D24S8
10:48:49: 	Flags: None
10:48:49: 	FullScreen_RefreshRateInHz: 0
10:48:49: 	PresentationInterval: 1
10:48:49: };
10:48:49: successfully set up d3d9 hooks
10:48:49: D3D9Present called
10:48:50: DoD3D9GPUHook: success - d3d9ex
10:48:51: D3DSURFACE_DESC {
10:48:51: 	Format: D3DFMT_A8R8G8B8
10:48:51: 	Type: D3DRTYPE_SURFACE
10:48:51: 	Usage: D3DUSAGE_RENDERTARGET 
10:48:51: 	Pool: D3DPOOL_DEFAULT
10:48:51: 	MultiSampleType: D3DMULTISAMPLE_NONE
10:48:51: 	MultiSampleQuality: 0
10:48:51: 	Width: 1920
10:48:51: 	Height: 1080
10:48:51: };
10:48:51: successfully capturing d3d9 frames via GPU
10:48:52: NV Capture available
10:48:52: FBO available
10:48:52: GL Present
10:48:52: (half life scientist) everything..  seems to be in order
10:48:58: ---------------------- Cleared D3D9 Capture ----------------------
10:48:58: D3D9EndScene called
10:48:58: D3D9Present called

I hope someone can help me.

'bamboe
 

BamboeCHOP

New Member
R1CH said:
You should not choose the GPU in the OBS settings if you have a laptop.

You mean I should select the Intel GPU in the OBS settings? Tried it too, just get blackscreen with the global scene from Dota 2.
 

Kharay

Member
No, what he means, you should force the issue at driver level; not at software level. In other words, make sure that the game and OBS use the discrete GPU (the nVidia). You should be able to do this in nVidia's control panel. You should be able to specify which pieces of software run on which GPU.
 

BamboeCHOP

New Member
I already did this. I tried starting OBS with rightclick on it and choosing the nVidia GPU too, however I still get a blackscreen when using the global sources. OBS is set to use the nVidia GPU in the nVidia settings as well as Dota 2.
 

belboz

Member
When I use my laptop I have had to make sure the games use the Nvidia chipset, and then make sure OBS is set for the discrete Intel Chipset IN OBS (doesn't seem to matter what I specify in the nvidia panel for OBS, I have to do it in the OBS program itself). Then it works (keeping in mind some games require running obs as admin and such).

To make things more difficult, most of the time I stream or preview Obs "loses" the Intel chipset in the selection box and most of the time goes from two entries (one intel, one nvidia), to two Nvidia and 1 Microsoft Basic Render Driver as you show. I have to close OBS and re-run it and then it detects the Nvidia/Intel as the only two devices, I select Intel and everything works again. Just be aware that almost every time you preview or stream, you may lose the device and have to close and rerun OBS. I mentioned this a month or two back (the changed device names, along with going from two to three devices in obs and was told that was the way it works). I just found it odd that when I run OBS the first time, I have my two devices, but then after a preview or stream. I have three devices then. Two are Nvidia and one the Microsoft Basic Render Driver. Seemed odd to me. But from what devs have said it is normal, and I guess a byproduct of having a laptop with dual graphic chipsets.

I suspect the reason I can get things to work like this is that the Nvidia chip still renders the game into the same video memory used by the Intel chipset. (From what I have read the nvidia is doing the heavy lifting, but the intel chipset still is used to display the video memory on the laptop screen.

So to sum up, here is what I do.

1) Games and OBS are set to use Nvidia in the nvidia control panel (also have nvidia set as default device in said panel)
2) Run OBS (as normal user or admin, depending on game being streamed), select Intel chipset in OBS
3) Preview or Stream and it works.
4) If I stop the preview or stream, I just close OBS, rerun as in #2 and make sure Intel is still selected
5) Stream or Preview again
6) Go back to #4 if needed

I can also note that I am 100% sure the game is using the Nvidia chipset. Because the frame rates in game are beast, I can specify that game use Intel and then play and the frame rates drop big time, plus the nvidia GPU activity app shows the game and OBS as using it.

So I recommend trying the above, and just remembering that once you stream or preview you may lose the intel device in OBS's video selection. Just rerun OBS and it should come back.
 

BamboeCHOP

New Member
belboz said:
When I use my laptop I have had to make sure the games use the Nvidia chipset, and then make sure OBS is set for the discrete Intel Chipset IN OBS (doesn't seem to matter what I specify in the nvidia panel for OBS, I have to do it in the OBS program itself). Then it works (keeping in mind some games require running obs as admin and such).

To make things more difficult, most of the time I stream or preview Obs "loses" the Intel chipset in the selection box and most of the time goes from two entries (one intel, one nvidia), to two Nvidia and 1 Microsoft Basic Render Driver as you show. I have to close OBS and re-run it and then it detects the Nvidia/Intel as the only two devices, I select Intel and everything works again. Just be aware that almost every time you preview or stream, you may lose the device and have to close and rerun OBS. I mentioned this a month or two back (the changed device names, along with going from two to three devices in obs and was told that was the way it works). I just found it odd that when I run OBS the first time, I have my two devices, but then after a preview or stream. I have three devices then. Two are Nvidia and one the Microsoft Basic Render Driver. Seemed odd to me. But from what devs have said it is normal, and I guess a byproduct of having a laptop with dual graphic chipsets.

I suspect the reason I can get things to work like this is that the Nvidia chip still renders the game into the same video memory used by the Intel chipset. (From what I have read the nvidia is doing the heavy lifting, but the intel chipset still is used to display the video memory on the laptop screen.

So to sum up, here is what I do.

1) Games and OBS are set to use Nvidia in the nvidia control panel (also have nvidia set as default device in said panel)
2) Run OBS (as normal user or admin, depending on game being streamed), select Intel chipset in OBS
3) Preview or Stream and it works.
4) If I stop the preview or stream, I just close OBS, rerun as in #2 and make sure Intel is still selected
5) Stream or Preview again
6) Go back to #4 if needed

I can also note that I am 100% sure the game is using the Nvidia chipset. Because the frame rates in game are beast, I can specify that game use Intel and then play and the frame rates drop big time, plus the nvidia GPU activity app shows the game and OBS as using it.

So I recommend trying the above, and just remembering that once you stream or preview you may lose the intel device in OBS's video selection. Just rerun OBS and it should come back.

I love you! <3 Looks like it works. Will test it once again when I'm at home, can't stream from work. :D
 

belboz

Member
Glad to be of help.

I am hoping maybe there is a glitch and future versions of OBS don't lose the Intel from the list and replace it with two nvidia's, and the microsoft thing. Would make it much easier since you wouldn't have to worry about closing OBS and rerunning in between previews or streams.
 
Top