Question / Help Late frames detected/Bad resolution / FPS / bitrate combination Help Needed

LainGG

New Member
Occurs during streaming and recording in OBS 64 bit

Hardware:
7-5930K CPU @ 3.50GHz
MSI X99A GODLIKE GAMING (MS-7883) motherboard
NVIDIA GeForce GTX 980 Ti
16GB DDR4 Ram

-reposting obs log below.
 
Last edited:

LainGG

New Member
Code:
07:33:02: CLR host plugin strings not found, dynamically loading 4 strings
07:33:02: CLRHost::Initialize() attempting to load and start the .NET runtime (null)
07:33:02: CLRHost::Initialize() Found version v2.0.50727 .NET runtime
07:33:02: CLRHost::Initialize() Found version v4.0.30319 .NET runtime
07:33:02: CLRHost::Initialize() attempting to use v4.0.30319 .NET runtime
07:33:02: CLRHost::LoadInteropLibrary() load the assembly plugins\CLRHostPlugin\CLRHost.Interop.dll
07:33:02: CLRHost::LoadPlugins() attempting to load the plugin assembly CLRBrowserSourcePlugin
07:33:02: CLRHost:: Could not find/load browser settings at location C:\Users\Lain\AppData\Roaming\OBS\pluginData\browser.json
07:33:02: CLRHost:: Exception: System.IO.FileNotFoundException: Could not find file 'C:\Users\Lain\AppData\Roaming\OBS\pluginData\browser.json'.
07:33:02: File name: 'C:\Users\Lain\AppData\Roaming\OBS\pluginData\browser.json'
07:33:02:    at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
07:33:02:    at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy, Boolean useLongPath, Boolean checkHost)
07:33:02:    at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share)
07:33:02:    at CLRBrowserSourcePlugin.Shared.BrowserSettings.Reload()
07:33:02: CLRHost::LoadPlugins() successfully added CLR plugin [Type: CLRBrowserSourcePlugin.CLRBrowserSourcePlugin, Name: CLR Browser Source Plugin]
07:33:08: Open Broadcaster Software v0.657b - 64bit ( ^ω^)
07:33:08: -------------------------------
07:33:08: CPU Name: Intel(R) Core(TM) i7-5930K CPU @ 3.50GHz
07:33:08: CPU Speed: 3500MHz
07:33:08: Physical Memory:  16278MB Total, 8832MB Free
07:33:08: stepping id: 2, model 63, family 6, type 0, extmodel 1, extfamily 0, HTT 1, logical cores 12, total cores 6
07:33:08: monitor 1: pos={1920, 0}, size={1920, 1080}
07:33:08: monitor 2: pos={0, 0}, size={1920, 1080}
07:33:08: Windows Version: 10.0 Build 10586 (revision 0)
07:33:08: Aero is Enabled
07:33:08: -------------------------------
07:33:08: OBS Modules:
07:33:08: Base Address     Module
07:33:08: 000000003FFB0000 OBS.exe
07:33:08: 00000000C7220000 OBSApi.dll
07:33:08: 00000000CC400000 CLRHostPlugin.dll
07:33:08: 00000000CC2E0000 DShowPlugin.dll
07:33:08: 00000000CB4F0000 GraphicsCapture.dll
07:33:08: 00000000C6A60000 NoiseGate.dll
07:33:08: 00000000C6A30000 PSVPlugin.dll
07:33:08: 00000000C6A00000 scenesw.dll
07:33:08: ------------------------------------------
07:33:08: Adapter 1
07:33:08:   Video Adapter: NVIDIA GeForce GTX 980 Ti
07:33:08:   Video Adapter Dedicated Video Memory: 2046492672
07:33:08:   Video Adapter Shared System Memory: 4239480832
07:33:08:   Video Adapter Output 1: pos={0, 0}, size={1920, 1080}, attached=true
07:33:08:   Video Adapter Output 2: pos={1920, 0}, size={1920, 1080}, attached=true
07:33:08: =====Stream Start: 2016-05-07, 07:33:08===============================================
07:33:08:   Multithreaded optimizations: On
07:33:08:   Base resolution: 1919x1040
07:33:08:   Output resolution: 1916x1040
07:33:08: ------------------------------------------
07:33:08: Loading up D3D10 on NVIDIA GeForce GTX 980 Ti (Adapter 1)...
07:33:08: ------------------------------------------
07:33:08: Audio Format: 48000 Hz
07:33:08: ------------------------------------------
07:33:08: Audio Channels: 2 Ch
07:33:08: Playback device Default
07:33:08: ------------------------------------------
07:33:08: Using desktop audio input: Headset Earphone (HyperX 7.1 Audio)
07:33:08: Global Audio time adjust: 0
07:33:08: ------------------------------------------
07:33:08: Using auxilary audio input: Microphone (Razer Seiren)
07:33:08: Mic time offset: 0
07:33:08: ------------------------------------------
07:33:08: Audio Encoding: AAC
07:33:08:     bitrate: 128
07:33:08: Using graphics capture
07:33:08: Trying to hook process: Overwatch.exe
07:33:08: Scene buffering time set to 700
07:33:08: ------------------------------------------
07:33:08: Video Encoding: x264
07:33:08:     fps: 60
07:33:08:     width: 1916, height: 1040
07:33:08:     preset: veryfast
07:33:08:     profile: high
07:33:08:     keyint: 250
07:33:08:     CBR: yes
07:33:08:     CFR: yes
07:33:08:     max bitrate: 2000
07:33:08:     buffer size: 2000
07:33:08: ------------------------------------------
07:33:08: SharedTexCapture hooked
07:41:03: Audio timestamp for device 'Microphone (Razer Seiren)' was behind target timestamp by 80
08:25:22: Audio timestamp for device 'Microphone (Razer Seiren)' was behind target timestamp by 90
09:19:29: Audio timestamp for device 'Microphone (Razer Seiren)' was behind target timestamp by 100
09:51:05: FlushBufferedVideo: Flushing 21 packets over 334 ms
09:51:05: Total frames encoded: 496560, total frames duplicated: 137946 (27.78%)
09:51:05: Total frames rendered: 371505, number of late frames: 119947 (32.29%) (it's okay for some frames to be late)
09:51:05: 
09:51:05: Profiler time results:
09:51:05: 
09:51:05: ==============================================================
09:51:05: video thread frame - [100%] [avg time: 20.486 ms] [children: 92.1%] [unaccounted: 7.86%]
09:51:05: | scene->Preprocess - [0.0391%] [avg time: 0.008 ms]
09:51:05: | GPU download and conversion - [92.1%] [avg time: 18.868 ms] [children: 1.78%] [unaccounted: 90.3%]
09:51:05: | | flush - [1.61%] [avg time: 0.329 ms]
09:51:05: | | CopyResource - [0.132%] [avg time: 0.027 ms]
09:51:05: | | conversion to 4:2:0 - [0.0439%] [avg time: 0.009 ms]
09:51:05: Convert444Threads - [100%] [avg time: 1.247 ms] [children: 99.6%] [unaccounted: 0.401%]
09:51:05: | Convert444toNV12 - [99.6%] [avg time: 1.242 ms]
09:51:05: encoder thread frame - [100%] [avg time: 2.762 ms] [children: 0.579%] [unaccounted: 99.4%]
09:51:05: | sending stuff out - [0.579%] [avg time: 0.016 ms]
09:51:05: ==============================================================
09:51:05: 
09:51:05: 
09:51:05: Profiler CPU results:
09:51:05: 
09:51:05: ==============================================================
09:51:05: video thread frame - [cpu time: avg 1.212 ms, total 450516 ms] [avg calls per frame: 1]
09:51:05: | scene->Preprocess - [cpu time: avg 0.007 ms, total 2812.5 ms] [avg calls per frame: 1]
09:51:05: | GPU download and conversion - [cpu time: avg 0.268 ms, total 99578.1 ms] [avg calls per frame: 1]
09:51:05: | | flush - [cpu time: avg 0.208 ms, total 77531.3 ms] [avg calls per frame: 1]
09:51:05: | | CopyResource - [cpu time: avg 0.015 ms, total 5843.75 ms] [avg calls per frame: 1]
09:51:05: | | conversion to 4:2:0 - [cpu time: avg 0.007 ms, total 2859.38 ms] [avg calls per frame: 1]
09:51:05: Convert444Threads - [cpu time: avg 1.196 ms, total 1.7773e+006 ms] [avg calls per frame: 4]
09:51:05: | Convert444toNV12 - [cpu time: avg 1.192 ms, total 1.77163e+006 ms] [avg calls per frame: 4]
09:51:05: encoder thread frame - [cpu time: avg 2.475 ms, total 1.22895e+006 ms] [avg calls per frame: 1]
09:51:05: | sending stuff out - [cpu time: avg 0.014 ms, total 7093.75 ms] [avg calls per frame: 1]
09:51:05: ==============================================================
09:51:05: 
09:51:05: =====Stream End: 2016-05-07, 09:51:05=================================================
09:52:37: No Intel graphics adapter visible in QSVHelper.exe, Optimus problem?
09:52:37: CUDA loaded successfully
09:52:38: 1 CUDA capable devices found
09:52:38: [ GPU #0 - < GeForce GTX 980 Ti > has Compute SM 5.2, NVENC Available ]
 

Harold

Active Member
Most of the problem is coming from your video card. What PCI-E mode is the card running in?
 

LainGG

New Member
It is actually necessary to run in x16.
actually all 5 of my expansion slots are x16, i think it just changes to x8 as a power saving feature in idle. I'll run a test and see if my card changes to x16 while ingame with obs on as it should
 

Harold

Active Member
Not usually. Idle for cards is usually x16 1.1 instead of x16 3.0 or whatever the highest pci-e is supported.
 

Harold

Active Member
Depends what other cards you have in your system. If you are using too many pci-e lanes on other cards, you won't have the room needed to have your video card running in x16.
 

LainGG

New Member
Depends what other cards you have in your system. If you are using too many pci-e lanes on other cards, you won't have the room needed to have your video card running in x16.
Im only using 1 pcie lane, and the current card is a MSI 980ti Lightning
 
Top