Question / Help Screen tearing for viewers?

KeyboardSpartan

New Member
Hello everyone,

For the life of me I cannot fix this issue alone after trying for over 6 months, I give up... please send help.

Pretty much everything is running smoothly with OBS, a few crashes here and there although which now appear to be fixed whilst streaming The Witcher 3 and GTA V.

The only issue that's confusing me is screen tearing, there is NO screen tearing in the game for my eyes but there is for the viewers watching the stream and also on the previous broadcasts.

So here's what's been tried so far: (some things mentioned here may not even relate... but why not try everything?!)

Encoder changed from x264 to NVENC and back again.
FPS has been changed from 60-30.
Resolution from 720 to 1080 and back down to 720.
Different Bitrates and Buffer Sizes.
PhysX settings from the dedicated 980 to CPU.
Reinstallation of Nvidia Drivers (DDU used), Old, New and Hotfixes TDR drivers.
OBS completely uninstalled and reinstalled (using the 32 and 64 bit versions).
Game Capture and Global Source both used, however I have not tried Window Capture as of yet.

Just a heads up, I'm using a GSYNC monitor (BenQ XL2420G) and have been playing around with a few settings to no avail.

Here is a link to a past broadcast taken today to show you what's happening:
http://www.twitch.tv/keyboardspartan/v/8305134

(Skip the first 30 seconds)

Any ideas at all?
 

Attachments

  • 2015-07-23-1425-31.log
    31.3 KB · Views: 317
Last edited:

KeyboardSpartan

New Member
Fair enough, can't believe how simple it could be... completely overlooked that!

Will try it later as not currently in, but do you think there would be a possible way to stop the tearing without V-Sync?

The input lag and not to mention the loss of 40-60 FPS is quite nasty.

Cheers for that!
 

FerretBomb

Active Member
Nope, that's what vsync is for; to stop tearing. You can reduce the delay greatly, to around 1 frame (max) by turning on triple buffering mode (which is what THAT is for; to remove the input delay added by vsync).

Simplified Tech Explanation:
Essentially, computer video works by having two drawing areas (buffers). Think of them like pages in a notebook. One is being shown on-screen, while the video card is drawing the next one. When it's done, they switch. So the one that was being shown on-screen is now being drawn on, and the other is being displayed.

Non-vsync switches the pages as soon as the monitor wants another one, regardless of if it is done being drawn over the last one that was on the page. This incomplete frame is the 'tear'.

VSync makes it so that it doesn't switch until the full page is done being drawn. This makes it delay just a little, but it adds up, and causes the video card to stop and start a lot. It's not as bad if your video card can keep a vsync-enabled rate faster than your refresh, but if it drops below that, it has to work 'on twos'. Meaning half the refresh, which is where it really cuts down and looks bad.

Triple-buffered vsync adds a third 'page' (hence, triple-buffering). One is being shown on-screen, while the other two are being drawn to in alternating order, as FAST as the video card can draw them (also why triple buffering can overheat video cards... it 'unlocks' them to run as fast as they possibly can! ...just in the background). So there's always one completely done. When as the monitor wants a new one? It takes the last fully-drawn page and puts it up.
This means that absolute worst-case, you'll be one frame behind the action, if the newest frame was one pixel away from being completed, assuming your video card can maintain a non-vsync framerate above your monitor's refresh rate (if it can't put out frames as fast as your monitor wants them, it isn't going to help much/at all). Most of the time you'll be half a frame behind (on average).

It's received a lot of bad press from 'competitive' gamers who don't really understand the tech underlying the games they play (and/or are just looking for a reason to blame a loss on), and just see 60fps vs 200+.
 

VooDoo

Member
Thing is Gsync and 120Hz+ monitors also remove tearing. That being said a Gsync that has Vsync on is still 1ms response and of no tearing to you, so you shouldnt be getting any input lag at any refresh. Obs also should display what the monitor is displaying, I mean a capture device itself can falter and show tearing that the monitor is not, but he should be drawing the exact image he is seeing when playing pc games. I'd honestly try Nvenc if he hasnt already.
 

FerretBomb

Active Member
Please don't. Gsync is a monitor/card interaction which is not widely supported, and handles the buffers in nonstandard methods which are not a parallel for vsync. With gsync, he will not see tearing (and doesn't), but as OBS is grabbing from the on-card buffers and is NOT gsync aware/enabled, his viewers still will (and are).

NVENC is a very poor quality encode, and should be avoided if at all possible, other than for local-only recordings..
 

KeyboardSpartan

New Member
First of all, thank you for all of the information!

I was aware of how triple buffering works but it's nice to get a full refresh on the subject, the explanation on how OBS is grabbing from the on-card buffers however makes complete sense, it's a shame... but thank you for the heads up.

Triple buffering is currently off, however TW3 will be tried with it on today. After trying it out I'll report back with the outcome.

NVIDIA Control Panel is showing "Preferred Refresh Rate" set to "Highest Available", not entirely sure if it'll make a difference if I place it on "Application Controlled" though as it's set to 144hz at the moment considering I'm only receiving around 75-100 FPS in game currently (also weirdly, turning off G-Sync seems to boost the FPS in some instances around 10-20, there are a few threads where others have had this issue too).
 

KeyboardSpartan

New Member
Now there appears to just be major stuttering whilst V-SYNC / Triple Buffering is enabled.

Tried a number of combinations with G-SYNC On / Fully Off
Adaptive V-SYNC vs ON vs ON (Smooth) etc
Triple Buffering On and Off

The latest recordings show what I mean, damn :/
 

VooDoo

Member
Please don't. Gsync is a monitor/card interaction which is not widely supported, and handles the buffers in nonstandard methods which are not a parallel for vsync. With gsync, he will not see tearing (and doesn't), but as OBS is grabbing from the on-card buffers and is NOT gsync aware/enabled, his viewers still will (and are).

NVENC is a very poor quality encode, and should be avoided if at all possible, other than for local-only recordings..
Encode quality is near negligible unless you're a partner w/ quality options or pushing the bitrate. In most cases I dont see non partnered streams pulling even high quality 720p.
 

FerretBomb

Active Member
Encode quality is near negligible unless you're a partner w/ quality options or pushing the bitrate. In most cases I dont see non partnered streams pulling even high quality 720p.
You've got that completely backward. Encoding quality is MORE important when you have a low available bitrate. Better-quality encoding/compression means that your stream will look better at the same bitrate. One of the best things that a non-partnered streamer can do to improve their stream visually is to move to a slower x264 preset, using more CPU to compress the video more efficiently and at a higher quality rate. It's not a silver bullet to let you run 1080p@30 or 720p@60 on 2000kbps, but a Slower-encoded stream is going to look leaps and bounds better than one encoded with Ultrafast.

NVENC on the other hand, looks like a poop hot pocket at low bitrates (rough parity with x264 Ultrafast for quality). QSV is in the same boat. They're useless for streaming unless you have a ridiculously weak CPU (Core i3 or similar) which is otherwise unable to stream, or are doing local-recording only, where you can throw a TON of bitrate at them (since you're just recording to hard drive) so the resulting video doesn't look like crap.

---

KeyboardSpartan, did you switch back to x264 encoding? The end of your log shows you using NVENC.
 

KeyboardSpartan

New Member
Yes FerretBomb, haven't been using NVENC at all since testing again.

Attached the latest log to this post.
 

Attachments

  • 2015-07-24-1233-33.log
    13.5 KB · Views: 66

FerretBomb

Active Member
That's a fairly short test (each a minute or less) but from what I'm seeing from the log, it should be working well. There's no duplications, no dropped, and the process time for conversions and such are in the sub-ms range.

Only things I could suggest would be turning your webcam down to 720p mode, as it's currently at 1080 and could be causing some hardware-level blocking; the c920 is well-known for flooding the USB bus, and since you're running at 720 it's not going to improve things, even on fullscreen-cam.

Of course, add the usual comment about 3500kbps still being way too high for a non-partnered stream, and the other one about 60fps being patently unnecessary for streaming in 99.999% of cases.
 
Last edited:

KeyboardSpartan

New Member
Will a longer test host better results though?

I changed the webcam from 1080p to 720p (didn't even realise it was set to that) and there is unfortunately still some stuttering, along with lowering the bitrate to 3000 and finally dropping the FPS to 30, something seems reluctant to keep the stuttering up even though the build doesn't appear to be stressed at all.
 

FerretBomb

Active Member
Try lowering to 2000kbps. 720p@30fps, 2000kbps is the recommended 'golden point' for non-partnered streamers.
 

FerretBomb

Active Member
Don't use a custom buffer, no.
There are only a few very specific reasons to ever use one, and almost no one ever needs to unless they're locally recording and set it to 0 to unlock the bitrate to 'use whatever is needed for perfect quality, and eat big chunks of my hard drive please'.

Also, if you're watching from the same device you're streaming from, it can cause stuttering; best to monitor from a laptop, tablet, or smartphone.
 

KeyboardSpartan

New Member
The problem seems to want to change each time something is resolved... there's no stuttering anymore although now a Texture > Map Failed error (although this could be down to the OC on the GPU's I guess but it wasn't happening before).

I've read about people adding a line of text in "Custom x264 Encoder Settings" to possibly resolve this or am I mistaken?

Thank you for the time you've put into the issues I'm having by the way, I really appreciate it.
 

FerretBomb

Active Member
Sounds like the stutter was a bitrate/playback issue, depending on if you swapped to another device or not.

Can't help with the texture map error sadly, only thing I remember about that one was that it's generally an intermittent issue, usually resolved with a driver update or downgrade if memory serves. Never had it myself.

Make sure you understand ANY custom x264 settings you put in, exactly what they do, and why you're adding them. There's a LOT of very bad 'best x264 settings' floating around and being parroted by people who have no idea what they're talking about, just heard them from someone else. In many cases, they can significantly harm performance... in others, they set values... to the default values, and rely on placebo effect. :b
 
Top