On win7, everything works. This is a screen of my plugin development for NVENC. The official new version still has nothing appeared except max-bitrate.
h264_nvenc AVOptions:
-preset <int> E..V.... Set the encoding preset (from 0 to 11) (default medium)
default E..V....
slow E..V.... hq 2 passes
medium E..V.... hq 1 pass
fast E..V.... hp 1 pass
hp E..V....
hq E..V....
bd E..V....
ll E..V.... low latency
llhq E..V.... low latency hq
llhp E..V.... low latency hp
lossless E..V....
losslesshp E..V....
-profile <int> E..V.... Set the encoding profile (from 0 to 3) (default main)
baseline E..V....
main E..V....
high E..V....
high444p E..V....
-level <int> E..V.... Set the encoding level restriction (from 0 to 51) (default auto)
auto E..V....
1 E..V....
1.0 E..V....
1b E..V....
1.0b E..V....
1.1 E..V....
1.2 E..V....
1.3 E..V....
2 E..V....
2.0 E..V....
2.1 E..V....
2.2 E..V....
3 E..V....
3.0 E..V....
3.1 E..V....
3.2 E..V....
4 E..V....
4.0 E..V....
4.1 E..V....
4.2 E..V....
5 E..V....
5.0 E..V....
5.1 E..V....
-rc <int> E..V.... Override the preset rate-control (from -1 to INT_MAX) (default -1)
constqp E..V.... Constant QP mode
vbr E..V.... Variable bitrate mode
cbr E..V.... Constant bitrate mode
vbr_minqp E..V.... Variable bitrate mode with MinQP
ll_2pass_quality E..V.... Multi-pass optimized for image quality (only for low-latency presets)
ll_2pass_size E..V.... Multi-pass optimized for constant frame size (only for low-latency presets)
vbr_2pass E..V.... Multi-pass variable bitrate mode
-rc-lookahead <int> E..V.... Number of frames to look ahead for rate-control (from -1 to INT_MAX) (default -1)
-surfaces <int> E..V.... Number of concurrent surfaces (from 0 to 64) (default 32)
-cbr <boolean> E..V.... Use cbr encoding mode (default false)
-2pass <boolean> E..V.... Use 2pass encoding mode (default auto)
-gpu <int> E..V.... Selects which NVENC capable GPU to use. First GPU is 0, second is 1, and so on. (from -2 to INT_MAX) (default any)
any E..V.... Pick the first device available
list E..V.... List the available devices
-delay <int> E..V.... Delay frame output by the given amount of frames (from 0 to INT_MAX) (default INT_MAX)
-no-scenecut <boolean> E..V.... When lookahead is enabled, set this to 1 to disable adaptive I-frame insertion at scene cuts (default false)
-forced-idr <boolean> E..V.... If forcing keyframes, force them as IDR frames. (default false)
-b_adapt <boolean> E..V.... When lookahead is enabled, set this to 0 to disable adaptive B-frame decision (default true)
-spatial-aq <boolean> E..V.... set to 1 to enable Spatial AQ (default false)
-temporal-aq <boolean> E..V.... set to 1 to enable Temporal AQ (default false)
-zerolatency <boolean> E..V.... Set 1 to indicate zero latency operation (no reordering delay) (default false)
-nonref_p <boolean> E..V.... Set this to 1 to enable automatic insertion of non-reference P-frames (default false)
-strict_gop <boolean> E..V.... Set 1 to minimize GOP-to-GOP rate fluctuations (default false)
-aq-strength <int> E..V.... When Spatial AQ is enabled, this field is used to specify AQ strength. AQ strength scale is from 1 (low) - 15 (aggressive) (from 1 to 15) (default 8)
-cq <int> E..V.... Set target quality level (0 to 51, 0 means automatic) for constant quality mode in VBR rate control (from 0 to 51) (default 0)
-aud <boolean> E..V.... Use access unit delimiters (default false)
-bluray-compat <boolean> E..V.... Bluray compatibility workarounds (default false)
-init_qpP <int> E..V.... Initial QP value for P frame (from -1 to 51) (default -1)
-init_qpB <int> E..V.... Initial QP value for B frame (from -1 to 51) (default -1)
-init_qpI <int> E..V.... Initial QP value for I frame (from -1 to 51) (default -1)
-qp <int> E..V.... Constant quantization parameter rate control method (from -1 to 51) (default -1)
Holy crap, THIS is what I've been wanting for YEARS with NVENC.
Where did you get this? How did you enable all these extra custom settings?
Is this in an older version of OBS, or in this new beta one?
I would love this much granular settings with NVENC. I know you can add a bunch of custom flags for NVENC if you're using straight up just FFMPEG without OBS, but they've never added a "custom flags" field in OBS for that sort of thing.
I would be very interested in this feature set. I'd even donate or pay for something like this.
I've noticed that if you use AMD's hardware encoder, you get this sort of granularity. How come NVENC doesn't support this?
Was it strictly because there wasn't a proper API for it until now? Or it was just too much work to add? Will we be getting this after the beta ends? I'd be happy with even just having a "custom flags" field.
Why would you need such granular control?
Most users have no idea what any of those settings do, so it would only be confusing to them. Also, any of these settings could already be used for NVENC through custom ffmpeg output, which is meant for advanced users.
Though the new NVENC encoder will not be able to use those settings, NVENC supports them, but OBS would need to specifically expose them to the user.
Why would you need such granular control?
Most users have no idea what any of those settings do, so it would only be confusing to them. Also, any of these settings could already be used for NVENC through custom ffmpeg output, which is meant for advanced users.
Though the new NVENC encoder will not be able to use those settings, NVENC supports them, but OBS would need to specifically expose them to the user.
The fact that most users have no idea what all the settings are for does not mean that they are not needed. There is an advanced mode where you can shove them with the default values. Who knows, he will correct as he needs, who does not know, there is a simple mode. We need full control, not 2 knobs on / off.
The fact that most users have no idea what all the settings are for does not mean that they are not needed. There is an advanced mode where you can shove them with the default values. Who knows, he will correct as he needs, who does not know, there is a simple mode. We need full control, not 2 knobs on / off.
Can you give a specific example where having this level of granular control would help, or are you just assuming that "more options = better"? Even with the limited options that are exposed in the current nvenc implementation, we see people who make things worse by changing settings they do not understand. For the less than 1% of users who might actually need them, does it make sense to confuse the other 99.9%? You need to understand that there are millions of users of OBS, and sometimes we need to find a balance between full control and ease of use.
Can you give a specific example where having this level of granular control would help, or are you just assuming that "more options = better"? Even with the limited options that are exposed in the current nvenc implementation, we see people who make things worse by changing settings they do not understand. For the less than 1% of users who might actually need them, does it make sense to confuse the other 99.9%? You need to understand that there are millions of users of OBS, and sometimes we need to find a balance between full control and ease of use.
Why do you then have a command line in the x264 encoder, in which you can specify any parameter from the set that supports the encoder v0.148.2762? And in nvenc this is not? You can encode video to file using FFMPEG, but for some reason streaming through it is impossible. Where's the justice?
There is a way to reserve some gpu resources for OBS? I have noticed if i use my 144hz monitor without V-Sync in games, the GPU go on 100% usage and OBS start stuttering and lagging. Can i set OBS priority above the games?
So I tried it last night, I have a 2080ti and it looked blurrier when moving then when I stream with my i7 7700k at veryfast and bitrate at 8000. I was very sad. I recorded and streamed at the same time and I also stop the recording and had the same outcome. I'm lost here. Oh forgot I'm streaming BLACKOUT on xbox. My upload is not an issue (900 plus mbps)
Hi thank you for the improvements. I've just tried a local recording with the latest posted beta, no streaming, and encountered an issue with the recording introducing screen tearing. I uploaded a clip of it to youtube: https://youtu.be/DxpiToKykCk
Here are my settings (using game capture plugin from release version):
My specs:
Windows 10 64bit 1803
8700k@5ghz
1050 ti 398.36
8gb 2666 ram
OBS 22.0.2-268-g10997e9d7
I've had issues in the past with OBS introducing stutters/tearing so was hoping maybe these latest improvements would put it on par with Shadowplay as far as smoothness goes. The game Pixeljunk Eden is a simple 2d game and does not put any strain on the system at all so I thought it would be a good test.
/* psycho aq */
if (nv_get_cap(enc, NV_ENC_CAPS_SUPPORT_TEMPORAL_AQ)) {
config->rcParams.enableAQ = psycho_aq;
config->rcParams.enableTemporalAQ = psycho_aq;
}
Why would you enable both same time? For my experience TemporalAQ is bad with high motion games. Also aq strength would be nice to config but i quess its too confusing right? :P
Heres quote from NVENC_VideoEncoder_API_ProgGuide:
Temporal AQ tries to adjust encoding QP (on top of QP evaluated by the rate control algorithm) based on temporal characteristics of the sequence. Temporal AQ improves the quality of encoded frames by adjusting QP for regions which are constant or have low motion across frames but have high spatial detail, such that they become better reference for future frames. Allocating extra bits to such regions in reference frames is better than allocating them to the residuals in referred frames because it helps improve the overall encoded video quality. If majority of the region within a frame has little or no motion, but has high spatial details (e.g. high-detail non-moving background) enabling temporal AQ will benefit the most.
I was having an issue when recording assetto Corsa competizione today, which is a graphically intensive game. On checking the recorded videos would have full audio but only a few frames of recorded video -i e it would record 1frame and freeze it for 30 seconds or so and then record anothet one.
A few times it's recorded footage normally for a bit and then went to that behaviour. I noticed when I tried clicking stop recording that it didn't seem to work and I had to click it a few times for it to stop
I was originally running beta 6 and then checked this forum and download beta 9 with the same result. I'm going to go back to v22 stable to re-test that
PC specs:
i7+4790k, 32gb ddr3, gtx2080, 2x Samsung SSD. Running at 3440x1440.
I recorded in the regular assetto Corsa the other day and it was fine. Not sure if windows updates could have broken it...I'll try and do more testing
In the presentation of nvidia it was said that the quality of the stream on RTX(i have RTX 2080ti) video cards will be at the level of x264 Medium. I tried to stream on twitch COD4 Blackout with bitrate 6000, Max Quality, and was surprised by the number of pixels on satlam the background while driving, for example if you look up at the sky. To test, I tried to record a video with 10K bitrate and compare with x264 Medium. As a result, the x264 Medium had no pixelation on a light background such as the sky, and the NVENC had a huge number of pixels of 264(new). I don't understand why this is happening. In beta 9 only improved performance, and the quality of NVENC has not yet improved?
There is a way to reserve some gpu resources for OBS? I have noticed if i use my 144hz monitor without V-Sync in games, the GPU go on 100% usage and OBS start stuttering and lagging. Can i set OBS priority above the games?
There is a way to reserve some gpu resources for OBS? I have noticed if i use my 144hz monitor without V-Sync in games, the GPU go on 100% usage and OBS start stuttering and lagging. Can i set OBS priority above the games?
When you turn on G-Sync, you automatically turn on vertical sync and block frame output at your monitor frequency. If you turn it off and do not turn on the usual vertical sync or frame limit, the video card will survive the maximum FPS, which results in a 100% load on the video core .
When you turn on G-Sync, you automatically turn on vertical sync and block frame output at your monitor frequency. If you turn it off and do not turn on the usual vertical sync or frame limit, the video card will survive the maximum FPS, which results in a 100% load on the video core .
You can reduce the frequency of the monitor to 100-120 Hz. You need to understand, no matter how powerful your video card is, if you want to use it in the stream, then its load in the game should at least be 85-90% maximum. How do you achieve this decide for yourself, by limiting the frame rate or reducing graphics in the game or both.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.