NVENC Future Development and Tensor Cores

Bensam123

Member
Cool update, anyways, I'm now currently looking to get a RTX card for my streaming PC (second PC) as it offers better encoding then the processor is currently capable of doing as a 1700 is only able to encode so much at 1080p@60 FPS with other sources and overhead on top of it. Very interested specifically in the tuning portion of it as it's not something x264 currently is capable of doing by itself in software (unless someone makes a plugin or something that does that).

I've read the usual suspects and everything says that a 1660ti is currently the lowest end card that supports all the improvements, however, since it doesn't support ray tracing is there a possibility in the future that more improvements might come down the pipeline that require ray tracing usage? Not specifically ray tracing, but rather the tensor cores could be programmed to help with fidelity, filtering, or other visual improvements. I know CUDA is a unique skillset as is FPGA development and as such even if it's possible, not sure anyone here would have the means to accomplish it.

Either way, looking to save $70 but not at the expense of future development and that's what the question is about.

Would this be a better question for the Nvidia devtalk forums? Not sure how much of the features added in this update are from Nvidia or from OBS.
 

dodgepong

Administrator
Community Helper
I think the Nvidia forums are probably a better place to talk about this. The work done on this update was largely done by us, but the subject you're talking about isn't quite as related to what OBS does.
 

Bensam123

Member
Did you guys implement features that were already available in NVENC or is the dynamic b-frames and psycho visual tuning something you guys developed? I guess that's a better way to ask this. I've definitely seen turing cores used for some pretty interesting things in crypto as well as CUDA, but if that's not really what you guys did and you just implemented already available features it's a pretty good bet nothing will change till their next gen cards 3XXX as Nvidia isn't really proactive between generations.
 
Top