Turing NVENC or wait for Intel Arc Hardware Accelerated AV1 encoder?

aberniando

New Member
My cousin offer me his 1660 Super for $175 US. I am not playing games in my PC, only on console. I would like to use NVENC to record gameplay only. Should I buy his 1660 Super (Turing NVENC) or wait for Intel's cheapest new graphic card (rumor say its gonna be around $150) that comes with Hardware Accelerated AV1 encoder?
 

Lawrence_SoCal

Active Member
What is your priority? current spend? stability? hope for future features? time to implement?

The current review on the A380 just released is basically that the drivers still are a problem, and performance is unimpressive, even with low expectations. Will driver updates fix the issue??? unknown at this point.

So, if you want something NOW that is stable, Turing NVENC is a solid choice. I too am hopeful that Intel's AV1 helps us move passed H.264.... but that is a gamble, and having the entire eco-system (recording, editors, streaming, etc) start moving in the AV1 direction could easily be a year or more (in terms of stable, released (not BETA), etc code). This stuff takes time to work out the kinks. Will Intel's first ARC GPU's be able to support whatever the end-result is? or might there be a caveat/incompatibility, and it is the next h/w release that is really usable/desirable? no good way to know that.
I'm personally in the market for a new computer, and for video editing, having the AV1 encoder would provide a nice future-proofing capability. But I expect it to be 3-6 months before we'll know if the current generation of Intel ARC GPUs are all that folks have hoped this new GPU competitor will be, or if we need to follow the MS model and wait for v3 to get a reasonably functioning version of what was promised in v1.

So, obviously all of the above based on long industry experience, but no hands on with ARC GPU. I'll love to hear other knowledgeable input
 

culfrankie

New Member
What GPU would you recommend for streaming at 1080p at 60fps without dropping frames? I was thinking of getting the intel arcs but looking at reviews, they don't seem to be reassuring and AMD encoders don't look to good yet. What is the lowest I can go with the Nvidia GPUs to comfortably stream at 1080p or 1440p at 60fps.
 

Lawrence_SoCal

Active Member
What GPU would you recommend for streaming at 1080p at 60fps without dropping frames? I was thinking of getting the intel arcs but looking at reviews, they don't seem to be reassuring and AMD encoders don't look to good yet. What is the lowest I can go with the Nvidia GPUs to comfortably stream at 1080p or 1440p at 60fps.
Unfortunately that really isn't a good question, as it leaves unstated critical context/criteria. So I would not recommend a GPU based on that question. And if for gaming, not my area of expertise/interest, so I wouldn't recommend a specific card regardless (any more than I'd recommend a specific car based on someone asking for a vehicle to 'drive to work and get groceries').
See my original reply.. what are your priorities? if this about a hardware purchase today, for streaming today, and you are okay with replacing GPU in a year or two, that is a different criteria than wanting to buy today and use for 5 years. It has been 3.5 months since my reply above; but with Intel only now releasing the rest of ARC GPU lineup (expected months ago), the 'jury' is still out. All of my earlier response still applies

Streaming today using H.264? most GPUs encode offload systems can handle that easily enough using OBS Studio, using appropriate settings for rest of system (RAM, CPU, not using PoS plugins, etc). The issue is your requirements (in WAY more detail than posted, which if you wish to purse, 1. check for other similar postings... there are many.. and 2. start your own thread).

My thoughts at a high-level as of the start of October '22
The Intel GPUs appear to be fine, as long as you are aware of the requirements (CPU, motherboard, ReBAR, etc), and aren't focused on legacy game/video modes. For gaming, and having a history/library of older games, Intel GPU drivers aren't optimized (especially for gaming) the way nVidia and AMD are, after well over a decade of maturing (or stumbling, as the case may be).
AMD suffers from making decent to great hardware, but backing it up with required software that is atrocious/terrible (like the H.264 SDK, recently updated, still not great, 5-8+ years late). The AMD GPUs are fine when someone else writes the software (a la Apple with Intel CPU based systems, etc). The issue today is that AMD appears to have presumed H.264 was legacy and put effort into H.265 (which is a licensing mess and therefore rarely used in streaming).. that gamble was a miserable failure (like Intel's approach to 10nm CPU manufacturing).
Leaving nVidia as least worst option.
But if you want AV1 for future-proofing, only the latest nVidia RTX 4080/90 cards have an AV1 encoder, and there aren't comparison tests out yet (that I'm aware of). and obviously many times more expensive that ARC 770. so.. it depends.. for future-proofing, and if one can be patient, I'd wait for a 4060/4050 vs 770 encoder comparison and decide then
 
Top