Question / Help Question About Encoding on Avermedia Live Gamer HD 2 GC570

Cedzilla

New Member
Is encoding with the Avermedia Live Gamer HD 2 on par with the x264 encoder?

I am streaming Fallout 4 with an I7 4790k and wanted to know if this capture card can help increase performance while outputting a good quality video.

Thank you for your time!
 

Rodney

Forum Moderator
No, the avermedia encoders are usually pretty bad. I'm also not sure if you can even use it with OBS.

Capture cards also don't help with performance unless you have a two PC setup.
 
Last edited:

obsgyn

Member
I'm planning on ordering the newer AVerMedia Live Gamer HD 2 (GC570) capture card today for use in a two PC stream setup. That their encoders are bunk should be irrelevant since all the card is doing is capturing the signal and my PC is doing the encoding? So do you know of any reasons why I should avoid this card? I'm using the latest version of OBS Studio and the streaming rig is an i7 4790k with a pair of 970s (not that the GPU matters).

I considered a Magewell, but the 4K that all the big streamers use doesn't seem to be purchasable *anywhere* and a lot of reviews for it seem to complain about the version of HDMI it supports being kind of limiting. Also the 4K Plus Magewell, which you do seem to be able to buy some places, is $900 before tax.

I considered a BlackMagic Intensity 4k card which I know a lot of big streamers use and is as low as about $200 (same as the AVermedia), but I've also heard a lot of issues about Black Magic Intensity with OBS over the years and it often being a bit janky and out of date.

Which seems to leave you with the two very mainstream consumer options of Avermedia and Elgato.

When it comes to these two brands, it seems that there is a mix of love and hate for both of them, so it's difficult to pick a winner. -- With any luck, perhaps someone can recommend a much better alternative to me?

I intend to play at 2560x1440 at as high of a refresh rate as I can (gsync, 144hz display, 1080ti) while streaming at 1080/60 which means that with any card that isn't labeled a 4k card (and probably even then, due to refresh rate limits), I'll end up having to run a local copy of OBS Studio in preview mode on the gaming PC and then sending *THAT* signal to the streaming PC (presuming I can do that via HDMI - I won't know how to do this until I actually have my capture card and everything so I can actually try it).

Anyway, sorry to bug you - but I just haven't found much out there about the Live Gamer HD 2 card and I've not done two PC streaming or used a capture card before. It seems like your options are "spend an ungodly amount to get a high quality but very dated and still limited device" or "spend a couple hundred bucks and suffer a lot of jank with a consumer product".

Also, sorry if it seems like I'm hijacking this thread. When I searched for discussions about this card here, this seemed the most relevant and my issue didn't seem to deviate enough to spam-up a new thread.

Thank you and regards!
 
Last edited:

Fenrir

Forum Admin
I'm planning on ordering the newer AVerMedia Live Gamer HD 2 (GC570) capture card today for use in a two PC stream setup. That their encoders are bunk should be irrelevant since all the card is doing is capturing the signal and my PC is doing the encoding? So do you know of any reasons why I should avoid this card? I'm using the latest version of OBS Studio and the streaming rig is an i7 4790k with a pair of 970s (not that the GPU matters).

GPU does matter. OBS uses GPU to composite your scene, and if you start adding a lot of sources and create a complex scene, it will need a proper GPU to render correctly. Especially if your goal is 60fps.

I considered a Magewell, but the 4K that all the big streamers use doesn't seem to be purchasable *anywhere* and a lot of reviews for it seem to complain about the version of HDMI it supports being kind of limiting. Also the 4K Plus Magewell, which you do seem to be able to buy some places, is $900 before tax.

The Magewell 4K+ is a fantastic card, but yes, the price is a bit high.

I considered a BlackMagic Intensity 4k card which I know a lot of big streamers use and is as low as about $200 (same as the AVermedia), but I've also heard a lot of issues about Black Magic Intensity with OBS over the years and it often being a bit janky and out of date.

This was true, but as of 19.0.0 (the changes are done, they are just pending the 19.0.0 release in the next few weeks at most) the BMD cards are very solid with OBS. There were some audio downmix problems, and some sync issues that we were able to identify and correct. I've been using the BMD Intensity Pro 4k for a few months now, and no issues at all after those fixes.

Which seems to leave you with the two very mainstream consumer options of Avermedia and Elgato.

When it comes to these two brands, it seems that there is a mix of love and hate for both of them, so it's difficult to pick a winner. -- With any luck, perhaps someone can recommend a much better alternative to me?

I intend to play at 2560x1440 at as high of a refresh rate as I can (gsync, 144hz display, 1080ti) while streaming at 1080/60 which means that with any card that isn't labeled a 4k card (and probably even then, due to refresh rate limits), I'll end up having to run a local copy of OBS Studio in preview mode on the gaming PC and then sending *THAT* signal to the streaming PC (presuming I can do that via HDMI - I won't know how to do this until I actually have my capture card and everything so I can actually try it).

This is where things kinda fall apart. 144fps gaming + any capture card is going to be a huge headache for you. The method you described is a super hacky workaround to the limitations of capture cards, but is your best bet. You set the capture card itself as another monitor, and project OBS' output to that to capture.

Anyway, sorry to bug you - but I just haven't found much out there about the Live Gamer HD 2 card and I've not done two PC streaming or used a capture card before. It seems like your options are "spend an ungodly amount to get a high quality but very dated and still limited device" or "spend a couple hundred bucks and suffer a lot of jank with a consumer product".

Also, sorry if it seems like I'm hijacking this thread. When I searched for discussions about this card here, this seemed the most relevant and my issue didn't seem to deviate enough to spam-up a new thread.

Thank you and regards!

This probably should have been its own thread, but it's all good. My personal recommendations is any of the BMD PCIe cards, a Magewell (expensive), or the Startech (slight jank),
 

obsgyn

Member
GPU does matter. OBS uses GPU to composite your scene, and if you start adding a lot of sources and create a complex scene, it will need a proper GPU to render correctly. Especially if your goal is 60fps.

That didn't occur to me (I was thinking merely of the encoding portion).


This was true, but as of 19.0.0 (the changes are done, they are just pending the 19.0.0 release in the next few weeks at most) the BMD cards are very solid with OBS. There were some audio downmix problems, and some sync issues that we were able to identify and correct. I've been using the BMD Intensity Pro 4k for a few months now, and no issues at all after those fixes.

This is great to hear. It sounds like reviews have picked up since the troublesome launch of the card. How do you feel about the tiny fan on the heatsink? I've heard it is intolerably loud (which could be a problem streaming as I run rather quiet setups). Do you send audio to your streaming rig via HDMI as well? (I presume that can be done with the BMD).


This is where things kinda fall apart. 144fps gaming + any capture card is going to be a huge headache for you. The method you described is a super hacky workaround to the limitations of capture cards, but is your best bet. You set the capture card itself as another monitor, and project OBS' output to that to capture.

That's unfortunate. People seemed to suggest the OBS Preview method was a simple work around that didn't have problems.

Is the continued limitation even on the expensive pro/industrial capture cards a matter of pure data transfer capacity? Since it should just be passing the signal through for the CPU to encode, theoretically they should handle whatever you throw at it, right? So I presume the problem is that even with HDMI 2.0, you're looking at like 18Gbps at *best* which isn't even enough for 4k@60? And introducing capture cards with DisplayPort only opens up to 25Gbps at 8best*, I think? So until the industry comes out with a new transmission protocol/format, capture cards are literally not going to go past 4k@30.

So the only way to work around those limitations is to reduce what you are sending to the capture card to a signal it can carry. Such as gaming at 2560x1440@120hz and sending as 1080p@60hz -- which requires you to define the capture card as such a display and send to it as such, which will cause possible issues with tearing, I presume? *OR* previewing via OBS at 1080p@60hz and sending *that* . . . which can also introduce tearing and other potential complications?

Do I understand, then, that the only ideal way I can do it is to play games at exactly the same resolution and refresh rate that the capture card supports? And I presume gsync has to be disabled?

As an aside, the BMD supports 1080p@60 and 4k@30. However, it does not explicitly state 2560x1440 as a supported resolution at all. Does this mean that it outright does not handle it, even though it should technically have the data transfer capacity for 2560x1440 at (I believe) near 60hz?

Sorry for the 20 questions and sorry for continuing them *here*. I've done a lot of research for quite awhile and it is just so hard to find anything real about any of this. Most of the time, at best, it's just a bunch of cryptic youtube videos of streamers who somehow got some sort of a setup sort of working for them that they themselves don't really even understand and try to explain how it all works or how they got it to work. Actual facts and reasons for how things are seem elusive and all I've been able to do is patchwork things in my head based on my software dev background.

Anyway, thanks very much for your time again. It is very much appreciated.
 

Fenrir

Forum Admin
The OBS preview method does work, but it's not perfect. You might still run into some issues.

And yes, the lack of >60fps support does indeed extend into the professional space.

I don't use a 2 PC setup, I only capture consoles with my IP4K (PS4 and DreamCast) and have no issues with either, but they're only running at most at 1080p 60fps. I haven't tested the IP4K at 1440p, so I'm not sure offhand what the options are there.
 
Top