Question / Help Basic question regarding Avermedia C985 PCI-E 1x CC

bruschetta

New Member
dodgepong said:
If you're going to buy cables and adapters, I recommend http://www.monoprice.com/

I concur with Boildown's suggestion to convert DVI to HDMI rather than Displayport to HDMI.

Do neither of your monitors have DisplayPort input?

primary,
http://www.amazon.com/gp/product/B0044U ... UTF8&psc=1

secondary,
http://www.newegg.com/Product/Product.a ... 6824254100

Both include DVI-D inputs, and the primary monitor also includes an hdmi input.

I shouldn't say the secondary monitor is cheap, it is your basic lcd with a max 60hz. The primary was a bit more pricey, as it includes 3D and a max 120hz.
 

bruschetta

New Member
Ok, so i had a lot of free time on my hands and i put together this diagram to indicate what connections are available and what are currently being used...

http://i.imgur.com/Z0byXQm.png

I did not bother mentioning that i use a receiver for 5.1 surround because it is unrelated to the topic, but since we are talking more about connections now i figured i would include it. The LCD TV connects to the receiver which then runs from the receiver to the HDMI port of the GPU.

If you goto "nvidia control panel" and select "set up digital audio" you will notice an area where a receiver or other device's can be shown - when my receiver is on and the lcd monitor is on, the title of my receiver shows up and the gpu recognizes this. I then can configure the sound tools in windows sound options etc.... and bingo, i have 5.1 surround speakers.

Here is a blank diagram for you to manipulate in a way that YOU would set it up,

http://i.imgur.com/Hga7EIu.png
 

Boildown

Active Member
It looks like you'll need an external passive adapter to convert DisplayPort to DVI or HDMI... more info: http://en.wikipedia.org/wiki/DisplayPort#Overview . I'd be interested to find out how well it works with the capture card, but if it doesn't work, you can send it to the secondary display instead. There really is no other way I can see to do it.
 

Muf

Forum Moderator
Boildown said:
It looks like you'll need an external passive adapter to convert DisplayPort to DVI or HDMI...
Correction: you need an external active adapter. One that actually converts the signal, instead of mapping pins on the connector.
 

Boildown

Active Member
Muf said:
Correction: you need an external active adapter. One that actually converts the signal, instead of mapping pins on the connector.

From the Wiki page I linked it says:

The DisplayPort signal is not compatible with DVI or HDMI. However, Dual-Mode DisplayPorts are designed to transmit a single-link DVI or HDMI 1.2/1.4 protocol across the interface through the use of an external passive adapter that selects the desired signal and converts the electrical signaling from LVDS to TMDS. Analog VGA and dual-link DVI require powered active adapters to convert the protocol and signal levels, and do not rely on Dual-Mode.

1080p60 falls under single-link DVI / HDMI 1.2. So unless the DisplayPort on the video card isn't dual-mode, a passive adapter is fine. Or the wikipedia article is wrong. Or I'm unaware of some other factor.
 

FerretBomb

Active Member
I believe the Wiki article is wrong. I know that I needed to buy an active adapter (specifically) to drive three monitors on my card; a passive adapter (which I picked up by accident first) will NOT help. Back in the day this was because there were only two DACs on a video card (de-facto standard, allowing two monitors but without the extra cost to support three, which almost no one used). I'm not sure if this applies to DVI/HDMI as they're supposed to be digital to begin with.

But I do know that a passive DP->HDMI adapter will NOT work in some cases, and an active is a safer bet (if about ten bucks more expensive).
 

Muf

Forum Moderator
TMDS ports (DVI/HDMI) need a clock source per port because the frequency is derived from the video mode. All DisplayPort outputs can share a single clock source because they use a fixed frequency. To save costs, triple output graphics cards only have two clock sources, so if you want to use all three ports, you need either a native DisplayPort monitor or an active adapter.
 

bruschetta

New Member
So to make things simple, i pretty much swapped the HDMI to Displayport cord to run from my receiver (hdmi) to my video card (displayport). Video and sound (on the television) basically looks and... sounds the same, so i'm fine with that. I then used an HDMI to HDMI to run from the XCAPTURE-1 to the hdmi port of my video card, leaving the dvi ports/cords as is...

FerretBomb, you were correct, as advanced and 'powerful' ultrabooks may seem (mine at least, for this intended use), they are not necessarily a viable component when used as a secondary/streaming PC. When attempting to stream Battlefield 4 which, as you know is an CPU intensive game, the ultrabook just wasn't cutting it. Very laggy and,, sluggish user end stream quality overall. As soon as i would go live, the CPU load percentage would immediately max to 100% on all cores, barely ever dropping. Also, I'm unsure if it was the laptop itself or the X-CAPTURE1 ( or both) causing this but there were massive flickering issues on stream. So for now, i will continue to use the laptop for work, travel and media related use.
 

Muf

Forum Moderator
bruschetta said:
FerretBomb, you were correct, as advanced and 'powerful' ultrabooks may seem (mine at least, for this intended use), they are not necessarily a viable component when used as a secondary/streaming PC. When attempting to stream Battlefield 4 which, as you know is an CPU intensive game, the ultrabook just wasn't cutting it. Very laggy and,, sluggish user end stream quality overall. As soon as i would go live, the CPU load percentage would immediately max to 100% on all cores, barely ever dropping. Also, I'm unsure if it was the laptop itself or the X-CAPTURE1 ( or both) causing this but there were massive flickering issues on stream. So for now, i will continue to use the laptop for work, travel and media related use.
Did you enable Quick Sync?
 

bruschetta

New Member
Muf said:
Did you enable Quick Sync?

Let me explain my idiocy, first off, I absolutely love the fact that the OBS community is so involved and willing to help one another. These forums are a AWESOME, secondly, with that said, before I was even aware of OBS's existence I fell into the world of xsplit, where a majority of my streaming understanding came from...

Now, I have been using Xsplit this entire time (since May of 2012?) - about a year ago i purchased what i believe was a 2-3 year Xsplit pro service for a reasonable discount that included some (not so) neat features. When trying to find answers to minute and advanced questions, I could not find any reasonable results besides HardOCP and Toms Hardware forums, which proved some decent responses. THIS place is the holy grail of Stream troubleshooting, tech tips and all the like.

Having said that, about a month ago i downloaded OBS's and became familiar with the program, and it did not 'look' (keyword, look) so different to xsplit other than the fact that there were more in depth and unfamiliar 'optional' settings. Let me just say that using OBS to stream compared to xSplit while using the ultrabook and XCAPTURE-1 setup is like night and day... I'm literally dumbfounded, at my own ignorance i suppose.

For one thing, the flickering issues and overall jittery/sluggish user end stream quality is completely gone... (default cpu preset this entire time, 720p 60fps & quicksync off) And by the way, i do not believe xsplit has a quicksync enabling feature. As far as CPU utilization, I noticed that OBS uses slightly less CPU resources, still noticeably around the 95-100% mark, wavering, but not nearly like xsplit (which was consistently 99-100% , pretty discouraging)

Now, with Quicksync enabled CPU load went down to about 30-45% for all cores, rarely ever reaching 50%... I can say for one thing that quality takes a beating with quicksync on, nonetheless the CPU performance is outstanding. I have yet to attempt 1080p 30fps and 60fps, i will report back with the results... I have some questions,

1. Does lowering CPU presets actually effect stream quality while QuickSync is on? I had *gradually* lowered it to medium, even reaching slow and CPU utilization stayed the same on my desktop and laptop for the most part, with the quality being very unnoticeable. I will NOT be trying this with quicksync turned off seeing how when it IS turned off and CPU preset is on default, cpu load is at 95-100% as i said above.

2. In a nutshell, or maybe being somewhat descriptive, how does quicksync combine with OBS to offer such outstanding performance? http://en.wikipedia.org/wiki/Intel_Quick_Sync_Video basically describes quicksync as being speed taking priority over quality.

I can safely say that OBS is now my primary and preferred tool for streaming.
 

R1CH

Forum Admin
Developer
1. The x264 presets do not affect quicksync since it's a completely different encoder (unrelated to x264).

2. QuickSync encoding is basically built into your CPU as a hardware feature, so it runs far faster than software. The quality of the QuickSync encoder is limited due to space concerns in the physical chip (more complex hardware uses more transistors).
 

Muf

Forum Moderator
If you can reach your desired frame rate consistently with Quick Sync disabled, and you're not using your laptop for anything else while streaming, I would just use x264 instead. The quality, as you noted, is higher with x264 than with Quick Sync. However, if you're having troubles maintaining a solid 30 or 60fps, Quick Sync may offer a solution as lowering the preset (to superfast or ultrafast) will almost certainly drop x264's quality to below what you'd get with Quick Sync (mostly since anything below veryfast disables inloop filtering which makes everything look very blocky).
 

bruschetta

New Member
R1CH said:
1. The x264 presets do not affect quicksync since it's a completely different encoder (unrelated to x264).

2. QuickSync encoding is basically built into your CPU as a hardware feature, so it runs far faster than software. The quality of the QuickSync encoder is limited due to space concerns in the physical chip (more complex hardware uses more transistors).

I see, thank you.

Muf said:
If you can reach your desired frame rate consistently with Quick Sync disabled, and you're not using your laptop for anything else while streaming, I would just use x264 instead. The quality, as you noted, is higher with x264 than with Quick Sync. However, if you're having troubles maintaining a solid 30 or 60fps, Quick Sync may offer a solution as lowering the preset (to superfast or ultrafast) will almost certainly drop x264's quality to below what you'd get with Quick Sync (mostly since anything below veryfast disables inloop filtering which makes everything look very blocky).

Interesting, i'll keep that in mind. A couple of things, In the video settings when i downscale my resolution to say 720p, the 'Filter' drop-down menu becomes manageable. Would you suggest messing with this? changing it from say bilinear to bicubic? Is this similar to increasing the resolution scale (like in guild wars 2 or bf4), in a sense? Also does it effect the CPU and by how much?

Also, the only real issue i have now is getting audio to my stream. I am using the 3.5mm splitter method on my headphone/speaker out port of the desktop pc, one end connected to my headphones and a 3.5mm male to male extension to the only available port on the laptop which is basically an headphone out or audio out 3.5mm plug. Is this correct? Maybe i am doing something wrong.
 

Muf

Forum Moderator
bruschetta said:
Interesting, i'll keep that in mind. A couple of things, In the video settings when i downscale my resolution to say 720p, the 'Filter' drop-down menu becomes manageable. Would you suggest messing with this? changing it from say bilinear to bicubic? Is this similar to increasing the resolution scale (like in guild wars 2 or bf4), in a sense? Also does it effect the CPU and by how much?
Downscaling in OBS is done on the GPU, so changing the filter only affects GPU usage. I would definitely recommend using Lanczos (the highest quality filter available).
 
Top