Camera sensed when video in Standard def, not in hi def

I noticed a couple weird things. When I plug in an HDMI capture card and try to avoid using VLC, my Macintosh cannot pick up a camera via USB and a hi def picture via HDMI.

But when I use an S-Video capture card for analog video or use an HDMI card that down converts to 480p I noticed that I can have a camera and a capture source simultaneously.

2 questions. the computer with an HDMI card has a maximum of USB 2.0 but also has a thunderbolt dock. If I were to route either the HDMI capture or the camera through USB 3 the Thunderbolt is compatible with, would that make the hardware requirements enough to make it accept both the high-definition video and a camera?

Second of all why does it seem like it only accept one camera 1 videos and high definition. It seems like you're relying too much on VLC. I would use VLC to accept it clogs to home network for my dad's videos. And any time I use VLC without connecting to a network it causes errors. Is that just the way OBS is built, to use VLC video?
 
Hello? Any help? Why can't I have an HDMI USB 2 input and a camera input, unless I use a "degraded to standard definition" version. Then I can use a Xbox 360 Vision Camera or a PS2 EyeToy?

Is only a certain amount of data allowed to be put through at once?

Also, what's better, SD and Webcam, or HD game footage only? Since I plan to do Jukebox and Jackpots, an honesty cam on by hands would be full disclosure. Plus the screen shrinks anyway.
 
Top