xbox to laptop

Aderyn

New Member
hi, i will be getting a capture card to connect xbox s to laptop to use the laptop as a monitor for playing downloaded games only, no recording or multiplayer at this point.. the help page for this seems to be missing.. do i choose options for virtual camera in the setup? ie auto config wizard=virtual camera , the source as video capture device? as I am not sure if streaming in the settings means online multiplayer or live streaming to YouTube or streaming from an xbox to laptop so i am a bit confused. thanks In advance.
 

AaronD

Active Member
The virtual camera is fed from OBS, as something for other apps to pick up. You can use it to "professionally" produce your feed to an online meeting, for example, instead of them always seeing your camera by itself.

When you plug in the capture card, it'll appear in the same list as a physical camera and behave the same way.

---

By the way, I hope you didn't get one of those cheap, ubiquitous, deceptively-advertised things. They're around $15 to $20, and make all kinds of claims, but what you actually get is practically random.

A common problem is that they make a big deal about being USB 3, and how much better USB 3 is than USB 2, which it is......but what you actually get is a USB 3 *connector* with a USB 2 chip behind it. Thus, it's limited to USB 2 despite all the hype!

USB 2 doesn't have the data rate to handle HD video. Just not happening. So a USB 2 capture card has to compress the video *in the card*, before it even gets to the PC, just to cram it through such a small hose. The de-facto standard for that compression is MJPEG, which is simply a JPG still image of each frame, with no knowledge of the other frames. So if you've seen the "JPG fuzzies" on a photograph or screenshot, that's what it's doing to your video, *in the card*, not the PC or OBS.

To fix that, stick with the name brands, for at least a chance at some honesty and accountability. And expect to pay about $100 per channel.

Also do some research and try to find one that works on Linux, even if you'll never use Linux yourself. The reason is that Linux has drivers built-in for almost every existing standard that there is. So, working on Linux probably means that they've followed one of those standards, instead of rolling their own. Following an existing standard means that anyone can and someone probably will write a driver to support that standard on future systems.

A proprietary thing will likely be abandoned soon after its one production batch is done, which turns perfectly good hardware into e-waste because they won't make a driver for the system that you're going to need in the future, to avoid getting hacked on the internet. And no one else can either because it's still a trade secret. (or at least, it takes a lot more effort to eavesdrop on the communications on an old working system, and reverse-engineer it)
 

Aderyn

New Member
The virtual camera is fed from OBS, as something for other apps to pick up. You can use it to "professionally" produce your feed to an online meeting, for example, instead of them always seeing your camera by itself.

When you plug in the capture card, it'll appear in the same list as a physical camera and behave the same way.

---

By the way, I hope you didn't get one of those cheap, ubiquitous, deceptively-advertised things. They're around $15 to $20, and make all kinds of claims, but what you actually get is practically random.

A common problem is that they make a big deal about being USB 3, and how much better USB 3 is than USB 2, which it is......but what you actually get is a USB 3 *connector* with a USB 2 chip behind it. Thus, it's limited to USB 2 despite all the hype!

USB 2 doesn't have the data rate to handle HD video. Just not happening. So a USB 2 capture card has to compress the video *in the card*, before it even gets to the PC, just to cram it through such a small hose. The de-facto standard for that compression is MJPEG, which is simply a JPG still image of each frame, with no knowledge of the other frames. So if you've seen the "JPG fuzzies" on a photograph or screenshot, that's what it's doing to your video, *in the card*, not the PC or OBS.

To fix that, stick with the name brands, for at least a chance at some honesty and accountability. And expect to pay about $100 per channel.

Also do some research and try to find one that works on Linux, even if you'll never use Linux yourself. The reason is that Linux has drivers built-in for almost every existing standard that there is. So, working on Linux probably means that they've followed one of those standards, instead of rolling their own. Following an existing standard means that anyone can and someone probably will write a driver to support that standard on future systems.

A proprietary thing will likely be abandoned soon after its one production batch is done, which turns perfectly good hardware into e-waste because they won't make a driver for the system that you're going to need in the future, to avoid getting hacked on the internet. And no one else can either because it's still a trade secret. (or at least, it takes a lot more effort to eavesdrop on the communications on an old working system, and reverse-engineer it)
Thank you, i haven't bought anything yet, but yes i was going to buy a cheap (very cheap)one, i did noticed on reviews about the usb issue which seemed a tad dodgy, .. my second plan was just buy a used monitor or tv screen... which is probably the simplest option.
 

qhobbes

Active Member
I have an Elgato HD60X. Works great for me for capturing but when trying to use OBS preview as the screen, there is some input lag. This is expected and so keep that in mind for the games that you would want to play this way. I only tried it with Gran Turismo 7 (single player), it was noticeable but not game breaking.

I would go with the used monitor or tv screen.
 
Top