Image blurry, ghosting through VGA capture

hunterjwizzard

New Member
I am attempting to use OBS to digitize the display output from a retro gaming PC. I am not doing any capture, or doing anything with audio. Just trying to play the old computer on a big modern display it does not natively support. As I have it configured currently, the image is blurry with some ghosting to the right.

Log files:

Here's the setup: the old PC has VGA out attached to a 1x4 VGA splitter and a CRT monitor. A port on the splitter is then attached to the DVI In on a DataPath VisionAV-SDI capture card via a passive VGA to DVI adapter. The source PC is running 1024x768 32bit . Its probably running 60hz but the setting just says "optimal" over on the computer.

The "capture" PC is a windows 10 system with 32gb of RAM and an 8-core 3.0ghz Xeon CPU, dual nVidia 1070s in SLI, running the OBS 30.0.2.

I'm fairly new to OBS, any help is appreciated.
 

AaronD

Active Member
Something in your analog (VGA) chain is not fast enough to keep up with the high-speed analog signal. Could be anything, but your description says it has to be on the analog side. The digital side just takes what it gets, already messed up before the conversion.

1024x768p60 is 47,185,920 pixels per second. Actually higher, because it also spends some time to draw a black frame around the video that you don't see and doesn't count towards the official resolution, to make the electronics of a CRT much easier. But I'm ignoring that here.

The worst case, a full-black and full-white single-pixel vertical pinstripe, would be half that, or about 24MHz on each of the color signals. To pass it accurately though, you need a fair amount of headroom, ideally about 10x the maximum frequency. So if something can't handle a 240MHz analog signal while also keeping its amplitude accurate, you'll probably have some ghosting/smearing to the right for the higher-frequency parts (fine detail), as you describe.

---

I made my own custom VGA processor about a year ago, based on what I had done with analog audio before then, and what I knew about how the video signal works. My own circuit design and board, hand-soldered with paste and hot air.

Coming from the relatively low-frequency audio world (20kHz max), I got the rules of thumb all wrong for the high-frequency world of video. Same ghosting/smearing to the right that you describe. The audio-sensible resistors that I used, interacted with the parasitic capacitance that every circuit board has, to make a lowpass filter. (that happens for everything, including audio, but the cutoff frequency is well above audible, so it doesn't matter for that) Swapped them all out for much lower values (closer to a short-circuit but still resistive), and the ghosting/smearing went away.

Part of my reason to use the too-high resistor values was to not exceed the datasheets' maximum current specification for each of the active parts. Audio often goes to the full supply rail, and so that's the number that you put into that equation. What I didn't realize yet, is that video *doesn't* go to the full supply rail. And it almost never sees a short circuit, like audio often does. So the rule of thumb for audio, about surviving full-scale into a dead short, is all wrong for video. Protecting against something that never happens, and causing other problems in the process. Maybe the designers of one of your VGA boxes made the same mistake?
 
Last edited:

hunterjwizzard

New Member
Interesting.

So I did some testing based on this and comments by others. I took the VGA splitter out of the chain and that made the ghosting stop, but the image is still a bit fuzzy. Also Windows 98 doesn't play well with having the capture card as its only monitor(hence my initial use of the splitter).

But the image is still fuzzy.

I then took another desktop with a DVI port and hooked that directly to the capture card. No splitters, no adapters, just DVI to DVI. This leads me to suspect I am missing some settings in OBS.
 

koala

Active Member
In Settings > Video > Base Resolution and Output resolution should exactly match the capture resolution of your capture device, so no resizing is performed within OBS.
According to your log, your capture device has quite some strange settings:
15:39:03.667: [DShow Device: 'Video Capture Device'] settings updated:
15:39:03.667: video device: Datapath VisionAV-SDI Video 01
15:39:03.667: video path: \\?\root#media#0000#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\dgc133video@0
15:39:03.667: resolution: 672x506
15:39:03.667: flip: 0
15:39:03.667: fps: 160.09 (interval: 62465)
If your analog device outputs resolution 1024x768 at 60 Hz, it should read resolution: 1024x768 here and fps=60 and not 160.
If these are the default or automatic settings of the capture device, explicitly set the actual and desired values in the capture device properties of OBS.
 

hunterjwizzard

New Member
Ah ha!

That might actually have done the trick.

I went to Settings > Video > Base Resolution and both were set at 1920x1080. I changed both to 1024x768, and set the FPS value to 60, still testing with a DVI output system.

The image is still a little bit fuzzy, but then I went and set up a comparison. I won't go into details, but the image looks about as sharp as another 1024x768 signal displaying on the same monitor in 4:3. I had been hoping to get a sharper picture. Are there any other tips for fine-tuning an image?
 
Top