Question / Help I want to stream 1080p60

Aegrit

New Member
Hey guys,

I'm new here and I came across someone who streams Battlefield 4 in 1080p with 60 FPS. He has his specs listed as using a 2nd PC with x2 Xeon 2690s and a Datapath VisionDVI-DL capture card. For the x264 encoding would 2 2690's would better or would 2 2695's work better? Do we want the extra cores for the extra $$$? Or does this not really matter? His rig seems to do the job with the 2690s anyway.

I also have no experience with capture cards. The specs on the Datapath card are amazing, but they seem a bit overkill, I will not be connecting Stryker towers in a surgical environment or CAD equipment, just streaming. What should I want out of a capture card? I only want 1080p with 60 frames, what strikes you as an alternative to this $1,500 card?
 

alpinlol

Active Member
you are talking about widgitybear ... his system is actually performing so well that he records everything at the same time on xsplit but then again i dont really understand the point in spending so much money for streaming since 1080p60 is just pure shit with flash.

capturecardwise you actually need a really strong capturecard for 1080p60 since you have to transfer a lot of data but then again his card got like 600mbit transferrate or something like that if i remember correctly thats truely by far more than you actually need
 

Aegrit

New Member
Yeah, exactly, widgitybear. I didn't want to link his stream as I don't know any forum rules! I guess that woulda been fine though. I see what you mean about 1080p60 being shit with flash, I've read a few articles. My PC and internet connection is already pretty good so I could view his stream without any issue, but I can see how many people would have difficulty viewing a stream at that quality.

As far as capture cards go, that 600mbit transfer rate. Its something like that, 600 or 650. I don't think I would need that much juice and could go with less. The Vision DVI-DL can transfer at 2k resolution, so it is definitely overkill. Maybe I want overkill though, I'm not sure yet. I just know that I know very little about capture cards and I am hoping for some alternative suggestions! Thanks for your response alpin! :)
 

FerretBomb

Active Member
Yep, at this point even trying to stream 1080@60 is a terrible idea. You really don't have the bandwidth available (if streaming to Twitch, 3500 is the cap, which is barely enough for halfway decent 1080p) to make it work, regardless of preset. Even past that, the Flash player tends to drop a smoking fat one trying to decode at that res/rate.

More cores, more computing power, ability to use a slower encoding preset and provide better quality video. But at that point you're probably running up against the diminishing returns pretty damn hard.

The Datapath Vision DVI-DL is going to be fairly futureproofed for when streaming starts to go into the 4K arena (supports a 4Kx4K canvas, not just 2k). However, if you're just looking for 1080@60 capture, take a look at the Micomsoft SC-512N1-L/DVI (http://www.solarisjapan.com/sc-512n1-l- ... ure-board/). About $330 USD, but it apparently works quite well.

Just to repeat. Trying to livestream (especially to Twitch) at 1080p@60fps is generally speaking non-viable at current. To get even baseline 'acceptable' image clarity you'll have to drastically exceed 3500kbps; figure closer to 6000 as a starting point. At which point many of your viewers will complain about buffering, and the Flash player may fall behind, desync audio/video, stutter, or simply crash outright.
 

Lain

Forum Admin
Lain
Forum Moderator
Developer
There is only one real problem with streaming 1080p 60fps (other than needing a powerful comptuer to encode it): Flash. Its h264 decoder is terrible, and it's too slow to decode 1080p 60fps unless you have a powerhouse machine. It's terrible.
 

Aegrit

New Member
Thanks FerretBomb. For sure buffering is an issue, I ran into it a bit on this widgitybear's stream. I don't know that I would want to regularly or religiously stream in that quality, but I think I would like to be able to at times. Sort of a way to throw both my money & time away in style ;) Seriously though, I am currently having some issues with chop getting 1080p30, which is something I may want for youtube. Granted this chop is occurring on the slow preset, its fine on medium. I do hear what you're saying, that 720p60 is more realistic for streaming and I really, really appreciate the voice of reason!!! :)

I like the thought of futureproofing, and its not a bad idea at all. Thanks for the link to an alternative capture card.

I wonder a bit about the futureproofing aspect. Its impossible to predict the pricing of these components when the time comes. When streaming does go to the 4k arena... How long do you think that would be until that happens anyway? There's a lot of hype surrounding an upcoming Haswell 8-core, I wonder how much closer that would get to that 4k mark and I wonder if I should just wait for that release. Would you be able to share any further thoughts on that stuff? I suppose I'm really thinking about what I want to do at this point and I'm not going to jump to any conclusions. Thanks again!
 

Boildown

Active Member
Pretty sure all of the Datapath cards listed here can do 1080p60: https://obsproject.com/forum/viewtopic.php?f=11&t=1180 . As can a number of others like the Extremecap U3, the Micomsoft SC-512N1, and Micomsoft XCapture-1, and probably others that I'm forgetting.

The reason to use the Datapath -DL, which is an awesome card I'd love to have myself, is to be able to capture higher resolutions and/or framerates than 1080p60. For example, 1080p120, or 1440p60. On the input side. On the output its reduced or downscaled back to 1080p60. If you don't have a 120Hz or 1440p monitor you don't need that.

And there's yet another way to stream 1080p60 with two PCs you should know about that doesn't even need a capture card at all: https://obsproject.com/forum/viewtopic.php?f=18&t=6757 . Personally I'd look into this option first, as I may need to use it myself if I ever get a G-Sync monitor, as that eliminates the possibility of cloning my screen for my capture card (I have a Datapath -E1S).

Edit: Aww I was distracted and beaten to the punch by a bunch of posts.

I wouldn't worry about 4K. No one is going to want to actually game in any competitive sense at that resolution because the framerates will be so low. Even if you did game at that resolution, streaming it would be a nightmare, take what Ferretbomb was saying and multiply it by four. In reality everyone will do a 2x downscale and stream it at 1080p at the highest. I would actually predict that 1440p will be the next "thing" for competitive gaming, but it'll still be downscaled to 720p or 1080p for streaming.
 

FerretBomb

Active Member
I'd say the investment would more pay off in being able to run at a very slow preset comparatively, allowing you to either drop the bitrate aggressively (meaning less buffering for your viewers), or just keep the bitrate as-is and inch a little closer to the never-attainable 1:1 video stream. It's one of the things I've been looking into more lately, and debating between a secondary encoder box serving optional duty as a standalone gaming rig, or just going with consumer-grade parts.

At the moment, the $330 option is probably going to be your best choice, assuming you don't ever intend to run at 1440p (the Next Big Thing according to people who have a 1440p monitor, and something that no one else really gives half a used fig about).

Really, to stream in 4K you're going to need a dedicated encoder rig with current-day processors and whatnot. Probably more than a 2-cpu if possible. I'd expect a quad-quad-core with HT, even if the rate of diminishing returns hits somewhere after 24 threads (currently). Even then, just populate three of the sockets if you're on a budget. Even so, when you're looking at a setup along those lines, $1500 for a capture card that will be able to pull it in is going to be fairly middling-to-low on your list of things that may break the budget.
 

Boildown

Active Member
Very Slow is so much more computationally heavy than even Medium, I'm not sure that its even possible to realtime that with the best multicore CPU setup available. You'd probably spend a truckload of money just to get to Slow preset and wonder what you've done. But I could be wrong, what happened to the guy who was getting that 16 core setup recently?
 

Boildown

Active Member
From: http://r-1.ch/analyze-twitch-vod.php?ur ... F502521606

panel-27927950-image-80090c5b2d78f057-320.jpeg


Video:
Resolution: 1920x1080
FPS: 60
Keyint: ± (estimated)
Audio:
aac, 44100 Hz, stereo, s16
x264 Settings:
vbv-maxrate: 6500
vbv-bufsize: 6500
crf: (OBS: q?)
rc: cbr
Estimated preset: medium

So you see, even with that bad ass dedicated encoding PC, he's only making Medium preset and he's "cheating" by using 6500/6500 bitrate/buffer.

If you run the trial version of FRAPS, put the overlay in the upper left corner and watch the framerate that you actually get playing back one of his streams (or anyone's stream) in fullscreen mode. For me, it hovers around 40 but varies between 30 and 50 pretty wildly. It works this way for anyone's 60fps Twitch stream. If I view a Youtube video however, FRAPS doesn't show this kind of variability in framerate.

So anyways, the point is that trying to duplicate this guy's quality is a fool's errand, because unless you get Twitch to partner with you, you can't stream at a high enough bitrate to make it look good, and even if you did, Twitch's flash implementation can't show 60 FPS anyways. You'd be better off streaming 1080p40, as that's all that Twitch/Flash can really manage, and that way your bitrate isn't spread out among so many frames.
 

dodgepong

Administrator
Community Helper
Analyzing a VOD also doesn't give the frame time graph that I mean...you have to run the analyzer while the stream itself is actually live, and use the analyzer I linked above (not the VOD analyzer).

But yes, Widgetybear streams at very high bit rate (higher than Twitch wants people to stream at), and he also is partnered, which means that if people can't watch his super-epeen 1080p60 6500kbps stream (i.e. most people), they can choose lower quality options. That's not an option for the majority of streamers.
 

Aegrit

New Member
Thanks guys. Yeah, I did notice some trouble watching the stream. I understand it may not be viable, maybe for several years, maybe longer. But come on, who doesn't want higher quality?? ;) I do agree with Boildown who said, a 1080p40 might be a good alternative. I also think 720p60 and trying to reach those lower presets would be cool. Like Ferret said, try to get to that 1:1.

Boildown, I think the thread you were referencing is: viewtopic.php?f=5&t=10722

I'm definitely more interested in the ability to ramp up the encoding done by the CPU and reduce the bitrate. The CPU's I was looking at are listed below. I was thinking to possibly do 2 of the 2690's, we've seen their ability to work at 1080p60 with 6k+bit rates already.

(x2) 2690 2.9GHz, 8 cores, 16 threads (16 cores, 32 threads)
(x2) 2695 v2 2.4GHz, 12 cores, 24 threads (24 cores, 48 threads)

I've been doing a lot of research and really began to spin my wheels. As much as I learn about x264, it hasn't helped me to understand it any better. I've read somewhere that up to 22 threads in use with OBS is recommended, that it requires a lot of resources to continue syncing threads beyond that point, resulting in diminishing returns. And in this wiki (http://mewiki.project357.com/wiki/X264_Settings#threads) it states there is an internal limit on threads set at 128, and that it realistically should not be set this high.

It's obvious that x2 2695s would give better performance for multi-threaded processing up to it's ceiling of 48 threads. It's my understanding that OBS could utilize all 48 threads, or 46 threads leaving 2 for OS, etc. But how well would it utilize threads 23 - 46? How would it perform compared to x2 2690s with higher clocks and 32 threads? I would be interested to look at some benchmarks, but I don't think they're out there. Is the limitation the x264 & flash more so than the hardware?

I guess we're talking about a very niche thing here, building a mini-server for streaming. There's an 8-core Sandy Bridge coming, probably within the next 3 or 4 months. There's the 8 core Haswell coming, probably some time from August to October. Anyone should be able to easily stream 720p60, probably even up to medium preset in OBS when these processors hit.

TL;DR: Would it be better to have 24 overclocked threads? Or would it be better to have a 48 thread set-up? I am wondering how much the higher thread count is limited in OBS and hoping someone with more experience can help! Thanks for reading.
 

Boildown

Active Member
The x264 thread limitation makes it pretty clear to me any cores over 2/3rds of that thread limitation is going to be wasted. The thread limitation as you noted is 22, lets round that to 24 and take 2/3rds of it. So 16. I guess if you really want to spend a lot of money, you could turn off hyperthreading and make it 16 full cores and that would mean you get two of those 2690s.

If you look at the chart of Intel server CPUs, as the core count goes up, the speed of each core goes down. So I don't think you really want to start going to the higher core count CPUs, certainly not the 15 core that is coming in a few months, because its clocked a lot slower. Theoretically you could do 4x4 core CPUs, because they're clocked the highest, but I don't know if there's a penalty for not having all the cores on the same CPU.

Personally I think this is all a waste of money, a huge waste of it lol. If I was going to spend multi-thousands on a dedicated encoder, I would get two six cores at the highest clocks available and turn the hyperthreading on, for 24 virtual cores, or just get one eight core when the new ones come out in a few months. I'm just worried that you'll be still only be running "Slow" when its all said and done and not reach anything like Slower or Very Slow. And if anyone would even notice the difference at whichever one you reach.

Here's an idea: Try to get Widgititydude to stream at 3500 for a bit to help you decide if its worth it.
 

Videophile

Elgato
In the end, I think it would be better to get a 4930k, or once the new X99 chipset comes out, one of those, and just stream at 720p60. 1080p is going to look fairly pixelated at any bit rate lower than 6500, as shown by Widgity. I to once had the dream of 1080p, but its just not worth it. At 6500, 720p60 looks absolutely gorgeous.

Just my $.02.

-Shrimp
 
Top