Question / Help Urgent (NEW CPU FOR SLOWER / SLOWEST PRESET)

DrLove

New Member
Hi everyone! I have been stalking the forums for many months! And decided to make an account seen I can just not find out WHAT I need exactly (cpu wise)

Now I am wondering if any of you would be able to help me out...

I am building a new pc at the minute, in the 3100,- Euro range, the thing is, I am a 4.5 yr old verified Twitch streamer (Old partner)
My stream build is basicly dedicated, I dont use my PC at all, as I stream with OBS through my PS4.

My pc is 90% ready, the only thing I need now is find the right CPU. At the moment I am running with the Ryzen 5 1600x (very good CPU)
I can run OBS at "SLOW" preset with 720p60. And 1080p60 on "NORMAL" preset.

Now I am a little bit in a pinch, as what I am wondering is, does anyone know if the Ryzen ThreadRipper 1950x OR the Ryzen 7 2700x is capable of running Slower / Slowest at any of those resolutions (Without frame drops)

I have done much research, but I can't find anything properly about it, so yeah before I shed out 1000,- Euro's for the ThreadRipper or not lol, I'd like to be certain.


This was my first post here, so I am sorry if this isnt in the right catagory, I really hope someone can give me some answers, as I have run dry on looking online, been seriously everywhere :(
 

carlmmii

Active Member
I don't have any personal experience with any of the ryzen chips personally, but we can do some educated guesses.

In regards to the 2700x, compared to your 1600x it's going to be better at encoding, yes. 8 cores vs 6 cores, slightly higher clock speed, better IPC. If I had to guess, you'd see about a 25-40% improvement in encoding efficiency, but that's just guessing -- not many review sites out there do actual stream compression comparisons, so the only thing to go by is scaling up by the spec difference (minus the usual overhead).

Does this translate into a bump in cpu quality preset ability? Maybe. Will it go farther than the "SLOW" preset for 1080p60? Probably not.

Threadripper is in a different league.
 

alexitx

New Member
It depends on many factors, especially the games you will be streaming. If there is fast movement, many details, etc. you will need very powerful CPU.

I don't have much experience with neither of those CPUs personally. But I've talked to a few streamers about hardware requirements for streaming certain games at high resolution and framerate (720p60 / 1080p60). They use TR 1920X & 1950X @ 3.80 / 4.00 GHz for streaming Action / Adventure / Tripple A games at 1080p60 on Slow preset, as far as I know. Even then these CPUs are pushed to the limits.

In my opinion 2700X would be a big improvement over 1600X for heavier preset, but Threadripper is the way to go for streaming 1080p60 Tripple A titles (and similar) on Medium / Slow preset and below, according to relatively big streamers, who use these CPUs and have quite a lot of experience.
 

johnypilgrim

New Member
If you're going for 1080p60 and the slowest encoding settings and have the money to spend, then Threadripper is the way to go. However, if you can hold off purchasing one for 35-60 days, the second generation Threadrippers are going to be released before September including their new 32 core/64 thread monster. Even if you don't go for that particular model, we're looking at a comparable performance improvement as between the original Ryzen and second gen Ryzen chips for the same prices.
 

DrLove

New Member
Hey guys thank you all very much for the replies / info. I am actually VERY familiar with OBS. Been testing different results / looking up others who have tried the same for the last 1.5 years. My Ryzen 5 1600x was able to run 720p60fps (SLOW) Preset @ 4500br it looked REALLY good, but there's always a way to go higher. Again thank you!

I also decided to put my own rig together! And this is what I got;

Cooler Master MasterCase MC500
AMD Ryzen Threadripper 1950X (Overclocked) 4.0/4.1
G.Skill Trident Z RGB DDR4 3200MHZ Quad Kit 32GB
Asus GTX 1080ti ROG STRIX oc 11GB gddr5x
ASRock Fatal1ty X399 Professional Gaming
2x Samsung 860 EVO 500 GB (SSD)
1200W Corsair AX1200i (I know 1200 might be a little overkill, but its future proof lol)

I wonder how good I can get it to look on my graphics card alone. In my old rig I have a GTX 1050ti (which is a REALLY good card imho)

Might post an update once i've tried it all!
 

Harold

Active Member
The presets that are slower than medium offer very minimal quality improvements for the level of increased cpu time they require.
 

Boildown

Active Member
Take your vertical resolution and divide by 40 and you have a rough estimate of the number of threads that x264 can utilize when encoding. x264 opens 1.5 threads per logical CPU core by default. If you do the math with these two facts, it leads you to the conclusion that you should get a 6C/12T CPU (for 720p video). But I've found that x264's 1.5x calculation for how many threads tends to open too many threads, and you get better performance using a threads=x custom command where x is an integer smaller than the default (just for CPUs with more threads than cores, i.e. hyperthreaded or whatever AMD calls it).

So doing the math again, for 720p, you can make use of up to an 18 core CPU, more than 18 is just wasted, assuming the vertical / 40 is correct. (turn hyperthreading off). But considering that clockspeed goes down as core count goes up in any CPU you can buy, you'll want to stop short of an actual 18C CPU.

For 720p I'd probably go for the fastest 8 or 10 or 12 core CPU you can buy (actual cores, not logical cores). For an 8 core I'd keep hyperthreading on (probably) and for a 12 core I'd turn hyperthreading off (probably, actual testing to decide for certain). For 1080p, 1080/40 = 27. 27/1.5 = 18. So you can use an 18 core CPU, but a 27 core CPU is probably overkill (and you lose clockspeed, so overkill is bad).

Slower is probably achievable for 720p, but for 1080p I doubt its even possible (for a typical high-action video game). The CPUs at the core counts that are ideal just don't have the IPC to pull it off yet.

All this stuff is theory and I haven't seen anyone do a legit analysis on it. The tradeoffs are many and hard/expensive to test, because you can't just buy a high core count CPU and thread-limit it, because the clockspeed will be slower. So you'd actually have to buy/borrow the various CPUs and test them, and that's ridiculously expensive. When reviewers do their x264 performance tests, they typically use the same preset and measure the time. But we need live encoding, not encoding of a premade video. Time is constant, not a variable. No one tests this way. They'd also have to make sure their results are repeatable. I tried to bench my CPU before using the same input one minute input video, and I got widely varying results. Only when I did 5 minute tests did I even approach repeatable results. 5 minutes or more per test is just too long for CPU reviewers, so they don't do it. CPU reviewers in their x264 tests also use movies as input, not game recordings. The video complexity of the average Overwatch match is far above the video complexity of just about anything in TV and cinema.

Now for the non-theoretical, my practical experience:

I have a dual Xeon e2670 (8c/16t per CPU) as described here: https://www.reddit.com/r/Twitch/comments/47bzdc/budget_friendly_secondary_streaming_pc_guide . I can do 720p40 on Slow (actually a hybrid between Slow and Slower, sorry, haven't tried 720p60, I streamed like this when I was limited to 3000kbps up). And I can do 1080p60 on Medium (now that I have faster upload speeds). I thought 1080p60 on Medium at 6Mbps looked excellent, I have no intentions of spending any money to improve upon it. Maybe when I go Ultrawide 1440p (been thinking about a monitor upgrade for years) I'll consider it, but not yet, it just wouldn't result in a noticeable improvement. Considering my CPUs are Sandy Bridge - era, there ought to be significant improvements in more modern CPUs.
 
Last edited:

Kenshin9977

New Member
Take your vertical resolution and divide by 40 and you have a rough estimate of the number of threads that x264 can utilize when encoding. x264 opens 1.5 threads per logical CPU core by default.
If you divide by 40 then by 1.5 it's equivalent to divide by 60.
Otherwise thanks, when looking for infos about the preset "slowest" of x264 and how many core are used you are the only one who gave a concise answer.
 
Top