Can I separate the resolution of the recording from Livestream?

We had internet degradation this past Sunday. Our church service was 1 hour 16 minutes. 39 minutes made it to the Livestream. “Saved by the recording“ I thought. And sure enough, all 1 hour an 16 minutes were there. The audio was intact as well. But the video was pixelated from the time the internet started to struggle. I am guessing that the video for the recording was being degraded to match the reduced bandwidth that was sometimes available. Our internet bandwidth is contracted at 100Mbps down and 20Mbps up. I did a Speedtest on my phone and found at that time 20 down and 0.1 up.

i would like to set up the OBS system to maintain full fidelity on the recording no matter what gymnastics it is going through to fit through the available Livestream pipe. TIA
 

AaronD

Active Member
Is the recording set to use the stream encoder? If so, then there's no way to avoid this. The singular encoder is being messed with to produce a bitstream that works, and "Use stream encoder" simply siphons that existing bitstream off to the file as well.

If you want them to be independent, then the recording needs its own separate encoder, which results in *two* encoders running simultaneously. Choose something other than "Use stream encoder", give it its own settings, and see if your system can do that.
 
That sounds like what I should do. Based on previous forum advice, we just upgraded the computer with a gen 14 intel processor and Nvidia graphics card. I think we will be able to run two encoders. I am currently 500 miles from the Livestream computer so I’ll check this on the weekend when I return. I appreciate the quick response. Thanks.
 

R1CH

Forum Admin
Developer
If you were using the dynamic bitrate option, this is expected in the recording as the bitrate affects the encoder. However if you were encountering dropped frames, these should not be visible in the recording as frame dropping happens on the network level after encoding has taken place.

A modern NVIDIA GPU with NVENC will have no trouble producing 2 streams, one at high bitrate / resolution for recording and a lower one for streaming.
 
No dropped frames, but yes it was a dynamic bit rate. Interesting that the old computer, most likely set up differently a long time ago, and on OBS 27.something showed a bit rate during live streaming of about 10k And OBS claimed to be using 10% of the cpu. Some entity dropped frames from time to time, some weeks none and some weeks a fair amount. I assumed it to be internet related. The new computer shows a bit rate during normal operation of about 2.7k with an acceptable result And not a dropped frame since installation last month, running on OBS ”whatever is the latest stable version” (meaning I forget the exact number) recommended by the software on startup. CPU usage is 0.1%.

The difference between 39 minutes and 1 hour 16 minutes I believe was due to “disconnecting”, “reconnecting”, “reconnection successful“. Which occurred (with a tone recorded each time) often. The recording seemed to go to full strength when we were offline from the livestream.
 

Lawrence_SoCal

Active Member
details matter... an Intel 14th Gen CPU doesn't say which CPU (which range from low-power/battery-optimized puny units which struggle to web browse to powerhouse models ... and lots in between)... same for GPU... model #s matter ;^)... Presuming an i7 and 4060 or better, then you'll be fine (probably even for 4K video, which my 4+ yr old system should handle as well).

on our i7-10700K and 1660 Super, I run 1080p30 stream at 7,000 kpbs and then Record at about 3.5X that bitrate with no issue (and barely using CPU/GPU -> 10-20%?). I originally set this up when we were on an unstable DSL (AT&T)... fortunately they gave us a free upgrade to Fiber and I'm much happier since

Due to livestream video getting so highly compressed (as expected with free video storage of typical CDN (YouTube/Facebook, etc)), I'd not want to use that resulting video for marketing, etc. ie handing video off to family from a service (Baptism, Wedding, etc) I'd much rather be working from much higher quality locally recorded video. Our services are about same length, and our Recordings are typically around 11GB. Just starting to making a dent 4.5 years later on a 4TB HDD set up for archiving Recordings (around $60 2.5 yrs ago)
- I Record to a SSD and was attentive to Disk I/O . I then auto-remux and move resulting video file(s) to Archive HDD when convenient
 
14th gen i7, 14700; 16G; 1TB SSD, cannot recall the Nvidia model. It was midrange of the selections available, about $400 add-on to the config. Lenovo brand, Win11 pro64
 

Lawrence_SoCal

Active Member
Probably a 4060 (guessing on OEM markups)... way more powerful than what I have... so as long as careful with background OS processes (ie beware OS and certain app defaults, like MS Office/OneDrive, etc) you should be fine for even 4K video when the time comes for a camera upgrade [though obviously no real point in that until you have better Upload bandwidth]

And you may need to have someone be more attentive to network bandwidth controls (vs the typical next to Zero controls... ie Quality of Service (QoS) to ensure your livestream traffic is unhindered [there are numerous approaches to this, and depends on specifics of your environment]. I've seen numerous Houses of Worship with a single Internet connection, and also have Guest WiFi using that same connection.... or the office PCs, etc .... any number of things can cause unexpected bandwidth contention. The challenge for a typical HoW is having someone around who understands networking well enough to address this
 
Thanks. I am learning this stuff by braille (bumping into things and learning how to avoid them next time). I’ll make sure that streaming has priority on the network even if most people who worship here don’t know how to connect to our WIFI. And I’ve already bumped into OneDrive and disabled it on all the church systems — awful thing. We use a local Synology Diskstation for backups and none happen on Sundays.

When I get back into town I’ll confirm I put a strong enough NVIDIA in there.

I appreciate the continued dialog here. Each post is helpful.
 
Update: I separated the encoding streams and it worked well. Thanks for all the comments. No internet outage this week for the real test. But both the stream and the recording looked very good.

the Nvdia in the system is a T1000 8GB GDDR6

I chose the x.264 encoding and not the Nvenc for both, an arbitrary choice. Perhaps I should change that.
 

AaronD

Active Member
Update: I separated the encoding streams and it worked well. Thanks for all the comments. No internet outage this week for the real test. But both the stream and the recording looked very good.
Great! I wonder how you would simulate the problem that you actually had. You can't just pull the plug; you'd have to reduce the bandwidth but keep the connection. Is there a setting in your router that you can squeeze down and put back?

I chose the x.264 encoding and not the Nvenc for both, an arbitrary choice. Perhaps I should change that.
x.264 is okay if you have a monster CPU. It does the video encoding in software on the CPU, like any other task, and it takes a fair amount of work to do that.

If you can offload that job to a chunk of silicon that doesn't do anything else anyway, then it frees up the CPU to do other things. NVENC is that dedicated chunk of silicon on an nVidia GPU, that does nothing else anyway. Some support only 1 stream to encode, some support 2 streams (your stream and recording, separately and simultaneously, with independent settings), some support more. Yours does 2, so you should be good there.
 

Lawrence_SoCal

Active Member
A quick Google search indicates the T1000 is a number of years old (2019 or 2022?) and supposedly is a slightly cut down version of the GTX 1650 ?? not sure similar model is what you have? if yes, seems odd pairing for a 14th gen Intel CPU
https://www.techpowerup.com/gpu-specs/quadro-t1000-mobile.c3435 2019
https://www.techpowerup.com/gpu-specs/t1000.c3797 Desktop 2021
This card ? https://www.lenovo.com/us/en/p/accessories-and-software/graphics-cards/graphics_cards/4x61j52233

Regardless, should have NVENC, and allow GPU encode offload, freeing up the CPU as noted earlier

The primary reason to use CPU x264 is 1. you have lots (and lots) of spare CPU cycles, and 2. you prefer the look of the CPU encoded version instead of the NVENC version.

As you want to save the Recorded video locally for a backup, and if a much higher quality video is desired for alternate uses (ex putting to DVD for a family, or snippet for marketing, or whatever) then there is a good chance you'll want to record at a higher bitrate that may well cause issues for a consumer CPU, and using the NVidia NVENC (instead of CPU x.264) would be advisable ... but it depends
 
Update: Yes Lawrence we have the card you show in the lenovo link above.

I separated the encoding for the Livestream and the recording, and used NVENC for both. Looking at the Windows device list, GPU0 was the intel processor and GPU1 was the NVIDIA. But when I put GPU1 in the spec on OBS it barfed. It worked using GPU0 and NVENC.

My task manager for WIN11 does not show the GPU as a column as I have seen in other posts and I have not figured out how to add it. But using the NVIDIA tools, with a stream and recording started the GPU is in the mid 30% range.

Now I am playing with the quality settings on the advanced video output screen of OBS, turning them up incrementally. Is there a best practice for an HD stream at 30fps for these settings using NVENC, or just play with it until I like it or it craters?
 

Lawrence_SoCal

Active Member
$400 for that card... ouch... grossly over-priced .. oh well.. if you were within return window, I would.. but otherwise, live and learn

Yea, I've avoided Win11 for a reason so far... can't help there.

As for advanced output settings - no, there really isn't a optimal/best practice, as it depends on system workload vs resource availability, and desired output (which varies by use case)
- the main thing to be aware of is that Record file sizes start going up (not exponentially, but close?) at higher bitrates to get incremental better video (at a certain point). And that is where it is usually personal preference in these circumstances (professional world would have some metrics, but not really useful for us). So, my recommendation is to think about intended use cases, and have some buffer room to work with. So in my case, an approx 75 min 1080p30 recording at 11+GB is plenty for me.. I have the disk space and hardware resources. I could go even higher quality, but for the extremely rare scenario where that would be desired, I'm ok with what we have. And folks who see the local recording are impressed, so I stopped fiddling at that point. I could get by with smaller video files, most likely... but not really worth time/effort considering low cost of archive HDD space. I had more important things to work on in terms of automating and documentation our processes.
- The question, I suspect, presuming NOT pushing hardware capabilities (which you shouldn't), is what video quality do you want post video editing? If just taking snippets of video, you can use avidemux and avoid re-encoding. Otherwise, determine which video editor would be used (Resolve, Premier, iMovie, whatever), and then edit a Recording (beware not all editors like same video settings). Then output/encode the edit. Do you/whomever like the output? adjust, if required, from there.
Assuming single edits from source (not multi-generational edits, with lossy encoding losses each generation) my goal was to Record at better quality than I need, such then post-edit output being at least as good as desired (I targeted someone watching on large screen TV or 4K monitor, yes for 1080p30 content)

- the above being the rationale I used. not all that sophisticated, and worked for us.
 
Top