Encoder overloaded when output set to NAS

twanzplays

New Member
Hello all,

Hope everyone is doing well. First time posting here.

OBS is reporting that the encoder is overloaded when I save to my network attached storage. Saving recordings to local SATA or NVME drives does not result in any encoder overload. I have tried using SMB and even iSCSI to see if the issue was the way Windows addresses an SMB share. I am using Nvidia's recommended settings for output (MKV, CQP of 15).

System specs:
Ryzen 9 3950X
32GB of memory
RTX 2080 TI
Canvas: 1440p
Output: 1080p/60

The NAS unit is a Synology DS1821+ with 3 bonded Gigabit connections so other network traffic to and from the NAS should not interfere with the recording speed.

Everything is connected via Gigabit ethernet, which might very well be the problem. But seeing as the recordings are usually in the 100-200Mbps range I should have plenty of headroom over that connection. The NAS unit has two SHR arrays (RAID 5 equivalents), one all SSD array and one all HDD array. Both can saturate a Gigabit connection when reading and writing data and both result in encoder overload when used as a recoding location for OBS.

Has anyone else seen a similar issue? Should I be doing something differently?

Thank you all for your thoughts and suggestions.
 

Lawrence_SoCal

Active Member
I don't know specifically. However, there was some comments made recently about how OBS handles network file shares.... so, I'm not completely surprised at a mention of an issue like you have. I'm not saying a problem is expected, or what exactly the issue is...

With that said, from a basic compute/data perspective, you are adding an extra single-point-of-failure layer by saving to NAS. For example, I record/stream for OS/NVMe drive. After recording is completed, I allow a cloud back to sync the file. Only after that is completed, do I move the resulting recording(s) off the NVMe drive to a local HDD for archiving. I mention this as I get saving once BUT, depending on space you have.... I'd be more inclined to save locally and have a script do the data move afterwards
 

twanzplays

New Member
I don't know specifically. However, there was some comments made recently about how OBS handles network file shares.... so, I'm not completely surprised at a mention of an issue like you have. I'm not saying a problem is expected, or what exactly the issue is...

With that said, from a basic compute/data perspective, you are adding an extra single-point-of-failure layer by saving to NAS. For example, I record/stream for OS/NVMe drive. After recording is completed, I allow a cloud back to sync the file. Only after that is completed, do I move the resulting recording(s) off the NVMe drive to a local HDD for archiving. I mention this as I get saving once BUT, depending on space you have.... I'd be more inclined to save locally and have a script do the data move afterwards

Thank you for the insight. I have FreeFileSync scheduled to copy over recordings nightly to the NAS from the NVMe drive that is used for recording. I thought I might record directly to the more robust storage on the NAS instead of to a single NVMe.

I thought iSCSI might solve the network write issue since iSCSI presents a block device that windows can then format as NTFS and use natively as a local disk, but my gigabit connection might just not be up to the task.
 

Lawrence_SoCal

Active Member
A GB link is more than a typical SATA SDD sustained write throughput. Again, I get going for robust, but at that point you are writing a file over the network vs local NVMe. When doing points of failure analysis, you have multiple cables, the switch, extra power supply, etc.... you are way better off saving locally and copying afterwards. Only in a datacenter type setup, with redundant NICs, switches, power sources, 5 9s storage system, etc would that not be the case
 
Top