Schauerland
Member
So, i know me and some other people habe problems with the scene rendering / (rendering lags/stalls)
it seems that rendering drops frames if the scene is larger than 1920x1080 (i have a 1440p monitor so my scene is 2560x1440)
now if the gpu is fully stressed BUT the ingame fps is still 60fps the scene render starts dropping frames.
this results in a smooth gameplay and a non-smooth recording (there are no encoding lags)
this gets even worst when i use nvenc to encode - whatever
i have no rendering lags when the scene is 1920x1080 even if i use the same ingame settings (1440p and so on)
other recording tools can then still record smoothly (because they dont have to render a scene - just encode the captured frames, i understand that)
- the question is how can this be improved (can this even be improved)?
my first though was an option to render the scene via cpu but i think the scene render process is so deeply in the code that you cant extract it and render it via cpu?
any other ideas?
it seems that rendering drops frames if the scene is larger than 1920x1080 (i have a 1440p monitor so my scene is 2560x1440)
now if the gpu is fully stressed BUT the ingame fps is still 60fps the scene render starts dropping frames.
this results in a smooth gameplay and a non-smooth recording (there are no encoding lags)
this gets even worst when i use nvenc to encode - whatever
i have no rendering lags when the scene is 1920x1080 even if i use the same ingame settings (1440p and so on)
other recording tools can then still record smoothly (because they dont have to render a scene - just encode the captured frames, i understand that)
- the question is how can this be improved (can this even be improved)?
my first though was an option to render the scene via cpu but i think the scene render process is so deeply in the code that you cant extract it and render it via cpu?
any other ideas?