OBS branch with AMD VCE support.

jackun

Developer
@Daranah For now, AMF assumes you game on AMD card as it uses the Video Adapter index from OBS' Video settings to select DirectX device. If you are only doing local recordings uncheck "Use AMF instead..." to use older OVE api. OVE should only select AMD cards but you can set the device index or "topoID" if it still selects nVidia card. You probably also need to check "Disable D3D11 OpenCL interop".
 
D

Deleted member 30350

This might sound pretty stupid, but will this make it into the official builds at some point? Hardware encoding would be a huge improvement.
 

jackun

Developer
Official build gets scrapped anyway so I think I wont bother with a merge (there's some VCE/OpenCL specific code in OBS main executable code too). But obs-studio will most likely get its own VCE encoder too.
 

AndersE

New Member
so I'm recording to file only at 30 fps and 1050p, using a 7950. 60 fps was too stuttery so I gave up fairly quickly. 100% resolution can give some pixelated results regardless of bitrate, I'm still experimenting from time to time, but the slightest downscale will solve it anyway. Other than that I've been recording fine for the last weeks. Until the other night where I had to reduce my CPU clock speed, it started to stutter again, normally occuring now and then, maybe a clear hit every minute or so, but sometimes even more frequently. I've pressed the Speed Preset since that seem to have been recommended, this is all I've done. Somebody that have any ideas how to optimize this for me? I can sacrifice quality to get rid of the stutter. I'm recording with bitrates around 40-50k and it wasn't a problem before, don't know now.
 

dping

Active Member
so I'm recording to file only at 30 fps and 1050p, using a 7950. 60 fps was too stuttery so I gave up fairly quickly. 100% resolution can give some pixelated results regardless of bitrate, I'm still experimenting from time to time, but the slightest downscale will solve it anyway. Other than that I've been recording fine for the last weeks. Until the other night where I had to reduce my CPU clock speed, it started to stutter again, normally occuring now and then, maybe a clear hit every minute or so, but sometimes even more frequently. I've pressed the Speed Preset since that seem to have been recommended, this is all I've done. Somebody that have any ideas how to optimize this for me? I can sacrifice quality to get rid of the stutter. I'm recording with bitrates around 40-50k and it wasn't a problem before, don't know now.
need full specs. Log file. Are you using amf or ove? Try, just try to enable dx11 interop and the other dx11.
 

Siavash

New Member
what does the twitch analyzer say? r-1.ch/analyzer/
Twitch analyzer gave me
  • Resolution: 1280x720
  • Frame Rate: 30 FPS
  • Audio: aac, 44100 Hz, stereo, s16, 117 kb/s
  • Total Bitrate: 1227 kbps
  • H264 Profile: Main
  • x264 Preset: veryfast / superfast (estimated)
  • Stream Uptime: 0h 3m 13s
The video always looked 30 fps (actual framerate); just source filters showing 15/15.17. I'm glad twitch analyzer sees the video as it is, though.
I also used nginx rtmp stat page and ffprobe/mpc-hc/vlc source filters to test stream or file, and they show video as 15/15.17 fps.

@Siavash Change file ending to mp4 if it isn't already. FFDshow shows 15 fps with FLV regardless of actual fps. Maybe something is off with h264 bitstream, meh.
I am talking about streaming, but I dumped to mp4 and got source filter claiming 15/15.17 fps still.
Funny I can count there are ~30 frames just looking at the frame counter but filter shows 15/15.17.
 
Last edited:

Scyna

New Member
anyone got any ideas how much b frames is needed on an amd 290? nm i tried 1,2,3,4,5,15,16 b frames and the video looks messed up. I'm guessing its broken.
 
Last edited:

jackun

Developer
2015-01-20 [64bit] : This is a quick fix for a mutex race condition until I get some time to do a little bit more thorough check.

Try if you get a crash like:
Video thread stack trace:
Stack EIP Arg0 Arg1 Arg2 Arg3 Address
ntdll.dll!0x7fff5c510c8a
ntdll.dll!0x7fff5c4b8b61
ntdll.dll!0x7fff5c4b7124
obsvceamf.dll!VCEEncoder::RequestBuffers+0xbe
obs.exe!OBS::MainCaptureLoop+0x1795
obs.exe!OBS::MainCaptureThread+0x9
kernel32.dll!0x7fff5c1713d2
ntdll.dll!0x7fff5c4f03c4

Encode thread stack trace:
Stack EIP Arg0 Arg1 Arg2 Arg3 Address
ntdll.dll!0x7fff5c510c8a
ntdll.dll!0x7fff5c4b8b61
ntdll.dll!0x7fff5c4b7124
obsvceamf.dll!VCEEncoder::Encode+0x68
obs.exe!OBS::ProcessFrame+0xdb
obs.exe!OBS::EncodeLoop+0x4aa
obs.exe!OBS::EncodeThread+0x9
kernel32.dll!0x7fff5c1713d2
ntdll.dll!0x7fff5c4f03c4
 

molkemon

New Member
Ok, first of all thanks for this awesome branch, the difference in performance hit is staggering.

Sadly, it seems that for now there is no way to achieve decent quality at 3500 kbit/sec for a 1920x1080@30 recording, high motion is badly pixelated. This is ok though, since I mostly record for youtube and the video looks ok starting from 7000kbit/sec upwards.

I currently only record World of Warcraft.

That being said, I'm a bit unsure as to what settings seem to be optimal (is there any sort of documentation outside this forum post for this branch?)

Currently I'm using: Bitrate: 7500, Buffer Size: 15000, IDR: 60, GOP: 15 I'm fairly certain these settings are alright.
Quality preset, peak VBR encoding.
As already mentioned, I record 1080p@30fps

Now my main question is on OVE vs AMF. Which is better, or is any of them better at all? Using AMF with DX11 engine crashes WoW and OBS, DX9 and Host work fine though. I read that Host offloads some tasks to the CPU, but since for WoW both CPU and GPU are somewhat capped at ~50% each, it doesn't seem to quite matter. I read in this post that AMF has a slightly bigger impact on performance, but this is very hard to verify for me. Also quality wise, both OVE and AMF seem identical to me (very pixelated at 3500, good at 7000+). I really just would like to know which one is "better".

B-Frames/other confusion:
I have an MSI 3G Gaming R9 280. The non-X 280 supposedly shouldn't even have VCE supported, but it works without problem. That said, even the X version is listed under version 1.0 which supposedly doesn't support bframes. Now setting bframes (either with OVE or AMF) seems to work okay though (I set it to 6 and I get no errors or anything). On the other hand I don't even know if it actually encodes with bframes, mediainfo doesn't tell, is there any way I can find out if the encoding has bframes?

Also, even though I have set VCE to use VBR, on the Video settings tab CBR and CBR padding are ALWAYS checked. If I uncheck them, save and leave and reopen the settings, it will be checked again. VBR encoding is working though (if I stand still ingame, the average bitrate of the video will be far below the max allowed, so this seems to be just a UI bug). I can however change the quality modifier (1-10) if I uncheck CBR, change it and save, and the number will stick. Also which number should I use here, I suppose using high bitrates (7k+), 10 should be ok?

Sorry for the load of questions^^
 

dping

Active Member
Sorry for the load of questions^^

Q: That being said, I'm a bit unsure as to what settings seem to be optimal (is there any sort of documentation outside this forum post for this branch?)

A: Individual features yes and no, if you google a term like GOP, you'll want to google it like this "GOP h264", this is how I have the descriptions for 50% of the features. the other 50% comes from the presets folder, open that and it'll give you a brief description of what the preset does.

Q: Now my main question is on OVE vs AMF. Which is better, or is any of them better at all?

A: OVE has been developed for a while now which means that its code, though old, it more developed than AMF, which was released with Media SDK 1.1 in August of 2014. Newer might mean new features but also could mean its not completely optimized. That being said, I dont preference towards OVE for a few reasons of preference and that OVE wont do 48fps but AMF will.

Q: is there any way I can find out if the encoding has bframes?

A: r9 260, 260x, 285, r9 290, 290x, and 295 can all do b frames. our 270 270x, 280, 280x cannot, but the r9 280 does have a VCE encoder which is probably the same one the 280x has and they are both on the VCE list on wiki. since the r9 280 is a rebranded 7950, it will work like the rest of the 7900 series cards.


Q: ...10 should be ok?

A: Play with it, which VBR are you using? there should be a few in the VCE window. yeah from what I can see, the VCE CBR/VBR settings overwrite the settings in the main encoder window. as for the quality balance playing with it will change your max bitrate output, or it used to from what I remember.

also, Max bitrate of 1080@30:

// - 20 MBits/sec for 1080p and frame rate less than or equal to 30 FPS
 

jackun

Developer
Q: is there any way I can find out if the encoding has bframes?

With AMF, open log window and if BPicturesPattern is greater than zero, you have B frames. It doesn't count as an error if it fails to set B frame count, but if needed, I could turn more failed settings into "fatal" ones.
 
Last edited:
D

Deleted member 30350

Question: does this type of encoding produce better quality than using CPU and x264?
 

molkemon

New Member
With AMF, open log window and if BPicturesPattern is greater than zero, you have B frames. It doesn't count as an error if it fails to set B frame count, but if needed, I could turn more failed settings into "fatal" ones.

Ah thanks, BPicturesPattern does fail, so this works as expected.
Well maybe not a fatal abort, but some sort of warning message like "bframe encoding failed, bframes are not supported by your device, encoding without bframes"

Question: does this type of encoding produce better quality than using CPU and x264?

No, x264 is far superior in quality. This method is signifcantly faster though, you will have almost 0 impact on gaming performance, but you need to use MUCH higher bitrates (1080p@30 fps looks terrible with everything below 7-8k kbit/sec).
 
D

Deleted member 30350

Oh that sucks :( What's the point of hardware encoding then? Quality matters just as much as performance. I mean my usual settings, downscale to 720p, 30fps, 2500kbit looks borderline crap at times when streaming BF4.
 

Osiris

Active Member
Oh that sucks :( What's the point of hardware encoding then? Quality matters just as much as performance. I mean my usual settings, downscale to 720p, 30fps, 2500kbit looks borderline crap at times when streaming BF4.

If you are just recording, you are not limited by the bitrate, so you can crank that up to achieve the same quality as x264 gets at a lower bitrate. For streaming you are severely limited on the bitrate.
 

Lucil

Member
just how do the bframes work and its advantage to us? if any? all because any number i put it looks weird
i understand how it works with the mpeg style but thats more in video editing and space
but in the obs amf it just looks horrid.
 
Last edited:

dping

Active Member
just how do the bframes work and its advantage to us? if any? all because any number i put it looks weird
i understand how it works with the mpeg style but thats more in video editing and space
but in the obs amf it just looks horrid.

To explain that, let me explain i frames, i frames are full uncompress pictures that dont refer to any other frame. they are the like a picture able to fully stand on its own.

b frames are frames that can predict from i frames both in front of and behind.
p frames (which are the only prediction frame with VCE 1.0) can only predict from the i frame behind it.

Being able to reference frames behind and/or ahead means that the GOP, or group of pictures, can be more compressed since less i frames are needed b frames need less information to reference a full viewable picture.

The problem being is if b frames are out of order or get mixed up. or to many b frames can cause issues because there aren't enough i frames to reference from. this often happens over a network (packet loss, received out of order, or decoders that don't match the encoder's order of frames).

since we are talking about this, IDR period is the max range between two IDR frames. IDR is basically a refresh or wipe of the history of the P and B frames, telling the encoder not to reference before it. IDR's are required for both seeking (like on a DVD or Bluray) or when you watch a stream, it waits on the IDR frame to populate since that is the first picture that can actually be decoded, since all the other frames before it reference to something else.


EDIT: @jackun, a while back were you experimenting with SPS/PPS nals? I thought you set the NAL size to the GOP. I've been reading about HLS and you want to set it to the IDR (key frames) not GOP.
 
Last edited:

jackun

Developer
EDIT: @jackun, a while back were you experimenting with SPS/PPS nals? I thought you set the NAL size to the GOP. I've been reading about HLS and you want to set it to the IDR (key frames) not GOP.
Probably.

B-frames and Quality preset start to "flutter" for some reason while with Balanced preset seems to be fine. Setting QP delta to 0 didn't seem to make a difference either.
 
Top