Some people believe there is a absolute setting for bitrate/buffer, there isn't. Your bitrate and buffer depend heavily upon what you're encoding. Whatever formula he's using to get that number is pointless.
Higher action scenes require more bandwidth in order to achieve a high quality experience. All of this is subjective, because quality is quite subjective as is bandwidth to achieve that level of quality. For instance you could get away with streaming LoL at 720p@30 on 1500 or 2000 bit rate. Where as if you play CoD or another FPS you'll probably need around 2500 or 3000 to get similar quality using the same exact resolution and FPS. The amount of action on the screen is entirely different.
That is just an example though. Someone who plays LoL may spend a lot of time looking all around the map so action on the screen may be higher and may result in reduced quality because of this. So they would need a higher bitrate in order to compensate for this. Conversely they can lower their quality setting to compensate for this, but that reduces quality if more bandwidth is needed.
People generally recommend the same buffer as bitrate because there is generally very little reason to set it differently. I go with a 1.5x buffer size, but that's just me. In practice I've seen very little reason to change this option outside of a 1:1 ratio.