x264 has several CPU presets, in increasing order from low CPU usage to high CPU usage: ultrafast, superfast, veryfast, faster, fast, medium, slow, slower, veryslow, placebo
By "preset", it means exactly what it sounds like: a set of pre-determined settings for x264 so that you don't have to set them all manually yourself to tweak things. These sets of settings have been tested by lots of people and are great for general use, depending on what you want to get out of your encoder. The actual details of what the settings are can be found here:
http://dev.beandog.org/x264_preset_reference.html (I'm told this is slightly out of date, but you get the idea.)
The basic idea is that, all things being equal (same bit rate, etc), less CPU usage would result in worse quality, and more CPU usage would result in better quality, because the presets change how much time the encoder spends compressing each frame to look good within its setting constraints. Sometimes you need to reduce your CPU usage in order to get good performance, and the higher CPU usage presets can be difficult to use effectively with normal consumer hardware.
However, the amount of energy the CPU spends compressing each frame isn't the only factor in video quality. Bit rate is also important for determining how much information you can put into each frame of video. If you are allowed to cram more data into each frame, you don't need lots of CPU spent on compression, so you can make each frame look better just by cranking up the bit rate.
Thus, you can get a good-looking video with relatively low CPU usage by using a low-CPU preset (like ultrafast) with a higher bit rate.
This is what is happening with the OBS setting in question. "Software (x264)" will choose the veryfast preset with a setting that will require a certain amount of bit rate to look "Indistinguishable" (according to your Recording Quality setting), and the "Software (x264 low CPU usage preset)" setting will use ultrafast, but ask x264 to effectively use more bit rate to make up for the quality loss imposed by the lower-usage CPU preset. (Technically, rather than asking for "more bit rate", it sets a lower CRF value, which naturally requires more bit rate, but I'm trying to keep the explanation simple.)
The end result will probably look a bit different, but the goal is to get roughly the same quality, but just trading off CPU usage for file size. I recommend trying both and see which one works for you in terms of quality, CPU usage, and resulting file size.