I'll assume you're using x264 for both streaming and recording.
If you are using simple output mode and have your recording quality set to "same as stream", it will share the same encoders that the stream uses, and the settings will be completely identical -- there should be no noticeable difference in that case because it uses the same exact encoding settings (and same encoder data).
If you set your recording quality to "high quality" or "indistinguishable quality", it will use different settings than streaming, and will actually spawn separate encoder objects in order to record. Because the settings are different (due to being oriented more towards recording), it is possible that it could be different CPU usage. However, I tested this, and noticed no conclusive difference (high quality for recording vs CBR for streaming, 18-25% streaming, 18-25% recording, 6 core machine -- no real conclusive difference).
However, if you are writing this because you stream and record at the same time, and recording quality is set to something other than "same as stream", you will be running multiple video encoders at once, and thus your CPU usage will probably be doubled -- but only when they are active at the same time. If recording quality is set to "same as stream", then it will share the same encoder instances for both streaming and recording, and CPU usage is unaffected.
Recording can also spawn a separate AAC encoder to ensure higher quality audio when recording, although the CPU usage on audio encoders are insignificant compared to the video encoders, and should make almost no noticeable difference.
(Also yes, please link or upload logs so we can see what you're doing)