I'm confused... is there a drawback to not using a higher bitrate for your camera?
There are two stages of compression you're referencing. The first occurs between the camera and OBS, where the data sent from the camera is decoded into the actual frames that are composited in the scene. You want this as high quality as possible without sacrificing anything, because anything here will be further compressed in the final encode (garbage in -> garbage out).
In a general sense, if you're talking about the same compression algorithm, when re-encoding you always want to have a higher source bitrate than your final bitrate. That way the re-compression of artifacts left over from the original compression don't make as much of a difference in the final output. There will always be some residual quality loss, and how much is very much a case-by-case study, but in general you want to have as high a quality reasonable going into the final encode.
Now, it's that "reasonable" that is the tricky bit. Do you need a perfect image quality if your webcam is only taking up a small corner in your scene, compressed down to a 2500kbps stream? Probably not. If there's a tradeoff in your situation where you are considering using a lower bitrate for the camera for some kind of benefit, the just test -- do multiple encodes (using your target bitrate for OBS's output) using the same scene, just changing the camera's bitrate. See whether it makes a difference in the final output, and whether that difference is worth the tradeoff.