You cannot use arbitrarily high bitrate for streaming. For example, your tests show that your available physical upload bandwidth with your provider is almost exactly 18000 (with everything above 18000, including 18500, you dropped frames due to bandwidth, with everything below you didn't).
And then there's the streaming service and the clients (your viewers). Everything you send is relayed to the clients. Some streaming services recode the stream, some don't.
If the service you use recodes, it is usually recoded to a generic bitrate the service deems as best practice. This is definitely lower than 18000, so the stream that's received by the clients is well below this 18000, so the quality the clients see is less than the quality you saw.
And if the service you use doesn't recode, it either relays the stream verbatim, or it cuts off packets if the data rate is too high, or it drops the connection altogether. If it relays the stream verbatim, your clients get choked by a bitrate of 18000. That's much too much for half of all clients worldwide, if not more. As far as I remember, 2 years ago, the average internet download speed was 5.6 mbit (bitrate 5600)! Today, it vastly increased, but you will still lose most of your viewers if you stuff them with 18000. In any case of this, 18000 isn't a good bitrate to stream with.
Best practice can be derived from Twitch's encoding guidelines:
https://stream.twitch.tv/encoding/
And by the way, let the idea go that you can produce the perfect stream. This isn't possible. For some streamers, you may think you see a perfect stream, but in fact it isn't. You don't see the hardware costs behind it, and you see others' streams with different eyes than your own. And you can only compare streams with the same visual complexity. A good stream with rather bland graphics of a shooter is much easier to achieve than a good stream with visually sophisticated graphics of something like a RPG game.