Anyone can recommend anything... not really all that helpful/meaningful without boundary considerations of budget and expected system lifetime. A system which you intend to replace GPU in 2 years, and motherboard/CPU (or entire system) in 3 or 4 years) will be different from something you want to last longer. And if you want stable, reliable system life for 3,4 years or longer, then beware over-clocking (period, especially without beefy cooling... consumer CPUs/circuit boards (most electronics) lifespans are shortened by higher heat)
Is gaming only heavy workload planned? no quality 4K video editing? or other?
From a streaming perspective, is your focus on current streaming targets you mention, or do want/are you willing to spend a bit more and get something also fully capable of AV1 streaming as that takes over from the ancient H.264 largely in use today. But beware, some current AV1 capable GPU encoders are first gen, with all the 'teething' problems that goes with such. From a value perspective, I'd be inclined to wait for RTX 5xxx GPUs to be released, then buy a RTX 4xxx series (or even 3xxxx depending on other rqmts) CPU, then upgrade GPU when your target platforms AV1 rollout in public, stable, etc. Intel GPUs look interesting, but still a work in progress... AMD.. well H.264 has been so under-developed (until recently, like almost a decade late), that the caveats that go with them, aren't worth it to me. BUT... it depends ... You can spend more, get nVidia and likely (but not guaranteed) to have a more stable/reliable experience (avoiding from both vendors some gaming video hacks that don't play well with streaming)
Then, reliable networking requires Ethernet, not WiFi (said based on decades of networking experience, including SMB and enterprise markets). Is old-school, default 1GbE sufficient, or will you want/need faster? 2.5GbE is cute, but 10GbE Ethernet is already cost-practical for home (small port count managed switches) high-traffic devices
Beyond that, you have your basic conundrum of AMD, which is more power efficient (in general) for a given CPU performance... but their related BIOS, drivers, chipset software, etc is (and has been for a very long time) severely underwhelming (inadequate quality/maturity, imo). Intel has a far more mature s/w dev process, and tends to have higher quality drivers (not that both vendors don't have issues... I'm talking at a high level, broad overview perspective... select products from Intel could be more problematic). Years ago, AMD could claim much lower CPU vulnerability risk, but that doesn't seem to be the case anymore. Intel in the middle of a turn-around, with all the risk that entails. Fanboys on both sides, who can be hard to ignore. Personally, I'm in the market for a mobile (laptop) workstation, and I'd love a AMD CPU (for power efficiency) combined with mobile RTX GPU... but there is not a single AMD Tier 1 workstation model out there that also includes Thunderbolt 4/5 (needed for large video transfers) or at least sufficient USB4 (40gb/s) ports... so I'll probably end up with an Intel based system.. again... grumble. From a desktop perspective, a significant benefit of AMD CPUs is the typically longer lifespan of their CPU mounts, so next gen CPU upgrades on same motherboard are often available (vs Intel, which almost always requires a new motherboard .. hate that)
Above meant as general food for thought.,.. you may have been already aware... but without budget and more detailed requirement expectations, not really viable to give much more concrete advice