Here is my log files:
Trying several servers:
https://gist.github.com/e4eafbcfe8cfb5b741e1
Trying one server:
https://gist.github.com/fa8af5a783097ed1c328
I am trying to stream a game to Twitch. My bitrate is set to 3500, and speedtest gives me an upload of 50mb/s.
The problem is the stream is completely unwatchable on any device - it shows 2-4 seconds of footage, and then it buffers for about 30 seconds. If I lower the resolution and/or bitrate, I get more seconds of footage, but the same issue.
However, the info at the bottom says I'm dropping 0 packets (0.00%). I switched servers from US East NY to VA (I am in Massachusetts), and the same issue persisted. Then I switched to FL.
When I had FL selected as the server, the bitrate icon turned from green to yellow to red, and fluctuated between those three. Dropped packets would spin up every few seconds, from 10% to 20%, and up past 50%. Watching this stream on twitch is exactly the same as the other two servers.
So it seems the NY and VA servers are not correctly reporting back to me how many frames have been dropped, or something to that effect.
I do believe my ISP is throttling this port, and I am trying to understand why. But I find it extremely hard to believe that I am dropping 0 packets to NY and so many packets to FL, and both give the same performance. I tried the rest of the EU/Asia servers, and I'd say about half of them show 0% drop, half of them show 50%+ drop.
For what it's worth, I've followed the advice in the other thread for tackling dropped packets, and the other sticky about the stream not viewing correctly does not seem relevant given my bitrate.
I will continue troubleshooting on my end, but I can provide any information needed.
Trying several servers:
https://gist.github.com/e4eafbcfe8cfb5b741e1
Trying one server:
https://gist.github.com/fa8af5a783097ed1c328
I am trying to stream a game to Twitch. My bitrate is set to 3500, and speedtest gives me an upload of 50mb/s.
The problem is the stream is completely unwatchable on any device - it shows 2-4 seconds of footage, and then it buffers for about 30 seconds. If I lower the resolution and/or bitrate, I get more seconds of footage, but the same issue.
However, the info at the bottom says I'm dropping 0 packets (0.00%). I switched servers from US East NY to VA (I am in Massachusetts), and the same issue persisted. Then I switched to FL.
When I had FL selected as the server, the bitrate icon turned from green to yellow to red, and fluctuated between those three. Dropped packets would spin up every few seconds, from 10% to 20%, and up past 50%. Watching this stream on twitch is exactly the same as the other two servers.
So it seems the NY and VA servers are not correctly reporting back to me how many frames have been dropped, or something to that effect.
I do believe my ISP is throttling this port, and I am trying to understand why. But I find it extremely hard to believe that I am dropping 0 packets to NY and so many packets to FL, and both give the same performance. I tried the rest of the EU/Asia servers, and I'd say about half of them show 0% drop, half of them show 50%+ drop.
For what it's worth, I've followed the advice in the other thread for tackling dropped packets, and the other sticky about the stream not viewing correctly does not seem relevant given my bitrate.
I will continue troubleshooting on my end, but I can provide any information needed.
Last edited: