It really depends, because the 'gaming latency problem' isn't a static thing.
Some games are traditional client-server communications. Clearing up local bufferbloat issues can certainly help, but if the best case RTT is 80-100ms, experiences may still not be great.
However a non-zero number of games only use a centralized server for certain functions, and the rest of the gameplay traffic is peer to peer, with one client being selected as the host for that session. Starts to become a complete crapshoot there.
There's also the fact that a LOT of gaming netcode is pretty bad. It's either written for a specific range of RTT , outside of which it performs terribly, or, it's complete ass to the point where kids actually try to inject latency/loss into their local network that screws with the game performance and gives them an advantage.