RTT doesn't drive performance, (bw*delay)-loss does.
FWIW, the heavily interactive apps are more affected by RTT than they are by bandwidth. Network games are the new TELNET. They despise varying latency levels, and are generally oblivious to bandwidth. Your point is still mostly valid, in that the only thing they hate more than varying latency is packet loss, but if the network isn't losing packets then RTT does affect "perfomance" for the heavily interactive apps.
i wasn't talking about network performance in general. (and i would never, ever recommend (bw*delay)-loss as a routing metric!) this is in the context of so-called global server load balancing. RTT may, or may not, matter in the decisions such a system must make ("serve this client from which proxy?")