Well, the idea is that, for most code, the refresh rate doesn't really matter. Usually, timeouts are approximate. If the refresh rate is higher than 60hz, then it only means the timeout will be more accurate. In my use cases, I think the benefit with respect to graphics outweighs the loss of accuracy (or is it precision?).