Timer durations are minimums, not absolutes. And 0ms never actually means 0ms. This applies to setTimeout and setInterval. Like anything else that is nondeterministic and out of the control of your code, timer durations are one of the JavaScript Common Pitfalls.
So, why are the times minimums and not actual times? The main reason is that there is no way to tell what else might be going on when the timer fires. If something else is synchronously executing, the timer has no way to interrupt and run. It simply has to wait until the event loop gets back around to it. In most cases this only accounts for 1-2ms delay in timers firing.
With setTimeout(fn, 0) we notice much larger lags, and more platform dependent variance.
var start = Date.now() setTimeout(function () { var next = Date.now() console.log(next - start) setTimeout(function () { var last = Date.now() console.log(last - next) }, 0) }, 0)
In node the first one takes ~1ms, and the second one takes closer to ~15ms on my machine. In the current chrome and firefox both take 0-2ms, except that by the html 5 spec the inner call should take a minimum of 4ms. And old browsers 10ms for the inner call was the common minimum.
On Safari it seems to honor the 4ms min for the second timer, but I sometimes get results up to 20ms for either one. That just kills you if you're trying to do something like an animation loop where you are attempting update something 60 times a second. Even if you're drawing and processing took no time at all, sometimes safari or node would eat up all of your time just waiting to fire the timer. setImmediate or requestAnimationFrame are better solutions for those tight asynchronous loops.
The reason for the minimum time on the inner setTimeout gets into the subtle mechanics of Timer Precedence, that are probably better left for another page.