I'm seeing differences between Firefox and Safari in the last digit of the output from
Safari (12.1.1) gives
-0.5235987755982988 but Firefox (Mac/67.0) gives
This is of course a tiny difference. However, it seems that all implementations should yield the same output across all inputs. A difference like this could, for example, cause an
if statement to follow different paths depending on browser.
Does what I'm seeing violate any version of the ECMAScript spec?