Thing is if it were like any decent programming language, divide by zero wouldn't return diddly squat. That should be a fatal error. A concept JavaScript seems to go above and beyond out of its way to avoid.
JavaScript's numbnuttery just plods on like you did nothing wrong.
Though I have the same criticism of HTML. Maybe it's because I learned assembly first, compiled languages second, and didn't start using interpreted languages "for real" until I'd already had two and a half decades of programming under my belt...
But if you detect an error, the program should flipping stop running. Not "quietly log and keep going."
Though to be fair, I spent a decade working in Ada. By comparison it feels like "normal" C syntax languages were designed to create bugs on purpose... and the further you stray from "actual C" the worse it gets.
But what do I know? I still don't think this was a joke: