Javascript quirks Memes

Posts tagged with Javascript quirks

JS Is A Very Respectable Language

JS Is A Very Respectable Language
JavaScript really said "consistency is for COWARDS" and honestly? It committed to the bit. 💀 So you've got an array [1, 2, 3] and you're like "hey what's at index -2?" JavaScript casually returns undefined because negative indices don't exist in JS arrays... EXCEPT when you use .at(-2) which is specifically designed to handle negative indices and suddenly it's like "oh you want the second element from the end? Here's your 2, bestie!" Then you assign foo[-2] = 4 which JavaScript happily accepts because arrays are objects and you just created a STRING property called "-2" on that array object. So now foo[-2] returns 4 from the object property while foo.at(-2) STILL returns 2 from the actual array position. Same syntax, completely different universes. Very respectable. Very normal. Nothing to see here. 🎪

Stop Doing NaNs

Stop Doing NaNs
Ah, the eternal JavaScript nightmare: NaN (Not a Number) - which ironically is a number type that doesn't equal itself. Because that makes perfect sense! The IEEE 754 floating-point standard really outdid itself here. "Let's create a special value that represents calculation errors but make it behave in the most counterintuitive ways possible!" My favorite part is JavaScript trying to be helpful: "You want to convert 'hello' to a number? Sure thing! Here's a NaN for your trouble. No errors thrown, just silent mathematical chaos." And then we wonder why our date calculations suddenly think it's the year NaN. The hex(983061) at the bottom is the cherry on top - it's 0xF00D61, or "FOOD A1". Even the hexadecimal is trolling us.

The Magic Number Of Zeroes

The Magic Number Of Zeroes
JavaScript's parseInt() function is like that one coworker who ignores all your emails until you add exactly seven zeroes after the decimal point. The function stubbornly returns 0 for every decimal value, until suddenly—at 0.0000005—it decides "Oh, I see a 5 now!" and returns 5. It's like watching someone squint harder and harder at tiny text until they finally give up and just read whatever letter they think they see. The floating point precision gods have spoken, and they've chosen chaos.

The Sort Of Surprise Every JavaScript Developer Deserves

The Sort Of Surprise Every JavaScript Developer Deserves
Innocent newbie: "I'll just use array.sort() to sort these numbers!" JavaScript: *sorts lexicographically* "Did I stutter?" Nothing says "welcome to JavaScript" quite like discovering your numbers are being sorted as strings. That moment when you realize you need array.sort((a,b) => a-b) and question all your life choices that led you to web development. It's basically JavaScript's hazing ritual - "Oh, you thought programming would make sense? That's adorable."