JavaScript history: undefined

Two tweets by Brendan Eich shed light on the history of JavaScript having both undefined and null [1].




The first version of JavaScript did not have exception handling, which is why JavaScript so often converts automatically [2] and/or fails silently (tweet).


JavaScript copied Java’s approach of partitioning values into primitives and objects [3]. null is the value for “not an object”. The precedent from C (but not from Java) is to convert null to 0 (C has pointers, not references and lets you perform arithmetic with pointers).


Remaining problem: In JavaScript, each variable can hold both primitives and objects. In Java, a variable’s static type limits it to either kind of value. We therefore need a value for “neither a primitive nor an object”. That value could be null, but at the time, Eich wanted something that wasn’t “reference-y” (associated with objects) and did not convert to 0 (tweet). Now you know why undefined and null are converted to different numbers:


> Number(undefined)
NaN
> Number(null)
0


References:


  1. JavaScript quirk 2: two “non-values” – undefined and null

  2. JavaScript quirk 1: implicit conversion of values

  3. Categorizing values in JavaScript


Comments

Popular posts from this blog

Steve Lopez and the Importance of Newspapers

A Treasure Hunt Without The Treasure

Drop a ping-pong ball in the clown’s mouth