This tension is as old as programming itself.
Dijkstra and Hoare spent decades arguing for rigor.
Meanwhile C, JavaScript, and PHP, none of them designed for correctness, won the world.
The languages that ship always beat the languages that are right.
JSON's dominance is one of the most accidental success stories in computing.
Douglas Crockford didn't design it — he said he "discovered" it. It was already there in JavaScript's object literal syntax, which itself traces back to Brendan Eich's 10-day sprint in 1995.
A data format that conquered the internet was a side effect of a language built under absurd time pressure.
Every attempt to replace it has to overcome that kind of accidental ubiquity, which is much harder than overcoming a technical limitation.
ENIAC is where the profession of programming was born — and the first programmers were six women: Kay McNulty, Betty Jennings, Betty Snyder, Marlyn Meltzer, Frances Bilas, and Ruth Lichterman.
They had to program it by physically rewiring patch cables and flipping switches. There was no programming language, no stored program. The "software" was the hardware configuration itself.
It took another decade before FORTRAN (1957) gave programmers a way to write instructions in something resembling human language.
I do not know. I think it was before she first got an assembler. Memory was still tubes, not core.
I kind of think she was at White Sands at the time, so... maybe look at what White Sands was using for missile trajectory calculations around 1954 or 1955?
There were programmers before that, e.g. for the IBM ASCC at Harvard, which was based on the ideas of Howard Aiken (inspired by Babbage).
Programming the IBM ASCC (a.k.a. Harvard Mark I) was much closer to the programming a modern computer in comparison with ENIAC, as it had an instruction set and programmers wrote a sequence of instructions on punched tapes. However even ASCC had some panels where it was possible to rewire some of the execution units to change their behavior, i.e. to change what some of the instructions from the instruction set did, but that was not the primary means for programming the computer. In ASCC the rewiring was akin to the microprogramming available in some later electronic computers, where you could change what some instructions did or you could add custom instructions.
Among the programmers of the IBM ASCC, Grace Hopper became later famous due to her contributions to the first high-level programming languages.
Therefore the profession of programmer has not started with ENIAC, even if the ENIAC programmers were among the first programmers.
Thank you for highlighting the contributions of women in computing, especially at it's inception! That is so easily forgotten or intentionally ignored in the age of the "tech bro"
There were other programmers before those of ENIAC, but they also included women, like Grace Hopper, who later had an important role in the development of programming languages.
Hopper's trajectory from the Mark I to FLOW-MATIC to COBOL is one of the great arcs in computing.
She had to fight to convince people that programs could be written in English-like words, her colleagues thought the idea was absurd because "computers don't understand English."
The PLATO connection is the best part of this piece.
PLATO ran its own programming language - TUTOR - which was designed specifically for creating interactive lessons on the system in 1967.
It's one of the earliest examples of a domain-specific language shaped entirely by its platform. The system also had real-time chat, message boards, and multiplayer games running on shared terminals in the early 1970s — a decade before BBSs went mainstream.
Lotus Notes, email, Slack — the entire groupware lineage traces back to a university teaching system in Illinois.
Meyer makes an important point that often gets lost: the null pointer predates Hoare. NIL existed in McCarthy's Lisp in 1959, six years before Hoare added null references to ALGOL W. The "mistake," if it was one, was already widespread.
What made Hoare's 2009 confession so impactful wasn't that he was solely responsible — it's that he was the first person with that level of authority to publicly say "this was wrong."
That's what gave Rust, Swift, and Kotlin permission to design around it.
I don’t know much about algol, but in Lisp, nil represents the empty list. And because the list is a recursive data structure, it always contains the empty list. It’s not the same as java where null is its own value that is part of every type. In Lisp, an atom can’t be nil because an atom is not a list.
What you say may be true for some modern LISPs, but it was false in most early LISPs, certainly in any LISP that has preceded the paper "Record Handling" of Tony Hoare.
I quote from the manual of LISP I: "Here NIL is an atomic symbol used to terminate lists".
I am not sure which is the rule in Common LISP, but in many early LISPs the predicate (atom NIL) was true.
In early LISPs, the end of a list was recognized when its CDR was an atom, instead of being another list. The atom could be different from NIL, because that final list could have been an association pair, pointing towards two associated atoms.
The fact that in early LISPs NIL was an atom, but it was also used to stand for an empty list caused some ambiguities.
EDIT:
I have searched now and in Common LISP the predicate (atom NIL) remains true, like in all early LISPs, therefore NIL is still an atom, even if using it almost anywhere is equivalent with an empty list.
LISP was special because it had only 2 data types: atoms and lists.
Both lists and atoms could appear in any place in any function or special form.
NIL was a special atom, which was used to stand for an empty list. Because it could appear in any place in a LISP program, it could be used anywhere where one had to write that something does not exist.
In a programming language with a more complicated and also extensible system of data types the handling of "nothing" values must also be more complex.
Any optional type can be viewed as a variable-length array of its base type, whose maximum length is 1 and a null length indicates a non-existent value.
This is equivalent with the use of NIL in LISP.
However, it is better to consider optional types as a distinct method of deriving types from base types than arrays, structures or unions, because in most cases more efficient implementations are possible for optional types than implementing them as variable-length arrays that may have a null length or as tagged unions where one of the alternative types is the empty set (a.k.a. void).
The attribution to Hoare is a common error — "Premature optimization is the root of all evil" first appeared in Knuth's 1974 paper "Structured Programming with go to Statements."
Knuth later attributed it to Hoare, but Hoare said he had no recollection of it and suggested it might have been Dijkstra.
Rule 5 aged the best. "Data dominates" is the lesson every senior engineer eventually learns the hard way.
Hoare's influence extends much further than Quicksort and null.
His work on ALGOL W with Wirth — rejected by the committee — led directly to Pascal. Meanwhile ALGOL 60, which he helped maintain, shaped the entire C lineage: ALGOL → CPL → BCPL → B → C.
His 1978 CSP paper is equally significant. Go's channels and goroutines are derived from CSP. Erlang is closer to the original CSP model, communicating to processes by name rather than over channels. Rob Pike traced Go's concurrency lineage explicitly through Newsqueak → Alef → Limbo, all CSP descendants.
And the null confession wasn't just rhetoric — Rust's Option<T>, Swift's optionals, and Kotlin's non-nullable defaults are all direct responses.
His first degree was Classics and Philosophy at Oxford. He invented Quicksort in Moscow while building a translation dictionary.
Every language since has had to pick a side, and "optional" always turns out to be the hardest choice.
JavaScript's ASI is what happens when a 10 day language has to live with a quick pragmatic decision for 30 years.