Skip to content

Instantly share code, notes, and snippets.

@bzz
Last active October 7, 2021 21:04
Show Gist options
  • Save bzz/8348550809d470c2280b14ce8b87d62c to your computer and use it in GitHub Desktop.
Save bzz/8348550809d470c2280b14ce8b87d62c to your computer and use it in GitHub Desktop.
Controversial ideas in Tech that end up working well

Controversy on good ideas in technology

As a programmer, it is your job to put yourself out of business. What you do today can be automated tomorrow.

Douglas McIlroy

1940s computers: digital VS analog

Vannevar Bush, scientist, participant of the Manhattan project, AT&T board of directors, inventor of the Differential Analyzer machine, etc. While he is a conceptual father of “personal computer” or Memex in his As We May Think - The Atlantic - he was famously against digital computers, did not believe it can be built in a reliable way and was in favour of doing analog computations instead.

1950s absolute binary vs Assembler

Renting IBM 701 was like 300$/h, which, with inflation is like a $1 / second now. As assemblers (or "symbolic systems" as they were known back than) were "too wistful of the machine time" - people were sceptical and kept programming in absolute binary instead, not worrying about “human time” that takes to do so.

Richard Hamming at The Art of Doing Science and Engineering, chapter 4 mentions:

At the time [the assembler] first appeared I would guess about 1% of the older programmers were interested in it — using [assembly] was “sissy stuff”, and a real programmer would not stoop to wasting machine capacity to do the assembly. Yes! Programmers wanted no part of it, though when pressed they had to admit their old methods used more machine time in locating and fixing up errors than the [assembler] ever used. One of the main complaints was when using a symbolic system you do not know where anything was in storage — though in the early days we supplied a mapping of symbolic to actual storage, and believe it or not they later lovingly pored over such sheets rather than realize they did not need to know that information if they stuck to operating within the system — no! When correcting errors they preferred to do it in absolute binary.

1960s assembler vs High Level Languages

Fortran (Formula Translating System) for translating "human language" of math/calculus into machine language.

Richard Hamming recalls:
It was opposed by almost all programmers:

  1. “It can’t be done”,
  2. “it will be so inefficient, that you can not afford it”,
  3. “even if it did work, no respectable programmer would use it — it was only for sissies!”

Algol (Algorithmic Language) Large group of members at the Panel Discussion on “Philosophies for Efficient Processor Construction” at the “International Symposium of Symbolic Languages in Data Processing” in 1962 was sceptical about usefulness of /recursive procedures/ in ALGOL60 language (which E.W. Dijkstra implemented one year before that) source

Compiled (AOT) vs interpreted (VMs, JIT) languages

Lisp 1959, Forth 1970, Smalltalk 1972 are interpreted

Memory mgt: manual vs automatic (GC, ref counting)

Object Oriented Programming

Edsger Dijkstra joked

object-oriented programming is an exceptionally bad idea which could only have originated in California.

Linus Torvalds at gmane.comp.version-control.git said:

limiting your project to C means that people don’t screw that up, and also means that you get a lot of programmers that do actually understand low-level issues and don’t screw things up with any idiotic “object model” crap.

Rob Pike in comp.os.plan9 commented:

object-oriented design is the roman numerals of computing

Joe Armstrong (more at Why OO Sucks by Joe Armstrong) mentioned elsewhere:

The problem with object-oriented languages is they’ve got all this implicit environment that they carry around with them. You wanted a banana but what you got was a gorilla holding the banana and the entire jungle

Other

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment