Skip to content

Instantly share code, notes, and snippets.

@AndreasS2501
Last active December 13, 2016 09:38
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save AndreasS2501/2fc06c47a71d17b5d3fbeb96548e6936 to your computer and use it in GitHub Desktop.
Save AndreasS2501/2fc06c47a71d17b5d3fbeb96548e6936 to your computer and use it in GitHub Desktop.

(trying to) Understand Alan Kay (better)

I'm writing this in response to this interview/talk with Alan Kay: https://www.youtube.com/watch?v=fhOHn9TClXY

Which in turn created some interesting responses on twitter like:

https://twitter.com/tomaspetricek/status/804704923317309441 or this one https://twitter.com/viktorklang/status/803535678466326528

and IMO you can even stretch it far out to connect it:

https://twitter.com/curious_reader/status/806401151713366016

By it I mean Alan Kay's Philosophy not the talk per se, since his talks are always more about his philosophy than a specific topic, he than just embeds the topic at hand in the philosophy.

It is quite a challenge to discuss the ideas of Alan Kay on twitter because they require quite a lot of context. Or even acknowledging that there is a context.

Context

I think many people from the IT industry struggle to understand Alan Kay because of perspective. Here I can throw in a very fitting quote from Alan Kay:

"A change in perspective is worth 80 IQ points."

So I would assume that many people who saw the talk at Codemesh IO and who saw the talk online where trying to understand what he said from a perspective which is in some sense related to the IT industry.

And this is the first mistake, if you want to put it that way. Alan Kay is a scientist which came to the field of computing when it was young and still in its infancy, it was still growing. As a matter of fact at that time when he got into computing (60's) many people already proficient in other domains came to computing to look and try for something.

This may seem trivial but it is not. Today the IT Industry and even academia is so busy with itself that ideas from other fields are almost impossible to penetrate the armor of some decades long tradition of computing industry and academia.

If you deeply and more consciously want to understand this topic I would recommend to you the Dijkstra Diaries : 447, 1284, 1036, 361

Converse ?

Sometimes statements like : "Linux is a distraction" from Alan Kay may seem strange but that I think that is just because, again , the perspective. Our perspective when trying to understand his views is unfavourable.

Some time ago Alan Kay did a AMA on Hacker News and a discussion came up on how "data could be bad". Rich Hickey the creator of clojure is a big fan of "data". Personally I find Clojure and the ideas which Rich Hickey incorporated into Clojure are a beautiful thing. Things like functional programming and data. They are great and help mitigate for the chaos the IT Industry is in. Here one can clearly see the difference in perspective. While Rich Hickey did amazing things to quote himself: "creating a new programming language is an act of insanity", he created Clojure so programmers could deal better with their daily problems. In that context, the context of our current IT industry his statements make absolutely sense (data is good etc..)

But why then Alan Kay says " What if "data" is a really bad idea? ". Well for one thing he is not so much concerned exclusively with the IT Industry, as with humanity at large. Why treat programmers in a special way? Why the computer revolution hasn't happened yet ? Why can't every object in a program have an URL and use that to communicate and cooperate with other objects ( I think to a degree blockchains will full fill this promise)? That is the context Alan Kay thinks in. Yes it is this comprehensive.

Alan Kay said he read about 20000 books. You can get a partial idea of what those books where about here: http://wiki.c2.com/?AlanKaysReadingList http://www.squeakland.org/resources/books/readingList.jsp So one could imagine that he has a broader spectrum of ideas influencing his work.

Another phrase he mentions sometimes says is: lets face it, computing turned into pop culture. I think this means that he thinks that the current IT Industry and (largely academia too) only tries to solve problems or try to improve in the scope of where it is today. So I think he sees IT industry and academia trapped in some kind of blub paradox (see paul graham). A Situation where you can not really see outside of your field. I think he looks at the situation today and sees the current situation as a certain paradigm in the sense of science revolution paradigms as described by Thomas S. Kuhn. This is why he says the computer revolution hasn't happened yet. Because there was quite few progress when looking from the 70's.

If you look at the computer as a medium it hasn't change that much in the last 50 years. Sure computers got faster but its still mostly GUI software and interaction via keyboard or a pointing interface ( mouse/touch). In the 70's many people anticipated a different interaction model with the computer. But as of today there is little of this in reality. Maybe Augmented/Virtual Reality and Brain-Computer Interfaces will change that

So?

I guess thinking in terms of "the human condition","the structure of scientific revolution" or "the computer revolution hasn't happened yet" seems very distant for people from the IT industry but I think, actually that if we want to advance on the problems humanity faces, it is not a bad idea to trying to take them into account.

Conceiving personal computing the paradigm shift allowing him to envision even further...

Files as an example of just data , disadvantages see Ted Nelson

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment