There's a lot of water-cooler buzz about the relationship between capitalism and computation. Did early computation make capitalism possible? Did capitalism lead to the elaboration of computation? And/or, should capitalism be understood as a form of distributed computation? Is capitalism an algorithm? Am I a computer? Is the solar system is one big atom? Are pre-enlightenment forms of similitude still the basis of most humanistic reasoning?
The major problem is that neither concept is well-specified. Or rather, some academic communities don't bother specifying it at all, while others specify it quite clearly but are at odds with each other over the best specification. In a talk I gave earlier this year, I mapped out some popular definitions for each, with very rough dates for each definition's emergence:
Of course, one solution is to stop using these terms and refer explicitly to the social or technical architectures in question. When well-specified, computer or computation and capitalism should contain clearly defined components, with equally well-defined interfaces between them. Then we could actually compare and validate research instead of going hostile camps mode; for example, mechanically computed algorithms and credit currency, or digital computers and modern debt instruments.
A few years ago I was interviewing employees at a a defense contractor and spoke with a woman who met her husband through some of the very first stable Internet connections, in the late 1970s. They divorced. The Internet is still here, though. The Internet always wins.