Skip to content

Instantly share code, notes, and snippets.

@non
Last active Aug 29, 2015
Embed
What would you like to do?
The analogy is comparing writing programs to buying goods:
writing a program with static types is equivalent to buying
with cash, and writing a program with dynamic types is
equivalent to buying with credit.
The "cost" in either case is mental energy and time. When I
write a program with static types I spend a lot of effort up
front, guided by the type system and compiler, to handle edge
cases and generate a totally correct solution. I have to
fully-understand any libraries I'm using and characterize
the problem in terms of types.
When I write a program with dynamic types, I typically get
something I can run much faster, but then I struggle through
compose/test/debug cycles until the program is eventually
correct. The energy cost here is spread out (potentially
over days or weeks) instead of happening earlier.
The correct program is the thing I am buying in this analogy,
not the currency I'm spending.
@non

This comment has been minimized.

Copy link
Owner Author

@non non commented Jan 29, 2014

Crucially, if I am very time/energy poor, it's may be easier to start working with a dynamic language, and defer "payment" until later when I have a more concrete idea of the errors/problems/whatever. But the total cost is usually higher (for me anyway).

@milessabin

This comment has been minimized.

Copy link

@milessabin milessabin commented Jan 29, 2014

I see what you're getting at, but I think the analogy is misleading. Monetary costs, whether up front cash or credit, are purely instrumental. When you say "writing a program with static types is equivalent to buying with cash, and writing a program with dynamic types is equivalent to buying with credit" you're strongly suggesting that types (and tests) are purely instrumental in the same way. I don't think this does justice to either (though unsurprisingly I'm more partisan wrt types ;-).

To see how quickly the analogy breaks down, consider that in the monetary case getting the same thing cheaper is better (other things being equal) and in the limit the best outcome is free. Your analogy suggests that a program with fewer, or no types and no tests is better (so long as it actually works). I think this is false (and I would continue to think that even if I was certain, for whatever reason, that the program behaved as intended). I think that this is clearest in the case of types: we might structure our program in ways that makes essential use of them (eg. by way of type classes). I expect the same might be said for tests, but I'll leave it to a test partisan to make that case.

I'm also uncomfortable with the way you slide into costs being "mental energy and time" rather than the costs being the types and tests. Which is it? If it's really mental energy you care about, doesn't that leave it open that both types and tests might actually save mental energy now or later?

@non

This comment has been minimized.

Copy link
Owner Author

@non non commented Jan 29, 2014

I do think that both types and tests save mental energy. In fact, someone who practices strict test-driven development may get the same benefits of types under my model (or not, I'm agnostic on this point for the purposes of our discussion).

I think the cost has to be time/energy because there is no limit to how many types I use. Given infinite time/energy I don't mind "spending" as many types (or tests) as I need, so I don't see them as a constraint. I guess what I am trying to get at is that types, tests, and ad-hoc debugging will all potentially arrive at a correct solution, but the graph (in terms of time) and the area under the curve of that graph are different in each case.

I don't think that fewer tests/types is better, but I do think that minimizing time/energy spent is a useful metric. For instance, if I'm trying to make a game in 48 hours (say, for the Ludum Dare competition) working in Python or Javascript may end up being easier since I get a semi-working prototype more quickly than I would if I was using Haskell. Someone else's experience may be different here, I'm just trying to illustrate the point.

In conclusion, I never intended types or tests to be thought of as a cost, but given the previous article I can see how my argument was framed that way.

@non

This comment has been minimized.

Copy link
Owner Author

@non non commented Jan 29, 2014

An implication of my argument is that if you can show that for every time t, one strategy's curve is less than another's, then that strategy is strictly better. The implication of the analogy is that the type curve starts out higher but then flattens out when correct, compared to a testing curve that keeps ascending over time as more bugs are found and tests are added.

@milessabin

This comment has been minimized.

Copy link

@milessabin milessabin commented Jan 29, 2014

Apologies if my reading was excessively literal minded. I think the idea of the "area under the curve" and front vs. back loading is useful. I think that it would probably wise to have at least two dimensions though: one for costs, one for benefits, without the assumption that they are directly commensurable.

I'm also wondering if your experience of extremely time constrained programming competitions is shaping the way you're thinking about this? If it is, do you think your conclusions carry over to less constrained circumstances?

@milessabin

This comment has been minimized.

Copy link

@milessabin milessabin commented Jan 29, 2014

Yeah, pareto optimality would be nice ... you'll be lucky to get it though ;-)

@non

This comment has been minimized.

Copy link
Owner Author

@non non commented Jan 29, 2014

I totally agree that I'm ignoring qualitative things like beauty and whatnot, which is definitely a problem. I certainly like some programs more than others!

So, besides time-constrained programming competitions, I guess I have had experiences where I started working on something I knew absolutely nothing about. I've done this with static and dynamic types and I would say that I found the "activation energy" lower with dynamic types. This can apply to a domain I'm totally ignorant of, or a project that's too big to contemplate doing on my own.

For instance, the text editor I wrote (about 10K lines of Python) evolved as I wanted to add features and fix bugs, and didn't have an up-front design. It's possible that the (immense) amount of energy I put into it between 2006 and 2011 would have been much lower with static types, but it's also possible I would never have gotten an initial prototype working well enough to get excited, stick with it, add features, etc. Using it in a day-to-day way was a huge motivation to make it better/faster.

I think I could probably write a better version of this editor in Scala (or Idris or whatever) these days, but I have a good idea of what design will work and why, and what types I'll need. It's possible it would have worked just as well then, but it's also possible I would have struggled and given up (not having enough energy/intelligence/etc for the up-front cost).

@milessabin

This comment has been minimized.

Copy link

@milessabin milessabin commented Jan 29, 2014

One thing that strikes me as a little odd is the timing: types are first and tests come later? Really? Isn't that just a strawman ... surely nobody really thinks that's an accurate picture? TBH, I don't even think it's a good approximation.

From my completely personal and annecdotal PoV, I do sketching first to rough out a structure and then refine and rework. Types and (some) tests work for me for the initial phase, but I could easily imagine someone with different preferences putting more emphasis on tests up front. But either way this really doesn't feel like the kind of periodization you're you're claiming.

@non

This comment has been minimized.

Copy link
Owner Author

@non non commented Jan 29, 2014

So, I don't know where types and tests came into things, since I was mostly talking about dynamic vs static types.

What I am really talking about is building types/tests into an upfront design, getting things "working" (compiling/passing), then moving on, versus implementing something, seeing it work in an ad-hoc way, moving on, and then returning to fix bugs/refactor/whatever later as needed. Dynamic types are compatible with either strategy, but static types are less amenable to getting something fuzzy working in an ad-hoc way.

I certainly don't spend days in a laboratory slowly building types, so I imagine my workflow is somewhat similar to yours. But I've never built a large project (game, editor, whatever) from start to finish in static types by myself, so usually the static type projects I've work on are smaller and more constrained (building an R-Tree, building a segmented sieve, implementing some algorithms, interval arithmetic, etc).

@non

This comment has been minimized.

Copy link
Owner Author

@non non commented Jan 29, 2014

To be clear, I find a ton of joy in working with static types so that when my program compiles it often works correctly without bugs.

I am just trying to reconcile this with the fact that I haven't yet written any large personal tools comparable to those I built in Python back when that was my go-to language.

@milessabin

This comment has been minimized.

Copy link

@milessabin milessabin commented Jan 29, 2014

Doesn't Spire count?

@non

This comment has been minimized.

Copy link
Owner Author

@non non commented Jan 30, 2014

Well, Spire is an interesting case. I mean, it's a big project, but as a library it's pretty modular. Also, there have been many collaborators (which is something a good type system has made way easier).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment