Courtesy of /u/bane on HackerNews: http://web.archive.org/web/20141017022007/https://news.ycombinator.com/item?id=7373301
It seems that they go a few directions:
The most common seems to be to try and generalize, because relearning most of your job skills every few years starts to get annoying the 20th time you've had to do it. It's different when you are younger and everything is new, you just chalk up a major tooling change as just something else to learn. But when the next hot platform or architecture or whatever comes out you get tired of running in exactly the same place. You also start to get a long view on things, where all these new things coming out don't really seem to offer any advantage to you that keeps development fun. It's just more and more layers of abstraction and you start to see the nth demo of WebGL maxing out a 4 core modern GPU system doing exactly what you did 20 years ago with a single 32-bit core, 1/5th the transistor count and all in so