Skip to content

Instantly share code, notes, and snippets.

@markbao
Created January 19, 2015 17:17
Show Gist options
  • Save markbao/095ab916c160b55c9828 to your computer and use it in GitHub Desktop.
Save markbao/095ab916c160b55c9828 to your computer and use it in GitHub Desktop.
Draft: The curse of knowledge, beginner’s mind, and barriers to creativity
# The curse of knowledge, beginner’s mind, and barriers to creativity
*“The best way to learn is to learn from those who have come before you, who have figured it out and wrote down their wisdom. So I read to get their knowledge, to take advantage of the work they’ve done to figure it out.”*
But how, then, do you avoid only thinking like them, and making the same mistakes they did? If we assume that once we learn something and take it to heart—we learn something that makes a lot of things make sense—that it’s harder to then reject that idea and think of new ideas that might actually be closer to the truth, how do we avoid thinking like the status quo and never challenging it?
This is a question that I’ve dealt with for most of my life, and over the past year I found out it has a name. The curse of knowledge is the idea that people who have more knowledge are unable to think about things from the perspective of those who have less knowledge. It’s similar to the innovator’s dilemma in that it describes a situation where the first person to think of something is the last person to challenge it, which makes the initial innovator blind to challenges to it, because of their steadfast devotion to their own innovation. Another related term is paradigm blindness, which gets closer to what I’m talking about, but is a lesser-used term. All of these ideas refer to a party that has a strong belief in something, a belief that may have paid off and been beneficial to their way of seeing things. However, that belief also *restricts their ability to think about new approaches*.
My worry is that this is present in everything that we learn—that it is *endemic* in learning itself. For example, learning about the dual process theory (of System 1, the more automatic, instinctive system of thinking (think 2+2), and System 2, the slower, deliberative system of thinking (think 7x8)) that Tversky & Kahneman popularized makes it hard to conceive of thinking in a different way. Or, learning about programming, like say this script:
password = input(“Enter your password.\n”)
if password == “swordfish”:
print(“Welcome”)
…well, how could you conceive of a programming langauge that operates differently than that? That seems so fundamental and simple that thinking about a different approach is extremely difficult. [^1]
The problem with the curse of knowledge is that **learning becomes tunnel vision**. Learning about something, understanding it, and believing it also means rejecting conflicting knowledge. (Learning about dual-process thinking makes me reject other conflicting models of cognition, for example.)[^2] But the net result is that I’ve tunnel-visioned myself into thinking about things in a certain way, at the detriment of being able to think about them differently and come up with new conceptions.
And it’s not enough to just *will* myself to think about things differently. If we assume the somewhat crude model of learning as reinforcing neural pathways, so that the facts, beliefs, and habits that we hold strongest are like deepened grooves of mental patterns, we can see that thinking differently means challenging these deeply entrenched and familiar ideas. Further, many times these aren’t just simple ideas, but we use some of these concepts as *ways to think*. So trying to will yourself to think differently rather than just accepting these concepts, is like trying to move a rug while you’re sitting on it.
That’s because a huge component of these beliefs and things that we learn are *frameworks of thought*. We use these frameworks as foundational elements to engage in thinking itself. A good example in psychology is dual-process theory, as we’ve already touched upon. One in economics is the idea of allowing people to act in self-interest and increasing incentives for behavior. The problem is that these concepts become so second-nature and subconscious that we don’t even think to challenge them, and our conscious *and* subconscious thought (the latter of which which is thought to contribute strongly to ‘dormant’ creativity) is subject to these frameworks of thought. We become blind to the paradigms that we are using, and it’s so much harder for us to see outside these paradigms, to come up with alternative explanations, because we’re using them unconsciously.
This is getting kind of depressing, so let’s bring in a nice Buddhist conception of it — they always have nice conceptions — which every Zen Buddhist knows about. In his wildly popular (and Steve Jobs-approved) book, *Zen Mind, Beginner’s Mind*, Shunryu Suzuki throws down a nice conception of what beginner’s mind is:
> "In the beginner's mind there are many possibilities, but in the expert's there are few."
The beginner is open to many opportunities, possibilities, and ways that something can be. The expert already knows, ostensibly, what possibilities there are. But the expert has signed away his ability to think about things openly—when he comes across an idea outside of his scope of knowledge, he can come up with objections to reject that idea.
## Personally…
Yeah, this is a pretty big problem for me personally, and it is a blocker when I learn about anything new. I worry about delving deeply into an important subject for fear that learning the traditional knowledge in that field will only make me think in traditional ways—not just on the surface, but deeply and subconsciously, using these frameworks without even knowing it and thus without the ability to control it.
And being at college has taken this problem to a whole different level. I have a few personal ideas about behavioral science given my dilettante-esque experience with some of it. I worry that learning the traditional body of knowledge will make me think in traditional ways, unable to challenge those ideas because I don’t even know that I’m using them.
But rejecting all of current knowledge and forging a new path, while it could uncover some new ideas, seems to be ultimately misguided since it would ignore everything that we’ve found out up to now.
## What do we do?
We might say: **that’s the way it has to be, but the end result is beneficial.** In an age of complexity, where making any progress in a field requires an immense amount of specialized knowledge as a precursor, it might be necessary to learn a lot, get tunnel-visioned, and whatnot to make breakthroughs. For example, in psychology, we can’t just ignore what has come before us and try to blaze a totally new trail. This view says that my ideas about psychology, built from little traditional knowledge, might be kind of cool, but nine times out of ten, they’ll be kind of stupid, most likely false, and almost certainly misinformed, and it’s only with the traditional knowledge in a field that I’ll be able to make any real progress. While there are some outsiders that go into a field and completely revolutionize it, they may be few and far in between—and probabilistically, it might be better to have tunnel-vision, but also have the knowledge that will allow you to have a higher chance of building new ideas than a beginner, despite having the tunnel vision.
![](http://i.imgur.com/hbn5Cof.png)
Or, we might say: **we can get the benefits of knowledge while still retaining an open mind**. In this conception, we can still hold on strongly to things like dual process theory, but find strategies to combat tunnel vision and try to challenge the status quo, restoring the ability to see things in a different light and avoiding paradigm blindness.
Relating to this idea, Will Johnson says:
> “[The Making of the Atomic Bomb](https://www.goodreads.com/book/show/16884.The_Making_of_the_Atomic_Bomb)” has a really good section about what makes a brilliant scientist. His hypothesis is that the best discoveries are made by people with 100% understandings of their field, but an unusual ability to remove themselves from groupthink.
One way is to systematically challenge thoughts and use some sort of root-cause analysis of sorts to understand the structure of thought we are using. Essentially, questioning what led to a certain idea, and then questioning all of those precursor ideas, with the hope that you can have clarity on whether the first principles are right or not. It seems like it would work, but this process is immensely difficult and hugely uncertain, since it’s doubtful that we can be fully impartial and objective while carrying out this process.
Brian Tobal[^3] suggested that meditation might help us do this. While it does seem like the automatic answer to everything nowadays, there is actually a good case to be made for it: presumably, if we meditate to increase awareness, then we can potentially be more aware of when we are using these frameworks of thought that are deeply embedded, and also question whether they are right or not. Perhaps greater awareness will allow us to be a bit more impartial about evaluating whether our “usual thinking,” like seeing something in a System 1 / System 2 light, is really the right answer, and allow us to come up with new ways to think outside of those well-worn grooves.
Finally, we can look into history and see what percentage of people actually succeeded as outsiders that came into a separate field and revolutionized it, versus those who revolutionized it from within.
## What’s the point?
The importance of all this is: the tunnel-visioning that happens with the innovator’s dilemma is certainly not limited to innovation. If we can see it happening in science as well, we might be able to be more aware of it, so that we can question whether our belief in the moment is actually useful. Scientists, paradoxically, are expected to specialize, causing tunnel vision, but they are also expected to “think outside the box!” and come up with new ideas. If paradigm blindness is influential in their thinking, then building systems to escape paradigm blindness is essential. Otherwise, we might continue learning, aware of the tunnel-vision we are engaging in, and unable to think in new ways.
[^1]: More on functional programming and other paradigms later.
[^2]: If we see it in this way, we can call learning predominantly zero-sum: we learn something and accept it, and reject the things that are not that.
[^3]: Brian Tobal is a former teacher and current product manager at Thinkful, and works on a lot of ideas on learning, knowledge retention, and education.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment