Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save Recursing/372758119c9a8a2e5e77e67b1e970fa3 to your computer and use it in GitHub Desktop.
Save Recursing/372758119c9a8a2e5e77e67b1e970fa3 to your computer and use it in GitHub Desktop.
[00:00.000 --> 00:16.280] When I was a little kid, I really hated visiting my native country, India.
[00:16.280 --> 00:23.320] It was hot, it was polluted, I didn't recognize any of my relatives, but most of all, it made
[00:23.320 --> 00:24.320] me really sad.
[00:24.320 --> 00:34.640] I saw a lot of things in India at a pretty young age, hunger, pain, anger, desperation.
[00:34.640 --> 00:39.040] One day, when I was about eight, my cousins and I went to a small street side ice cream
[00:39.040 --> 00:47.160] parlor in Hyderabad, and we each got little ice cream cones for less than a dollar.
[00:47.160 --> 00:51.880] As we were walking outside, we saw something that's sadly familiar to a lot of us who live
[00:51.880 --> 00:54.800] in big cities like San Francisco and Berkeley.
[00:54.800 --> 01:00.680] A woman outside the ice cream parlor was begging for money for food.
[01:00.680 --> 01:08.920] She was brave, and even though she was wrapped in a rag, I could count every one of her ribs.
[01:08.920 --> 01:15.520] I didn't have any money to give, but without thinking, I handed over my ice cream cone.
[01:15.520 --> 01:22.320] As we were walking away, my cousins stared at me, you're crazy, they said, why would
[01:22.320 --> 01:24.680] you ever do that?
[01:24.680 --> 01:32.040] And I didn't know how to respond, so we just walked home in uncomfortable silence.
[01:32.040 --> 01:37.740] That night, and a lot of nights after that, when everyone else in the house was asleep,
[01:37.740 --> 01:45.480] I would stay up, and I would sob into my pillow, because the world was unfair and full of pain.
[01:45.480 --> 01:52.480] Because what was normal was to ignore that pain, and it was crazy to try to do something
[01:52.480 --> 01:58.000] about it, no matter how small.
[01:58.000 --> 02:04.480] Because I knew that after that old woman finished her ice cream cone, she would be hungry again,
[02:04.480 --> 02:09.840] and I wouldn't be there to help her, because I felt powerless.
[02:09.840 --> 02:15.080] So I want to ask you, how many of you have had experiences like this, where you're forced
[02:15.080 --> 02:25.120] to confront just how unfair the world is, and just how privileged you really are?
[02:25.120 --> 02:26.120] So what do you react?
[02:26.120 --> 02:31.440] How do you feel when you have an experience like this?
[02:31.440 --> 02:33.400] Shame?
[02:33.400 --> 02:37.720] You feel like you don't deserve the comfort that you have?
[02:37.720 --> 02:41.120] You feel like you shouldn't have it, you don't deserve it?
[02:41.120 --> 02:46.760] You feel guilt that you should be doing more to help, instead of just living your life
[02:46.760 --> 02:48.640] as it is?
[02:48.640 --> 02:52.960] Do you feel anger that other people, people more powerful than you, people richer than
[02:52.960 --> 02:54.680] you, aren't helping?
[02:54.680 --> 02:59.240] Why should you feel guilty when they're not doing anything either?
[02:59.240 --> 03:04.760] Do you feel apathy, because you've seen sites like this hundreds of times, thousands of
[03:04.760 --> 03:08.040] times, every day, on the way to work?
[03:08.040 --> 03:11.640] And you really should feel something, but you have papers to write, you have kids to
[03:11.640 --> 03:15.400] raise, you have a life to live.
[03:15.400 --> 03:18.600] Or do you feel like you just want to deny it all?
[03:18.600 --> 03:22.880] Like it's too hard for you, you don't want to think about it, and you just want it to
[03:22.880 --> 03:23.880] go away?
[03:23.880 --> 03:26.840] That was my personal favorite as a child.
[03:26.840 --> 03:33.200] I begged my parents not to take me back to India, my dad can attest.
[03:33.200 --> 03:38.360] I even stayed up at night praying that we wouldn't move back, that we wouldn't ever
[03:38.360 --> 03:43.840] be taken back, I wouldn't have to see something like that ever again.
[03:43.840 --> 03:46.560] And my prayers were answered.
[03:46.560 --> 03:54.040] Life got busy, and my trips to India became less and less frequent.
[03:54.040 --> 04:00.000] I got wrapped up in an upper middle class world, in my upper middle class life.
[04:00.000 --> 04:05.720] And while I'd like to say that those experiences of suffering stayed with me in the back of
[04:05.720 --> 04:10.760] my mind, truthfully, they were pretty far back.
[04:10.760 --> 04:11.760] And I was content.
[04:11.760 --> 04:19.080] But if there's anyone who's really good at ruining a happy, contented life, it's Peter
[04:19.080 --> 04:20.080] Singer.
[04:20.080 --> 04:27.120] When I was 14 years old, I read his book, The Life You Can Save.
[04:27.120 --> 04:31.840] In that book, Peter Singer argues that if you would ruin an expensive suit to save the
[04:31.840 --> 04:37.600] life of a child drowning in a shallow pond, then you should give up a few thousand dollars
[04:37.600 --> 04:41.480] to save the life of a child dying of a preventable disease.
[04:41.480 --> 04:45.440] Those two situations were no different.
[04:45.440 --> 04:51.680] Peter Singer took me back to that time when I was eight years old, and faced with such
[04:51.680 --> 04:55.360] a stark reminder of how unfair our world is.
[04:55.360 --> 05:00.040] Peter Singer reminded me that even though that old woman wasn't in front of me anymore,
[05:00.040 --> 05:01.040] she still mattered.
[05:01.040 --> 05:06.160] And after that, I was compelled to learn more.
[05:06.160 --> 05:11.960] I learned that for every single American, for each and every one of you, there are three
[05:11.960 --> 05:19.640] people living on less than $2.50 a day, less than the price of a cup of coffee.
[05:19.640 --> 05:25.800] I learned that adverse effects from climate change are going to kill or dislocate tens
[05:25.800 --> 05:31.320] of millions of the most vulnerable people and animals in the next few years.
[05:31.320 --> 05:38.280] I learned about the cruelty and racial injustice of our criminal justice system.
[05:38.280 --> 05:43.360] I learned about the horrifying ways in which we treat the animals that we slaughter for
[05:43.360 --> 05:44.360] food.
[05:44.360 --> 05:50.960] I learned about the shame and powerlessness of unemployment and homelessness.
[05:50.960 --> 05:55.920] And I learned about the burden of disease and death.
[05:55.920 --> 06:01.240] By the end of it all, I felt like the world was on fire.
[06:01.240 --> 06:05.120] And relatedly, I was a horrible person.
[06:05.120 --> 06:09.180] Because I wasn't doing anything to stop it.
[06:09.180 --> 06:11.800] I always thought of myself as a hero.
[06:11.800 --> 06:17.400] I always imagined that I would be that person who would go rescue strangers from a burning
[06:17.400 --> 06:18.400] building.
[06:18.400 --> 06:23.600] I always imagined that when something sufficiently dramatic happened, when the time came, I would
[06:23.600 --> 06:24.600] step up.
[06:24.600 --> 06:30.920] But I realized then that even though I couldn't see it, the time had come.
[06:30.920 --> 06:34.640] So I had to step up.
[06:34.640 --> 06:37.280] But what would I actually do?
[06:37.280 --> 06:40.920] There are so many causes worth dedicating a lifetime to.
[06:40.920 --> 06:42.520] So many problems.
[06:42.520 --> 06:43.520] I was overwhelmed.
[06:43.520 --> 06:45.840] How would I actually choose?
[06:45.840 --> 06:48.280] What would I choose?
[06:48.280 --> 06:53.880] And that's when I discovered effective altruism, which is still a very, very small social movement
[06:53.880 --> 06:57.520] in 2011 when I was a junior in high school.
[06:57.520 --> 07:03.240] Effective altruism proposed a radical, but very simple plan for your life.
[07:03.240 --> 07:08.040] Figure out how to do the most good, and then do it.
[07:08.040 --> 07:12.280] Needless to say, simple does not mean easy.
[07:12.280 --> 07:17.520] But one insight from effective altruism really can make it a lot easier, and that's cause
[07:17.520 --> 07:18.520] prioritization.
[07:18.520 --> 07:24.160] The idea that you should actively compare different causes, figure out how much good
[07:24.160 --> 07:29.760] each one does, and only support the causes that do the most good.
[07:29.760 --> 07:36.040] To a lot of people, this might seem unfair or crass or cold.
[07:36.040 --> 07:37.800] Should we support all causes?
[07:37.800 --> 07:41.200] Should all causes deserve some support?
[07:41.200 --> 07:44.320] I don't think so.
[07:44.320 --> 07:46.800] Causes don't deserve support.
[07:46.800 --> 07:48.520] People deserve support.
[07:48.520 --> 07:51.760] And causes are supposed to support people.
[07:51.760 --> 07:57.760] And sometimes we have to ruthlessly prioritize between causes in order to be fair to people.
[07:57.760 --> 07:58.760] Let me give you an example.
[07:58.760 --> 08:04.360] Suppose you have $1,000 to spare, and you're trying to figure out where to give it.
[08:04.360 --> 08:09.760] You could give it to an organization that trains seeing eye dogs to support people with
[08:09.760 --> 08:10.840] blindness.
[08:10.840 --> 08:15.100] Or you could give it to an organization that provides insecticide treated malaria nets
[08:15.100 --> 08:17.840] to prevent transmission of malaria.
[08:17.840 --> 08:23.040] What would happen if we tried to be fair and gave $500 to each?
[08:23.040 --> 08:31.080] Well, it takes about $45,000 to train a seeing eye dog from a puppy to graduation.
[08:31.080 --> 08:37.680] So your $500 would support about 1% of that, or about one week of training.
[08:37.680 --> 08:42.960] On the other hand, it costs about $10 to produce and distribute a malaria net.
[08:42.960 --> 08:49.200] Each net protects two people, usually children under the age of five, from getting malaria
[08:49.200 --> 08:51.760] for three years.
[08:51.760 --> 08:58.640] So with your $500, you could protect 100 children for three years from a debilitating and often
[08:58.640 --> 08:59.640] fatal disease.
[08:59.640 --> 09:06.000] Add that to the first $500 you spent, and this is how much good you've done.
[09:06.000 --> 09:10.280] And this is how much good you would have done if you had spent the next $500 on the malaria
[09:10.280 --> 09:14.160] charity as well.
[09:14.160 --> 09:21.040] In reality, if we try to be fair and support both of these causes equally, what we're saying
[09:21.040 --> 09:27.840] is that we value helping one person a little bit over helping 100 people a lot.
[09:27.840 --> 09:30.320] And that's not fair.
[09:30.320 --> 09:37.280] Supporting all causes equally is unfair to people, and they're who we care about.
[09:37.280 --> 09:42.520] And we can't necessarily rely on our personal experience or instinct to let us know what
[09:42.520 --> 09:44.400] causes are the best either.
[09:44.400 --> 09:48.720] John Green puts this beautifully in his book, The Fault in Our Stars.
[09:48.720 --> 09:55.000] In this book, the main character Hazel's favorite book stars a cancer survivor named Anna.
[09:55.000 --> 09:56.800] This is what Hazel says about Anna.
[09:56.800 --> 10:02.040] Anna decides that being a person with cancer who starts a cancer charity is a bit narcissistic.
[10:02.040 --> 10:07.880] So she starts a charity called the Anna Foundation for People with Cancer Who Want to Cure Cholera.
[10:07.880 --> 10:10.880] I aspire to be an Anna.
[10:10.880 --> 10:17.840] And effective altruists understand, like Anna, that choosing from our heart is unfair too.
[10:17.840 --> 10:18.840] We're human.
[10:18.840 --> 10:22.440] We're biased to want to support those who are like us.
[10:22.440 --> 10:25.680] Our experience informs the things that we think are urgent.
[10:25.680 --> 10:29.720] And if we don't try and move past that and actually do the research.
[10:29.720 --> 10:34.480] So if we can't choose from our heart, we need some kind of framework to choose the best
[10:34.480 --> 10:35.480] cause.
[10:35.480 --> 10:38.800] Here's one that I find particularly useful.
[10:38.800 --> 10:42.120] We can break a cause down into three components.
[10:42.120 --> 10:47.120] Importance, tractability, and neglectedness.
[10:47.120 --> 10:49.600] Let's start with importance.
[10:49.600 --> 10:52.480] Importance is the product of scale and severity.
[10:52.480 --> 10:56.200] Scale refers to the number of people or animals this problem affects.
[10:56.200 --> 11:01.440] And severity refers to how badly it affects each one.
[11:01.440 --> 11:05.440] Even though it might seem hard, we can actually compare really disparate causes that do very
[11:05.440 --> 11:11.160] different things relatively easily on the dimension of importance.
[11:11.160 --> 11:19.000] For example, let's consider the causes of leukemia and climate change.
[11:19.000 --> 11:24.800] 10,000 people die of leukemia every single year.
[11:24.800 --> 11:27.640] That is horrifying when you think about it.
[11:27.640 --> 11:32.320] Each one of those squares is a thousand human lives.
[11:32.320 --> 11:40.600] Every 50 minutes, someone with hopes and dreams, someone with a family and a future dies a
[11:40.600 --> 11:45.240] painful death of leukemia.
[11:45.240 --> 11:51.920] But in the next few years, from the effects of climate change, 10,000 additional people
[11:51.920 --> 11:58.000] are expected to die of heat stroke alone.
[11:58.000 --> 12:07.080] Millions of others will die from the effects of increased extreme weather and emerging
[12:07.080 --> 12:10.440] tropical diseases.
[12:10.440 --> 12:16.560] From this we can see that climate change is a more important cause than leukemia.
[12:16.560 --> 12:19.440] Not because leukemia isn't important.
[12:19.440 --> 12:25.080] Not because it's not worth dedicating your life to fighting leukemia.
[12:25.080 --> 12:32.640] But because we're only so many people and we can't solve every problem because the world
[12:32.640 --> 12:37.640] is unfair and full of pain.
[12:37.640 --> 12:42.960] But even if we're working on the most... and when we're trying to calculate importance,
[12:42.960 --> 12:49.760] it's actually very, very crucial to do the math, to go find the numbers, to figure out
[12:49.760 --> 12:54.560] how many people a problem affects, to figure out how badly it affects them.
[12:54.560 --> 12:57.920] Because I don't know about you, but I'm a bleeding heart.
[12:57.920 --> 13:02.640] If I were to just make up numbers for how important each cause was, everything would
[13:02.640 --> 13:06.320] be an 11 on a scale from 1 to 10.
[13:06.320 --> 13:11.440] But there can be a world of difference between two causes that both seem like urgent life
[13:11.440 --> 13:12.440] and death situations.
[13:12.440 --> 13:17.840] And if we don't pay attention to that difference, we're going to leave behind people that we
[13:17.840 --> 13:21.320] could have helped.
[13:21.320 --> 13:26.840] But we could be working on the most important cause in the world, but it wouldn't matter
[13:26.840 --> 13:29.480] if we couldn't make any progress on it.
[13:29.480 --> 13:30.800] And that's the second cause.
[13:30.800 --> 13:33.560] That's the second criteria for a cause.
[13:33.560 --> 13:34.560] Tractability.
[13:34.560 --> 13:37.880] Tractability basically means that we can see a path forward.
[13:37.880 --> 13:41.720] We have good evidence that putting additional resources, additional time and money into
[13:41.720 --> 13:46.680] a problem will produce results.
[13:46.680 --> 13:50.640] Let's think about the tractability of two political causes.
[13:50.640 --> 13:54.840] The first is criminal justice reform in the United States.
[13:54.840 --> 14:02.000] Every year, police in the US shoot and kill 1,000 people without trial.
[14:02.000 --> 14:09.360] Unarmed black men are seven times more likely to be killed than unarmed white men.
[14:09.360 --> 14:14.120] The Black Lives Matter movement focuses especially on the issue of this police violence.
[14:14.120 --> 14:19.960] But in truth, this kind of excessive cruelty and racial inequality is present at all levels
[14:19.960 --> 14:28.800] of our criminal justice system, from sentencing to trials to bail.
[14:28.800 --> 14:31.200] The second cause is foreign aid reform.
[14:31.200 --> 14:38.080] The United States spends about 1% of its federal budget on foreign aid.
[14:38.080 --> 14:41.440] Over half of this aid is actually military aid.
[14:41.440 --> 14:46.080] Even the non-military aid overwhelmingly goes to countries where we have a political stake
[14:46.080 --> 14:51.360] or a military presence, rather than the countries that need the aid the most.
[14:51.360 --> 14:56.120] If we could convince the United States government to slightly increase their foreign aid budget,
[14:56.120 --> 15:00.400] or if we could convince them to redirect more of that aid to proven cost-effective health
[15:00.400 --> 15:04.160] interventions, we could save millions of lives.
[15:04.160 --> 15:10.000] Unfortunately, very few people know about or care about foreign aid reform.
[15:10.000 --> 15:15.400] It's not in the public eye, and it's a federal issue decided by a small group of people behind
[15:15.400 --> 15:16.600] closed doors.
[15:16.600 --> 15:21.000] It's very, very hard for an outside activist to influence.
[15:21.000 --> 15:24.840] On the other hand, we're in a really exciting moment in our country for criminal justice
[15:24.840 --> 15:26.840] reform.
[15:26.840 --> 15:28.680] It's all over the news media.
[15:28.680 --> 15:34.440] There's overwhelming bipartisan support for things like cameras on cops, reclassifying
[15:34.440 --> 15:40.320] a lot of felonies as misdemeanors, retroactively reducing sentences, releasing prisoners.
[15:40.320 --> 15:46.000] This is a rare moment where we can help the disadvantaged a lot by cutting the size of
[15:46.000 --> 15:48.560] government.
[15:48.560 --> 15:56.320] Criminal justice reform is a vastly more tractable cause in our current political environment.
[15:56.320 --> 16:00.200] It's important to note, though, that tractability depends on situation.
[16:00.200 --> 16:02.640] It can depend on your location.
[16:02.640 --> 16:09.560] For example, as people in the United States, it's relatively easy for us to help people
[16:09.560 --> 16:15.640] overseas by providing health interventions or economic interventions.
[16:15.640 --> 16:21.240] It's much harder and riskier for us to try and mess with other countries' political systems.
[16:21.240 --> 16:27.640] On the other hand, someone in Uganda or Kenya might be much better placed to be an activist
[16:27.640 --> 16:32.000] towards their own government and have them change their policies.
[16:32.000 --> 16:34.840] It obviously depends on the skills you have as a person.
[16:34.840 --> 16:39.440] The world's top biochemist is much better placed to work on the problem of aging than
[16:39.440 --> 16:43.080] on the problem of criminal justice reform.
[16:43.080 --> 16:45.900] This means that tractability can change.
[16:45.900 --> 16:51.320] It means that the best cause to work on at the moment is, in a sense, opportunistic and
[16:51.320 --> 16:55.320] that we should look for good opportunities to work on the causes that we think are the
[16:55.320 --> 16:56.880] most important.
[16:56.880 --> 17:02.600] It also means that we should build up skills so that the most important problems in the
[17:02.600 --> 17:05.040] world become more tractable for us.
[17:05.040 --> 17:10.240] Rohin will talk about this a lot more later this afternoon in a career workshop, which
[17:10.240 --> 17:13.680] you can see in your programs.
[17:13.680 --> 17:15.640] So that was tractability.
[17:15.640 --> 17:16.640] What about neglectedness?
[17:16.640 --> 17:25.360] Neglectedness is the idea that a cause is getting a lot fewer resources than its importance
[17:25.360 --> 17:30.280] and its tractability would suggest it should get.
[17:30.280 --> 17:34.000] Other things being equal, we want to support causes that are more neglected.
[17:34.000 --> 17:36.560] This is not out of a sense of fairness.
[17:36.560 --> 17:41.280] This is not out of a sense of trying to balance out the money that different causes get for
[17:41.280 --> 17:43.000] their own sake.
[17:43.000 --> 17:46.640] It's the idea of picking low-hanging fruit.
[17:46.640 --> 17:52.440] If there are millions of people and billions of dollars already going into a cause, then
[17:52.440 --> 17:57.480] chances are that the most cost-effective, easiest things you could do to do a lot of
[17:57.480 --> 17:59.160] good were already taken.
[17:59.160 --> 18:03.680] If you're not highly, highly specialized or if you're not taking a dramatically different
[18:03.680 --> 18:09.160] approach than everyone else in the field, you will be working on the toughest problems,
[18:09.160 --> 18:13.080] the ones that all those millions of people and billions of dollars haven't yet been able
[18:13.080 --> 18:14.080] to solve.
[18:14.080 --> 18:18.600] So in a sense, neglectedness implies that a cause is more tractable.
[18:18.600 --> 18:22.680] If you're going into a smaller cause that's less established, that doesn't have much money
[18:22.680 --> 18:31.160] or funding, then you can lay foundational groundwork that has a multiplying impact because
[18:31.160 --> 18:33.680] other people can build on your work.
[18:33.680 --> 18:40.000] And because if you hadn't stepped in and helped that cause, it's highly unlikely that other
[18:40.000 --> 18:42.000] people would have filled that gap.
[18:42.000 --> 18:47.240] Whereas for a less neglected cause, that cause is likely to have been helped anyway.
[18:47.240 --> 18:52.280] Let's look at two broad causes to illustrate the idea of neglectedness.
[18:52.280 --> 18:55.600] The first is human welfare in the United States.
[18:55.600 --> 18:58.680] The second is farm animal welfare.
[18:58.680 --> 19:01.360] Let's look at some numbers.
[19:01.360 --> 19:07.400] There are seven times as many farm animals in the United States as there are humans.
[19:07.400 --> 19:13.520] But 97% of donations go to organizations that aid humans.
[19:13.520 --> 19:22.120] The other 3% that go to animals over here, the vast majority of that is to pet shelters.
[19:22.120 --> 19:29.000] So farm animals are vastly more neglected than both humans and pet animals.
[19:29.000 --> 19:36.160] And there's a good case to be made that they suffer a lot more than animals.
[19:36.160 --> 19:39.600] And here's a very important point.
[19:39.600 --> 19:42.960] Neglected causes are going to look like bad causes.
[19:42.960 --> 19:44.400] They're going to look weird.
[19:44.400 --> 19:46.080] They're going to look like fiction.
[19:46.080 --> 19:49.520] They're going to look a lot less important than other causes.
[19:49.520 --> 19:52.440] They're going to look a lot less tractable than other causes.
[19:52.440 --> 19:55.560] That's why they're neglected.
[19:55.560 --> 19:58.800] The analogy to investing is really appropriate here.
[19:58.800 --> 20:03.000] This is what Microsoft looked like in 1978.
[20:03.000 --> 20:06.480] Would you have invested in them?
[20:06.480 --> 20:08.040] Most people didn't.
[20:08.040 --> 20:11.680] And now they're worth $290 billion.
[20:11.680 --> 20:17.600] The key to being a good investor and to being a good altruist is to dig past the first impressions
[20:17.600 --> 20:25.560] and actually do the research so you can be the one that makes the bet that pays off.
[20:25.560 --> 20:31.360] So we want to look for causes that are highly important, highly tractable, and highly neglected
[20:31.360 --> 20:34.520] relative to their importance and tractability.
[20:34.520 --> 20:40.440] That already narrows down the space of causes a lot, no matter what your values are.
[20:40.440 --> 20:43.480] But does this mean that there's a one true cause?
[20:43.480 --> 20:48.680] Was all of this a lead up for me to tell you that the thing you should be working on is
[20:48.680 --> 20:53.760] X and you can just turn your brain off and go do that for the rest of your life?
[20:53.760 --> 20:54.760] Absolutely not.
[20:54.760 --> 20:57.800] This is not a solved problem.
[20:57.800 --> 21:02.880] And even though a lot of effective altruists use this framework or something like it to
[21:02.880 --> 21:08.800] find the causes that they're working on, there's still a huge amount of variation.
[21:08.800 --> 21:10.080] We have different values.
[21:10.080 --> 21:11.560] We have different beliefs.
[21:11.560 --> 21:15.480] We have different assumptions and different priorities.
[21:15.480 --> 21:22.960] So let's look at some of the causes that are very popular in the effective altruism movement.
[21:22.960 --> 21:29.880] Here are the three causes, global poverty, animal welfare, and reducing existential risk
[21:29.880 --> 21:32.320] or global catastrophic risk.
[21:32.320 --> 21:34.920] I will go through these one by one.
[21:34.920 --> 21:39.800] In our first block of the conference, we'll be focusing on these causes, which is a little
[21:39.800 --> 21:44.180] shifted over because of the delays.
[21:44.180 --> 21:49.280] So for global poverty, like I said before, over a billion people in this world are living
[21:49.280 --> 21:53.200] on less than $2.50 a day.
[21:53.200 --> 21:58.480] Most of the, many of those people are affected by easily preventable diseases that we've
[21:58.480 --> 22:03.160] mostly eliminated in our countries.
[22:03.160 --> 22:08.720] Anjali Gopal will be talking to you about this right after my talk, whenever that ends,
[22:08.720 --> 22:16.440] in somewhere other than 10 Evans, we'll tell you.
[22:16.440 --> 22:18.000] So that's global poverty.
[22:18.000 --> 22:19.800] Next we have farm animal welfare.
[22:19.800 --> 22:25.320] JC Rees in the back is going to be talking to you about effective animal activism.
[22:25.320 --> 22:30.760] He's going to do his talk twice, once right after this one and once at probably like 1040
[22:30.760 --> 22:34.280] or something in this room, three Evans.
[22:34.280 --> 22:40.040] There are three times as many farm animals as there are humans in the world.
[22:40.040 --> 22:45.640] And the vast majority of them are kept in industrial farming operations where they live
[22:45.640 --> 22:49.280] very, very short, very, very painful lives.
[22:49.280 --> 22:52.720] And it can actually be extremely cheap to help them.
[22:52.720 --> 22:55.800] So JC will tell you more about that.
[22:55.800 --> 23:01.880] And the last cause is existential risk reduction or global catastrophic risk reduction.
[23:01.880 --> 23:07.280] A global catastrophic risk is a situation where if it happened, it could dramatically
[23:07.280 --> 23:14.000] derail human civilization or lead to the extinction of humanity or life on earth in general.
[23:14.000 --> 23:21.880] This might be something like a US Russia nuclear war, a bioterrorist attack from some pandemics,
[23:21.880 --> 23:25.440] risks from an artificial intelligence going wrong.
[23:25.440 --> 23:31.080] Jeff Alexander will talk to you about that in the second block after Anjali in wherever
[23:31.080 --> 23:34.880] room Anjali talks.
[23:34.880 --> 23:39.600] So these are the three causes that are the most popular in effective altruism, but they're
[23:39.600 --> 23:42.360] not by any means the only causes.
[23:42.360 --> 23:46.760] Many people, for example, believe that opening our borders to immigration is the most effective
[23:46.760 --> 23:50.360] way that we can fight poverty all over the world.
[23:50.360 --> 23:55.720] Others believe that educating people who are already altruistically minded on rationality
[23:55.720 --> 24:02.840] and decision making is the best way that they can have a leveraged impact on the world.
[24:02.840 --> 24:08.120] So effective, and these are only our current best guesses.
[24:08.120 --> 24:12.360] The point of effective altruism is not to come in, pick one of these three tracks based
[24:12.360 --> 24:16.640] on what seems interesting to you and just defend that to the death.
[24:16.640 --> 24:21.320] Causes will change over time within a person, within the movement as a whole, and will differ
[24:21.320 --> 24:23.240] from person to person.
[24:23.240 --> 24:29.680] So it's all about learning from each other.
[24:29.680 --> 24:32.440] So effective altruists don't all support the same cause.
[24:32.440 --> 24:37.320] And if we don't support the same cause through our whole life, then what the hell do we have
[24:37.320 --> 24:38.320] in common?
[24:38.320 --> 24:42.240] How do you do it?
[24:42.240 --> 24:47.440] So the first point is to support the cause that you currently think is the best cause.
[24:47.440 --> 24:49.960] We've talked about this a lot.
[24:49.960 --> 24:52.080] It means not going with your gut.
[24:52.080 --> 24:56.300] It means not just giving to the first organization that asks for your money.
[24:56.300 --> 24:59.960] It means not just working for the first non-profit where you get a job.
[24:59.960 --> 25:05.480] It means taking a step back, figuring out your values, and figuring out where you can
[25:05.480 --> 25:08.840] have the most impact.
[25:08.840 --> 25:11.680] But it also means challenging your beliefs.
[25:11.680 --> 25:15.400] Because even though you're working on the cause that you currently think is best, no
[25:15.400 --> 25:18.720] one person has it all figured out.
[25:18.720 --> 25:22.600] So if some person at this conference disagrees with you about what the most important thing
[25:22.600 --> 25:27.960] to work on is, don't just smile and agree to disagree.
[25:27.960 --> 25:30.440] You're on the same side.
[25:30.440 --> 25:32.120] Ask them to convince you.
[25:32.120 --> 25:37.800] You are both trying to figure out how you can best allocate your limited, limited resources
[25:37.800 --> 25:40.320] to do the most good.
[25:40.320 --> 25:45.040] You're on a team, so try to learn from each other.
[25:45.040 --> 25:49.760] And the last is expanding your compassion.
[25:49.760 --> 25:55.120] A lot of people got into effective altruism, myself included, because we realized we didn't
[25:55.120 --> 25:58.160] just care about ourselves and our immediate community.
[25:58.160 --> 26:03.360] We cared, at least in some sense, about all humans.
[26:03.360 --> 26:07.760] I realized this when I was eight years old and I gave the ice cream cone to that old
[26:07.760 --> 26:08.760] lady.
[26:08.760 --> 26:15.160] And I realized it again when I was 14 and I read Peter Singer for the first time.
[26:15.160 --> 26:22.600] And I turned to effective altruism to ask what it should mean to care about all humans.
[26:22.600 --> 26:24.360] How should I change my behavior?
[26:24.360 --> 26:28.880] What should I be doing if I really want to put that principle into practice, caring about
[26:28.880 --> 26:30.720] all of humanity?
[26:30.720 --> 26:37.640] And this is why most effective altruists think that global poverty is such a pressing cause.
[26:37.640 --> 26:42.840] Because when you let all of humanity into your bubble, when you let all of humanity
[26:42.840 --> 26:47.760] into the circle of people that you're caring for, a lot of the concerns that are very local
[26:47.760 --> 26:54.080] to us, while they're very real, kind of become overwhelmed by the sheer numbers of people
[26:54.080 --> 26:58.000] that are outside of our immediate community.
[26:58.000 --> 27:04.000] The thing about being involved in this community is that you never stop caring.
[27:04.000 --> 27:09.480] Most effective altruists care, at least somewhat, about animals, non-human animals, as well
[27:09.480 --> 27:10.480] as humans.
[27:10.480 --> 27:14.200] Although most of us struggle with just how much we're supposed to weight the interests
[27:14.200 --> 27:17.200] of animals against the interests of humans.
[27:17.200 --> 27:22.880] Those of us who have reflected on it and think that we do care have often gone vegetarian
[27:22.880 --> 27:24.160] or vegan.
[27:24.160 --> 27:28.800] And many effective altruists support reducing factory farming for the same reasons that
[27:28.800 --> 27:31.760] others support ending global poverty.
[27:31.760 --> 27:37.400] Because when we let all of these animals, tens of billions, into the circle of things
[27:37.400 --> 27:44.040] that we care about, then a lot of human concerns get pushed aside out of sheer numbers, out
[27:44.040 --> 27:49.360] of the sheer suffering that most of these animals endure.
[27:49.360 --> 27:54.680] And most effective altruists also care about future humans and animals, care about how
[27:54.680 --> 27:59.120] our actions affect generations to come.
[27:59.120 --> 28:04.040] And those of us, again, most of us struggle with how to weight people who will be alive
[28:04.040 --> 28:06.480] in the future with people who are alive today.
[28:06.480 --> 28:08.160] It's not a solved problem.
[28:08.160 --> 28:13.960] But those of us who decide that we care a lot about future beings often work on supporting,
[28:13.960 --> 28:20.840] often work on reducing existential risk, because if humanity goes extinct or if civilization
[28:20.840 --> 28:29.800] is derailed, then billions, trillions of future people may not come into existence or may
[28:29.800 --> 28:32.920] live worse lives.
[28:32.920 --> 28:38.280] I have by no means achieved global empathy.
[28:38.280 --> 28:43.560] Most of my day is spent thinking about myself, and I care way more about my friends than
[28:43.560 --> 28:45.840] I care about people living far away.
[28:45.840 --> 28:49.840] I care way more about humans than I care about animals.
[28:49.840 --> 28:55.880] And I can barely plan for the next five years, much less the next 5,000.
[28:55.880 --> 29:02.720] But the thing about being in this community is that no one ever lets me get away with
[29:02.720 --> 29:06.560] just stopping and packing up and going home.
[29:06.560 --> 29:18.360] I'm always being challenged to grow, to care more, to think more deeply, and to do more.
[29:18.360 --> 29:23.520] So our second block of the conference, which will happen probably around noon, will be
[29:23.520 --> 29:30.400] talking about the mental skills that we can use to get better at this.
[29:30.400 --> 29:35.440] Duncan Savian, an instructor at the Center for Applied Rationality, will be talking to
[29:35.440 --> 29:39.440] us about breaking out of our mental comfort zones and being able to better change our
[29:39.440 --> 29:43.880] minds and better change the minds of other people productively.
[29:43.880 --> 29:50.400] Stephen Frey, the co-founder of the neurotech working group NeurotechX, will be talking
[29:50.400 --> 29:57.280] to us about how we can navigate the seemingly infinite space of possibilities to make smart
[29:57.280 --> 30:02.920] decisions that will work out well in a number of different situations.
[30:02.920 --> 30:11.040] Nate Suarez, who I'm personally very excited about, is going to be talking to us about
[30:11.040 --> 30:17.040] some of our emotional frailties, about how we can burn out, about how we can motivate
[30:17.040 --> 30:24.800] ourselves to do good through guilt and self-criticism and self-hate, and how that's unsustainable,
[30:24.800 --> 30:31.280] and how we can break out of that and find internal motivation to positively, actively
[30:31.280 --> 30:36.440] be excited about doing more good.
[30:36.440 --> 30:38.200] Let me tell you a little secret.
[30:38.200 --> 30:41.480] I say that I'm an effective altruist.
[30:41.480 --> 30:46.400] That just means a person trying to be effective at altruism.
[30:46.400 --> 30:47.720] This is the ideal.
[30:47.720 --> 30:50.680] You figure out how to do the most good and then you do it.
[30:50.680 --> 30:54.840] But in reality, you figure out how to do a little more good than you're doing today,
[30:54.840 --> 30:57.160] and then you work on that.
[30:57.160 --> 31:03.160] And as you act on what you think is the best thing to do right now, your values change,
[31:03.160 --> 31:07.240] your beliefs change, you learn more, you grow more, and you figure out how to do a little
[31:07.240 --> 31:12.480] more good, and you keep going.
[31:12.480 --> 31:14.840] We're human.
[31:14.840 --> 31:18.440] Sometimes we won't do as much good as we're planning to do.
[31:18.440 --> 31:20.800] Sometimes we'll be selfish.
[31:20.800 --> 31:22.440] Sometimes we'll take a break.
[31:22.440 --> 31:25.400] Sometimes we'll make bad decisions.
[31:25.400 --> 31:31.520] But it's important not to let the fact that you can't meet your own high standards mean
[31:31.520 --> 31:34.600] that you should just lower your standards.
[31:34.600 --> 31:39.920] Being consistent is not the highest virtue in the world.
[31:39.920 --> 31:47.880] You can do so much more good by aiming high and then falling short than you could by deciding
[31:47.880 --> 31:53.920] that you're just a bad person and you'll pack up and give up and go home.
[31:53.920 --> 31:59.200] Our keynote speaker, Larissa McFarquhar, has written a beautiful book that I think every
[31:59.200 --> 32:06.080] single one of you should read called Strangers Drowning about what it's like to be an ordinary
[32:06.080 --> 32:13.400] human trying really, really hard to do an extraordinary amount for other people.
[32:13.400 --> 32:20.920] She's going to talk to us about how our world often sneers at do-gooders like that, how
[32:20.920 --> 32:27.440] society often jumps at the chance to accuse someone trying to do a lot of good of hypocrisy
[32:27.440 --> 32:33.760] or failing to meet their own standards as if that was the only measure of a good thing.
[32:33.760 --> 32:37.840] And she's going to talk about why those people keep trying.
[32:37.840 --> 32:43.000] So if you're facing criticism like that, if someone says, oh, you're a vegetarian, why
[32:43.000 --> 32:44.000] aren't you a vegan?
[32:44.000 --> 32:48.600] Oh, you're giving 10%, why aren't you giving 20%?
[32:48.600 --> 32:50.560] Keep trying.
[32:50.560 --> 32:54.040] The people you're trying to help will thank you even if other people roll their eyes at
[32:54.040 --> 32:57.320] you.
[32:57.320 --> 33:05.840] So these principles get us pretty far, but we have a long way to go.
[33:05.840 --> 33:14.360] Peter Singer may have shattered my happy, peaceful life when I read that book.
[33:14.360 --> 33:20.040] He may have brought back all of these ugly emotions that I wanted to hide away, the emotions
[33:20.040 --> 33:25.840] that bubble up when you notice the unfairness and sadness in the world.
[33:25.840 --> 33:29.680] But he also gave me the tools to fight that unfairness.
[33:29.680 --> 33:33.080] He gave me a sense of resolve.
[33:33.080 --> 33:40.840] And he introduced me to a movement, a community that has my back, that supports me as I try
[33:40.840 --> 33:52.080] to improve, and that truly gives me so much hope for the world.
[33:52.080 --> 33:53.880] And that's what I want to leave you with.
[33:53.880 --> 33:56.840] The world is getting better.
[33:56.840 --> 34:00.600] There are solutions to the world's toughest problems.
[34:00.600 --> 34:03.160] We don't have to hide from them.
[34:03.160 --> 34:05.900] They may be difficult, but we're smart.
[34:05.900 --> 34:09.880] They may take a while, but we're determined.
[34:09.880 --> 34:14.320] So please keep that in mind as you listen to these people in the first half of our conference
[34:14.320 --> 34:16.320] this morning.
[34:16.320 --> 34:22.360] Please try to learn from them, because there really are actionable things that we can do.
[34:22.360 --> 34:28.080] I'm going to leave you with a statistic that's just the beginning of what I hope will be
[34:28.080 --> 34:31.920] your journey as effective altruists if you're not on there yet.
[34:31.920 --> 34:37.840] It takes less than $3,000 to save a human life.
[34:37.840 --> 34:41.600] So yes, we can do it.
[34:41.600 --> 34:48.000] The world is getting better, and it's because ordinary people like you decided to do something
[34:48.000 --> 34:50.200] extraordinary.
[34:50.200 --> 34:51.860] So thank you for coming.
[34:51.860 --> 34:54.480] Thank you for being patient with our technical difficulties.
[34:54.480 --> 34:55.480] Thank you for waking up early.
[34:55.480 --> 34:58.040] And thank you for stepping up.
[34:58.040 --> 34:59.040] Have a wonderful conference.
[34:59.040 --> 35:09.040] Thank you.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment