Skip to content

Instantly share code, notes, and snippets.

@jabley
Last active July 21, 2018 19:37
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save jabley/687d82f72bf076100cb9f976fff00ba4 to your computer and use it in GitHub Desktop.
Save jabley/687d82f72bf076100cb9f976fff00ba4 to your computer and use it in GitHub Desktop.
Note from CoEd Ethics, London 2018

CoEd:Ethics

Introduction by Anne Currie (@anne_e_currie)

  • Why are we here?
    • Is there a problem with ethics in and around technology use at the moment?
    • If there is an issue, what can we as engineers do about it?
      • What might stop me?
      • How can I get around that?
  • There is a workbook for the conference
    • It looks a bit like the ones DareConf used to do
  • Ed: this is quite exciting. It feels like the start of something. Let’s see how the day goes

“When Data Kills” – Cori Crider

Introduction

  • Human Rights Lawyer
  • Human rights lawyers tend to fly around the world and talk to people about what happened to them
    • not a lot of traditional lawyering in a courtroom

What are the most egregious violations at the moment in the world?

  • Weaponised AI: Here and Now

Outline

  • Talking about what lead to the US dropping a Hellfire missile on a Yemeni wedding in 2012
  • Revenge of the Nerds
  • What does the future hold?

Actors

  • Faisal bin Ali Jaber
    • Environmental Engineer
    • Met Cori and said “I want to talk to you about what happened at the wedding of my eldest son”
    • Handed over a hard drive
    • Yemeni weddings are over multiple days
  • Salem bin Ali Jaber
    • an imam
    • preaching against extremism and Al Qaeda
  • Waleed bin Ali Jaber
    • Only policeman in the village
    • Salem’s nephew

Timeline

  • Friday before the wedding, Salem did a sermon against Al Qaeda
  • After the main day of the wedding, 3 young men (aged 16 to 18) come to the village
    • they wanted to talk to Salem
    • people weren’t sure what they wanted
    • Salem and Waleed went to meet them
  • Waleed, Salem, and 3 men were blown up by a Hellfire missile

Images of the aftermath

  • We’re only being shown appropriate images
  • the hard drive contains images of bodies (or what was left of them)

These ripple effect of these types of bombing

  • “The resentment created by American use of unmanned strikes … is much greater than the average American appreciates. They are hated on a visceral level, even by people who’ve never seen one or seen the effects of one.”
    • Ret General Stanley McChrystal
    • Great recruiting tool for Al Qaeda

Faisal went to Washington

  • Met people in the Obama Whitehouse
  • No apology
  • 7 months later, an envelope of $100,000 was presented to him
    • to help the families who had lost income

Signature Strike

  • “We kill people based on metadata”
    • Michael Hayden, NSA Director, CIA Director
  • these are called a signature strike
    • Selecting, targetting and killing people based on patterns of life, defined by an algorithm
  • The CIA sees 3 guys doing jumping jacks in the desert and thinks it’s a terrorist training camp

Snowden revealed these systems

  • Skynet
    • Yes, as per Terminator
    • evidently the NSA has a sense of humour
  • We have an NSA slidedeck which talks about it
  • performs courier detection via Machine Learning
    • how it works
      • The system takes 7 supposed, known couriers, and the dataset trains against them
        • JFC. 7!?
        • that seems like a really small dataset to use for training
    • how does it perform?
      • Tags an Al Jazeera journalist as a militant, because he has funny travel patterns and talks to shady people

What has the effect been?

  • 100s of people have been killed in the Drone Wars
  • See the Bureau of investigative Journalism
    • ~ 1000 civilians have been killed in this way

An observation on the current serving president - Donald Trump

  • Rules are looser now
  • 6,000 civilian casualities in Iraq and Syria last year alone
  • Trump was told that drone operators waited for the wife and children of a target to move out of range
    • “Why did you wait?”
  • That feels like a worrying escalation to target families

The takeway: people use algorithms to kill people

  • What about killer robots?
    • The CIA and special forces like algorithms because they’re fast (versus doing investigation like they used to)
    • But are they accurate?
    • The claim – “We’re killing these sons of bitches faster than they can grow them”
      • That doesn’t stand up. McChrystal said it.
      • It’s getting people lining up to join Al Qaeda

Revenge of the Nerds aka Google: Project Maven

  • The Algorithmic Warfare Cross-Functional Team
  • Used Google’s AI to process drone feed images
  • Google employees protected it internally
  • Got leaked to media
  • Media hammered Google
  • Google said it was for non-offensive uses
    • we know this stuff is hard
    • images of black people labelled as monkeys
    • Distinguishing a journalist with a camera from a terrorist with a Stinger
  • 3,000 GOOG employees wrote an open letter criticising GOOG
  • a dozen staff resigned over the issue
  • GOOG caved

Are we seeing a Tech Ethics renaissance?

  • Worrying things are happening in tech
    • Amazon and facial Rekognition
      • nice Stasi k there
    • Microsoft - ICE
  • AI is going to revolutionise things
  • We should absolutely consider how things will be used

What you can do

  • Know your power (4% of GOOG employees fordced the issuye)
  • Negotiate your work, not just your cash
  • Ask questions
    • Am I going to go to jail (for fudging VW emissions tests)
  • Get help
    • Talk to people across society

coricrider.com

  • In Yemen, stories like Faisal’s are pretty common
  • We are responsible for fixing that

Questions

  • Should we try to encourage the best people to build the tools, so that it has a better ‘correct’ kill ratio
    • Who is owning the tool?
    • How are they using it?
    • How might they use it?
    • Are you working for Dr Evil?
  • What about AI principles? See https://futureoflife.org/ai-principles/ Are the GOOG AI Principles enough?
    • They are more the beginning of a conversation
  • It feels like we’re really dependent on whistleblowers (Snowden, leaks to the press etc)?
    • Collective bargaining
    • Unions!
    • Most people don’t konw/can’t defend against against a determined nation0-state attacker wanting to root out whistleblowers
    • Looking at how dissidents in the 70s were arrested in the GDR or similar, they would arrest people, then arrest all of the people in their address book. That’s only got faster with modern technology.

“Data Citizens: why we should all care about data ethics” - Dr Caitlin McDonald

Is data ethics distinct from ethics?

  • We can draw on frameworks from civics
  • We do not need to invent / discover everything from scratch.
  • In civic life, citizens have mechanisms for influencing law-makers
    • Voting
    • Lobbying
  • What are the equivalents for data science?
    • See Weapons of Math Destruction by Cathy O’Neill
  • Often data citizens cannot see or understand what the rules are?
  • We have no recourse to challenging or influencing the rules.
  • Just because a thing is mathematical, it does not mean that it’s fair
  • Data scientists are making ethical decisions ALL THE TIME
    • They need to be aware of this and acknowledge it

The EU defines some models

Most data citizens are not data scientists

  • Data citizens can keep themselves informed as to what’s gone wrong, and how data can be used

Questions

  • If we want to think of data science more like we think of the law, how do we ensure that poeple don’t try to game the system if they know how the system works? For example, credit scoring? Where the incentives for technologists to act correctly?
    • I am not entirely sure that people need this. Most people don’t set out to be bad actors.

“Data Science in Action” – Emma Prest & Clare Kitching

Introduction

  • Emma is from DataKind UK
    • DataKind UK do pro-bono data science for charities
  • Clare has done some work with DataKind
  • Going to talk about one of the projects they’ve done for a client

Examples of data science gone wrong

  • Order substition algorithm
    • Substituted a sack of potatoes for Prosecco
  • Pothole detection app for Boston
    • App was downloaded and used by affluent population, so they got the best roads
  • Samaritans Radar app
    • Picking up on sentiment in tweets
    • Google it

What are the ethical quesions here?

  • Indention
  • Data and algorithms

Client story

  • A small foodbank
  • Trying to identify who needs additional support
  • Who is dependent on the foodbank?

What if?

  • What could go wrong?
    • We wrongly predict who needs extra support
      • that doesn’t seem like a massively bad thing
      • presumably humans would spot and correct it
    • The model is implemented without any human intervention
      • in 3 years time, everyone involved has left and the algorithm is in charge!
      • the people receiving support have social services contact, so that should address that
    • What happens if the model is used to ration foodbank support?
      • how could someone misuse my tool?

Their Data

  • full set of data for the last 3 years. It contains:
    • Lots of personally identifiable data
    • Reason for current referral
    • Historical referrals
    • Referral pattern (derived from the above)

The framing question

  • What would this look like in the Daily Mail?
    • not again
    • policy making by fear of the DM
  • Took action to review sensitive variables
    • Gender
    • Country of birth
  • Reviewed completeness and relevance
    • are they properly populated in the entire data set?
      • no, not for a lot things
    • How does removing them from the model change the predictions?
      • it is fine

What we would do diffefently next time?

  • discuss ethics at kickoff
    • sensitive data
    • worst-case scenarios
  • build bias assessment checking approach into both:
    • data
    • algorithm
  • build ethics into the implementation

Questions

  • How can you minimse bias?
    • Be explicit
    • Try to get different people involved

“Ethics: A pyschological perspective” – Andrea Dobson

20 years ago, group of friends had a thing about shop-lifting

  • Tried it
  • Felt so much guilt and shame
  • No rush
  • Knew it was bad
  • Did it anyway

Why do good people sometimes make bad decisions?

Overview

  • 3 areas:
    • Conformity
    • Obedience
    • What can be Done?
  • Most what we know about these is from unethical studies done in the 1950s :)

Conformity

  • Visual test comparing line lengths
  • Solomon Asch was a social scientist back in the 50s
  • StackOverflow developer survey
    • There was a question “How would developers report ethical problems with code?
    • ~76,000 responses
    • Answer split:
      • Depends – 46.6%
      • Yes, internally – 35.7%
      • Yes, public – 13.1%
      • No – 4.6%

Obedience

  • Do as you are told
  • https://en.wikipedia.org/wiki/Stanley_Milgram
    • American who watching the Nuremburg Trials
      • I was only following orders
  • Obedience to authority study
    • 65% administered the highest shock (which would have been fatal)
    • others have replicated the study with 80% obedience rates
  • Another StackOverflow survey question
    • Who is ultimately responsible for code that accomplishes something unethical?
      • think of the VW emissions cheating
    • ~64,500 responses
    • Answer split:
      • Upper management
      • person that came up with the idea
      • the developer who implemented it

What can be done?

  • Being aware of conformity helps
    • Bad examples from CEOs, presidents can trickle down

What is your personal stance?

  • You need to define your own line in the sand
  • GOOG Project Maven example from Cori’s talk
  • Speak up
    • Needs courage
    • 300 BC: Aristotle said fear is an inate part of courage
    • Courage is persistenc in the face of fear
  • It can make us happier

Questions

  • How can we distinguish between good and bad conformity? For example, listening respectfully to your talk and applauding at the end
    • Emotianal signals
    • Do you feel guilt/shame?
  • How can we create environments where people feel comfortable speaking up?
    • Pyschological safety is essential if you want to grow.
    • Leaders need to be aware of this.
    • Explain to people that you also don’t know where you are going. It’s a shared journey.

“Thinking Ethically at Scale” – Yanqing Chen

  • We care about ethics because we want the world to be a better place
    • We want to appear to be good people

Many of us create technology with a postive impact

  • medical technology
  • educational resources
  • autonomous vehicles
    • 1.3 million people die from road traffic injuries every year

Create technology quickly and do it well

  • Another reference to Google Project Maven
    • If you’re a software engineer at a major tech company, you have 1/12 enough power to cancel a contract with the military

Why can’t we have both

  • Feels morally good, but makes the world worse?
  • Makes the world better, but doesn’t feel morally good?

Scope insensitivy

  • Psychological bias
  • A Study:
    • Donations for saving the birds were flat, despite 2 orders of magnitude in the number of birds involved
      • We cannot imagine large numbers

Effective Altruism

  • How can we best use our limited resources to help others?
  • The QALY appraoch
    • What is a QAKY worth?
    • NHS is prepared to spend about £20,000 per QALY
      • ~ £2.30 per day
    • Can we gain a QALY for less
      • YES!
  • What does a QALY cost?
    • Looking at HIV treatments
      • Antiretroviral therapy – 2 QALY/£1000
      • Prevention of transmission during preganancy – 8 QALY/£1000
      • Distribution of condoms – 20 QALY/£1000
  • How do you maximise the effect of your personal donations?
    • Giving what we can
    • Give well

An Example of Effective Altruism

  • Decreasing incidence of malaria in Africa
  • Malaria is no longer the largest killer in Africa
  • Effect of bed nets as been really effective
  • Cost of saving a child’s life has gone up, because we’ve been so effective with bed nets
  • => The most effective actions are not always the most intuitive

Another example: No Lean Season (Google it)

  • https://www.evidenceaction.org/beta-no-lean-season/#intro-no-lean-season/
    • No Lean Season reduces the negative effects of seasonality on the poorest in rural agricultural areas by enabling labor mobility that increases incomes. It is a new program that we are testing in Evidence Action Beta’s portfolio, based on rigorous experimental evidence.
    • We give a travel subsidy of $20 to very poor rural laborers so they can send a family member to a nearby city to find a job during the period between planting and harvesting. This is the time in rural areas when there are no jobs, no income, and when families miss meals. This seasonal poverty affects 600 million people around the world.
    • With a temporary job during this ‘lean season,’ households are are able to put an additional meal on the table for every member of the family each and every day. That’s 500 additional meals during the lean season.
  • A ticket out of seasonal poverty
  • Move people out of rural communities to drive rickshaws in the city seasonally
  • Gives them bus tickets to do that
  • This is 5 times mroe effective at keeping people out of poverty, than giving them food
  • It also has network effects – they tell their friends that it’s working for them

The impacts of our actions

  • We care about ethics because we want the world to be a better place
  • We have to take action to make this happen
  • How you can spot opportunities in your daily life?
    • Like the No Lean Season matchmaking for jobs and inactive workforce

One approach: Who How What

  • Asks the questions:
    • Who can have an impact on your goal?
    • How can they help/obstruct
    • What can
  • => Find the set of whats that gives you the biggest impact for the largest set of people

Closing thoughts

  • We’re trying to make the world a better place
  • Heroic Responsibility
    • Harry Potter and the Methods of Rationality fanfic
    • You’ve got to get the job done no matter what

Questions

  • How does Effective Altruism ensure that they avoid bias and aren’t playing God?
    • They are a young movement, but they are generally having a postiive impact

“Ethical Design” – Harry Trimble

Introduction

  • Outline
    • Recognising Power
    • Applying Ethics
    • Languages, tools and
  • Services should respect our rights
  • It’s getting harder to tell design and software development apart
    • Collabration is happeneing more, lines are being blurred
    • Is the distinction useful any more?

Software is politics now

  • Wor Richard Pope
  • Designers have gone where the money/power is
    • 60s - advertising
    • 90s - consumer electronics
    • now - software

Designers as power brokers

  • Wor Stephen McCarthy

Design is a position of power

  • Are you making things better?
  • Are you helping incumbents hoard more power?

Desiogn educations big gap

  • Understanding the role of power
    • George Aye

A personal experience

  • Worked on design in government for
    • driving licensing
    • childminder permits
  • Was asked to work on design for an interface for doing weapons exports
    • Told boss not comfortable
  • Ed: I disagree with Harry’s claim that weapons export is unethical
    • Is exporting to rebels fighting oppressive regimes unethical?
    • Is exporting to Saudi Arabia/Syria?
    • It’s nuanced (maybe!)

Experience at the NHS

  • Worked on helping people recovering from gambling addiction to track their progress
  • Track it, motivation
  • Design ethics seemed reaosable
  • Data ethics / privacy seemed less so

Projects By If example

  • Prototype for telecom bills
    • Bills box reads a barcode at the bottom of your utility bill
    • Add new housemates to the bills
  • Interesting modelling around:
    • informed consent
    • permissions for groups

Exploring consent and data minisation for Oxfam

  • Your works Counts service
    • Used in the middle east
    • There is often a power relationship at play when sharing data/giving consent
  • Design questions explored:
    • Can you delegate consent to a 3rd party/expert?
    • Can you given example proof that something happened?
    • Example of inadvertent data capture
      • Photo of a water pipe in a village
      • Also contains peoples faces

Machine Intelligence

  • Increasingly entering important areas of our lives
    • Finding a job
    • managing your finances
    • getting medical treatment
  • If gave evidence to UK Parliament about how this might work

Another Projects By If prototype

  • Something that looks very like UC
    • Here are your benefits
    • You have been sanctioned for these reasons
    • Captures an audit trail of versions of machines/software involved in that decision

Languages, tools

  • We need a shared langauge
    • Names have power
    • Giving a thing a name and defining it helps us have better conversations
    • Data Permissions Catalogue
      • Verifiable Proof example
      • Proximity Sharing
      • Decision Testing

“A Responsible Dev Process” – Sam Brown & Adam Samdoz

  • What are the questions I need to be aware of for ethical development?
  • How does that apply to my daily job?

Facebook is selling all the data

Breaking Silos

  • development
  • operations
  • management
    • really blame heavy culture
    • lots of finger-pointing
  • people need to be aware of the wider context in which they operate
    • Ed: Systems Thinking, innit?

Multi-disciplinary teams - aka Devops and stuff

  • one empowered team
  • team is responsible for everything
    • including ethical decisions
  • Great Responsibility!

But how do you know what to do?

  • Responsible Technology from doteveryone
  • Responsible Technology considers the social impact it creates and seeks to understand and minimise its potential unintended consequences
    • Do not knowingly create or deepen existing inequalities
    • Recognise and respect dignity and rights
    • Give peopel condfidence and trust in their use

Created the 3C model

  • The model:
    • Context
      • looking beyond the individual user and taking into account the technology’s potential impact and consequences on society
      • see published diagrams
    • Continuity
      • ensuring best practice in technology that accounts for real human lives
      • Ed: I had this down as Consequence, but the blog post says Continuity?
    • Contribution
      • sharing how value is created in a transparent and understandable way
  • not an ethics bible
    • a model for responsible practice
  • Responsible Technology product assessment

Questions

  • your model is encouraging compliance activities which are expensive. It feels like SMEs won’t be able to compete with larger companies
    • what you have is a snapshot
    • you should be able to continuously improve and come back to it in the future

“Mitsuku” – Steve Worswick

What is a chatbot?

  • a computer program that responds like a real person
    • Siri
    • Cortana
    • Alexa
  • a demo with Mitsuku

History

  • used to be a dance/techno music producer
  • Scottish Clubland 3 compilation album

Competitions

  • wrote Mitsuku in 2005
  • started doing comps in 2010
  • Loebner prize for the last 2 years
  • Siri placed 14 in 2013

Features

  • Global userbase
  • child friendly
  • something something Ex Factor?
    • Ed: I didn’t understand this reference. Some UK TV thing, apparerently

Training

  • supervised learning
  • all content is created by Steve
    • no potty mouth
  • very time-consuming
  • unsupervised learning
    • chatbot learns from its users
  • Unsupervised feels unethical
    • would you structure learning, or sit a thing in front of Google
    • cf Microsoft Tay
      • Mitsuku’s attackers
        • trying to corrupt it like Tay

User population

  • Category A abusive people make up 30%
  • Category B are general users 50%
  • Category C academics and sceptics 20%

How do you handle abuse in chatbots?

  • tame answers?
    • that tends to be seen as weak
  • overly aggressive responses (swear back at it)
    • that tends to escalate the situation
  • instead, has a warning system
    • 5 strikes => something like an IP ban
  • not all keywords are abusive
    • “I want to have sex with you”
    • “What sex are you?”
  • Outcome
    • by diverting abuse, users started to behave
    • they enjoyed the banter with the bot
    • humour worked well

Lessons learned

  • allow your bot to be mean to users. Treat them as they treat the bot
  • check what the bot is learning

Romantic interactions

  • I love you Mitsuku
  • Thanks, I like you a lot too
  • Put them straight in the friend zone!

Suicidal thoughts

  • try to point people towards professional help

Questions

  • Can you speak a bit about the underlying tech? It sounds like a lot of if statements?
    • Yes
    • Tried pattern-matching, but didn’t get good results.
    • Resulting bots were stupid
    • AIML language
  • Should we personify machines? Women being seen as subservient?
    • Reflects the origins of being intended for a gaming site of 18 to 30 males
    • It’s a piece of software
  • How big is the template file?
    • pizza bot would be maybe 10,000 lines
    • currently at 350,000 intents
  • It’s like an echo chamber. No state is shared between users. Is there an ethical question about creating something like that?
    • that’s people
    • their own experience
    • reflecting their own biases

Panel with Gareth Rushgrove

How much do we rely on education as a lever for the change that we want to see, rather than regulation?

  • politicians need education before they can regulate
  • GDPR is regulation that is an important step
  • there are existing laws, like the Inequality Act
  • one of the interesing things about this conference is that it’s connected lots of interesting people
    • Anne met with a Lord, who was interested in what’s going on
    • 3 aspects to police ethics in tech
      • the law
      • users educate themselves
      • getting Apple/GOOG to do it themselves
    • Anne suggested that tech folk can also help with that; ie a 4th aspect
    • a question was asked in the UK House of Lords
      • “Will AI be subject to health & safety legislation?”
      • the minister answered yes, AI and ML in the UK is subject to that
      • Will be interesting to see if that actually pans out
      • TODO find relevant bit in Hansard
  • laws tend to lag tech and other developments

Do we think there are any negative effects of using ethics as a product?

  • chatbot was very intolerant of abuse
    • That cost revenue
      • Relaxed the restrictions
    • Was that the right thing to do?
  • Politics is an attempt to do that
  • Projectsbyif is an interesting thing around that
    • Harry feels it’s more like a market imperative
  • It’s a good thing that consumers care about ethics
  • If ethics is being used as a dishonest attempt to win market share, the feeling is that will be called out.

Are we doing the classic Not Invented Here for this? What prior art exists? From this (exGDS questioner), the GDS design principles looked very similar to some things IBM wrote in the 60s. Can we learn from earlier smart people?

  • ethical code of conduct for sysadmins
  • ACM has recently been updated theirs / it launches next week?
  • Other fields have this
    • it’s part of the standard training
  • Professional bodies should cover this
    • what does an industry body good thing look like?
  • Around 30% of the audience reckoned they’d had ethical training
  • People don’t see it as a thing relevant to them
    • job interviews cover Python, or Go, or k8s.
    • they do not cover ethics
    • so people invest in tech skills, but not so much in ethical thinking

Is Apple ethical? Should I just chuck my phone away?

  • Restart project
    • the most ethical phone you have is your current one
    • doing some interesting work around supply chains
  • they are doing a lot of interesting work
    • differential privacy
    • do not track in the browser
  • APPL aren’t as bad as Facebook
  • Can we untwine ethics from utility?
    • Facebook lost $87B, but it bounced back
    • People get value out of FB

How can we do stuff now?

  • there are industry bodies
  • A panelist met someone from a union recently
    • general secretary of the TUC
    • actually, unions are not just to get a good deal for workers
    • can also help with communication between management and workers
    • in tech, we tend not to be unionised
      • soft comfortable jobs
  • not many people in industry bodies
    • maybe 6 in the entire audience
  • How do we change the attitudes towards unionising and/or professional bodies?
    • Deliveroo developers should be helping Deliveroo delivery people!
  • feels like there is a global need for representation

Are there ways that we can think about technology so that we can fix the existing structures?

  • Read the book After the Internet
  • Having a little badge, transparency
  • Can we spot biases in these interactions?
  • Go and test with users and see what you think
    • Example: are you on benefits question
    • caused anxiety with the users
    • changed it to explain why the question was being asked
  • Chinese government example
    • Social Credit system, entirely algorithm

What weight should we give to ethics in a business focused on making money?

  • surely that will change over time?
    • Consumers will make a choice
  • see vegan food
    • people have educated themselves and the industry has grown
  • Cambridge Analytica has forced that idea to be something people care about

What is one small step everyone here can take to empower themselves?

  • Join a union!
  • Think about what matters to you
    • work to do that
  • practice
  • collective responsibility to be more informed
  • start a conversation with people that you work with

My unasked question - Are we worried that we are rapidly approachiung the point where people will not be able to comprehend the rules that are in systems.

@gmmorris
Copy link

Thanks for these notes mate, such a good job of covering so much of what we discussed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment