Skip to content

Instantly share code, notes, and snippets.

View gcr's full-sized avatar

Kimmy gcr

  • Google AI
  • 21:07 (UTC -04:00)
View GitHub Profile

ML in law symposium

Andreas’ idea: Given explainability / the ability to explain decisions, let’s maximize the performance we can get.

My three takeaways

  • Tech folks have a tendency to “fly in and fix everything.” That feels like a dangerous approach here. It’s far better to stand on the shoulders of existing legal precedent, which has studied fairness, discrimination, and bias for decades, even if that slows down progress.
  • Machine learning systems mirror and amplify bias by default. We cannot simply ignore sensitive attributes because the system averages loss over the majority. (Disparate mistreatment). Pithy corollary: this problem will only go away if we devote resources into making it go away.
  • Providing explanations for decisions is the only humane way to build automatic classification systems. Why? If I can’t test a result, I can’t contest it. If the decisions must be testable and explainable, they will be much more reliable as a result.
Org-mode entities
=================
* User-defined additions (variable org-entities-user)
* Letters
** Latin
Symbol Org entity LaTeX code HTML code
-----------------------------------------------------------
À \Agrave \`{A} À

NIPS ML in the Law symposium 2016 notes

Including notes for the second session and the first panel.

My three takeaways

  • Tech folks have a tendency to "fly in and fix everything." That feels like a dangerous approach here. It's far better to stand on the shoulders of existing legal precedent, which has studied fairness, discrimination, and bias for decades, even if that slows down progress.
  • Machine learning systems mirror and amplify bias by default. We cannot simply ignore sensitive attributes because the system averages loss over the majority. (Disparate mistreatment). Pithy corollary: this problem will only go away if we devote resources into making it go away.
  • Providing explanations for decisions is the only humane way to build automatic classification systems. Why? If I can't test a result, I can't contest it. If the decisions must be testable and explainable, they will be much more reliable as a result.

Aaron Roth: Quantitative tradeoffs between fairness and accuracy in machine learning

Rev. Dr. William J. Barber, II, President of the North Carolina NAACP November 11, 2017

This is a transcript of the "Post-Election "Moral Message Moving Forward" NC NAACP Press Call" held on November 11, 2017. https://www.youtube.com/watch?v=xYxAQv6IC5I

I apologize for any errors in transcription.

SHIRI: Welcome to the North Caroline NAACP press call. My name is Shiri and I will be your operator for today's call. Please note that this conference is being recorded. I would like to now turn this call over to Tyler Swanson. You may begin.

SWANSON: Thank you. Tonight, Reverend Dr. William J Barber II, president of the North Carolina NAACP is making an ultimate public statement to all one hundred branches of the ... of the NC NAACP, members of the Forward Together moral movement, and the state of North Carolina. Dr. Barber will take questions immediately after his statement.

@gcr
gcr / a.py
Created September 23, 2016 17:38
Right-aligned python
import re
python_regex = re.compile(r"^(.*?)(\s*)$")
from IPython.core.magic import register_cell_magic, cell_magic, magics_class, Magics
@magics_class
class RightAlignMagics(Magics):
@cell_magic
def right_align(self, line, cell):
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"ein.tags": [
"worksheet-0"
]
},
"source": [
#!/usr/bin/env python
import sys
from glob import glob
"""
Convert the XML file into a nice-looking HTML file suitable for reading in
Chrome
"""
json = require 'cjson'
function buildNcduLayer(name, module)
local result = nil
if torch.isTensor(module) then
if module:numel() ~= 0 then
local strt = {name..': [' .. torch.typename(module) .. ' of size '}
for i=1,module:nDimension() do
table.insert(strt, module:size(i))
if i ~= module:nDimension() then
json = require 'cjson'
function buildNcduLayer(name, module)
local result = nil
if torch.isTensor(module) then
if module:numel() ~= 0 then
local strt = {name..': [' .. torch.typename(module) .. ' of size '}
for i=1,module:nDimension() do
table.insert(strt, module:size(i))
if i ~= module:nDimension() then
th> x = torch.randn(3,3)
[0.0001s]
th> x
-0.9764 1.3443 0.4054
1.7598 1.9367 -0.6121
-0.1593 -0.0788 -0.2321
[torch.DoubleTensor of size 3x3]
[0.0002s]
th> x:mean(1)