Logo

ICS Colloquium: Speaker Dr. Robin Burke Zoom Meeting - Shared screen with speaker view
Margaret Perkoff
23:34
Netflix movie recommendations
Nick Hunkins
23:34
YouTube recommendations
Brandon M. Booth
23:37
Classic: Netflix
Christopher Davis
23:38
Amazon
Bill Penuel (he/him/his)
23:40
Amazon
Sage O Sherman
23:41
Amazon
Tamara Sumner
23:42
amazon
Tom Williams (He/Him)
23:43
Goodreads
Matthew Menten
23:44
Spotify discover
Christopher Davis
23:52
LL Bean
Sarah Fahmy (she, hers)
23:54
Research Gate
Bill Penuel (he/him/his)
23:57
“I feel lucky” in early days of Google
Jean M. Bowen
24:00
Audible
Emily Johns-O'Leary (she/her/hers)
24:07
Bookshop.org, Bookbar
Sarah Fahmy (she, hers)
24:15
Instagram, facebook
Bill Penuel (he/him/his)
24:24
Twitter
Matthew Menten
24:29
Netflix: “You’ll love this generic top ten trending show”
Neal McBurnett
24:38
google assistant news
Neal McBurnett
30:39
Black Software: The Internet & Racial Justice, from the AfroNet to Black Lives Matter, November 1, 2019 by Charlton D. McIlwain
Tom Williams (He/Him)
53:50
? - Re: Accuracy, I think it's Noble that points out that equal prediction accuracy can be problematic -- e.g. some recidivism prediction systems have equal accuracy for both white and black defendents, but while these systems err at the same rate for both populations, they more frequently err *in favor* of white defendents and more frequently err *against* black defendents. Do similar issues arise in the types of recommender applications that you focus on?
Sarah Fahmy (she, hers)
57:21
Also I’m curious about who/how these fairness metrics are determined? How are elements of “power”, “privilege”, “education” etc. influenced by notions of white saviorism, and misconceptions of different communities?
Tamara Sumner
01:03:43
Sarah, that is why we are adopting a Responsible Innovation approach (Stilgoe 2013) in the AI Institute. It prioritizes changing who participates in design and development.
Arturo Cortez
01:04:09
^^^
Neal McBurnett
01:06:03
Can a system allow users to customize fairness objectives? find intersection between the organization's altruistic goals and their own?
Sarah Fahmy (she, hers)
01:07:38
That’s great and so important!! Glad the AI instate will follow that! I’m always curious about the identities in the room/who is present when decisions are made, since that influences personal bias
Neal McBurnett
01:08:21
More generally, how about allowing for third-party recommender systems (or parameters) in some circumstances, chosen by users. Cf. socially-responsible investing.
Matthew Menten
01:31:10
Gah, those things are frustrating to hear! Thanks for maintaining virtue in field that raised so many ethical dilemmas