Resources

Investigating opaque algorithms can be a challenge. This page has some pointers to example investigations, some methodologically helpful references, and some guidance on how you might use Freedom of Information (FOI) requests in the U.S. to find out more information.

Examples

  • Inside the Algorithm That Tries to Predict Gun Violence in Chicago. New York Times. 2017. [Link]
  • Minority Neighborhoods Pay Higher Car Insurance Premiums Than White Areas With the Same Risk. ProPublica. 2017. [Link]
  • Machine Bias. ProPublica. 2016. [Link]
  • Uber seems to offer better service in areas with more white people. That raises some tough questions. Washington Post. 2016.  [Link
  • How Google Shapes the News You See About the Candidates. Slate. 2016.  [Link
  • Googling Politics. Slate. 2016. [Link]
  • The Tiger Mom Tax: Asians Are Nearly Twice as Likely to Get a Higher Price from Princeton Review. ProPublica. 2015.  [Link]
  • Why Google Search Results Favor Democrats. Slate. 2015.  [Link
  • Sex, Violence, and Autocomplete Algorithms. Slate. 2013. [Link]
  • The Apple ‘Kill List’: What Your iPhone Doesn’t Want You to Type. The Daily
    Beast, 2013. [Link]
  • Websites Vary Prices, Deals Based on Users’ Information. Wall Street Journal. 2012.  [Link]
  • Message Machine: Reverse Engineering the 2012 Campaign. ProPublica. 2012. [Link]

References

  • How to Hold Algorithms Accountable. MIT Technology Review. 2016. [Link]
  • Algorithmic Accountability: Journalistic Investigation of Computational Power Structures. Digital Journalism. 2015. [Link]
  • Algorithmic Transparency in the News Media. Digital Journalism. 2016. [Link]
  • Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms.  preconference at the 64th Annual Meeting of the International Communication Association. 2014. [Link]
  • The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press. 2015. [Link]
  • Auditing Algorithms @ Northeastern University. [Link] (many papers here)

Relevant Workshops / Conferences

  • Fairness, Accountability and Transparency in Machine Learning (FATML). [Link]
  • Data and Algorithmic Transparency (DAT). [Link]
  • FAT* Conference [Link]
  • AAAI/ACM Conference on AI, Ethics, and Society [Link]

How to FOIA an Algorithm

In the Fall of 2015 we instructed students at the University of Maryland to use Freedom of Information (FOI) requests to obtain information about criminal risk assessment algorithms in use in different states in the U.S. This endeavor is detailed in this article:

  • We need to know the algorithms the government uses to make important decisions about us. The Conversation. 2016. [Link]

Here’s the language that we suggested students use in their requests:

  • Copies of any open-source statistical assessment tools and any contracts for digital or statistical-assessment tools or related services used in bail decisions.
  • Copies of any open-source statistical assessment tools and any contracts for digital or statistical assessment tools or  related services used in sentencing decisions.
  • Copies of any open-source statistical assessment tools and any contracts for digital or statistical assessment tools or  related services used in parole and probation decisions.
  • Variables that form the basis of the assessment tools.
  • Data used as the basis for training the assessment tools.
  • Algorithms or algorithmic processes applied to bail, sentencing and/or parole and probation decisions.
  • Source code for such algorithms.
  • Mathematical descriptions of such algorithms or assessments.
  • Assessments or evaluations of such digital tools and/or algorithms.
  • Any memos or communications or reports dealing with these algorithms.

There are a growing number of FOI requests for algorithms that people have submitted via the Muckrock platform. Examining the approaches and language used in those requests may also be instructive. There are a variety of example requests available on the Muckrock platform via their Uncovering Algorithms project.