Resources

Investigating opaque algorithms can be a challenge. This page has some pointers to example investigations, some methodologically helpful references, and some guidance on how you might use Freedom of Information (FOI) requests in the U.S. to find out more information.

Examples

  • Machine Bias. ProPublica. 2016. [Link]
  • The Tiger Mom Tax: Asians Are Nearly Twice as Likely to Get a Higher Price from Princeton Review. ProPublica. 2015.  [Link]
  • Websites Vary Prices, Deals Based on Users’ Information. Wall Street Journal. 2012.  [Link
  • Uber seems to offer better service in areas with more white people. That raises some tough questions. Washington Post. 2016.  [Link
  • How Google Shapes the News You See About the Candidates. Slate. 2016.  [Link
  • Why Google Search Results Favor Democrats. Slate. 2015.  [Link
  • Sex, Violence, and Autocomplete Algorithms. Slate. 2013. [Link]

References

  • Algorithmic Accountability: Journalistic Investigation of Computational Power Structures. Digital Journalism. 2015. [Link]
  • Algorithmic Transparency in the News Media. Digital Journalism. 2016. [Link]
  • Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms.  preconference at the 64th Annual Meeting of the International Communication Association. 2014. [Link]
  • How to Hold Algorithms Accountable. MIT Technology Review. 2016. [Link]
  • Auditing Algorithms @ Northeastern University. [Link]
  • Fairness, Accountability and Transparency in Machine Learning (FATML). [Link]

FOI Pointers

In the Fall of 2015 we instructed students at UMD to use Freedom of Information requests to obtain information about criminal risk assessment algorithms in use in different states in the U.S. This endeavor is detailed in this article:

  • We need to know the algorithms the government uses to make important decisions about us. The Conversation. 2016. [Link]

Here’s the language that we suggested students use in their requests:

  • Copies of any open-source statistical assessment tools and any contracts for digital or statistical-assessment tools or related services used in bail decisions.
  • Copies of any open-source statistical assessment tools and any contracts for digital or statistical assessment tools or  related services used in sentencing decisions.
  • Copies of any open-source statistical assessment tools and any contracts for digital or statistical assessment tools or  related services used in parole and probation decisions.
  • Variables that form the basis of the assessment tools.
  • Data used as the basis for training the assessment tools.
  • Algorithms or algorithmic processes applied to bail, sentencing and/or parole and probation decisions.
  • Source code for such algorithms.
  • Mathematical descriptions of such algorithms or assessments.
  • Assessments or evaluations of such digital tools and/or algorithms.
  • Any memos or communications or reports dealing with these algorithms.

There are a growing number of FOI requests for algorithms that people have submitted via the Muckrock platform. Examining the approaches and language used in those requests may also be instructive: Read more here.