Home > Online > Auditing Algorithms for Discrimination

Auditing Algorithms for Discrimination

As reliance on algorithmic decisionmaking expands, concerns are growing about the potential for arbitrary, unfair, or discriminatory outcomes in areas such as employment, credit markets, and criminal justice. Legal scholars have lamented the lack of accountability of these automated decision processes and called for greater transparency. They argue that the way to avoid unfair or discriminatory algorithms is to demand greater disclosure of how they operate. Accountable Algorithms resists this call for transparency, calling it “a naive solution.” Instead, it argues that technology offers tools—“a new technological toolkit”—that can better assure accountability.

One of the examples that Kroll et al. rely on to illustrate their argument is the goal of ensuring that algorithms do not discriminate. Many commentators have pointed out the risk that automated decision processes may produce biased outcomes, and in prior work, I have argued that serious policy concerns are raised when these algorithms exacerbate historic inequality or disadvantage along the lines of race, sex, or other protected characteristics—what I’ve referred to as “classification bias.” Recognizing that the precise meaning of discrimination is uncertain and contested, Kroll et al. do not try to resolve debates over the meaning of discrimination. Instead, without choosing among the competing definitions, they simply survey the available technical tools, suggesting that these tools will be more effective at ensuring nondiscrimination than calls for transparency.

Transparency involves outside scrutiny of a decision process, for example, by allowing third parties to examine the computer code or the decision criteria it implements. Auditing is another method for promoting transparency. When the goal is nondiscrimination, auditing could involve techniques to ensure that an algorithm follows a specified rule—for example, sorting must not occur based on race or sex. Alternatively, auditing for discrimination could take the form of examining inputs and outputs to detect when a decision process systematically disadvantages particular groups. The latter form of auditing does not involve direct examination of the decision process, but is useful in detecting patterns. This type of auditing, in the form of field experiments, is well established in the social science literature as a technique for testing for discrimination in decisions such as employment and consumer transactions. Auditing the effects of decisionmaking algorithms similarly offers a method of detecting when they may be biased against particular groups. Kroll et al., however, express skepticism about auditing as a strategy, arguing that it is not only technically limited, but also likely restricted by law. More specifically, they suggest that when an algorithm is found to have a disparate impact, the Supreme Court’s decision in Ricci v. DeStefano may prevent correcting for that bias.

This Essay responds to Kroll et al., arguing that, despite its limitations, auditing for discrimination should remain an important part of the strategy for detecting and responding to biased algorithms. Technical tools alone cannot reliably prevent discriminatory outcomes because the causes of bias often lie not in the code, but in broader social processes. Therefore, implementing the best available technical tools can never guarantee that algorithms are unbiased. Avoiding discriminatory outcomes will require awareness of the actual impact of automated decision processes, namely, through auditing.

Fortunately, the law permits the use of auditing to detect and correct for discriminatory bias. To the extent that Kroll et al. suggest otherwise, their conclusion rests on a misreading of the Supreme Court’s decision in Ricci. That case narrowly addressed a situation in which an employer took an adverse action against identifiable individuals based on race, while still permitting the revision of algorithms prospectively to remove bias. Such an approach is entirely consistent with the law’s clear preference for voluntary efforts to comply with nondiscrimination goals.

#