Joshua Kroll and Ed Felten
Important decisions about people are increasingly made by algorithms: Votes are counted; voter rolls are purged; financial aid decisions are made; taxpayers are chosen for audits; air travelers are selected for search; credit eligibility decisions are made. Citizens, and society as a whole, have an interest in making these processes more transparent. Yet the full basis for these decisions is rarely available to affected people: the algorithm or some inputs may be secret; or the implementation may be secret; or the process may not be precisely described. A person who suspects the process went wrong has little recourse. And an oversight authority who wants to ensure that decisions are made according to an acceptable policy has little assurance that proffered decision rules match decisions for actual users.
To address this problem, we propose to use accountable algorithms, which provide both an result and a proof that can convince a skeptical party that a consistent policy was applied correctly to accurate data to produce the announced result. Critically, the proof can convince an observer while maintaining the secrecy of parts of the policy used to determine the output, and the privacy of individuals' personal data.
Our methods use the tools of computer science to cryptographycially ensure the technical properties that can be proven, while providing the necessary information so that a political, legal, or social oversight process can operate effectively. Combining the technology of verified computation with the operation of non-technical governance structures offers the best hope of governing the operation of algorithmic decision processes in practice.