New York will tackle unfair biases in automated city services

City Council is calling for a task force on the matter.

Whether we're aware of them or not, algorithms affect a huge part of our lives. Now, in a US-first, New York is taking steps to address potential algorithmic biases in services provided by municipal agencies. City council has passed a bill that would -- if signed by Mayor de Blasio -- create a task force to examine if and how service algorithms are biased, how citizens can appeal decisions made by algorithms if they feel they're unfair, and if agency source code could be made publicly available.

"Automated decision systems" are responsible for determining outcomes on a wide range of city/citizen matters. Take eligibility for bail, for example. Training data used to produce algorithms for this system may involve biases that unjustly favour one group of individuals over another. The task force would look at ways certain groups, such as the elderly, immigrants, the disabled and minorities, are affected by these automated processes.

The bill, named Intro 1696-A, is not as wide-reaching as advocates had initially hoped for. An earlier version would have mandated that all agencies making decisions with algorithms make their codes publicly available. The passed version simply requires the taskforce look into the feasibility of this.

If signed, the taskforce will need to be formed within three months, but the report itself wouldn't be due for 18 months, which is fair given the size of such a data intensive task, and, of course, its importance. Weeding out algorithmic biases and challenging the systems that allow them to exist in the first place will have a massive civic impact and set vital precedents for the rest of the country.