In 2017, New York City passed a law to keep the city’s automated systems in check. Local Law 49, the first law of its kind in the nation, established a task force to examine the hidden algorithms governing life in New York and suggested a way for experts to study those tools for error and bias.
As automated systems take over more and more decision-making in cities around the country, the new group could ask key questions about the systems that are being used in New York. Which tools decide who is first — or last — in line for government services? Does automation favor some neighborhoods more than others? If so, who’s being left behind? The new task force could examine those questions and recommend changes where necessary.
But the New York task force now shows signs of fraying, raising troubling questions for the algorithmic accountability movement nationwide. Some members have turned openly critical of the city, accusing officials of failing to provide transparent access to information, effectively turning the task force into a publicity effort instead of a source of accountability.
At a city council hearing earlier this month, city officials said they had not yet come up with a definition of automated decision systems (ADS), the tools that the task force is meant to examine, and they couldn’t identify a single instance of an automated system that the task force could study in detail. “It has taken more time than we originally thought it would take,” an official admitted at the hearing.
With nothing to study, critics say, the task force is toothless and able to provide only broad policy recommendations — the kind that experts would have been able to suggest without convening a task force at all. New York University assistant professor and task force member Julia Stoyanovich told The Verge that if no examples are forthcoming, “then there was really no point in forming the task force at all.”
“I would not have signed up for this task force if I knew this was just a formal sort of an exercise,” she says.
While a full list of ADS used in New York City might never be produced, experts have identified many, both in the city and around the United States, although the tools are largely invisible to the city residents they affect. As the task force continues, the AI Now Institute released some publicly acknowledged examples of automated tools used in the city, pointing out systems that are used to determine where students will be sent to school, how to fund neighborhood fire departments, and which building inspections to prioritize.
In an interview, task force chair and director of the Mayor’s Office of Operations, Jeff Thamkittikasem, said some dissent was a natural part of the process and that the task force has been “a great opportunity to have challenging conversations.” The city, he says, is also working on providing new, specific examples, including two from the Department of Transportation and Department of Education. “I’m hoping in a couple of weeks we can get that,” he says, adding that the task force is “moving pretty judiciously to get toward recommendations” and on track to issue a report in December.
Thamkittikasem says the task force had several questions at their earliest meetings about what types of systems the law might apply to. The law the city passed, officials said at the meeting, was written so broadly that it could be read to include tools like calculators. Concerns with privacy and security have also prevented them from turning over detailed information about systems that are in use by the city, they said. Stoyanovich says this is an “excuse,” and there are ways to make the information available.
The tone is a sharp turn in how the law was once greeted. In a statement announcing the formation of the task force last year, Mayor Bill de Blasio hailed the plan as a way to bring New York City to the forefront of technology policy. “The establishment of the Automated Decision Systems Task Force is an important first step towards greater transparency and equity in our use of technology,” de Blasio said in the statement.
As the first plan of its kind in the United States, the New York law established a model, and other local governments have launched similar programs in the time since. But the internal dissent raises questions about the transparency that’s necessary to make that model work.
Advocates say time may now be short to make any major changes. At the end of this month, the task force will hold the first of two public engagement sessions. Thamkittikasem says the city is working to bring residents to the public meetings and to testify about how they may have been affected by ADS. Along with gathering input on automated systems, the law also requires the task force to write a report with recommendations for using those systems at the end of the year. “I’m unclear if the public will actually be engaged,” Rashida Richardson, director of policy research at the AI Now Institute, says.
Richardson, with a group of several local advocates, signed on to a letter in August outlining ways the task force could effectively work with the public. But last month, many of the same people sent another letter, this time asking the city to speed up their outreach, writing, “we fear that given this timeline the window of opportunity for the type of meaningful public engagement that would inform the Task Force’s work is rapidly closing.”
Stoyanovich worries that the problems might be systemic and that the law essentially allowed city officials to govern their own accountability. The law doesn’t give outside experts any mechanism to demand information on automated systems, which might uncover information that could be damaging to the city’s reputation.
“I was hopeful, and still am somewhat hopeful,” she says, “although far less now.”