One of the main rotundas in the Philadelphia House of Corrections

One of the main rotundas in the Philadelphia House of Corrections

Bobby Chen / Billy Penn

The Reentry Project

Can Philly’s new technology predict recidivism without being racist?

A tool to help predict whether someone who’s been arrested will reoffend does not factor in race. But it could consider convictions.

One of the main rotundas in the Philadelphia House of Corrections

One of the main rotundas in the Philadelphia House of Corrections

Bobby Chen / Billy Penn
anna

So you’ve just been arrested. Welcome to Philadelphia’s criminal justice system. You’ll soon be whisked into a room where bail will be assigned to you — likely in a matter of seconds — by a bail magistrate or a judge who has access to your record. Maybe they’ll let you go. Maybe they’ll set your bail sky-high. Maybe they won’t offer you bail at all.

And if you’re one of the people with the latter fate or someone who can’t post cash bail, you’ll await trial in a jail cell on State Road. Good luck.

The current process of assigning bail is far from scientific. As part of sweeping changes to Philadelphia’s criminal justice system that are afoot, city officials are working with top data scientists to develop a computerized risk assessment tool that looks at a variety of factors and assigns a defendant a label: Low-, medium- or high-risk. Bail will be assigned from there, and the ultimate goal is to get more pretrial defendants out of the city’s jails while working to eventually end cash bail entirely.

Criminal justice reform advocates see the end goal as a good one. But there’s a real concern that computerized risk assessment tools could predict recidivism by weighing factors that serve as a proxy for race and socioeconomic status, ultimately incarcerating more black and brown defendants while allowing white defendants to go free.

Hannah Sassaman, the policy director at the Media Mobilizing Project who was recently awarded a fellowship to study risk assessment models, said there are factors beyond race and zip code — which won’t be incorporated into Philadelphia’s risk assessment tool — that can stand as a proxy for race, whether it’s conviction record, job status or arrest history.

“If we know convictions are caused by those systemic racist factors, how can we have convictions as a proxy for dangerousness?” she said.

The development of a new risk assessment tool for Philadelphia is part of a number of strategies being implemented by a team of stakeholders working on a three-year project to reduce Philadelphia’s prison population by a third. Much of those efforts are being funded by a $3.5 million grant from the MacArthur Foundation, which selected Philadelphia as a recipient to take part in its Safety and Justice Challenge.

A large part of that jail population reduction comes in the pretrial arena. Ultimately, the city wants to move toward ending cash bail entirely, a move that criminal justice reform advocates push for, saying cash bail unfairly punishes and disproportionately incarcerates the poor just for being poor. The development of a risk assessment tool goes hand-in-hand.

Michael Bouchard, director of pretrial services for the First Judicial District of Pennsylvania, said in an interview earlier this month that “the goal with implementing a new risk tool is to reduce or eliminate cash bail.”

“Once we have a risk tool and once we have a model with numbers,” he said, “we’ll be able to allocate our resources in the pretrial arena to provide those that are more suited for community supervision than pretrial incarceration.”

ABC Bail Bonds in Center City, Philadelphia

ABC Bail Bonds in Center City

Angela Gervasi/ Billy Penn

But there’s no timeline, at least not publicly, for when that tool will be finalized and implemented. City leaders are still working with Dr. Richard Berk, a statistician at the University of Pennsylvania known nationally for his work in predictive modeling, to develop the tool. Berk said at this point, there’s no risk assessment instrument yet, though he’s examined some prototypes, and is “waiting for feedback.”

Gabriel B. Roberts, a spokesman for the First Judicial District of Pennsylvania, said in a statement provided to Billy Penn that “the risk assessment tool is just one of 19 initiatives funded by the MacArthur grant to safely reduce Philadelphia County’s jail population while also reducing racial and ethnic disparities.”

“As with other initiatives, every effort will be made to reduce racial and ethnic disparities,” he said. “To that end, the model, which is still being developed, will not include any information concerning race or zip code.”

Sassaman and other advocates she’s working with say that doesn’t go far enough. She’s been working since July — via a 2017 Soros Justice Advocacy Fellowships within the Open Society Foundations — to study predictive models in other jurisdictions while also working to gain a seat at the table as Philadelphia officials work to develop the city’s tool.

“If they’re in the process of designing a tool, part of that process should be directly working with people,” Sassaman said, “Not just letting policy-makers make political and moral decisions, but also everyday people from the poorest big city in America should be a structural part of that conversation.”

Hannah Sassaman, policy director at Media Mobilizing Project

Hannah Sassaman, policy director at Media Mobilizing Project

Danya Henninger

Larry Krasner, the Democratic nominee for district attorney and the favorite to win the election in November, agreed, saying any risk assessment tool developed by the city needs to be open source, adding “there is a real danger that the components going into the risk assessment are proxies for race and for socioeconomic status.”

“The threshold there is we have to be able to look inside that black box,” Krasner said in an interview earlier this month. “It cannot be wholly proprietary. It cannot be closed in terms of how it processes the information. It cannot be closed in terms of what inputs there are.”

For now, those developing the tool are keeping tight-lipped about what factors are going into it, save for promising race and zip code won’t be among them.

Sassaman said she’s confident that will change.

“If my very liberty is being determined by a computer program that is invisible to me,” she said, “then I have every right to watch it, and I have every right to make sure it is trained on data that is local.”