Opinion

Pennsylvania officials need to track the use of algorithms, which already make important decisions about schools, funding, and criminal justice | Opinion

Rampant use of artificial intelligence can create new problems, with baked-in biases that can grow if unchecked.

cellphone-computer-crop
Danya Henninger / Billy Penn
walkergosrich

State and local governments are widely using algorithms to automate decisions. These tools are powerful, but can easily backfire. Their applications are growing fast, which means now is the time for Philadelphia and Pennsylvania lawmakers to start tracking and studying their use.

We need to ensure algorithms are effective and equitable, so we can harness the efficiency of automation without abusing the rights of its citizens.

What’s the big deal? You may realize we interact with algorithms whenever we’re online. These software routines decide whose profiles we see on dating apps, filter spam from our inboxes, and tag and sort our photos. More subtly, algorithms also shape our offline experiences: they streamline bus routes, provide insurance quotes, set bail, evaluate students, decide credit card limits, and help the government apportion public funds.

The humans behind these computer programs are usually well-intentioned, creating them to help government agencies make faster decisions or lower companies’ bottom line — ideally so savings can be passed on to taxpayers and customers.

But rampant use of AI is creating problems that go unseen, obscured by complex statistics until it’s too late.

Biases creep into algorithms easily. After all, most engineers who design them aren’t paid for perfection; they’re paid for cheap and efficient solutions. And after algorithms are deployed, it can be almost impossible to detect bias. Few systems build in audits of the decisions — “A computer did it, how could it be unfair?” — and subtle trends that indicate unfairness are buried in heaps of data that can be spread across an organization and are often kept private for proprietary reasons.

So algorithms continue on, cementing into computer code the racial and gender biases that lurk beneath the surface of society. They are automating harmful and discriminatory practices, which will continue to grow if left unchecked.

Here are a few examples of how Pennsylvania uses algorithms to make high-risk, sensitive decisions about citizens:

These are all areas in which it’s important to get decisions right. As such, they present an opportunity to lead by example.

The Pa. General Assembly and Philadelphia City Council can both take action now: they can establish a task force to keep track of state and local use of algorithms. This low-cost first step will empower Pennsylvania to develop a smart regulatory approach to automation, allowing government agencies and the public to reap the benefits of this new technology while minimizing the risks to residents.

Already in Pa., algorithms are being applied to important decisions without proper safeguards. This is true when government agencies or researchers design algorithms, but is even more evident when private companies are contracted to build and run them, a common practice. The system used to evaluate students and teachers in Pennsylvania public schools, for example, is contracted from a company called SAS Institute Inc. Researchers recently found that this and other similar tools are racially biased, perhaps even more than other tools in the market.

Intentional policymaking around algorithm use is so important because bias and errors in algorithms tend to sneak through, with disastrous consequences.

Even small biases present in an algorithm can have huge effects when scaled up to make thousands of decisions about people’s lives every second. Their consistency and the scale with which they’re applied means harms can easily balloon, and even become systemic, erecting more barriers for communities already marginalized.

How can we rein in the damages that algorithmic decision systems cause? Start by examining and learning more about the systems we’re already using.

Lawmakers in Harrisburg could pass bills modeled after similar measures proposed in the New York and Washington state legislatures. Councilmembers in Philly can build on the work of a similar task force in New York City. These task forces would test for bias, find best practices, and publish reports to inform the public and suggest future legislation. This would add much-needed oversight, increase transparency, and facilitate research to continue to make algorithms more fair.

Leaders need to take action now to establish proper oversight and ensure equitable treatment for all residents.

Want some more? Explore other Opinion stories.

Mornings are for coffee and local news

Billy Penn’s free morning newsletter gives you a daily roundup of the top Philly stories you need to start your day.

You finished another Billy Penn article — keep it up!

We hope you found it useful, fun, or maybe even both. If you want more stories like this, will you join us as a member today?

Nice to see you (instead of a paywall)

Billy Penn’s mission is to provide free, quality information to Philadelphians through our articles and daily newsletter. If you believe local journalism is key to a healthy community, join us!

Your donation brought this story to life

Billy Penn only exists because of supporters like you. If you find our work valuable, consider making a sustaining donation today.

Being informed looks good on you

Thanks for reading another article, made possible by members like you. Want to share BP with a friend?