AI for the police?

SHOTSPOTTER2

A crime-fighting AI company altered evidence to please police, a new investigation claims — the latest in a rising chorus of criticism.

What’s new: ShotSpotter, which makes a widely used system of the same name that detects the sound of gunshots and triangulates their location, modified the system’s findings in some cases, Vice reported.

Why it matters: ShotSpotter’ technology is deployed in over 100 U.S. cities and counties. The people who live in those places need to be able to trust criminal justice authorities, which means they must be able to trust the AI systems those authorities rely on. The incidents described in legal documents could undermine that trust — and potentially trust in other automated systems.

We’re thinking: There are good reasons for humans to analyze the output of AI systems and occasionally modify or override their conclusions. Many systems keep humans in the loop for this very reason. It’s crucial, though, that such systems be transparent and subject to ongoing, independent audits to ensure that any modifications have a sound technical basis.

Do you think AI can help solve crimes or is it too big of a risk? :oncoming_police_car: :boom:

To read full story, click HERE