E-MAIL THIS LINK
To: 

You've Been Flagged as a Threat: Predictive AI Technology Puts a Target on Your Back
[Rutherford] You’ve been flagged as a threat.

Before long, every household in America will be similarly flagged and assigned a threat score.

Without having ever knowingly committed a crime or been convicted of one, you and your fellow citizens have likely been assessed for behaviors the government might consider devious, dangerous or concerning; assigned a threat score based on your associations, activities and viewpoints; and catalogued in a government database according to how you should be approached by police and other government agencies based on your particular threat level.

If you’re not unnerved over the ramifications of how such a program could be used and abused, keep reading.

It’s just a matter of time before you find yourself wrongly accused, investigated and confronted by police based on a data-driven algorithm or risk assessment culled together by a computer program run by artificial intelligence.

Consider the case of Michael Williams, who spent almost a year in jail for a crime he didn’t commit. Williams was behind the wheel when a passing car fired at his vehicle, killing his 25-year-old passenger Safarian Herring, who had hitched a ride.

Despite the fact that Williams had no motive, there were no eyewitnesses to the shooting, no gun was found in the car, and Williams himself drove Herring to the hospital, police charged the 65-year-old man with first-degree murder based on ShotSpotter, a gunshot detection program that had picked up a loud bang on its network of surveillance microphones and triangulated the noise to correspond with a noiseless security video showing Williams’ car driving through an intersection. The case was eventually dismissed for lack of evidence.

Although gunshot detection program like ShotSpotter are gaining popularity with law enforcement agencies, prosecutors and courts alike, they are riddled with flaws, mistaking "dumpsters, trucks, motorcycles, helicopters, fireworks, construction, trash pickup and church bells...for gunshots."

As an Associated Press investigation found, "the system can miss live gunfire right under its microphones, or misclassify the sounds of fireworks or cars backfiring as gunshots."

In one community, ShotSpotter worked less than 50% of the time.

Then there’s the human element of corruption which invariably gets added to the mix. In some cases, "employees have changed sounds detected by the system to say that they are gunshots." Forensic reports prepared by ShotSpotter’s employees have also "been used in court to improperly claim that a defendant shot at police, or provide questionable counts of the number of shots allegedly fired by defendants."

The same company that owns ShotSpotter also owns a predictive policing program that aims to use gunshot detection data to "predict" crime before it happens. Both Presidents Biden and Trump have pushed for greater use of these predictive programs to combat gun violence in communities, despite the fact that found they have not been found to reduce gun violence or increase community safety.

The rationale behind this fusion of widespread surveillance, behavior prediction technologies, data mining, precognitive technology, and neighborhood and family snitch programs is purportedly to enable the government takes preemptive steps to combat crime (or whatever the government has chosen to outlaw at any given time).
Posted by: Besoeker 2022-05-19
http://www.rantburg.com/poparticle.php?ID=633335