see a cute person on the bus, take a photo of them and this computer program will look up and find all their personal information…
You may have good reason to be worried that police use of facial recognition might erode your privacy – many departments are already using software with serious privacy concerns. The New York Times has learned that over 600 law enforcement agencies in the US and Canada have signed up in the past year to use software from little-known startup Clearview AI that can match uploaded photos (even those with imperfect angles) against over three billion images reportedly scraped from the web, including Facebook and YouTube. While it has apparently helped solve some cases, it also creates massive privacy concerns – police could intimidate protesters, stalk people and otherwise abuse the system with few obstacles.
Part of the problem stems from a lack of oversight. There has been no real public input into adoption of Clearview’s software, and the company’s ability to safeguard data hasn’t been tested in practice. Clearview itself remained highly secretive until late 2019. It’s certainly capable of looking at search data if it wants – police helping to test the software for the NYT 's story got calls asking if they’d been talking to the media.
You should google the YouTube video “red flag laws Virginia”
They undoubtedly used this type software to show up on an elderly gentleman’s doorstep to harass him prior to the 2nd amendment rally.
I was at a rally in DC. The day of the rally, we couldn’t get cell service at all, my daughter got some kind of “all circuits are busy” thing when she tried to call me. So yeah, ■■■■ happens.
Or it could be that none of that happened. Cell service was not jammed and all the pictures Ive seen of the rally is proof in itself that the media was not steered away from the event.