'Too dangerous': Why even Google was afraid to release this technology (www.npr.org)
from throws_lemy@lemmy.nz to privacy@lemmy.ca on 13 Oct 2023 16:42
https://lemmy.nz/post/2349142

Imagine strolling down a busy city street and snapping a photo of a stranger then uploading it into a search engine that almost instantaneously helps you identify the person.

This isn’t a hypothetical. It’s possible now, thanks to a website called PimEyes, considered one of the most powerful publicly available facial recognition tools online.

#privacy

threaded - newest

autotldr@lemmings.world on 13 Oct 2023 16:45 next collapse

This is the best summary I could come up with:


Imagine strolling down a busy city street and snapping a photo of a stranger then uploading it into a search engine that almost instantaneously helps you identify the person.

A basic version of PimEyes is free for anyone to use, but the company offers advanced features, like alerts on images that users may be interested in when a new photo appears online, for a monthly subscription fee.

Gobronidze said PimEyes now blocks access in 27 countries, including Iran, China and Russia, over fears government authorities could use the service to target protesters and dissidents.

“These benefits are being used as a pretext for government and industry simply to expand their power and profits, without any meaningful gains any way,” said Woodrow Hartzog, a Boston University School of Law professor who specializes in facial recognition technology.

And while Big Tech companies have been holding back, smaller startups pushing the technology are gaining momentum like PimEyes, and another called Clearview AI, which provides AI-powered face search engines to law enforcement.

Silicon Valley giants had developed the powerful chatbots for years in labs, but kept them a secret until a smaller startup, OpenAI, made ChatGPT available to the public.


The original article contains 1,428 words, the summary contains 196 words. Saved 86%. I’m a bot and I’m open source!

tacosanonymous@lemm.ee on 13 Oct 2023 17:26 next collapse

Okay, we need legal protections against this yesterday. Wtf?

baconisaveg@lemmy.ca on 13 Oct 2023 19:30 next collapse

Maybe because it doesn’t work? I uploaded a picture of someone I know, it found 2 ‘free’ results, neither of them matched, but it then offered me a ‘deep search’ for only $20.47.

And at the end of the day, the responsibility is still on YOU to quit fucking uploading your mug everywhere and making it publicly searchable.

Rodeo@lemmy.ca on 13 Oct 2023 22:26 collapse

What about cameras in public places uploading your image without your consent?

Remember last year when that mall in Calgary got caught selling face data of random customers? You don’t even need to have any social media at all to be affected by this.

CommanderCloon@lemmy.ml on 13 Oct 2023 19:39 next collapse

That will help making these tools less accessible, but it won’t stop them from existing

nik282000@lemmy.ca on 13 Oct 2023 22:35 collapse

That cat has been out of the bag for nearly 20 years. If you have ever posted photos of yourself online (on a platform that you do not own and operate) your images are free for that platform to use however they want. The way to not be in a face search engine is to not upload your face into search engines.

thezeesystem@lemmy.world on 13 Oct 2023 18:39 next collapse

Thanks! This will definitely be helpful with trying to get people to understand that there should be regulations and things to prevent it.

Basically ask someone if I can take a photo and then show them how incredibly easy it is to find everything about them.

[deleted] on 13 Oct 2023 19:33 next collapse
.
extant@lemmy.world on 13 Oct 2023 20:06 next collapse

Reverse Image search is offered by Google but it’s not focused on faces, there are many companies that do this they just target governments and other corporations for their business instead of publicly advertising it. What do you think Meta does with everything that’s uploaded to it?

ultratiem@lemmy.ca on 13 Oct 2023 20:27 collapse

“Eric Schmidt as far back as 2011, said this was the one technology that Google had developed and decided to hold back, that it was too dangerous in the wrong hands — if it was used by a dictator, for example,” Hill said.

Yeah, I’m sure they didn’t just keep it to themselves and use it for their nefarious purposes. Definitely not what happened.

nik282000@lemmy.ca on 13 Oct 2023 22:38 collapse

Government’s already have this tech, it’s not a secret or even very complicated it just takes some really beefy hardware (or a lot of time). The FBI used it to identify people in the capitol attack.

ultratiem@lemmy.ca on 13 Oct 2023 23:38 collapse

Yeah I’m sure they’ve had for decades. Not saying it’s not out there, just that it’s a bit disingenuous to say Google doesn’t employ and even sell the service.

It makes it seem like Google locked it away like some Akira project.