Adversarial images on clothing to combat AI facial recognition without covering the face (petapixel.com)
from listless@lemmy.cringecollective.io to privacy@lemmy.ml on 19 Jul 2024 02:17
https://lemmy.cringecollective.io/post/58437

#privacy

threaded - newest

CaptainSpaceman@lemmy.world on 19 Jul 2024 02:42 next collapse

This will work for about 10 minutes. Better off wearing a facemask, bandana, juggalo face paint, etc

lessthanluigi@lemmy.world on 19 Jul 2024 03:00 next collapse

I’ll take the juggalo face paint even though I am not a juggalo

just_another_person@lemmy.world on 19 Jul 2024 03:02 collapse

Wrong answer.

strawberry@kbin.run on 19 Jul 2024 05:14 collapse

can't they still id you with a mask? apples face id can work with a mask. though I suppose that has depth data too

TragicNotCute@lemmy.world on 19 Jul 2024 20:48 collapse

Almost certainly. The facial ID is good enough that US customs didn’t even want to see my passport. Just a photo was enough to let me back in the country. I even significantly changed my hair between departure and arrival. Shit is scary.

z00s@lemmy.world on 19 Jul 2024 03:45 next collapse

The method that Cap_able has patented allows the wearer to incorporate the algorithm into the fabric of the clothing and still look stylish.

I was with you up until the stylish bit

LodeMike@lemmy.today on 19 Jul 2024 03:53 next collapse

Stylish != fashionable

z00s@lemmy.world on 19 Jul 2024 09:16 collapse

<img alt="1000000504" src="https://lemmy.world/pictrs/image/0ada0bfe-beb2-4a82-ad10-8438d18de165.jpeg">

LodeMike@lemmy.today on 19 Jul 2024 09:22 collapse

Oh right. I forget that definitions change

z00s@lemmy.world on 19 Jul 2024 14:43 collapse

When did it change?

LodeMike@lemmy.today on 19 Jul 2024 18:46 collapse

IDK. 15 years ago maybe

z00s@lemmy.world on 20 Jul 2024 00:11 collapse

maybe it never changed

finley@lemm.ee on 19 Jul 2024 04:02 next collapse

Everyone’s a critic

FinalRemix@lemmy.world on 19 Jul 2024 09:09 collapse

I for one think looking like a texture ripped from DooM is stylish.

muntedcrocodile@lemm.ee on 19 Jul 2024 04:14 next collapse

Unfortunatly its a cat and mouse game. Except the cat is a easily deployable software problem and the mouse is buy new clothing hardware problem.

notfromhere@lemmy.ml on 19 Jul 2024 05:38 next collapse

Led clothing anyone?

muntedcrocodile@lemm.ee on 19 Jul 2024 06:02 next collapse

If its radioactive then it will disrupt the image sensor. It mist also disrupt your dna but u dont need that do ya.

notfromhere@lemmy.ml on 19 Jul 2024 06:09 next collapse

Light emitting diode (screens sewn on fabric) not lead lined 😁

muntedcrocodile@lemm.ee on 19 Jul 2024 06:15 collapse

Fuck im retarded. But maybe a automated laser targeting system.

Etterra@lemmy.world on 19 Jul 2024 09:31 collapse

I guess I’ll just wear radium paint then.

merde@sh.itjust.works on 19 Jul 2024 06:57 collapse

they’ve been around for some time now: www.reflectacles.com

<img alt="" src="https://sh.itjust.works/pictrs/image/a78a54c2-8583-4773-ae50-041e7bea3356.jpeg">

Ghost uses a frame-applied material that reflects both infrared and visible light. In low light environments they will maintain your privacy on cameras using infrared for illumination and also block 3D infrared facial mapping during both day & night. The visible light reflection can make you anonymous in images/videos using a flash in low light.

01189998819991197253@infosec.pub on 19 Jul 2024 15:03 collapse

Ghost is $170 (US, I’m assuming). Not great but not bad for a wicked cool looking pair of sunglasses. Considering Ray-Bans are around $200 (and, no offense, look like they’re from Tesco), and that Ghost are privacy focused, I’d say that price seems not that bad. Still high, though.

Umbrias@beehaw.org on 19 Jul 2024 07:39 collapse

If they can target the underlying architecture of the models like nightshade does, it will actually be quite hard to deal with for the surveillance companies.

muntedcrocodile@lemm.ee on 20 Jul 2024 13:04 collapse

Interesting concept if we can target and poison the definatly stolen training data.

DemBoSain@midwest.social on 19 Jul 2024 04:54 next collapse

This would be a good article if the pictures actually showed people wearing the clothes.

01189998819991197253@infosec.pub on 19 Jul 2024 15:13 collapse

Literally the header image…

<img alt="" src="https://infosec.pub/pictrs/image/a2fcc163-3b88-4cf5-a497-c614e844fb63.webp">

DemBoSain@midwest.social on 19 Jul 2024 19:21 next collapse

I see a couple people, and some oddly colored blobs.

01189998819991197253@infosec.pub on 19 Jul 2024 20:16 collapse

Oh! HHahhahhhHah! That’s a good joke! Wooshed right over my head hahahahahahahah!

Edit: correct autocorrect

possiblylinux127@lemmy.zip on 20 Jul 2024 04:40 collapse

What’s with the floating heads?

DetachablePianist@lemmy.ml on 19 Jul 2024 05:00 next collapse

🎶"Because I’m tacky…" 🎵

Blaster_M@lemmy.world on 19 Jul 2024 05:13 next collapse

<img alt="" src="https://imgs.xkcd.com/comics/license_plate.png">

dRLY@lemmy.ml on 21 Jul 2024 06:03 collapse

A friend of mine’s dad worked in some capacity of pigs. Which lead to my friend finding out that some people had either by really random luck in attempting something like the comic or also finding out from interacting with pigs. That in the city I lived in, there is like a “panic” signal that auto calls for lots of help that involved hitting a specific letter or number multiple times. For some reason I want to say it was maybe either zero or O, but I don’t remember off hand.

So when they would be quickly inputting a plate with enough taps and not thinking, shit would cause resources to be pulled and wasted. Not great for attracting attention to the driver since it is basically pulling aggro. But could be great for moving attention from somewhere else.

Cheradenine@sh.itjust.works on 19 Jul 2024 05:47 next collapse

William Gibson’s Ugly Shirt come to life

dipak@lemmy.ml on 19 Jul 2024 09:37 next collapse

Good for privacy! But I really doubt it would work for all recognition systems.

Some funny pitfalls that may occur - Self driving cars would prefer to hit that person if had to make a choice between him and some other human. And, there is possibility that the Street mapping cars would not blur his face for the lack of detection.

fernandu00@lemmy.ml on 19 Jul 2024 11:28 next collapse

$246?! I can’t afford that. For that price I’d rather avoid cameras and such. Cool technology though

whydudothatdrcrane@lemmy.ml on 19 Jul 2024 15:01 next collapse

Absolutely cool. I will have to revise all my internalized cyberpunk imagery though.

01189998819991197253@infosec.pub on 19 Jul 2024 15:12 next collapse

Similar tech has been around for a while, and it almost always gets beaten.

CableMonster@lemmy.ml on 19 Jul 2024 20:30 next collapse

AI probably was already patched 5 minutes after the article came out.

ssm@lemmy.sdf.org on 19 Jul 2024 21:39 collapse

You can’t really “patch” LLMs like most software; you’d have to retrain them, no?

CableMonster@lemmy.ml on 19 Jul 2024 23:29 next collapse

Oh I dont know, I would just assume they could update (or retrain) to adapt pretty quickly.

ssm@lemmy.sdf.org on 20 Jul 2024 00:09 collapse

I don’t know either, I wasn’t trying to be condescending or anything.

OhNoMoreLemmy@lemmy.ml on 20 Jul 2024 12:14 collapse

Yeah but they don’t use LLMs for this, they’ll use some other kind of machine learning mixed in a big pipeline of data processing. It makes it really hard to guess how much work it would take to fix. It might require retraining, might just require an easy patch of the rest of the pipeline.

My guess is that they’re just shitty jumpers and there’s nothing to fix anyway.

geneva_convenience@lemmy.ml on 19 Jul 2024 21:01 next collapse

Their demo video looks horrible. They are using a trash algorithm to demo the detection failing.

JokeDeity@lemm.ee on 20 Jul 2024 13:24 collapse

The girl also moves extremely slowly and permanently has her arms out to the side at the elbows. I assume this is the only way they could get the results they wanted to show.

whotookkarl@lemmy.world on 19 Jul 2024 21:18 next collapse

400-700 for a single article of clothing with no mention of what facial recognition software this affects, how effective it is and what is the failure rate, error bounds, etc. Sounds like a scam.

Cethin@lemmy.zip on 19 Jul 2024 23:11 collapse

I wouldn’t call it a “scam” just manipulative marketing. This stuff doesn’t seem like it’d work for any of the modern facial recognition options, but that’s just a guess. If it did work well and they were proud of it, you can be sure that’d be part of the marketing, so it at best is mediocre if not useless.

ruplicant@sh.itjust.works on 20 Jul 2024 11:53 next collapse

I wouldn’t call it a “scam” just manipulative marketing

the difference?

HumanPerson@sh.itjust.works on 21 Jul 2024 14:25 collapse

Not who you asked, but I think some might argue that it would be a scam if you ordered it and it didn’t arrive or something like that. If it works against one facial recognition model than technically it is just bad marketing. Either way is bad, though.

JokeDeity@lemm.ee on 20 Jul 2024 13:22 collapse

So I don’t know if you guys actually read the article or not but they absolutely DO claim that it works against YOLO which they claim to be the most popular recognition software. I don’t know about how factual any of that is, but they do make the statement.

some_guy@lemmy.sdf.org on 20 Jul 2024 02:21 next collapse

It’s only a matter of time before a cop charges someone with obstruction for trying to disrupt a camera system (during the commission of a crime, I mean).

possiblylinux127@lemmy.zip on 20 Jul 2024 04:40 collapse

Or they just work around it

Etterra@lemmy.world on 20 Jul 2024 02:48 next collapse

So I guess we’re wearing broken JPEGs now huh?

possiblylinux127@lemmy.zip on 20 Jul 2024 04:39 next collapse

I want this to be a thing

dRLY@lemmy.ml on 21 Jul 2024 05:49 collapse

Glitch art clothing would be dope even if it didn’t help with AI fuzzing.

NigelFrobisher@aussie.zone on 20 Jul 2024 12:04 collapse

Those people are just dressed like regular Australians.