Every lens leaves a blur signature—a hidden fingerprint in every photo; With it, we can tell apart ‘identical’ phones by their optics, deblur images, and render realistic blurs. (blur-fields.github.io)
from Pro@programming.dev to privacy@lemmy.ml on 12 Sep 09:33
https://programming.dev/post/37302987

cross-posted from: programming.dev/post/37278389

Optical blur is an inherent property of any lens system and is challenging to model in modern cameras because of their complex optical elements. To tackle this challenge, we introduce a high‑dimensional neural representation of blur—the lens blur field—and a practical method for acquisition.

The lens blur field is a multilayer perceptron (MLP) designed to (1) accurately capture variations of the lens 2‑D point spread function over image‑plane location, focus setting, and optionally depth; and (2) represent these variations parametrically as a single, sensor‑specific function. The representation models the combined effects of defocus, diffraction, aberration, and accounts for sensor features such as pixel color filters and pixel‑specific micro‑lenses.

We provide a first‑of‑its‑kind dataset of 5‑D blur fields—for smartphone cameras, camera bodies equipped with a variety of lenses, etc. Finally, we show that acquired 5‑D blur fields are expressive and accurate enough to reveal, for the first time, differences in optical behavior of smartphone devices of the same make and model.

#privacy

threaded - newest

hellfire103@lemmy.ca on 12 Sep 10:49 next collapse

This was mentioned in Little Brother by Cory Doctorow. So, what do we do about it?

interdimensionalmeme@lemmy.ml on 12 Sep 11:12 collapse

Have a coordinated volunteer project where people print and photograph special patterned image designed to map the blur and other aberration of their particular lens. With hundreds of thousands of sample, we train a micro-distortion ML model that subtly shifts and distorts the pixels just enough to make positive lens identification impossible. Then have something to auto-apply this filter (and discard originals) on every pictures before they even have a chance of being uploaded to the cloud.

asbestos@lemmy.world on 12 Sep 11:37 next collapse

Splendid

icelimit@lemmy.ml on 12 Sep 12:47 next collapse

This blur persists past digital (automatic) post processing? Or is otherwise still uniquely traceable?

pupbiru@aussie.zone on 13 Sep 09:26 collapse

i’d guess that the digital processing is a well known change so you can account for it

after all, modern post processing on a phone afaik is done in the raw sensor data, so is using a lot more data than is actually stored in the JPEG: it probably leads to more information being available than if it weren’t done (more shadow detail rather than crushed blacks, etc)

webghost0101@sopuli.xyz on 12 Sep 13:23 next collapse

As much as this can be a privacy issue couldn’t this be a feature to tell apart ai and real photos?

Ross_audio@lemmy.world on 12 Sep 13:50 collapse

Just my guess. I could be wrong:

As the lens blur is mathematically fairly simple and spread across the whole image it’s likely already consistently replicated by AI in a similar way to real photos.

It’s easier for generative AI to spot, “understand”, and replicate a mathematical pattern than the number of fingers on a hand or limbs on a body.

webghost0101@sopuli.xyz on 12 Sep 14:38 next collapse

Also a guess, isn’t a hand or any biological form not also the result of a mathematical pattern?

I do see how ai could replicate “a” blur but what it might not be able to do (yet) is replicate the unique blur of a specific device.

So maybe you couldn’t proof something is AI, but the physical lens as proof that it is not.

Ross_audio@lemmy.world on 12 Sep 20:24 next collapse

Hands appear differently in different positions all over the frame in the photo so I maintain the hand pattern is less consistent and harder than lens blur.

But you’re right as the blur is a fingerprint you can match it to a lens and prove a photo is real that way.

It could be a useful tactic as much of AI detection is a way to find and prove AI fake so far.

aashd123@feddit.nl on 13 Sep 06:33 collapse

You wouldn’t share your physical lens for high-risk work (i.e. where you are anonymous) and since there’s no way to know whether a specific “blur” was produced by a physical lens or by AI, this won’t help in proving if something is AI.

howrar@lemmy.ca on 13 Sep 12:10 collapse

It also helps that the current generation of image generation models essentially work by “deblurring” some random noise. Having a blur in the resulting image just means the model has to do less work in a sense.

Core_of_Arden@lemmy.ml on 12 Sep 13:46 next collapse

Everything leaves behind a unique pattern.

Endymion_Mallorn@kbin.melroy.org on 12 Sep 14:49 next collapse

So you're saying, always 'scratch' your lens and get a repair shop to replace it with a generic lens. And if possible, get the CCD changed to a compatible one as well.

lime@feddit.nu on 12 Sep 15:06 collapse

…for every photo

Endymion_Mallorn@kbin.melroy.org on 12 Sep 16:04 collapse

That's a challenge, but even moving to a "generic" lens should help to reduce the identifiability.

CookieOfFortune@lemmy.world on 12 Sep 20:54 next collapse

Why would a generic lens be any better? These distortions are part of the lens design and manufacturing. Arguably, a lower quality lens would be easier to identify.

Endymion_Mallorn@kbin.melroy.org on 12 Sep 21:51 collapse

But not identifiable to a specific type of phone.

CookieOfFortune@lemmy.world on 12 Sep 22:12 next collapse

Ah I see. Do non phone specific generic lenses exist? They all seem pretty specialized to me.

Endymion_Mallorn@kbin.melroy.org on 13 Sep 07:16 collapse

I know that a lot of the cheap Android handsets, which we mostly encounter as prepaid, have interchangeable camera bits.

porous_grey_matter@lemmy.ml on 12 Sep 22:40 collapse

I didn’t see anything to see that these aberrations indicated anything about a type of phone? They’re unique for each lens…

lime@feddit.nu on 14 Sep 13:00 collapse

but the problem isn’t “we can tell this photo is from a specific phone”, it’s “we can tell these two photos are from the same phone”.

sefra1@lemmy.zip on 12 Sep 15:17 next collapse

It’s old news that you should never use the same camera for two images that need separate identities.

The same applies to radio transmitters and every analogue medium like probably microphone or preamp or ADC.

Anything that doesn’t work on purely digital domain is most likely traceable and I wouldn’t be surprised if proprietary software like Adobe started embedding hidden fingerprints into their files to “enforce their copyright” or “better collaborate with law enforcement”

I tend to complain that ROMs like Graphene OS don’t allow spoofing IMEI which should be basic functionally of every privacy-enabled phone. Yet if you require real privacy the electronic “fingerprint” of the radio itself is probably enough to track someone if they really want to.

There’s also a thing where they can track someone’s time and location just from listening to oscillations on the utility power’s frequency

irmadlad@lemmy.world on 12 Sep 17:02 next collapse

It’s old news that you should never use the same camera for two images that need separate identities.

Sanatize metadata and Exif data?

sefra1@lemmy.zip on 12 Sep 17:12 collapse

That’s probably enough to stop your online mates from doxing you, but a powerful enough adversary can trace the little unique nuanced fingerprints that a camara lens introduces to the picture, and compare it with images from other sources like social media.

There are are many steps that can introduce patterns, like the way the lens blurs as explained in the article, sensor readout noise patterns, a speckle of dust, scratches, I bet chromatic aberrations are probably also different between multiple copies of the lens.

LesserAbe@lemmy.world on 12 Sep 22:23 next collapse

It’s news to me. Do you have any further reading about it you can share?

Mgineer@lemmy.ml on 13 Sep 07:52 next collapse

Anything that doesn’t work on purely digital domain is most likely traceable

at this point I believe that digital is easier to trace as every device ever connected to the Internet or connected to a device that has, has probably been bugged

pupbiru@aussie.zone on 13 Sep 09:23 collapse

The same applies to radio transmitters and every analogue medium like probably microphone or preamp or ADC.

exactly why when you buy any halfway decent mic there’s the option to buy them in sets: they’ll have come off the production line together so that their imperfections are as close to each other as possible so that they sound as identical as they can be

communism@lemmy.ml on 12 Sep 20:42 next collapse

Is it possible to use some kind of random noise algorithm to modify the image so that devices can’t be uniquely identified like this anymore? Or would that not work?

interdimensionalmeme@lemmy.ml on 12 Sep 20:48 collapse

The would have to be enough to obscure the lens’ aberration, that would be an obnoxious amount of noise. Instead I think a better solution is to add micro distortion strategically to make identification ambiguous/inconclusive

pupbiru@aussie.zone on 13 Sep 09:28 collapse

perhaps simply putting something like cling wrap over the lens and moving it for each photo would be enough: adding some scratches and roughness that slightly changes each time you move it

interdimensionalmeme@lemmy.ml on 14 Sep 07:13 collapse

Some parts of the image would not be changed enough, and like a partial fingerprint they could still be traced, the entire image has to be micro-distorted digitally to thoroughly jumble up the microblur image signature.

The sensor also has a unique “grain” structure and that has to be dispelled

CookieOfFortune@lemmy.world on 12 Sep 20:57 collapse

That means we could get better processing by applying this analysis in reverse. Would also reduce this type of fingerprinting.

Eagle0110@lemmy.world on 13 Sep 09:34 collapse

Uhhh I feel like reversing this wouldn’t be much easier than trying to reverse hash functions lol

CookieOfFortune@lemmy.world on 13 Sep 09:42 collapse

We already apply distortion correction based on lens profiles. Probably all cell phone lenses do this. They just aren’t corrected based on individual phones.