By spoofing the fingerprint, developers can make their automated tools impersonate real users more convincingly, thereby bypassing bot detections.
Many OSS projects and personal web servers have bot detection because they would otherwise drown under (AI) scrappers and other bots traffic. Hosting or bandwidth cost is often unsustainable without bot protection.
If you don’t want to kill these projects, honor robots.txt by default, use throttling, don’t try to circumvent bot blocks. Look if there’s a purpose built API available to bots. If they don’t want to offert such API, go find something else to do.
Most open source projects just hit everything with a PoW captcha instead of trying to guess if a user is real or not, so trying to spoof enough to look like a real user won’t change all that much anymore.
That’s true. The reason is there’s lots of bot traffic spoofing real users, sometimes even going through residential proxies.
When bots spoof users well, the last option for projects is use these PoW captcha that annoy everyone. Enshitification continues.
ghodawalaaman@programming.dev
on 18 Mar 08:54
collapse
well if a person decide to use this attack small OSS projects server then we are failed as humanity. I shared this article to fight against big tech surveillance if people use it to damage FOSS project I highly discourage that behavior.
The article focuses on techniques that help bots spoof browsers, to make them impersonate a typical human visitor.
It’s not obvious how this helps people protect themselves against surveillance while being online. Using python scripting is not a practical way to browse. But it’s handy to write scrappers.
It’s certainly useful to misbehaving bots that try to evade anti-bots protection.
MonkderVierte@lemmy.zip
on 18 Mar 13:58
nextcollapse
I removed the Firefox’ version number and half the internet broke. I replaced the whole string with Dillo 3/2.0 and this fixed most sites that “don’t work” without JS.
Well, this was before the anti-scraping all-captcha now.
ghodawalaaman@programming.dev
on 18 Mar 17:40
collapse
What works for me is to use Firefox with tracking protection set to strict (or Librewolf with resistFingerprinting disabled and WebGL enabled), then install Jshelter and set it like below:
Time precision: High
Locally generated images: Little lies
Locally generated audio: Little lies
Graphic card information: Little lies for maximum privay or disabled to minimise captchas/shadowbans
WebAssembly speed-up: Enabled
Everything else disabled as Firefox/Librewolf takes care of them
Once you do that, FingerprintJS Pro (fingerprint.com) should give you a different ID every time you clear cookies and change the IP.
threaded - newest
Really nice article
Many OSS projects and personal web servers have bot detection because they would otherwise drown under (AI) scrappers and other bots traffic. Hosting or bandwidth cost is often unsustainable without bot protection.
If you don’t want to kill these projects, honor robots.txt by default, use throttling, don’t try to circumvent bot blocks. Look if there’s a purpose built API available to bots. If they don’t want to offert such API, go find something else to do.
Most open source projects just hit everything with a PoW captcha instead of trying to guess if a user is real or not, so trying to spoof enough to look like a real user won’t change all that much anymore.
That’s true. The reason is there’s lots of bot traffic spoofing real users, sometimes even going through residential proxies.
When bots spoof users well, the last option for projects is use these PoW captcha that annoy everyone. Enshitification continues.
well if a person decide to use this attack small OSS projects server then we are failed as humanity. I shared this article to fight against big tech surveillance if people use it to damage FOSS project I highly discourage that behavior.
The article focuses on techniques that help bots spoof browsers, to make them impersonate a typical human visitor.
It’s not obvious how this helps people protect themselves against surveillance while being online. Using python scripting is not a practical way to browse. But it’s handy to write scrappers.
It’s certainly useful to misbehaving bots that try to evade anti-bots protection.
I removed the Firefox’ version number and half the internet broke. I replaced the whole string with
Dillo 3/2.0and this fixed most sites that “don’t work” without JS.Well, this was before the anti-scraping all-captcha now.
things were simpler back then 🤧
i had the same thought; this shit is going to suck when i’m old and disconnected enough to not understand what’s going on.
That ai header image is a real turnoff to anything they have to say
What works for me is to use Firefox with tracking protection set to strict (or Librewolf with resistFingerprinting disabled and WebGL enabled), then install Jshelter and set it like below:
Once you do that, FingerprintJS Pro (fingerprint.com) should give you a different ID every time you clear cookies and change the IP.