Your Therapists’ Notes Could Become Fodder For AI (jacobin.com)
from Novocirab@feddit.org to privacy@lemmy.ml on 18 Sep 00:11
https://feddit.org/post/18983476

#privacy

threaded - newest

obsidianfoxxy7870@lemmy.blahaj.zone on 18 Sep 00:27 next collapse

I also hate that there is never an option to opt out of this service either.

muxika@lemmy.world on 18 Sep 01:02 next collapse

Jeff Goldblum’s Jurassic Park monologue feels appropriate here.

“You stood on the shoulders of geniuses to accomplish something as fast as you could and before you even knew it you had it. You patented it and packaged it and slapped it on a plastic lunch box, and now your selling it!”

Albbi@lemmy.ca on 18 Sep 02:14 collapse

Great quote!

32 year old movie… I love/hate when old movie quotes are more relevant today than when the movie came out.

solarvector@lemmy.dbzer0.com on 18 Sep 19:47 next collapse

Sure you can. Just, you know, don’t need therapy.

doinky@lemmy.today on 18 Sep 22:56 collapse

Not universally true— with my healthcare network there is a consent form that allows you to decline the use of AI for online appointments, and for in-person they request verbal consent. I make sure to decline every time.

Just wanted to share so that people can check if opting out is possible (for now at least).

ReversalHatchery@beehaw.org on 20 Sep 04:59 collapse

one can only hope it has any meaning. what platform do they use for online applications appointments? zoom? ms teams? google meet?

doinky@lemmy.today on 20 Sep 19:27 collapse

My network (my insurance is with an HMO so it’s all centralized with all the physical and mental healthcare providers I have access to) has you click on your appointment in the app or online, and then it opens Zoom.

I have to use Teams for work and the idea of using that for therapy is kind of funny for some reason…

ReversalHatchery@beehaw.org on 20 Sep 23:32 collapse

yeah. I wouldn’t expect that opt out to be honored. maybe by your provider, but not by zoom.

irmadlad@lemmy.world on 18 Sep 01:40 next collapse

I wonder how hard it would be to set up an AI honeypot that attracted AI scrapers, but all the data contained in the honeypot was poisoned.

XenGi@feddit.org on 18 Sep 01:43 collapse

Pretty easy. It’s called a tar pit.

irmadlad@lemmy.world on 18 Sep 01:49 next collapse

Ding! I just didn’t have the proper verbage.

Related link

ganymede@lemmy.ml on 18 Sep 05:52 collapse

tar pits target the scrapers.

were you talking also about poisoning the training data?

two distinct (but imo highly worthwhile) things

tar pits are a bit like turning the tap off (or to a useless trickle). fortunately it’s well understood how to do it efficiently and it’s difficult to counter.

poisoning is a whole other thing. i’d imagine if nothing comes out of the tap the poison is unlikely to prove effective. there could perhaps be some clever ways to combine poisoning with tarpits in series, but in general they’d be deployed separately or at least in parallel.

bear in mind to meaningfully deploy a tar pit against scrapers you usually need some permissions on the server, it may not help too much for this exact problem in the article (except for some short term fuckery perhaps). poisoning this problem otoh is probably important

[deleted] on 18 Sep 03:37 collapse

.

solrize@lemmy.ml on 18 Sep 01:58 next collapse

Meanwhile the guy who breached a Finnish therapy database and held 33,000 records for ransom just got out of prison after serving around 2 years of a 6 year sentence:

en.wikipedia.org/wiki/Vastaamo_data_breach

rabbitcat@sopuli.xyz on 18 Sep 13:48 collapse

To make matters worse he basically “served” 2 years in hotel. Because that’s what Finnish prisons are. He clearly wasn’t punished enough. I’m afraid he will go back to his criminal life…

eldavi@lemmy.ml on 18 Sep 19:13 next collapse

i think that the worse thing about this is that the people committing the same crime are made millionaires and will never receive any sort of comeuppance for it.

rabbitcat@sopuli.xyz on 19 Sep 03:48 collapse

True.

UltraGiGaGigantic@lemmy.ml on 19 Sep 03:05 next collapse

Prisons shouldn’t be cruel and unusual. If you think Finnish prisons are so good compared to being free in your country… what do you think that says about your country?

rabbitcat@sopuli.xyz on 19 Sep 03:47 next collapse

I’m already got bit off topic and I’m afraid this will go far away from the topic if I’d answer your question. I rather keep this sub on topic. And also one of the rules of this community is literally: “Try to keep things on topic”

[deleted] on 19 Sep 07:09 collapse

.

HiddenLayer555@lemmy.ml on 19 Sep 07:19 collapse

Finland to financial criminals victimizing the entire country:

<img alt="" src="https://lemmy.ml/pictrs/image/3345c159-5c53-4d95-b82a-f5e13dba000a.jpeg">

Finland to the actual victims:

<img alt="" src="https://lemmy.ml/pictrs/image/0b288a40-c59d-4271-8e33-e59d71d75a17.png">

(Hi from Canada where the courts do the same thing and then get all high and mighty about being “progressive” and “rehabilitative” when the victims express their grievances)

Jimmycrackcrack@lemmy.ml on 18 Sep 06:32 next collapse

You know, as with a lot of these tech advances that impinge upon privacy and put us at risk in the name of profit, the buy-in, the thing they’re offering in exchange, IS actually pretty worthwhile. This is extremely useful. It’s such a shame that all this cool Star Trek shit that I would have been giddy about as a kid has been realised, but at a sinister and often hidden cost.

Is there any way this can be done on local metal? Would it achieve the same level of accuracy and sophistication of the progress notes? Because if this can be offered to the therapists that wanted it enough in the first place that they either knowingly or unwittingly sacrificed their patient’s privacy for it, maybe they can be given an alternative.

Novocirab@feddit.org on 18 Sep 19:53 next collapse

I think nothing stands in the way of doing this on local metal, as far as technology goes. Something similar has already been achieved for personal photo sorting, e.g. ente.io

ReversalHatchery@beehaw.org on 20 Sep 05:05 collapse

in your opinion what advantage would AI give to experienced therapists?

Jimmycrackcrack@lemmy.ml on 20 Sep 08:06 collapse

The specific application in this instance was that it creates “progress notes”. Admittedly, as I have only the information from the article itself, having no background in this field myself, I can only make assumptions what those are like, but as the name implies it’s charting a client’s progress through therapy and would also imply to me a lot of summarising of information gleaned during sessions. I guess in as much as it also would necessarily have to create a transcript in doing this for you, I guess it also provides that too. This is portrayed as tedious and time consuming work by the creators of the service, who obviously have a vested interest in casting it in such light, but taken at its word, I would say in my opinion the advantage would be in automating some of the tedious and time consuming aspects of the job.

As I suspect you were driving at from the tenor of the question, there’s a lot of ways this could go wrong, in particular privacy concerns when this service is offered in the manner that it is here where it’s processed outside of the therapist’s own clinic by 3rd parties and information is shared with additional parties and used for many purposes with only the flimsy promise of “de-anonymisation” which appears to be hollow. It could also maybe affect how the therapy is conducted, making decisions about how to summarise this information that will influence what decisions a therapist makes and perhaps that therapist might have summarised it differently if doing the notes themselves, then again this all hinges upon how effective it is considered to be. If it can be evaluated and found to be generally good, then it seems tentatively like this could be a pretty helpful tool for a therapist. But in general, my comment was really more directed at what I feel like is a sad state of affairs across the board with recent tech advances including generative AI as applied in any aspect of life or work, that I think is often lost in these conversations where the technology really shows promise or is quite impressive but because of the manner of its development or the surveillance profit model, it’s basically tainted and ruined. I feel like I often come across commentary that fails to make the distinction between the negative aspects of how these techs have come about and are monetized and the tech itself where the latter is simply cast as inherently undesirable even when there’s clearly reason enough for people to find it appealing in the first place for it to end up in use.

ReversalHatchery@beehaw.org on 20 Sep 15:34 collapse

that’s plausible. in my opinion a therapist should take the effort to take their own interpretation of what has been said, instead of relying on a machine that digests the system in a uniform way. words of a patient can mean a lot of things, even depending on things like their body language. but I have to admit I’m even more concerned about the privacy consequences which you pointed out. that’s like, it simply can’t go unabused in my opinion. too tempting. I wouldn’t even want to run a business that just stores it without abuse, it’s too risky too.

bathing_in_bismuth@sh.itjust.works on 18 Sep 09:00 next collapse

Haha, because of my notes AI will finally be defeated!

prex@aussie.zone on 18 Sep 12:56 next collapse

<img alt="" src="https://aussie.zone/pictrs/image/147772e8-fce1-45dd-988c-ea06fbfee484.gif">

StarMerchant938@lemmy.world on 18 Sep 13:12 next collapse

Haha jokes on them I can’t AFFORD therapy. OR land a job with good insurance.

mondomon@lemmy.world on 18 Sep 20:22 next collapse

This is really bad. We don’t need Palintir getting a hold of our private health info. My damn doctor is even using this shit.

UltraGiGaGigantic@lemmy.ml on 19 Sep 03:10 next collapse

Don’t worry everyone, I’ve undiagnosed myself. Everything is okay. I’m fine. This is fine.

Someonelol@lemmy.dbzer0.com on 19 Sep 04:37 next collapse

Those who were too paranoid to visit a shrink were justified all along.

HiddenLayer555@lemmy.ml on 19 Sep 07:02 next collapse

This is why I self medicate with cannabis instead of going to therapy. Never have to worry about my dumbass high thoughts getting stolen. /s /s

DieserTypMatthias@lemmy.ml on 19 Sep 09:00 collapse

Thank god mine still uses paper notebook.