AI roleplay chatbot privacy thing? (please please don't judge me)
from MeowerMisfit817@lemmy.world to privacy@lemmy.ca on 09 Nov 18:15
https://lemmy.world/post/38564984

I used perchance.org/ai-chat, but they switched to a VERY crappy LLM now.

#privacy

threaded - newest

wizardbeard@lemmy.dbzer0.com on 09 Nov 18:41 next collapse

I’m not sure what exactly you’re asking for, but the only way to ensure privacy with LLMs is to self host.

MeowerMisfit817@lemmy.world on 10 Nov 00:34 collapse

I heard about opensource AI roleplay website, but…

6nk06@sh.itjust.works on 09 Nov 18:55 next collapse

The cloud is someone else’s computer. Also we judge you.

Sxan@piefed.zip on 09 Nov 22:20 next collapse

Not for the role-play, but for the AI use.

MeowerMisfit817@lemmy.world on 10 Nov 00:32 collapse

bro 🥺

Nelots@piefed.zip on 09 Nov 19:45 next collapse

I see your previous post got deleted. I’m just going to paste my old comment here in case you didn’t see it. Feel free to ignore it if you did I guess:

How good’s your computer? Running locally is always the best option, but an 8-13GB model is never going to be as good as the stuff you’d find hosted by major companies. But hey, no limits and it literally never leaves your PC.

You can find models on Huggingface, and if you don’t know what you’re looking for, there’s a subreddit where they have a weekly discussion on enthusiasts favorite models. I don’t remember the sub’s name, but you should be able to find it easily enough with a google search like “reddit weekly AI model thread”. Go to the poster’s profile and you’ll find all of the old threads you can read through for recommendations.

MeowerMisfit817@lemmy.world on 10 Nov 00:34 collapse

I don’t have a computer yet.

yardratianSoma@lemmy.ca on 10 Nov 04:12 collapse

fyi, I don’t roleplay often, but if I do, I do it with a local LLM in LM Studio, for instance. OpenAI and the others already have too much, they don’t need to know my kinks too.

Offline AI has the benefit of being swappable, so if you don’t like the results, you can just use a new model