Can really recommend LM Studio Local AI with Phi-4 or Gemma for privacy ChatGPT like LLMS.
from tootnbuns@lemmy.dbzer0.com to privacy@lemmy.ml on 02 May 01:10
https://lemmy.dbzer0.com/post/43352159

Honestly, Phi-4 is just 8GB, which even my laptop 2070 can do and it’s not that much worse that ChatGPT for most stuff.

Runs pretty well as an AppImage on my linux boxes. On my workstation I run it with a 3090 and the new Gemma LLM has like 20GB and that’s a really good model in my experience so far.

It’s also by microsoft (disclaimer: absolutely fuck microsoft), which owns chatgpt, so makes sense that it’s somewhat capable.

#privacy

threaded - newest

commander@lemmy.world on 02 May 01:39 next collapse

You ever try Alpaca?

flathub.org/apps/com.jeffser.Alpaca

Open source alternative to LM Studio for simple chat with local LLMs

joeldebruijn@lemmy.ml on 02 May 06:21 collapse

Microsoft owns ChatGPT … 🤔🙈