llama.cpp based AI chat app for macOS
Chat with LLMs on your Mac without installing any other software. Every conversation is saved locally, all conversations happen offline.
https://github.com/psugihara/FreeChat/assets/282016/fd546e39-7657-4ccd-a44f-0b872547a629
Join the TestFlight beta: https://6032904148827.gumroad.com/l/freechat-beta
Or download on the Mac App Store: https://apps.apple.com/us/app/freechat/id6458534902
Or build from source via “Archive” in Xcode after completing dev setup below.
The main goal of FreeChat is to make open, local, private models accessible to more people.
FreeChat is a native LLM appliance for macOS that runs completely locally. Download it and ask your LLM a question without doing any configuration. A local/llama version of OpenAI’s chat without login or tracking. You should be able to install from the Mac App Store and use it immediatly.
Once you’re up and running, it’s fun to try different models in FreeChat. The AI training community is releasing new models basically every day. FreeChat is compatible with any gguf formatted model that llama.cpp works with.
Models are usually named with their parameter count (e.g. 7B) and are formatted with different levels of lossy compression applied (quantization). The general rule of thumb is that models with more parameters tend to be slower and wiser and more quantization makes it dumber.
To find models, try Hugging Face. Most models have a linked “model card” by the author that discusses its training and abilities.
Contributions are very welcome. Let’s make FreeChat simple and powerful.
This project would not be possible without the hard work of:
Also many thanks to Billy Rennekamp, Elliot Hursh, Tomás Savigliano, Judd Schoenholtz, Alex Farrill for invaluable spitballing sessions.