Chat with your favourite LLaMA models
LlamaChat allows you to chat with LLaMa, Alpaca and GPT4All models1 all running locally on your Mac.
- Vicuna Coming soon
Convert models with ease
LlamaChat can import raw published PyTorch model checkpoints directly, or your pre-converted
.ggml model files.
LlamaChat is powered by open-source libraries including llama.cpp and llama.swift.
LlamaChat is 100% free and fully open-source, and always will be.
See something missing? Open a Pull Request.
Download for Mac
$ brew install --cask llamachat
v1.2.0 | Built for Intel processors and Apple Silicon | Requires macOS 13