Supports Mac Intel & Apple Silicon. macOS 14+
Easily configure multiple Ollama server connections. Connect to your local Ollama server or a remote Ollama server.
Ollamac Pro supports the latest Ollama Chat and Completion API, allowing you to interact with Ollama's latest models and features.
Use multi-modal models to describe and chat with your images. Ollamac Pro supports the latest multi-modal models, allowing you to interact with Ollama's latest models and features.
Easily configure the Ollama parameters such and the seed, temperature, and top-k and many more.
Privacy is non-negotiable. Ollamac Pro adheres to a strict zero-tracking policy, ensuring your data and activities remain confidential, allowing you to concentrate on creating, worry-free. Your data is never sent to any server.
Ollamac Pro is built for efficiency and speed, catering to the professional developer's need for a fast, lightweight, and native application that enhances productivity without compromise.
Check out roadmap page.
macOS Sonoma 14 or higher is recommended.
Windows is currently not supported. It is not currently on our roadmap.
We are currently in Beta so we're not on the AppStore just yet.
Yes we do! Just contact us by email.
Create a Github issue here: https://github.com/gregorym/ollamac-pro/issues/new or send an email to greg@ollamac.com