Your gateway to both Ollama & Apple MlX models
Inspired by Ollama, Apple MlX projects and frustrated by the dependencies from external applications like Bing, Chat-GPT etc, I wanted to have my own personal chatbot as a native MacOS application. Sure there are alternatives like streamlit, gradio (which are based, thereby needing a browser) or others like Ollamac, LMStudio, mindmac etc which are good but then restrictive in some means (either by license, or paid or not versatile). Also I wanted to enjoy both Ollama (based on llama.cpp
) and Mlx models (which are suitable for image generation, audio generation etc and heck I own a mac with Apple silicon ) through a single uniform interface.
All these lead to this project (PyOllaMx) and another sister project called PyOMlx.
I'm using these in my day to day workflow and I intend to keep develop these for my use and benefit.
If you find this valuable, feel free to use it and contribute to this project as well. Please this repo to show your support and make my day!
I'm planning on work on next items on this roadmap.md. Feel free to comment your thoughts (if any) and influence my work (if interested)
MacOS DMGs are available in Releases
PyOllaMx : ChatBot application capable of chatting with both Ollama and Apple MlX models. For this app to function, it needs both Ollama & PyOMlx macos app running. These 2 apps will serve their respective models on localhost for PyOllaMx to chat.
PyOMlx : A Macos App capable of discovering, loading & serving Apple MlX models downloaded from Apple MLX Community repo in hugging face
ollama pull <model name>
ollama pull mistral
This command will download the Ollama models in a known location to PyOllaMx
[!TIP] As of PyOllaMx v0.0.4, you can download & manage ollama models right within PyOllaMx's ModelHub. Check the v0.0.4 release page for more details
use hugging-face cli
pip install huggingface_hub hf_transfer
export HF_HUB_ENABLE_HF_TRANSFER=1
huggingface-cli download mlx-community/CodeLlama-7b-Python-4bit-MLX
This command will download the MlX models in a known location to PyOllaMx
Now you can download Ollama models right within PyOllaMx's Model Hub tab. You can also inspect existing models , delete models right within PyOllaMx instead of using Ollama CLI. This greatly simplifies the user experience . And you before you ask, yes I'm working to bring similar functionality for MLX models from huggingface hub. Please stay tuned
Click the release version link above to view demo gifs explaining the features.
Click the release version link above to view demo gifs explaining the features.
Limitations:
Click the release version link above to view demo gifs explaining the features.