ollama-docker-inference

Inference Ollama Using Docker based on NGINX Proxy Manager using a custom ApiKey (Just like OpenAI).

APACHE-2.0 License

Stars
2

Statistics for this project are still being loaded, please check back later.