A demonstration of tool integration with Ollama in a FastAPI application. This project demonstrates how to effectively connect external tools like weather and time services to an Ollama-powered chat interface.
MIT License
This project demonstrates how to integrate tools with Ollama in a FastAPI application. It showcases three main functionalities:
FastAPI and Requests:
You can install FastAPI and Requests using pip:
pip install fastapi requests
Ollama:
Follow the instructions on the Ollama GitHub repository to install Ollama. Make sure to download and install the version that includes the llama3.1
model.
For a quick installation via the command line, use:
pip install ollama
Ensure that you have the llama3.1
model available. You can usually download and install it through Ollamas CLI or the web interface.
git clone [https://github.com/darcyg32/ollama-tools-project](https://github.com/darcyg32/Ollama-Tools-Integration-with-FastAPI-Demo)
cd Ollama-Tools-Integration-with-FastAPI-Demo
Using the Command-Line Script:
You can use send_request.py
to interact with the FastAPI server. Heres how to use it:
python send_request.py
- You can customize the initial system instructions at the # Initialize the conversation with a system message
comment.
- You can customize the initial customer message at the # Append the user's initial message to the conversation
comment.
http://localhost:11434
. Update the URL in app.py
if your Ollama instance is hosted elsewhere.For reference, this project was developed and tested on the following hardware:
This project is licensed under the MIT License - see the LICENSE file for details.