Check if all Llama providers are the same
MIT License
This is a quick test to see if all the new LLama 3.1 server providers are returning the same tokens, using Weights & Biases Weave Evaluations https://wandb.me/weave
To get started with this project, you need to install the required dependencies. Follow the steps below:
Create a virtual environment (optional but recommended):
python3 -m venv venv
source venv/bin/activate # On Windows use `venv\Scripts\activate`
Install the requirements:
pip install -r requirements.txt
Set up environment variables:
Copy the .env.example
file to a new file named .env
:
cp .env.example .env
Then, open the .env
file and add your secrets:
WANDB_API_KEY
OPENROUTER_API_KEY
GROQ_API_KEY
TOGETHER_API_KEY
Your .env
file should look something like this:
WANDB_API_KEY=your_wandb_api_key_here
OPENROUTER_API_KEY=your_openrouter_api_key_here
GROQ_API_KEY=your_groq_api_key_here
TOGETHER_API_KEY=your_together_api_key_here
Note: Make sure to keep your .env
file private and never commit it to version control.
** Run the python notebook run_evals.ipynb**