LLM Agent Routing Project: This project implements a routing system for user queries using LangChain and Groq's Gemma2-9b-It model. The system intelligently routes questions to arxiv, Wikipedia, or an LLM based on the context, offering efficient and targeted responses for AI research, human information, or general queries.
This project is an intelligent routing system for user queries, using the LangChain framework and Groq's Gemma2-9b-It model. It allows queries to be directed to the most relevant data source either arxiv_search (for AI research papers), Wikipedia (for human-related information), or an LLM (for general queries).
Clone the repository:
git clone https://github.com/MadhanMohanReddy2301/SmartChainAgents.git
Navigate to the project directory:
cd SmartChainAgents
Install the required dependencies:
pip install -r requirements.txt
Set up your environment variables by adding your Groq API key:
export GROQ_API_KEY=your_groq_api_key
Initialize the LLM router system by running the script:
python graph.py
Test the routing functionality with some example queries:
question_router.invoke({"question": "Who is Shahrukh Khan?"})
question_router.invoke({"question": "What are the types of agent memory?"})
The system will route the query to the appropriate source and return the result accordingly.
# Example query for Wikipedia search
{
"datasource": "wiki_search"
}
# Example query for arxiv search
{
"datasource": "arxiv_search"
}