This project is a robust and scalable multi-agent stock investment and analysis platform built using a Flask backend and a Next.js frontend, leveraging CrewAI for sophisticated multi-agent interactions.
MIT License
Once you have sucessfully done every step mentioned below. You can also Play around .
https://github.com/user-attachments/assets/26c7ec92-955e-4a6e-98af-ac9a8404d9f3
This project is a robust and scalable multi-agent stock investment and analysis platform built using a Flask backend and a Next.js frontend, leveraging CrewAI for sophisticated multi-agent interactions. The platform is designed to provide detailed insights and recommendations for stock investments by utilizing a multi-agent approach, enabling users to make informed decisions even with limited prior knowledge of stocks and market analysis.
cd Stock_investment_Analysis_Crew/Stock_Analyzer
4.Docker Build Both Frontend and Backend with Compose (Can even build Frontend and Backend Seperately using docker build Dockerfile)
docker-compose build
5.Docker Run build (replace 'up' with 'down' before recreating build)
docker-compose up
git clone https://github.com/pravincoder/Stock_investment_Analysis_Agent.git
cd Stock_investment_Analysis_Agent/Stock_Analyzer
conda env create -f environment.yml
conda activate stock_guru
python app.py
cd nextjs_app
npx create-next-app@latest --js --tailwind --eslint
Keep every option default while setting up the app.npm run dev
Note:- You need to execute both the backend Flask app and the Frontend as we are using cross connection with flask-cors.
.env
File SetupCreate a .env file and add the below code :- (Make sure to add the api keys from the specified platforms)
# API Key for GROQ
GROQ_API_KEY=your_groq_api_key_here
# API Key for OpenAI
OPENAI_API_KEY=your_openai_api_key_here
# Base URL for OpenAI API (default for local setup)
OPENAI_API_BASE=http://localhost:11434/v1
# Model name for OpenAI (adjust based on available models)
OPENAI_MODEL_NAME=mistral:latest
# Optional: API Key for LangChain
LANGCHAIN_API_KEY=your_langchain_api_key_here
# Optional: Enable LangChain tracing
LANGCHAIN_TRACING_V2=true
Use the files in the setup folder to create a LLama3 , Mistral8b ... or your own LLM model
Commands/steps to setup ollama llm comming soon!
Once the servers are running, you can access the application in your browser by navigating to http://localhost:3000
.
The Flask backend exposes several API endpoints that can be used for data retrieval and interaction:
POST /analyze-stock
: Analyze a stock and return Analysis and Investment reports in Markdown format.(Detailed documentation for each endpoint can be provided here. In near future we might make a seperate api endpoints for both reports.)
We welcome contributions! To get started:
git checkout -b feature-name
).git commit -m 'Add some feature'
).git push origin feature-name
).Please ensure your code adheres to the project's coding standards and passes all tests.
This project is licensed under the MIT License - see the LICENSE file for details.
For any questions, issues, or suggestions, please contact: