LocalAGI

LocalAGI:Locally run AGI powered by LLaMA, ChatGLM and more. | 基于 ChatGLM, LLaMA 大模型的本地运行的 AGI

APACHE-2.0 License

Stars
72
Committers
1

LocalAGI

ChatGLM-6B AGI

********:

[2023/06/25] ChatGLM2-6BChatGLM-6B ChatGLM2-6B 42%INT4 6G 1K 8K

git clone https://github.com/EmbraceAGI/LocalAGI

python

cd LocalAGI
pip install -r requirements.txt

ChatGLM API

pip install fastapi uvicorn chatglm_server.py

python chatglm_server.py

8001 POST

curl -X POST "http://127.0.0.1:8001" \
     -H 'Content-Type: application/json' \
     -d '{"prompt": "", "history": []}'

ChatGLM API

LocalAGI

.env ,

LLM_MODEL=chatglm-6b   # chatglm-6b / llama

# RUN CONFIG
OBJECTIVE=   # 
# For backwards compatibility
# FIRST_TASK can be used instead of INITIAL_TASK
INITIAL_TASK=     # 
python local_agi.py

.env local_agi_zh.py

python local_agi_zh.py

  • ubuntu18.04
  • python3.8
  • GPU 3090 Ti + Cuda 11+

**: ChatGLM 12G embedding text2vec-large-chinese 3G 15G ChatGLM **

Related Projects