ChatGLM-6B AGI
********:
[2023/06/25] ChatGLM2-6BChatGLM-6B ChatGLM2-6B 42%INT4 6G 1K 8K
git clone https://github.com/EmbraceAGI/LocalAGI
python
cd LocalAGI
pip install -r requirements.txt
pip install fastapi uvicorn
chatglm_server.py
python chatglm_server.py
8001 POST
curl -X POST "http://127.0.0.1:8001" \
-H 'Content-Type: application/json' \
-d '{"prompt": "", "history": []}'
ChatGLM API
.env
,
LLM_MODEL=chatglm-6b # chatglm-6b / llama
# RUN CONFIG
OBJECTIVE= #
# For backwards compatibility
# FIRST_TASK can be used instead of INITIAL_TASK
INITIAL_TASK= #
python local_agi.py
.env
local_agi_zh.py
python local_agi_zh.py
**: ChatGLM 12G embedding text2vec-large-chinese 3G 15G ChatGLM **