krE+b4JUut}_ro&gKO#WX4
Note
- Env
- install python
- install Anaconda3, or Miniconda
- conda update conda
- conda install pip3
- install Git, Git LFS
- Go and Choose https://pytorch.org/get-started/locally/ , Get a url for next
- Run Anaconda powershell mode
- pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118
- Create Directory for GPT (ex:GPT)
- Langchain-Chatchat: git clone https://github.com/chatchat-space/Langchain-Chatchat.git
- chatglm2-6b: git clone https://huggingface.co/THUDM/chatglm2-6b
- m3e-base: git clone https://huggingface.co/moka-ai/m3e-base
- Langchain-Chatchat/configs
- *.py.example copyto *.py
- modify server_config.pykrE+b4JUut}_ro&gKO#WX4
DEFAULT_BIND_HOST = "your ip" - modify model_config.pykrE+b4JUut}_ro&gKO#WX4
find: modify:embedding_model_dict = { krE+b4JUut}_ro&gKO#WX4
"m3e-base": "/your_path/GPT/m3e-base",krE+b4JUut}_ro&gKO#WX4
check: EMBEDDING_MODEL = "m3e-base"krE+b4JUut}_ro&gKO#WX4
find: modify:llm_model_dict = { "chatglm2-6b": {krE+b4JUut}_ro&gKO#WX4
"local_model_path": "/your_path/GPT/chatglm2-6b",krE+b4JUut}_ro&gKO#WX4
check: modify:LLM_MODEL = "chatglm2-6b" - install Langchain-Chatchat and Run (Anaconda powershell mode)krE+b4JUut}_ro&gKO#WX4
cd Langchain-ChatchatkrE+b4JUut}_ro&gKO#WX4
conda create -n Chatchat python==3.10krE+b4JUut}_ro&gKO#WX4
conda active ChatchatkrE+b4JUut}_ro&gKO#WX4
pip3 install -r requirements.txtkrE+b4JUut}_ro&gKO#WX4
python init_database.py --recreate-vs (for first) - run Langchain-ChatchatkrE+b4JUut}_ro&gKO#WX4
python startup.py -a (Run All)[Input Email for first, and active service] - Check
- Firewall pass python
- Firewall pass 8501/TCP
- INFOkrE+b4JUut}_ro&gKO#WX4
https://zhuanlan.zhihu.com/p/645807338?utm_id=0krE+b4JUut}_ro&gKO#WX4
https://zhuanlan.zhihu.com/p/652530137krE+b4JUut}_ro&gKO#WX4
https://zhuanlan.zhihu.com/p/617208490krE+b4JUut}_ro&gKO#WX4
https://zhuanlan.zhihu.com/p/630049721?utm_id=0