show code js

顯示具有 convert 標籤的文章。 顯示所有文章
顯示具有 convert 標籤的文章。 顯示所有文章

2024年8月27日 星期二

Huggingface model convert gguf

Conda

  • conda create -n gguf  #one time
  • conda activate gguf
  • pip install huggingface_hub

Create

  • mkdir convert #one time
  • cd convert

Project and update

  • git https://github.com/ggml-org/llama.cpp.git  #one time
  • update1: cd llama.cpp
    • git pull origin
  • update2 in conda: pip install -r requirements.txt

HF

  • goto hf and create a access token
  • in conda:keyin access token
    huggingface-cli login

Model

  • in convert of directory, mkdir model
  • create a file is download.py in model of directory  #onetime
    from huggingface_hub import snapshot_download
    model_id="[hf model_name like xxx/xxxxxx]"
    snapshot_download(repo_id=model_id, local_dir="model/model_name",local_dir_use_symlinks=False, revision="main")
  • modify:./model/download.py
    model_id="[hf model_name like xxx/xxxxxx]"
    local_dir="model/model_name"
  • in conda: python ./model/download.py

dl model from hf:get model_id,local_dir

  • create file:download.py and save
    from huggingface_hub import snapshot_download
    model_id="Qwen/Qwen2-1.5B-Instruct"
    snapshot_download(repo_id=model_id, local_dir="Qwen2-1.5B-Instruct",local_dir_use_symlinks=False, revision="main")
  • python download.py

Convert

  • python ./llama.cpp/convert_hf_to_gguf.py -h
  • python ./llama.cpp/convert_hf_to_gguf.py ./model/[model_name]  --outtype q8_0 --verbose --outfile ./[model_name]-q8_0.gguf
Ollama
  • cd .ollama/models
  • create a file([model_name])
    FROM C:\path\convert\model_name.gguf
  • cmd:ollama create [model_name] -f ./[model_name]
Update on 2025/8/28