show code js

2024年9月16日 星期一

use axolotl for trainning

Env

  • no internet
  • qlora_root c:\qlora
  • gguf_root c:\gguf

Dataset 

  • c:/qlora/output_dataset/instruction_dataset.parquet

PS:Axolotl

  • docker run --gpus '"all"' --rm -it winglian/axolotl:main-latest
PS:Dataset
  • docker cp C:\qlora\output_dataset\instruction_dataset.parquet container_name:/workspace/axolotl/examples/instruction_dataset.parquet

PS:LLM

  • docker cp C:\Meta-Llama-3.1-8B container_name:/workspace/axolotl/examples/Meta-Llama-3.1-8B/

Axolotl:Qlora

  • open ./examples/llama-3/qlora.yml find and modify path of llm and parquet

Axolotl:Trainning

  • CUDA_VISIBLE_DEVICES="" python -m axolotl.cli.preprocess examples/llama-3/qlora.yml
  • accelerate launch -m axolotl.cli.train examples/llama-3/qlora.yml

Axolotl:Test(need internet)

  • accelerate launch -m axolotl.cli.inference examples/llama-3/qlora.yml --lora_model_dir="./outputs/qlora-out" --gradio

Axolotl:Merged

  • python3 -m axolotl.cli.merge_lora examples/llama-3/qlora.yml --lora_model_dir="./outputs/qlora-out"

PS:Export

  • docker cp container_name:/workspace/axolotl/outputs/qlora-out/merged C:\merged

GGUF

  • convert to merged_f16.gguf

Ollama

  • ollama run merged


2024年9月6日 星期五

unsloth

Conda

  • conda install -c conda-forge xformers
  • pip install xformers
  • conda config --add channels conda-forge
  • conda update conda

Env

  • conda create --name unsloth python=3.11 pytorch-cuda=12.1 pytorch cudatoolkit -c pytorch -c nvidia -y
  • conda activate unsloth
  • pip install xformers

Install

  • pip install "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git"
  • pip install --no-deps trl peft accelerate bitsandbytes

2024年8月27日 星期二

Huggingface model convert gguf

config env

  • mkdir model and cd model
  • conda create -n gguf
  • conda activate gguf
  • pip install huggingface_hub

dl model from hf:get model_id,local_dir

  • create file:download.py and save
    from huggingface_hub import snapshot_download
    model_id="Qwen/Qwen2-1.5B-Instruct"
    snapshot_download(repo_id=model_id, local_dir="Qwen2-1.5B-Instruct",local_dir_use_symlinks=False, revision="main")
  • python download.py

config llama.cpp

  • cd ..
  • git clone https://github.com/ggerganov/llama.cpp.git
  • cd llama.cpp
  • pip install -r requirements.txt

run python llama.cpp/convert_hf_to_gguf.py -h

convert

  • cd ..
    check path for directory of model
  • #full f16 or f32
    python ./llama.cpp/convert_hf_to_gguf.py ./model/qwen2_1.5b_instruct  --outtype f32 --verbose --outfile ./Qwen2-1.5B-Instruct_f32.gguf
  • #q2_k,q3_k_l/q3_k_m/q3_k_s,q4_0/q4_1/q4_k_m/q4_k_s,q5_0/q5_1/q5_k_m/q5_k_s,q6_k,q8_0,fp16/f32
    python ./llama.cpp/convert_hf_to_gguf.py ./model/qwen2_1.5b_instruct  --outtype q8_0 --verbose --outfile ./Qwen2-1.5B-Instruct_q8_0.gguf

2024年8月16日 星期五

perplexica update

Check

  • docker volume ls | perplexica_backend-dbstore
  • docker run -it --rm -v perplexica_backend-dbstore:/data busybox ls -l /data

Backup

  • docker run --rm -v perplexica_backend-dbstore:/data -v C:/backup:/backup busybox tar cvf /backup/perplexica_backend-dbstore.tar /data

Remove

  • docker remove old container

Update

  • goto perplexica of directory and run git pull origin master
  • check config.toml and docker-compose.yaml

ReBuild

  • goto perplexica of directory and docker compose up -d --build

Recovery

  • docker run --rm -v perplexica_backend-dbstore:/data -v C:/backup:/backup busybox tar xvf /backup/perplexica_backend-dbstore.tar -C /data

Check

  • docker run -it --rm -v perplexica_backend-dbstore:/data busybox ls -l /data

anythingllm update(update 2024.09.23)

Get-New

  • docker pull mintplexlabs/anythingllm

Remove

  • docker remove old container

Run

  • docker run -d -p 3001:3001 --cap-add SYS_ADMIN -v C:/anythingllm/storage:/app/server/storage -v C:/anythingllm/env.txt:/app/server/.env -e STORAGE_DIR="/app/server/storage" --add-host=host.docker.internal:host-gateway --name anythingllm mintplexlabs/anythingllm
---------------------------------------
use github to pull to docker and update
  • git clone https://github.com/Mintplex-Labs/anything-llm
  • copy ./docker to ./anythingllm
  • into ./anything-llm/anythingllm and rename .env.example to .env
  • PS>docker compose up -d (--build)
change port number
  • .env
    SERVER_PORT=3001
  • docker-compose.yml
    ports:
    - "3001:3001"
change name
  • docker-compose.yml
    name: anythingllm
    container_name: anythingllm


Chroma update

 

Backup

  • docker volume ls
  • docker run --rm -v chroma_chroma-data:/data -v C:/backup:/backup busybox tar cvf /backup/chroma_chroma-data.tar /data

Remove

  • docker remove old container

Get-New

  • get new version from https://github.com/chroma-core/chroma/releases
  • docker pull ghcr.io/chroma-core/chroma:0.5.4
  • docker run -p 8000:8000 ghcr.io/chroma-core/chroma:0.5.4

Recovery

  • docker run --rm -v chroma_chroma-data:/data -v C:/backup:/backup busybox tar xvf /backup/chroma_chroma-data.tar -C /data
  • docker run -it --rm -v chroma_chroma-data:/data busybox ls -l /data