llama.cpp/tools/cvector-generator
Georgi Gerganov 6990e2f1f7
libs : rename libcommon -> libllama-common (#21936)
* cmake : allow libcommon to be shared

* cmake : rename libcommon to libllama-common

* cont : set -fPIC for httplib

* cont : export all symbols

* cont : fix build_info exports

* libs : add libllama-common-base

* log : add common_log_get_verbosity_thold()
2026-04-17 11:11:46 +03:00
..
CMakeLists.txt libs : rename libcommon -> libllama-common (#21936) 2026-04-17 11:11:46 +03:00
completions.txt llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00
cvector-generator.cpp libs : rename libcommon -> libllama-common (#21936) 2026-04-17 11:11:46 +03:00
mean.hpp llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00
negative.txt llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00
pca.hpp docs : Minor cleanups (#19252) 2026-02-02 08:38:55 +02:00
positive.txt llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00
README.md llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00

cvector-generator

This example demonstrates how to generate a control vector using gguf models.

Related PRs:

Examples

# CPU only
./cvector-generator -m ./llama-3.Q4_K_M.gguf

# With GPU
./cvector-generator -m ./llama-3.Q4_K_M.gguf -ngl 99

# With advanced options
./cvector-generator -m ./llama-3.Q4_K_M.gguf -ngl 99 --pca-iter 2000 --pca-batch 100

# Using mean value instead of PCA
./cvector-generator -m ./llama-3.Q4_K_M.gguf --method mean

# To see help message
./cvector-generator -h
# Then, have a look at "cvector" section

Tips and tricks

If you have multiple lines per prompt, you can escape the newline character (change it to \n). For example:

<|im_start|>system\nAct like a person who is extremely happy.<|im_end|>
<|im_start|>system\nYou are in a very good mood today<|im_end|>

Example to use output file with llama-cli:

(Tips: The control vector works better when apply to layers higher than 10)

./llama-cli -m ./llama-3.Q4_K_M.gguf -p "<|start_header_id|>system<|end_header_id|>\n\nYou are a helpful assistant<|eot_id|><|start_header_id|>user<|end_header_id|>\n\nSing a song<|im_end|><|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n" --special --control-vector-scaled ./control_vector.gguf 0.8 --control-vector-layer-range 10 31