ik_llama.cpp/examples
2026-02-17 12:33:28 +01:00
..
baby-llama Merge mainline - Aug 12 2024 (#17) 2024-08-12 15:14:32 +02:00
batched spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
batched-bench spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
batched.swift Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
benchmark build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809) 2024-06-13 00:41:52 +01:00
convert-llama2c-to-ggml spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
cvector-generator spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
deprecation-warning Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
embedding spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
eval-callback spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
export-lora Merge vulkan code from mainline up to commit of 6/28/2025 (#563) 2025-07-02 08:49:42 +02:00
gbnf-validator llama : add token matching support to llama-grammar (#1220) 2026-02-03 07:57:17 +02:00
gguf Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
gguf-hash Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
gguf-split gguf-split : update (#444) 2025-05-23 08:07:42 +03:00
gritlm spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
imatrix spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
infill spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
jeopardy build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809) 2024-06-13 00:41:52 +01:00
llama-bench spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
llama.android Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
llama.swiftui Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
llava add dry sampler (#513) 2025-06-19 10:24:53 +03:00
lookahead spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
lookup spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
main spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
main-cmake-pkg Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
mtmd spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
parallel spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
passkey spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
perplexity spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
quantize Allow quantization of ffn_gate_inp (#896) 2025-11-05 10:44:32 +02:00
quantize-stats spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
retrieval spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
rpc Refactor chat and server file (#1062) 2025-12-15 08:27:20 +01:00
save-load-state spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
server server: add string ban in speculative path (#1274) 2026-02-17 12:33:28 +01:00
simple spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
speculative spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
sweep-bench spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
sycl Merge mainline - Aug 12 2024 (#17) 2024-08-12 15:14:32 +02:00
tokenize spec : add self speculative decoding, ngram and refactor (#1261) 2026-02-13 19:04:55 +01:00
base-translate.sh build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809) 2024-06-13 00:41:52 +01:00
chat-13B.bat Create chat-13B.bat (#592) 2023-03-29 20:21:09 +03:00
chat-13B.sh build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809) 2024-06-13 00:41:52 +01:00
chat-persistent.sh build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809) 2024-06-13 00:41:52 +01:00
chat-vicuna.sh build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809) 2024-06-13 00:41:52 +01:00
chat.sh build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809) 2024-06-13 00:41:52 +01:00
CMakeLists.txt Port mdmd from mainline + Qwen2/2.5-VL support (#798) 2025-09-27 08:45:29 +02:00
convert_legacy_llama.py Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
json_schema_pydantic_example.py Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
json_schema_to_grammar.py Update grammar (#1023) 2025-11-30 18:45:38 +01:00
llama.vim llama.vim : added api key support (#5090) 2024-01-23 08:51:27 +02:00
llm.vim llm.vim : stop generation at multiple linebreaks, bind to <F2> (#2879) 2023-08-30 09:50:55 +03:00
Miku.sh build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809) 2024-06-13 00:41:52 +01:00
pydantic_models_to_grammar_examples.py Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
pydantic_models_to_grammar.py Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
reason-act.sh build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809) 2024-06-13 00:41:52 +01:00
regex_to_grammar.py Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
server_embd.py Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
server-llama2-13B.sh build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809) 2024-06-13 00:41:52 +01:00
ts-type-to-grammar.sh JSON schema conversion: ️ faster repetitions, min/maxLength for strings, cap number length (#6555) 2024-04-12 19:43:38 +01:00