ik_llama.cpp/ggml
2025-12-08 06:09:45 +01:00
..
cmake Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
include Hadamard transforms for K-cache - CPU only (#1033) 2025-12-04 06:51:11 +01:00
src Unroll for loop for repacked BF16 MATMUL (#1047) 2025-12-08 06:09:45 +01:00
.gitignore Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
CMakeLists.txt Enable fusion by default (#939) 2025-11-11 10:35:48 +02:00