ik_llama.cpp/ggml
2025-07-12 12:20:00 +03:00
..
cmake Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
include Chnage KQ mask padding to 64 (#574) 2025-07-03 10:43:27 +02:00
src Check if MMQ should be used before using it 2025-07-12 12:20:00 +03:00
.gitignore Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
CMakeLists.txt Merge vulkan code from mainline up to commit of 6/28/2025 (#563) 2025-07-02 08:49:42 +02:00