ik_llama.cpp/ggml
Iwan Kawrakow 14ef9ebe9a Vulkan: fix u_batch > 4096/n_active_experts
for coopmat1. Without this fix we get an assert.
We get the same assert in mainline too.
2025-07-14 17:28:55 +03:00
..
cmake Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
include It compiles 2025-07-14 11:43:37 +03:00
src Vulkan: fix u_batch > 4096/n_active_experts 2025-07-14 17:28:55 +03:00
.gitignore Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
CMakeLists.txt Merge vulkan code from mainline up to commit of 6/28/2025 (#563) 2025-07-02 08:49:42 +02:00