ik_llama.cpp/ggml
2025-12-29 14:18:27 +01:00
..
cmake Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
include Async compute graph evaluation (2 or more GPUs) (#1089) 2025-12-27 08:18:06 +01:00
src Fix Windows build (#1097) 2025-12-29 14:18:27 +01:00
.gitignore Merge mainline llama.cpp (#3) 2024-07-27 07:55:01 +02:00
CMakeLists.txt Graph parallel: the next generation (#1080) 2025-12-24 08:31:48 +01:00