ik_llama.cpp/examples/server/webui
firecoperana 15159a87d4
Add vision support in llama-server (#901)
* server: add support for vision model
webui: add support for vision model

* server : remove hack for extra parallel slot#10187

* llama : fix KV shift for qwen2vl #13870

* add no-context-shift parameter

---------

Co-authored-by: firecoperana <firecoperana>
2025-11-05 10:43:46 +02:00
..
dist Add vision support in llama-server (#901) 2025-11-05 10:43:46 +02:00
public Webui improvement (#481) 2025-06-08 14:38:47 +03:00
src Add vision support in llama-server (#901) 2025-11-05 10:43:46 +02:00
.prettierignore Webui improvement (#481) 2025-06-08 14:38:47 +03:00
eslint.config.js Webui improvement (#481) 2025-06-08 14:38:47 +03:00
index.html Webui: New Features for Conversations, Settings, and Chat Messages (#618) 2025-07-20 12:33:55 +02:00
package-lock.json Add vision support in llama-server (#901) 2025-11-05 10:43:46 +02:00
package.json Add vision support in llama-server (#901) 2025-11-05 10:43:46 +02:00
postcss.config.js Webui improvement (#481) 2025-06-08 14:38:47 +03:00
tailwind.config.js Webui improvement (#481) 2025-06-08 14:38:47 +03:00
tsconfig.app.json Webui improvement (#481) 2025-06-08 14:38:47 +03:00
tsconfig.json Webui improvement (#481) 2025-06-08 14:38:47 +03:00
tsconfig.node.json Webui improvement (#481) 2025-06-08 14:38:47 +03:00
vite.config.ts Add vision support in llama-server (#901) 2025-11-05 10:43:46 +02:00