ik_llama.cpp/examples/server/webui/src/components
firecoperana 2421a7e12b
Webui: improve scroll and bug fixes (#1082)
* Webui: fix message scroll back due to setPending

smooth scroll

remove throttle

increase scroll margin

# Conflicts:
#	examples/server/public/index.html.gz
#	examples/server/webui/dist/index.html
#	examples/server/webui/src/utils/app.context.tsx

* webui: don't scroll to bottom when conversation changes or edit message

# Conflicts:
#	examples/server/public/index.html.gz
#	examples/server/webui/dist/index.html

* Webui: fix save config error

# Conflicts:
#	examples/server/public/index.html.gz
#	examples/server/webui/dist/index.html

* Webui: add api key to request model name

# Conflicts:
#	examples/server/public/index.html.gz
#	examples/server/webui/dist/index.html

* Update

* webui: fix loading dots display issue

# Conflicts:
#	examples/server/public/index.html.gz
#	examples/server/webui/dist/index.html
#	examples/server/webui/src/components/ChatMessage.tsx

* Webui: cancel scroll when user moves up

---------

Co-authored-by: firecoperana <firecoperana>
2025-12-24 12:30:26 +01:00
..
CanvasPyInterpreter.tsx Webui improvement (#481) 2025-06-08 14:38:47 +03:00
ChatInputExtraContextItem.tsx Add vision support in llama-server (#901) 2025-11-05 10:43:46 +02:00
ChatMessage.tsx Webui: improve scroll and bug fixes (#1082) 2025-12-24 12:30:26 +01:00
ChatScreen.tsx Webui: improve scroll and bug fixes (#1082) 2025-12-24 12:30:26 +01:00
Header.tsx webui update (#1003) 2025-11-24 07:03:45 +01:00
MarkdownDisplay.tsx webui update (#1003) 2025-11-24 07:03:45 +01:00
ModalProvider.tsx Webui: New Features for Conversations, Settings, and Chat Messages (#618) 2025-07-20 12:33:55 +02:00
SettingDialog.tsx Webui: improve scroll and bug fixes (#1082) 2025-12-24 12:30:26 +01:00
Sidebar.tsx webui update (#1003) 2025-11-24 07:03:45 +01:00
useChatExtraContext.tsx webui update (#1003) 2025-11-24 07:03:45 +01:00
useChatScroll.tsx Webui: improve scroll and bug fixes (#1082) 2025-12-24 12:30:26 +01:00
useChatTextarea.ts Add vision support in llama-server (#901) 2025-11-05 10:43:46 +02:00