llama.cpp/tools/server/webui/src/lib/services
Rohan Jain 974c8c94cc
webui: add setting for first-line chat titles (#21797)
* webui: add setting for first-line chat titles

Add an opt-in setting (`titleGenerationUseFirstLine`) to use the first
non-empty line of a prompt as the generated conversation title.

Previously, the complete multi-line prompt was being used, which created
long titles for complex queries. Coupled with
"Ask for confirmation before changing conversation title", the dialog
would overflow.

* Update tools/server/webui/src/lib/utils/text.ts

Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com>

* Update tools/server/webui/src/lib/utils/text.ts

Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com>

* webui: Run build to update the bundle

As requested in:
https://github.com/ggml-org/llama.cpp/pull/21797#pullrequestreview-4094935065

* webui: Fix missing import for NEWLINE_SEPARATOR

---------

Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com>
2026-04-13 09:30:46 +02:00
..
chat.service.ts webui: Add option to pre-encode conversation for faster next turns (#21034) 2026-04-09 09:10:18 +02:00
database.service.ts webui: Conversation forking + branching improvements (#21021) 2026-03-28 13:38:15 +01:00
index.ts webui: Agentic Loop + MCP Client with support for Tools, Resources and Prompts (#18655) 2026-03-06 10:00:39 +01:00
mcp.service.ts webui: MCP Diagnostics improvements (#21803) 2026-04-13 07:58:38 +02:00
models.service.ts webui: Improve model parsing logic + add unit tests (#20749) 2026-03-19 12:25:50 +01:00
parameter-sync.service.spec.ts common/parser: add proper reasoning tag prefill reading (#20424) 2026-03-19 16:58:21 +01:00
parameter-sync.service.ts webui: add setting for first-line chat titles (#21797) 2026-04-13 09:30:46 +02:00
props.service.ts webui: Architecture and UI improvements (#19596) 2026-02-14 09:06:41 +01:00