mirror of
https://github.com/ggerganov/llama.cpp
synced 2026-04-14 03:05:38 +02:00
* server : fix first message identification When using the OpenAI SDK (https://github.com/openai/openai-node/blob/master/src/lib/ChatCompletionStream.ts#L623-L626) we noticed that the expected assistant role is missing in the first streaming message. Fix this by correctly checking for the first message. Co-authored-by: Piotr Stankiewicz <piotr.stankiewicz@docker.com> Signed-off-by: Dorin Geman <dorin.geman@docker.com> * server : Fix checks for first role message for stream=True Co-authored-by: Piotr Stankiewicz <piotr.stankiewicz@docker.com> Signed-off-by: Dorin Geman <dorin.geman@docker.com> --------- Signed-off-by: Dorin Geman <dorin.geman@docker.com> Co-authored-by: Piotr Stankiewicz <piotr.stankiewicz@docker.com> |
||
|---|---|---|
| .. | ||
| test_basic.py | ||
| test_chat_completion.py | ||
| test_completion.py | ||
| test_ctx_shift.py | ||
| test_embedding.py | ||
| test_infill.py | ||
| test_lora.py | ||
| test_rerank.py | ||
| test_security.py | ||
| test_slot_save.py | ||
| test_speculative.py | ||
| test_template.py | ||
| test_tokenize.py | ||
| test_tool_call.py | ||
| test_vision_api.py | ||