From 12a369cc67db9ae4708e7138b3eedc8bb196d49f Mon Sep 17 00:00:00 2001 From: leejet Date: Sat, 11 Apr 2026 18:41:12 +0800 Subject: [PATCH] docs: update readme --- README.md | 3 +++ examples/server/README.md | 30 ++++++++++++++++++++++++++++++ 2 files changed, 33 insertions(+) diff --git a/README.md b/README.md index b5bb4975..514a22b9 100644 --- a/README.md +++ b/README.md @@ -15,6 +15,9 @@ API and command-line option may change frequently.*** ## ๐Ÿ”ฅImportant News +* **2026/04/11** ๐Ÿš€ stable-diffusion.cpp now uses a brand-new embedded web UI. + ๐Ÿ‘‰ Details: [PR #1408](https://github.com/leejet/stable-diffusion.cpp/pull/1408) + * **2026/01/18** ๐Ÿš€ stable-diffusion.cpp now supports **FLUX.2-klein** ๐Ÿ‘‰ Details: [PR #1193](https://github.com/leejet/stable-diffusion.cpp/pull/1193) diff --git a/examples/server/README.md b/examples/server/README.md index e813a2f1..e27d973f 100644 --- a/examples/server/README.md +++ b/examples/server/README.md @@ -1,3 +1,33 @@ +# Example + +The following example starts `sd-server` with a standalone diffusion model, VAE, and LLM text encoder: + +``` +.\bin\Release\sd-server.exe --diffusion-model ..\models\diffusion_models\z_image_turbo_bf16.safetensors --vae ..\models\vae\ae.sft --llm ..\models\text_encoders\qwen_3_4b.safetensors --diffusion-fa --offload-to-cpu -v --cfg-scale 1.0 +``` + +What this example does: + +* `--diffusion-model` selects the standalone diffusion model +* `--vae` selects the VAE decoder +* `--llm` selects the text encoder / language model used by this pipeline +* `--diffusion-fa` enables flash attention in the diffusion model +* `--offload-to-cpu` reduces VRAM pressure by keeping weights in RAM when possible +* `-v` enables verbose logging +* `--cfg-scale 1.0` sets the default CFG scale for generation + +After the server starts successfully: + +* the web UI is available at `http://127.0.0.1:1234/` +* the native async API is available under `/sdcpp/v1/...` +* the compatibility APIs are available under `/v1/...` and `/sdapi/v1/...` + +If you want to use a different host or port, pass: + +```bash +--listen-ip --listen-port +``` + # Frontend ## Build with Frontend