InvokeAI/invokeai/backend/patches
Lincoln Stein dfc66b7142
Feature: Add FLUX.2 LOKR model support (detection and loading) (#8909)
* Add FLUX.2 LOKR model support (detection and loading) (#88)

Fix BFL LOKR models being misidentified as AIToolkit format



Fix alpha key warning in LOKR QKV split layers

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: lstein <111189+lstein@users.noreply.github.com>

* Fix BFL→diffusers key mapping for non-block layers in FLUX.2 LoRA/LoKR

BFL's FLUX.2 model uses different names than diffusers' Flux2Transformer2DModel
for top-level modules (embedders, modulations, output layers). The existing
conversion only handled block-level renames (double_blocks→transformer_blocks),
causing "Failed to find module" warnings for non-block LoRA keys like img_in,
txt_in, modulation.lin, time_in, and final_layer.

---------

Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com>
Co-authored-by: lstein <111189+lstein@users.noreply.github.com>
Co-authored-by: Alexander Eichhorn <alex@eichhorn.dev>
2026-02-27 00:45:13 +00:00
..
layers feat: Add Z-Image LoRA support 2025-12-01 22:23:30 +01:00
lora_conversions Feature: Add FLUX.2 LOKR model support (detection and loading) (#8909) 2026-02-27 00:45:13 +00:00
__init__.py Rename backend/lora/ to backend/patches 2024-12-17 13:20:19 +00:00
layer_patcher.py chore: fix ruff checks 2025-12-14 19:51:22 +05:30
model_patch_raw.py Rename LoRAModelRaw to ModelPatchRaw. 2024-12-17 13:20:19 +00:00
pad_with_zeros.py Push LoRA layer reshaping down into the patch layers and add a new FluxControlLoRALayer type. 2024-12-17 13:20:19 +00:00