Chenkin-UniControl-XL (ckn 动漫多合一控制网)

中文 | English


中文介绍

Chenkin-UniControl-XL Cover (ZH)

欢迎来到 Chenkin-UniControl-XL 的官方仓库!

这是 ckn 团队基于最新的 ckn v0.5 主干模型,专门为二次元/动漫生成场景量身定制的专属多合一控制网(Universal ControlNet)。它彻底解决了传统 ControlNet 在控制动漫图像时容易导致“画风崩坏”、“色彩变脏”或“AI味过重”的痛点。

核心亮点 (Features)

  1. 八合一专属控制 (8-in-One Control):本模型将 8 种最常用的控制网类型融合为一个底模,支持以下控制模式:
    • OpenPose (姿势骨骼)
    • Depth (深度图)
    • HED / PIDI / Scribble / TED (涂鸦与边缘提取)
    • Canny / Lineart / MLSD (精准线稿与建筑线条)
    • Normal (法线贴图)
    • Tile (分块重绘与细节放大)
    • Inpaint / Outpaint (局部重绘与画幅外扩)
    • Fuse (实验性融合功能)
  2. 原生画风保护 (Native Style Preservation):基于 ckn v0.5 千万级高质量动漫数据集的审美底蕴训练,无论使用哪种控制模式,都不会污染原模型的色彩和光影表现。
  3. 深度定制的 ComfyUI 节点:本模型采用了特殊的底层架构,必须搭配由团队成员开发的专属高阶节点才能解锁全部功能。

杀手级功能:Fuse (多条件融合控制)

在 8 种控制模式中,我们特别开放了底层极其硬核的 Fuse (融合) 模式。

在传统工作流中,同时使用“线稿+深度+姿势”需要加载 3 个控制网,不仅显存爆炸、出图极慢,且权重容易互相干扰。

Fuse 模式彻底改变了这一现状:

  • 合并调用,性能翻倍:它允许你同时输入多个控制条件(如三选二、三选三),在底层通过专属的 Condition Transformer 将多种特征自然融合,将多次 ControlNet 调用合并为一次调用
  • 控制不打架:多重控制信号在进入主模型前已完成底层协同,彻底解决多 ControlNet 堆叠导致的画面崩坏问题,在极低显存下实现极其复杂的精准控制。

必读:专属插件依赖 (Mandatory Plugin)

注意:本模型无法在 ComfyUI 原生的 ControlNet 节点中直接使用!

为了实现“多合一”的复杂路由和极高的控制自由度,您必须安装并使用由团队成员 @chinoll 主导开发的专属插件:ComfyUI-Advanced-ControlNet

专属插件的核心优势:

  • 多模式切换:在节点中直接通过下拉菜单无缝切换上述 8 种控制模式。
  • Soft Weights (软权重):完美复刻 WebUI 中的 "My prompt is more important""ControlNet is more important" 功能,实现极其细腻的权重过渡。
  • Timestep Scheduling (时间步调度):支持在特定的采样步数区间内介入或退出控制,避免过度控制导致的画面僵硬。

安装专属节点:

cd custom_nodes
git clone https://github.com/chinoll/ComfyUI-Advanced-ControlNet.git

(或直接通过 ComfyUI Manager 搜索 ComfyUI-Advanced-ControlNet 安装)

下载与安装

  1. 下载本仓库中的 .safetensors 模型文件。
  2. 将其放入您的 ComfyUI/models/controlnet/ 目录下。
  3. 在 ComfyUI 中使用 Load Advanced ControlNet Model 节点加载本模型,并连接至 Apply Advanced ControlNet 节点进行生成。
  4. 参考工作流:我们提供了一个基础的 Fuse 模式参考工作流,您可以下载 example/controlnet-fuse.json 直接拖入 ComfyUI 中体验。

协议与鸣谢

  • 开源协议:本模型继承 ckn 主干模型的开源精神,遵循 fair-ai-public-license-1.0-sd 协议。严禁任何形式的商业化及倒卖行为
  • 团队致谢:感谢 ckn 团队前沿技术实验室,以及团队成员 @chinoll 在本次多合一模型训练与专属节点底层架构上的核心技术攻坚。

English Introduction

Chenkin-UniControl-XL Cover (EN)

Welcome to the official repository for Chenkin-UniControl-XL!

Trained on top of the latest ckn v0.5 base model, this is a dedicated Universal ControlNet tailored specifically for anime/2D generation workflows. It solves the common pain points of traditional ControlNets, such as style degradation, muddy colors, or an overly "AI-generated" look when applied to anime illustrations.

Features

  1. 8-in-One Control: This model integrates 8 of the most commonly used ControlNet types into a single base model, supporting the following modes:
    • OpenPose
    • Depth
    • HED / PIDI / Scribble / TED
    • Canny / Lineart / MLSD
    • Normal
    • Tile
    • Inpaint / Outpaint
    • Fuse (Experimental)
  2. Native Style Preservation: Trained with the aesthetic foundation of ckn v0.5's massive high-quality anime dataset, using any of the control modes will not pollute the color and lighting performance of the base model.
  3. Deeply Customized ComfyUI Nodes: Due to its unique underlying architecture, this model must be paired with the exclusive advanced nodes developed by our team members to unlock its full functionality.

Killer Feature: Fuse (Multi-Condition Fusion Control)

Among the 8 control modes, we have specially unlocked the highly advanced Fuse mode at the architectural level.

In traditional workflows, using "Lineart + Depth + Pose" simultaneously requires loading 3 separate ControlNets. This not only causes VRAM explosion and extremely slow generation but also leads to weight interference.

The Fuse mode completely revolutionizes this:

  • Merged Calls, Doubled Performance: It allows you to input multiple control conditions simultaneously (e.g., choose 2 or 3). The underlying Condition Transformer naturally fuses these various features, merging multiple ControlNet calls into a single call.
  • No Control Interference: Multiple control signals are synergized at the base level before entering the main model, completely solving the issue of image degradation caused by stacking multiple ControlNets. This achieves extremely complex and precise control with minimal VRAM usage.

Mandatory: Exclusive Plugin Dependency

Note: This model cannot be used directly with ComfyUI's native ControlNet nodes!

To achieve the complex routing of the "All-in-One" functionality and extremely high control freedom, you MUST install and use the exclusive plugin lead-developed by our team member @chinoll: ComfyUI-Advanced-ControlNet.

Core advantages of the exclusive plugin:

  • Multi-Mode Switching: Seamlessly switch between the 8 control modes mentioned above directly via a dropdown menu in the node.
  • Soft Weights: Perfectly replicates the "My prompt is more important" and "ControlNet is more important" features from WebUI, allowing for extremely fine-grained weight transitions.
  • Timestep Scheduling: Allows the ControlNet to intervene or exit at specific sampling step intervals, preventing stiff generations.

Node Installation:

cd custom_nodes
git clone https://github.com/chinoll/ComfyUI-Advanced-ControlNet.git

Download & Setup

  1. Download the .safetensors model file from this repository.
  2. Place it in your ComfyUI/models/controlnet/ directory.
  3. In ComfyUI, use the Load Advanced ControlNet Model node to load this model, and connect it to the Apply Advanced ControlNet node for generation.
  4. Reference Workflow: We provide a basic reference workflow for the Fuse mode. You can download example/controlnet-fuse.json and drag it directly into ComfyUI to experience it.

License & Credits

  • License: This model inherits the open-source spirit of the ckn base model and follows the fair-ai-public-license-1.0-sd license. Any form of commercialization or reselling is strictly prohibited.
  • Team Credits: Thanks to the ckn team's Frontier Technology Lab and our team member @chinoll for their core technical breakthroughs in training this all-in-one model and developing the underlying architecture for the exclusive node.
Downloads last month
409
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ChenkinNoob/Chenkin-UniControl-XL

Adapter
(1)
this model