This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# By https://github.com/blepping | |
# License: Apache 2.0 | |
# Experimental MSW-MSA attention implementation ported to ComfyUI from: https://github.com/megvii-research/HiDiffusion | |
# Lightly tested, may or may not actually work correctly. | |
# | |
# *** NOTE *** | |
# This is *NOT* a full implementation of HiDiffusion, only the MSW-MSA attention component which is mainly | |
# for performance. By itself it will not enable generating at higher resolution than the model normally supports. | |
# | |
# Usage: Copy into custom_nodes directory, connect the ApplyMSWMSAAttention node. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Made by https://github.com/blepping | |
# I recommend using the repo version instead as it's easier to install (currently no functional difference): | |
# https://github.com/blepping/ComfyUI-ApplyResAdapterUnet | |
# Very experimental ComfyUI node for https://github.com/bytedance/res-adapter | |
# Usage: | |
# Put resadapter.py in custom_nodes/ directory. | |
# Put the resolution_normalization.safetensors model in models/unet | |
# Patch the model with ApplyResAdapterUnet, load the LoRA part normally. |