Credit to Helaman
Originally uploaded at: https://openmodeldb.info/models/4x-Nomos8kHAT-L-otf
Hybrid AttentionTransformer (HAT) combines channel attention and self-attention schemes and makes use of their complementary advantages. To enhance the interaction between neighboring window features, an overlapping cross-attention module is employed in HAT. Read more
To use this (and other HAT upscalers) with Automatic1111 and Forge follow these steps.
Create a folder in in \webui\models\ and name it HAT
Download the file either here or from the source
Place the file in \webui\models\HAT\
Restart your webui
Note: If you have issues getting the model to work, change the file name from .pt to .pth