https://github.com/deepghs/waifuc/tree/main
The model data source is derived from Waifuc, a highly robust tool for crawling, filtering, and processing training sets. With the aid of the Waifuc tool and a single 5600g machine, it is possible to efficiently crawl and process over 3000 training images within a span of 2 hours. Training the character "Lora" on this extensive dataset yields significant performance advantages compared to situations with limited data availability.
↑ Translation from gpt.
Trigger Words in page,other tag in lora metainfo.
recommand use model: Kohaku_XL https://civitai.com/models/162577/kohaku-xl-beta
weight: 0.8~1
if not halo,please add “halo" to prompt