File size: 1,953 Bytes
7529c6f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 |
# Sora (Blue Archive NPC)
Decided to try completely changing things up after seeing an anon's suggestions on /hdg/. Used batch size 12, lower resolution, and a high learning rate. Can't tell if it's any better or worse quality-wise than my usual LoRAs, but it was a lot faster to train.
Be forewardned that this LoRA is prone to generating NSFW results since Sora an unusually high proportion of very lewd art.
## Usage
Didn't do much tag pruning so you have to prompt a lot.
Use any or all of the following tags to summon Sora: `sora, halo, mini wings, white wings, two side up, blonde hair, forehead`
For her usual outfit, `blue apron, white shirt` is typically enough. You can also add `off-shoulder, strap slip, bowtie, short sleeves` if desired.
Sometimes her wings appear too high; `low wings` can help with this. Her wings are also usually a bit too big but that's kind of hard to control.
Weight 1 seems to work fine.
## Training
*All parameters are provided in the accompanying JSON files.*
- Trained on a set of 128 images, repeated 25 times, 3 epochs (128 images * 25 repeats / 12 batch size * 3 epochs = 800 steps)
- Dataset included a mixture of SFW and NSFW.
- Batch size 12 is up from usual of 3
- Resolution was 512, down from usual of 832
- General learning rate is 1.2e-3, significantly up from the Kohya default (1e-6 I believe)
- My understanding is that when this learning rate is set at the same time as the text encoder and unet rates, it is used only as one of the inputs to the AdamW optimizer, which adjusts the learning rate as necessary. So even though this value is high, it is not running at this value constantly.
- Text encoder learning rate of 1.5e-5
- Unet learning rate of 1.5e-4
- `constant_with_warmup` scheduler instead of `cosine`
- Initially tagged with WD1.4 Convnextv2 model. Tags minimally pruned/edited.
- Used network_dimension 128 (same as usual) and network_alpha 64
- Trained without VAE.
|