timntorres
51e3dc9cca
Sanitize hypernet name input.
2022-10-21 16:52:24 +03:00
AUTOMATIC
03a1e288c4
turns out LayerNorm also has weight and bias and needs to be pre-multiplied and trained for hypernets
2022-10-21 10:13:24 +03:00
AUTOMATIC1111
0c5522ea21
Merge branch 'master' into training-help-text
2022-10-21 09:57:55 +03:00
timntorres
4ff274e1e3
Revise comments.
2022-10-21 09:55:00 +03:00
timntorres
5245c7a493
Issue #2921-Give PNG info to Hypernet previews.
2022-10-21 09:55:00 +03:00
AUTOMATIC
c23f666dba
a more strict check for activation type and a more reasonable check for type of layer in hypernets
2022-10-21 09:47:43 +03:00
discus0434
6b38c2c19c
Merge branch 'AUTOMATIC1111:master' into master
2022-10-20 18:51:12 +09:00
AUTOMATIC
930b4c64f7
allow float sizes for hypernet's layer_structure
2022-10-20 08:18:02 +03:00
discus0434
6f98e89486
update
2022-10-20 00:10:45 +00:00
DepFA
166be3919b
allow overwrite old hn
2022-10-20 00:09:40 +01:00
DepFA
d6ea584137
change html output
2022-10-20 00:07:57 +01:00
discus0434
2ce52d32e4
fix for #3086 failing to load any previous hypernet
2022-10-19 16:31:12 +00:00
AUTOMATIC
c6e9fed500
fix for #3086 failing to load any previous hypernet
2022-10-19 19:21:16 +03:00
discus0434
3770b8d2fa
enable to write layer structure of hn himself
2022-10-19 15:28:42 +00:00
discus0434
42fbda83bb
layer options moves into create hnet ui
2022-10-19 14:30:33 +00:00
discus0434
7f8670c4ef
Merge branch 'master' into master
2022-10-19 15:18:45 +09:00
Silent
da72becb13
Use training width/height when training hypernetworks.
2022-10-19 09:13:28 +03:00
discus0434
e40ba281f1
update
2022-10-19 01:03:58 +09:00
discus0434
a5611ea502
update
2022-10-19 01:00:01 +09:00
discus0434
6021f7a75f
add options to custom hypernetwork layer structure
2022-10-19 00:51:36 +09:00
AngelBottomless
703e6d9e4e
check NaN for hypernetwork tuning
2022-10-15 17:15:26 +03:00
AUTOMATIC
c7a86f7fe9
add option to use batch size for training
2022-10-15 09:24:59 +03:00
AUTOMATIC
03d62538ae
remove duplicate code for log loss, add step, make it read from options rather than gradio input
2022-10-14 22:43:55 +03:00
AUTOMATIC
326fe7d44b
Merge remote-tracking branch 'Melanpan/master'
2022-10-14 22:14:50 +03:00
AUTOMATIC
c344ba3b32
add option to read generation params for learning previews from txt2img
2022-10-14 20:31:49 +03:00
AUTOMATIC
354ef0da3b
add hypernetwork multipliers
2022-10-13 20:12:37 +03:00
Melan
8636b50aea
Add learn_rate to csv and removed a left-over debug statement
2022-10-13 12:37:58 +02:00
Melan
1cfc2a1898
Save a csv containing the loss while training
2022-10-12 23:36:29 +02:00
AUTOMATIC
c3c8eef9fd
train: change filename processing to be more simple and configurable
...
train: make it possible to make text files with prompts
train: rework scheduler so that there's less repeating code in textual inversion and hypernets
train: move epochs setting to options
2022-10-12 20:49:47 +03:00
AUTOMATIC
ee015a1af6
change textual inversion tab to train
...
remake train interface to use tabs
2022-10-12 11:05:57 +03:00
Milly
2d006ce16c
xy_grid: Find hypernetwork by closest name
2022-10-12 10:40:10 +03:00
AUTOMATIC
6be32b31d1
reports that training with medvram is possible.
2022-10-11 23:07:09 +03:00
AUTOMATIC
d6fcc6b87b
apply lr schedule to hypernets
2022-10-11 22:03:05 +03:00
AUTOMATIC
6a9ea5b41c
prevent extra modules from being saved/loaded with hypernet
2022-10-11 19:22:30 +03:00
AUTOMATIC
d4ea5f4d86
add an option to unload models during hypernetwork training to save VRAM
2022-10-11 19:03:08 +03:00
AUTOMATIC
6d09b8d1df
produce error when training with medvram/lowvram enabled
2022-10-11 18:33:57 +03:00
AUTOMATIC
d682444ecc
add option to select hypernetwork modules when creating
2022-10-11 18:04:47 +03:00
AUTOMATIC
b0583be088
more renames
2022-10-11 15:54:34 +03:00
AUTOMATIC
873efeed49
rename hypernetwork dir to hypernetworks to prevent clash with an old filename that people who use zip instead of git clone will have
2022-10-11 15:51:30 +03:00