index
:
stable-diffusion-webui-gfx803.git
master
stable-diffusion-webui by AUTOMATIC1111 with patches for gfx803 GPU and Dockerfile
about
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
modules
/
textual_inversion
Age
Commit message (
Expand
)
Author
2023-01-13
Merge branch 'master' into tensorboard
AUTOMATIC1111
2023-01-13
Merge pull request #6689 from Poktay/add_gradient_settings_to_logging_file
AUTOMATIC1111
2023-01-13
print bucket sizes for training without resizing images #6620
AUTOMATIC
2023-01-13
Merge pull request #6620 from guaneec/varsize_batch
AUTOMATIC1111
2023-01-12
add gradient settings to training settings log files
Josh R
2023-01-12
Allow creation of zero vectors for TI
Shondoit
2023-01-11
set descriptions
Vladimir Mandic
2023-01-10
Support loading textual inversion embeddings from safetensors files
Lee Bousfield
2023-01-11
Enable batch_size>1 for mixed-sized training
dan
2023-01-09
make a dropdown for prompt template selection
AUTOMATIC
2023-01-09
remove/simplify some changes from #6481
AUTOMATIC
2023-01-09
Merge branch 'master' into varsize
AUTOMATIC1111
2023-01-08
make it possible for extensions/scripts to add their own embedding directories
AUTOMATIC
2023-01-08
skip images in embeddings dir if they have a second .preview extension
AUTOMATIC
2023-01-08
Move batchsize check
dan
2023-01-08
Add checkbox for variable training dims
dan
2023-01-08
Allow variable img size
dan
2023-01-07
CLIP hijack rework
AUTOMATIC
2023-01-06
rework saving training params to file #6372
AUTOMATIC
2023-01-06
Merge pull request #6372 from timntorres/save-ti-hypernet-settings-to-txt-rev...
AUTOMATIC1111
2023-01-06
allow loading embeddings from subdirectories
Faber
2023-01-05
typo in TI
Kuma
2023-01-05
Include model in log file. Exclude directory.
timntorres
2023-01-05
Clean up ti, add same behavior to hypernetwork.
timntorres
2023-01-05
Add option to save ti settings to file.
timntorres
2023-01-04
Merge branch 'master' into gradient-clipping
AUTOMATIC1111
2023-01-04
use shared function from processing for creating dummy mask when training inp...
AUTOMATIC
2023-01-04
fix the merge
AUTOMATIC
2023-01-04
Merge branch 'master' into inpaint_textual_inversion
AUTOMATIC1111
2023-01-04
Merge pull request #6253 from Shondoit/ti-optim
AUTOMATIC1111
2023-01-03
add job info to modules
Vladimir Mandic
2023-01-03
Save Optimizer next to TI embedding
Shondoit
2023-01-02
feat(api): return more data for embeddings
Philpax
2023-01-02
fix the issue with training on SD2.0
AUTOMATIC
2022-12-31
changed embedding accepted shape detection to use existing code and support t...
AUTOMATIC
2022-12-31
validate textual inversion embeddings
Vladimir Mandic
2022-12-24
fix F541 f-string without any placeholders
Yuval Aboulafia
2022-12-14
Fix various typos
Jim Hays
2022-12-03
Merge branch 'master' into racecond_fix
AUTOMATIC1111
2022-12-03
Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixes
AUTOMATIC1111
2022-12-02
Fix divide by 0 error
PhytoEpidemic
2022-11-30
Use devices.autocast instead of torch.autocast
brkirch
2022-11-27
Merge pull request #4688 from parasi22/resolve-embedding-name-in-filewords
AUTOMATIC1111
2022-11-27
Merge remote-tracking branch 'flamelaw/master'
AUTOMATIC
2022-11-27
set TI AdamW default weight decay to 0
flamelaw
2022-11-26
Add support Stable Diffusion 2.0
AUTOMATIC
2022-11-23
small fixes
flamelaw
2022-11-21
fix pin_memory with different latent sampling method
flamelaw
2022-11-20
moved deepdanbooru to pure pytorch implementation
AUTOMATIC
2022-11-20
fix random sampling with pin_memory
flamelaw
[prev]
[next]