aboutsummaryrefslogtreecommitdiff
path: root/modules/textual_inversion
AgeCommit message (Collapse)Author
2023-01-11set descriptionsVladimir Mandic
2023-01-10Support loading textual inversion embeddings from safetensors filesLee Bousfield
2023-01-11Enable batch_size>1 for mixed-sized trainingdan
2023-01-09make a dropdown for prompt template selectionAUTOMATIC
2023-01-09remove/simplify some changes from #6481AUTOMATIC
2023-01-09Merge branch 'master' into varsizeAUTOMATIC1111
2023-01-08make it possible for extensions/scripts to add their own embedding directoriesAUTOMATIC
2023-01-08skip images in embeddings dir if they have a second .preview extensionAUTOMATIC
2023-01-08Move batchsize checkdan
2023-01-08Add checkbox for variable training dimsdan
2023-01-08Allow variable img sizedan
2023-01-07CLIP hijack reworkAUTOMATIC
2023-01-06rework saving training params to file #6372AUTOMATIC
2023-01-06Merge pull request #6372 from ↵AUTOMATIC1111
timntorres/save-ti-hypernet-settings-to-txt-revised Save hypernet and textual inversion settings to text file, revised.
2023-01-06allow loading embeddings from subdirectoriesFaber
2023-01-05typo in TIKuma
2023-01-05Include model in log file. Exclude directory.timntorres
2023-01-05Clean up ti, add same behavior to hypernetwork.timntorres
2023-01-05Add option to save ti settings to file.timntorres
2023-01-04Merge branch 'master' into gradient-clippingAUTOMATIC1111
2023-01-04use shared function from processing for creating dummy mask when training ↵AUTOMATIC
inpainting model
2023-01-04fix the mergeAUTOMATIC
2023-01-04Merge branch 'master' into inpaint_textual_inversionAUTOMATIC1111
2023-01-04Merge pull request #6253 from Shondoit/ti-optimAUTOMATIC1111
Save Optimizer next to TI embedding
2023-01-03add job info to modulesVladimir Mandic
2023-01-03Save Optimizer next to TI embeddingShondoit
Also add check to load only .PT and .BIN files as embeddings. (since we add .optim files in the same directory)
2023-01-02feat(api): return more data for embeddingsPhilpax
2023-01-02fix the issue with training on SD2.0AUTOMATIC
2022-12-31changed embedding accepted shape detection to use existing code and support ↵AUTOMATIC
the new alt-diffusion model, and reformatted messages a bit #6149
2022-12-31validate textual inversion embeddingsVladimir Mandic
2022-12-24fix F541 f-string without any placeholdersYuval Aboulafia
2022-12-14Fix various typosJim Hays
2022-12-03Merge branch 'master' into racecond_fixAUTOMATIC1111
2022-12-03Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixesAUTOMATIC1111
Use devices.autocast() and fix MPS randn issues
2022-12-02Fix divide by 0 errorPhytoEpidemic
Fix of the edge case 0 weight that occasionally will pop up in some specific situations. This was crashing the script.
2022-11-30Use devices.autocast instead of torch.autocastbrkirch
2022-11-27Merge pull request #4688 from parasi22/resolve-embedding-name-in-filewordsAUTOMATIC1111
resolve [name] after resolving [filewords] in training
2022-11-27Merge remote-tracking branch 'flamelaw/master'AUTOMATIC
2022-11-27set TI AdamW default weight decay to 0flamelaw
2022-11-26Add support Stable Diffusion 2.0AUTOMATIC
2022-11-23small fixesflamelaw
2022-11-21fix pin_memory with different latent sampling methodflamelaw
2022-11-20moved deepdanbooru to pure pytorch implementationAUTOMATIC
2022-11-20fix random sampling with pin_memoryflamelaw
2022-11-20remove unnecessary commentflamelaw
2022-11-20Gradient accumulation, autocast fix, new latent sampling method, etcflamelaw
2022-11-19Merge pull request #4812 from space-nuko/feature/interrupt-preprocessingAUTOMATIC1111
Add interrupt button to preprocessing
2022-11-19change StableDiffusionProcessing to internally use sampler name instead of ↵AUTOMATIC
sampler index
2022-11-17Add interrupt button to preprocessingspace-nuko
2022-11-13resolve [name] after resolving [filewords] in trainingparasi