aboutsummaryrefslogtreecommitdiff
AgeCommit message (Collapse)Author
2023-06-27alternate fix for catch errors when retrieving extension index #11290AUTOMATIC
2023-06-27Merge pull request #11315 from guming3d/masterAUTOMATIC1111
fix: adding elem_id for img2img resize to and resize by tabs
2023-06-27Merge pull request #11415 from netux/extensions-toggle-allAUTOMATIC1111
Add checkbox to check/uncheck all extensions in the Installed tab
2023-06-27Merge pull request #11146 from AUTOMATIC1111/api-quit-restartAUTOMATIC1111
api quit restart
2023-06-27Merge branch 'master' into devAUTOMATIC
2023-06-27Merge branch 'release_candidate'AUTOMATIC
2023-06-27Merge branch 'release_candidate' into devAUTOMATIC
2023-06-27Merge pull request #11189 from daswer123/devAUTOMATIC1111
Zoom and pan: More options in the settings and improved error output
2023-06-27Merge pull request #11136 from arch-fan/typoAUTOMATIC1111
fixed typos
2023-06-27Merge pull request #11199 from akx/makedirsAUTOMATIC1111
Use os.makedirs(..., exist_ok=True)
2023-06-27Merge pull request #11294 from zhtttylz/Fix_Typo_of_hints.jsAUTOMATIC1111
Fix Typo of hints.js
2023-06-27Merge pull request #11408 from wfjsw/patch-1AUTOMATIC1111
Strip whitespaces from URL and dirname prior to extension installation
2023-06-27add missing infotext entry for the pad cond/uncond optionAUTOMATIC
2023-06-25feat(extensions): add toggle all checkbox to Installed tabMartín (Netux) Rodríguez
Small QoL addition. While there is the option to disable all extensions with the radio buttons at the top, that only acts as an added flag and doesn't really change the state of the extensions in the UI. An use case for this checkbox is to disable all extensions except for a few, which is important for debugging extensions. You could do that before, but you'd have to uncheck and recheck every extension one by one.
2023-06-25Strip whitespaces from URL and dirname prior to extension installationJabasukuriputo Wang
This avoid some cryptic errors brought by accidental spaces around urls
2023-06-19fix: adding elem_id for img2img resize to and resize by tabsGeorge Gu
2023-06-18Fix Typo of hints.jszhtttylz
2023-06-18update the description of --add-stop-routw-e-w
2023-06-14terminate -> stopw-e-w
2023-06-14response 501 if not a able to restartw-e-w
2023-06-14update workflow kill test serverw-e-w
2023-06-14rename routesw-e-w
2023-06-14Formatting code with PrettierDanil Boldyrev
2023-06-14Reworked the disabling of functions, refactored part of the codeDanil Boldyrev
2023-06-13Use os.makedirs(..., exist_ok=True)Aarni Koskela
2023-06-12remove console.logDanil Boldyrev
2023-06-12Improved error output, improved settings menuDanil Boldyrev
2023-06-12remove fastapi.Responsew-e-w
2023-06-12move _stop route to apiw-e-w
2023-06-10quit restartw-e-w
2023-06-09fixed typosarch-fan
2023-06-09Merge branch 'dev' into release_candidateAUTOMATIC
2023-06-09add changelog for 1.4.0AUTOMATIC
2023-06-09linterAUTOMATIC
2023-06-09Merge pull request #11092 from AUTOMATIC1111/Generate-Forever-during-generationAUTOMATIC1111
Allow activation of Generate Forever during generation
2023-06-09Merge pull request #11087 from AUTOMATIC1111/persistent_conds_cacheAUTOMATIC1111
persistent conds cache
2023-06-09Merge pull request #11123 from akx/dont-die-on-bad-symlink-loraAUTOMATIC1111
Don't die when a LoRA is a broken symlink
2023-06-09Merge pull request #10295 from Splendide-Imaginarius/mk2-blur-maskAUTOMATIC1111
Split mask blur into X and Y components, patch Outpainting MK2 accordingly
2023-06-09Merge pull request #11048 from DGdev91/force_python1_navi_renoirAUTOMATIC1111
Forcing Torch Version to 1.13.1 for RX 5000 series GPUs
2023-06-09Don't die when a LoRA is a broken symlinkAarni Koskela
Fixes #11098
2023-06-09Split Outpainting MK2 mask blur into X and Y componentsSplendide Imaginarius
Fixes unexpected noise in non-outpainted borders when using MK2 script.
2023-06-09Split mask blur into X and Y componentsSplendide Imaginarius
Prequisite to fixing Outpainting MK2 mask blur bug.
2023-06-08Generate Forever during generationw-e-w
Generate Forever during generation
2023-06-08persistent conds cachew-e-w
Update shared.py
2023-06-07Merge pull request #11058 from AUTOMATIC1111/api-wikiAUTOMATIC1111
link footer API to Wiki when API is not active
2023-06-07Merge pull request #11066 from aljungberg/patch-1AUTOMATIC1111
Fix upcast attention dtype error.
2023-06-06Fix upcast attention dtype error.Alexander Ljungberg
Without this fix, enabling the "Upcast cross attention layer to float32" option while also using `--opt-sdp-attention` breaks generation with an error: ``` File "/ext3/automatic1111/stable-diffusion-webui/modules/sd_hijack_optimizations.py", line 612, in sdp_attnblock_forward out = torch.nn.functional.scaled_dot_product_attention(q, k, v, dropout_p=0.0, is_causal=False) RuntimeError: Expected query, key, and value to have the same dtype, but got query.dtype: float key.dtype: float and value.dtype: c10::Half instead. ``` The fix is to make sure to upcast the value tensor too.
2023-06-06Skip force pyton and pytorch ver if TORCH_COMMAND already setDGdev91
2023-06-06link footer API to Wiki when API is not activew-e-w
2023-06-06Write "RX 5000 Series" instead of "Navi" in errDGdev91