aboutsummaryrefslogtreecommitdiff
AgeCommit message (Collapse)Author
2022-10-08chore: Fix typosAidan Holland
2022-10-08Break after finding the local directory of stable diffusionEdouard Leurent
Otherwise, we may override it with one of the next two path (. or ..) if it is present there, and then the local paths of other modules (taming transformers, codeformers, etc.) wont be found in sd_path/../. Fix https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/1085
2022-10-08add 'Ignore last layers of CLIP model' option as a parameter to the infotextAUTOMATIC
2022-10-08make --force-enable-xformers work without needing --xformersAUTOMATIC
2022-10-08Added ability to ignore last n layers in FrozenCLIPEmbedderFampai
2022-10-08Update ui.pyDepFA
2022-10-08TI preprocess wordingDepFA
I had to check the code to work out what splitting was 🤷🏿
2022-10-08Merge branch 'master' into dev/deepdanbooruGreendayle
2022-10-08add --force-enable-xformers option and also add messages to console ↵AUTOMATIC
regarding cross attention optimizations
2022-10-08add fallback for xformers_attnblock_forwardAUTOMATIC
2022-10-08made deepdanbooru optional, added to readme, automatic download of deepbooru ↵Greendayle
model
2022-10-08alternate promptArtem Zagidulin
2022-10-08Add GZipMiddleware to root demoDepFA
2022-10-08check for ampere without destroying the optimizations. again.C43H66N12O12S2
2022-10-08check for ampereC43H66N12O12S2
2022-10-08check for 3.10C43H66N12O12S2
2022-10-08Merge branch 'master' into dev/deepdanbooruGreendayle
2022-10-08why did you do thisAUTOMATIC
2022-10-08fix conflictsGreendayle
2022-10-08Fixed typoMilly
2022-10-08restore old opt_split_attention/disable_opt_split_attention logicAUTOMATIC
2022-10-08simplify xfrmers options: --xformers to enable and that's itAUTOMATIC
2022-10-08emergency fix for xformers (continue + shared)AUTOMATIC
2022-10-08Merge pull request #1851 from C43H66N12O12S2/flashAUTOMATIC1111
xformers attention
2022-10-08Update sd_hijack.pyC43H66N12O12S2
2022-10-08Update requirements_versions.txtC43H66N12O12S2
2022-10-08Update launch.pyC43H66N12O12S2
2022-10-08update sd_hijack_opt to respect new env variablesC43H66N12O12S2
2022-10-08add xformers_available shared variableC43H66N12O12S2
2022-10-08default to split attention if cuda is available and xformers is notC43H66N12O12S2
2022-10-08check for OS and env variableC43H66N12O12S2
2022-10-08fix bug where when using prompt composition, hijack_comments generated ↵MrCheeze
before the final AND will be dropped
2022-10-08Remove duplicate event listenersguaneec
2022-10-08fix glob path in hypernetwork.pyddPn08
2022-10-08fix AND broken for long promptsAUTOMATIC
2022-10-08fix bugs related to variable prompt lengthsAUTOMATIC
2022-10-08Update requirements.txtC43H66N12O12S2
2022-10-08install xformersC43H66N12O12S2
2022-10-08do not let user choose his own prompt token count limitAUTOMATIC
2022-10-08check specifically for skippedTrung Ngo
2022-10-08Add button to skip the current iterationTrung Ngo
2022-10-08Merge remote-tracking branch 'origin/master'AUTOMATIC
2022-10-08let user choose his own prompt token count limitAUTOMATIC
2022-10-08fix: handles when state_dict does not existleko
2022-10-08use new attnblock for xformers pathC43H66N12O12S2
2022-10-08Update sd_hijack_optimizations.pyC43H66N12O12S2
2022-10-08add xformers attnblock and hypernetwork supportC43H66N12O12S2
2022-10-08add info about cross attention javascript shortcut codeAUTOMATIC
2022-10-08implement removalDepFA
2022-10-08context menu stylingDepFA