aboutsummaryrefslogtreecommitdiff
AgeCommit message (Collapse)Author
2022-10-08simplify xfrmers options: --xformers to enable and that's itAUTOMATIC
2022-10-08emergency fix for xformers (continue + shared)AUTOMATIC
2022-10-08Merge pull request #1851 from C43H66N12O12S2/flashAUTOMATIC1111
xformers attention
2022-10-08Update sd_hijack.pyC43H66N12O12S2
2022-10-08Update requirements_versions.txtC43H66N12O12S2
2022-10-08Update launch.pyC43H66N12O12S2
2022-10-08update sd_hijack_opt to respect new env variablesC43H66N12O12S2
2022-10-08add xformers_available shared variableC43H66N12O12S2
2022-10-08default to split attention if cuda is available and xformers is notC43H66N12O12S2
2022-10-08check for OS and env variableC43H66N12O12S2
2022-10-08fix bug where when using prompt composition, hijack_comments generated ↵MrCheeze
before the final AND will be dropped
2022-10-08Remove duplicate event listenersguaneec
2022-10-08fix glob path in hypernetwork.pyddPn08
2022-10-08fix AND broken for long promptsAUTOMATIC
2022-10-08fix bugs related to variable prompt lengthsAUTOMATIC
2022-10-08Update requirements.txtC43H66N12O12S2
2022-10-08install xformersC43H66N12O12S2
2022-10-08do not let user choose his own prompt token count limitAUTOMATIC
2022-10-08check specifically for skippedTrung Ngo
2022-10-08Add button to skip the current iterationTrung Ngo
2022-10-08Merge remote-tracking branch 'origin/master'AUTOMATIC
2022-10-08let user choose his own prompt token count limitAUTOMATIC
2022-10-08fix: handles when state_dict does not existleko
2022-10-08use new attnblock for xformers pathC43H66N12O12S2
2022-10-08Update sd_hijack_optimizations.pyC43H66N12O12S2
2022-10-08add xformers attnblock and hypernetwork supportC43H66N12O12S2
2022-10-08add info about cross attention javascript shortcut codeAUTOMATIC
2022-10-08implement removalDepFA
2022-10-08context menu stylingDepFA
2022-10-08Context MenusDepFA
2022-10-08Add hypernetwork support to split cross attention v1brkirch
* Add hypernetwork support to split_cross_attention_forward_v1 * Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device
2022-10-08edit-attention browser compatibility and readme typoJairo Correa
2022-10-08delete broken and unnecessary aliasesC43H66N12O12S2
2022-10-08switch to the proper way of calling xformersC43H66N12O12S2
2022-10-07linux testGreendayle
2022-10-07even more powerfull fixGreendayle
2022-10-07loading tf only in interrogation processGreendayle
2022-10-07Merge branch 'master' into dev/deepdanbooruGreendayle
2022-10-07make it possible to use hypernetworks without opt split attentionAUTOMATIC
2022-10-07do not stop working on failed hypernetwork loadAUTOMATIC
2022-10-07support loading VAEAUTOMATIC
2022-10-07added support for hypernetworks (???)AUTOMATIC
2022-10-07Update sd_hijack.pyC43H66N12O12S2
2022-10-07Update sd_hijack.pyC43H66N12O12S2
2022-10-07Update sd_hijack.pyC43H66N12O12S2
2022-10-07Update requirements.txtC43H66N12O12S2
2022-10-07Update shared.pyC43H66N12O12S2
2022-10-07Update sd_hijack.pyC43H66N12O12S2
2022-10-07add xformers attentionC43H66N12O12S2
2022-10-06added ctrl+up or ctrl+down hotkeys for attentionAUTOMATIC