aboutsummaryrefslogtreecommitdiff
path: root/modules/hypernetworks
AgeCommit message (Collapse)Author
2022-10-29Merge pull request #3858 from R-N/log-csvAUTOMATIC1111
Fix log off by 1 #3847
2022-10-29Merge pull request #3717 from benkyoujouzu/masterAUTOMATIC1111
Add missing support for linear activation in hypernetwork
2022-10-29Re enable linearAngelBottomless
2022-10-28Fix log off by 1Muhammad Rizqi Nur
2022-10-28Add missing support for linear activation in hypernetworkbenkyoujouzu
2022-10-27Disable unavailable or duplicate optionsAngelBottomless
2022-10-26patch bug (SeverianVoid's comment on 5245c7a)timntorres
2022-10-26remove duplicate keys and lowercaseAngelBottomless
2022-10-26Weight initialization and More activation funcAngelBottomless
add weight init add weight init option in create_hypernetwork fstringify hypernet info save weight initialization info for further debugging fill bias with zero for He/Xavier initialize LayerNorm with Normal fix loading weight_init
2022-10-24check length for varianceAngelBottomless
2022-10-24convert deque -> listAngelBottomless
I don't feel this being efficient
2022-10-24statistics for pbarAngelBottomless
2022-10-24cleanup some codeAngelBottomless
2022-10-24Hypernetworks - fix KeyError in statistics cachingAngelBottomless
Statistics logging has changed to {filename : list[losses]}, so it has to use loss_info[key].pop()
2022-10-23Update hypernetwork.pyDepFA
2022-10-22Allow tracking real-time lossAngelBottomless
Someone had 6000 images in their dataset, and it was shown as 0, which was confusing. This will allow tracking real time dataset-average loss for registered objects.
2022-10-22Update hypernetwork.pyAngelBottomless
2022-10-22small fixdiscus0434
2022-10-22Merge branch 'AUTOMATIC1111:master' into masterdiscus0434
2022-10-22small fixdiscus0434
2022-10-22add an option to avoid dying reludiscus0434
2022-10-22added a guard for hypernet training that will stop early if weights are ↵AUTOMATIC
getting no gradients
2022-10-22Merge branch 'master' of upstreamdiscus0434
2022-10-22add dropoutdiscus0434
2022-10-21Remove unused variable.timntorres
2022-10-21Match hypernet name with filename in all cases.timntorres
2022-10-21Sanitize hypernet name input.timntorres
2022-10-21turns out LayerNorm also has weight and bias and needs to be pre-multiplied ↵AUTOMATIC
and trained for hypernets
2022-10-21Merge branch 'master' into training-help-textAUTOMATIC1111
2022-10-21Revise comments.timntorres
2022-10-21Issue #2921-Give PNG info to Hypernet previews.timntorres
2022-10-21a more strict check for activation type and a more reasonable check for type ↵AUTOMATIC
of layer in hypernets
2022-10-21Revert "fix bugs and optimizations"aria1th
This reverts commit 108be15500aac590b4e00420635d7b61fccfa530.
2022-10-21fix bugs and optimizationsAngelBottomless
2022-10-20only linearAngelBottomless
2022-10-20generalized some functions and option for ignoring first layerAngelBottomless
2022-10-20Merge branch 'AUTOMATIC1111:master' into masterdiscus0434
2022-10-20allow float sizes for hypernet's layer_structureAUTOMATIC
2022-10-20updatediscus0434
2022-10-20allow overwrite old hnDepFA
2022-10-20change html outputDepFA
2022-10-19fix for #3086 failing to load any previous hypernetdiscus0434
2022-10-19fix for #3086 failing to load any previous hypernetAUTOMATIC
2022-10-19enable to write layer structure of hn himselfdiscus0434
2022-10-19layer options moves into create hnet uidiscus0434
2022-10-19Merge branch 'master' into masterdiscus0434
2022-10-19Use training width/height when training hypernetworks.Silent
2022-10-19updatediscus0434
2022-10-19updatediscus0434
2022-10-19add options to custom hypernetwork layer structurediscus0434