aboutsummaryrefslogtreecommitdiff
path: root/style.css
diff options
context:
space:
mode:
authorAlexander Ljungberg <aljungberg@wireload.net>2023-06-06 21:45:30 +0100
committerGitHub <noreply@github.com>2023-06-06 21:45:30 +0100
commitd9cc0910c8aca481f294009526897152901c32b9 (patch)
treec850c5e4316b1cf830224dcae01aa5aecff589a3 /style.css
parentbaf6946e06249c5af9851c60171692c44ef633e0 (diff)
Fix upcast attention dtype error.
Without this fix, enabling the "Upcast cross attention layer to float32" option while also using `--opt-sdp-attention` breaks generation with an error: ``` File "/ext3/automatic1111/stable-diffusion-webui/modules/sd_hijack_optimizations.py", line 612, in sdp_attnblock_forward out = torch.nn.functional.scaled_dot_product_attention(q, k, v, dropout_p=0.0, is_causal=False) RuntimeError: Expected query, key, and value to have the same dtype, but got query.dtype: float key.dtype: float and value.dtype: c10::Half instead. ``` The fix is to make sure to upcast the value tensor too.
Diffstat (limited to 'style.css')
0 files changed, 0 insertions, 0 deletions