From c12d960d1ee4f9134c2516862ef991ec52d3f59e Mon Sep 17 00:00:00 2001 From: Robin Rombach Date: Wed, 7 Dec 2022 15:10:21 +0100 Subject: [PATCH] add details on precision for 2.1 --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 926327c..50c5974 100644 --- a/README.md +++ b/README.md @@ -13,6 +13,7 @@ new checkpoints. The following list provides an overview of all currently availa *Version 2.1* - New stable diffusion model (_Stable Diffusion 2.1-v_, [HuggingFace](https://huggingface.co/stabilityai/stable-diffusion-2-1)) at 768x768 resolution and (_Stable Diffusion 2.1-base_, [HuggingFace](https://huggingface.co/stabilityai/stable-diffusion-2-1-base)) at 512x512 resolution, both based on the same number of parameters and architecture as 2.0 and fine-tuned on 2.0, on a less restrictive NSFW filtering of the [LAION-5B](https://laion.ai/blog/laion-5b/) dataset. +Per default, the attention operation of the model is evaluated at full precision when `xformers` is not installed. To enable fp16 (which can cause numerical instabilities with the vanilla attention module on the v2.1 model) , run your script with `ATTN_PRECISION=fp16 python ` **November 24, 2022**