Models Families

Optimization techniques

Latent Consistency Models (LCM)

LCMs overcome the slow iterative sampling process of Latent Diffusion models (LDMs), enabling fast inference with minimal steps on any pre-trained LDMs (e.g Stable Diffusion). LCMs predict its solution directly in latent space, achieving super fast inference with few steps. A high-quality 768x768 LCM, distilled from Stable Diffusion, requires only 32 A100 GPU training hours (8 node for only 4 hours) for 2~4-step inference.