Understanding the Latent Space of Diffusion Models through the Lens of Riemannian Geometry
Despite the success of diffusion models (DMs), we still lack a thorough understanding of their latent space. To understand the latent space 𝐱_t ∈𝒳, we analyze them from a geometrical perspective. Specifically, we utilize the pullback metric to find the local latent basis in 𝒳 and their corresponding local tangent basis in ℋ, the intermediate feature maps of DMs. The discovered latent basis enables unsupervised image editing capability through latent space traversal. We investigate the discovered structure from two perspectives. First, we examine how geometric structure evolves over diffusion timesteps. Through analysis, we show that 1) the model focuses on low-frequency components early in the generative process and attunes to high-frequency details later; 2) At early timesteps, different samples share similar tangent spaces; and 3) The simpler datasets that DMs trained on, the more consistent the tangent space for each timestep. Second, we investigate how the geometric structure changes based on text conditioning in Stable Diffusion. The results show that 1) similar prompts yield comparable tangent spaces; and 2) the model depends less on text conditions in later timesteps. To the best of our knowledge, this paper is the first to present image editing through 𝐱-space traversal and provide thorough analyses of the latent structure of DMs.
READ FULL TEXT