diffusion-hyperfeatures.github.io - Diffusion Hyperfeatures: Searching Through Time and Space for Semantic Correspondence

Description: Diffusion Hyperfeatures is a framework for consolidating the multi-scale and multi-timestep internal representations of a diffusion model for tasks such as semantic correspondence.

diffusion models (23) representation learning (19) diffusion hyperfeatures (2) hypercolumns (1) semantic correspondence (1)

Example domain paragraphs

Diffusion models have been shown to be capable of generating high-quality images, suggesting that they could contain meaningful internal representations. Unfortunately, the feature maps that encode a diffusion model's internal information are spread not only over layers of the network, but also over diffusion timesteps, making it challenging to extract useful descriptors.

We propose Diffusion Hyperfeatures, a framework for consolidating multi-scale and multi-timestep feature maps into per-pixel feature descriptors that can be used for downstream tasks. These descriptors can be extracted for both synthetic and real images using the generation and inversion processes. We evaluate the utility of our Diffusion Hyperfeatures on the task of semantic keypoint correspondence: our method achieves superior performance on the SPair-71k real image benchmark. We also demonstrate that our

We extract feature maps varying across timesteps and layers from the diffusion process and consolidate them with our lightweight aggregation network to create our Diffusion Hyperfeatures, in contrast to prior methods that select a subset of raw diffusion features. For real images, we extract these features from the inversion process, and for synthetic images we extract these features from the generation process. Given a pair of images, we find semantic correspondences by performing nearest neighbors over th

Links to diffusion-hyperfeatures.github.io (5)