Home AI News Restart: A Novel Sampling Technique for Differential Equation-Based Generative Models

Restart: A Novel Sampling Technique for Differential Equation-Based Generative Models

0
Restart: A Novel Sampling Technique for Differential Equation-Based Generative Models

Differential equation-based deep generative models are powerful tools for modeling high-dimensional data in various fields. These models solve differential equations backwards, transforming a simple distribution into a complex data distribution. Two types of prior samplers have been identified: ODE samplers, which have deterministic evolution, and SDE samplers, which have stochastic generation trajectories.

Several studies have shown that these samplers have benefits in different scenarios. ODE solvers produce smaller discretization errors, allowing for better sample quality even at larger step sizes. However, the quality of their results levels off quickly. On the other hand, SDE solvers improve sample quality in the big NFE regime, but at the cost of increased sampling time.

Taking inspiration from these findings, MIT researchers developed a new sampling technique called Restart, which combines the benefits of both ODE and SDE. The Restart algorithm consists of two subroutines: a Restart forward process that introduces a large amount of noise to “restart” the original backward process, and a Restart backward process that executes the backward ODE.

Restart decouples randomness and drifts, with the forward process adding more noise than earlier SDEs, leading to increased error contraction. By cycling forward and backward multiple times, Restart reduces discretization errors and achieves ODE-like step sizes with its deterministic backward processes. The Restart interval is usually positioned at the end of the simulation to maximize the contraction effects, and multiple Restart periods can be used for more challenging tasks to reduce early errors.

Experimental results have shown that Restart outperforms state-of-the-art ODE and SDE solvers in terms of quality and speed across various datasets and pre-trained models. For example, on CIFAR-10 with VP, Restart achieves a 10x speedup compared to previous SDEs, and on ImageNet 64×64 with EDM, a 2x speedup while outperforming ODE solvers in the small NFE regime.

The researchers also applied Restart to a Stable Diffusion model pre-trained on LAION 512×512 images for translating text to images. Restart improved upon prior samplers by achieving a better balance between text-image alignment/visual quality and diversity.

To fully harness the potential of the Restart framework, the team plans to develop a more automated method for selecting hyperparameters based on error analysis in the future.

For more details, you can check out the paper and Github link.

Don’t forget to join our ML SubReddit, Discord channel, and Email Newsletter for the latest AI research news, cool AI projects, and more. If you have any questions or if we missed anything, feel free to email us at Asif@marktechpost.com.

Check out 100’s AI Tools in AI Tools Club.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here