logo80lv
Articlesclick_arrow
Professional Services
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
Order outsourcing
Advertiseplayer
profile_loginLogIn

Nvidia Develops Alias-Free GAN for Smoother Image Generation

The new tool can do whatever StyleGAN2 does more gracefully as it solves the problem of signal processing that causes the aliasing and therefore makes it look like certain details are glued to the image and lag when the object is moving.

"Interpreting all signals in the network as continuous, we derive generally applicable, small architectural changes that guarantee that unwanted information cannot leak into the hierarchical synthesis process," explains Nvidia.

The gif below demonstrates the "glued pixels" problem that is solved with the new Alias-Free GAN. In the video generated by StyleGAN2 the little details like facial hair and wrinkles do not move organically, instead, they freeze in the same spots. In Nvidia's example, all details transform coherently.

The same applies when generating the so-called morphs a.k.a. images that slowly shift from one to another. In the StyleGAN2 example, you can clearly see the transition point while in Nvidia's version it is smooth and fast.

Nvidia's Alias-Free GAN processes images in a completely different way compared to StyleGAN2. In Alias-Free GAN, the multi-scale phase signals that follow the features seen in the final image must control both the appearance and the relative positions of the features. "The local-oriented oscillations form a basis that enables hierarchical localization. The construction appears to make it natural for the network to construct them from the low-frequency input Fourier features," Nvidia adds.

For those ready to take a deep dive into the maths behind Alias-Free GAN, you can read more here. Don't forget to join our new Telegram channel, our Discord, follow us on Instagram and Twitter, where we are sharing breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more