Oleg Zabluda's blog
Tuesday, November 27, 2018
 
"""
"""
The authors proposed a model (BigGAN) with modifications focused on the following three aspects:

- Scalability: As the authors discovered that GANs benefit dramatically from scaling, they introduced two architectural changes to improve scalability (described in detail in the paper’s Appendix B), while at the same time improving conditioning by applying orthogonal regularization to the generator.

- Robustness: The orthogonal regularization applied to the generator makes the model amenable to the “truncation trick” so that fine control of the trade-offs between fidelity and variety is possible by truncating the latent space.

- Stability: The authors discovered and characterized instabilities specific to large-scale GANs, and devised solutions to minimize the instabilities — although these involved a relatively high trade-off on performance.

In addition to its performance boost at 128×128 resolutions, BigGAN also outperformed the previous SotA at 256×256 and 512×512 resolutions on ImageNet.
"""
https://medium.com/syncedreview/biggan-a-new-state-of-the-art-in-image-synthesis-cf2ec5694024
https://medium.com/syncedreview/biggan-a-new-state-of-the-art-in-image-synthesis-cf2ec5694024

Labels:


| |

Home

Powered by Blogger