Formatting Instructions for ICLR2020 Conference Submissions


How PyTorch Transposed Conv1D Work
Deconvolution and Checkerboard Artifacts
Flow-based Deep Generative Models


【Flow-based Model】
Variational Inference with Normalizing Flows
NICE: Non-linear Independent Components Estimation
Density estimation using Real NVP
Glow: Generative Flow with Invertible 1x1 Convolutions

Large Scale GAN Training for High Fidelity Natural Image Synthesis
A Style-Based Generator Architecture for Generative Adversarial Networks
HoloGAN: Unsupervised Learning of 3D Representations from Natural Images
Few-Shot Adversarial Learning of Realistic Neural Talking Head Models
SinGAN: Learning a Generative Model from a Single Natural Image
Towards a Deeper Understanding of Adversarial Losses under a Discriminative Adversarial Network Setting

●Generative Adversarial Networks
Generative Adversarial Nets
Adversarial Feature Learning
Adversarially Learned Inference
Adversarially Learned Inference - GitHub Pages
Autoencoding beyond pixels using a learned similarity metric
●Adversarial Autoencoder
Adversarial Autoencoders
●Wasserstein GAN
Wasserstein GAN
●Gradient Penalty
Improved Training Wasserstein GANs
●Perceptual Loss
Perceptual Loss for Real-Time Style Transfer and Super-Resolution
●Hinge Loss
Hierarchical Implicit Models and Likelihood-Free Variational Inference Tran, Ranganath, Blei, 2017
Geometric GAN Lim & Ye, 2017
Spectral Normalization for Generative Adversarial Networks Miyato, Kataoka, Koyama, Yoshida, 2018
●Feature Matching Loss
High-Resolution Image Synthesis and Semantic Manipulation with Conditional GANs
●Instance Normalization
Instance Normalization: The Missing Ingredient for Fast Stylization
●Spectral Normalization
Spectral Normalization for Generative Adversarial Networks
●Adaptive Instance Normalization(AdaIN)
Arbitrary Style Transfer in Real-time with Adaptive Instance Normalization
Self-Attention Generative Adversarial Networks
●Projection Discriminator
cGANs with Projection Discriminator
●Inception Score
Improved Techniques for Training GANs
●Frechet-Inception Distance(FID)
GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium
●Structured Similarity(SSIM)
Image Quality Assessment: From Error Visibility to Structural Similarity
●Perception-Distortion Tradeoff
The Perception-Distortion Tradeoff
●Cosine Similarity(CSIM)
ArcFace: Additive Angular Margin Loss for Deep Face Recognition

【Meta Learning】
Meta-Learning: Learning to Learn Fast - Lil'Log
Few-shot Learningとは何なのか【Generalizing from a few examples: A survey on few-shot learning】
MetaGAN: An Adversarial Approach to Few-Shot Learning
Model-Agnostic Meta-Learning For Fast Adaptation of Deep Neural Networks
●Reptile (FOMAML)
On First-Order Meta-Learning Algorithims
●Implicit MAML
Meta-Learning with Implicit Graidents
Modular Meta-Learning with Shrinkage
Fast Context Adaptation via Meta-Learning
Task-Agnostic Meta-Learning for Few-shot Learning

【The Lottery Ticket Hypothesis】
The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks
What's Hidden in a Randomly Weighted Neural Network?
Training BatchNorm and Only BatchNorm: On the Expressive Power of Random Feature in CNNs