Follow
Will Grathwohl
Will Grathwohl
Research Scientist, Deepmind
Verified email at deepmind.com - Homepage
Title
Cited by
Cited by
Year
Disentangling space and time in video with hierarchical variational auto-encoders
W Grathwohl, A Wilson
arXiv preprint arXiv:1612.04440, 2016
15502016
Ffjord: Free-form continuous dynamics for scalable reversible generative models
W Grathwohl, RTQ Chen, J Bettencourt, I Sutskever, D Duvenaud
arXiv preprint arXiv:1810.01367, 2018
1054*2018
Invertible residual networks
J Behrmann, W Grathwohl, RTQ Chen, D Duvenaud, JH Jacobsen
International conference on machine learning, 573-582, 2019
7012019
Your classifier is secretly an energy based model and you should treat it like one
W Grathwohl, KC Wang, JH Jacobsen, D Duvenaud, M Norouzi, ...
arXiv preprint arXiv:1912.03263, 2019
6162019
Backpropagation through the void: Optimizing control variates for black-box gradient estimation
W Grathwohl, D Choi, Y Wu, G Roeder, D Duvenaud
arXiv preprint arXiv:1711.00123, 2017
3312017
Reduce, reuse, recycle: Compositional generation with energy-based diffusion models and mcmc
Y Du, C Durkan, R Strudel, JB Tenenbaum, S Dieleman, R Fergus, ...
International conference on machine learning, 8489-8510, 2023
1162023
Learning the Stein Discrepancy for Training and Evaluating Energy-Based Models without Sampling
W Grathwohl, KC Wang, JH Jacobsen, D Duvenaud, R Zemel
International Conference on Machine Learning, 2020
962020
Oops i took a gradient: Scalable sampling for discrete distributions
W Grathwohl, K Swersky, M Hashemi, D Duvenaud, C Maddison
International Conference on Machine Learning, 3831-3841, 2021
942021
Deep reinforcement learning and simulation as a path toward precision medicine
BK Petersen, J Yang, WS Grathwohl, C Cockrell, C Santiago, G An, ...
Journal of Computational Biology 26 (6), 597-604, 2019
88*2019
Continuous diffusion for categorical data
S Dieleman, L Sartran, A Roshannai, N Savinov, Y Ganin, PH Richemond, ...
arXiv preprint arXiv:2211.15089, 2022
822022
Understanding the limitations of conditional generative models
E Fetaya, JH Jacobsen, W Grathwohl, R Zemel
arXiv preprint arXiv:1906.01171, 2019
61*2019
Denoising diffusion samplers
F Vargas, W Grathwohl, A Doucet
arXiv preprint arXiv:2302.13834, 2023
482023
Score-based diffusion meets annealed importance sampling
A Doucet, W Grathwohl, AG Matthews, H Strathmann
Advances in Neural Information Processing Systems 35, 21482-21494, 2022
292022
Self-conditioned embedding diffusion for text generation
R Strudel, C Tallec, F Altché, Y Du, Y Ganin, A Mensch, W Grathwohl, ...
arXiv preprint arXiv:2211.04236, 2022
262022
Optimal design of stochastic DNA synthesis protocols based on generative sequence models
EN Weinstein, AN Amin, WS Grathwohl, D Kassler, J Disset, D Marks
International Conference on Artificial Intelligence and Statistics, 7450-7482, 2022
222022
Joint energy-based models for semi-supervised classification
S Zhao, JH Jacobsen, W Grathwohl
ICML 2020 Workshop on Uncertainty and Robustness in Deep Learning 1, 2020
192020
Gradient-based optimization of neural network architecture
W Grathwohl, E Creager, SKS Ghasemipour, R Zemel
152018
No MCMC for me: Amortized samplers for fast and stable training of energy-based models
D Duvenaud, J Kelly, K Swersky, M Hashemi, M Norouzi, W Grathwohl
International Conference on Learning Representations (ICLR), 2021
112021
Annealed importance sampling meets score matching
A Doucet, WS Grathwohl, AGG Matthews, H Strathmann
ICLR Workshop on Deep Generative Models for Highly Structured Data, 2022
92022
No conditional models for me: Training joint ebms on mixed continuous and discrete data
J Kelly, WS Grathwohl
Energy Based Models Workshop-ICLR 2021, 2021
52021
The system can't perform the operation now. Try again later.
Articles 1–20