Dong Yin
Dong Yin
Research Scientist, DeepMind
Verified email at google.com - Homepage
Title
Cited by
Cited by
Year
Byzantine-robust distributed learning: Towards optimal statistical rates
D Yin, Y Chen, R Kannan, P Bartlett
International Conference on Machine Learning, 5650-5659, 2018
1822018
Rademacher complexity for adversarially robust generalization
D Yin, R Kannan, P Bartlett
International Conference on Machine Learning, 7085-7094, 2019
862019
A fourier perspective on model robustness in computer vision
D Yin, R Gontijo Lopes, J Shlens, ED Cubuk, J Gilmer
Advances in Neural Information Processing Systems 32, 13276-13286, 2019
752019
Phasecode: Fast and efficient compressive phase retrieval based on sparse-graph codes
R Pedarsani, D Yin, K Lee, K Ramchandran
IEEE Transactions on Information Theory 63 (6), 3663-3691, 2017
602017
Gradient diversity: a key ingredient for scalable distributed learning
D Yin, A Pananjady, M Lam, D Papailiopoulos, K Ramchandran, P Bartlett
International Conference on Artificial Intelligence and Statistics, 1998-2007, 2018
57*2018
Improving robustness without sacrificing accuracy with patch gaussian augmentation
RG Lopes, D Yin, B Poole, J Gilmer, ED Cubuk
arXiv preprint arXiv:1906.02611, 2019
462019
Defending against saddle point attack in Byzantine-robust distributed learning
D Yin, Y Chen, R Kannan, P Bartlett
International Conference on Machine Learning, 7074-7084, 2019
312019
Robust federated learning in a heterogeneous environment
A Ghosh, J Hong, D Yin, K Ramchandran
arXiv preprint arXiv:1906.06629, 2019
292019
Learning mixtures of sparse linear regressions using sparse graph codes
D Yin, R Pedarsani, Y Chen, K Ramchandran
IEEE Transactions on Information Theory 65 (3), 1430-1451, 2019
182019
Sub-linear time support recovery for compressed sensing using sparse-graph codes
X Li, D Yin, S Pawar, R Pedarsani, K Ramchandran
arXiv preprint arXiv:1412.7646, 2014
172014
Distributed sequence memory of multidimensional inputs in recurrent networks
AS Charles, D Yin, CJ Rozell
The Journal of Machine Learning Research 18 (1), 181-217, 2017
132017
Fast and robust compressive phase retrieval with sparse-graph codes
D Yin, K Lee, R Pedarsani, K Ramchandran
2015 IEEE International Symposium on Information Theory (ISIT), 2583-2587, 2015
132015
Compressed sensing using sparse-graph codes for the continuous-alphabet setting
D Yin, R Pedarsani, X Li, K Ramchandran
2016 54th Annual Allerton Conference on Communication, Control, and …, 2016
92016
An Efficient Framework for Clustered Federated Learning
A Ghosh, J Chung, D Yin, K Ramchandran
arXiv preprint arXiv:2006.04088, 2020
62020
Sparse constraint affine projection algorithm with parallel implementation and application in compressive sensing
D Yin, HC So, Y Gu
2014 IEEE International Conference on Acoustics, Speech and Signal …, 2014
6*2014
Sub-linear time support recovery for compressed sensing using sparse-graph codes
X Li, D Yin, S Pawar, R Pedarsani, K Ramchandran
IEEE Transactions on Information Theory 65 (10), 6580-6619, 2019
52019
Sub-linear time support recovery for compressed sensing using sparse-graph codes
X Li, D Yin, S Pawar, R Pedarsani, K Ramchandran
IEEE Transactions on Information Theory 65 (10), 6580-6619, 2019
52019
SOLA: Continual Learning with Second-Order Loss Approximation
D Yin, M Farajtabar, A Li
arXiv preprint arXiv:2006.10974, 2020
42020
Stochastic Gradient and Langevin Processes
X Cheng, D Yin, P Bartlett, M Jordan
arXiv preprint arXiv:1907.03215, 2019
4*2019
A maximum-entropy approach to off-policy evaluation in average-reward MDPs
N Lazic, D Yin, M Farajtabar, N Levine, D Gorur, C Harris, D Schuurmans
Advances in Neural Information Processing Systems 33, 2020
32020
The system can't perform the operation now. Try again later.
Articles 1–20