Follow
Eyyüb Sari
Eyyüb Sari
Huawei Noah's Ark Lab, Montréal
Verified email at huawei.com
Title
Cited by
Cited by
Year
How does batch normalization help binary training?
E Sari, M Belbahri, VP Nia
arXiv preprint arXiv:1909.09139, 2019
402019
Differentiable mask for pruning convolutional and recurrent networks
RK Ramakrishnan, E Sari, VP Nia
2020 17th Conference on Computer and Robot Vision (CRV), 222-229, 2020
162020
Adaptive Binary-Ternary Quantization
R Razani, G Morin, E Sari, VP Nia
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2021
102021
Foothill: A quasiconvex regularization for edge computing of deep neural networks
M Belbahri, E Sari, S Darabi, VP Nia
International Conference on Image Analysis and Recognition, 3-14, 2019
62019
Demystifying and Generalizing BinaryConnect
T Dockhorn, Y Yu, E Sari, M Zolnouri, V Partovi Nia
Advances in Neural Information Processing Systems 34, 2021
52021
Batch Normalization in Quantized Networks
E Sari, VP Nia
arXiv preprint arXiv:2004.14214, 2020
52020
iRNN: Integer-only Recurrent Neural Network
E Sari, V Courville, VP Nia
arXiv preprint arXiv:2109.09828, 2021
32021
Smart Ternary Quantization
G Morin, R Razani, VP Nia, E Sari
arXiv preprint arXiv:1909.12205, 2019
32019
Understanding BatchNorm in Ternary Training
E Sari, VP Nia
Journal of Computational Vision and Imaging Systems 5 (1), 2-2, 2019
22019
Neural network pruning
VP NIA, RK RAMAKRISHNAN, EH SARI
US Patent App. 17/012,818, 2021
12021
Foothill: A Quasiconvex Regularization Function
M Belbahri, E Sari, S Darabi, VP Nia
2019
Foothill Regularizer as a Binary Quantizer
M Belbahri, E Sari, S Darabi, X Li, M Courbariaux, VP Nia
The system can't perform the operation now. Try again later.
Articles 1–12