Follow
Mohammadreza Tayaranian Hosseini
Mohammadreza Tayaranian Hosseini
Verified email at mail.mcgill.ca - Homepage
Title
Cited by
Cited by
Year
Skippynn: An embedded stochastic-computing accelerator for convolutional neural networks
R Hojabr, K Givaki, SMR Tayaranian, P Esfahanian, A Khonsari, ...
Proceedings of the 56th Annual Design Automation Conference 2019, 1-6, 2019
542019
Efficient Fine-Tuning of BERT Models on the Edge
D Vucetic, M Tayaranian, M Ziaeefard, JJ Clark, BH Meyer, WJ Gross
2022 IEEE International Symposium on Circuits and Systems (ISCAS), 1838-1842, 2022
342022
On the Resilience of Deep Learning for Reduced-voltage FPGAs
K Givaki, B Salami, R Hojabr, SMR Tayaranian, A Khonsari, D Rahmati, ...
2020 28th Euromicro International Conference on Parallel, Distributed and …, 2020
212020
Is Integer Arithmetic Enough for Deep Learning Training?
A Ghaffari, MS Tahaei, M Tayaranian, M Asgharian, V Partovi Nia
Advances in Neural Information Processing Systems 35, 27402-27413, 2022
142022
Towards Fine-tuning Pre-trained Language Models with Integer Forward and Backward Propagation
MT Hosseini, A Ghaffari, MS Tahaei, M Rezagholizadeh, M Asgharian, ...
Findings of the Association for Computational Linguistics: EACL 2023, 1867-1876, 2023
5*2023
Embedded stochastic-computing accelerator architecture and method for convolutional neural networks
M Najafi, SR Hojabrossadati, K Givaki, SMR Tayaranian, P Esfahanian, ...
US Patent App. 17/159,347, 2021
52021
Efficient Fine-Tuning of Compressed Language Models with Learners
D Vucetic, M Tayaranian, M Ziaeefard, JJ Clark, BH Meyer, WJ Gross
arXiv preprint arXiv:2208.02070, 2022
42022
Automatic Pruning of Fine-tuning Datasets for Transformer-based Language Models
M Tayaranian, SH Mozafari, BH Meyer, JJ Clark, WJ Gross
arXiv preprint arXiv:2407.08887, 2024
2024
Faster Inference of Integer SWIN Transformer by Removing the GELU Activation
M Tayaranian, SH Mozafari, JJ Clark, B Meyer, W Gross
arXiv preprint arXiv:2402.01169, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–9