Llion Jones
Llion Jones
Verified email at google.com
Title
Cited by
Cited by
Year
Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Advances in neural information processing systems, 5998-6008, 2017
99442017
Tensor2tensor for neural machine translation
A Vaswani, S Bengio, E Brevdo, F Chollet, AN Gomez, S Gouws, L Jones, ...
arXiv preprint arXiv:1803.07416, 2018
2462018
The best of both worlds: Combining recent advances in neural machine translation
MX Chen, O Firat, A Bapna, M Johnson, W Macherey, G Foster, L Jones, ...
arXiv preprint arXiv:1804.09849, 2018
1992018
One model to learn them all
L Kaiser, AN Gomez, N Shazeer, A Vaswani, N Parmar, L Jones, ...
arXiv preprint arXiv:1706.05137, 2017
1842017
Natural questions: a benchmark for question answering research
T Kwiatkowski, J Palomaki, O Redfield, M Collins, A Parikh, C Alberti, ...
Transactions of the Association for Computational Linguistics 7, 453-466, 2019
1492019
Character-level language modeling with deeper self-attention
R Al-Rfou, D Choe, N Constant, M Guo, L Jones
Proceedings of the AAAI Conference on Artificial Intelligence 33, 3159-3166, 2019
932019
Wikireading: A novel large-scale language understanding task over wikipedia
D Hewlett, A Lacoste, L Jones, I Polosukhin, A Fandrianto, J Han, ...
arXiv preprint arXiv:1608.03542, 2016
932016
Lingvo: a modular and scalable framework for sequence-to-sequence modeling
J Shen, P Nguyen, Y Wu, Z Chen, MX Chen, Y Jia, A Kannan, T Sainath, ...
arXiv preprint arXiv:1902.08295, 2019
442019
ukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez
Advances in neural information processing systems, 5998-6008, 2017
372017
Kaiser, and I
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez
Polosukhin,“Attention is All you Need,” in Advances in Neural Information …, 2017
192017
Accurate supervised and semi-supervised machine reading for long documents
D Hewlett, L Jones, A Lacoste, I Gur
Proceedings of the 2017 Conference on Empirical Methods in Natural Language …, 2017
132017
Byte-level machine reading across morphologically varied languages
T Kenter, L Jones, D Hewlett
Thirty-Second AAAI Conference on Artificial Intelligence, 2018
82018
Kaiser. 2017. Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez
arXiv preprint arXiv:1601.03317, 0
4
WikiReading: A novel large-scale language understanding task over wikipedia
A Lacoste, A Fandrianto, D Hewlett, D Berthelot, I Polosukhin, J Han, ...
12016
Multi-task multi-modal machine learning system
NM Shazeer, AN Gomez, LM Kaiser, JD Uszkoreit, LO Jones, NJ Parmar, ...
US Patent App. 16/689,025, 2020
2020
Machine translation using neural network models
Z Chen, MR Hughes, Y Wu, M Schuster, X Chen, LO Jones, NJ Parmar, ...
US Patent App. 16/521,780, 2020
2020
Attention-based sequence transduction neural networks
NM Shazeer, AN Gomez, LM Kaiser, JD Uszkoreit, LO Jones, NJ Parmar, ...
US Patent App. 16/559,392, 2019
2019
Attention-based sequence transduction neural networks
NM Shazeer, AN Gomez, LM Kaiser, JD Uszkoreit, LO Jones, NJ Parmar, ...
US Patent 10,452,978, 2019
2019
Large Scale Multi-Domain Multi-Task Learning with MultiModel
L Kaiser, AN Gomez, N Shazeer, A Vaswani, N Parmar, L Jones, ...
2018
The Best of Both Worlds: Combining Recent Advances in Neural Machine Translation
A Bapna, G Foster, L Jones, M Hughes, M Johnson, M Chen, M Schuster, ...
2018
The system can't perform the operation now. Try again later.
Articles 1–20