Follow
Llion Jones
Llion Jones
SakanaAI
Verified email at sakana.ai - Homepage
Title
Cited by
Cited by
Year
Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Advances in neural information processing systems 30, 2017
1181592017
Attention is all you need
A Waswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, A Gomez, ...
NIPS, 2017
2599*2017
Natural questions: a benchmark for question answering research
T Kwiatkowski, J Palomaki, O Redfield, M Collins, A Parikh, C Alberti, ...
Transactions of the Association for Computational Linguistics 7, 453-466, 2019
21382019
Prottrans: Toward understanding the language of life through self-supervised learning
A Elnaggar, M Heinzinger, C Dallago, G Rehawi, Y Wang, L Jones, ...
IEEE transactions on pattern analysis and machine intelligence 44 (10), 7112 …, 2021
10882021
Attention is all you need. arXiv 2017
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762 3762, 2023
10362023
Gomez Aidan N., Kaiser Łukasz, and Polosukhin Illia. 2017
V Ashish, S Noam, P Niki, U Jakob, J Llion
Attention is all you need. In Advances in neural information processing …, 2017
7312017
Tensor2tensor for neural machine translation
A Vaswani, S Bengio, E Brevdo, F Chollet, AN Gomez, S Gouws, L Jones, ...
arXiv preprint arXiv:1803.07416, 2018
6122018
Attention is all you need (2017)
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762, 2019
5052019
The best of both worlds: Combining recent advances in neural machine translation
MX Chen, O Firat, A Bapna, M Johnson, W Macherey, G Foster, L Jones, ...
arXiv preprint arXiv:1804.09849, 2018
5012018
Character-level language modeling with deeper self-attention
R Al-Rfou, D Choe, N Constant, M Guo, L Jones
Proceedings of the AAAI conference on artificial intelligence 33 (01), 3159-3166, 2019
4162019
One model to learn them all
L Kaiser, AN Gomez, N Shazeer, A Vaswani, N Parmar, L Jones, ...
arXiv preprint arXiv:1706.05137, 2017
3822017
Attention is all you need. CoRR abs/1706.03762 (2017)
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
2652017
Kaiser, and I
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez
Polosukhin,“Attention is all you need,” in Advances in neural information …, 2017
2572017
Attention is all you need
N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, Ł Kaiser, ...
Advances in neural information processing systems 30, 5998-6008, 2017
2252017
Lingvo: a modular and scalable framework for sequence-to-sequence modeling
J Shen, P Nguyen, Y Wu, Z Chen, MX Chen, Y Jia, A Kannan, T Sainath, ...
arXiv preprint arXiv:1902.08295, 2019
1992019
Proceedings of the 31st international conference on neural information processing systems
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Curran Associates Inc., Red Hook, NY, USA, 2017
1722017
Wikireading: A novel large-scale language understanding task over wikipedia
D Hewlett, A Lacoste, L Jones, I Polosukhin, A Fandrianto, J Han, ...
arXiv preprint arXiv:1608.03542, 2016
1592016
Attention is all you need. 2017. doi: 10.48550
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint ARXIV.1706.03762 2 (2), 2017
982017
Attention is all you need. NIPS (2017)
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762 10, S0140525X16001837, 2017
882017
ProtTrans: Towards cracking the language of Life’s code through self-supervised deep learning and high performance computing. arXiv 2020
A Elnaggar, M Heinzinger, C Dallago, G Rihawi, Y Wang, L Jones, ...
arXiv preprint arXiv:2007.06225, 2007
882007
The system can't perform the operation now. Try again later.
Articles 1–20