Follow
Machel Reid
Machel Reid
Research Scientist, Google DeepMind
Verified email at google.com - Homepage
Title
Cited by
Cited by
Year
Large Language Models are Zero-Shot Reasoners
T Kojima, SS Gu, M Reid, Y Matsuo, Y Iwasawa
NeurIPS 2022, 2022
18852022
Gemini: a family of highly capable multimodal models
G Team, R Anil, S Borgeaud, Y Wu, JB Alayrac, J Yu, R Soricut, ...
arXiv preprint arXiv:2312.11805, 2023
4272023
Can wikipedia help offline reinforcement learning?
M Reid, Y Yamada, SS Gu
arXiv preprint arXiv:2201.12122, 2022
872022
LEWIS: Levenshtein Editing for Unsupervised Text Style Transfer
M Reid, V Zhong
Findings of the Annual Meeting of the Association for Computational …, 2021
602021
A Few Thousand Translations Go a Long Way! Leveraging Pre-trained Models for African News Translation
DI Adelani, JO Alabi, A Fan, J Kreutzer, X Shen, M Reid, D Ruiter, ...
NAACL 2022, 2022
42*2022
Diffuser: Diffusion via edit-based reconstruction
M Reid, VJ Hellendoorn, G Neubig
The Eleventh International Conference on Learning Representations, 2022
32*2022
Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers
M Reid, E Marrese-Taylor, Y Matsuo
Findings of Empirical Methods in Natural Language Processing (EMNLP), 2021
312021
Gemma: Open models based on gemini research and technology
G Team, T Mesnard, C Hardin, R Dadashi, S Bhupatiraju, S Pathak, ...
arXiv preprint arXiv:2403.08295, 2024
262024
AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages
M Reid, J Hu, G Neubig, Y Matsuo
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2021
262021
Learning to Model Editing Processes
M Reid, G Neubig
Findings of Empirical Methods in Natural Language Processing (EMNLP), 2022
242022
VCDM: Leveraging Variational Bi-encoding and Deep Contextualized Word Representations for Improved Definition Modeling
M Reid, E Marrese-Taylor, Y Matsuo
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020
212020
PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining
M Reid, M Artetxe
Conference of the North American Chapter of the Association for …, 2021
172021
Low-Resource Machine Translation Using Cross-Lingual Language Model Pretraining
F Zheng, M Reid, E Marrese-Taylor, Y Matsuo
AmericasNLP Workshop, NAACL 2021, 2021
142021
M2D2: A Massively Multi-domain Language Modeling Dataset
M Reid, V Zhong, S Gururangan, L Zettlemoyer
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2022
132022
On the impact of data augmentation on downstream performance in natural language processing
I Okimura, M Reid, M Kawano, Y Matsuo
Proceedings of the Third Workshop on Insights from Negative Results in NLP …, 2022
112022
mmt5: Modular multilingual pre-training solves source language hallucinations
J Pfeiffer, F Piccinno, M Nicosia, X Wang, M Reid, S Ruder
arXiv preprint arXiv:2305.14224, 2023
102023
Variational Inference for Learning Representations of Natural Language Edits
E Marrese-Taylor, M Reid, Y Matsuo
AAAI 2021, 2020
82020
Gemini 1.5: Unlocking multimodal understanding across millions of tokens of context
M Reid, N Savinov, D Teplyashin, D Lepikhin, T Lillicrap, J Alayrac, ...
arXiv preprint arXiv:2403.05530, 2024
62024
On the role of parallel data in cross-lingual transfer learning
M Reid, M Artetxe
arXiv preprint arXiv:2212.10173, 2022
42022
Buffet: Benchmarking large language models for few-shot cross-lingual transfer
A Asai, S Kudugunta, XV Yu, T Blevins, H Gonen, M Reid, Y Tsvetkov, ...
arXiv preprint arXiv:2305.14857, 2023
22023
The system can't perform the operation now. Try again later.
Articles 1–20