팔로우
Michael Matena
Michael Matena
PhD Student, UNC Chapel Hill
확인된 이메일 없음
제목
인용
인용
연도
Exploring the limits of transfer learning with a unified text-to-text transformer
C Raffel, N Shazeer, A Roberts, K Lee, S Narang, M Matena, Y Zhou, W Li, ...
Journal of machine learning research 21 (140), 1-67, 2020
149362020
Merging models with fisher-weighted averaging
MS Matena, CA Raffel
Advances in Neural Information Processing Systems 35, 17703-17716, 2022
1222022
Do transformer modifications transfer across implementations and applications?
S Narang, HW Chung, Y Tay, W Fedus, T Fevry, M Matena, K Malkan, ...
arXiv preprint arXiv:2102.11972, 2021
822021
Exploring the limits of transfer learning with a unified text-to-text transformer. arXiv
C Raffel, N Shazeer, A Roberts, K Lee, S Narang, M Matena, Y Zhou, W Li, ...
arXiv preprint arXiv:1910.10683, 2019
702019
Exploring the limits of transfer learning with a unified text-to-text transformer
A Roberts, C Raffel, K Lee, M Matena, N Shazeer, PJ Liu, S Narang, W Li, ...
Google, Tech. Rep., 2019
402019
A Combinatorial Perspective on the Optimization of Shallow ReLU Networks
MS Matena, CA Raffel
Advances in Neural Information Processing Systems 35, 22187-22198, 2022
12022
NPEFF: Non-Negative Per-Example Fisher Factorization
M Matena, C Raffel
arXiv preprint arXiv:2310.04649, 2023
2023
현재 시스템이 작동되지 않습니다. 나중에 다시 시도해 주세요.
학술자료 1–7