팔로우
Hao  Sun
Hao Sun
pku.edu.cn의 이메일 확인됨
제목
인용
인용
연도
Multispeech: Multi-speaker text to speech with transformer
M Chen, X Tan, Y Ren, J Xu, H Sun, S Zhao, T Qin, TY Liu
arXiv preprint arXiv:2006.04664, 2020
932020
Token-level ensemble distillation for grapheme-to-phoneme conversion
H Sun, X Tan, JW Gan, H Liu, S Zhao, T Qin, TY Liu
arXiv preprint arXiv:1904.03446, 2019
682019
LightPAFF: A two-stage distillation framework for pre-training and fine-tuning
K Song, H Sun, X Tan, T Qin, J Lu, H Liu, TY Liu
arXiv preprint arXiv:2004.12817, 2020
162020
Knowledge distillation from bert in pre-training and fine-tuning for polyphone disambiguation
H Sun, X Tan, JW Gan, S Zhao, D Han, H Liu, T Qin, TY Liu
2019 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU …, 2019
142019
현재 시스템이 작동되지 않습니다. 나중에 다시 시도해 주세요.
학술자료 1–4