팔로우
Gyeongman Kim
Gyeongman Kim
kaist.ac.kr의 이메일 확인됨
제목
인용
인용
연도
Distilling linguistic context for language model compression
G Park, G Kim, E Yang
Proceedings of the 2021 Conference on Empirical Methods in Natural Language …, 2021
262021
Diffusion Video Autoencoders: Toward Temporally Consistent Face Video Editing via Disentangled Video Encoding
G Kim, H Shim, H Kim, Y Choi, J Kim, E Yang
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2022
172022
PromptKD: Distilling Student-Friendly Knowledge for Generative Language Models via Prompt Tuning
G Kim, D Jang, E Yang
arXiv preprint arXiv:2402.12842, 2024
12024
SeamsTalk: Seamless Talking Face Generation via Flow-Guided Inpainting
Y Jeong, G Kim, D Jang, J Hwang, E Yang
IEEE Access, 2024
2024
현재 시스템이 작동되지 않습니다. 나중에 다시 시도해 주세요.
학술자료 1–4