Follow
Se June Joo
Se June Joo
KAIST AI
No verified email
Title
Cited by
Cited by
Year
The cot collection: Improving zero-shot and few-shot learning of language models via chain-of-thought fine-tuning
S Kim, SJ Joo, D Kim, J Jang, S Ye, J Shin, M Seo
arXiv preprint arXiv:2305.14045, 2023
672023
Mind the gap! injecting commonsense knowledge for abstractive dialogue summarization
S Kim, SJ Joo, H Chae, C Kim, S Hwang, J Yeo
arXiv preprint arXiv:2209.00930, 2022
182022
Cotever: Chain of thought prompting annotation toolkit for explanation verification
S Kim, SJ Joo, Y Jang, H Chae, J Yeo
arXiv preprint arXiv:2303.03628, 2023
82023
How Well Do Large Language Models Truly Ground?
H Lee, S Joo, C Kim, J Jang, D Kim, KW On, M Seo
arXiv preprint arXiv:2311.09069, 2023
42023
Latent Action Pretraining From Videos
S Ye, J Jang, B Jeon, S Joo, J Yang, B Peng, A Mandlekar, R Tan, ...
arXiv preprint arXiv:2410.11758, 2024
2024
Semiparametric Token-Sequence Co-Supervision
H Lee, D Kim, J Jun, S Joo, J Jang, KW On, M Seo
arXiv preprint arXiv:2403.09024, 2024
2024
ÀÚ¿¬¾î 󸮸¦ À§ÇÑ Á¶°ÇºÎ °ÔÀÌÆ® ´ÙÃþ ÆÛ¼ÁÆ®·Ð ¸ðµ¨ °³¹ß ¹× ±¸Çö
¼Õ±ÔÁø£¬ ±è½Â¿ø£¬ ÁÖ¼¼ÁØ£¬ Á¶¿ìÁø£¬ ³ªÁ¤Àº
Çѱ¹Á¤º¸Ã³¸®ÇÐȸ Çмú´ëȸ³í¹®Áý 28 (2), 1116-1119, 2021
2021
The system can't perform the operation now. Try again later.
Articles 1–7