Graph transformer networks S Yun, M Jeong, R Kim, J Kang, HJ Kim Advances in neural information processing systems 32, 2019 | 298 | 2019 |
Hats: A hierarchical graph attention network for stock movement prediction R Kim, CH So, M Jeong, S Lee, J Kim, J Kang arXiv preprint arXiv:1908.07999, 2019 | 52 | 2019 |
Global stock market prediction based on stock chart images using deep Q-network J Lee, R Kim, Y Koh, J Kang IEEE Access 7, 167260-167277, 2019 | 43 | 2019 |
DeepNAP: Deep neural anomaly pre-detection in a semiconductor fab C Kim, J Lee, R Kim, Y Park, J Kang Information Sciences 457, 1-11, 2018 | 19 | 2018 |
Sain: Self-attentive integration network for recommendation S Yun, R Kim, M Ko, J Kang Proceedings of the 42nd International ACM SIGIR Conference on Research and …, 2019 | 9 | 2019 |
MAPS: Multi-Agent reinforcement learning-based Portfolio management System J Lee, R Kim, SW Yi, J Kang arXiv preprint arXiv:2007.05402, 2020 | 8 | 2020 |
A deep neural spoiler detection model using a genre-aware attention mechanism B Chang, H Kim, R Kim, D Kim, J Kang Pacific-Asia Conference on Knowledge Discovery and Data Mining, 183-195, 2018 | 8 | 2018 |
Predicting multiple demographic attributes with task specific embedding transformation and attention network R Kim, H Kim, J Lee, J Kang Proceedings of the 2019 SIAM International Conference on Data Mining, 765-773, 2019 | 7 | 2019 |
Graph Transformer Networks: Learning meta-path graphs to improve GNNs S Yun, M Jeong, S Yoo, S Lee, SY Sean, R Kim, J Kang, HJ Kim Neural Networks, 2022 | | 2022 |