ÆÈ·Î¿ì
Ming Ding
Ming Ding
mails.tsinghua.edu.cnÀÇ À̸ÞÀÏ È®ÀεÊ
Á¦¸ñ
Àοë
Àοë
¿¬µµ
GPT understands, too
X Liu, Y Zheng, Z Du, M Ding, Y Qian, Z Yang, J Tang
AI Open, 2023
701*2023
GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training
J Qiu, Q Chen, Y Dong, J Zhang, H Yang, M Ding, K Wang, J Tang
KDD 2020, 2020
6382020
CogView: Mastering Text-to-Image Generation via Transformers
M Ding, Z Yang, W Hong, W Zheng, C Zhou, D Yin, J Lin, X Zou, Z Shao, ...
NeurIPS 2021, 2021
3622021
Cognitive graph for multi-hop reading comprehension at scale
M Ding, C Zhou, Q Chen, H Yang, J Tang
ACL 2019, 2019
2432019
All nlp tasks are generation tasks: A general pretraining framework
Z Du, Y Qian, X Liu, M Ding, J Qiu, Z Yang, J Tang
ACL 2022, 2021
215*2021
Towards Knowledge-Based Recommender Dialog System
Q Chen, J Lin, Y Zhang, M Ding, Y Cen, H Yang, J Tang
Proceedings of the 2019 Conference on Empirical Methods in Natural Language ¡¦, 2019
1722019
ProNE: Fast and Scalable Network Representation Learning
J Zhang, Y Dong, Y Wang, J Tang, M Ding
Proceedings of the 28th International Joint Conference on Artificial ¡¦, 2019
1712019
Glm-130b: An open bilingual pre-trained model
A Zeng, X Liu, Z Du, Z Wang, H Lai, M Ding, Z Yang, Y Xu, W Zheng, X Xia, ...
arXiv preprint arXiv:2210.02414, 2022
150*2022
Are we really making much progress? Revisiting, benchmarking, and refining heterogeneous graph neural networks
Q Lv*, M Ding*, Q Liu, Y Chen, W Feng, S He, C Zhou, J Jiang, Y Dong, ...
KDD 2021, 2021
1312021
Understanding Negative Sampling in Graph Representation Learning
Z Yang*, M Ding*, C Zhou, H Yang, J Zhou, J Tang
KDD 2020, 2020
1252020
Semi-supervised learning on graphs with generative adversarial nets
M Ding, J Tang, J Zhang
Proceedings of the 27th ACM International Conference on Information and ¡¦, 2018
1172018
CogView2: Faster and Better Text-to-Image Generation via Hierarchical Transformers
M Ding, W Zheng, W Hong, J Tang
NeurIPS 2022, 2022
1132022
M6: A chinese multimodal pretrainer
J Lin, R Men, A Yang, C Zhou, M Ding, Y Zhang, P Wang, A Wang, ...
arXiv preprint arXiv:2103.00823, 2021
1102021
CogLTX: Applying BERT to Long Texts
M Ding, C Zhou, H Yang, J Tang
NeurIPS 2020, 2020
1082020
CogVideo: Large-scale Pretraining for Text-to-Video Generation via Transformers
W Hong*, M Ding*, W Zheng, X Liu, J Tang
ICLR 2023, 2022
972022
MixGCF: An Improved Training Method for Graph Neural Network-based Recommender Systems
T Huang, Y Dong, M Ding, Z Yang, W Feng, X Wang, J Tang
KDD 2021, 2021
832021
M6-ufc: Unifying multi-modal controls for conditional image synthesis
Z Zhang, J Ma, C Zhou, R Men, Z Li, M Ding, J Tang, J Zhou, H Yang
NeurIPS 2021, 2021
59*2021
Wudaocorpora: A super large-scale chinese corpora for pre-training language models
S Yuan, H Zhao, Z Du, M Ding, X Liu, Y Cen, X Zou, Z Yang, J Tang
AI Open 2, 65-68, 2021
41*2021
Controllable Generation from Pre-trained Language Models via Inverse Prompting
X Zou, D Yin, Q Zhong, M Ding, Z Yang, J Tang
arXiv preprint arXiv:2103.10685, 2021
372021
Adaptive Diffusion in Graph Neural Networks
J Zhao, Y Dong, M Ding, E Kharlamov, J Tang
Advances in Neural Information Processing Systems (NeurIPS 2021), 2021
292021
ÇöÀç ½Ã½ºÅÛÀÌ ÀÛµ¿µÇÁö ¾Ê½À´Ï´Ù. ³ªÁß¿¡ ´Ù½Ã ½ÃµµÇØ ÁÖ¼¼¿ä.
ÇмúÀÚ·á 1–20