[go: up one dir, main page]

Follow
Shengding Hu
Shengding Hu
Verified email at mails.tsinghua.edu.cn
Title
Cited by
Cited by
Year
Graph neural networks: A review of methods and applications
J Zhou, G Cui, S Hu, Z Zhang, C Yang, Z Liu, L Wang, C Li, M Sun
AI open 1, 57-81, 2020
89802020
Parameter-efficient fine-tuning of large-scale pre-trained language models
N Ding, Y Qin, G Yang, F Wei, Z Yang, Y Su, S Hu, Y Chen, CM Chan, ...
Nature machine intelligence 5 (3), 220-235, 2023
13732023
Efficient GPT-4V level multimodal large language model for deployment on edge devices
Y Yao, T Yu, A Zhang, C Wang, J Cui, H Zhu, T Cai, C Chen, H Li, W Zhao, ...
Nature Communications 16 (1), 5509, 2025
894*2025
Enhancing chat language models by scaling high-quality instructional conversations
N Ding, Y Chen, B Xu, Y Qin, S Hu, Z Liu, M Sun, B Zhou
Proceedings of the 2023 Conference on Empirical Methods in Natural Language …, 2023
7122023
Olympiadbench: A challenging benchmark for promoting agi with olympiad-level bilingual multimodal scientific problems
C He, R Luo, Y Bai, S Hu, Z Thai, J Shen, J Hu, X Han, Y Huang, Y Zhang, ...
Proceedings of the 62nd Annual Meeting of the Association for Computational …, 2024
6152024
Knowledgeable prompt-tuning: Incorporating knowledge into prompt verbalizer for text classification
S Hu, N Ding, H Wang, Z Liu, J Li, M Sun
5112021
Minicpm: Unveiling the potential of small language models with scalable training strategies
S Hu, Y Tu, X Han, C He, G Cui, X Long, Z Zheng, Y Fang, Y Huang, ...
arXiv preprint arXiv:2404.06395, 2024
5102024
Tool learning with foundation models
Y Qin, S Hu, Y Lin, W Chen, N Ding, G Cui, Z Zeng, X Zhou, Y Huang, ...
ACM Computing Surveys 57 (4), 1-40, 2024
4772024
Openprompt: An open-source framework for prompt-learning
N Ding, S Hu, W Zhao, Y Chen, Z Liu, H Zheng, M Sun
Proceedings of the 60th Annual Meeting of the Association for Computational …, 2022
4162022
Delta tuning: A comprehensive study of parameter efficient methods for pre-trained language models
N Ding, Y Qin, G Yang, F Wei, Z Yang, Y Su, S Hu, Y Chen, CM Chan, ...
arXiv preprint arXiv:2203.06904, 2022
3312022
∞ Bench: Extending long context evaluation beyond 100k tokens
X Zhang, Y Chen, S Hu, Z Xu, J Chen, M Hao, X Han, Z Thai, S Wang, ...
Proceedings of the 62nd Annual Meeting of the Association for Computational …, 2024
1982024
Prototypical verbalizer for prompt-based few-shot tuning
G Cui, S Hu, N Ding, L Huang, Z Liu
arXiv preprint arXiv:2203.09770, 2022
1362022
Seed1. 5-thinking: Advancing superb reasoning models with reinforcement learning
BD Seed, J Chen, T Fan, X Liu, L Liu, Z Lin, M Wang, C Wang, X Wei, ...
arXiv preprint arXiv:2504.13914, 2025
1162025
Graph Policy Network for Transferable Active Learning on Graphs
S Hu, Z Xiong, M Qu, X Yuan, MA Côté, Z Liu, J Tang
NeurIPS'20, 2020
962020
& Sun, M.(2020)
J Zhou, G Cui, S Hu, Z Zhang, C Yang, Z Liu
Graph neural networks: A review of methods and applications. AI open 1, 57-81, 0
90
Decoder-only or encoder-decoder? interpreting language model as a regularized encoder-decoder
Z Fu, W Lam, Q Yu, AMC So, S Hu, Z Liu, N Collier
arXiv preprint arXiv:2304.04052, 2023
812023
Graph neural networks: a review of methods and applications. AI Open 1, 57–81 (2020)
J Zhou, G Cui, S Hu, Z Zhang, C Yang, Z Liu, L Wang, C Li, M Sun
Prompt used to predict future trend (Table 12 and 13)–Prompt used to …, 1812
781812
Copen: Probing conceptual knowledge in pre-trained language models
H Peng, X Wang, S Hu, H Jin, L Hou, J Li, Z Liu, Q Liu
arXiv preprint arXiv:2211.04079, 2022
612022
Sparse structure search for delta tuning
S Hu, Z Zhang, N Ding, Y Wang, Y Wang, Z Liu, M Sun
Advances in Neural Information Processing Systems, 2022
47*2022
Prosparse: Introducing and enhancing intrinsic activation sparsity within large language models
C Song, X Han, Z Zhang, S Hu, X Shi, K Li, C Chen, Z Liu, G Li, T Yang, ...
Proceedings of the 31st International Conference on Computational …, 2025
422025
The system can't perform the operation now. Try again later.
Articles 1–20