Peng Li (李鹏)
E-mail: lip21 [at] m.fudan.edu.cn; pengli.ds [at] gmail.com
I am currently a third-year M.S. student in the NLP Group at Fudan University, advised by Prof. Xipeng Qiu. I am now also very fortunate to work with Prof. Hongyuan Mei and Prof. Matthew R. Walter at TTIC. Before coming to Fudan, I earned my B.S. degree from the School of Data Science and Engineering at East China Normal University, working with Prof. Yuanbin Wu.
My research interest lies mainly in foundation models for robot learning. The goal is to build general-purpose robots that can interact with the real world automatically and can continuously evolve. To achieve this goal, I am (1) enhancing the perception, reasoning, planning, acting and learning capabilities of robots with existing CV/NLP foundation models; (2) designing and pretraining robotics-specific foundation models; (3) building new evaluation benchmarks.
*I am now actively looking for research-oriented full-time positions in foundation models × robotics. Here is my CV. Please feel free to contact me!
MANGO: A Benchmark for Evaluating Mapping and Navigation Abilities of Large Language Models
Peng Ding*, Jiading Fang*, Peng Li*, Kangrui Wang*, Xiaochen Zhou*, Mo Yu, Jing Li, Matthew Walter, Hongyuan Mei (α)
ArXiv 2024 | Website | Paper | GitHub
Statler: State-Maintaining Language Models for Embodied Reasoning
Takuma Yoneda*, Jiading Fang*, Peng Li*, Huanyu Zhang*, Tianchong Jiang, Shengjie Lin, Ben Picker, David Yunis, Hongyuan Mei, Matthew R. Walter
ICRA 2024 | Website | Paper | GitHub
MOSS: An Open Conversational Language Model
🌟 MOSS is the first ChatGPT-like LLM in China, the first plugin-augmented LLM in China, and fully open sourced with 11.8k+ stars.
Tianxiang Sun, Xiaotian Zhang, Zhengfu He, Peng Li, Qinyuan Cheng, Hang Yan, Xiangyang
Liu, Yunfan Shao, Qiong Tang, Xingjian Zhao, Ke Chen, Yining Zheng, Zhejian Zhou, Ruixiao
Li, Jun Zhan, Yunhua Zhou, Linyang Li, Xiaogui Yang, Lingling Wu, Zhangyue Yin, Xuanjing
Huang, Xipeng Qiu
Machine Intelligence Research 2024 | Website | Paper | GitHub
CodeIE: Large Code Generation Models are Better Few-Shot Information Extractors
Peng Li*, Tianxiang Sun*, Qiong Tang, Hang Yan, Yuanbin Wu, Xuanjing Huang, Xipeng Qiu
ACL 2023 | Website | Paper | GitHub