Qingyun Wu

alt qtext 

Assistant Professor,
College of Information Science and Technology,
Penn State University
Email: qingyun.wu [at] psu.edu

About me

I am an Assistant Professor at the College of Information Science and Technology (IST) at Penn State University, having joined in the fall of 2021. Prior to this, I was a Postdoctoral Researcher at the New York City Lab of Microsoft Research from 2020 to 2021, where I was hosted by Dr. John Langford. I got my Ph.D. in Computer Science from the University of Virginia in 2020, under the supervision of Dr. Hongning Wang.

My research interests span machine learning and artificial intelligence. At present, I am particularly focused on Automated Machine Learning (AutoML) and Large Language Models (LLMs), exploring their potential in shaping the next generation of intelligent information systems.

To prospective students:

I am looking for self-motivated students (interns, 2024 fall PhD applicants, or collaborating students) who are interested in LLM agents, particularly in Multi-Agent LLM systems (check out our AutoGen project below). If you share this interest, please send me an email with your CV and explain why you believe you are a good fit for this research. For better organization, please use the subject line: “Application for LLM Agent Research.”

Projects

  • Open-source project on LLM agents: AutoGen: Enabling Next-Gen LLM Applications via Multi-Agent Conversation. (Trending. 5k stars in a week.)
    [project github] (My role: creator and maintainer)

  • Open-source project on AutoML and tuning: FLAML: A Fast Library for Automated Machine Learning & Tuning. (2.6k stars, 1 million downloads)
    [project github] (My role: creator and maintainer)

Research Highlights

  • AutoGen: Enabling Next-Gen LLM Applications via Multi-Agent Conversation.
    Qingyun Wu, Gagan Bansal, Jieyu Zhang, Yiran Wu, Shaokun Zhang, Erkang Zhu, Beibin Li, Li Jiang, Xiaoyun Zhang, and Chi Wang. ArXiv:2308.08155.[paper] [github]

  • An Empirical Study on Challenging Math Problem Solving with GPT-4.
    Yiran Wu, Feiran Jia, Shaokun Zhang, Hangyu Li, Erkang Zhu, Yue Wang, Yin Tat Lee, Richard Peng, Qingyun Wu, and Chi Wang. ArXiv:2306.01337. [paper] [code]

  • Targeted Hyperparameter Optimization with Lexicographic Preferences Over Multiple Objectives.
    Shaokun Zhang, Feiran Jia, Chi Wang, Qingyun Wu. In ICLR 2023. (Notable Top 5%) [paper] [code]

  • ChaCha for Online AutoML.
    Qingyun Wu, Chi Wang, John Langford, Paul Mineiro, Marco Rossi. ICML 2021. [paper] [code]

  • FLAML: A Fast and Lightweight AutoML Library.
    Chi Wang*, Qingyun Wu*, Markus Weimer, and Erkang Zhu. In MLSys 2021. (* indicates equal contribution) [paper] [code]

  • Variance Reduction in Gradient Exploration for Online Learning to Rank.
    Huazheng Wang, Sonwoo Kim, Eric McCord-Snook, Qingyun Wu, and Hongning Wang. In SIGIR 2019. (Best Paper Award) [paper] [code]