Research

Research interests - Green computing, Large Language Model fine-tuning

2024

  1. Understanding Multi-Dimensional Efficiency of Fine-Tuning Large Language Models Using SpeedUp, MemoryUp, and EnergyUp
    Dayuan Chen, Noe Soto, Jonas F Tuttle, and 1 more author
    In 2024 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW), 2024

2023

  1. Unveiling the Truth: An Analysis of the Energy and Carbon Footprint of Training an OPT Model using DeepSpeed on the H100 GPU
    Alexander Song, Dayuan Chen, and Ziliang Zong
    In Proceedings of the 14th International Green and Sustainable Computing Conference, 2023
  2. Evaluating the carbon impact of large language models at the inference stage
    Brad Everman, Trevor Villwock, Dayuan Chen, and 3 more authors
    In 2023 IEEE international performance, computing, and communications conference (IPCCC), 2023
  3. Carbon Aware Software Deployment in Cloud(Independent Research)
    2023