Aug
15

Green Serverless Computing for Resource-Efficient AI Training

Principal Investigators & Key Members: Kok-Seng Wong, PhD

Research Team Details:

Project Information

Context/Introduction/Motivation:

With the increasing demand for artificial intelligence (AI) applications, there is a need for efficient, sustainable computation solutions for AI model training. Traditional server-based architectures often consume substantial energy and resources, leading to environmental concerns and high operational costs. Serverless computing offers a promising approach by providing on-demand computing resources and scaling capabilities. This research explores the feasibility of leveraging serverless computing for efficient AI training, focusing on minimizing energy consumption and resource utilization. Specifically, this research aims to:

  • Develop a highly efficient serverless AI training architecture that reduces computing overhead.
  • Minimize energy consumption and improve resource utilization through dynamic scaling.
  • Lower operational costs, making AI training more affordable for startups, researchers, and
    enterprises.
  • Advance sustainable AI practices, aligning with global efforts to reduce carbon emissions and energy waste.

Methodologies/Activities:

  1. Data Collection & Performance Metrics: Gathering and analyzing AI training datasets while defining key energy efficiency and performance evaluation metrics.
  2. Serverless AI Training Architecture Design: Reviewing existing cloud-based serverless platforms (e.g., AWS Lambda, Google Cloud Functions, Azure Functions) and designing a customized serverless framework optimized for AI workloads.
  3. Real-World Use-Case Implementation & Testing: Deploying AI training tasks in serverless environments and measuring execution time, scalability, and energy consumption.
  4. Evaluation & Sustainability Impact Analysis: Comparing serverless AI training performance with traditional computing methods, assessing cost reductions, and quantifying carbon footprint reductions.

This project involves collecting and analyzing AI training data while defining key energy efficiency and performance metrics. It will design a customized serverless AI training framework, optimized for cloud platforms like AWS, Google Cloud, and Azure. The framework will be tested through real-world AI training deployments, measuring execution time, scalability, and energy consumption. Finally, the study will evaluate performance, cost savings, and sustainability impact, comparing serverless AI training with traditional computing methods to assess its efficiency and carbon footprint reduction.

Expected Outcomes:

This project aims to develop a highly efficient serverless AI training architecture that significantly reduces computing overhead, optimizes energy consumption, and improves resource utilization through dynamic scaling. By minimizing operational costs, this solution will make AI training more accessible and affordable for startups, researchers, and enterprises. Furthermore, it aligns with global efforts to promote sustainable AI practices by lowering carbon emissions and reducing energy waste.

Future Development/Impacts:

In the future, the project will focus on optimizing the Serverless AI Training Framework by enhancing dynamic resource allocation, model parallelization, and adaptive load balancing to further improve efficiency. Additionally, integration with edge computing will be explored to decentralize AI training, reducing reliance on large-scale cloud data centers and lowering overall energy consumption. The project will also foster industry and research collaborations by partnering with cloud service providers, AI research institutions, and industry stakeholders to enhance adoption and applicability. Ultimately, this initiative contributes to the advancement of sustainable AI development by promoting energy-efficient computing solutions and green computing practices, encouraging organizations to integrate serverless AI frameworks into their corporate sustainability strategies.