Multi-Objective Machine Learning

Real-world applications of deep learning often have to contend with objectives beyond predictive performance, i.e., more than one equally important and competing objective or criterion. Examples include cost functions pertaining to invariance (e.g., to photometric or geometric variations), semantic independence (e.g., to age or race for face recognition systems), privacy (e.g., mitigating leakage of sensitive information), algorithmic fairness (e.g., demographic parity), generalization across multiple domains, computational complexity (FLOPs, compactness), to name a few. In such applications, achieving a single solution that simultaneously optimizes all objectives is no longer feasible; instead, finding a set of solutions that are representative in describing the trade-off among objectives becomes the goal. Multiple approaches have been developed for such problems, including simple scalarization and population-based methods.

This tutorial aims to provide a comprehensive introduction to fundamentals, recent advances, and applications of multi-objective optimization (MOO), followed by hands-on coding examples. Some emerging applications of MOO include,

  • hardware-aware neural architecture search;
  • multi-task learning as multi-objective optimization;
  • designing neural networks for secure inference;
We will also summarize potential research directions intersecting MOO and ML/CV research.


Time

Title

Speaker
09:00am-9:20am Introduction Vishnu Boddeti
Foundational Concepts
09:20am-10:00am Introduction to Multi-Objective Optimization Kalyanmoy Deb
10:00am-10:30am MOEA/D: Multi-Objective Evolutionary Algorithm Based on Decomposition Qingfu Zhang
10:30am-10:50am Break
Applications
10:50am-11:15am Multi-Objective Optimization for Multi-Task Learning Xi Lin
11:15am-11:40am Designing CNNs for Secure Inference Vishnu Boddeti
11:40am-12:05pm Multi-Objective Neural Architecture Search Zhichao Lu
12:05am-12:30pm Hands-On Session Notebook Zhichao Lu