Open Conference Systems, CLADAG2023

Font Size: 
Efficiency and Robustness in Supervised Learning
Anand N. Vidyashankar, Giacomo Francisci, Fengnan Deng, Xiaoran Jiang

Last modified: 2023-07-02

Abstract


In recent years, there has been an increasing interest in building machine-learning systems that perform adequately when the training and test data differ. In the context of supervised learning, this problem has been addressed within the distributionally robust framework wherein the ambiguity set for the test distributions is allowed to vary within a neighborhood of the training distribution. While such methods are useful, the tradeoff between statistical efficiency and robustness remains unclear. Focusing on the out-of-distribution generalization problem, in this presentation, we describe a precise notion of statistical efficiency and relate the loss of efficiency to the gain in robustness in these contexts.  We illustrate our ideas with examples from label shift estimation arising in diagnostic problems, privacy and utility in healthcare, and generalized adversarial networks.