LOCALIZED META-LEARNING: A PAC-BAYES ANAL-YSIS FOR META-LEARNING BEYOND GLOBAL PRIOR

Abstract

Meta-learning methods learn the meta-knowledge among various training tasks 1 and aim to promote the learning of new tasks under the task similarity assumption. 2 Such meta-knowledge is often represented as a fixed distribution; this, however, 3 may be too restrictive to capture various specific task information because the 4 discriminative patterns in the data may change dramatically across tasks. In this 5 work, we aim to equip the meta learner with the ability to model and produce 6 task-specific meta knowledge and, accordingly, present a localized meta-learning 7 framework based on the PAC-Bayes theory. In particular, we propose a Local 8 Coordinate Coding (LCC) based prior predictor that allows the meta learner to 9 generate local meta-knowledge for specific tasks adaptively. We further develop a 10 practical algorithm with deep neural network based on the bound. Empirical results 11 on real-world datasets demonstrate the efficacy of the proposed method. 12



Figure1: Illustration of the localized metalearning framework. Instead of using global meta-knowledge for all tasks, we tailor the meta-knowledge for various specific task.

