Multi-task learning is a subfield of machine learning in which multiple tasks are solved simultaneously, taking advantage of commonalities and differences between them. It can improve the performance of classification tasks by learning them jointly, and regularization induced by requiring an algorithm to perform well on related tasks can be superior to uniform complexity penalization. MTL has also been shown to be beneficial for learning unrelated tasks.
Stanford University
Fall 2022
This course emphasizes leveraging shared structures in multiple tasks to enhance learning efficiency in deep learning. It provides a thorough understanding of multi-task and meta-learning algorithms with a focus on topics like self-supervised pre-training, few-shot learning, and lifelong learning. Prerequisites include an introductory machine learning course. The course is designed for graduate-level students.
No concepts data
+ 17 more concepts