On May 04th 2022 at 15:30, I graduated with my PhD at the University of Amsterdam. My thesis is titled Information Sharing Methods for Multi-Task Learning and contains a bunch of papers from my research from 2017 until 2022. Yay!

COVER

You can watch my defence in the video below:

or if you enjoy light reading, you can download my thesis here.

Thesis Abstract

This thesis investigates information sharing for Multi-Task Learning (MTL) in the multimedia and computer vision domains. More specifically, we investigate how, when and where information sharing should occur, in order to maximize the positive, and minimize negative influence between the performed tasks in MTL. In the first chapter, we introduce a benchmark dataset dubbed OmniArt, which contains heterogeneously annotated data from digitized visual artworks. Through OmniArt we first start to observe the relationships that form between the multiple tasks defined on the same dataset. In the second chapter, we investigate how secondary latent features such as back-propagated gradients help in determining between task similarities. In the third chapter of the thesis, we study task relationships through featurewise transformations on the filter level, which allow MTL models to share knowledge between large numbers of tasks. To distinguish performing MTL on a very large number of tasks, in the third chapter we introduce Many-Task Learning (MaTL) which occurs when more than 20 tasks are trained simultaneously. Finally, in the fourth chapter we investigate the relationships between tasks and scales in the form of a universal attention module. These works broaden the applicability scope of MTL approaches. By controlling the way information is processed within MTL, we gain the ability to approach a problem from multiple perspectives and discover relationships hidden in the underlying dataset. As multitasking comes naturally to humans, we hope this work contributes to making MTL the paradigm of choice in general machine learning.