Back to Engineering

Task Routing

2019

Researcher

View on GitHub
⭐ 64🍴 16
PyTorchDeep LearningMulti-Task LearningResearchICCV 2019

Overview

Task Routing is the official PyTorch implementation of the ICCV 2019 Oral presentation "Many Task Learning With Task Routing". Developed at the University of Amsterdam, this research introduces a novel approach to handling hundreds of classification tasks within a single neural network model.

Traditional multi-task learning (MTL) struggles to scale beyond a handful of tasks. Task Routing solves this by introducing the Task Routing Layer (TRL) - a conditional feature-wise transformation mechanism that enables efficient many-task learning without massive parameter expansion.

Architecture

Input Image + Task ID
Which task to perform?
Shared Convolutional Layers
VGG-11 Backbone
Task Routing Layer (TRL)
Feature-wise Transformation
Applies task-specific γ and β transformations to convolutional activations
Task-Specific Classifier
Fully Connected Layers
Classification Result
Task-Specific Prediction

Key Innovation

Task Routing Layer (TRL)

The TRL applies conditional feature-wise linear modulation to convolutional activations. For each task, it learns task-specific γ (scale) and β (shift) parameters that transform the shared feature maps:

output = γ_task * conv_features + β_task

This lightweight mechanism allows the model to adapt shared representations to hundreds of different tasks with minimal additional parameters - only 2 values per feature channel per task.

Features

Scalable Architecture: Handles 100+ classification tasks in a single model

Efficient Parameters: Minimal overhead compared to task-specific models

PyTorch Implementation: Standalone layer that integrates with any CNN architecture

Reference Models: Includes VGG-11 implementations with and without BatchNorm

Comprehensive Evaluation: Tested on 5 datasets plus Visual Decathlon challenge

Repository Contents

taskrouting.py

Standalone PyTorch layer implementation that can be integrated into any convolutional architecture

routed_vgg.py

Sample VGG-11 model with task routing integration, demonstrating how to use TRL in practice

Configuration Files

Templates for setting unit counts, task counts, and sigma ratios for model instantiation

Research Impact

Presented as an Oral paper at ICCV 2019, one of the top computer vision conferences. The work introduced the concept of "Many Task Learning" (MaTL) - defined as multi-task learning with 20+ tasks - distinguishing it from conventional MTL approaches that typically handle 2-5 tasks.

The research demonstrates that task routing can successfully fit hundreds of tasks into a single model while maintaining performance competitive with task-specific models and outperforming traditional MTL baselines. This has significant implications for model deployment, memory efficiency, and transfer learning.

Conference
ICCV 2019 Oral
Institution
University of Amsterdam
GitHub Impact
64 ⭐ 16 🍴

Applications

Multi-Domain Learning: Train a single model across diverse visual domains simultaneously

Continual Learning: Add new tasks without catastrophic forgetting or architecture changes

Model Compression: Replace hundreds of task-specific models with a single efficient network

Transfer Learning: Leverage shared representations across related classification tasks