# iTAML **Repository Path**: hchouse/iTAML ## Basic Information - **Project Name**: iTAML - **Description**: No description available - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2020-04-13 - **Last Updated**: 2020-12-19 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # iTAML : An Incremental Task-Agnostic Meta-learning Approach Official implementation of "iTAML : An Incremental Task-Agnostic Meta-learning Approach". (CVPR 2020) [(paper link)](https://arxiv.org/abs/2003.11652). iTAML : An Incremental Task-Agnostic Meta-learning Approach (accepted at IEEE Conference on Computer Vision and Pattern Recognition, Seattle, Washington, 2020), hypothesizes that generalization is a key factor for continual learning. In this pursuit, we learn a set of generalized parameters, that are neither specific to old nor new tasks by introducing a novel meta-learning approach that seeks to maintain an equilibrium between all the encountered tasks. This is ensured by a task-agnostic meta-update rule which avoids catastrophic forgetting. When presented with a continuum of data, our model automatically identifies the task and quickly adapts to it with just a single update. This code provides an implementation for iTAML. This repository is implemented using PyTorch and it includes code for running the incremental learning domain experiments on MNIST, SVHN, CIFAR100, ImageNet, and MS-Celeb-10K datasets.

(a) iTAML overall learning process
### Dependencies This code requires the following: * matplotlib==3.2.1 * numpy==1.18.2 * pandas==1.0.3 * Pillow==7.0.0 * scipy==1.4.1 * torch==1.4.0 * torchvision==0.5.0 run `pip3 install -r requirements.txt` to install all the dependencies. ### Data All the dataloading is handled at `incremental_dataloader.py` and the experimental setting for the datasets are handled at `args` class in `train_
(b) iTAML performance on CIFAR100 with class incremental setting of 5,10,20 classes per task respectivly.

(c) iTAML performance on ImageNet-100, ImageNet-1K and MS-Celeb-10K.
### We Credit Thanks to https://github.com/khurramjaved96/incremental-learning, for the preliminary implementation of the data loader. ### Contact Jathushan Rajasegaran - jathushan.rajasegaran@inceptioniai.org or brjathu@gmail.com