In the MUTAMUR project (Multitask Learning with Multilingual Resources for Better Natural Language Understanding), we investigate methods for knowledge sharing and transfer between machine learning models in natural language processing.


Modern machine learning models in natural language processing require large amounts of training data to reach high quality. While this training data is task-specific, various tasks are related both in terms of machine learning algorithms and language representations. MUTAMUR investigates new machine learning methods to exploit this relationship, and develop better natural language processing systems for tasks and languages with small amounts of task-specific training data.


Project Head:

Rico Sennrich


Farhad Nooralahzadeh

Chantal Amrhein

Jannis Vamvas


Previous project members:

Duygu Ataman

Annette Rios