Thursday, November 28th, 2024
- Share
- Share on Facebook
- Share on X
- Share on LinkedIn
Contrastive Methods for Enhanced Representation Learning
Abstract:
At the heart of deep learning’s success is its ability to learn rich, hierarchical representations of data through extensive training,on large labeled datasets. However, the availability of such labeled data is often limited or costly to acquire, which represents a significant challenge for many practical applications. To address this issue, researchers have developed self-supervised and semi-supervised learning techniques, which leverage unlabeled data to enhance model performance. These approaches have shown great promise in maximizing the utility of limited labeled examples, making them very interesting in scenarios where labeled data is scarce.
This thesis explores the advancement and application of contrastive learning techniques to learn better representations, especially visual representation, within the realm of semi-supervised learning, self-supervised learning, long-tailed multi-label classification, and bias mitigation. Contrastive learning is a powerful general framework that learns representations by contrasting positive pairs (similar data points) against negative pairs (dissimilar data points), thereby encouraging the model to capture meaningful patterns in the data. Over the course of this research, we aim to address the challenges associated with learning effective data representations in various contexts by proposing novel frameworks and methodologies that leverage the strengths of contrastive learning.
Date and place
Thursday, November 28th, 2024, at 3:45 PM
Bâtiment IMAG, Salle Séminaire 2
and Zoom
Composition du jury
Massih-Reza Amini
Professeur, Université Grenoble Alpes (Directeur de thèse)
Jakob Verbeek
Chercheur, FAIR, Meta (Rapporteur)
Stéphane Canu
Professeur, Université de Normandie (Rapporteur)
Julien Mairal
Directeur de Recherche, INRIA Grenoble (Examinateur)
Isabelle Guyon
Directrice de recherche, Google DeepMind (Examinatrice)
- Share
- Share on Facebook
- Share on X
- Share on LinkedIn