Monday July 4th 2022
Neural network architecture search for extreme classification and in a partially labeled learning context
Deep learning applications are rapidly expanding and show no signs of slowing down. Neural network topologies are becoming larger and more complex for challenging real-life problems.
This increased complexity necessitates more time and expertise from professionals, as well as a significant financial investment for AI companies.
Neural Architecture Search is a novel Machine Learning paradigm that seeks to determine the best NN architecture for a given problem. NAS techniques, on the other hand, have only been studied and developed in limited, well-defined Machine Learning problems, which are not representative of all existing ML scenarios.
This thesis focuses on the research and development of the NAS approaches for new tasks as well as a new learning framework that is more relevant to real-world applications.
We suggested using a neuro-evolutionary NAS framework to solve the extreme multi-label classification challenge in particular.
We combined convolution and recurrent networks to provide a more appropriate space search for this assignment.
On several datasets, we evaluate the performance of the searched network. We also looked at the challenge of reconstructing an RSSI map, which is a more difficult process due to the lack of input data and the fact that it is only partially annotated. In this way, we provide a system for semantic segmentation task dynamic architecture search with a minimal number of annotated samples. We investigated multiple semi-supervised learning algorithms in this framework to see which one was the most successful at using unlabeled samples.
We looked at a number of strategies, including "traditional" and "new" semi-supervision approaches, as well as self-supervision approaches.
Mis à jour le 30 June 2022