Welcome to the IKCEST
Journal
Proceedings of the IEEE

Proceedings of the IEEE

Archives Papers: 436
IEEE Xplore
Please choose volume & issue:
Proceedings of the IEEE: Stay Informed. Become Inspired.
Keywords:Proceedings Of The IEEE
Get in the Conversation
TechRxiv
When Multitask Learning Meets Partial Supervision: A Computer Vision Review
Maxime FontanaMichael SpratlingMiaojing Shi
Keywords:ReviewsOptimization methodsComputer visionSparse matricesNatural language processingComputational modelingAutonomous drivingDeep learningBiomedical imagingMultitaskingMedical roboticsVisualizationVision TasksMulti-task LearningPartial SupervisionMulti-task FashionMulti-objective OptimizationMultiple TasksConvolutional Neural NetworkSemi-supervised LearningDepth EstimationMaximum A PosterioriSelf-supervised LearningPareto FrontPareto Optimal SolutionsFew-shot LearningNeural Architecture SearchAuxiliary TaskShared RepresentationUnlabeled ImagesMulti-task ModelGround-truth Bounding BoxMulti-task Learning MethodShared ParametersSelf-supervised TaskSurface NormalsMulti-task Learning ModelTask LossSource TaskPareto OptimalDeep Reinforcement LearningDepth PredictionAutonomous drivingdeep learning (DL)medical imagingminimal supervisionmultitask learning (MTL)robotic surgeryvisual understanding
Abstracts:Multitask learning (MTL) aims to learn multiple tasks simultaneously while exploiting their mutual relationships. By using shared resources to simultaneously calculate multiple outputs, this learning paradigm has the potential to have lower memory requirements and inference times compared to the traditional approach of using separate methods for each task. Previous work in MTL has mainly focused on fully supervised methods, as task relationships (TRs) can not only be leveraged to lower the level of data dependency of those methods but also improve the performance. However, MTL introduces a set of challenges due to a complex optimization scheme and a higher labeling requirement. This article focuses on how MTL could be utilized under different partial supervision settings to address these challenges. First, this article analyses how MTL traditionally uses different parameter sharing techniques to transfer knowledge in between tasks. Second, it presents different challenges arising from such a multiobjective optimization (MOO) scheme. Third, it introduces how task groupings (TGs) can be achieved by analyzing TRs. Fourth, it focuses on how partially supervised methods applied to MTL can tackle the aforementioned challenges. Lastly, this article presents the available datasets, tools, and benchmarking results of such methods. The reviewed articles, categorized following this work, are available at https://github.com/Klodivio355/MTL-CV-Review.
Hot Journals