The IEEE SPS Video and Image Processing Cup (VIP Cup) student competition, presented by the IEEE Signal Processing Society, gives students the opportunity to work together to solve real-life problems using video and image processing methods. After students submit their work, three final teams are selected to present their work and compete for the grand prize at ICIP 2021.
Interested in competing? The submission deadline is July 30, 2021. For full competition details, eligibility requirements, and team registration, visit the IEEE Signal Processing Society website.
Every person spends around 1/3 of their life in bed. For an infant or a young toddler this percentage can be much higher, and for bed-bound patients it can go up to 100% of their time. Automatic non-contact human pose estimation topic has received a lot of attention/success especially in the last few years in the artificial intelligence (AI) community thanks to the introduction of deep learning and its power in AI modeling. However, the state-of-the-art visionbased AI algorithms in this field can hardly work under the challenges associated with in-bed human behavior monitoring, which includes significant illumination changes (e.g. full darkness at night), heavy occlusion (e.g. covered by a sheet or a blanket), as well as privacy concerns that mitigate large-scale data collection, necessary for any AI model training.
Theoretically, estimating human pose from covered labeled cases to unlabeled ones can be deemed as a domain adaptation problem. For the human pose estimation problem although multiple existing datasets, yet they are mainly RGB images from daily activities, which have huge domain shifts with our target problem. Although the domain adaptation topic has been studied in machine learning in the last few decades, mainstream algorithms mainly focus on the classification problem instead of a regression task, such as pose estimation. The 2021 VIP Cup challenge is in fact a domain adaptation problem for regression with a practical application for in-bed human pose estimation, not been addressed before.
In this 2021 VIP Cup challenge, we seek computer vision-based solutions for in-bed pose estimation under the covers, where no annotations are available for covered cases during model training, while the contestants have access to the large amounts of labeled data in no-cover cases. The successful completion of this task enables the in-bed behavior monitoring technologies to work on novel subjects and environments, where no prior training data is accessible.
For more information about the competition please visit: IEEE VIP Cup 2021
Each team must be composed of: (i) One faculty member (the Supervisor); (ii) At most one graduate student (the Tutor), and; (iii) At least 3 but no more than 10 undergraduates. At least three of the undergraduate team members must be either IEEE Signal Processing Society (SPS) members or SPS student members.
Augmented Cognition Lab at Northeastern University