The beauty of synchronized dancing lies in the synchronization of body movements among multiple dancers. While dancers utilize camera recording for their practice, standard video interfaces do not efficiently support their activities of identifying segments where they are not well synchronized. This thus fails to close a tight loop of an iterative practice process (i.e.,capturing a practice, reviewing the video, and practicing again). We present SyncUp, a system providing multiple interactive visualizations to support the practice of synchronized dancing to liberate users from manual inspection of recorded practice videos. By analyzing videos uploaded by users, SyncUp quantifies two aspects of synchronization in dancing: pose similarity among multiple dancers and temporal alignment of their movements. The system then highlights which body parts and which portions of the dance routine require further practice to achieve better synchronization. The results of our system evaluations show that our pose similarity estimation and temporal alignment predictions were correlated with human ratings well. Participants in our qualitative user evaluation expressed the benefits and its potential use of SyncUp, confirming that it would enable quick iterative practice.



Zhongyi Zhou, Anran Xu, and Koji Yatani. 2021. SyncUp: Vision-based Practice Support for Synchronized Dancing. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 5, 3, Article 143 (Sept 2021), 25 pages. (paper)

周 中一,矢谷 浩司.「人体ポーズ分析を応用したシンクロダンス練習支援システム」情報処理学会UBI研究会,2020年12月.学生奨励賞受賞. (paper)

Zhongyi Zhou, Yuki Tsubouchi, and Koji Yatani. 2019. Visualizing Out-of-synchronization in Group Dancing. In The Adjunct Publication of UIST 2019, 107–109. (paper)