대학원소개

논문성과

VE Lab’s (Prof. Young Ho Chai)'s three papers accepted to ACM VRST 2025 (AI Top-tier Conference)

관리자 │ 2025-10-11

HIT

98


We are delighted to announce that three papers from the Virtual Environments Lab (VELab, Prof. Young Ho Chai) has been accepted to ACM Symposium on Virtual Reality Software and Technology (VRST) [LINK].


Title: 

MoPriC : Two Stage Approach for Text Guided Motion-Primitives Composition


Authors:

Jeong Yeon Lee, Soung Sill Park, Young Ho Chai


Abstract:

Text-to-motion generative models suffer from the long-term dependency problem, where it becomes difficult to maintain the context of text instructions as the motion length increases. Also, current MoCap datasets include only predefined actions and fail to reflect diverse individual styles. To address these limitations, we introduced MoPriC, a two-stage motion composition framework that produces sequential motions from elementary motion primitives guided by text descriptions. We also presented DancePrimitives, a new dataset of collected motion primitives to capture the semantics of each unit motion.


__________________________________________________________________________________________________________________________________________________________________________________________


Title: 

Minimal Input to Maximal Movement: Real-Time Avatar Control with Dual-sensor


Authors:

Se Hyeok Yoo, Kyung Min Kim, Young Ho Chai


Abstract:

We present a real-time dance control system that enables users to manipulate complex full-body movements through simple hand gestures. Our system demonstrates movement modification capabilities using consumer-grade hardware, requiring only a webcam and IMU sensor for gesture capture. This research demonstrates the potential of accessible hardware for intuitive avatar control, providing immediate and responsive interaction in digital environments.


__________________________________________________________________________________________________________________________________________________________________________________________


Title: 

Partial Joint Correction of Abnormal Motion Data via Reward Function Design in Virtual Environments


Authors:

Hyun Beom Kim, Soung Sill Park, Young Ho Chai


Abstract:

Most reinforcement learning-based humanoid motion studies emphasize full-body imitation, limiting selective correction of abnormal joints. This study proposes a reward function design that corrects abnormal joint behavior while preserving motion style, using twoapproaches: periodic positional targets and pre-trained joint angle references. Applied separately, both guided the agent to recover natural swing motion. Experiments in a physics-based simulation showed improved joint mobility and corrected gait patterns. The results highlight that targeted correction is achievable with imperfect motion data through reward design alone, with potential applications in rehabilitation simulations and user-feedback systems in virtual environments.





이전글 VE Lab’s (Prof. Young Ho Chai)'s three papers accepted to ISMAR 2025 (AI Top-t...
다음글 VI Lab's (Prof. Jongwon Choi) one paper accepted to ECAI 2025 (AI Top-tier Confe...