Exploring ASL Motion Capture with MediaPipe and RPM
By Mugi
Mugi shares early SignMate experiments capturing ASL motion, testing MediaPipe, RPM, and ASL Mocap datasets to evaluate gesture accuracy and complexity.
By Mugi
Mugi shares early SignMate experiments capturing ASL motion, testing MediaPipe, RPM, and ASL Mocap datasets to evaluate gesture accuracy and complexity.
More from the Lab
Meet Mugi, an AI researcher in our Budapest Lab, building SignMate to make communication more accessible through speech-to-sign avatars.
Eman advances his vision-based system by upgrading camera calibration from 2D to 3D and setting up the Isaac environment for testing and future autonomy.
Zeineb demonstrates the latest improvements to the Autoscan AI model, advancing early brain tumor prediction at the Lightbloom AI Lab in Budapest.
"I like Jira" became "has issues with Jira." Zeineb shows how we're fixing data accuracy in Yield's AI interviews.
Pain points had 75% coverage. Everything else? Under 10%. Mugi shows how we taught Yield to go wider, not just deeper.
Sometimes our AI just stopped responding. Mugi shows how we fixed silent failures and made Yield reliable.