Vision-Driven Robot Movement
By Eman
Eman demos a vision-driven robot control system that reads printed signs to interpret commands and move autonomously in real time.
By Eman
Eman demos a vision-driven robot control system that reads printed signs to interpret commands and move autonomously in real time.
More from the Lab
Zeineb introduces her medical AI project focused on early brain tumor detection, explaining how AI, precision, and analysis can improve patient outcomes.
Munkhchimeg Sergelen explains how SignMate avatars are created and why clean, precise motion data is essential for accurate ASL handshapes and movements.
Mugi shares early SignMate experiments capturing ASL motion, testing MediaPipe, RPM, and ASL Mocap datasets to evaluate gesture accuracy and complexity.
"I like Jira" became "has issues with Jira." Zeineb shows how we're fixing data accuracy in Yield's AI interviews.
40 seconds per turn was too slow. Eman parallelized agents, fixed caching, and cut latency by 12%. Here's how.
Pain points had 75% coverage. Everything else? Under 10%. Mugi shows how we taught Yield to go wider, not just deeper.