[R&D] Google Gemini + Matterport SDK: Hand-Gesture Navigation Demo21337
Pages:
1
TosoliniProductions Bellevue, Washington |
Tosolini private msg quote post Address this user | |
Experience a new hands-free way to navigate Matterport 3D tours. In this prototype, I used Google Gemini 3 and a bit of vibe-coding to control a Matterport virtual tour entirely through hand gestures. The webcam detects open/closed hands in real time to move left, right, or forward inside the space. This experiment explores how AI models can make 3D environments more accessible and more intuitive to explore. Special thanks to Peder Nelson of The Museum of Flight in Seattle, WA for the ongoing R&D collaboration. Try the demo yourself (webcam required): https://gemini.google.com/share/17bfeefb0327 (You’ll need to allow webcam access) Tools used: • Google Gemini 3 • Matterport SDK • Webcam hand-gesture recognition via vibe-coding |
||
| Post 1 IP flag post | ||
WGAN ForumFounder and Advisor Atlanta, Georgia |
DanSmigrod private msg quote post Address this user | |
| @Tosolini Thank you for sharing. Minority Report with Tom Cruise immediately came to mind watching you hand-gesture through a Matterport space. Way cool! Happy Thanksgiving, Dan |
||
| Post 2 IP flag post | ||
TosoliniProductions Bellevue, Washington |
Tosolini private msg quote post Address this user | |
| @dansmigrod Thanks! Vibe coding with the MP SDK wasn't a smooth ride. Both ChatGPT and Claude failed, but persistence with Gemini 3 paid off. | ||
| Post 3 IP flag post | ||
WGAN ForumFounder and Advisor Atlanta, Georgia |
DanSmigrod private msg quote post Address this user | |
| @Tosolini Glad you persisted. Hand gestures to "walk through" a Matterport space is exciting. Vibe coding is pretty cool. I did that to create a WGAN calculator. Happy Thanksgiving, Dan |
||
| Post 4 IP flag post | ||
Pages:
1


















