When people think AR/VR, many of them think about headwear: Oculus Rift, Google Cardboard, Microsoft Hololens. Why should AR/VR designers care about Leap Motion?
Hands are really the universal human user interface. When people try on a VR headset for the first time, they instinctively start reaching out with their hands. Because that’s how we interact with the real world. In VR, you can’t see your traditional keyboard/mouse/controller, so you really need to reimagine how to interact with digital content at a fundamental level. We believe that bare-handed tracking is the way forward because it’s instantly accessible, requires no additional hardware, and tricks your brain. You can see your real-life fingers brush against a virtual electric surface or connect a series of musical nodes, and you get transported to that other world.
What’s the easiest way for UX/UI designers to start working with the Leap Motion sensor?
We have a robust set of Unity assets that make it really easy to get started, including a set of core UI Widgets. These are really fundamental interfaces – button, slider, text, and time dial – designed so that you can quickly drop them into any project. Fire up the Widgets demo, try it out, dig into the code, and let your imagination run wild. You should also check out some of the projects that are just emerging from the 2015 3D Jam.
What creative software do you recommend UX/UI designers interested in AR/VR start learning? What design software is compatible with Leap Motion?
Unity or Unreal – either of these powerful game engines is an excellent starting point to breaking into AR/VR. Most of the assets already available for Leap Motion + VR are designed for Unity, so I would recommend a first-time VR designer start there. We do a lot of our prototyping and design using MAYA, a 3D design suite, but just about any 3D modelling engine that can export to Unity will work. There are also some really cool ways of adapting paper prototyping to work for 3D interfaces.
What are some interaction design terms you hear around the Leap Motion office that might be new to a screen-based UX/UI designer?
Screen-based designers are used to virtual objects that are trapped behind screens. These can be pulled, pushed, swiped, clicked, dragged, but they really exist in a world all their own. When that world starts spilling into ours, terms from industrial design start to slip into everyday conversation. These terms may not be new to screen-based designers, but they will become second nature.
Affordance – The physical aspects of an object that suggest how it ought to be used. A teapot has a handle and a spout that suggest both how you’re supposed to grab it, and what you’re supposed to do next.
Ergonomics – Screen-based designers typically don’t have to worry about how their interfaces will affect the human body. But if you’re developing a 3D interface, user comfort is an essential consideration.
Intuitive – Let’s throw some warning signs around this word right now. Nothing is intuitive – it either builds on existing expectations, or it affords a particular use. And everything is subject to user testing.
Locomotion – Getting from one place to another is pretty straightforward in the real world. But in VR, there are no perfect solutions, only experimental hardware and software approaches. This is one aspect of VR interaction design you need to watch in the months and years ahead.
Can you recommend any blogs, books, or articles for UX/UI designers interested in applying their discipline to the AR/VR space?
The Sci-Fi Interfaces blog is one of the greatest resources I can recommend to any designer who wants to break into this space. Everyone who straps on a headset and dives into other worlds brings a wealth of cultural expectations that we’ve learned from decades of movies and TV shows. Some sci-fi interfaces are great, some are awful, but they all express how we as a society believe that the future will be interactive. How technology can make us feel powerful and in control. How everything can be at our fingertips. UX/UI designers need to be able to ride this wave of expectations or risk being pulled under. That doesn’t mean trying to achieve the Minority Report interface in real life, but identifying what about that interface inspires us.
Leap Motion and Oculus VR have both published best practices guidelines for virtual reality development  . I can’t recommend these enough to avoid common pitfalls. If you can read through both, and your project meets all the guidelines, it’s already stronger than the majority of VR projects out there.
Beyond that, here are some core resources on our blog and Medium channel:
- “Our Kids Are Going to Get Really Weird” — David Holz on AR/VR in 2022
- Break Out Your Scissors: The Secret of Rapid 3D Prototyping for AR/VR & IoT
- Build-a-Button Workshop: VR Interaction Design from the Ground Up
- Designers + Geeks: Building Virtual Reality
- From Idea to Demo: Your VR Development Roadmap
- How Do Fictional UIs Influence Today’s Motion Controls?
- How to Throw a Block in WebVR
- Leap Motion VR Getting Started
- Taking Motion Control Ergonomics Beyond Minority Report
- VR Game Design with Weightless Creator Martin Schubert
- VR Interface Design and the Future of Hybrid Reality
- What Do VR Interfaces and Teapots Have in Common?
- What Would a Truly 3D Operating System Look Like?
There’s also our latest post called,Storytelling in the 4th Dimension: What’s the Future of VR Cinema?, which just came out a few weeks ago.
What’s currently the greatest challenge to fluid hand-computer interaction in the AR/VR space?
Game engines are not currently designed with hands in mind, so physics and interactivity get really tricky. Just being able to pick up an object relies on concepts that are foreign to that experience – most of the time, it will get knocked around, or pop out from your fingers. This requires some thoughtful design and scripting to get it right. More broadly, the biggest design challenge that you’ll face is that the sensor is always on. With the mouse or the touchscreen, you always have an obvious active/inactive state, because there’s a physical object you can touch and feel. With Leap Motion, however, your hands are constantly in the frame, and there is no automatic transition from passive to active. Setting user expectations with constant dynamic visual feedback is key. See, for example, Zach Kinstner’s VR Guitar project, which uses color cues to reinforce your sense of hands in 3D space:.
Can you talk about Leap Motion’s internal approach to UX/UI design?