Use Your Hands! An interview with Alex Colgan, VR Community Lead at Leap Motion

Tweet about this on TwitterShare on FacebookShare on Google+Share on LinkedInEmail this to someone


Alex Colgan is VR Community Lead at Leap Motion. This month, he answers questions from Designation AR/VR’s Aaron Faucher about the importance of hand interactions in the future of UX/UI design.

When people think AR/VR, many of them think about headwear: Oculus Rift, Google Cardboard, Microsoft Hololens. Why should AR/VR designers care about Leap Motion?

Hands are really the universal human user interface. When people try on a VR headset for the first time, they instinctively start reaching out with their hands. Because that’s how we interact with the real world. In VR, you can’t see your traditional keyboard/mouse/controller, so you really need to reimagine how to interact with digital content at a fundamental level. We believe that bare-handed tracking is the way forward because it’s instantly accessible, requires no additional hardware, and tricks your brain. You can see your real-life fingers brush against a virtual electric surface or connect a series of musical nodes, and you get transported to that other world.

Screen Shot 2016-01-07 at 11.03.28 PM

What’s the easiest way for UX/UI designers to start working with the Leap Motion sensor?

We have a robust set of Unity assets that make it really easy to get started, including a set of core UI Widgets. These are really fundamental interfaces – button, slider, text, and time dial – designed so that you can quickly drop them into any project. Fire up the Widgets demo, try it out, dig into the code, and let your imagination run wild. You should also check out some of the projects that are just emerging from the 2015 3D Jam.

The Leap Motion sensor uses infrared cameras to translate real-world hands into virtual space. (Source: Leap Motion)

What creative software do you recommend UX/UI designers interested in AR/VR start learning? What design software is compatible with Leap Motion?

Unity or Unreal – either of these powerful game engines is an excellent starting point to breaking into AR/VR. Most of the assets already available for Leap Motion + VR are designed for Unity, so I would recommend a first-time VR designer start there. We do a lot of our prototyping and design using MAYA, a 3D design suite, but just about any 3D modelling engine that can export to Unity will work. There are also some really cool ways of adapting paper prototyping to work for 3D interfaces.

Until now, VR has relied on game controllers or head motions to interact with objects in virtual space. By mounting the Leap Motion sensor on a VR headset like Oculus Rift, users can reach out and touch virtual worlds. (Source: RoadToVR)
Until now, VR has relied on game controllers or head motions to interact with objects in virtual space. By mounting the Leap Motion sensor on a VR headset like Oculus Rift, users can reach out and touch virtual worlds. (Source: RoadToVR)

What are some interaction design terms you hear around the Leap Motion office that might be new to a screen-based UX/UI designer?

Screen-based designers are used to virtual objects that are trapped behind screens. These can be pulled, pushed, swiped, clicked, dragged, but they really exist in a world all their own. When that world starts spilling into ours, terms from industrial design start to slip into everyday conversation. These terms may not be new to screen-based designers, but they will become second nature.

Affordance – The physical aspects of an object that suggest how it ought to be used. A teapot has a handle and a spout that suggest both how you’re supposed to grab it, and what you’re supposed to do next.

Ergonomics – Screen-based designers typically don’t have to worry about how their interfaces will affect the human body. But if you’re developing a 3D interface, user comfort is an essential consideration.

Intuitive – Let’s throw some warning signs around this word right now. Nothing is intuitive – it either builds on existing expectations, or it affords a particular use. And everything is subject to user testing.

Locomotion – Getting from one place to another is pretty straightforward in the real world. But in VR, there are no perfect solutions, only experimental hardware and software approaches. This is one aspect of VR interaction design you need to watch in the months and years ahead.

Can you recommend any blogs, books, or articles for UX/UI designers interested in applying their discipline to the AR/VR space?

The Sci-Fi Interfaces blog is one of the greatest resources I can recommend to any designer who wants to break into this space. Everyone who straps on a headset and dives into other worlds brings a wealth of cultural expectations that we’ve learned from decades of movies and TV shows. Some sci-fi interfaces are great, some are awful, but they all express how we as a society believe that the future will be interactive. How technology can make us feel powerful and in control. How everything can be at our fingertips. UX/UI designers need to be able to ride this wave of expectations or risk being pulled under. That doesn’t mean trying to achieve the Minority Report interface in real life, but identifying what about that interface inspires us.

Leap Motion and Oculus VR have both published best practices guidelines for virtual reality development [1] [2]. I can’t recommend these enough to avoid common pitfalls. If you can read through both, and your project meets all the guidelines, it’s already stronger than the majority of VR projects out there.

Beyond that, here are some core resources on our blog and Medium channel:

There’s also our latest post called,Storytelling in the 4th Dimension: What’s the Future of VR Cinema?, which just came out a few weeks ago.

The Leap Motion controller allows for new interface possibilities in an AR/VR context. (Source: Leap Motion)

What’s currently the greatest challenge to fluid hand-computer interaction in the AR/VR space?

Game engines are not currently designed with hands in mind, so physics and interactivity get really tricky. Just being able to pick up an object relies on concepts that are foreign to that experience – most of the time, it will get knocked around, or pop out from your fingers. This requires some thoughtful design and scripting to get it right. More broadly, the biggest design challenge that you’ll face is that the sensor is always on. With the mouse or the touchscreen, you always have an obvious active/inactive state, because there’s a physical object you can touch and feel. With Leap Motion, however, your hands are constantly in the frame, and there is no automatic transition from passive to active. Setting user expectations with constant dynamic visual feedback is key. See, for example, Zach Kinstner’s VR Guitar project, which uses color cues to reinforce your sense of hands in 3D space:.

Can you talk about Leap Motion’s internal approach to UX/UI design?

We’re strong believers in the power of rapid prototyping and iteration. Paper prototypes, rapid 3D models in MAYA, quick-and-dirty JavaScript demos. These save a lot of time and energy, not just by making it obvious whether something will work, but also communicating ideas to your fellow team members. After that, it’s user testing, user testing, user testing. I seriously can’t stress enough how much we value user testing as part of our internal development process. In fact, we wrote a big blog post about it. The one thing to bear in mind is that you don’t need a fancy setup to do proper user testing – just a few pieces of off-the-shelf hardware and a few neighbors who will give you honest feedback.

Tweet about this on TwitterShare on FacebookShare on Google+Share on LinkedInEmail this to someone

Leave a Reply

Your email address will not be published. Required fields are marked *