News

Beyond Touch: Developing and The Future of Natural UI Design

Advances in touch-based computing opens up a host of new ways for users to interact with apps. Developers will need to keep up and be keenly aware of newer natural UI design methods, such as voice, gesture, and even neural inputs, says InterKnowlogy's Tim Huckaby.

First there was nothing but a monochromatic screen and a keyboard. Then along came the mouse, and developers had to actually remove a hand from the keyboard. Then came touch interface with its swipe and sweep. Still other ways to interact and control technology are coming down the pike. Tim Huckaby opened with the Tuesday, November 19 keynote session at Live! 360, held at the Loew's Royal Pacific Resort in Orlando, Florida.

His keynote address entitled "User Interaction Design in a NUI World" gave attendees a fascinating glance at some of the user interaction possibilities coming in the next few years. "As a culture, we are changing," Huckaby says. "Even two-year-olds know how to use an iPad."

Huckaby says he has always been fascinated by user interaction design, the way users interact with digital systems through touch, gesture, voice commands or even neural input. "We are a decade away from a truly neural-based interface," he says. 

He demonstrated a couple of 3D apps developed by his company InterKnowlogy, and walked through a couple of milestones in the technology marketplace that have advanced the science of the natural interface. January 27, 2010 was the date Apple unveiled the first iPad. "These people paved the way for touch-based computing."

As the hardware gets more capable, faster and less expensive, it improves the potential for more advanced interface development, according to Huckaby. He then laid out his Seven Tenets of Engaging User Experience.

  • Use the Power of Faces
  • Use Food, Sex and Danger
  • Tell a Story
  • Build Commitment Over Time
  • Use Natural Interactions
  • "Game-ification"
  • Make it Intuitive and Easy

For that last tenet, he discussed what he calls the Grandma Huckaby factor -- if his grandmother could use it, then he's done his job. He gave a brief description of each of his tenets, then concluded with a prediction for the crowd of developers, saying, "If you're not doing it now, you will be soon."

Huckaby's next demonstration was the CNN magic wall, which his company developed. This interactive screen displayed election results data during the recent mid-term elections. Huckaby said this app processed 2GB to 3GB of data per second as the election results were tallied.

Later in the presentation, Huckaby was joined by Philip Japikse, another veteran speaker for the Live! 360 events. The two then demonstrated how to properly use and display animation so as to be more acceptable to the user. "We don't live in Hogwarts, so when things just appear, people can be freaked out," says Japiske. They discussed animation design factors to present animations that will be gentle to the user, moving from one position to another with fluidity.

The final user interface and experience demonstration was the most impressive. Using a Microsoft Kinect 3D, Huckaby operated an app with hand motions and voice commands. The avatar that represented his motions moved smoothly around the screen and from icon to icon. "It's an amazing camera," he says, "the camera sees so well, it can look at your cheek and determine your heartbeat." Huckaby discussed how this could be particularly effective in a sterile situation such as an operating room or other medical setting, and how you can expect to see more of these types of input devices and methods as both the hardware and software evolve.

Live! 360 wraps up 1105 Media's 2014 event calendar. For the 2015 event schedule, go to live360events.com, vslive.com and techmentorevents.com.

About the Author

Lafe Low is the editorial liaison for ECG Events.

comments powered by Disqus

Featured

  • Mastering Blazor Authentication and Authorization

    At the Visual Studio Live! @ Microsoft HQ developer conference set for August, Rockford Lhotka will explain the ins and outs of authentication across Blazor Server, WebAssembly, and .NET MAUI Hybrid apps, and show how to use identity and claims to customize application behavior through fine-grained authorization.

  • Linear Support Vector Regression from Scratch Using C# with Evolutionary Training

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the linear support vector regression (linear SVR) technique, where the goal is to predict a single numeric value. A linear SVR model uses an unusual error/loss function and cannot be trained using standard simple techniques, and so evolutionary optimization training is used.

  • Low-Code Report Says AI Will Enhance, Not Replace DIY Dev Tools

    Along with replacing software developers and possibly killing humanity, advanced AI is seen by many as a death knell for the do-it-yourself, low-code/no-code tooling industry, but a new report belies that notion.

  • Vibe Coding with Latest Visual Studio Preview

    Microsoft's latest Visual Studio preview facilitates "vibe coding," where developers mainly use GitHub Copilot AI to do all the programming in accordance with spoken or typed instructions.

  • Steve Sanderson Previews AI App Dev: Small Models, Agents and a Blazor Voice Assistant

    Blazor creator Steve Sanderson presented a keynote at the recent NDC London 2025 conference where he previewed the future of .NET application development with smaller AI models and autonomous agents, along with showcasing a new Blazor voice assistant project demonstrating cutting-edge functionality.

Subscribe on YouTube