News

Beyond Touch: Developing and The Future of Natural UI Design

Advances in touch-based computing opens up a host of new ways for users to interact with apps. Developers will need to keep up and be keenly aware of newer natural UI design methods, such as voice, gesture, and even neural inputs, says InterKnowlogy's Tim Huckaby.

First there was nothing but a monochromatic screen and a keyboard. Then along came the mouse, and developers had to actually remove a hand from the keyboard. Then came touch interface with its swipe and sweep. Still other ways to interact and control technology are coming down the pike. Tim Huckaby opened with the Tuesday, November 19 keynote session at Live! 360, held at the Loew's Royal Pacific Resort in Orlando, Florida.

His keynote address entitled "User Interaction Design in a NUI World" gave attendees a fascinating glance at some of the user interaction possibilities coming in the next few years. "As a culture, we are changing," Huckaby says. "Even two-year-olds know how to use an iPad."

Huckaby says he has always been fascinated by user interaction design, the way users interact with digital systems through touch, gesture, voice commands or even neural input. "We are a decade away from a truly neural-based interface," he says. 

He demonstrated a couple of 3D apps developed by his company InterKnowlogy, and walked through a couple of milestones in the technology marketplace that have advanced the science of the natural interface. January 27, 2010 was the date Apple unveiled the first iPad. "These people paved the way for touch-based computing."

As the hardware gets more capable, faster and less expensive, it improves the potential for more advanced interface development, according to Huckaby. He then laid out his Seven Tenets of Engaging User Experience.

  • Use the Power of Faces
  • Use Food, Sex and Danger
  • Tell a Story
  • Build Commitment Over Time
  • Use Natural Interactions
  • "Game-ification"
  • Make it Intuitive and Easy

For that last tenet, he discussed what he calls the Grandma Huckaby factor -- if his grandmother could use it, then he's done his job. He gave a brief description of each of his tenets, then concluded with a prediction for the crowd of developers, saying, "If you're not doing it now, you will be soon."

Huckaby's next demonstration was the CNN magic wall, which his company developed. This interactive screen displayed election results data during the recent mid-term elections. Huckaby said this app processed 2GB to 3GB of data per second as the election results were tallied.

Later in the presentation, Huckaby was joined by Philip Japikse, another veteran speaker for the Live! 360 events. The two then demonstrated how to properly use and display animation so as to be more acceptable to the user. "We don't live in Hogwarts, so when things just appear, people can be freaked out," says Japiske. They discussed animation design factors to present animations that will be gentle to the user, moving from one position to another with fluidity.

The final user interface and experience demonstration was the most impressive. Using a Microsoft Kinect 3D, Huckaby operated an app with hand motions and voice commands. The avatar that represented his motions moved smoothly around the screen and from icon to icon. "It's an amazing camera," he says, "the camera sees so well, it can look at your cheek and determine your heartbeat." Huckaby discussed how this could be particularly effective in a sterile situation such as an operating room or other medical setting, and how you can expect to see more of these types of input devices and methods as both the hardware and software evolve.

Live! 360 wraps up 1105 Media's 2014 event calendar. For the 2015 event schedule, go to live360events.com, vslive.com and techmentorevents.com.

About the Author

Lafe Low is the editorial liaison for ECG Events.

comments powered by Disqus

Featured

  • Compare New GitHub Copilot Free Plan for Visual Studio/VS Code to Paid Plans

    The free plan restricts the number of completions, chat requests and access to AI models, being suitable for occasional users and small projects.

  • Diving Deep into .NET MAUI

    Ever since someone figured out that fiddling bits results in source code, developers have sought one codebase for all types of apps on all platforms, with Microsoft's latest attempt to further that effort being .NET MAUI.

  • Copilot AI Boosts Abound in New VS Code v1.96

    Microsoft improved on its new "Copilot Edit" functionality in the latest release of Visual Studio Code, v1.96, its open-source based code editor that has become the most popular in the world according to many surveys.

  • AdaBoost Regression Using C#

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the AdaBoost.R2 algorithm for regression problems (where the goal is to predict a single numeric value). The implementation follows the original source research paper closely, so you can use it as a guide for customization for specific scenarios.

  • Versioning and Documenting ASP.NET Core Services

    Building an API with ASP.NET Core is only half the job. If your API is going to live more than one release cycle, you're going to need to version it. If you have other people building clients for it, you're going to need to document it.

Subscribe on YouTube