Computer Vision, AI Help Devs Build the New Experience
Computer vision, artificial intelligence (AI) and machine learning may all sound like far-off technologies, but Tim Huckaby showed how developers can work these into their applications today.
During his Wednesday keynote address at Visual Studio Live! in Anaheim entitled, "The Future of Technology Seen through the Eyes of Computer Vision," Huckaby presented a lively series of video clips, examples, and demos of these technologies that are starting to find their way into common usage across a variety of industries and use cases.
"We're going to talk about future and demonstrate what we're doing now," says Huckaby, the founder and executive chairman of InterKnowlogy, Actus, and VSBLTY. "It's crazy cool and powerful." Even though he admits the ramification do occasionally border on "big brother."
Nevertheless, he says this type of technology is rapidly becoming more practical and accessible. "You're going be doing this," Huckaby tells the assembled developers. "It's just matter of time. And it's going to be easy."
He discussed virtual reality and its place in entertainment software. What he is involved in is similar, but an evolution of that. "What we do is augmented reality; what Microsoft calls ‘mixed reality,'" he says. "You can see the world around you, there are just artificial objects in there. And some of these objects can be interactive holograms."
Machine learning and AI can difficult to define, he says, but the most interesting and compelling aspect of machine learning is its ability to find anomalies. Describing the recent tragic shooting in Las Vegas, he explains how more capable security cameras might have been able to detect what led up to that event. "We're heading into a new world of security," he says.
Next he moved onto the first of several brief videos he used to support his presentation. This first two-minute video was created to present a concept for a Hollywood movie. "People are counting on us to have that white and black spaceship appear in front of you," says Huckaby, describing one of the effects in the short preview, "then transition back to the screen. They have to make movies interactive and have to go 3D in a way that's not annoying. And they're counting on us to make that happen."
Huckaby showed another movie preview video that depicted an actor wearing augmented reality (AR) glasses. "We are building the tools to help you guys build those AR experiences easily," he says. That second video clip also showed the actor approaching a Tesla vehicle, which Huckaby says will someday incorporate touch screens and other levels of display.
Huckaby then discussed the increasing capabilities of both hardware and software in terms of Moore's law, which states every year, capacity will essentially double. "People say Moore's Law has slowed down," he says. "I say it has not. And the implications are unbelievable ... Why am I telling you this? All this machine learning and AI and cloud and everything else gets easier and better."
Development teams are working on making these capabilities more accessible. "Machines are calculating faster and faster, but we still need humans to create algorithms for this stuff." And those human capabilities are increasing along with the technology. He explains how Microsoft has teams of developers working on APIs to delivery AR capabilities to applications.
His next video helped put that idea into context. He showed a video of an engineer wearing AR glasses. "That was almost two years ago. That's an engineer that works at Microsoft and he is blind," he says. "All he can see through those glasses is about 2,000 objects. That's nothing for someone who is blind. But a couple years before that, they could only see 200 objects [through the glasses]. Today we're close to an unlimited amount of classifiers. And that's because of machine learning and because so much power in the cloud."
With this level of power, says Huckaby, comes a commensurate level of responsibility. He tells of how Microsoft CEO Satya Nadella speaks of the collective responsibility to use technology in way that helps the world.
Huckaby then moved on to talk about facial recognition technologies already being used in retail environments. "The coolest implementation is freezers in grocery and convenience stores," he says. "The screens run engaging digital content on touch screens, and use the technology to ‘see' the humans [viewing the content]. If you see white people drinking champagne, and the system sees you are Asian, it will change the image to Asian people drinking champagne."
And that technology has possibilities far beyond retail, explains Huckaby. "If an Amber Alert kid walks in front of that, or someone off the FBIs most wanted list—boom. It will get you."
Developers can already build this type of capability into apps and systems they're developing. "This source code available to you," he says. "You can take this technology right now and build facial recognition for your company."
The demonstrations and the video clips he presented show where technology is going in terms of computer vision. "We are building holodeck-like experiences right now [with the Hololens]," he says. "There's spatial awareness. It maps the room around it, and it's just a windows device. And it's just going to get better."
The next Visual Studio Live! event will be part of Live! 360, Nov. 12-17 at the Royal Pacific Resort in Orlando, Fla.
Lafe Low is the editorial liaison for ECG Events.