Q&A
The Science Behind User Interface Design
User interface design expert Billy Hollis is annoyed when he spots even tiny application tweaks that could improve the intuitive experience for users. He finds them everywhere, even in our favorite IDE, Visual Studio (see more on that below).
He believes UI design is more than just creating visually appealing screens -- it's about understanding the science of how users perceive and interact with apps. Developers and designers who grasp the principles of the human visual system can create interfaces that are not only functional but also intuitive and engaging.
That notion drives his upcoming session titled "What Do Your Users Really See: The Science Behind User Interface Design" at the big Visual Studio Live! developer conference coming to Las Vegas in March.
Led by Hollis -- a developer, designer, speaker, and author with a reputation for insightful talks -- this introductory-level session delves into the scientific principles that influence how users process visual information on their screens. Topics include Gestalt principles for grouping and highlighting, the phenomena of inattentional and change blindness, and how users scan and navigate views. The session promises practical takeaways for creating clearer, more effective interfaces, all grounded in cognitive science.
Hollis is the right man for the job, as he runs a consulting practice in Nashville where he and his team focus on UX, advanced UI development, rules-based architectures, and more. He teaches design classes for UX and technical classes on XAML for the Universal Windows Platform and XAML for WPF.
He said attendees will explore real-world examples of poorly designed screens, learn about common pitfalls, and discover ways to leverage new UI technologies to enhance user experiences. The session will also include hands-on tests to help participants understand how they personally perceive visual information, offering fresh perspectives on design challenges.
It's part of the "Developing New Experiences" track at VSLive!, running from March 10-14 at the Paris Las Vegas Hotel & Casino. Hollis said it's an excellent opportunity for developers and designers to gain actionable insights into crafting user-friendly interfaces that respect how users truly see and think about their screens.
Attendees are promised they will:
- Learn how the science behind the human visual system affects user perception of app screens
- See examples of poorly designed screens that don't respect how the visual system works
- Find out ways to apply new UI technologies to leverage how users see screens
We caught up with Hollis to discuss the science behind UI design, how you can apply these principles in your own projects, and how to learn more about the subject and prepare for the session.
VisualStudioMagazine: What inspired you to present a session on this topic?
Hollis: Developers and designers sometimes clash because designers tend to be intuitive, and developers tend to be analytical. Developers need reasons for things.
When it comes to what works in design, there is actually science behind most of it, and that resonates with developers. In the design world, we refer to design principles, many of which are based in the science of the human visual system, and in our cognitive limitations.
"I designed this session to present some of that science, so that developers and other non-designers can themselves begin to appreciate why certain designs work well and others don't."
Billy Hollis, Developer, Designer, Speaker, Author
I designed this session to present some of that science, so that developers and other non-designers can themselves begin to appreciate why certain designs work well and others don't. These principles will also help them avoid some common mistakes when designing views and web pages, even in the absence of any formal design effort.
You're going to discuss the Gestalt principles and how they aid in creating user-friendly interfaces. Can you name just one of these principles that is most often overlooked in modern UI design?
I think Gestalt Proximity is most often overlooked, to a great extent because of the tendency for apps to end up with crowded screens. Gestalt Proximity says that when things are grouped together, the visual system automatically considers them related. When a screen is crowded, it's often necessary to cram things together, with no spacing to establish related groups. If related things are not close together or are grouped in with unrelated items, the user may not perceive them as related.
I even see this design principle violation in Visual Studio. For example, when specifying a color, the three RGB values are related in a way that the alpha channel for transparency is not. Yet they're just all stacked together, as this screen shot shows. This would be a more intuitive design if there were some separation between the bottom of the RGB values and the alpha channel (labeled A) value. It's a small issue, but a hallmark of good design is to get the small things right.
Can you provide one or two examples of how these mistakes negatively impact user experience and engagement?
A few months ago, I made a credit card payment to the wrong card. I have two cards with that company. In the accounts screen, everything is rather jammed together, and the button for the second card was too close to the information for the first card. Unwinding the mistake took time and was frustrating for me. A relatively small amount of separation would have prevented that mistake.
Another puzzling mistake is the Start screen for Windows 11. In Windows 10, tiles to launch apps could be various sizes, and they could be separated into groups. In Windows 11, icons to launch apps are just all dumped into a screen, identically sized, and with no ability to group them. Here are the two designs side-by-side. I don't really understand why the Windows team replaced a design that worked fine with one that wasn't nearly as good.
Since I use a Windows 10 machine most of the time, I find using Windows 11 on my laptop rage inducing. It's simply harder to find and use the apps and documents I want to use. The Windows 10 Start screen respected the Gestalt principles of proximity and similarity, and the Windows 11 Start screen does not. It has no structure to help the visual system navigate to what I need.
For developers and designers already familiar with basic UX principles, what advanced concepts from this session would push their skills further?
This session will get into cognitive limitations in memory and attention span, and discuss ways to relieve the user of having to observe and remember so much. For example, I'll talk about how current business apps often require far too much attention and memory to handle common business workflows, and how better design can overcome that problem. Such designs can speed up users dramatically, slash the amount of training required, and reduce the number of mistakes they make.
I consider this an advanced topic because neither designers nor developers tend to do a good job in designing for workflow, even though it has enormous value.
What resources would you recommend for developers to get up to speed with the science behind UI design and prepare for your session?
I start pretty much from ground zero, so attendees can come with no preparation. But if the subject really excites them, I have a couple of books that I highly recommend.
The go-to book for understanding how the visual and cognitive systems work, and how that affects UX design, is Designing with the "Mind in Mind" by neurobiologist Dr. Jeff Johnson. Another well known book in the field is "Don't Make Me Think" by Steve Krug.
Note: Those wishing to attend the conference can save hundreds of dollars by registering early, according to the event's pricing page. "Save $400 when you Register by the Super Early Bird savings deadline of Jan. 17," said the organizer of the event, which is presented by the parent company of Visual Studio Magazine.
About the Author
David Ramel is an editor and writer at Converge 360.