Fast Company: What does UI design look like after screens go away? Fjord’s Andy Goodman explains.
For better or worse, a large amount of design work these days is visual. That makes sense, since the most essential products we interact with have screens. But as the internet of things surrounds us with devices that can hear our words, anticipate our needs, and sense our gestures, what does that mean for the future of design, especially as those screens go away?
Last week at San Francisco’s SOLID Conference, Andy Goodman, group director of Fjord, shared his thoughts on what he thinks the new paradigm of design will be like when our interfaces are no longer constrained by screens, and instead turn to haptic, automated, and ambient interfaces. He calls it Zero UI. We talked to him about what it meant.
Zero UI isn’t really a new idea. If you’ve ever used an Amazon Echo, changed a channel by waving at a Microsoft Kinect, or setup a Nest thermostat, you’ve already used a device that could be considered part of Goodman’s Zero UI thinking. It’s all about getting away from the touchscreen, and interfacing with the devices around us in more natural ways: haptics, computer vision, voice control, and artificial intelligence. Zero UI is the design component of all these technologies, as they pertain to what we call the internet of things.
“If you look at the history of computing, starting with the jacquard loom in 1801, humans have always had to interact with machines in a really abstract, complex way,” Goodman says.
Over time, these methods have become less complex: the punch card gave way to machine code, machine code to the command line, command line to the GUI. But machines still force us to come to them on their terms, speaking their language. The next step is for machines to finally understand us on our own terms, in our own natural words, behaviors, and gestures. That’s what Zero UI is all about.
Read the full article on Fast Company.