Alexa, order me a dollhouse
A new year, and I’m already super nostalgic.
Looking back to the halcyon days of 2016, one of the Fjord trends was “Taking Things Off the Thinking List”, where we asserted that minimizing the burden of choice and automating certain decisions for people will continue to become the norm. Anticipatory services that read your mind in advance, sending you more shampoo before you even notice you are running low?
Love it. Do it. Sign me up yesterday. Dreams really do come true!
The 2017 Trend Unintended Consequences has a direct and sobering follow-on connection to that utopia, however. We’ve a collection of cautionary tales on this front and the latest is a whopper – a news anchor in San Diego recently ordered a bunch of dollhouses from his viewers’ Amazon echo devices, on air. Oops.
Poor Alexa in the above example didn’t know who was talking to her, or that he was on a newscast; she was all muddled up. Without the right context, you see, the story often goes quite wrong!
Indeed, everything in our lives is driven by our context, the framework surrounding our personal story and social experience. We write our own life stories within this merely by trudging along, forming opinions which fuel our decision-making. The act of making a decision itself is a narrative act, and increasingly it’s the algorithms shaping the narrative for us. When they do so without our full bidding, they are chipping away at our power.
Many of these decisions are minor, inconsequential (what socks shall I wear today?) while others are obviously massive (whom should I vote for?) Using two extreme examples such as these, it would seem obvious which ones we’d wish to outsource.
This past year has shown us that, in fact, we must take a very close look at the mechanisms of these various decisions because it’s happening in ways people don’t understand. In these circumstances, people’s reality can be hijacked, their context infected, their experience refracted and opaque. Often they are none the wiser that there’s an algorithm manipulating their experience, genetically engineering the distribution of these unfiltered messages; the lack of accountability, accessibility and speed have become exponential. Enter the world of “post-truth”.
Nowhere is this truer than on social media. Twitter algorithmic timelines and the inscrutable, ever-shifting Facebook feed are disrupting our sense of linear temporality by substituting time, our most basic human benchmark (is nothing sacred!?) with their own computational notions of what I should see first, as a profit measure.
For me, the experience of parsing this content is beyond confusing. I no longer inhabit a legitimate narrative context in these spaces; meaning collapses; I float around on a wave of nonsense.
Hillary Clinton is running a child sex ring out of the basement of a pizza restaurant? Oh look, my god-daughter turned six. I’ll never guess what happened when this bobcat befriended an actual porcupine!
There are remedies, ways to regain a sense of order. Twitter lets you turn off the algorithmic timeline, though many people won’t find their way to that functionality. Facebook, reacting to one part of the problem, is hiring veteran newswoman Campbell Brown as media liaison to combat fake news. Well, I say: great. Keep going.
As designers, our primary obligation – whether we are designing brave chatbots, motherly robots, edgy smart buildings or practical mega-applications – should be to serve the user by creating opportunities for meaningful narrative, a positive experience: a utility, benefit, delight or even a joy. Human-centred design demands it, and requires that going forward we empower users with the proper context to engage in discernment and the crafting of their own story.
We do this by listening to their needs and understanding the real requirements, without making assumptions about boundaries and empowerment – and with really, really thinking through what might happen if our delegated decision-making logic, god forbid, fails us.
Image by Kevin Burkett from Philadelphia, Pa., USA