Soon, AI will be used as the scapegoat for anything that doesn’t work
Following on from my last two posts, I’ve continued to ruminate on instances where inflexible rules (often compounded by unnecessary complexity) created problematic circumstances that could only be solved by humans with empathy and empowerment.
I loved the @mildlyamused tweet that Stefan referred to in his post, where Donna talked about “arguing with a motion sensor about whether or not your hands are in the sink”. As we continue to implement inflexibility and poor design in software, we’re going to see more and more of this.
It’s easy to blame the software or firmware or hardware. All designed and implemented by us. Soon we’re going to blame “AI” for everything that doesn’t work, in an ironic variant of Douglas Hofstadter‘s definition of AI, sometimes referred to as Tesler’s Law or Effect, sometimes attributed to others; Hofstadter described AI as “anything that hasn’t been done yet”.
Many years ago, around 1986 or 1987, I’d planned a day out playing golf with a bunch of colleagues. We’d booked a tee time at Gatton Manor, near Ockley, which at the time prided itself as the course with the most holes with water hazards; it also had the longest Championship par 5 in the country, I think it was the 16th, 625 yards. (The course has been shut for development for many years; I hear it’s reopened, and I will plan a visit soon).
Anyway, back to my story. I’ve never driven in my life, as in driven a motor vehicle. (After seeing me play off the tee, the unkind amongst you would say that the distinction is a tautology).
Since I didn’t drive, I’d arranged to meet the rest of the four ball at Camberley station. Why Camberley? Because they lived there. So I wandered down to Windsor and Eton Riverside station at the crack of dawn, asked for a return to Camberley, changing at Staines. The man at the ticket counter asked me when I’d be returning, I told him, and having heard that, he issued me a Cheap Day Return and I went to the platform in a state of anticipatory bliss. (I’m like that, many small things fill me with joy).
Now you can’t get to Camberley without passing through Ascot and Bagshot first. When I got to Bagshot, I was shaken out of my early-morning reverie by the raucous sounds emanating from the car park there. My colleagues and golf partners. Trying, urgently, to attract my attention. Which they succeeded at. So I got off the train, one stop “early”, accompanied by my clubs and bag.
Even at that unearthly time, there was a ticket inspector at the exit. I gave him my ticket.
“This ticket’s not valid. You can’t get off here”. “Why ever not? I’m paid up to the next stop, and this is a valid “alighting” stop, isn’t it?” “Yes, but not on this ticket. Cheap Day Returns from Windsor cannot be used at this time to get to Bagshot, but are valid from Camberley onwards”.
Heaven help us.
Hyperlinks subvert hierarchies
I know I’ve written about this before, probably over a decade ago. I gave him my card, told him he can write to fine me or whatever he had to do, and barged past him to meet my friends, reverie now no longer as blissful as before. (British Rail actually wrote to me to explain the rules and to “let me off” grudgingly just this once).
Now all this was dreamt up by people and imposed on people in a very analogue way; as more and more of what we do goes digital, we’re going to find more things that go wrong. That doesn’t mean we shouldn’t be digitising processes. What it does mean is that process design in a digital world really needs to understand the concept of leeway.
David Weinberger, writing in The Cluetrain Manifesto, introduced me to the lovely phrase “Hyperlinks subvert hierarchies“. In this particular context, I think of complex rules and processes as the hierarchy, and trust-based human interactions as the hyperlink.
A classic example for me is something I’ve had to face a few times:
I’m heading for the airport. I get a notification that the plane I’m booked on is delayed by a few hours. So I make use of the time doing something else and then arrive at the airport well in time for the now-delayed flight. Hand baggage only as usual. (Paper) boarding pass in my hand. (Why paper? I will come to that later). Walk up to the turnstiles that welcome me to security. Place pass on scanner. Red light. You can’t pass Go or collect 200 or anything else.
Why was I stopped? Because the system thinks I’ve missed my flight, or at least that I’ve missed the deadline by which I should have passed through security and headed for the gate. It doesn’t matter that the flight is late, and that the plane hasn’t even arrived at the gate to let people board. All that matters is that I’ve missed the preset deadline.
This has happened to me more times than I would have liked. Each time, I was late-on-purpose, to catch a flight that was later-still. Each time, I had to find someone with the hey-presto ability to magic me past the turnstile. At least once, in a mainland Europe airport, I’ve had to vault the turnstile, inelegantly and clumsily, to make it through security on time; it was the last flight to depart that night, and there was no one left manning the check-in desks.
Stuff like this happens all the time, not just tied to the underlying process. We’ve moved a lot of functionality to the cloud, and made our smart devices carry critical workloads.
It doesn’t all work the way it’s meant to. Bar codes and QR codes don’t scan. Image recognition doesn’t work. Phones run out of battery. Taps, towel dispensers and soap servers don’t recognise the presence of your hands. Signals are weak. Surfaces overheat. Touch screens become unresponsive. Sunlight makes the screen unreadable; ambient noise makes the instruction unhearable.
I was at a phone shop, waiting patiently to be served, finally at the front of the queue, when a woman came running in to the shop and beseeched me to let her jump the queue and be served before me. All she wanted was a charged battery, or the chance to get a little bit of charge on her phone. She’d come from the shop across the road, groceries piled up in her cart, went to pay, her phone had “randomly” run out of battery, even though she was sure it was over 50% ten minutes earlier. Her phone was her only way to pay. She had a parking slot that was about to run out and a child that needed picking up from school.
It all worked out. But it needed shop assistants in both shops to understand, to make exceptions; it needed the others involved to be human and not mechanical, social and not selfish.
The good thing is that most of the time, humans are human. Social. Empathetic. Machines are machines.
Trust is the hyperlink that can subvert hierarchies.
Herb Kim, who runs the wonderful Thinking Digital conference, shared a story on one of the social channels a few days ago. He’s one of those people who regularly shares stories that will enrich and encourage others, rather than criticise them or cut them down. One such story was about a Dutch supermarket operator who’d opened slow-checkout queues to help combat loneliness amongst the elderly. (An aside. Legend has it that when Mahatma Gandhi first visited somewhere in the West, he was asked what he thought of Western civilisation. He considered the question and then replied, I think it would be a good idea).
I love the very idea. People don’t necessarily do something for the “obvious” or “visible” reason. When we design stuff or automate stuff, we need to consider that.
As I write this, my local train operator plans to shut a number of ticket offices down. No more ticket counters. Instead, machines that are platform-mounted, exposed to the elements, with screens that are hard to read in what passes for sunlight here. Machines. With hard-to-follow instructions based on complex rules. With error conditions that are often neither displayed nor guessable. Offering choices that are very context dependent, missing the nuances that a trained and experienced ticket clerk provides.
My local station has a wonderful ticket clerk. She’s part of the community, part of what makes the village tick. She knows everyone, greets everyone, knows what they need. Knows enough about them to give them advice that is precision-tailored for them. Asks the right questions. And does all this in an environment she works to make friendly and welcoming. Christmas decorations and Easter bunnies and everything. Warm when needed, cool when needed.
Her personality and attitude and service are probably key reasons why some amongst the elderly feel less lonely. They come for a chat, not just to buy tickets to go somewhere. They come for the warmth they are guaranteed before they go wherever they’re going. These things are really important in villages with the demographics we have, an ageing population with many people living on their own.
My village isn’t unique. All over the West, the demographics are heading that way. All over the West, there are instances of misunderstandings as to what a process is really about: the human contact, the ability to have conversations, the camaraderie, the shelter, the dignity and purpose that come from all that.
We can’t blame devices for not working. Hierarchies doing what they do best.
What we can do is be the hyperlinks. Subvert the hierarchies.
Planning for offline behaviours
When I was at the Euro final a couple of years ago, the ticket technology just failed to keep up, and people stormed the gates. Ugly scenes. I’ve seen variants all over the place, where the “design” of a process is focused on objectives like preventing fraud and controlling secondary markets, while paying limited attention to the things that will fail and the leeways needed. I think David Birch, when writing about digital currencies, often makes the point that no digital currency will work unless it caters explicitly for offline behaviours.
Devices will fail. Lost, stolen, out of range, battery dead, screen shattered, microphone or camera not working, Bluetooth disabled, whatever.
Processes will not work as intended. The connection with the mother ship may be down. The mother ship may itself have problems. Stuff happens.
The conditions in which the failure happens are usually the least helpful. Inclement weather; a time when no one is around; a time when everyone is around and everyone is frustrated; long queues here, deserted halls there.
Stuff happens. Stuff that can be fixed because people design for human intervention. Empowered and empathetic.
Many years ago, I gave lectures on the topic of Designing for Loss of Control. Many years ago, I spoke here and there on The Future of Lurk. Both topics continue to intrigue me. Not everyone is digital-savvy, not everyone has the wherewithal to work around the obstacles formed by failing systems, processes and connections. And yet we continue to entrust much that is critical to that world.
It can work.
It will work.
Only if we make the allowances we need to. Only if we treat people with respect, with tolerance, with dignity. Only if we work on the basis of mutual trust.
Only if we continue to be human.