[Note: This is the sixth in a series of posts about the Social Enterprise and the Big Shift. The first post provided an introduction and overall context; the second looked specifically at collaboration, working together; the third looked at optimising performance, enjoying work, working more effectively. The fourth, Doing By Learning, looked at how work gets done in the enterprise, and provided the context in which such flows should be seen. The fifth delved into the subject of flows in detail, and introduced the concept of enterprise social objects. This post looks at such objects in detail. In my next post I will look at filtering and curation. The last three posts (in the series of ten) will then look at innovation, motivation and radical transformation.]
Jonathan Sacks has an interesting article in the Times today, headlined It Is The End of a Dangerous Experiment; sadly, it’s paywalled. Written in the wake of the LIBOR scandal, Sacks emphasises the importance of morality in markets, a reminder that ethics cannot be legislated for, that a shared moral code is needed. He reminds us of the inherent weakness in the Invisible Hand proposed by Adam Smith by making reference to Von Neumann, game theory and the Prisoner’s Dilemma. As he puts it “Two or more rational agents, each acting in their own self-interest, will produce an outcome that is bad for both, individually and collectively“, something the Invisible Hand theory does not allow for.
Sacks makes a critical point about market economies: they can be risk-based or trust-based. Regulatory frameworks work well in risk-based markets, but not in trust-based markets. What he said reminded me of a recent speech by Andrew Haldane, Executive Director, Financial Stability, at the Bank of England, headlined Tails of the Unexpected. The original paper, written by Haldane along with his colleague Benjamin Nelson, is available for download using that link. Haldane makes a similar point about risk and uncertainty, how many of the processes prevalent in the financial sector are based on assumptions of normal distribution in modelling risk, rather than using assumptions of fat-tailed distributions in conditions of uncertainty. When I read the Sacks article, my immediate temptation was to mash the two together, to describe the market as being risk-based or uncertainty-based, leading to the assertion that risk-based markets work well under strong regulation, but uncertainty-based markets need something more, something based on a shared moral code.
Trust.
We live in uncertain times. We have always lived in uncertain times. And we will continue to live in uncertain times. One of the key theses of the Big Shift is that as digital infrastructures permeate all over the world, and as public policy shifts towards greater mobility of resources, the speed of change increases as does the level of uncertainty. The need to move from stocks to flows, from experience curves to collaboration curves, from past actions to continued learning, all this is predicated on the growing level of uncertainty within global markets.
Uncertainty-based markets require an underpinning of trust in order to function effectively.
I’ve said before that trust is hard to legislate for: it’s also hard to automate. Unlike regulation: rules can be automated. [An aside. I’m reminded of Sugata Mitra’s wonderful pronouncement on teachers: A teacher that can be replaced by a computer should be replaced by a computer. Rules can be replaced by systems. Trust cannot.]
Trust is necessary for the efficient functioning of organisms in conditions of uncertainty. During the late 1980s and early 1990s, I spent some time reading up on “incomplete contracts” in the context of firm theory. While I came across a number of papers on the subject at the time, the one that stood out for me was Erik Brynjolfsson’s December 1991 paper An Incomplete Contracts Theory of Information, Technology and Organisation. It allowed me to conceive of information assets differently; it made me think harder about the impact of designing information assets to be shareable, “flexible”; and it made me ponder over the consequences of having information assets controlled centrally. [An aside: Not surprisingly, that interest in firm-theory-meets-incomplete-contracts resulted in my meeting Erik a few years later, and, serendipitously, was given a further boost when I had the opportunity to spend time with Tom Malone soon after he’d written The Future of Work. Tom came to see me in hospital in San Francisco last week: Tom, if you’re reading this, much thanks.]
[While on the subject of hospitals — you may have noticed I’d gone quiet here for a while. It was because I’d somehow contracted a serious infection that needed urgent treatment in hospital, spent nearly a fortnight in hospital, and I’m still recovering. Many of you found ways of letting me know you cared, and I am immensely grateful for that. My salesforce.com colleagues were brilliant, they kept me in high spirits during what was a very painful and worrying episode in my life. Colleagues from my childhood, my early years and my more recent past also showed up, amazingly. Particular thanks also go out to Sheldon Renan, coming all the way from Portland, and to John Hagel and John Seely Brown, who found time to have a wonderful dinner with me the day before I was admitted to hospital. They helped provide grist to my mill as I lay supine for too long.]
Though some of you may dismiss it as “pop”, I liked the simple message in Patrick Lencioni’s Five Dysfunctions of a Team. An absence of trust leads to a fear of conflict which in turn reduces commitment, which makes for an environment where people avoid accountability and thereby leads to poor execution.
Teams, like firms and markets, thrive on trust.
The Social Enterprise is about raising performance levels in uncertain environments. Which makes trust a must. But trust cannot be legislated or mandated. How then are we to engender trust?
Society has had mechanisms for engendering trust for aeons. Some of these are based on transparency, openness and the power of inspection. Look at me, my hands are empty, I bear no weapons — the origins of the handshake. Some of these are based instead on “trusted domains” doing the introducing — as in the passport — Her Britannic Majesty’s Secretary of State Requests and requires in the Name of Her Majesty all those whom it may concern to allow the bearer to pass freely without let or hindrance, and to afford the bearer such assistance and protection as may be necessary. Yet others used badges of achievement or honour — as in the coats of armour or the diplomas of the medical and legal professions.
But all these were structured and explicit means of engendering trust. I think people are a whole lot more complicated than that, and that we all have a slew of tacit ways of giving and receiving trust. And my hunch is that Social Objects have key roles to play in this regard. [For more on the theory of Social Objects, you may want to refer to these three posts. ]
When you trust someone, you make yourself vulnerable. This is true whether the you in question is a person, a firm, a market, humanity. By making yourself vulnerable, you engender trust. Vulnerability is the essence of trust.
One of the ways you make yourself vulnerable is by sharing. I make myself vulnerable when I share my thoughts here; over the years, people have written some fairly disturbing comments publicly as well as privately. If I wasn’t prepared to be vulnerable, I shouldn’t have chosen to share my thoughts like this.
The act of sharing helps build trust. When you get together with your friends and you discuss the books you read, the films you saw, the meals you ate, the places you visited, the things you experienced, you’re sharing. The topics you speak about are social objects, things you have in common, things that encourage commentary and opinion to be expressed freely and openly. And after a while, the “social object” isn’t the important thing, what matters is the relationship that forms as a result of the comments, the advice, the observations.
What’s actually happening is that you’re building strong bonds as a result of the things you have in common, bonds that help you stay committed and loyal to each other when faced with things you don’t have in common, where you don’t see eye to eye. To me the interactions are tacit ways of establishing and building trust.
At work, these social objects take on different guises. They’re not just films and restaurants and holidays and TV shows and concerts any more; they’re documents and spreadsheets and presentations and photographs and videos. As people work together, they share these objects, build communities around them, grow the moss of commentary around the rolling stone of the object. In some ways I am reminded of Erik Brynjolfsson’s “information assets”, which can be designed for central control or for sharing: a decision that affects, deeply, the sense of ownership that workers experience, which then informs their later actions and behaviour, influences whether they act as “owners” or not.
Knowledge work involves the use of many digital information assets. Some of them are very structured and architected and form part of workflow, with formal steps and approval processes. For sure these are enterprise objects and they form a critical part of workflow. But they’re not social objects, and I’m not talking about workflow.
I’m talking about posting a draft for a talk and saying hey guys, could you tell me what you think of this? does it work for you? how can I improve it? I’m talking about reading a medical report and saying I really don’t like the look of this, there’s something not quite right about it, my sixth sense is on overdrive and I need your help. I’m talking about looking at a customer complaint and saying you know what, I think we’re barking up the wrong tree, I think this is not a router problem, it’s a credit problem, that’s why we’re not solving it. I’m talking about oops, I’ve got an unexpected opportunity to pitch to the ceo of an aerospace company about what we can do for them. but it’s in fifteen minutes. does anyone know of any work we’ve done in that sector? and when you get five different responses with attachments I’m talking about the comments that let you choose which one is best.
I’m talking about ways of sharing, of collaborating around information assets, ways that tacitly engender trust.
They involve social objects. But the objects aren’t important per se, what matters is the quality of the interactions that ensue as a result.
Hagel, Seely Brown and Davison speak of participants, interactions and environment as the key components of creation spaces. These participants need to trust each other. The environment helps them deal with explicit ways of building trust. You need the interactions in order to deal with the tacit ways of building trust.
When trust is formalised and structured and regulated it can be gamed. Identities can be victims of theft, so to say. [Although every time I hear that phrase, I am reminded of Mitchell and Webb’s wonderful skit on Identity Theft, a must-listen.
More and more, what we need is a more personal experience of trust, one that is tacit, one that is based on interactions and experience. But this is hard to scale. Unless we learn more about the value of digital social objects, information assets, in this context.
Like this:
Like Loading...