Musing gently about the impact of change and the time it takes

We’re all very connected now, and news travels fast. There was a time when what constituted “news” used to be corroborated before it was sent on its way. There was a time when news had to be new to be news. There was a time when news had to be true to be news.

Those were the days.

Some of you may remember reading Bombardiers, a book by Po Bronson, which came out over 20 years ago. Here’s a review that might intrigue you enough to read it.

The principal reason for mentioning that book is that Bronson’s satire was instrumental in getting me to understand something I’ve written about before, over 10 years ago.

That something is this, paraphrasing Bronson:

In the Stone Age, Might was Right. And you could figure out who was Mighty quite easily. In the Industrial Age, Money Ruled. And you could figure out Who had the Money quite easily. In the Information Age, figuring out Who has The Information isn’t easy. In fact it is Very Hard. Because the time taken to verify the information can exceed the speed at which information changes.

So it doesn’t matter if the UK doesn’t spend £350m per week on the EU, it doesn’t matter if the NHS never receives that theoretical weekly sum. It doesn’t matter if Mexico never pays for the wall, or even if that wall never gets built.

Things move on. Turning back the clock is hard. Very hard.

There is something that is a little easier. Studying the impact of change, looking objectively at the data. Not easy, but easier.

It means knowing the right data to collect, the right way to collect it. It means preserving that data and making it accessible. It means protecting that data from change and from misuse.

It also means looking at the data over a long enough period. That’s what we didn’t do with cigarettes; but at least that battle’s over. That’s what we didn’t do about sugar; and that battle isn’t over yet. That’s what we’ve been trying to do with climate change, and that battle’s barely begun. That’s what we’re probably at the nascent stages on with fracking: the issue is not about hydraulic fracturing per se, not even about horizontal drilling,  but about safe ways of disposing of the waste water, in particular learning from the seismologists who’ve been studying this aspect.

These battles get won and lost by people who are in positions of authority for far shorter periods than the time it takes for their decisions to have an effect.

And for us to learn about that effect, understand it and respond to it. Adapt as needed.

Things move on. Turning back the clock is hard. Very hard.

We need to get better at studying the impact of change over time. Proper, longitudinal study, collecting and preserving the right data sets, with the relevant discipline and safeguards in place. That’s why I have been fascinated by and supportive of Web Science.

This need for good longitudinal data is also the reason why I have been so taken with Amara’s Law, about our tendency to overestimate the effects of a technology in the short run and to underestimate it in the long run.

Take cricket for example. There was wailing and gnashing of teeth when limited-overs cricket was introduced, more of the same when T20 was introduced. The death of test cricket was foretold and prophesied, and later announced. Apparently it’s been dying ever since.

Not according to the data, especially when you look at longitudinal data. This was something I wrote about recently.

Taken over nearly a century and a half, comprising well over two thousand Tests, the data indicated that since the introduction of the changes, Tests were more prone to ending in win/loss results rather than draws, that more runs were being scored and more quickly, and that perhaps counterintuitively, the number of long individual innings (as exemplified by triple centuries scored) was also on the increase.

Events earlier this week have allowed me to look into another data set, suitably longitudinal, which reinforces all this.

I started with the hypothesis that one reason why Tests may be ending in win-loss results more often is that batsmen have learnt to truly accelerate run-scoring in bursts, using skills acquired in the short game. I surmised that we may be seeing more such bursty behaviour in the 3rd innings, thereby setting up for grandstand finishes, sometimes with “sporting declarations”. I also surmised that this bursty behaviour would be able to act as a potential insurance policy against any time lost due to inclement weather. But it was all hypothesis. I needed the facts. At the very least I needed a suitable data set collected over a sensible time period. The recent Australia-Pakistan Test gave me the catalyst I needed. The Australians scored 241 for 2 in their second innings before declaring.  By itself that wasn’t unusual. But they scored the runs at a rate of 7.53 RPO, something I would associate readily with T20, something I would expect in a small percentage of 50-over games, but something I would consider a complete outlier in the five-day game.

So I went and had a look.

In the history of Test cricket, covering somewhere between 15000 and 18000 innings, there have been just 10 instances where a run rate (RPO) of 6 or more per over has been sustained in an innings lasting 20 or more overs.

  • England 237/6d, RPO 6.12, 3rd innings, Mar 2009
  • Pakistan 128/2, RPO 6.19, 4th innings, Oct 1978
  • West Indies 154/1, RPO 6.20, 4th innings, Mar 1977
  • Australia 251/6d, RPO 6.27, 3rd innings, Jan 2015
  • Australia 264/4d, RPO 6.28, 3rd innings, Nov 2015
  • South Africa 189/3d, RPO 6.37, 3rd innings, Mar 2012
  • Pakistan 164/2, RPO 6.60, 4th innings, Nov 1978
  • South Africa 340/3d, RPO 6.80, 2nd innings, Mar 2005
  • West Indies 173/6, RPO 6.82, 4th innings, Feb 1983
  • Australia 241/2d, RPO 7.53, 3rd innings, Jan 2017

The first limited-overs international was played in 1971. All ten instances took place after that date. The first T20 international was played in 2005. 6 out of the 10 instances took place after that date. In all ten cases, the team putting their foot on the accelerator didn’t lose; in half the cases they won.

 As it is with cricket, so it is with many other things. When you change things, it takes time to figure out the real effects. Longitudinal studies are important. This is as true for technology change as for any other change. With all change, there is an Amara’s Law in operation. We tend to overestimate the short term effects and underestimate the longer term impact. 

Tracking the impact of change requires good baseline data and a consistent way of collecting and preserving the data over long periods. That’s not a trivial task. It is made more complex with the need to protect the data from corruption and misuse.

While I love cricket, I only use it as an example  here, to illustrate how longitudinal studies can help assess the impact of change, objectively and reliably.

A sideways look at cognitive surpluses and knowledge “management”

Doc Searls used to keep reminding me of something he attributed to Don Marti: Information doesn’t want to be be free, it wants to be $5.99. Incidentally, talking about Doc Searls, here’s a bonus for Cluetrain devotees: the first time all four original Cluetrainers were together in one place, at Defrag in Denver last week.

Where was I? Oh yes, information doesn’t want to be free. You know something, sometimes I feel the same about knowledge. It doesn’t want to be free. As Paula Thornton said some years ago, maybe knowledge doesn’t want to be managed either.

Ever since I read Clay Shirky’s Cognitive Surplus earlier this year, I’ve been thinking about the book’s implications for “knowledge management” in the enterprise. Which is why I wrote what I did yesterday, and planned to follow up today. Which is what I’m doing here.

Let’s start with knowledge. For the sake of simplicity, I’m going to define knowledge in the enterprise as “information about anything and everything that makes our customers’ lives easier; as a corollary, information about anything and everything that helps us make our customers’ lives easier”. I feel that such a definition is in keeping with the ethos of Peter Drucker’s immortal saying “People make shoes. Not money“. If we make our customers’ lives easier, they will thank us for it. With their attention, their time, their loyalty, and even their money.

Using this definition, the management of knowledge can be defined as “the process by which we create, collect and share information that makes our customers’ lives easier”.

So who should be involved in such a process? Who would know the most about what would make our customers’ lives easier?

Our customers.

If you accept that logic, then the customer should be at the heart of any knowledge management system.

Who else? People who deal with the customer. Those who “touch” the customer. Followed by people who know something about the products or services those customers want or sometimes even need. Followed by people who know something about the process by which the products or services get created, delivered and exchanged for value.

Which means pretty much everybody in the enterprise. The extended enterprise. All the way to the customer.

Okay, so that’s the what and the who of knowledge management. Let’s take a look at the how.

One way of defining the how is to look at the things that failed in the past.

  • The right people weren’t involved.
  • Access and edit permissions were hierarchical rather than networked, preserving “expertise”
  • Information was inaccurate; without scale, the costs to correct were too high.
  • Information was missing: only text could be captured, image and sound were missing.
  • Information was out of date; production, reproduction and distribution costs were prohibitive, reviews were infrequent.
  • Information was inaccessible. Analogue, poorly indexed, and hard to copy.

Today, all these failures can be dealt with. Scale is not an issue for companies designed to make proper use of the internet. Network-based architectures are inherently more flexible than their hierarchical predecessors: role- and function- based permissioning is simpler to implement. Smartphones allow us to capture all types of media, not just text. Connectivity is pretty much ubiquitous. And the information is held digitally in the cloud, taggable, searchable, retrievable. From anywhere. Anytime.

Taking a leaf out of Clay’s book:

We have the means. Cloud computing infrastructures. Smart phones. Cloud services that allow people to converse with each other, share and annotate digital objects, improve upon them.

We have the motives. Human beings are inherently social, we like sharing. We enjoy the bonding, the peer respect, the recognition. No man is an iland, intire of it selfe.

We just haven’t had the opportunity before. Enlightened bosses are now providing that opportunity, by focusing on outcomes rather than input timesheets, allowing their staff to determine what happens with their cognitive surpluses.

Knowledge workers, part of the tertiary sector, are intrinsically different from those employed in the historical primary and secondary sectors of agriculture and manufacturing. Their work is lumpy, amorphous, misshapen, non-linear.

This is not a new problem. Many “professionals” faced real challenges of scheduling and prioritisation, and found it impossible to have true predictability in workflow. Ask a doctor. A nurse. A teacher. A policeman or fireman. Their lives have been about lumpiness and unpredictability and non-linearity.

But we were stuck in the manufacturing mindset, so we pretended these anomalies didn’t exist. And we designed our education and healthcare institutions as if they were industrial in origin. Look what they’ve done to my song, ma.

Today is another day. We now have means, motive and opportunity. All we have to do is to allow people to make use of their cognitive surpluses. Focus on outcomes rather than inputs. And make everything we do centre on the customer.

[Okay, after a period of quiet I’ve now written two posts in two days. Which means the risk of getting flamed is high. I await your comments. With some trepidation. ]

It all began when the fat man sang

One of my favourite t-shirts, second only to Help>Slip>Franklin’s. [That’s a reference to one of the finest sequences ever played live or laid on vinyl: Help On The Way, Slipknot and Franklin’s Tower, taken in sequence from Blues for Allah.] Both t-shirts, by the way, available from zazzle.

You guessed it. I’m one of those. A Deadhead. And proud to be one. If you check out the end of the About Me section of this blog, written when I started blogging, you’ll find these words:

my thoughts on opensource were probably more driven by Jerry Garcia than by Raymond or Stallman or Torvalds

It’s been a long strange trip for fans of the Grateful Dead recently: For example, the March 2010 edition of the Atlantic Review had an article entitled Management Secrets of the Grateful Dead.

Image credit: Zachariah O’Hora

The article talks about the inauguration of the Grateful Dead archive at the University of Santa Cruz. Some years earlier, Strategy + Business, a prestigious management journal, published an article entitled How to “Truck” the Brand: Lessons from the Grateful Dead.

Atlantic Review. University Archives. Management Journals. Just what is it about the Dead? A fan site that’s really a social network, one of the earliest to understand the value of social media in bringing the fan base together and giving them a space to inhabit. A dominant position in live music: the Dead have their own tab in the Internet Archive (the only entity, band or otherwise, to have one) and account for 10% of the overall Live Music collection there. A Google Earth mashup that shows you the precise locations and times of Dead concerts. Sites dedicated to trading the music of the Grateful Dead. A shirts Hall of Fame. A gazillion ties. [I should know, I have over 50 of them…]

A long strange trip indeed. So here’s my personal perspective on why the Dead succeeded.

1. It’s all about performance. Unlike most other bands, the Dead were a touring band. They played. And played. And played. Between 1963 and 2007 the Rolling Stones performed live 1597 times, or about 35 times a year. As against that, the Grateful Dead performed live 2380 times between 1965 and 1995, or about 77 times a year. Very few bands keep up that level of performance.

And so it is in business. People care about what you do, not what you claim to have done or how good your marketing is. Particularly now, when the cost of discovering truth is lower than ever before, what matters is how a company performs. Not how it says it will perform. Which is why customer experience has become so important.

2. It’s all about participation. Studio performances are not the same as live music: when you see what gets traded in Dead circles, you begin to understand why. Live sessions are real, organic, they change from session to session. Audiences are not locked away on couches or straitjackets, they participate. Because they can. And they want to.

Companies need to understand this as well, particularly as the analog world shifts to digital. The cost of participation gets lowered. There was a time when I used to get really irritated with management consultants who would bring their powerpoint decks when meeting with me, always in analog, always taking care not to leave it behind. [In case I tried to copy it or, Heaven forfend, amend it, add to it.] What tosh. I’d already paid through my nose for the material.

Contrast that sort of short-term thinking with the vision inherent in Garcia saying “When we’re done with it, they can have it”, when asked about fans taping their shows.

3. It’s all about improvisation. John Lennon, another of my favourites, is reported to have said:

Life is what happens to you while you’re making other plans

When you look at the way they performed at concerts, there were many interesting charcteristics. They didn’t seem to have a predefined list of songs or sets; there was a lot of jamming and improvisation within the songs, drawn from a vast array of songs whose “design” made such improvisation possible. Garcia suggested more than once that they made up the song list as they went on, basing it on active feedback from the fans.

Lineups varied; band members performed in other bands or groups; everything about the culture of the band screamed responsiveness, adaptability.

4. It’s all about passion. Quality matters. And quality is a function of passion, of persistence, or practice. What the Dead did they did as a labour of love. Unless you enjoy what you do, there isn’t any point.

When you’re passionate about something, then you take the values inherent in that something and live your life according to those values. They permeate everything you do. I had the privilege of spending some time with John Perry Barlow, erstwhile lyricist for the Grateful Dead, cattle rancher, founder member of the Electronic Frontier Foundation, poet, what-have-you. And he was a perfect example of how his values affected everything he did and does.

If you haven’t done so already, you should read his essays The Economy of Ideas and  The Next Economy of Ideas, along with the oft-quoted A Declaration of the Independence of Cyberspace.

In the end, what the Grateful Dead stood for are principles. Principles of openness and participation, principles of performance and passion, principles that allowed them to improvise and respond.

Companies would do well to pay heed.

The Right Thing

When it comes to leadership, I’ve tended to go along with Max De Pree’s definition, paraphrased here. A leader’s first job is to provide vision and strategy; his next is to say thank you. In between those two a leader is a servant and a debtor. So yes, I guess you could say I believe in soft-hands leadership.

From an enterprise perspective, I’ve always felt that true leadership is about Doing The Right Thing, and that the role of control functions is to make sure that Things Are Done The Right Way. Choosing the Right Thing is therefore of fundamental importance, and is often to do with principles and values rather than metrics and measurements. When you start thinking about being 57% right, that’s when you land up in Another Fine Mess.

Which is one of the reasons I was so glad to see this yesterday:

At that point it didn’t look like India had enough runs on the board, and Dhoni was the last recognised batsman. And the captain. Victory was not assured, and he could have stayed on. But he Does The Right Thing. He walks.

And talking about doing the right thing, I was glad to see that the ICC reversed its decision and awarded England the “Darrell Hair” match, as was the case originally. There were no grounds to call it a draw; worse, it was a very dangerous precedent to set. So I am glad to see that the original decision has been reinstated.

Why did it take this long? How come it happened at all? Or, as friend and erstwhile colleague Dom Sayers suggested, is this the last step in the rehabilitation of Hair? And the answer to all those questions is “I don’t know”.

What I do know is this. Doing the right thing is important. And doing the right thing is a key facet of leadership. So well done Dhoni.

Freewheeling about excavating information and stuff like that

Do you remember enterprise application integration? Those were the days.  First you paid to bury your information in someone’s proprietary silo, then you paid to excavate it from there, then you paid again to bury it again in someone else’s silo. Everybody was happy. Except for the guys paying the bills.

I went to see the guys in Osmosoft yesterday, it’s always a pleasure visiting them. At BT Design, our approach to innovation has a significant community focus: Web21C, now integrated into Ribbit, was formed on that basis; both Osmosoft as well as Ribbit  are excellent examples of what can be done with open multisided platforms.

While I was there, I spent some time with Jeremy Ruston who founded the firm and leads the team. Incidentally, it was good to see Blaine Cook there, I hadn’t seen him since he joined BT. Welcome to the team, Blaine.

When it comes to opensource, Jeremy’s one of the finest brains I know, we’re really privileged to have him. We got to talking, and somehow or the other, one of the topics that came up was the ways and means we have to figure out if someone’s any good, in the context of hiring. After all, there is no strategy in the world that can beat the one that begins “First hire good people”.

When you’re hiring people with experience, the best information used to come from people you knew who’d already worked with her or him. Nothing beats a good recommendation from a trusted domain. You can do all the interviews you want, run all the tests you can find, do all the background searching you feel like; over time, the trusted domain recommendation trumps the rest.

Now obviously this does not work when the person has not worked before, where there is no possibility of a trusted domain recommendation. Which is why people still use tests and interviews and background checks.

Which brings me to the point of this post. Jeremy brought up an issue that he’d spoken to me about quite some time ago, something I’m quite keen on: the use of subversion commit logs as a way of figuring out how good someone is.

And that got me thinking. Here we are, in a world where people are being told: Don’t be silly and record what you do in Facebook; don’t tell people everything you do via Twitter; don’t this; don’t that; after all, the bogeyman will come and get you, all these “facts” about your life will come back to haunt you.

As a counterpoint to this, we have the opensource community approach. Do tell everyone precisely what you are doing, record it in logs that everyone can see. Make sure that the logs are available in perpetuity. After all, how else will people find out how good you are?

Transparency can and should be a good thing. Abundant transparency can and should be a better thing, rather than scarce transparency. Right now we have a lot of scarce transparency; people can find out things about you, but only some people. Which would be fine, if you could choose who the people were. Do you have any idea who can access your credit rating? Your academic records? Do you have any idea who decided that?

Scarce information of this sort leads to secrets and lies and keeps whole industries occupied. Maybe we need to understand more about how the opensource community works. Which, incidentally, is one of the reasons why BT chose to champion Osmosoft.

An aside: David Cushman, whom I’d known electronically for a while, tweeted the likelihood of his being near the new Osmosoft offices around the time of my visit, so it made sense to connect up with him as well. It was good to meet him, and it reminded me of something I tweeted a few days ago. How things change. In the old days relationships began face to face and over time moved into remote and virtual and electronic. Nowadays that process has been reversed. Quite often, you’ve known someone electronically for a while, then you get to meet them. Intriguing.

Finally, my thanks to gapingvoid for the illustration, which I vaguely remembered as “Excavation 47”. It was a strange title so it stuck. Which reminds me, I have to start saving up to buy one of his lithographs, they’re must-haves.