Musing lazily about work and play

A decade ago, soon after becoming Global CIO at Dresdner Kleinwort Wasserstein, I started looking into how I could embed what we now call “social software” into the everyday operations of that institution. By that time we were already pioneers in the use of wikis, we were already well established in our use of instant messaging, we had begun quite serious experiments in the use of smart mobile devices at work, so what we were trying to do was to add blogging to our portfolio and bring all the tools together into a more harmonious whole. It was a real privilege to lead that department, we had an amazing array of truly world class talent there. [If you used to work there and you’re reading this, you have no idea how grateful I am for having had that privilege. A fabulous team. And I’m still grateful.]

Andrew McAfee, then at Harvard, chanced upon our work, spent time with us and helped us understand the formal context of what we were discovering. You can read about it in his seminal book Enterprise 2.0: New Collaborative Tools For Your Organisation’s Toughest Challenges; or if you prefer to test the water first, you can read the original article he wrote for the MIT Sloan Management Review, Enterprise 2.0: The Dawn of Emergent Collaboration.

During those years, I had the opportunity to talk to a lot of people about these emergent tools: friends and colleagues, industry participants, observers, consultants, academics, in fact anyone and everyone who had an opinion. And there were many opinions. [Strange that. You don’t tend to find that everyone has an opinion as to how to treat something on a balance sheet. You don’t tend to find that everyone has an opinion on the meaning of a clause in a contract. But when it comes to IT ….. My name is Legion, for we are Many.]

Anyway, amongst all these varied opinions, one set stood out. That we would fail in our quest to implement social software at the bank because the very names of the tools we used were so “silly”. Proponents of this particular set of opinions felt that life was hard enough when companies had “stupid” made-up names like Google, and that I would face a real uphill battle because I was proposing using trivial-sounding things like “blogs” and “wikis”. After all, this was “work”, not “play”, and people at work had more important things to do than to play.

At that time, I was getting to the point where I was working on a move to a “four-pillar” architecture for enterprise software, something I spoke about in this video of a closed architects meeting in December 2005, andĀ  as reported here by old friend Phil Wainewright who was present. We were already committed to publish-subscribe models within the bank, had a sensible bus architecture, and so I was keen to improve the subscribe capability, thinking at the time it would be something along the lines of what Netvibes offered then. Facebook was just emerging, and Twitter hadn’t yet begun. But the principles of back-end publishers and RSS and pub-sub and corporate “social” networks had already been established, Bloomberg Chat had been around for a while, ICQ had been mutated into various forms, we’d been learning off Groove and Jabber, Parlano MindAlign had shown us the art of the possible (before Microsoft trampled all over them). We were ready for bringing all this together.

Part of my energy was being used on something possibly more ambitious, moving the bank away from Microsoft and on to Apple. Project Jobsworth. The reason I have the mail id [email protected] (used to be [email protected]). The reason I call myself @jobsworth on Twitter and in many other digital places. You can read about that project here.

Al-Noor Ramji, my predecessor at the bank, had set many things in train that I could use and extend: a commitment to open source, not just as consumer, but as producer, as contributor: under his leadership, we were early on to Java, committed wholeheartedly to a publish-subscribe mindset and developed what later became OpenAdaptor. The environment was fertile, the talent pool amazing. And the time was right, we needed to innovate to help reduce costs and drive up speed and quality.

None of this was being done because we thought it was fashionable or cool or different. At the time, my influences included John Seely Brown’s Social Life of Information, Steven Johnson’s Emergence, Christopher Alexander’s A Pattern Language, Brian Arthur on Increasing-Returns models, Howard Rheingold on Virtual Communities, (and later Amy-Jo Kim as well) pretty much everything written by Esther Dyson and the editorial team at Release 1.0; and of course The Cluetrain Manifesto. Chris Locke even came and spoke to us in 2000 and in 2001, first in Bangalore and later in London. Later influences include Tom Malone and The Future of Work, Kathy Sierra on everything to do with Creating Passionate Users and Carlota Perez on Technological Revolutions and Financial Capital.

So you can see where our heads were at the time. We believed that we were in a new paradigm, one where digital infrastructure was writing new rules, where the customer was central, where information was social, where activity was emergent, communal and collaborative, where community mattered, where tools were becoming mobile, where software was about patterns, where identity and intellectual property were being redefinedĀ  by the internet. And based on those beliefs, we thought that work was changing, that the people at work were changing, that the tools of work needed to change.

And those are the kind of reasons why I’m at Salesforce.com today, where I’m chief scientist. [And why I’m a trustee of the Web Science Trust, and why I’m a venture partner at Anthemis.com].

Those are the reasons why I’m looking forward to Cloudforce Social Enterprise Tour London 2012 on May 22nd, which is showing all the signs of becoming the largest single one-day tech event ever.

As with open source, as with information becoming social, as with smart mobile devices, as with the Facebooks and Twitters of this world, the Social Enterprise is no fad.

The Social Enterprise is a consequence. A consequence of customers becoming empowered and discovering how to use their voice. A consequence of Moore’s Law meeting Metcalfe’s Law, and then evolving a further forty years or so. If you want to know about the connected customer, read about it here in the Telegraph or here on CloudBlog.

In a similar context, the blurring of lines between work and play is nothing new. As I found out the hard way using terms like blogs and wikis, people push back on the terms because work is meant to be “serious”. Meant to be. Not is.

Work and play are not mutually exclusive. If you want to get into this, a good place to start is Mihaly Czikszentmihalyi’s Flow. You should also read Pat Kane’s The Play Ethic. And maybe riffle through Michael Schrage’s Serious Play. But it will only scratch the surface.

Which brings me on to today. Today, where the changing work landscape has meant that we need new ways of looking at work, new tools, new processes. Tools and processes that relate to collective and collaborative activity in “real time”, that deal with nonlinear and lumpy processes, that recognise the importance of patterns. Work environments that are distributed and geographically dispersed, with people needing to “shift time and place”. Engaging with empowered, always-on, mobile customers and partners and staff.

I started looking closely at a number of communities to try and learn from them: traders who used to operate in open outcry markets and later shifted to trading floors; traffic and flow controllers, particularly air traffic controllers; network operators (which is why I moved from Dresdner Kleinwort to BT); and gamers.

Yes, gamers.

I thought they had a lot to teach me. And they do.

So I will continue to observe them and learn from them. And see how to apply that learning at work. Which is what I was doing when the term “gamification” intruded on to my radar. I have spoken on the subject a few times, explaining why the lipstick of gamification cannot solve the pig of work, explaining why this sea-change at work is something far more than just the transposition of some game mechanics.

Which makes this Pew Internet report intriguing reading. If you have the time to read this, then I strongly recommend you read this critique by Sebastian Deterding as well. He makes some crucial points, particularly on competition, on generations, and on work/play. I have had the opportunity to meet and to spend time conversing with Sebastian, and would urge those interested in this space to follow him.

The Friday Question: 18 May 2012

First, the answer to yesterday’s prequel. I asked people to point out what six named people have in common, and why one was distinctively the odd one out. The six were Bohr, Curie, Einstein, Fermi, Nobel and Rutherford.

The common element was easy; I wanted more than “scientists”, what I was looking for was that they all had elements named after them. Most of you got that.

The odd man out was a bit harder. There were some facile choices: some of you pointed out, correctly, that all bar Nobel had won the Nobel Prize. Some indicated that Nobel was possibly the only chemist; some that Marie Curie was the only woman. The best answer I saw was that the elements themselves were all synthetic except for Curium, so that Curie would become the exception.

But it wasn’t the answer I was looking for. Perhaps I have to take even more care with setting the question…. composing unGoogleable questions continues to be a challenge. My understanding is that all that has ever been seen of Curium in natural state is at “trace” level and nothing more than that, and that the way Curium is obtained is by bombarding uranium or plutonium with neutrons in nuclear reactors. But technically all your answers are correct, and I will find better ways of framing the question.

I was looking for more. I wanted to lead people down the rabbit holes of “appearing on banknotes” (neither Fermi nor Nobel have done so) or “appearing on stamps” (they all have). And the real answer I wanted was this:

While they are all craters on the Moon, only Bohr is fully visible from Earth; the remainder are at best marginally visible, and more often than not deeply rooted in the far side.

So on to today’s question.

Marshall is to Allen as Hercules is to what?

The Friday Question: A prequel

I think I missed out last week’s question, my apologies. So here’s a simple teaser instead, while I work on the question for tomorrow.

Name the odd one out as well as what they have in common.

Niels Bohr. Pierre and Marie Curie. Albert Einstein. Enrico Fermi. Alfred Nobel. Ernest Rutherford.

Getting the odd one out correctly means nothing. You must answer both parts.

Of blue raincoats and polka dot bikinis

Ah the last time we saw you/you looked so much older

Your famous blue raincoat was torn at the shoulder

Leonard Cohen, Famous Blue Raincoat, 1971

 

 

It was an itsy-bitsy teeny weeny yellow polka dot bikini

That she wore for the first time today

Brian Hyland, Itsy-Bitsy Teeny Weeny Yellow Polka Dot Bikini, 1960

 

Two songs from my childhood, both immensely memorable. One a novelty song that charted its way to the top, the other a haunting, lilting melody. Guess which one I had to learn to dance to at the age of 14? [I’ll have you know that dancing to Leonard Cohen is no laughing matter!].

So what are these songs doing in “a blog about information”?

Let me try and explain. Famous blue raincoat. [Incidentally, it was a Burberry]. Not blue famous raincoat. Itsy-bitsy teeny weeny yellow polka dot bikini, not polka dot yellow itsy-bitsy teeny weeny bikini.

Why?

Because the other forms just don’t sound right, don’t feel right, there’s something you can’t quite put your finger on why, but they’re not right.

Because adjectives have an order, a hierarchy; an order that is tacitly understood, learnt and practised by native English speakers; an order that has to be explained explicitly to non-native speakers of the language.

An order that goes something like this:

Quantity. Opinion. Size. Age. Shape. Colour. Origin. Material. Purpose.

Visit this site to see how non-natives get to learn the order and hierarchy. You may also find it of interest to read these posts on the subject. Other languages appear to be less hierarchical when it comes to adjective placement and order. [If you’re interested, you can even take a look at Dalcurian adjectives :-)]

I had three reasons to write this post:

One, having known about this for some time, and having been reminded of it regularly more recently, I wanted to share it with you, in case you were as interested in it as I was. English is a wonderful, living, just-slightly-insane language.

Two, I think it’s a great example of tacit knowledge, something we need to understand better as we move forward with the web. We know it, but don’t know we know it. We use it, without knowing we’re using it.

Three, I think it’s a great example of how the web works, allowing me to write a post like this, linking to stuff that lets you dig into it if you choose to.

Incidentally, when people come and argue with me about apps and HTML5, I’ve tended to use just one word in reply.

Links.

Musing lazily about the Digital Divide

According to the International Telecommunications Union, and as referred to in Wikipedia, this was the state of the Global Digital Divide in 2010.

Digital divides come in many forms: between continents, between countries, within countries; between age groups, between genders, between professions. There are even digital divides between companies and customers, particularly if the company’s inclined to imitate a dinosaur [In which case the company will suffer the same fate as the dinosaur].

When I worked in regulated industries, particularly in finance and telecoms, I had at least one regular source of joy. And it was in meetings when we were considering doing something new. I would count the minutes before someone asked, very well-meaningly “Have you considered the compliance implications?” At which point everyone nodded sagely and went back into staring serenely into their coffee, secure in the knowledge that very little “new” was going to happen.

It never took long. Most of the time, the question had been asked before the ten-minute marker.

Legacies come with costs.

While working at Dresdner Kleinwort, sometime in 1998, I was asked to pop over to Warsaw for a couple of days in order to assess the “Year 2000 readiness” of a number of Polish banks; they were considering flotations and my role was to perform part of the due diligence.

It was a very quick trip, validating what I’d already found out. None of them had done any real preparation for the Year 2000.

They didn’t need to. They were so late to the table that they’d leapfrogged the problem.

It was something that really resonated with me, because of what I’d seen in Calcutta time and time again, yesterday’s pioneers leave amazing legacies…. with amazing costs to follow. Younger, later participants don’t face the same brownfield challenges.

At LIFT in Geneva this year, David Rowan gave an excellent talk on why Start-up Entrepreneurs should move to Africa; afterwards, I had the chance to talk to him briefly over dinner, and what he said resonated as well. As a result, I started looking more closely into how “wired” Africa was becoming. Here’s the current intra-Africa optical fibre network, courtesy the UbuntuNet Alliance:

 

David also sent me off to check out what was happening undersea. Here’s what is projected to happen by 2014, from Steve Song’s excellent ManyPossibilities blog:

Africa has already gotten itself a good reputation for pragmatic progress particularly from a communications viewpoint, with “guerrilla innovation” around wireless and mesh; the mobile story is also very strong.

When I wax lyrical about the Dark Continent, some people respond by trying to move the argument to India and China, wanting desperately to show me that the digital divide is present there. It used to take me three years to get a landline in the India I left in 1980. Today, I can get a mobile phone there faster than I can get one in the UK, with less paperwork, and at lower comparable cost.

When I quote stories like that one, I get dismissive shrugs and suggestions that the technology in question is usually dated and second-rate. Which is why I smiled when I saw the recent Apple results, where it turned out that over 20% of Apple’s iPhone business was in …. China.

Carlota Perez, one of my favourite authors and economists, is someone you absolutely must read. Her Technological Revolutions and Financial Capital is one of the few books I have read cover to cover over a dozen times. [One day I shall write a post about those books]. At the Triple Helix Conference in memory of Chris Freeman at Stanford last year (slides here), she summarised one of the key ideas of her book as “The shift from financial mania and collapse to Golden Ages occurs when enabled by regulation and policiesĀ  to shape and widen markets”.

Sometimes when I see what happens in that murky space where incumbents and regulators act as haruspices over the entrails of mummified intellectual property regimes, I start thinking wistfully of a different world. One where regulation and policy enables Golden Ages to occur, unhampered by the acts of erstwhile market participants.

Maybe that different world is already there in the West. Eastman Kodak, with a commanding position in the world of film, and with over 1000 digital photo patents, went into bankruptcy earlier this year. Polaroid, who defined a whole new world of “instant” photography, has been going bankrupt regularly and repeatedly since 2001, and finally sold off some of its core patents earlier this month.

Patents. Stocks of knowledge, as John Hagel, John Seely Brown and Lang Davison would probably call them, in the context of their seminal The Big Shift.

And while all that was going on, a small, young company whose apparent raison d’etre was to make digital photographs a la Kodak look like they were taken on Polaroids, got bought by Facebook for a cool billion dollars.

Instagram understood flows. Understood the importance of cloud, mobile, social and open.

All this makes me think.

Maybe I should be telling my children and grandchildren(to-be, in case anyone was wondering) Go South Young Man/Woman/Child.

Maybe Africa is it. Maybe Africa will leapfrog everyone else in welcoming a Carlota Perez Golden Age, with everyone connected and empowered with compute and storage and bandwidth affordably and effectively; maybe this will happen because they have no legacy to hold them back in this context, no haruspices, no mummified anythings. Maybe Africa will gain from the scale that India and China generate, and put that scale to work before anyone else.

Incidentally, this is not the first time I’ve raised this idea. Read Why It’s Over if you want a slightly different context.