Rambling about creativity and capital and content and frames

In this context of creativity and web, Jonathan Zittrain, or JZ as he gets called, made a number of critical points in his excellent book The Future of the Internet And How to Stop It cover.jpg One of those key points is to do with the “generative” web, the phrase he uses to describe the open and innovative and creative aspects of the web; JZ spends time articulating the rise of locked-down devices, services and whole environments as a direct response to the ostensibly anarchic nature of the generative web, with its inherent vulnerabilities and weaknesses. … ] The implied tension between “generative” and “secure” that is to be found in JZ’s book, resonated, in a strange kind of way, with some of the ideas in Carlota Perez’s Technological Revolutions and Financial Capital: 184376331101lzzzzzzz.jpg The book remains one of my all-time favourites, I’ve probably read it a dozen times since it was published.

The tragic death of Michael Jackson has dominated much of the news this past week, even overshadowing the Iran situation in some quarters. Strange but true. Jackson’s death has had some unusual consequences, as people try and deal with their own reactions in different and creative ways. While the original story broke, I believe, on TMZ, Twitter was the river that carried the news to the world.

And Twitter was overwhelmed. Which meant the arrival of the much-loved Fail Whale:

whale.png

Which led someone to come up with this:

3661418856_0a86b4884e.jpg

This concerned a small number of people, who were worried that the image may cause offence. Which in turn led someone else to this:

2009-06-30_2203.png

And so it went on, as people sought more and more creative ways of expressing their emotions and paying tribute to Michael Jackson. Wallpaper downloads. Posters. Photographs. Videos. Collages and montages. All in double-quick time. For me the most creative was this mashup:

2009-06-30_2210.png

BillieTweets. Where someone has taken a Billie Jean video and made the lyrics visual using tweets where the relevant word has been highlighted. Follow the link to see how it works. [Thanks to the Scobleizer for the heads-up. And safe travels.].

All this is part of the magic of the web, the value that is generated when people have the right access and tools and ideas. Human beings are so incredibly creative.

In this context of creativity and web, Jonathan Zittrain, or JZ as he gets called, made a number of critical points in his excellent book The Future of the Internet And How to Stop It

cover.jpg

One of those key points is to do with the “generative” web, the phrase he uses to describe the open and innovative and creative aspects of the web; JZ spends time articulating the rise of locked-down devices, services and whole environments as a direct response to the ostensibly anarchic nature of the generative web, with its inherent vulnerabilities and weaknesses. [If you haven’t read the book, do so, it’s worth it. ]

The implied tension between “generative” and “secure” that is to be found in JZ’s book, resonated, in a strange kind of way, with some of the ideas in Carlota Perez’s Technological Revolutions and Financial Capital:

184376331101lzzzzzzz.jpg

The book remains one of my all-time favourites, I’ve probably read it a dozen times since it was published. And given away many many copies, something I have done with a very small number of books, including: The Social Life of Information, The Cluetrain Manifesto and Community Building on The Web.

The resonant piece was this: One of Perez’s seminal findings was the difference between financial capital and production capital.

In Perez’s view, financial capital “represents the critera and behaviour of those agents who possess wealth in the form of money or other paper assets….. their purpose remains tied to having wealth in the form of money (liquid or quasi-liquid and making it grow. To achieve this purpose, they use …. intermediairies …. The behaviour of these intermediaries while fulfilling the function of making money from money that can be observed and analysed as the behaviour of financial capital. In essence, financial capital serves as the agent for reallocating and redistributing wealth.

Perez goes on to say that “the term production capital embodies the motives and behaviours of those agents who generate new wealth by producing goods or performing services.

Through these distinctions, she clearly delineates the differences between the “process of creating wealth and the enabling mechanisms”; these distinctions are then played out through a number of “surges” or paradigm shifts. An incredible book.

For some time now, I’ve been wrestling with the connections between Zittrain’s generative web and Perez’s production capital, and formed my own views of the progressive-versus-conservative tensions that can be drawn from such a juxtaposition.

All this came to the fore again in the context of copyright and content, as I read Diane Gurman’s excellent First Monday piece on Why Lakoff Still Matters: Framing The Debate On Copyright Law And Digital Publishing

I give the abstract of the article here:

In 2004, linguist and cognitive scientist George Lakoff popularized the idea of using metaphors and “frames” to promote progressive political issues. Although his theories have since been criticized, this article asserts that his framing is still relevant to the debate over copyright law as applied to digital publishing, particularly in the field of scholarly journals. Focusing on issues of copyright term extension and the public domain, open access, educational fair use, and the stewardship and preservation of digital resources, this article explores how to advocate for change more effectively — not by putting a better “spin” on proposed policies — but by using coherent narratives to frame the issues in language linked to progressive values.

Reading the article took me back to Perez and to Zittrain. Our Lakoffian frames of “strict father” and “nurturant parent” are in many ways congruent with the generative-versus-secure and production-versus-financial continua described by JZ and Carlota. As Gurman says:

Lakoff’s nurturant parent embodies values of equality, opportunity, openness and concern for the general welfare of all individuals. Under the progressive economic model, markets should serve the common good and democracy…. The strict father frame, on the other hand, centres on issues of authority and control. The moral credo expresses the belief that if people are disciplined and pursue their self-interest they will become prosperous and self-reliant. The favoured economic model is that of a free market operating without government interference.

A free market operating without government interference. Hmmm I remember those.

Despite the credit crunch, the economic meltdowns, the rise in fraud, despite the socialisation of losses and the privatisation of gains that ensued, many things have not changed. And they must. We need to move to a generative internet production capital world. And for that maybe we need to think about what Diane Gurman is saying.

We need to frame our arguments around our values rather than just on the facts and figures; we need to weave a coherent narrative based on public benefit via empowerment and access.

We can see the implications of this divide in many of the arguments that are being had in the digital domain. For example, the recent announcement by Ofcom of its intention to enforce regulated access to premium (and hitherto exclusive) content is a case in point, where the same arguments prevail.

The response of the incumbent, while understandable, is benighted. You only have to look at the public benefit implications, particularly those to do with human progress and innovation.

The returns expected from production capital differ from those expected out of financial capital for a variety of reasons; the most important reason is that when you’re in the business of creating value and wealth, rather than redistributing it, the returns tend to be somewhat less than astronomical.

Thinking about innovation and business models

I’ve always maintained that people who “think opensource” work on useful things, solve problems, create value; they don’t focus on the business model at the outset but instead concentrate on the value they create.

In Peter Drucker’s words, “people make shoes, not money”. Make something that is worth while and people will pay you for it. Figure out what shoes you’re good at making and then make them well. You will make money as a result.

Knowing in advance how you’re going to make money from snake oil may sound like you have a business model; what you have is snake oil. And that’s the problem you need to concentrate on first, the fact that you’re not creating anything of value.

And sometimes the process of calculating and measuring benefits can come in the way. Many years ago, when I worked for Burroughs Corporation, I learnt this the hard way. This was the early 1980s, and software/services was just emerging as a business. Until then, all the margin was in hardware, so we ‘shifted tin”. We gave away the software and the services in order to sell the hardware. Then, as the cost of human capital rose, and investable capital became scarce, this equation began to shift. It became more and more important to understand the true cost of software projects before starting them.

So we instituted something called the Phase Review Process, borrowed from the US Navy if I remember correctly, and implemented it within the firm. Every project had to undergo a phase review at inception and then at each phase.

Which was all fine and dandy. Unless you were just about to start a project that would cost a total of £25,000 inclusive of everything. Which was less than the lowest possible total cost of the phase review process. But I was lucky, my management understood this issue, and it was mandated that projects had to exceed £100,000 in total planned cost before they needed to be put through the Phase Review Process.

Why am I writing all this? Well, some years ago I remember reading about something called the polypill; the newspaper articles referred to this paper which had been published in the BMJ in 2003.

The principle was simple. Six tried and tested medications to be combined into one pill that could cut potentially reduce cardiovascular disease by 80%.

When I first read the articles, I was intrigued. But I didn’t know much about the drugs involved. I knew nothing about statins, other than some vague notion that they were wonder drugs that combated high cholesterol with some wonder side effects. I knew even less about ACE inhibitors and beta-blockers, though I may have come across the beta-blockers as something to do with performance enhancement. Folic acid was something pregnant women took; and diuretics meant you had plumbing problems.

Aspirin I knew about, although I had no idea it could be obtained in cardio doses.

But that was in 2003. Since then, as many of you will know, I have had reason to get to know this particular cocktail of pharmacology quite intimately. Nevertheless, I’d forgotten all about the polypill.

Until a few weeks ago, when I read this on the BBC web site. The polypill could become reality in five years’ time, it said. And then I remembered what i’d read all those years ago, when they said … that the polypill could become reality in five years’ time.

And that made me think. Slowly. Very slowly. And my thoughts went a little like this:

One, cardiovascular disease is the single biggest cause of death facing humans.

Two, people had come up with a cheap and effective way of reducing the risk of cardiovascular disease by 80%.

Three, this had happened six or seven years ago.

Four, with a little bit of luck and a following wind, we may see something happen in five years.

Of course I’m oversimplifying, but I don’t believe I’m exaggerating. A strange world we live in.

I’m not by nature a conspiracy theorist. I believe man landed on the moon nearly forty years ago. I don’t believe in little green men or UFOs. Neither do I believe that Big Oil makes sure that substitutes for gasoline never surface.

But here is what I believe. I believe there is some evidence that the polypill does not exist today because it’s hard to make money from it.

Why? Because the ingredients in the polypill are all out of patent, all “generic”. Because the way drugs are trialled, it’s prohibitively expensive to bring a new drug to market unless you have some monopoly rents to come, patents to exploit and exhaust.

So it is possible that the cost of trialling a cocktail of generic drugs exceeds the potential income from selling the cocktail. And so no polypill.

No mention of the number of lives potentially saved and minor stuff like that.

Now I take statins, beta blockers, ACE inhibitors, diuretics, blood thinners and anti coagulants daily. You could say I have an amateur interest in all this. A passion, even, given that the medication has worked wonders on my heart and on my life expectancy.

This is not meant to be a diatribe against doctors or the medical profession or even the pharmaceutical industry: they have all treated me really well, and I owe them a debt of gratitude.

What I am trying to do is to point out that sometimes we hold up innovation by concentrating on the wrong thing at the start. And sometimes it’s because of the anchors and frames of the way we do things.

So I was thinking. Opensource people solve generic problems. Is there a way to opensource the trials of generic drugs, to change the mechanics and dynamics of drug trials for generics? Is there a way to adopt the opensource principle of “privatising losses and socialising gains”, the exact opposite of what happened during the credit crunch?

I wonder.

Views?

Of markets and black swans and opensource and software

I was reading a column by Nassim Nicholas Taleb in the Financial Times yesterday, where he lists “ten principles for a Black Swan-proof world”. You can read the whole piece here, or download the pdf from Taleb’s own site.

I couldn’t help noticing how striking the relationship was, between what he was putting forward for financial markets, and what we consider generally to be the principles of opensource and of good software development. So I decided to list each of his principles below, with brief comments in italics seeking to explain the same thing in a software context.

  1. What is fragile should break early while it is still small. Make sure you test early, test often, test small.
  2. No socialisation of losses and privatisation of gains. Opensource is pretty much predicated on this: “losses” are borne by individual contributors, “gains” are shared by all participants
  3. People who were driving a school bus blindfolded (and crashed it) should never be given a new bus. Opensource communities make use of tools like commit logs for this very purpose, looking at the prior contributions made by a participant before letting the changes in
  4. Do not let someone making an “incentive bonus” manage a nuclear power plant — or your financial risks. We’ve all seen what happens when incentives for technology staff are not aligned properly with business objectives. This is why the most important job of a CIO is “dial-tone”, reliable secure operations at an affordable price
  5. Counterbalance complexity with simplicity. Build software on a “high cohesion, loose coupling” basis
  6. Do not give children sticks of dynamite, even if they come with a warning. Make sure that the people who make software decisions actually have software experience
  7. Only Ponzi schemes should depend on confidence. Governments should never need to “restore confidence”. How many Linux ads did you see when Linux came out?
  8. Do not give an addict more drugs if he has withdrawal pains. Don’t cure proprietary-software addictions by giving people more proprietary software
  9. Citizens should not depend on financial assets or fallible “expert advice” for their retirement. Rely on something real. Code. Code is King. Not slideware.
  10. Make an omelette with the broken eggs. Again an opensource principle. Cannibalise. Reuse.

So what would happen if financial markets get run on opensource principles? Complete transparency. Open inspection. Visible track records. Compartmentalisation of losses, sharing of gains. Moderation not regulation. And yes, the capacity to “fork”.

[Update, with Tongue Firmly in Cheek: If the banking system needs to learn from the software industry, is the opposite true as well? Are we approaching a time when large software firms will need bailing out? http://www.infoworld.com/d/adventures-in-it/microsoft-asks-feds-bailout-720 ]

Code_swarm and community

I wrote recently about a conversation with Jerm about commit logs, opensource and hiring; Ted chipped in to the debate with a reminder for us to visit code_swarm, a project I’d been aware of but only peripherally. There is much about the project that makes it interesting, even remarkable; one that gives me personal pleasure is the usage of Processing as the visualisation tool.

What Michael Ogawa has done is find better and better ways to visualise the human interactions that take place in software development, particularly community-based development. Many of the lessons we’ve learnt in opensource are made tangible and graspable by all, just by watching what happens. The organic nature of the process is brought out beautifully. Anyone interested in opensource and community-based development would do well to take a look. I think there are applications for the use of codeswarm in many open multisided platforms; as and when we use them I shall keep you posted.

In the meantime, I hope you enjoy the visualisations created by Michael and his team. Our thanks to them.

Freewheeling about excavating information and stuff like that

Do you remember enterprise application integration? Those were the days.  First you paid to bury your information in someone’s proprietary silo, then you paid to excavate it from there, then you paid again to bury it again in someone else’s silo. Everybody was happy. Except for the guys paying the bills.

I went to see the guys in Osmosoft yesterday, it’s always a pleasure visiting them. At BT Design, our approach to innovation has a significant community focus: Web21C, now integrated into Ribbit, was formed on that basis; both Osmosoft as well as Ribbit  are excellent examples of what can be done with open multisided platforms.

While I was there, I spent some time with Jeremy Ruston who founded the firm and leads the team. Incidentally, it was good to see Blaine Cook there, I hadn’t seen him since he joined BT. Welcome to the team, Blaine.

When it comes to opensource, Jeremy’s one of the finest brains I know, we’re really privileged to have him. We got to talking, and somehow or the other, one of the topics that came up was the ways and means we have to figure out if someone’s any good, in the context of hiring. After all, there is no strategy in the world that can beat the one that begins “First hire good people”.

When you’re hiring people with experience, the best information used to come from people you knew who’d already worked with her or him. Nothing beats a good recommendation from a trusted domain. You can do all the interviews you want, run all the tests you can find, do all the background searching you feel like; over time, the trusted domain recommendation trumps the rest.

Now obviously this does not work when the person has not worked before, where there is no possibility of a trusted domain recommendation. Which is why people still use tests and interviews and background checks.

Which brings me to the point of this post. Jeremy brought up an issue that he’d spoken to me about quite some time ago, something I’m quite keen on: the use of subversion commit logs as a way of figuring out how good someone is.

And that got me thinking. Here we are, in a world where people are being told: Don’t be silly and record what you do in Facebook; don’t tell people everything you do via Twitter; don’t this; don’t that; after all, the bogeyman will come and get you, all these “facts” about your life will come back to haunt you.

As a counterpoint to this, we have the opensource community approach. Do tell everyone precisely what you are doing, record it in logs that everyone can see. Make sure that the logs are available in perpetuity. After all, how else will people find out how good you are?

Transparency can and should be a good thing. Abundant transparency can and should be a better thing, rather than scarce transparency. Right now we have a lot of scarce transparency; people can find out things about you, but only some people. Which would be fine, if you could choose who the people were. Do you have any idea who can access your credit rating? Your academic records? Do you have any idea who decided that?

Scarce information of this sort leads to secrets and lies and keeps whole industries occupied. Maybe we need to understand more about how the opensource community works. Which, incidentally, is one of the reasons why BT chose to champion Osmosoft.

An aside: David Cushman, whom I’d known electronically for a while, tweeted the likelihood of his being near the new Osmosoft offices around the time of my visit, so it made sense to connect up with him as well. It was good to meet him, and it reminded me of something I tweeted a few days ago. How things change. In the old days relationships began face to face and over time moved into remote and virtual and electronic. Nowadays that process has been reversed. Quite often, you’ve known someone electronically for a while, then you get to meet them. Intriguing.

Finally, my thanks to gapingvoid for the illustration, which I vaguely remembered as “Excavation 47”. It was a strange title so it stuck. Which reminds me, I have to start saving up to buy one of his lithographs, they’re must-haves.