Fossil files

I’m in the middle of moving offices, with all the packing/unpacking/throwing away it entails. There is a cathartic feel to it.

And, for an information ferret like me, there is enormous temptation to stop and read bits of things I have resolutely refused to throw away. Occasionally, I give in to that temptation. To read. Never to throw away.
This morning I came across an old New Yorker article by Malcolm Gladwell, one that I had filed for a different reason…… to remind me to order a copy of any book that contained the article “The  Dark Side of Charisma” by Hogan, Raskin and Fazzini.  Which I finally did today, nearly four years later, by ordering  Measures of Leadership by Kenneth E. Clark & Miriam B. Clark (Eds.)..

But I couldn’t help re-read the Gladwell article. And just loved the ending. I quote:

They [the consultants] were there looking for people who had the talent to think outside the box. It never occurred to them that, if everyone had to think outside the box, maybe it was the box that needed fixing. Fossilfool time again.

I think the same is true for walled-garden approaches to information and permissioning. In true sixties Suppose-They-Gave-A-War-And-Nobody-Came fashion, what happens if there are DRM/IPR walled gardens everywhere, but all the co-creation is taking place in the open spaces in between? We’re soon going to find out.

Four Pillars: Digital Equivalents of Opposable Thumbs

wren.thumbnail1.jpg

Over the last five years, my thoughts and actions have moved more and more into a community-driven, market-standard-based, opensource-influenced, platform-independent and device-agnostic IT world.

This was not always the case.

Originally I was comfortable with the notion that TCOs for enterprise IT were optimised by strategic vendor relationships, and that one could outsource many of the problems of software development, deployment and operations to specified vendors. The reasoning was simple: prevent undue proliferation of architectures, toolsets, standards and methodologies by sticking with one vendor, in effect leveraging that vendor’s “ecosystem”. Reduce EAI and regression testing costs as a result, improve time to market and use strategic procurement techniques to drive vendor costs down further.
This was seen as great from a control and governance perspective: it yielded readymade reams of stuff you could use for setting out source and target architectures, the roadmaps from source to target, the standards that drove people towards the target architecture, the strategic decisions underlying the roadmaps, and the detailed implementation plans that went with each piece. And the Control Gods looked at it and saw that it was good.

It was good. On paper. And probably very meaningful for environments that had all the following characteristics:

  • a stable 3-year business environment and outlook
  • a stable 3-year management environment and outlook
  • a technology environment that was largely homogeneous to begin with
  • a single-vendor “ecosystem” that covered a meaningful proportion of the technical needs
  • a complete lack of a blame culture amongst business sponsors, so that decisions could be reviewed and changed if needed
  • a complete lack of Not-Invented-Here amongst the inhouse developer community, so that tendencies to reinvent wheels and mousetraps were avoided

Reality tended to be slightly different from this, and as a result, the gains to be had from relatively narrow single-vendor-ecosystem strategies were not there. Enterprise application integration costs tended to skyrocket, either visibly through longer integration, regression and acceptance testing, or less visibly through high incidences of bugs and reduced systems availability. Problems were exacerbated by the existence of multiple proprietary architectures each hell-bent on non-cooperation with the rest.

As people recognised that hybrid environments were the norm and not the exception, we had to find ways of solving the EAI problem. So then we got comfortable with:

  • attempts at bus-based architectures and increased componentisation to simplify the technology foundations
  • attempts at use of rapid prototyping, RAD, XP, pair-programming to improve the quality of requirements capture;
  • attempts at use of time-boxing and time-placing to reduce scope creep;
  • attempts at use of options theory to augment classical DCF theory to improve IT investment appraisal
  • attempts at outsourcing, far-shoring, near-shoring and here-shoring to improve access to skills and reduce wage bills

These were different ways of ensuring we Did The Right Thing and drove the best value out of technology spend, but it failed to appease the control gods. Each of the techniques stated above placed some level of strain on the way we historically communicated on what we did. Architecture and standards and roadmaps and strategy papers and implementation plans became harder to maintain to any worthwhile accuracy. It did not mean that work was not being done, just that the reporting mechanisms of the past struggled with the present.

The static approaches could not cater for the repeated one-offs such as EMU and Y2K and Basle II and IAS and IFRS and US GAAP and Sarbanes-Oxley, to name but a few. Consultants understood this and exploited the opportunities to the full. And those that were left out of the first feeding frenzy focused hard on Six Sigma and Balanced Scorecard and Smartsourcing and EVA and BPM and and and, corrupting the difference between published and living reality even more. There were many emperors and many sets of new clothes.

Thankfully, Moore’s Law and Metcalfe’s Law and the consequent price-performance gains, coupled with the significant recessionary market pressures at the turn of the century, all this meant that real IT costs continued to fall, at least until the surpluses were eaten up doing “mandatory” consultant-generated programmes.

In the meantime, IT departments the world over learnt to make silk purses out of sows’ ears. They learnt to do more and more with less and less as business needs and markets became more volatile. They learnt to move painfully from raw inventory management through asset management to a portfolio-based approach to managing IT. They learnt that the complexity of their environments grew on a power law basis as everything became connected.
In the midst of all this, some good things happened. The opensource movement got some real traction, Doc Searls’ D-I-Y IT models were coming closer to reality. Tools to support Four Pillars became more readily available and more consistent. Telephony became software. Cool design became important again as a result of the iPod halo. Flash memory and NAND RAM began to drive the environment differently, just as virtualisation and service orientation were becoming reality.

A new set of ecosystems was also emerging. Ecosystems that weren’t single-vendor yet with reduced TCO. Ecosystems that were adaptive and responsive to external stimuli. Ecosystems built on community standards with market-driven principles. Four-pillar tools that supported search, syndication, fulfilment and conversation came from different vendors but worked together. That allowed information to move freely across their perceived boundaries, at the behest of the owner of the information. [Which was NEVER the vendor, anyway].

Which brings me to the point of this long post.

We are on the verge of a new digital revolution. As with the Industrial Revolution, this means we will need new sets of “machine tools”. Machine tools with a difference.

The Wikipedia definition of Machine Tool:

A machine tool is a powered mechanical device, typically used to fabricate metal components of machines by the selective removal of metal. The term machine tool is usually reserved for tools that used a power source other than human movement, but they can be powered by people if appropriately set up …[……]…. Devices that fabricate components by selective addition of material are called rapid prototyping machines.

The age we’re entering needs three types of “machine tool”:


  • Those that selectively remove information to create product, the “old way”
  • Those that selectively add information to create product, the “web” way
  • Those that selectively add information to co-create product, the “tomorrow” way

Man learnt to really use tools when he designed them to make use of his opposable thumbs.
We need to discover what the digital equivalents of the thumbs are, before we learn to use the tools properly.

And my hunch is that it is in the identity and authentication and permissioning space, to take our nascent machine tools to Main Street.

Methinks the Lady’s Competitors Doth Protest Too Much

Like everyone else, I had heard rumours and read the odd blog post about Mark Thompson’s Fleming Memorial Lecture last night. [Mark is the Director-General of the BBC]. And I thought to myself, it could just be possible that the Beeb were about to get it. Then, Malc of Accidental Light bludgeoned me into reading the full text of the lecture. Just by saying one word. Seminal. Thank you Malc.

Anyone interested in the future of what was “big media”, or for that matter the fast-converging TMT sector, should read the full text. Which you can find here. I’ve now read it fully once, and absolutely love what it says. I’m not going to summarise it, instead I will encourage you to read it for yourself.
Sure, there are pitfalls. Some will worry that product placement and stealth advertising are unavoidable, and that the strategy espoused will actually help make this happen. I don’t care. If there was something I worried about, it would be a lack of detail about platform/device independence from a consumer standpoint. More later.

In the meantime, take a look at the squeals coming from the trad feeding trough, linked here. Rivals criticise BBC strategy. Hilarious.

If they pull this off (and I hope they will) then I look forward to watching Walking With Dinosaurs (the Remake), chronicling the end of old media. Because that’s what it looks like.

Four Pillars: Taking the Empire out of Foundation and Empire

To me this particular Empire is all to do with proprietary behaviour, be it about Lock-in Layers or Digital Wrongs or Intellectual Property Wrongs.

So let me tell some stories.
I found the picture below in Tantek Celik’s Flickr, getting there via Matt: ComicPress via my own WordPress dashboard. Thank you Tantek. Thank you Matt.
132468961_5477d96945_o1.jpg

Tantek wondered about the number of soap dispensers visible and the failure scenarios in terms of vendors, refills and service that could have led to the nonsense in the picture. You can read his Flickr “blog” here. My not-really-cynical interpretation of Dispenser Row is that there were cyclical changes in staffing in the Procurement Department, and every change meant a new vendor for things like soap and loo paper and towels and driers. And it was probably cheaper to leave the old ones there than to remove and repair the unsightly holes in the mirror.
Then I saw this story in the Financial Times, about a Chinese DVD producer who was damaging the piracy segments there by selling legal DVDs. The story included the immortal phrases below:

Some international companies have already begun to respond. Warner Home Video’s Chinese joint venture CAV Warner this month began trail sales of a modestly packaged DVD edition of The Aviator priced at just Rmb12, and already issues some DVDs in China a month earlier than in the US .

So let me get this right. If I’ve understood the story correctly, Zoke is a Chinese DVD manufacturer who made the news by launching “retail sales of The Promise just 15 days after it hit local cinemas. And the Promise DVD sold for just Rmb10 (US$1.25) — a price that contrasts with the US$20-30 typically charged for a recent-release Hollywood picture.” Those quotes are from the FT story. You can read more Zoke stories here.
What’s wrong with this picture, I keep wondering. Chinese audiences now get legal DVDs earlier than in the US, for about a twentieth of the US price. Yup, that would do it. That would kill “piracy”.

Makes you think a bit about what causes “piracy” in the first place. I think I would like DVDs at a twentieth of the price and released two weeks after the film hits the big screen. I look forward to the next unsightly mess, as there are attempts made to stop the grey exports.

Random Walk Three is around my own kitchen. Our dishwasher broke down; it was cheaper to buy new than to repair. And then I had to do this very frustrating thing. Pay twice as much to “integrate” the device into my kitchen as to have it stand-alone. I don’t know what the US market does for this, but in the UK, if I pay X for a free-standing household appliance, I tend to have to pay 2X for the easily-integrable-cover-off version. So I have the choice of paying 2X or spoiling the consistency of kitchen cabinet facades.

And this made me think of the more proprietary software vendors and how they act. They seem to think it’s okay to say to us “If you want the luxury of our stuff working with your stuff and being coherent and consistent, then you’re going to have to pay. A lot”. In any other world I would be suing someone. But try and find out what rights you have as a software consumer. Diddly squat. Or maybe that’s unfair. You have the right to inadvertent trespass on the vendor’s rights, and the privilege, no, right, of being sued if this happens.

The final random walk is a reverse reverse chronological one, looking at Tim Berners-Lee’s first blog post. I reproduce it in its entirety here:

  • So I have a blog

    Submitted by timbl on Mon, 2005-12-12 14:52. ::
    In 1989 one of the main objectives of the WWW was to be a space for sharing information. It seemed evident that it should be a space in which anyone could be creative, to which anyone could contribute. The first browser was actually a browser/editor, which allowed one to edit any page, and save it back to the web if one had access rights.Strangely enough, the web took off very much as a publishing medium, in which people edited offline. Bizarrely, they were prepared to edit the funny angle brackets of HTML source, and didn’t demand a what you see is what you get editor. WWW was soon full of lots of interesting stuff, but not a space for communal design, for discourse through communal authorship.Now in 2005, we have blogs and wikis, and the fact that they are so popular makes me feel I wasn’t crazy to think people needed a creative space. In the mean time, I have had the luxury of having a web site which I have write access, and I’ve used tools like Amaya and Nvu which allow direct editing of web pages. With these, I haven’t felt the urge to blog with blogging tools. Effectively my blog has been the Design Issues series of technical articles.

    That said, it is nice to have a machine to the administrative work of handling the navigation bars and comment buttons and so on, and it is nice to edit in a mode in which you can to limited damage to the site. So I am going to try this blog thing using blog tools. So this is for all the people who have been saying I ought to have a blog.

Thank you, Sir Tim. [Yes I did change the spelling of discourse and bizarre, please forgive my editorial foibles].

Intriguing to think that, as averred by Tim, we were willing to do the funny-angled-bracket thing and not insist on WYSIWIG.

Not any more. The consumerisation and Generation M movements have made sure of that. When we scaled out our wiki implementation, the first “customer” (i.e. non-IT) screams were for WYSIWIG. And not as a nice-to-have. But on a “what the hell are you playing at? ” basis.

Random walks over. Let’s summarise.

  • Some of the problems we have today are caused by our own buying behaviour; we think we procure well, but in software and hardware terms we’re aeons away from where we should be. And we don’t know much about decommissioning. When we get taken out to lunch, we don’t realise just how often we are lunch. Tantek’s soap dispensers are an example.
  • Some problems are caused by sheer greed on the part of some vendors. Prices that are not set by cost or by “what the market will bear”, but instead based on “what we can get away with”. The Zoke story is an example of how we can change this.
  • Yet other problems are caused by real misconceptions as to who the customer is and what rights the customer has. Whose data it is. Whose systems. Whose flows. Whose processes. And at what price. The kitchen appliance example tries to show this.
  • And yet more problems are caused by our own willingness to continue with a “holy of holies” approach to IT. Following the footsteps of doctors, lawyers, priests. You won’t understand it, it’s too complex. We know best. We have our jargon, our rituals, our secret conclave. And you can’t come in. The Berners-Lee inertia story tries to capture this.

Making software platform-independent and device-agnostic. Minimising the costs of enterprise application integration by becoming smarter at what we do, not just blindly driven by yesterday (usually vendor-defined and matured to perfection) process. Using the opensource community as our Zoke and getting things at a twentieth of the price and months early. Avoiding stupidity in DRM and IPR. Bringing a bit of Ralph Nader consumerism into our community.

And guys, it probably begins with the internet. So once again, do whatever you can to support what’s happening at Pulver.

Save the internet.

Four pillars: The disaggregation and reaggregation of search

Brendon Mclean tipped me the wink on Splunk. A search engine explicitly for logs and message queues and database transactions and the like, “IT information”. Sometime ago Chris Locke had told me about Krugle. Finding search code and related technical documentation. Dohop, from Iceland, concentrates on building the best travel search engine. Dibdabdoo is all about hand-finished web laundering for kids, using human judgement to validate kid-friendly content.

So. While Google go serious on Appliance and One Box, and people like FAST come at the enterprise in a different way, there are people spending time and energy building specialised search. And I’m still trying to work out why.

So I tried to see what attributes search could have. For example:

  • The space being covered: a disk or server or many of them, at one’s desk, behind a firewall, everywhere, the web proper.
  • The type of thing being covered: text or file or image or music or whatever, as narrow or as broad as needed
  • The way the space is covered or indexed or checked for changes.
  • The way the searcher interacts with the engine and the engine with the searcher, including personalisation and relevance heuristics

Early search was all about the space being covered and the way it was covered or made relevant. And as I understand more about the Splunks and Krugles of this world, the bulk of today’s innovation seems to be about the “type of thing being covered”, with a little bit on the interaction between searcher and engine. iTunes search became spotlight this way, I guess.
I wonder. I promised Steve Patrick and Phil Dawes I would never start a “semantic web project” at the bank, because our own internal equivalent of industry body and standards body and vendor would kick into overdrive to kill it every which way, a sort of natural antibody ever-present in large organisations, whereas what I wanted was a Steven Johnson emergence.

Maybe this, the emergence of the Krugles and the Splunks, is how some parts of the semantic web will come to be. The data that Tim Berners-Lee wants to see migrating to the web may not always get there via standards like RDF, however hard we try. Because standards are meat and drink to lock-in specialists, about as meaningful and as useful as governments and regulators in preventing lock-in.

But a million different Krugles and Splunks covering different areas deeply and doing it in such a way that information ecosystems can evolve? Some sort of high-cohesion-loose-coupling approach to layered search. Open on standards and agnostic on platforms and opensource in approach. [Opensource free as in freedom, not as in gratis, in case people think otherwise]. Guerilla and emergent in business model and approach. Maybe.

I could be talking absolute tosh, but isn’t that partly what blogs are for? To start the snowballs rolling and to see what happens. If I have to go by the progress made by the zillions of standards bodies in IT, I’d rather back the guerilla approach.