Over the last five years, my thoughts and actions have moved more and more into a community-driven, market-standard-based, opensource-influenced, platform-independent and device-agnostic IT world.
This was not always the case.
Originally I was comfortable with the notion that TCOs for enterprise IT were optimised by strategic vendor relationships, and that one could outsource many of the problems of software development, deployment and operations to specified vendors. The reasoning was simple: prevent undue proliferation of architectures, toolsets, standards and methodologies by sticking with one vendor, in effect leveraging that vendor’s “ecosystem”. Reduce EAI and regression testing costs as a result, improve time to market and use strategic procurement techniques to drive vendor costs down further.
This was seen as great from a control and governance perspective: it yielded readymade reams of stuff you could use for setting out source and target architectures, the roadmaps from source to target, the standards that drove people towards the target architecture, the strategic decisions underlying the roadmaps, and the detailed implementation plans that went with each piece. And the Control Gods looked at it and saw that it was good.
It was good. On paper. And probably very meaningful for environments that had all the following characteristics:
- a stable 3-year business environment and outlook
- a stable 3-year management environment and outlook
- a technology environment that was largely homogeneous to begin with
- a single-vendor “ecosystem” that covered a meaningful proportion of the technical needs
- a complete lack of a blame culture amongst business sponsors, so that decisions could be reviewed and changed if needed
- a complete lack of Not-Invented-Here amongst the inhouse developer community, so that tendencies to reinvent wheels and mousetraps were avoided
Reality tended to be slightly different from this, and as a result, the gains to be had from relatively narrow single-vendor-ecosystem strategies were not there. Enterprise application integration costs tended to skyrocket, either visibly through longer integration, regression and acceptance testing, or less visibly through high incidences of bugs and reduced systems availability. Problems were exacerbated by the existence of multiple proprietary architectures each hell-bent on non-cooperation with the rest.
As people recognised that hybrid environments were the norm and not the exception, we had to find ways of solving the EAI problem. So then we got comfortable with:
- attempts at bus-based architectures and increased componentisation to simplify the technology foundations
- attempts at use of rapid prototyping, RAD, XP, pair-programming to improve the quality of requirements capture;
- attempts at use of time-boxing and time-placing to reduce scope creep;
- attempts at use of options theory to augment classical DCF theory to improve IT investment appraisal
- attempts at outsourcing, far-shoring, near-shoring and here-shoring to improve access to skills and reduce wage bills
These were different ways of ensuring we Did The Right Thing and drove the best value out of technology spend, but it failed to appease the control gods. Each of the techniques stated above placed some level of strain on the way we historically communicated on what we did. Architecture and standards and roadmaps and strategy papers and implementation plans became harder to maintain to any worthwhile accuracy. It did not mean that work was not being done, just that the reporting mechanisms of the past struggled with the present.
The static approaches could not cater for the repeated one-offs such as EMU and Y2K and Basle II and IAS and IFRS and US GAAP and Sarbanes-Oxley, to name but a few. Consultants understood this and exploited the opportunities to the full. And those that were left out of the first feeding frenzy focused hard on Six Sigma and Balanced Scorecard and Smartsourcing and EVA and BPM and and and, corrupting the difference between published and living reality even more. There were many emperors and many sets of new clothes.
Thankfully, Moore’s Law and Metcalfe’s Law and the consequent price-performance gains, coupled with the significant recessionary market pressures at the turn of the century, all this meant that real IT costs continued to fall, at least until the surpluses were eaten up doing “mandatory” consultant-generated programmes.
In the meantime, IT departments the world over learnt to make silk purses out of sows’ ears. They learnt to do more and more with less and less as business needs and markets became more volatile. They learnt to move painfully from raw inventory management through asset management to a portfolio-based approach to managing IT. They learnt that the complexity of their environments grew on a power law basis as everything became connected.
In the midst of all this, some good things happened. The opensource movement got some real traction, Doc Searls’ D-I-Y IT models were coming closer to reality. Tools to support Four Pillars became more readily available and more consistent. Telephony became software. Cool design became important again as a result of the iPod halo. Flash memory and NAND RAM began to drive the environment differently, just as virtualisation and service orientation were becoming reality.
A new set of ecosystems was also emerging. Ecosystems that weren’t single-vendor yet with reduced TCO. Ecosystems that were adaptive and responsive to external stimuli. Ecosystems built on community standards with market-driven principles. Four-pillar tools that supported search, syndication, fulfilment and conversation came from different vendors but worked together. That allowed information to move freely across their perceived boundaries, at the behest of the owner of the information. [Which was NEVER the vendor, anyway].
Which brings me to the point of this long post.
We are on the verge of a new digital revolution. As with the Industrial Revolution, this means we will need new sets of “machine tools”. Machine tools with a difference.
The Wikipedia definition of Machine Tool:
A machine tool is a powered mechanical device, typically used to fabricate metal components of machines by the selective removal of metal. The term machine tool is usually reserved for tools that used a power source other than human movement, but they can be powered by people if appropriately set up …[……]…. Devices that fabricate components by selective addition of material are called rapid prototyping machines.
The age we’re entering needs three types of “machine tool”:
- Those that selectively remove information to create product, the “old way”
- Those that selectively add information to create product, the “web” way
- Those that selectively add information to co-create product, the “tomorrow” way
Man learnt to really use tools when he designed them to make use of his opposable thumbs.
We need to discover what the digital equivalents of the thumbs are, before we learn to use the tools properly.
And my hunch is that it is in the identity and authentication and permissioning space, to take our nascent machine tools to Main Street.