The Arab Spring of the West

As you’ve probably noticed by now,  the Arab Spring, the collective name for the protests in the Middle East and North Africa since 18 December last year, has captured the public’s imagination. If you want to remind yourself of the events that constitute the Arab Spring, you could do worse than try the Guardian’s excellent interactive timeline, a snapshot of which is shown below:

 

This post is not about the Arab Spring per se.There are many other places you can read about it, covered by many people who know far more about it than I do.

While all this has been going on, there’s been a second, much gentler conflagration amongst the digerati. The Blefuscudian question they’ve been trying to address is this: What role did social networks and social media in general play in the Arab Spring? The Big-Endians say Everything, the Little-Endians say Nothing, and while they continue to argue I am sure we will all live happily ever after. This post is not about them either.

What this post is about is a question that’s been troubling me for some time. And that is this: What is the Western equivalent of the Arab Spring? I’ve also been thinking of the natural follow-up question to it: When will it happen?

I may be completely wrong. [If I am, I’m sure you’ll tell me.]

But.

I have a sneaking suspicion that the Arab Spring of the West is already upon us. Why?

The original Arab Spring, the Arab Spring of the Arabs, was about disaffected people, mainly youth, giving vent to their feelings about injustice and inequality and unreasonable behaviour of the powers-that-be, by rising up and challenging the control structures around them.

I guess it’s natural for us to think that there won’t be a Western equivalent, there are no comparable conditions of injustice and inequality and unreasonable behaviour.

Perhaps there aren’t.

But then again, maybe there are.

Disaffected youth.

Giving vent to their feelings.

Feelings spurred by injustice and inequality and unreasonable behaviour.

Rising up to challenge the control structures around them.

Hmmm.

It gets me thinking.

The US State Department, Amazon, EveryDNS, Mastercard, Visa, Wikileaks, Assange, Manning…..

Sony, the PlayStation Network, not-Anonymous, Anonymous, hacking of PS3s, GeoHot…..

Super-injunctions, Justice Eady, Lord Chief Justice Judge (really, that’s his name), CTB, NEJ, LNS, JIH and for that matter CDE, FGH and LMN….and all that jazz

Hmmm.

Maybe that’s the way the Arab Spring will look in the West. As “traditional” control structures like superinjunctions and DMCA and CFAA are found to be unjust and unreasonable by disaffected youth. As they rise up and challenge the control structures. In the West.

I wonder.

Artificial scarcities tend to get met by artificial abundances. Over time, the artificial scarcities lose. No one, not even Qadhafi, can sustain being a Qadhafi forever.

A coda:  All this talk about scarcity and abundance reminded me of the old Shaw quote:

“If you have an apple and I have an apple and we exchange these apples then you and I will still each have one apple. But if you have an idea and I have an idea and we exchange these ideas, then each of us will have two ideas.”

 

 

On firehoses and filters: Part 1

Image above courtesy of Library of Congress, Prints and Photographs Online Catalog.

 

I’ve never been worried about information overload, tending to treat it as a problem of consumption rather than one of production or availability: you don’t have to listen to everything, read everything, watch everything. As a result, when, some years ago, I heard Clay Shirky describe it as “filter failure”, I found myself nodding vigorously (as us Indians are wont to do, occasionally sending confusing signals to onlookers and observers).

 

Filtering at the point of consumption rather than production. Photo courtesy The National Archives UK.

 

Ever since then, I’ve been spending time thinking about the hows and whys of filtering information, and have arrived “provisionally” at the following conclusions, my three laws of information filtering:

1. Where possible, avoid filtering “on the way in”; let the brain work out what is valuable and what is not.

2. Always filter “on the way out”: think hard about what you say or write for public consumption: why you share what you share.

3. If you must filter “on the way in”, then make sure the filter is at the edge, the consumer, the receiver, the subscriber, and not at the source or publisher.

 

What am I basing all this on? Let’s take each point in turn:

a. Not filtering at all on inputs

One of the primary justifications for even thinking about this came from my childhood and youth in India, surrounded by mothers and children and crowds and noise. Lots of mothers and children. Lots and lots of mothers and children, amidst lots and lots of crowds. And some serious noise as well. Which is why I was fascinated by the way mothers somehow managed to recognise the cry of their own children, and could remain singularly unperturbed, going placidly about their business amidst the noise and haste. This ability to ignore the cries of all the other babies while being watchful and responsive to one particular cry fascinated me. Years later, I experienced it as a parent, nowhere near as good at is as my wife was, but the capacity was there. And it made me marvel at how the brain evolves to do this.

Photo courtesy BBC

There are many other justifications. Over the years I’ve spent quite a lot of time reading Michael Polanyi, who originally introduced the “Rumsfeld” “unknown unknowns” concept to us (the things we know we know; the things we know we don’t know and the things we don’t know we don’t know). I was left with the view that I should absorb everything like a new sponge, letting my brain work out what is worth responding to, what should be stored for later action, what should be discarded. And, largely, it’s worked for me. Okay, so what? Why should my personal experience have any bearing on this? I agree. Which is why I would encourage you to read The Aha! Moment: The Cognitive Neuroscience of Insight, by Kounlos and Beeman. Or, if you prefer your reading a little bit less academic, try The Unleashed Mind: Why Creative People are Eccentric. In fact, as shown below, the cover of the latest issue of Scientific American MIND actually uses the phrase “An Unfiltered Mind” when promoting that particular article.

 

b. Filtering outputs

We live in a world where more and more people have the ability to publish what they think, feel or learn about, via web sites, blogs, microblogs and social networks. We live in a world where this “democratised” publishing has the ability to reach millions, perhaps billions. These are powerful abilities. And with those powerful abilities comes powerful responsibilities. Responsibilities related to truth and accuracy, responsibilities related to wisdom and sensitivity. Responsibilities related to curation and verification. None of this is new. Every day we fill forms in with caveats that state that what we say is true to the best of our knowledge and ability; every day, as decent human beings, we take care not to offend or handicap people because of their caste, creed, race, gender, age. Every day we take care to protect minors, to uphold the confidentiality of our families and friends and colleagues and employers and trading partners and customers. Sometimes, some of these things are enforced within contracts of employment. All of them, however, should come under the umbrella term “common decency”.

 

These principles have always been at the forefront of cyberspace, and were memorably and succintly put for WELL members as YOYOW, You Own Your Own Words. Every one of us does own our own words. Whatever the law says. It’s not about the law, it’s about human decency. We owe it to our fellow humans.

When we share, it’s worth thinking about why we share, something I wrote about here and here.

c. Filtering by subscriber, not by publisher

Most readers of this blog are used to having a relatively free press around them, despite superinjunctions and despite the actions taken to suppress Wikileaks. A relatively free press, with intrinsic weaknesses. Weaknesses brought about by largely narrow ownership of media properties, weaknesses exacerbated by proprietary anchors and frames, the biases that can corrupt publication, weaknesses underpinned by the inbuilt corruptibility of broadcast models. Nevertheless, a relatively free press.

The augmentation of mainstream media by the web in general, and by “social media” in particular, is often seen as the cause of information overload. With the predictable consequence that the world looks to the big web players to solve the problem.

Which they are keen to do.

Google, Facebook, Microsoft et al are all out there, trying to figure out the best way of giving you what you want. And implementing the filtering mechanisms to do this. Filtering mechanisms that operate at source.

There is a growing risk that you will only be presented with information that someone else thinks is what you want to see, read or hear. Accentuating your biases and prejudices. Increasing groupthink. Narrowing your frame of reference. If you want to know more about this, it is worth reading Eli Pariser’s book on The Filter Bubble. Not much of a reader? Then try this TED talk instead. Jonathan Zittrain, in The Future Of The Internet and How to Stop it, has already been warning us of this for a while.

Now Google, Microsoft, Facebook, all mean well. They want to help us. The filters-at-source are there to personalise service to us, to make things simple and convenient for us. The risks that Pariser and Zittrain speak of are, to an extent, unintended consequences of well-meaning design.

But there’s a darker side to it. Once you concentrate solely on the design of filterability at source, it is there to be used. By agencies and bodies of all sorts and descriptions, ranging from less-than-trustworthy companies to out-and-out malevolent governments. And everything in between.

We need to be very careful. Very very careful. Which is why I want to concentrate on subscriber-filters, not publisher-filters.

Otherwise, while we’re all so busy trying to prevent Orwell’s Nineteen Eighty-Four, we’re going to find ourselves bringing about Huxley’s Brave New World. And, as Huxley predicted, perhaps actually feeling good about it.

 

More to follow. Views in the meantime?

 

 

More sniffing around Twitter, Chatter and pheromones

[Note: This is my third post in a series I’ve been writing on this topic; the two previous posts immediately precede this one]. What I want to do here is touch on a few subjects that came up in earlier posts, where I didn’t really have the time or space to express what I meant adequately. My intention in sharing all this is to give you as much depth as I can into my thoughts on the use of tools like Twitter and Chatter.

 

Connected versus channelled

Some of you may have noticed that, in previous posts,  I appear to make a big thing of wanting to place filters at the point of receipt rather than at the point of dissemination, at the “subscriber” level rather than at the “publisher” level. This is no random thought, it represents something I have believed in ever since I took up blogging: you will find it a recurrent theme in the kernel for this blog. There are a number of reasons for it, and I’m going to try and articulate them as succintly as I dare.

Michael Polanyi, in helping us understand what he meant by “tacit knowledge”, is reputed to have said something along the lines of “there are things we know we know, things we know we don’t know, things we don’t know we know and things we don’t know we don’t know”. That fourth bit, the things we don’t know we don’t know, has always intrigued me. As a result, I used to walk around telling myself: “filter on the way out, not on the way in. Let everything come in, you don’t know what you don’t know.” What I was trying to do was to minimise the building of anchors and frames that would constrain or corrupt what was allowed to enter my head, what Einstein called “common sense: the collection of prejudices collected by age eighteen“.

When I see words like “connected” and “channelled” they conjure up different meanings, heavily laden with my own prejudices, despite all my efforts to avoid such prejudices. “Channelled” suggests a one-way street, a broadcast model, a structure where I am a recipient of a signal with all the choices made by the sender of the signal. “Connected”, on the other hand, has a sense of being two-way, interactive, with some sort of parity or equality between the things that are connected.

There’s also something else, something darker, harder to put my finger on, evoking a deep sense of distrust. And it’s rooted in some modern variant of Say’s Law: Supply creates its own demand. What do I mean? Well, let’s take terrorism laws. Come, perform an experiment with me. Open a separate tab or window in your browser, bring up Google and enter the term “UK terror laws used to snoop”. Just look at what you get. Here’s a sample list of the things that local councils have used terror laws for checking whether:

  • nurseries were selling plants unlawfully
  • a child lived in a school catchment area
  • fishermen were gathering shellfish illegally
  • alcohol was being sold to under aged
  • benefit claims were fraudulent
  • people’s dog’s were fouling
  • people were littering
  • cows were meandering
  • calls were made to 900 number phone lines

It’s a much much longer list, with over 470 councils invoking the laws over 10,000 times in a nine year period. Why do they do this? Because they can.

Coming from a family of journalists, and having lived as an adult through the “Emergency” years in India, and having been on the receiving end of some of the power that such states wield, I’ve felt more strongly about such misuse than most.

With all this in mind, I want to remain connected, not channelled. I want to be able to choose what I can know about, learn about, be told about. I don’t want to block out what I don’t know. I don’t want the technology to have tools for censorship built in, which in effect is what happens when filters are designed into publishers. It is too easy to game the publisher end of the market, far harder to game the subscriber end.

So I try and avoid filtering at source. I have no problem with tags, with providing people the metadata that simplifies filtering at subscriber level. But the mechanisms for tagging at source should be designed in a way that they can’t become choke points used by the unprincipled.

 

Avoiding echo chambers, groupthink and herd behaviour

When social networks are used to share information upon which decisions may be made, you will always hear someone bring up the echo-chamber risk. After all, if you put a bunch of like-minded people together, you will get repeated assertions of the same thing. Or so the theory goes.

Wrong. Now this is not deep research, but anecdotally the results have been positive enough for me to want to assert this. Social networks bring together people who have a few common interests, rather than people who hold common views about those interests, or who replicate those interests. My twitter followers are not clones of me. Very few of them are into chillies and capsaicin in a big way; very few have the same “retarded hippie” tastes in music I do; very few are as crazy about cooking (and eating) as I am; very few are Indian and 53; very few go to church every Sunday. Some do. But not all.

Social networks create value because people in the networks come together, drawn by what they have in common, but creating value because of what they don’t have in common.

There have been a number of discussions recently about the “dangers” of direct democracy: how could we possibly run anything, manage anything, lead anything,  based on the statistically expressed will of the Great Unwashed?

Surely what will happen is that people will keep on asking for faster horses.

Perhaps.

Maybe.

But who are we to decide that everyone else is wrong?

The tools we have today allow for greater dissemination of information than we’ve ever had before. Attempts to control, suppress or subvert the free passage of information are becoming harder and harder to pull off, there’s a Wikileaks waiting to happen in every command-and-control centralised hierarchical set-up. These tools are becoming ubiquitous, affordable, effective, and the empowerment of the edge continues apace. Snap polls are no longer about random sampling, not when there’s a Facebook around. [Incidentally, don’t underestimate the value of having good polling mechanisms in systems like Twitter and Chatter].

Democratisation does not yield dummification. Except perhaps in the eyes of elitist experts.

 

Signals, not trails: improving our work lives

Some of the comments I’ve received, some of the references I’ve been pointed towards, have a tendency to veer towards a trail-like analogy for lifestreaming and workstreaming. This is possibly due to my use of the pheromone analogy. If that has happened I am sorry, that was not my intention. If anything, my use of the wikipedia article in the first post was an attempt to avoid just that, by showing that the pheromone classification went way beyond the concept of trail.

Since then, on a the-physics-is-different basis, I’ve tried to bring in the time dimension as well. The signals we share as we workstream are separable by time, and each “layer” of time does not in any way corrupt other layers, contiguous or not. And I feel the very existence of these signal histories helps us improve our work lives dramatically.

How?

In four ways.

Firstly, they give us institutional memory as to what happened, what was done. This allows us to break away from blame cultures, move towards an environment of “We have not failed, we have found ten thousand ways that do not work”…. but with a difference. By being able to record the conditions under which something did not work, we learn something about the conditions under which something will work. And we can form the equivalent of seed-banks under the icecaps of organisations, storing the seeds we need for conditions that do not exist today, but could exist at a future date.

Secondly, they give us the ability to trend behaviours and forecast with somewhat more accuracy than has been the case in the past, based on data rather than political connections. It used to be said that history will always be written from the perspective of the hunter until lions learn to speak. Well, lions can speak. Now. Histories are less likely to be corrupt if they are constructed by bringing together squadrons of disparate tweetstreams. This sort of crowdsourcing of information has been happening for some time now; I could not hide my glee when I learnt that 18th century ships’ captain’s logs were being used to conduct climate change research. [And thank you, everyone involved in the project, for making sure the output was not behind a paywall, that it was searchable and retrievable. How wonderful.

Thirdly (and this may be my most controversial point) I think they make our work more interesting. Humour me on this. One of the most depressing things about the Industrial Revolution, assembly-line thinking and division of labour was the way human beings were somewhat dehumanised as a result, becoming narrow specialists good at doing mind-numbingly boring things well. Five or six years ago, I had the pleasure of listening to John Seely Brown and John Hagel at a Supernova conference (thank you Kevin Werbach) talking about motorcycle factories in China and how collaboration took place because people weren’t working sequentially. And it got me thinking.

It got me thinking about the new generation, and how they seemed comfortable multi-tasking, how they were being accused of being ADHD as if ADHD was an epidemic [if you have not watched Sir Ken Robinson’s talk on changing education paradigms, stop everything you’re doing and watch this 11 minute video. Then watch the whole thing, the hour long version, link provided below the summary. Thank you RSA!]

It got me thinking about knowledge workers and the lumpiness of knowledge work, the implications for the generation of cognitive surplus in the enterprise.

And it got me to a point where I saw the possibility that division of labour was a thing of the past. That for the millenial knowledge worker in a social network with workstreaming, switching costs were tending to zero.

More to chew on. I’ll be back. Comment away.

Thinking more about Twitter, Chatter and knowledge worker pheromones

Introduction

This post is a follow-up to one I wrote a few days ago; based on the twitter and mail feedback, and on the comments I’ve received via blog and facebook, it seemed worthwhile to continue discussions on this train of thought.

Summary of previous post

Let me first summarise where I was trying to go with the previous post:

  • It’s normal and natural for human beings to “publish” signals that can be shared
  • Signals can be of many types: alerts and alarms, territorial markers, calls for action, pure unadulterated information
  • Signals can be shared in small groups or made available to all
  • As the tools for sharing improve, and as they become accessible by more and more people, the sharing of signals will grow
  • Twitter and Chatter are the leading examples of consumer and enterprise tools for sharing signals
  • All this could have significant benefits for us, at home, at work and at play (that’s if we know the difference any more), particularly as we grasp the value of knowledge worker cognitive surplus

The role of technology in all this

At the outset I want to make sure people understand that technology is nothing more than an enabler to all this. People must want to share, to make productive use of their cognitive surplus. It may sound altruistic, it may sound Utopian, all this emphasis on sharing; we all have to get it into our heads that man is a social animal; we all have to get it into our heads that sharing creates value, both at home and at work; short-sighted, ill-thought-out and sometimes downright nefarious approaches to “intellectual property rights” over the last fifty years or so have blinkered people from seeing this fundamental point.

Sharing is a very people-centric concept, part of our culture, part of our values. In a sense these posts are not about Twitter or Chatter, but about the existence of toolsets that make sharing easier, more accessible, more affordable. If we start believing that it’s all about the technology, we will start convincing ourselves that events in Tunisia and Egypt and Libya were about the technology rather than the will of the people. So hey, let’s be careful out there.

When I was young, I was taught that there were three “waves” to technology adoption. In wave 1, there was a substitution effect: people used the new technology to do something they used to do with something else: substituting the horse for the car. In wave 2, there was an “increased use” effect: A horse would take you maybe 40 miles in a day at best, a car could take you 400 miles in that same day. So people could travel further. And in wave 3, we had “embedded use”, where the technology was an intrinsic part of a new product or service, unseen before: smartcards are an example.

The Nineties and the dot.com boom led to a newish taxonomy for markets and products and services, as startups and venture capitalists desperately tried to put labels on things: there was a flowering of “categories” and “category-busters”. Maybe I’ve read too much Kevin Kelly (though I don’t really think that is possible): I would strongly recommend every one of his books, right up to his latest “What Technology Wants”. Over the years it became clear I was beginning get hung up on seeing various aspects of techn0logy strictly from an evolutionary standpoint

This came to a head recently when I was reading Tim Flannery’s Here On Earth, another wonderful book. Go out and buy it now. Stop reading this post. Come back later. You won’t regret it. Anyway, let me tell you about the a-ha moment I had while reading that book.

Technology speeds up evolution

In Here On Earth, Flannery spends some time talking about technology as a means of radically speeding up evolution; the way I interpreted what he was saying, he was comparing the time taken for species to grow armour or fangs or claws with the time taken for humans to make suits of armour and spears.

This resonated a lot with me, because for the past two years I’ve been writing a book about information seen from the perspective of food: information ingredients, preparation of information, nutrients in information, toxic information, how information is processed, the concept of information waste, all around the fulcrum of an information diet. Part of the stimulus for writing that book was gaining a deeper understanding of cooking as an “external stomach”.

Implications for the pheromone concept as applied to Twitter and Chatter

With all this in mind, it is not enough for us to visualise tweets as pheromones, copying nature; we should look further, see what we can do that we could not have done before, extending what nature shows us. I’ve tried to build a small list to help people think this through:

  • Pheromones aren’t archivable, indexable, searchable, retrievable
  • Pheromones can’t be analysed for historical trends
  • Pheromones can’t be mashed
  • Segmenting pheromones isn’t easy
  • So what we do with digital pheromones should do all this, overcome the constraints of natural pheromones

An aside: Whenever I try and learn something, I start with understanding similarities with other things I know, then I concentrate on the differences. In a way that’s what I’m doing with this post; the previous post concentrated on similarities between tweeting and pheromones, this one will concentrate on differences and why they’re valuable.

Histories, trending and analytics

Unlike pheromones, tweets are digital and can easily be archived, stored, tagged and classified, searched for, retrieved at will. So the signals are more easily accessible to a larger group of people. In this context, we have to think of the signalling aspect of tweets, from an economics viewpoint, as “extreme nonrival goods”….. one person’s use of the signal does not impact the ability of others to use the same signal. Obviously there are constraints to do with scarcities and abundances: an ant can follow a trail to a store of food only to find that the store has been used up; physical things tend to obey laws of scarcity, while digital things tend to obey laws of abundance. [Don’t get me started on the abominations taking place in the Digital Economy Act space; that’s for another post, another time.]

There are no time series for pheromone tracks, nor an easy way to find out “what’s trending now”. These are very powerful capabilities in the twittersphere, as long as the data is there, accessible and exportable. The implications for what used to be called “knowledge management” are radical and extreme. Visualisation tools become more and more important, in terms of tag clouds and heat maps and radar diagrams and the like.

Understanding the location implications of the signals is also very valuable. Some years ago I was told that every desktop in Google had its latitude and longitude embedded in the desktop ID; this became very valuable for doing analysis of things like prediction markets, when you want to see the impact of physical adjacency on the results.

But that was in another country, and besides the wench is dead. Today, we live in times when workspaces no longer need to have desks, so the concept of the desktop becomes more and more questionable. Location becomes something dynamic, which is why the GPS or equivalent in smartphones and tablets is coming to the fore.

The physics is different

Many years ago, when I was first venturing into virtual worlds, I remember reading an article which really struck me at the time. What it said was that in virtual worlds, “the physics is different”. With no gravity, no heat, no light, no cold, there was no reason you couldn’t fly or starve or walk around naked for that matter.

There’s a bit of the-physics-is-different about tweets. The pheromone concept tends to deal with signals given by a single author, then amplified (or allowed to decay) by other authors either overlaying the signal or avoiding it. Which means in effect that the signals are aggregated in the same physical area and in the same period of time. Tweets don’t face that constraint. As a result, we can daisy-chain tweets by a single person over a period of time, “geographically dispersed” in space or by logical context. Or we can aggregate all tweets sharing some characteristic or the other. Or, for that matter, blocking out all tweets that share specific characteristics.

This ability to tune in or tune out at such a level of granularity is of critical value, particularly when it comes to filtering.

The need for good filtering

If everyone tweeted and everything tweeted, soon all would be noise and no signal. As Clay Shirky said, there is no such thing as information overload, there’s only filter failure.  In other words, information overload is not a production problem but one of consumption.

This is important. Too often, whenever there is a sense of overload, people start trying to filter at the production point. In a publish-subscribe environment, this translates to asking the publisher to take action to solve the problem. My instinct goes completely against this. I think we should always allow publishing to carry on unfettered, unhampered, and that all filtering should take place at the edge, at the subscriber level. There’s something very freedom-of-expression and freedom-of-speech about it. But it goes further: the more we try and concentrate on building filters at publisher level, the more we build systems open to bullying and misuse by creating central bottlenecks. Choke points are dangerous in such environments.

It is far better to build filters at subscriber level. Take my twitter feed, @jobsworth. Most of my tweets are about four things: my thoughts about information, often related to blog posts; the food I’m cooking and eating; the music I’m listening to; and my summaries and reports on conferences and workshops and seminars. [I tend not to tweet at sports events because of the spoiler risk].

So while there is a fairly low underlying tweet level, my twitter activity is bursty, lumpy.  At weekends it goes up as I play music at blip.fm/jobsworth; when I’m at conferences it can go up to 50 tweets an hour; and, also usually at weekends, when I’m cooking, the tweet frequency goes up. This lumpiness is uncomfortable for some people, they find that every now and then I dominate their tweetspace.

As a result, over the years, a number of people have asked me whether I can suppress one or the other, e.g. by cutting the link between twitter and facebook, blip.fm and twitter, et cetera, or, more often, asking me to fragment my twitter ID, have one for food and one for music and so on and so forth.

Letting the subscriber do the filtering

This is not a radical idea. The whole point of personalisation is that it takes place at the edge and not at the core. So, to solve the lumpiness problem, we need better subscriber tools. With such tools, you should be able to say, I want to follow @jobsworth’s conference tweets and his food tweets but not his music tweets, while someone else says the precise opposite. And all I will have to do is to ensure that the 21st century equivalent of hashtags is used to segment and categorise and classify my tweets. Implementing filters at publisher level is a broadcast concept, and, furthermore, runs the risk of misuse: every time you build a choke point, someone will come along and try to exert undue influence over that choke point. We shouldn’t have governments and quasi-governments telling publishers and ISPs what to publish and what not to publish, not in lands where words like “free” have any meaning.

Lifestreaming implications for workstreaming

I’ve used tweets as a generic term, not bothering to differentiate between Twitter and Chatter. The same things hold true for both Twitter as well as Chatter, in most cases. But there are a few strategic differences.

Firstly, unlike Twitter, Chatter tends to operate in a space between systems of engagement and systems of record (see my post on this distinction here); in enterprises, identity is subject to strict verification, making this simpler to do. Secondly, a Chatter world is likely to have many inanimate publishers, and the “asymmetric follow” ( a term I first saw used by James Governor of RedMonk) becomes important. [Think about it for a minute. You get spam in your email? Boo-hoo. How much of your email spam is actually formal and internal to your company, rather than as a result of malign external forces?]

The ability to have strong verification of identity in a corporate context has many benefits, since it is then possible to have formal attributes, values and characteristics associated with an individual. This is where “gamification” and “badges” start having real tangible value in the enterprise, with different classes of badges, some personally asserted, some bestowed by a third party, some “earned” as a result of completing an activity or activities successfully.

And on to Part 3

Again, I shall wait for feedback, and if people continue to be interested, I shall take this further. This time I intend to concentrate on the lumpiness of knowledge work; why workstreaming, in combination with a couple of other techniques, can make sensible use of the cognitive surplus; how this will allow enterprises of all sizes to move away from traditional politically charged blame-cultures to genuine value-builders.

And most importantly, I want to discuss why lifestreaming and workstreaming actually make us smarter human beings, in comparison with the dumbing-down that took place during the Industrial Age with its assembly line, division of labour, broadcast mindset, built on economics of scarcity, hierarchical to the extreme. Assembly line. Division of labour. Broadcast mindset.  Scarcity-focused. Hierarchical in nature. Five constructs that have destroyed education, healthcare and government, and will soon destroy all industry. If we let them.

Thinking about Twitter and Chatter: the knowledge worker’s pheromones

Introduction: Background and influences

If you’ve been reading this blog for a while, then you’d have come across the name of Clay Spinuzzi. I’ve been following his work for about five years now, and had the privilege of meeting him for breakfast while vacationing in Austin in the summer of 2008. Clay introduced the term “ambient signalling” into my thought process, a key ingredient in the thought process that led to this post. It made me read his work on organisational genres and on participatory design, often within the perspective of networks.

A number of other people have also influenced me considerably when it comes to this particular post, and I’d like to declare my debt to them at the outset. Howard Rheingold (whom I was meant to meet earlier this week, but couldn’t, as a result of my unwisely choosing to tear ankle ligaments rather than fall down) really set the scene for me over two decades ago in his Whole Earth Review and WELL writings, followed by his excellent book on The Virtual Community in the early 1990s; one of his other books, Smart Mobs, which I read nearly a decade ago, was also a key influence.

The next breakthrough came when, as a subscriber to Chris “Rageboy” Locke’s Entropy Gradient Reversals, I was taken on my first ride on the Cluetrain. I can’t tell you how many times I’ve read The Cluetrain Manifesto, but it’s in double digits. [Disclosure: I met Chris soon after the book was published, Doc soon after that, David shortly after that and Rick a few years later; they remain good friends of mine, and I was deeply honoured when they asked me to contribute a guest chapter to the 10th Anniversary edition of the book.] Cluetrain really got me thinking about community interaction from a business perspective, beyond the fortress-like walls of the corporation.

Around the same time, I had the opportunity to read Amy Jo Kim’s excellent Community Building on the Web. Over the years I’ve bought at least five copies of the book, it’s one of those regularly borrowed, rarely returned; I had the opportunity to meet her at Supernova some years ago. The book sparked my personal interest in game design and game mechanics from a business perspective, something Amy Jo continues to work on with her usual flair.

Shortly after that, I read Steven Johnson’s Emergence, deepening my understanding of slime mould and ants and swarming, all against the context I’ve described above. I’ve had the opportunity to meet Steven since, and continue to follow his work as well; he really helped me understand something about headlessness, how everything is a node in the network.

The final, and critical, influence was that of Clay Shirky, whose blog I followed religiously over the years, even when his output shifted more to books and speeches. It was he who helped me bring all this together with the trenchancy of his analysis of how communities work. Insights from him on three aspects helped establish my thinking: to create sustainable commons material, the cost of repair should be at least as low as the cost of damage (the undo button in Wikipedia is an example); that there is no such thing as information overload, only filter failure (which helped me understand something about the recommending, curating, filtering roles of network members); and that in the age of knowledge workers, much good can be achieved by effective use of what Clay terms Cognitive Surplus, the title of his most recent book. I’ve known Clay for some time as well. We had a delightful lunch together in Davis this year, and I visited him at Tisch when I was last in New York, a wonderful set-up.

Why am I telling you all this? Two reasons. Firstly, to give thanks where thanks is due, to point out the people who have influenced and inspired me in this particular context. Secondly, as a consequence, to give you the opportunity to go deeper into the influences, research things for yourself. Some of you obviously know all this, have read all the books, met the authors; for you, this may seem onerous and repetitive, and you’ve probably skipped all this anyway. This introductory section is for the rest of the readers.

Pheromones

For a few years now, I’ve been looking at information systems and services as if they were biological in origin, serene in the attitude that, if the 20th century was meant to be the age of physics, the century we live in will be characterised as the age of biology. Using that perspective, it was only a matter of time before the “ambient signalling” spoken about by Clay Spinuzzi would start feeling like the pheromones laid down by ants as they go about their chores. Which, unsurprisingly, led me to Wikipedia, to start reading up on pheromones.

There I found the definition of “pheromone” to be: a secreted or excreted chemical factor that triggers a social response in members of the same species. Now that interested me greatly. A “social response” as opposed to any other kind of response. So I looked further, and Wikipedia informed me that Peter Karlson and Martin Luscher introduced the term “pheromone” in 1959, to influence specific behaviours from two or more “conspecifics”, members of the same species. The intended etymology of the neologism was itself of interest: they formed this word from the Greek roots pherein (to transport, to bear) and hormone (stimulus, impetus): so a pheromone became something that was a carrier of stimuli. Hmmm.

Types of pheromones

It turns out that there are many types of pheromones, classified in different ways. You have the concept of primer, releaser and information pheromones: primers kick off changes in development events, releasers make you change your behaviour, and information pheromones just tell you things. If you look further, you find far more detailed classifications of pheromones: aggregation, alarm, epideictic, signal, terrotorial, trail, sex, and so on. If you’re interested, please read the Wikipedia article on pheromones yourself, which gives you the basics on pheromone types.

Intriguingly, “there are physical limits on the practical size of organisms employing pheromones”.

Twitter, Chatter and their pub-sub nature

I’ve always thought of Twitter as a publish-subscribe mechanism, and, in similar vein, of Chatter as an enterprise bus with pub-sub built in. [Disclosure: I’m Chief Scientist at Salesforce, the people behind Chatter. It’s one of the key reasons I joined the company]. I’ve been lucky enough to work with people who’ve believed in bus-based architectures for some time now; and, ever since delving into EDI in the early 1980s, I’ve been a convert to recipient “beneficiary” driven transaction and messaging systems.

Over the years, as I’ve continued to play with Twitter and with Chatter, their innate strength-from-simplicity has become more apparent to me. Which is why I wrote posts describing Twitter as a “submarine in the ocean of the web” some years ago. I’ve been able to use Twitter to rescue hamsters lost down holes in floorboards, to get visas for foreign travel, to collect hand-me-down recipes for ragu, to acquire limited-release CDs in the city of origin, the list goes on and on. And, now that I’ve been using Chatter in anger for the past six months, I’ve been able to learn something about its differentiated value. How following “things” as well as people becomes valuable in a business context. How exception handling becoming the norm is no longer a frightening thought. How closed and open groups can overlap and coexist. How institutional memory is established and rekindled, how new forms of knowledge leadership emerge as a result.

Tweets as pheromones, and their Chatter equivalents

Nearly a year ago, I spent some time looking at why we share and what we share, using tools like Twitter. More recently, I took this further, in a three-part post looking at social objects and their role in such communal enterprise activities, which you can read here, here and here. And a few weeks ago, I tried to put all this in the context of why sharing is important in the enterprise, a theme I looked at tongue-in-cheek here.

Which brings me to the nub of this post.

“Birds do it, bees do it, even educated fleas do it”

Like falling in love, providing signals meant for sharing is normal and natural. We have to start thinking of tweets as the knowledge worker’s pheromones. Signalling. Alerting. Marking out “territory”. Warning off. Pointing towards food or shelter. Looking for relationship. Sometimes preparatory, sometimes catalytic, sometimes just plain old informative.

But always social, always designed to share.

Sometimes only visible to your “conspecifics”, to those belonging to your own species.

Sometimes visible to all.

Sometimes reinforced by repeated overlays and relays.

But always  always social, always always designed to share.

More later, as I extend this theme into compound and multi-authored shared signals, and of the immense value in being able to record, replay, aggregate, analyse.

That’s for my next post….that’s if I find the comments and feedback such that writing a next post becomes worth the while. [Reminds me of an apocryphal Churchill-Bernard Shaw story. Shaw is meant to have sent Churchill a pair of tickets to the opening night of one of his plays, saying “bring a friend… if you have one”. Churchill is meant to have replied, returning the tickets, “can’t make opening night. will make second. if you have one.”