More sniffing around Twitter, Chatter and pheromones

[Note: This is my third post in a series I’ve been writing on this topic; the two previous posts immediately precede this one]. What I want to do here is touch on a few subjects that came up in earlier posts, where I didn’t really have the time or space to express what I meant adequately. My intention in sharing all this is to give you as much depth as I can into my thoughts on the use of tools like Twitter and Chatter.

 

Connected versus channelled

Some of you may have noticed that, in previous posts,  I appear to make a big thing of wanting to place filters at the point of receipt rather than at the point of dissemination, at the “subscriber” level rather than at the “publisher” level. This is no random thought, it represents something I have believed in ever since I took up blogging: you will find it a recurrent theme in the kernel for this blog. There are a number of reasons for it, and I’m going to try and articulate them as succintly as I dare.

Michael Polanyi, in helping us understand what he meant by “tacit knowledge”, is reputed to have said something along the lines of “there are things we know we know, things we know we don’t know, things we don’t know we know and things we don’t know we don’t know”. That fourth bit, the things we don’t know we don’t know, has always intrigued me. As a result, I used to walk around telling myself: “filter on the way out, not on the way in. Let everything come in, you don’t know what you don’t know.” What I was trying to do was to minimise the building of anchors and frames that would constrain or corrupt what was allowed to enter my head, what Einstein called “common sense: the collection of prejudices collected by age eighteen“.

When I see words like “connected” and “channelled” they conjure up different meanings, heavily laden with my own prejudices, despite all my efforts to avoid such prejudices. “Channelled” suggests a one-way street, a broadcast model, a structure where I am a recipient of a signal with all the choices made by the sender of the signal. “Connected”, on the other hand, has a sense of being two-way, interactive, with some sort of parity or equality between the things that are connected.

There’s also something else, something darker, harder to put my finger on, evoking a deep sense of distrust. And it’s rooted in some modern variant of Say’s Law: Supply creates its own demand. What do I mean? Well, let’s take terrorism laws. Come, perform an experiment with me. Open a separate tab or window in your browser, bring up Google and enter the term “UK terror laws used to snoop”. Just look at what you get. Here’s a sample list of the things that local councils have used terror laws for checking whether:

  • nurseries were selling plants unlawfully
  • a child lived in a school catchment area
  • fishermen were gathering shellfish illegally
  • alcohol was being sold to under aged
  • benefit claims were fraudulent
  • people’s dog’s were fouling
  • people were littering
  • cows were meandering
  • calls were made to 900 number phone lines

It’s a much much longer list, with over 470 councils invoking the laws over 10,000 times in a nine year period. Why do they do this? Because they can.

Coming from a family of journalists, and having lived as an adult through the “Emergency” years in India, and having been on the receiving end of some of the power that such states wield, I’ve felt more strongly about such misuse than most.

With all this in mind, I want to remain connected, not channelled. I want to be able to choose what I can know about, learn about, be told about. I don’t want to block out what I don’t know. I don’t want the technology to have tools for censorship built in, which in effect is what happens when filters are designed into publishers. It is too easy to game the publisher end of the market, far harder to game the subscriber end.

So I try and avoid filtering at source. I have no problem with tags, with providing people the metadata that simplifies filtering at subscriber level. But the mechanisms for tagging at source should be designed in a way that they can’t become choke points used by the unprincipled.

 

Avoiding echo chambers, groupthink and herd behaviour

When social networks are used to share information upon which decisions may be made, you will always hear someone bring up the echo-chamber risk. After all, if you put a bunch of like-minded people together, you will get repeated assertions of the same thing. Or so the theory goes.

Wrong. Now this is not deep research, but anecdotally the results have been positive enough for me to want to assert this. Social networks bring together people who have a few common interests, rather than people who hold common views about those interests, or who replicate those interests. My twitter followers are not clones of me. Very few of them are into chillies and capsaicin in a big way; very few have the same “retarded hippie” tastes in music I do; very few are as crazy about cooking (and eating) as I am; very few are Indian and 53; very few go to church every Sunday. Some do. But not all.

Social networks create value because people in the networks come together, drawn by what they have in common, but creating value because of what they don’t have in common.

There have been a number of discussions recently about the “dangers” of direct democracy: how could we possibly run anything, manage anything, lead anything,  based on the statistically expressed will of the Great Unwashed?

Surely what will happen is that people will keep on asking for faster horses.

Perhaps.

Maybe.

But who are we to decide that everyone else is wrong?

The tools we have today allow for greater dissemination of information than we’ve ever had before. Attempts to control, suppress or subvert the free passage of information are becoming harder and harder to pull off, there’s a Wikileaks waiting to happen in every command-and-control centralised hierarchical set-up. These tools are becoming ubiquitous, affordable, effective, and the empowerment of the edge continues apace. Snap polls are no longer about random sampling, not when there’s a Facebook around. [Incidentally, don’t underestimate the value of having good polling mechanisms in systems like Twitter and Chatter].

Democratisation does not yield dummification. Except perhaps in the eyes of elitist experts.

 

Signals, not trails: improving our work lives

Some of the comments I’ve received, some of the references I’ve been pointed towards, have a tendency to veer towards a trail-like analogy for lifestreaming and workstreaming. This is possibly due to my use of the pheromone analogy. If that has happened I am sorry, that was not my intention. If anything, my use of the wikipedia article in the first post was an attempt to avoid just that, by showing that the pheromone classification went way beyond the concept of trail.

Since then, on a the-physics-is-different basis, I’ve tried to bring in the time dimension as well. The signals we share as we workstream are separable by time, and each “layer” of time does not in any way corrupt other layers, contiguous or not. And I feel the very existence of these signal histories helps us improve our work lives dramatically.

How?

In four ways.

Firstly, they give us institutional memory as to what happened, what was done. This allows us to break away from blame cultures, move towards an environment of “We have not failed, we have found ten thousand ways that do not work”…. but with a difference. By being able to record the conditions under which something did not work, we learn something about the conditions under which something will work. And we can form the equivalent of seed-banks under the icecaps of organisations, storing the seeds we need for conditions that do not exist today, but could exist at a future date.

Secondly, they give us the ability to trend behaviours and forecast with somewhat more accuracy than has been the case in the past, based on data rather than political connections. It used to be said that history will always be written from the perspective of the hunter until lions learn to speak. Well, lions can speak. Now. Histories are less likely to be corrupt if they are constructed by bringing together squadrons of disparate tweetstreams. This sort of crowdsourcing of information has been happening for some time now; I could not hide my glee when I learnt that 18th century ships’ captain’s logs were being used to conduct climate change research. [And thank you, everyone involved in the project, for making sure the output was not behind a paywall, that it was searchable and retrievable. How wonderful.

Thirdly (and this may be my most controversial point) I think they make our work more interesting. Humour me on this. One of the most depressing things about the Industrial Revolution, assembly-line thinking and division of labour was the way human beings were somewhat dehumanised as a result, becoming narrow specialists good at doing mind-numbingly boring things well. Five or six years ago, I had the pleasure of listening to John Seely Brown and John Hagel at a Supernova conference (thank you Kevin Werbach) talking about motorcycle factories in China and how collaboration took place because people weren’t working sequentially. And it got me thinking.

It got me thinking about the new generation, and how they seemed comfortable multi-tasking, how they were being accused of being ADHD as if ADHD was an epidemic [if you have not watched Sir Ken Robinson’s talk on changing education paradigms, stop everything you’re doing and watch this 11 minute video. Then watch the whole thing, the hour long version, link provided below the summary. Thank you RSA!]

It got me thinking about knowledge workers and the lumpiness of knowledge work, the implications for the generation of cognitive surplus in the enterprise.

And it got me to a point where I saw the possibility that division of labour was a thing of the past. That for the millenial knowledge worker in a social network with workstreaming, switching costs were tending to zero.

More to chew on. I’ll be back. Comment away.

Thinking more about Twitter, Chatter and knowledge worker pheromones

Introduction

This post is a follow-up to one I wrote a few days ago; based on the twitter and mail feedback, and on the comments I’ve received via blog and facebook, it seemed worthwhile to continue discussions on this train of thought.

Summary of previous post

Let me first summarise where I was trying to go with the previous post:

  • It’s normal and natural for human beings to “publish” signals that can be shared
  • Signals can be of many types: alerts and alarms, territorial markers, calls for action, pure unadulterated information
  • Signals can be shared in small groups or made available to all
  • As the tools for sharing improve, and as they become accessible by more and more people, the sharing of signals will grow
  • Twitter and Chatter are the leading examples of consumer and enterprise tools for sharing signals
  • All this could have significant benefits for us, at home, at work and at play (that’s if we know the difference any more), particularly as we grasp the value of knowledge worker cognitive surplus

The role of technology in all this

At the outset I want to make sure people understand that technology is nothing more than an enabler to all this. People must want to share, to make productive use of their cognitive surplus. It may sound altruistic, it may sound Utopian, all this emphasis on sharing; we all have to get it into our heads that man is a social animal; we all have to get it into our heads that sharing creates value, both at home and at work; short-sighted, ill-thought-out and sometimes downright nefarious approaches to “intellectual property rights” over the last fifty years or so have blinkered people from seeing this fundamental point.

Sharing is a very people-centric concept, part of our culture, part of our values. In a sense these posts are not about Twitter or Chatter, but about the existence of toolsets that make sharing easier, more accessible, more affordable. If we start believing that it’s all about the technology, we will start convincing ourselves that events in Tunisia and Egypt and Libya were about the technology rather than the will of the people. So hey, let’s be careful out there.

When I was young, I was taught that there were three “waves” to technology adoption. In wave 1, there was a substitution effect: people used the new technology to do something they used to do with something else: substituting the horse for the car. In wave 2, there was an “increased use” effect: A horse would take you maybe 40 miles in a day at best, a car could take you 400 miles in that same day. So people could travel further. And in wave 3, we had “embedded use”, where the technology was an intrinsic part of a new product or service, unseen before: smartcards are an example.

The Nineties and the dot.com boom led to a newish taxonomy for markets and products and services, as startups and venture capitalists desperately tried to put labels on things: there was a flowering of “categories” and “category-busters”. Maybe I’ve read too much Kevin Kelly (though I don’t really think that is possible): I would strongly recommend every one of his books, right up to his latest “What Technology Wants”. Over the years it became clear I was beginning get hung up on seeing various aspects of techn0logy strictly from an evolutionary standpoint

This came to a head recently when I was reading Tim Flannery’s Here On Earth, another wonderful book. Go out and buy it now. Stop reading this post. Come back later. You won’t regret it. Anyway, let me tell you about the a-ha moment I had while reading that book.

Technology speeds up evolution

In Here On Earth, Flannery spends some time talking about technology as a means of radically speeding up evolution; the way I interpreted what he was saying, he was comparing the time taken for species to grow armour or fangs or claws with the time taken for humans to make suits of armour and spears.

This resonated a lot with me, because for the past two years I’ve been writing a book about information seen from the perspective of food: information ingredients, preparation of information, nutrients in information, toxic information, how information is processed, the concept of information waste, all around the fulcrum of an information diet. Part of the stimulus for writing that book was gaining a deeper understanding of cooking as an “external stomach”.

Implications for the pheromone concept as applied to Twitter and Chatter

With all this in mind, it is not enough for us to visualise tweets as pheromones, copying nature; we should look further, see what we can do that we could not have done before, extending what nature shows us. I’ve tried to build a small list to help people think this through:

  • Pheromones aren’t archivable, indexable, searchable, retrievable
  • Pheromones can’t be analysed for historical trends
  • Pheromones can’t be mashed
  • Segmenting pheromones isn’t easy
  • So what we do with digital pheromones should do all this, overcome the constraints of natural pheromones

An aside: Whenever I try and learn something, I start with understanding similarities with other things I know, then I concentrate on the differences. In a way that’s what I’m doing with this post; the previous post concentrated on similarities between tweeting and pheromones, this one will concentrate on differences and why they’re valuable.

Histories, trending and analytics

Unlike pheromones, tweets are digital and can easily be archived, stored, tagged and classified, searched for, retrieved at will. So the signals are more easily accessible to a larger group of people. In this context, we have to think of the signalling aspect of tweets, from an economics viewpoint, as “extreme nonrival goods”….. one person’s use of the signal does not impact the ability of others to use the same signal. Obviously there are constraints to do with scarcities and abundances: an ant can follow a trail to a store of food only to find that the store has been used up; physical things tend to obey laws of scarcity, while digital things tend to obey laws of abundance. [Don’t get me started on the abominations taking place in the Digital Economy Act space; that’s for another post, another time.]

There are no time series for pheromone tracks, nor an easy way to find out “what’s trending now”. These are very powerful capabilities in the twittersphere, as long as the data is there, accessible and exportable. The implications for what used to be called “knowledge management” are radical and extreme. Visualisation tools become more and more important, in terms of tag clouds and heat maps and radar diagrams and the like.

Understanding the location implications of the signals is also very valuable. Some years ago I was told that every desktop in Google had its latitude and longitude embedded in the desktop ID; this became very valuable for doing analysis of things like prediction markets, when you want to see the impact of physical adjacency on the results.

But that was in another country, and besides the wench is dead. Today, we live in times when workspaces no longer need to have desks, so the concept of the desktop becomes more and more questionable. Location becomes something dynamic, which is why the GPS or equivalent in smartphones and tablets is coming to the fore.

The physics is different

Many years ago, when I was first venturing into virtual worlds, I remember reading an article which really struck me at the time. What it said was that in virtual worlds, “the physics is different”. With no gravity, no heat, no light, no cold, there was no reason you couldn’t fly or starve or walk around naked for that matter.

There’s a bit of the-physics-is-different about tweets. The pheromone concept tends to deal with signals given by a single author, then amplified (or allowed to decay) by other authors either overlaying the signal or avoiding it. Which means in effect that the signals are aggregated in the same physical area and in the same period of time. Tweets don’t face that constraint. As a result, we can daisy-chain tweets by a single person over a period of time, “geographically dispersed” in space or by logical context. Or we can aggregate all tweets sharing some characteristic or the other. Or, for that matter, blocking out all tweets that share specific characteristics.

This ability to tune in or tune out at such a level of granularity is of critical value, particularly when it comes to filtering.

The need for good filtering

If everyone tweeted and everything tweeted, soon all would be noise and no signal. As Clay Shirky said, there is no such thing as information overload, there’s only filter failure.  In other words, information overload is not a production problem but one of consumption.

This is important. Too often, whenever there is a sense of overload, people start trying to filter at the production point. In a publish-subscribe environment, this translates to asking the publisher to take action to solve the problem. My instinct goes completely against this. I think we should always allow publishing to carry on unfettered, unhampered, and that all filtering should take place at the edge, at the subscriber level. There’s something very freedom-of-expression and freedom-of-speech about it. But it goes further: the more we try and concentrate on building filters at publisher level, the more we build systems open to bullying and misuse by creating central bottlenecks. Choke points are dangerous in such environments.

It is far better to build filters at subscriber level. Take my twitter feed, @jobsworth. Most of my tweets are about four things: my thoughts about information, often related to blog posts; the food I’m cooking and eating; the music I’m listening to; and my summaries and reports on conferences and workshops and seminars. [I tend not to tweet at sports events because of the spoiler risk].

So while there is a fairly low underlying tweet level, my twitter activity is bursty, lumpy.  At weekends it goes up as I play music at blip.fm/jobsworth; when I’m at conferences it can go up to 50 tweets an hour; and, also usually at weekends, when I’m cooking, the tweet frequency goes up. This lumpiness is uncomfortable for some people, they find that every now and then I dominate their tweetspace.

As a result, over the years, a number of people have asked me whether I can suppress one or the other, e.g. by cutting the link between twitter and facebook, blip.fm and twitter, et cetera, or, more often, asking me to fragment my twitter ID, have one for food and one for music and so on and so forth.

Letting the subscriber do the filtering

This is not a radical idea. The whole point of personalisation is that it takes place at the edge and not at the core. So, to solve the lumpiness problem, we need better subscriber tools. With such tools, you should be able to say, I want to follow @jobsworth’s conference tweets and his food tweets but not his music tweets, while someone else says the precise opposite. And all I will have to do is to ensure that the 21st century equivalent of hashtags is used to segment and categorise and classify my tweets. Implementing filters at publisher level is a broadcast concept, and, furthermore, runs the risk of misuse: every time you build a choke point, someone will come along and try to exert undue influence over that choke point. We shouldn’t have governments and quasi-governments telling publishers and ISPs what to publish and what not to publish, not in lands where words like “free” have any meaning.

Lifestreaming implications for workstreaming

I’ve used tweets as a generic term, not bothering to differentiate between Twitter and Chatter. The same things hold true for both Twitter as well as Chatter, in most cases. But there are a few strategic differences.

Firstly, unlike Twitter, Chatter tends to operate in a space between systems of engagement and systems of record (see my post on this distinction here); in enterprises, identity is subject to strict verification, making this simpler to do. Secondly, a Chatter world is likely to have many inanimate publishers, and the “asymmetric follow” ( a term I first saw used by James Governor of RedMonk) becomes important. [Think about it for a minute. You get spam in your email? Boo-hoo. How much of your email spam is actually formal and internal to your company, rather than as a result of malign external forces?]

The ability to have strong verification of identity in a corporate context has many benefits, since it is then possible to have formal attributes, values and characteristics associated with an individual. This is where “gamification” and “badges” start having real tangible value in the enterprise, with different classes of badges, some personally asserted, some bestowed by a third party, some “earned” as a result of completing an activity or activities successfully.

And on to Part 3

Again, I shall wait for feedback, and if people continue to be interested, I shall take this further. This time I intend to concentrate on the lumpiness of knowledge work; why workstreaming, in combination with a couple of other techniques, can make sensible use of the cognitive surplus; how this will allow enterprises of all sizes to move away from traditional politically charged blame-cultures to genuine value-builders.

And most importantly, I want to discuss why lifestreaming and workstreaming actually make us smarter human beings, in comparison with the dumbing-down that took place during the Industrial Age with its assembly line, division of labour, broadcast mindset, built on economics of scarcity, hierarchical to the extreme. Assembly line. Division of labour. Broadcast mindset.  Scarcity-focused. Hierarchical in nature. Five constructs that have destroyed education, healthcare and government, and will soon destroy all industry. If we let them.

Thinking about Twitter and Chatter: the knowledge worker’s pheromones

Introduction: Background and influences

If you’ve been reading this blog for a while, then you’d have come across the name of Clay Spinuzzi. I’ve been following his work for about five years now, and had the privilege of meeting him for breakfast while vacationing in Austin in the summer of 2008. Clay introduced the term “ambient signalling” into my thought process, a key ingredient in the thought process that led to this post. It made me read his work on organisational genres and on participatory design, often within the perspective of networks.

A number of other people have also influenced me considerably when it comes to this particular post, and I’d like to declare my debt to them at the outset. Howard Rheingold (whom I was meant to meet earlier this week, but couldn’t, as a result of my unwisely choosing to tear ankle ligaments rather than fall down) really set the scene for me over two decades ago in his Whole Earth Review and WELL writings, followed by his excellent book on The Virtual Community in the early 1990s; one of his other books, Smart Mobs, which I read nearly a decade ago, was also a key influence.

The next breakthrough came when, as a subscriber to Chris “Rageboy” Locke’s Entropy Gradient Reversals, I was taken on my first ride on the Cluetrain. I can’t tell you how many times I’ve read The Cluetrain Manifesto, but it’s in double digits. [Disclosure: I met Chris soon after the book was published, Doc soon after that, David shortly after that and Rick a few years later; they remain good friends of mine, and I was deeply honoured when they asked me to contribute a guest chapter to the 10th Anniversary edition of the book.] Cluetrain really got me thinking about community interaction from a business perspective, beyond the fortress-like walls of the corporation.

Around the same time, I had the opportunity to read Amy Jo Kim’s excellent Community Building on the Web. Over the years I’ve bought at least five copies of the book, it’s one of those regularly borrowed, rarely returned; I had the opportunity to meet her at Supernova some years ago. The book sparked my personal interest in game design and game mechanics from a business perspective, something Amy Jo continues to work on with her usual flair.

Shortly after that, I read Steven Johnson’s Emergence, deepening my understanding of slime mould and ants and swarming, all against the context I’ve described above. I’ve had the opportunity to meet Steven since, and continue to follow his work as well; he really helped me understand something about headlessness, how everything is a node in the network.

The final, and critical, influence was that of Clay Shirky, whose blog I followed religiously over the years, even when his output shifted more to books and speeches. It was he who helped me bring all this together with the trenchancy of his analysis of how communities work. Insights from him on three aspects helped establish my thinking: to create sustainable commons material, the cost of repair should be at least as low as the cost of damage (the undo button in Wikipedia is an example); that there is no such thing as information overload, only filter failure (which helped me understand something about the recommending, curating, filtering roles of network members); and that in the age of knowledge workers, much good can be achieved by effective use of what Clay terms Cognitive Surplus, the title of his most recent book. I’ve known Clay for some time as well. We had a delightful lunch together in Davis this year, and I visited him at Tisch when I was last in New York, a wonderful set-up.

Why am I telling you all this? Two reasons. Firstly, to give thanks where thanks is due, to point out the people who have influenced and inspired me in this particular context. Secondly, as a consequence, to give you the opportunity to go deeper into the influences, research things for yourself. Some of you obviously know all this, have read all the books, met the authors; for you, this may seem onerous and repetitive, and you’ve probably skipped all this anyway. This introductory section is for the rest of the readers.

Pheromones

For a few years now, I’ve been looking at information systems and services as if they were biological in origin, serene in the attitude that, if the 20th century was meant to be the age of physics, the century we live in will be characterised as the age of biology. Using that perspective, it was only a matter of time before the “ambient signalling” spoken about by Clay Spinuzzi would start feeling like the pheromones laid down by ants as they go about their chores. Which, unsurprisingly, led me to Wikipedia, to start reading up on pheromones.

There I found the definition of “pheromone” to be: a secreted or excreted chemical factor that triggers a social response in members of the same species. Now that interested me greatly. A “social response” as opposed to any other kind of response. So I looked further, and Wikipedia informed me that Peter Karlson and Martin Luscher introduced the term “pheromone” in 1959, to influence specific behaviours from two or more “conspecifics”, members of the same species. The intended etymology of the neologism was itself of interest: they formed this word from the Greek roots pherein (to transport, to bear) and hormone (stimulus, impetus): so a pheromone became something that was a carrier of stimuli. Hmmm.

Types of pheromones

It turns out that there are many types of pheromones, classified in different ways. You have the concept of primer, releaser and information pheromones: primers kick off changes in development events, releasers make you change your behaviour, and information pheromones just tell you things. If you look further, you find far more detailed classifications of pheromones: aggregation, alarm, epideictic, signal, terrotorial, trail, sex, and so on. If you’re interested, please read the Wikipedia article on pheromones yourself, which gives you the basics on pheromone types.

Intriguingly, “there are physical limits on the practical size of organisms employing pheromones”.

Twitter, Chatter and their pub-sub nature

I’ve always thought of Twitter as a publish-subscribe mechanism, and, in similar vein, of Chatter as an enterprise bus with pub-sub built in. [Disclosure: I’m Chief Scientist at Salesforce, the people behind Chatter. It’s one of the key reasons I joined the company]. I’ve been lucky enough to work with people who’ve believed in bus-based architectures for some time now; and, ever since delving into EDI in the early 1980s, I’ve been a convert to recipient “beneficiary” driven transaction and messaging systems.

Over the years, as I’ve continued to play with Twitter and with Chatter, their innate strength-from-simplicity has become more apparent to me. Which is why I wrote posts describing Twitter as a “submarine in the ocean of the web” some years ago. I’ve been able to use Twitter to rescue hamsters lost down holes in floorboards, to get visas for foreign travel, to collect hand-me-down recipes for ragu, to acquire limited-release CDs in the city of origin, the list goes on and on. And, now that I’ve been using Chatter in anger for the past six months, I’ve been able to learn something about its differentiated value. How following “things” as well as people becomes valuable in a business context. How exception handling becoming the norm is no longer a frightening thought. How closed and open groups can overlap and coexist. How institutional memory is established and rekindled, how new forms of knowledge leadership emerge as a result.

Tweets as pheromones, and their Chatter equivalents

Nearly a year ago, I spent some time looking at why we share and what we share, using tools like Twitter. More recently, I took this further, in a three-part post looking at social objects and their role in such communal enterprise activities, which you can read here, here and here. And a few weeks ago, I tried to put all this in the context of why sharing is important in the enterprise, a theme I looked at tongue-in-cheek here.

Which brings me to the nub of this post.

“Birds do it, bees do it, even educated fleas do it”

Like falling in love, providing signals meant for sharing is normal and natural. We have to start thinking of tweets as the knowledge worker’s pheromones. Signalling. Alerting. Marking out “territory”. Warning off. Pointing towards food or shelter. Looking for relationship. Sometimes preparatory, sometimes catalytic, sometimes just plain old informative.

But always social, always designed to share.

Sometimes only visible to your “conspecifics”, to those belonging to your own species.

Sometimes visible to all.

Sometimes reinforced by repeated overlays and relays.

But always  always social, always always designed to share.

More later, as I extend this theme into compound and multi-authored shared signals, and of the immense value in being able to record, replay, aggregate, analyse.

That’s for my next post….that’s if I find the comments and feedback such that writing a next post becomes worth the while. [Reminds me of an apocryphal Churchill-Bernard Shaw story. Shaw is meant to have sent Churchill a pair of tickets to the opening night of one of his plays, saying “bring a friend… if you have one”. Churchill is meant to have replied, returning the tickets, “can’t make opening night. will make second. if you have one.”

 

 

 

Musing about sharing and social in business

To paraphrase Peter Drucker, the primary purpose of a business is to create customers, people who are able and willing to part with their money to buy goods and services from you.

To paraphrase Ronald Coase, the primary purpose of a firm is to reduce business transaction costs, principally the costs of information, search, contracting and enforcement.

Words like “sharing” and “social” are often treated as fluffy and ephemeral and Utopian and otherworldly, dismissed as being too pinko-lefty-tree-hugger to make business sense.

Which begs the question. What makes business sense?

In this context, I think it’s reasonable to assert that anything that helps businesses to create customers and/or to reduce transaction costs is worth doing, “makes business sense”.

So any debate about the value of “sharing” and of “social” in the enterprise needs to consider these things.

Social networks reduce transaction costs in a number of ways. Search costs are reduced as a result of network-driven recommendations, votes, ratings, increasing signal and quieting the noise. Some of these are “passive” outcomes, where the value is a function of the overall size of the network, provided as an anonymous aggregation. Much of collaborative filtering falls into this category. As against this, we also have “active” outcomes, where the value is a function of the size of your network, provided as an “onymous” recommendation to you by a person or people with real knowledge of you and your needs. Contracting costs are reduced as a result of the ability to have transferable, “nested” trust: you imbue trust on to the people trusted by those you trust. Enforcement costs also reduce as a result of this trust daisy-chain, since adherence to common principles and rules is made easier in an environment of trust, simplifying the process of recourse in the event of breach.

Sharing also reduces transaction costs in a number of ways. As Linus’s Law put it, given enough eyeballs all bugs are shallow. Transparently shared information is less likely to contain errors, as a result of the crowdsourced “proofreading” that takes place. Tasks that would otherwise be hard to parallelise are made parallel through sharing techniques, as seen in the way people study satellite imagery in order to find big things that have been lost for years. Even the tragic death of Steve Fossett had an unintended outcome: the discovery of the locations of many crashed planes as a result of thousands of people poring over thousands of images. There’s also the question of liberating latent “cognitive surplus”. Knowledge work is intrinsically lumpy in character: it comes in peaks and troughs rather than in a smoothed out manner. But people are not prepared to look unemployed as a result of that lumpiness, for fear of losing their jobs. So they busy themselves with the only tool provided to them: meetings. By sharing information resources and providing the right tools to create, find, check and correct information where required, firms can convert such cognitive surplus into tangible value, as error rates drop.

In similar vein, social networks help us understand customer intentions, needs and wants more accurately, by bringing the customer into the conversation and providing that customer with a voice. Tools to monitor, analyse and report on the conversations taking place are improving as we speak.

These conversations don’t take place in walled gardens but out in the open, the shared spaces of life, both in culture as well as in technology and architecture. Markets are conversations, as Doc Searls said. It is through the sharing of social objects that communities form and grow.

The social objects have another critical use: they form rolling stones that gather the moss of metadata we all need. As we move from hierarchies of product and customer to networks of capabilities and relationships, the topology of the business and the firm changes. Vertical integration is replaced by an architecture best defined as high cohesion with loose coupling or, if you prefer, small pieces loosely joined, to quote David Weinberger.

This horizontalisation of industry and business is taking place while one other major shift takes place, the democratisation of access to real computing power. Neither telcos nor IT companies have any control of the device at the edge of the service. Without end-to-end control, and in a small-pieces-daisy-chained world, it is no longer possible to have a finite number of low-volatility repeatable processes.

Instead we have patterns.

And as we move from process management and execution to pattern recognition and response, the metadata we collect becomes more important; the ability to spot patterns in the data and in the metadata becomes more important; the ability to extrapolate from the data to the pattern becomes more important. Which in turn means we have to get better at how we describe things, again something that requires us to consider the concept of cognitive surplus, how to apply it, how to convert it into value for our customers, for our business, for our shareholders.

Businesses exist to create customers. They are organised into firms in order to reduce transaction costs. The use of social and sharing tools within the enterprise needs to be looked at in these contexts and not dismissed out of hand as is often the case.

 

 

 

A lazy Sunday playlist

The man that hath no music in himself,
Nor is not mov’d with concord of sweet sounds,
Is fit for treasons, stratagems, and spoils.

The motions of his spirit are dull as night
And his affections dark as Erebus:
Let no such man be trusted.

So said the Bard via the voice of Lorenzo in The Merchant of Venice. One of my favourite quotations.

I’ve always loved music, and tend to have a song playing in my head much of the time, whatever I’m doing. Which may sound strange, especially since neither me, nor my siblings (or for that matter our parents) showed any significant sign of being “musical”. Other than the usual teenage-angst thing of playing guitar, I can’t remember any of us actually picking up a musical instrument.

But we had relatives and friends aplenty who made up for our shortcomings in this respect, and the house I grew up in reverberated much of the day (and possibly even more of the night) with music. There was music everywhere.

This, despite growing up before television, and before the video recorder had made its messy inroads into our lives. This, despite the frequent paucity of electrical power and the relative absence of battery-driven solid-state radios.

For the most part, the music we listened to was based on vinyl, sometimes lacquer, and the sounds scratched their way through turntables and valve amplifiers through simple sturdy speakers. In later years the cassette player became the norm, given its then-unprecedented capacity to work on mains power as well as on battery. And we listened and swayed and sang along. And we even learnt to dance…. to Leonard Cohen…

Wonderful times. We were very privileged, there were some very talented musicians around then. And I’ve considered myself incredibly lucky to have been able to watch many of them “live” in later years, a trend that continues to this day. So for example in the last few years I’ve seen Steve Winwood, Neil Young, Eric Clapton, Crosby Stills and Nash, Donovan, Don McLean, Cat Stevens, just to name a few.

I still have a bunch of their vinyl albums (and, thankfully, the ability to add to that collection as the vinyl gets retro-reissued).

I still have a bunch of their works on pre-recorded cassette tape (though I no longer have a cassette tape recorder in the house).

I still have a few thousand CDs of their works.

And I still go to watch them in concert. [Those that are still alive, that is].

If you’ve followed me on twitter (where I exist as @jobsworth) or directly at blip.fm/jobsworth, you’ve probably noticed that I tend to listen to a very narrow band of music, deeply engrossed in the period 1966-73, with occasional forays into the world that existed before and after. This is for a number of reasons.

Time. I can only listen to so much music.

Familiarity. It’s the music I grew up with, music that I’ve heard many many times.

Preference. I happen to like the styles, the genres, the whole nine yards. Everything about the music of the time.

But there’s one more reason.

An important reason.

The music was *brilliant*. And continues to be brilliant. From a time when singer-songwriters were the norm, when musicians actually played musical instruments, when the word harmony was to do with voices and not perfume, when lyrics were worth learning.

Now I’m showing my age. Every age is entitled to its music. I’m just glad my age had the music it did.

Why?

Here are a bunch of reasons why. And you know something? I could write a hundred posts like this, and still not run out of songs. So if you haven’t heard of them, do listen. And run to your favourite download site. And buy the ones you like. [And for those of you familiar with the music already, I hope I’ve contributed to your lazy Sunday.]

 

http://www.youtube.com/watch?v=EzF_MoXOU1E

http://www.youtube.com/watch?v=TYoPUkfaahI

http://www.youtube.com/watch?v=4tZtJIL5va4

http://www.youtube.com/watch?v=1gvvC0qOQng

http://www.youtube.com/watch?v=6pHKkuK3mUU

http://www.youtube.com/watch?v=uQYDvQ1HH-E