Is there a Gresham’s Law for information?

The kernel for this post is a comment and a question from Stephen Smoliar on a recent post of mine. [I think this post should come with a health warning on its length and its subject matter. You have been warned :-) ]

Gresham’s Law, simply put, states that Bad Money Drives Out Good.

If there is “Something” commoditised and in circulation, with real and measurable intrinsic value, this “Something” can be replaced by “Something Else” with lower intrinsic value, provided the Something Else has a way of having its artificial value upheld or warranted. Please do look at the linked Wikipedia entry if you are not familiar with the term, rather than rely on any of my mutterings.

Is there a Gresham’s Law for information? Not yet. Could there be a Gresham’s Law for information? There will be, if we let it happen. Should it happen? I think not; I hope not.

My reasons:

  • Information, particularly digital information, is an extreme nonrival good
  • For digital information to have value, we (the consumers/producers of information) must impute that value to information
  • That imputation of value should not come from seeking to make an abundant commodity scarce, but from new ways of imputing value to digital information
  • We already have new ways of imputing value to such information
  • The velocity of information is increasing, and traditional responses will not scale

We have the power to prevent Gresham’s Law from being applicable to information. It is up to us.

Let me now take each point in turn.

Information, particularly digital information, is an extreme nonrival good.

I quote from Rishab Aiyer Ghosh‘s opening essay in CODE below:

  • The economic basis for intellectual property is nonobvious, to say the least. Unlike most forms of property, intellectual property is almost unique in requiring state support for its very existence. While it is helpful to have state protection for a plot of land, it can also be protected by, for instance, putting a fence around it, and a chair can be protected by sitting on it. Such acts of protection express your possession of your property. Information is not just an extreme nonrival good, in that many people can enjoy its benefits at the same time; information is also unusual in that ownership over it cannot be expressed through a public act of possession. You can possess information if you keep it to yourself — in which case it remains private, and nobody knows what it is that you possess. As soon as you make public the information you claim to own, it is public information that everyone can access since you no longer have any natural control over it. The extreme nonrival nature of information means that any expression of possession you make over it, after publishing it, is impotent, and your “ownership” of published information can only be guaranteed through external support, such as by the state.
  • […..]
  • The external protection of such a hard-to-possess form of property also runs against the gradient of economic sense. Information can be reproduced infinitely with no inherent marginal cost of reproduction — any cost is solely related to the medium of production. Since something with a zero marginal cost of reproduction is clearly not scarce, it also has no value that can be naturally protected.

The extreme nonrival nature of information is something to be cherished, to be nurtured, to be protected.

Next point.

For digital information to have value, we (the consumers/producers of information) must impute that value to information

This time I shall quote from Norbert Weiner. In a book titled Extrapolation, Interpolation, and Smoothing of Stationary Time Series, he has this to say:

Let us now turn from the study of time series to that of communications engineering. This is the study of messages and their transmission, whether these messages be sequences of dots and dashes, as in the Morse Code or the teletypewriter, or sound-wave patterns, as in the telephone or phonograph, or patterns representing visual images, as in telephoto services and television. In all communication engineering —[….]— the message to be transmitted is represented as some sort of array of measurable quantities distributed in time. [……] For the existence of a message, it is indeed essential that variable information be transmitted. The transmission of a single fixed item of information is of no communicative value. We must have a repertory of possible messages, and over this repertory a measure determining the probability of these messages.

From Weiner to Winterson. Jeanette Winterson has this to say about reading:

Good books are detonating devices, able to trigger something in the mind of the reader — a memory perhaps, or a revelation, or an understanding not possible by any other means…..The introverted nature of reading is part of its power. No one knows what you are thinking as you read. No one can see what changes might be taking place under the surface of your silent repose. It is this unaccountability to external authority that makes reading both defiant and an act of free will.

Information itself has no value. We have to impute that value, by interpreting it, giving it meaning, giving it credence. Esther Dyson covered some of this ground in a recent post in her new blog. Stephen Smoliar was on a parallel path in his recent post. The whole Semantic Web concept is built around how we can impute meaning, and therefore value, to the bits. And it is we who do that imputing.

As Stewart Brand said, Information Wants To Be Free. [Or maybe the updated version is as Don Marti said, via Doc Searls, Information wants to be $6.95]

Through meaning and interpretation comes value. Remember that the only fact on a financial statement is the cash position; everything else is conventional representation. And in this context, think Enron. Think Sarbanes-Oxley. Think revenue recognition. Think back-dating of options. It is through meaning and interpretation that information has value.

That imputation of value should not come from seeking to make an abundant commodity scarce, but from new ways of imputing value to digital information

The more I think about it, the more I worry about the “evolutionary” response to freeriders and to vandals, as we increase the number of blocks and filters we place on the World Live Web, on the Writable Web. People like Doc and Esther and Chris and Cory and Ross have successfully hammered the idea of the web being live and writable into my head, and now it won’t go away.

Clay Shirky first brought this to my attention, and it seemed reasonable. Innovations adapt to survive and thrive; for things like Wikipedia, he suggests that increased governance is an evolutionary adaptation necessary for survival. I have tended to agree with him, but now I’m not so sure. I think the retrograde nature of the adaptation is a cause for concern, and that we ought to look at new governance models, not variations of the old.

On the inside front cover of Democracy and The Problem of Free Speech, Cass Sunstein is quoted as follows: “Our government now protects speech that causes harm yet forbids speech that is essential.” I have this growing fear that we will live with far too many unintended and unwished-for consequences if we fall into the Increased Governance trap for community information. It is the community that creates the information, maintains it, corrects it where needed, imputes value to it, consumes it, archives it. And we must keep it that way.

We already have new ways of imputing value to such information

Linus’s Law, Given Enough Eyeballs, All Bugs Are Shallow, has itself been exposed to a good many eyeballs. Blogs and wikis are social-software instances where the same Law holds true. The opensource movement has already proven the worth of the Law; let us not give up now and revert to prior, and potentially destructive, governance models.

And we have new ways, ways that make use of the power of emergence a la Steven Johnson, of the power of democratised innovation a la von Hippel and Benkler. Ways like Reputation. Ways like Community Ratings and Collaborative Filtering. Ways like improving the concepts and implementation of the Semantic Web. Ways like better Pattern Recognition and Contextual Search, as Esther suggested. If you want examples, take a look at what Tom Bell (of the Chapman University School of Law) is doing. I quote from the abstract:


Copyrights and patents promote only superficial progress in the sciences and useful arts. Copyright law primarily encourages entertaining works, whereas patent law mainly inspires marginal improvements in mature technologies. Neither form of intellectual property does much to encourage basic research and development. Essential progress suffers.

Prediction markets offer another way to promote the sciences and useful arts. In general, prediction markets support transactions in claims about unresolved questions of fact. A prediction market specifically designed to promote progress in the sciences and useful arts – call it a scientific prediction exchange or SPEx – would support transactions in a variety of prediction certificates, each one of which promises to pay its bearer in the event that an associated claim about science, technology, or public policy comes true. Like other, similar markets in information, a scientific prediction exchange would aggregate, measure, and share the opinions of people paid to find the truth.

Because it would reward accurate answers to factual questions, a SPEx would encourage essential discoveries about the sciences and useful arts. Researchers and developers in those fields could count on the exchange to turn their insights into profit. In contrast to copyrights or patents, therefore, a SPEx would target fundamental progress. Furthermore, and in contrast to copyrights and patents, the exchange would not impose deadweight social costs by legally restricting access to public goods. To the contrary, a scientific prediction exchange would generate a significant positive externality: Claim prices that quantify the current consensus about vital controversies

I’m not saying we have the right answers already. What I am saying is that we need to look for the right answers; what I am also saying is that current retrograde governance suggestions are inappropriate.

The velocity of information is increasing, and traditional responses will not scale

And now for something completely different. Take a look at what Po Bronson said in Bombardiers:

Eggs Igino had been studying Economics for six years, and he’d never seen such a perfect display of the Third Law. He sat down at the small round table in the kitchen and tried to gather his thoughts. The First Law of Information Economics was simple. Knowledge is power. The Second Law was only a little more complicated: Knowledge is not a candy bar. If you eat a candy bar, the candy bar is gone. And if you give it to a friend, then he gets to eat it and you don’t. But with knowledge, you can’t use it up, and you can’t get rid of it by giving it away. This leads to the corollary to the Second Law: Word travels fast. Knowledge spreads much faster and more easily than any physical product, mostly because telling your friends doesn’t make you poorer. If knowledge spreads effortlessly to everyone, and if knowledge is power, then one logical conclusion was that everyone would have power. The other logical conclusion was that the power of knowledge was fleeting and temporary and we would all be powerless. Eggs Igino pulled a paper napkin off the breakfast cart and wrote on it with one of the corporate pens in light blue ink:

1. Knowledge is Power!

2. Knowledge is not a Candy Bar

2(b). Word Travels Fast

He stared at his theories. He underlined each of them twice as he rehearsed their logic. It was just so beautiful to see the salespeople so powerless and their world going to hell. For an intellectual like Igino, it was as beautiful as mitochondria in a petri dish or a mouse in a maze. Then he wrote below the other lines in large, energetic, slashing letters:

3. Power is Temporary!!!

We have learnt about the power of many. We have learnt about the corruptions that take place when reading/writing power is in the hands of a few. History is not just littered with examples, even the history we read has had its fair share of corruption.

Let’s not allow Gresham’s Law to become applicable to Information. Let’s keep traditional governance models where they deserve to be, filed somewhere even Google cannot find, and let’s concentrate on using the power of many, of peer review and rating and pressure and action.

I realise this may offend some of you. No offence intended, I am doing what I normally do, thinking aloud about things that matter to me. Always happy to be proven wrong. Comments welcome.

How do I love thee Wikipedia? Let me count the links.

16 thoughts on “Is there a Gresham’s Law for information?”

  1. It is writing like this that gives me hope that conversations stimulated by blogs may be the hope of the future of education! My immediate reaction, JP, is to now appeal to your wisdom on the question of economic bubbles, since this is the topic of my latest blog entry at:

    http://blog.360.yahoo.com/blog-Mff23hgidqmHGqbcv.lfskakEtS6qLVHUEMFUG4-?cq=1&p=86

    My perspective on the bubble phenomon may be the reason why I am not as optimistic as you are on whether or not we can stave off a Gresham’s Law of information.

    I think you are absolutely right in driving your argument around the principle of new ways of imputing value to information. To a great extent this is the driving logic behind the way in which the judiciary branch of government was conceived in the United States Constitution, however much that logic may currently be under fire. My concern, however, is that we get so fixated on those new ways of imputing value that we recklessly abandon the old ways, rather than seeking a synthetic path that reconciles the diversity of interpretations. For example if we do not pursue such synthetic thinking when confronted with Tom Bell’s SPEx proposal, we may miss the historical context within which progress in copyright and patent laws is less superficial than Bell’s rhetoric would have us believe.

    This may lead to a fourth law to tag on to Igino’s third. Yes, power is definitely temporary. This is probably the most important lesson of history, no matter which historian is telling the story. Nevertheless, every version of history needs to be read critically; and this is the domain of the law I wish to propose:

    Critical Thinking varies inversely with the Velocity of Information!

    (In a way this is my variation of Simon’s proposition that a wealth of information creates a poverty of attention.) Unfortunately, critical thinking as an extremely fragile skill. In my “bubble blog entry” I invoked Mark Taylor’s “theory” of CONFIDENCE GAMES. However, much of what Taylor has to say can probably be traced back to a much earlier source: Gustave Le Bon’s THE CROWD: A STUDY OF THE POPULAR MIND. The Dover edition chose the right piece of text to excerpt on their back cover: “The masses have never thirsted after truth. … Whoever can supply them with illusions is easily their master; whoever attempts to destroy their illusions is always their victim.” One might say that Le Bon viewed crowds the same way that thermodynamics views large collections of molecules, in which case Gresham’s Law of Information may be as inevitable as entropy.

  2. Hi JP,
    Nice to find you again after all those years. I’m assuming you’ll know who I am if you are the right JP -everything fits doesn’t it in a serendipitiuous way?
    Get in touch if you feel like it.
    DD

  3. Stephen, have you read Carlota Perez? Technological revolutions and financial capital. Let me know what you think about it.

    Hey Devangshu, great to hear from you, What are you doing and where? And how do I get in touch? Welcome to the conversation.

  4. I’m not sure I follow how Gresham’s Law _could_ apply to information.

    When applied to money, the key ingredient to make Gresham’s Law work is the legal tender law that forces merchants to accept both good money and bad money as if they had equal value. But what “legal tender” laws exist to force us to accept bad information?

  5. JP, I just read the introduction to Perez’ book. Sadly, the San Francisco Public Library has not included it in their collection; so, for the moment, I shall have to make do with what she has made available on the Web. A couple of years ago Brian Arthur gave a PARC Forum about post-bubble economies, drawing upon the historical evidence of the evolution of the railway and canal systems in England and the United States. Arthur would probably call himself a Schumpeterian, which is consistent with Perez’ prediction that he would give little “attention to the real economyu of the production of goods and services.” I was able to use her Detailed Table of Contents to get a good sense of the narrative she discusses in Part II. What interests me, however, is the impact of Iginos Law 2b (Word travels fast). I see Web 2.0 as a bubble that has begun to inflate before the narrative of what we might call “the new economy bubble” has run its course. I believe this to be the case because there is actually a Law 2b’: Word travels faster every day! The very pace of the flow of information may outstrip the pace of Perez’ narrative (or, as a society, we now “speed-write” the narratives of our lives, as well as “speed-reading” them). In any event I really appreciated the pointer.

    Turning now to Neil Bartlett, while there may not be “legal tender” laws, there are a variety of situations in which we are obliged to accept bad information. In this respect I am not thinking of Wikipedia, because that is a situation where we can always take the initiative of seeking out other sources. On the other hand the world of software development, whether or not open-source is involved, is often forced by project management constraints to take-what-you-can-get. The open source movement was supposed to liberate software development from the “black box” thinking based in APIs; but, in the real world of code generation, there is often an enormous gap between having the source code and understanding it well enough to adapt it to your needs. Since I used to develop software in LISP, just about all the code was transparent to me; and sometimes a candidate for appropriation was so complex that it would be more productive for me to write my own version from scratch. Other developers working on tighter schedules do not always have that luxury. They take what they can and hope for the best, and we are often the ones tripped up by their misfortunes!

    Now open-source evangelists will tell you that all programmers share their code with a sense of good will and that defective code will be improved through group participation. I can see that being the case when the community is a relatively tightly-knit social network. However, in those situations there is often some centralized authority (or authority-clique) that tries to moderate the overall quality of content. A more general setting might be vulnerable to malicious acts, such as submitting source code that is complex enough to conceal the fact that it is carrying a virus. I would conjecture that, for example, a student programming club could be highly vulnerable in this respect. All this is just a way of iterating my fear of inevitability of entropy that I voiced in my last comment!

  6. I guess I have more faith in the multitude. Given enough eyeballs is an important condition, It’s like any other market. Liquidity is king for anything commoditised. And opensource can and should be primarily about commodity.

  7. Hi JP,
    The email is my first name at gmail.com. One of the nicer things about having an unusual name.
    DD

  8. Now, let’s not get too excited by the terms “good money” and “bad money” in Gresham’s Law. If you trust Wikipedia’s definition, it’s clear that “bad” means money whose nominal value is greater than its intrinsic value. A dollar bill is worth $1, although the paper it’s printed on is probably worth less than 1 cent. That’s “bad money” by Gresham’s definition, but it’s still worth a $1 as long as the American economy holds up. In fact, all of our money has been entirely nominal for years, and it’s not even backed by real money anymore since we went off the gold standard (I think in the 1960s). It’s all a massive trust game: Money is worth what we believe it is worth.

    Now, what would it mean to apply this to information? Not that “bad information” drives out “good information” in the simplistic sense of the terms “bad” and “good” — ie. Wikipedia will drive out Britannica (although that is almost certainly going to happen). Instead, I’d read this to mean that “bad information” is information that has nominal value rather than “intrinsic” value. In other words, information that is valuable to the extent that a community of people agree it is valuable — without reference to some external criterion, like say physical reality.

    In this sense, I’d argue that Gresham’s Law is already valid for the blogosphere, in spades. (Or more accurately, the Googlesphere.) Information is valued exactly to the extent that it’s linked to. Any reference to external validation is irrelevant — what matters is how many people blog you. That’s Gresham’s Law in action right there.

  9. Tend to agree with you Dylan, which is why I am concerned.

    My post dealt with keeping the nominal versus intrinsic distinctions simple, and concentrated on where the warranty for the nominal value came from. If that warranty can be obtained by gaming the system, then “bad” information will drive out “good”.

    My thesis is that we can prevent this, by looking hard at the provider of the warranty and ensuring that the warranting process is kept honest. We haven’t got it right, the system can be gamed, but I’d like to think that we will learn more and that we can minimise the impact of Gresham’s Law as a result.

    There is, of course, a separate discussion to be had on what constitutes good information and bad information in an “intrinsic value” sense. Is centrally controlled propaganda good? Is a reasonably well-written blog bad? There is some capacity to make “factual” distinctions between good and bad information, although I am uncomfortable with it; I have seen too many spin doctors at work, and the Lakoff and Chomsky arguments come to mind. But when it comes to ideas, it is positively dangerous to start defining ideas as good and bad, whatever the yardstick used and however well -meaning the attempt is. That path leads to thought police very quickly.

  10. Actually, I think it is just as dangerous to go out in a limb over the concept of “intrinsic value” as it is to fall back on the oversimplification of “good money!” This is why I prefer to refer to paper currency as instances of “fiat money:” the value is assigned by the fiat of an authority that is accepted by those who exchange the currency (the Federal Reserve, in the case of the United States). There is nothing “intrinsic” about the value of specie currency. As a matter of fact, in THE MONEY GAME the other Adam Smith talked about exchanging a dollar bill (back in the days then these carried the label “Silver Certificate”) for silver. He got silver all right; but he got it in a form whose “intrinsic” value was probably less than that of the dollar bill!

    Needless to say, it makes even less sense to talk about the “intrinsic value” of information. As JP has demonstrated in another blog entry, any value information has resides in how it is interpreted. This depends on both WHO is doing the interpreting and WHEN the interpretation takes place.

    Now, what may differentiate information from other commodities is that we may wish to talk about instances of information having NEGATIVE value. An example would be a piece of source code at an open-source site that has a virus concealed in it. Another example might be the intentionally planted bogus data about John Siegenthaler’s connection to the assassination of Bobby Kennedy. Back in the day Einstein warned us that just about any scientific discovery could be used for good or ill purposes. Perhaps we need to refine this warning with a better theory of negative information value.

  11. Which is why we’re having this conversation. There’s a lot to work out and get right, the answers aren’t simple, and much can be gained or lost in the words we use and the meanings we seek to portray or absorb.

  12. Your suppositino of applying Gresham’s Law information reminds me of Dr Stockmann’s speech in Ibsen’s play, “Enemy of the People” where he declares (as translated for Penguin Books): “What sort of truths do the majority always rally around? Why, truths so stricken with age that they are practically decrepit! But when a truth’s as old as that, gentlemen, it’s well on the way to becoming a lie.”

    Of course, Ibsen wrote this play over one hundred years ago, in 1882, so by its own logic this truth has aged into a falsehood. ;-)

    Geoff

  13. Every time someone reads a post I wrote over two years ago, I realise how remiss I am in not tagging and archiving the posts in a more convenient way. Any tips for how I could do this easily for over a thousand posts?

Let me know what you think

This site uses Akismet to reduce spam. Learn how your comment data is processed.