Info Libre

I’ve been re-reading Promises to Keep by William W. Fisher III. An absolute must-read for people interested in digital rights and in alternative compensation models.

An excerpt from the inside front cover:

If the available technologies were exploited fully, the cost of audio and video recordings would drop sharply, the income of artists would rise, many more artists could reach global audiences, the variety of music and films popularly available would increase sharply, and listeners and viewers would be able to participate much more easily in the shaping of their cultural environments.

Utopia maybe. But a utopia I believe in and one I will do everything in my power to help create.

Which brings me to the point of this post. I was reminded of Canto Livre, the digital library of Brazilian music, while reading the Fisher book. I have no idea how it is doing, and would love pointers.

Is there anyone out there already creating an opensource pool of free-to-net material? Free because of three possible reasons: out of copyright; copyright owner provides it into pool free; copyright owner agrees to place in pool for alternative compensation.

Tom Maddox and Dom Sayers have recently asked me to provide free-to-net versions of stuff I own. Where can I place this stuff for posterity?

If enough people placed stuff in a pool like this, and the pool collected donations, would we not subvert the need for a tax-based compensation system?

Someone please tell me where my stuff should go. Even Google is an answer. Or Creative Commons. I just want to know where to put the stuff, a virtual place where no one can later claim copyright….

Why would you want to turn [customers] away? Alan Rusbridger on walled gardens

Another example of how the web works. I have to be on vacation in India and reading Steven Johnson’s blog to find out about something Alan Rusbridger said recently………

I quote from the Rusbridger speech, via Steven:

The more of a wall that you put around, whether it’s a wall of payment or a wall of registration, the more you’re repelling people rather than building an audience for the day when we hope that advertising will come in like the cavalry and rescue us. So I think at the moment, the smarter thing to do is to make your content available everywhere and to have it aggregated and linked to like mad by everybody in the world, because that way you will reach a gigantic audience. And that matters journalistically. If you’re in the business of journalism for influence, and because of the Guardian worldview that you believe in, it’s terrific to have an audience of 14 million instead of 400,000. That’s wonderful. So why would you want to turn them away?

I have often felt that there is no such thing as a bad customer, just a customer who does not fit your business model. And in the past, businesses have spent time discarding customers as “not relevant to the business model”. And one firm’s rejects became some other firm’s Most Valuable Customers, as many airline and credit-card “bottom-feeder” businesses have shown.

This does not make sense. If you want a sustainable business, then adapt the business model to suit your customers, not the other way around. It is the relationship that matters. And you will work out a way for all parties in a relationship to gain as a result. Otherwise it is not a relationship.
You can find the Steven Johnson post here, with links to the original speech audio as well as some very useful comments. [And yes, Steven, I am looking forward to reading Ghost Map!]

Four Pillars: Thinking about search

Doc Searls pointed me to a thought-provoking post by Tristan Louis called Where Virtual and Physical meet. Amongst other things, Tristan makes the point that as genres like SecondLife evolve with their own community standards, the question of how these standards operate in a court of law is moot.

The question per se is not new. Larry Lessig (in Code and Other Laws of Cyberspace) had this wonderful story about two neighbours, one growing beautiful flowers that killed all that touched them, the other growing pedigree canines that exhibited powerful and sensitive emotions. Petal from flower person floats into pedigree chum’s garden. Dog dies. Painfully. Lawsuit. Prosecution says “make the flowers not kill”. Defence says “but this mix of beauty and death is what sells, tell neighbour not to create dogs that feel so much pain”. Judge rules that flowers can kill only when paid for. Cyberlaw. All my paraphrase, I don’t have the book to hand and I read it maybe seven years ago. Apologies for any misinterpretation or misquoting.
What is new is that genres like SecondLife exist. What’s this got to do with search, you ask? Patience, patience.

Separate thread. Joel On Software gets me tangentially to a place where a community formed off Joel is discussing Microsoft and AJAX. Arguments about whether anything on AJAX can be found on MSDN or not. Turns out that MSDN is “localised” and search results relate to a specific location.

Separate thread. In the March issue of Release 1.0 (which I’m still reading) Esther Dyson makes the point that “now that it has found a business model, search is likely to evolve rapidly in two directions that empower the user to filter rather than find….”

Separate thread.  A recent conversation in the blogosphere where I work looks at the use of microformats such as hCard to try and resolve person/role conflicts between different applications when looked at from a workflow perspective.

The Tristan Louis post that took me on my random walk started that process by talking about the community rules in SecondLife and how they would stand up in court. And that made me think how easy it was for virtual rules to be come bricks in the walled gardens.

Somewhere in my head all these threads are the same. Once you move away from request-response 100%-accurate-lies models of database queries, you move into a new world.

Non-deterministic “probabilistic” search with relevance and ranking and heuristics-based feedback loops is this new world. Metadata and tags and microformats are just routes to improving how we improve. Improve search and find and archive and retrieve. And even improve mix and mash and cook and come up with something completely different.
And search means different things in this world.

  • To some it means resolve conflict, deduplicate, cleanse information.
  • To some it means improve information by empowering the creator/consumer.
  • To some it still means provide an answer I can use.
  • To all it means improve when successful

It is when the locus of the search is (a) consciously constrained by someone other than the searcher or (b) unconsciously constrained by someone other than the searcher, and this happens without the searcher’s knowledge or consent, that we hit a major problem.

So whether it is Google or Yahoo in China or MSDN re AJAX or community standards in SecondLife, we have to look at the garden wall and who’s doing the building.

When there are no walls, search can be refine or filter or cleanse or repair or fix or even good old find. Where there are walls of omission or commission, search is a corrupt and poor reflection of what it could be. 

Learning about blogging

I received a comment from 2036AD about my last post, and it was good constructive feedback. I guess I fell into my own trap, and rather than be Thinker or Linker, I wandered into Stinker territory.

The rules for creating value when linking appear simple:

  • 1. Tell people enough about what you are linking to, enough to make them choose to find out further or not, as the case may be.
  • 2. Tell people why you are doing the linking and pointing out, provide some opinion of your own. Is this good? Is this bad? Does this matter? Why?
  • 3. Make sure you explain any jargon or specialist terms within your post.
  • 4. The reader must have enough information (as a result of your post) to understand what is being linked to and to decide whether or not to acquire more information by following the link.

These four criteria make the difference between Linker and Stinker. With my EULALyzer post I went into Stinker territory. Thank you 2036AD for pulling me up.

And so as not to make it worse, here’s what I should have said. Someone has come up with a tool to analyse the terms and conditions in an End User Licence Agreement. This is interesting and useful. But it currently only works on Windows. Which makes it less interesting and useful.

I went on to add that the end-user agreements in the ISP, cableco and telco space are also proving challenging, and that having a tool to analyse them would be useful.

EULAlyzer

LifeHacker  pointed me towards  this while I was travelling. Interesting concept, once it works on multiple platforms.

Now if someone somewhere could come up with the equivalent for analysing the end-user agreements offered by every ISP, cableco or telco, in the run-up to WalledGardenia, we could all benefit.