Getting Better

  • I’ve got to admit it’s getting better
  • A little better all the time
  • I have to admit it’s getting better
  • It’s getting better

John Lennon/Paul McCartney, Getting Better (Sgt Pepper’s Lonely Hearts Club Band)

I listen to a lot of music. Particularly in the evenings and late nights, particularly at weekends, and particularly when I’m travelling. But I used to listen to a lot more music, in my teens and early adulthood. Between 15 and 22 I must have listened to music at least 8 hours a day, sometimes twice that.

As you can imagine, my musical taste was heavily influenced as a result; so even now, most of the time, I listen to music created between 1965 and 1975, give or take a few years on either edge. It’s not that I don’t listen to any other music: I do. But I tend to think that there were so many wonderful albums made during those years, so many talented musicians, that I don’t need to venture out from there. Call it my comfort zone if you must. I just happen to think the music was great.

Now many of the people I listen to are dead, sometimes as a result of personal excess, sometimes as a result of accident and tragedy. So it comes as an incredible privilege to me when I get to see any of my boyhood heroes play live, when I get to see the musicians and bands of my youth in the flesh. Over the years, I’ve been able to see the Grateful Dead, Crosby, Stills and Nash, Neil Young, Bob Dylan, Steve Winwood, Eric Clapton, John Mayall, Queen, the Rolling Stones, Led Zeppelin, John Martyn, Van Morrison, Cat Stevens, Donovan, Don McLean, the Moody Blues, just to name a few. And I am so grateful.

In the early 1980s getting tickets used to be very hard; you tended to have to queue up at the venue box office. Sometimes, if you were very lucky, you got them on the phone. And if you couldn’t get them in person or on the phone, you didn’t go. Scalper prices were too high. By the early 1990s phone-based sales became more common, at least for the bands I wanted to watch. And it took till the late 1990s and early 2000s before the web became a potential route. Potential. I use that word advisedly. The early days of web sales were diabolical, even more roulette-ish than the telephone. Sites crashed more often than British Rail cancelled trains. You wouldn’t be able to get through. And when you did get through, the tickets had all gone.

If you really wanted to see someone, and you just couldn’t get through, you still had the touts. But their prices weren’t cheap, so it was not something you could do anytime you liked.

Roll forward to today. I’d been travelling for some time, came home, went through my personal mail, and found an email from the Royal Albert Hall. Offering me the opportunity to buy Eric Clapton tickets for next May before they opened for general sales tomorrow. How convenient. Why was this? I don’t really know. I assume it was because I’d bothered to register some years ago, that I’d listed my preferences, and, over the years, I’d bought a considerable number of tickets. Any of the above. All of the above.

Who was I to complain? So I clicked on the link, hoping against hope that the early release tickets hadn’t sold out. And then I was taken somewhere I’d never been taken before:

A waiting room. How nice. With a little counter that counted down to when it would be my turn. When I clicked on the link, I was something like 1250th in the queue; in about 20 minutes I was through. But I didn’t have to wait there doing nothing while I waited. And I got the tickets I wanted. Restricted view, but I know those seats and they’re good enough for me.

We’ve come a long way in the last 30 years, even if it doesn’t always feel that way. Priority booking for registrants. Alerts and offers based on profile and preference. A humane, almost-friendly queueing system, with excellent feedback loops. Keeping the customer informed.

All that, in the month before I get to see Santana for the first time, Winwood for the nth time and Jeff Beck on his own for the second time.

I’ve got to admit it’s getting better…..

Bear necessities

There’s been a lot of commotion on the web about a particular video going viral a few days ago. When I heard about it, my instinct was to do nothing; after all, there was a NSFW warning emblazoned right across it. So I forgot all about it.

Then an old friend of mine, Philippa Davis, pinged me about it via Facebook, all the way from South Africa. And she wasn’t the type to be sending me smut. So I took a look this evening.

And I loved it. Just the kind of thing that lets me see the art of the possible on the web. Over 4.3m views already.

I don’t want to say any more about it. Turn down the volume on your machine, make sure you’re by yourself, and then click here. And let me know what you think.

Musing about a new kind of literacy

My thanks to Tochis for the wonderful photograph above.

A full twenty-six years after the eponymous year of Orwell’s dystopian novel, we are only just getting used to the idea of Big Brother watching us. For many of us, this sense of being watched seems to have been built around physical constructs, around the usage of devices such as cameras.

For the older ones amongst us, Big Brother may be less about devices and more about people: for people like me, the concept of a surveillance society may bring forth images more akin to the Cold War and to state control: twitching curtains, informers and spies. Even Spy vs Spy spies. Especially Antonio Prohias’ Spy Vs Spy spies.

My thanks to arkworld for the lovely tribute to Mad Magazine’s Spy vs Spy series above.

I get the impression that the post-Vietnam Space Invaders generation thinks of a surveillance society differently; they view things much more in a Star Wars kind of way, particularly in the sense of the Strategic Defense Initiative, so the attention shifts to a postmodern Military-Industrial complex. We all have our crosses to bear.

My thanks to Karf Oohlu for his fantastic creations above.

As we all know, those days are history. There’s a new game in town, where the surveillance is all digital. Where everything we do is monitored and recorded and analysed and used, ostensibly to help us. Ostensibly. A world of digital fingerprints.

My thanks to Caroline Bosher for putting the concept above together so elegantly.

We’ve gotten used to the idea of people “following” us in a digital world, subscribing to stuff we publish. Here we know that others are watching us. It is completely within our individual gift.

We’ve gotten used to the idea that when we visit somewhere, our web browsers may accept tiny little poison-pill cookies. While these beasties are capable of being used as spyware, we appear to be able to stop our browsers accepting them, we can clear them from our caches, we seem to be in control.

Some of us have even gotten used to the idea that we can keep rough track of the number of people that have accessed a particular site, what browsers they used, how they got there, where they went to, a whole pile of stats. Just take a look at an example of what StatCounter tells me about this blog:

It’s not just about where we go on the web, the metadata that attaches to our actions is pretty rich already. Take a look at what the Exif data holds for a normal Flickr photograph that I uploaded. If you’ve used flickr, you’ve probably done the same.

These are all things we’re getting used to.

But there’s stuff we’re not yet used to.

And it’s all to do with the concept of privacy. Whose privacy is it anyway?

If I upload something on to the web, and I want to know who sees it, do I have the right? Or do you have the right not to tell me you saw it?

Let’s say that what I “uploaded” is a blog post. Then it’s easy, you’re probably in your comfort zone. What happens if what you looked at is my music playlist? You’re still pretty cool about letting me know you saw it. So let’s make it a little harder. What happens if what you were looking at is my CV. Now sometimes you don’t want me to know that.

Whose privacy is it anyway?

Incidentally, sometime ago, I had to wait up for one of my children to get home. So I was idly looking at “watcher” statistics on Wikipedia, randomly trying to see what gives there. Who or what is watched the most. How do different groupings of people or things do? So here’s some of the highlights. First I name the article, then the number of watchers.

  • Obama 2024. Bush Jr 1922. Bill Clinton 833. Hillary Clinton 778. Saddam 766. Churchill 760. Bin Laden 748. Palin 697. Blair 691. Cameron 248.
  • Gates 901. Jobs 696. Berners-Lee 237. Zuckerberg 141. Page 94. Ellison 85.
  • Gandhi 931. Lincoln 916. Martin Luther King Jr 858. JFK 747. Queen Elizabeth II 740. Mandela 603.
  • Jesus 1483. Mohammed 1240. Scientology 977. God 920. Darwin 854. Hawking 715. Dawkins 599.
  • Lost 1155. Simpsons 1149. Heroes 791.
  • Manchester United 712. Liverpool FC 563. Chelsea 531.
  • Google 1336. Microsoft 889. Facebook 766. Apple 673.
  • Lady Gaga 496. Ashton Kutcher 129. Ev Williams <30.
  • Abortion 697. AIDS 687. Climate change 258.
  • Michael Jackson 1463. Madonna 734. Dylan 730. Lennon 724. Presley 676.
  • India 2270. US 1658. China 923.

But the overall winner from about a hundred I tried?

Katrina at 2872. Even the September 11 attacks could only muster up 1337.

Many of the things we do are recorded, and we know about it. Many of the things we do are recorded, and we give permission for that recording to take place. Some of the things we do are recorded with our permission and we don’t understand enough about it.

So we need to know more about all of this. Which is something that the VRM people are working hard on.

A new kind of literacy is needed. Many are working on this, but we all need to think harder about it.

Incidentally, the millenials may be more clued up on privacy than we give them credit for. Their views are different, their values are different, they may start off naive and trusting, but they cotton on fast. So when you take the privacy settings on facebook, my gut feel would be that people under 28 would be more inclined to have sorted out their privacy settings to their satisfaction than people over 28.

Any views?

Musing about inclusion in technology

My thanks to Phillie Casablanca for the delightfully evocative notice above.

I was born a foreigner.

While my hereditary roots were from southern India, I was born and brought up in Calcutta, as was my father before me. And for the first 23 years of my life, I knew no other city. Never lived anywhere else. But my surname gave away my southern roots: I wasn’t a true Bengali.

I am one of five siblings. When we were young, we used to spend a good deal of time every summer in Tambaram, on the campus of Madras Christian College. My grandfather was Professor of Chemistry there. Though I had bloodlines traced back to those parts, my accent gave away my north-eastern roots: I wasn’t a true Tamil.

I was born a foreigner.

A somewhat privileged foreigner, born into a Brahmin family (and an ostensibly well-to-do one at that). A family that took multiple copies of the Statesman so that we could each do the Times crossword on an unsullied diagram. Using a pen, of course. A family that played billiards and duplicate bridge and scrabble and chess. A family that devoured the written word.

So I didn’t really know much about being discriminated against. But, as Einstein reminded us, common sense is the collection of prejudices one collects by age eighteen. And I’m sure I had my fair share of prejudices. With three sisters, a bevy of aunts and a truly matriarchal grandmother, it was somewhat difficult for me to inculcate gender bias into my prejudice collection.

Which was probably a good thing, since the first boss I had was a woman, and since the person who gave me the job, her boss, was also a woman; I wrote about them as part of my Ada Lovelace Day pledge some time ago here.  [Incidentally, some of you may be aware of this recent incident in my life, which somehow made it into the Times City Diary, and thence into syndicated journals far and wide. Including one in Hong Kong. Which led to my getting back in touch with the woman who started me off in my professional career.]

That first job was a great job, and I learnt a lot. A genuine meritocracy; the nearest I came to any form of discrimination was when it came to publicity shots for the firm; a small number of us, foreign in origin, skin or gender, used to get wheeled in for all such occasions. It was done in such a spirit that we didn’t really consider it tokenism.

I soon learnt a little bit about discrimination the hard way, when my skull and forehead made repeated contact with some fairly large Doc Martens belonging to a group of young gentlemen with very short haircuts, and the resultant coma kept me quiet for a short while. But that was a rare and aberrant event, and all of twenty-seven years ago.

When I look back on the last thirty years, I tend to think of the industry I work for as fairly inclusive; perhaps it had more to do with the firms I worked for. BT, where I’ve been for the last four years (how time flies), for example, has an exemplary record on diversity and inclusiveness; people like Sally Davis, CEO of BT Wholesale, and Caroline Waters, director of people and policy, lead by example. Caroline was recently awarded an OBE for services to diversity and equal opportunity.

In many ways, the industry is itself designed to be inclusive: it’s about brains, not brawn. It is possible to work in an office as well as remotely. Shiftwork is possible, and there are opportunities to work in or with many timezones. The industry is just barely old enough to become ageist, so we’ve been able to avoid doing that. The work we do helps people use computers and communicate regardless of  physical or linguistic constraints; in many cases computers can be used to overcome those constraints.

Which brings me to the reason for this post: the recent debates about Women in Tech.  Shefaly Yogendra has done an excellent job in bringing together the different strands of argument and discussion, while providing us with the origins and context of the debate here.

Anyway, a number of people, including @shefaly, @thinkmaya and @freecloud, wanted to know where I stand on this issue.

So here’s my two-penn’orth:

We can all argue about the why, but there’s no disputing the what. Women are underrepresented in a number of dimensions in the tech world, and this is noticeable in conference line-ups and in start-up founder lists. This is particularly odd because there are a lot of talented women in this space: I am privileged to count many of them amongst my friends. There are many possible reasons for this phenomenon, and many possible ways of fixing it.

I think we need to make sure that one possible reason is dealt with, because it’s the kind of reason that could overlook. An anchoring-and-framing kind of reason. Let me give you an example.

Take The Indus Entrepreneurs, TiE in short. Many of you must have heard of them. While TiE is an inclusive network that advises, supports and mentors would-be entrepreneurs, its origins were different. I believe TiE was created to ensure that people of South Asian extraction were given the funding opportunities they were otherwise being denied. There was general acceptance of the engineering excellence of such people, but for some reason question marks were raised about their ability to run companies. Which meant that the “engineers” never got funded when they went forward with business plans.

I think we need to make sure that something similar is not happening here, in terms of unintended consequences as a result of anchors and frames. We need to make sure that we eradicate prejudices that go along the lines of: Women don’t code. Founders must code. So women can’t found startups…..

Generalisations, like comparisons, are always odious. Many parts of the industry are open and inclusive and meritocratic. Nevertheless, the numbers don’t add up, the evidence suggests we have a bias somewhere, and we have to do something, do whatever we can, to correct it. So I’m all for what people like TED and DLD are doing.

Systemic problems often need systemic solutions; awareness-raising initiatives can often provide the quantum energy required to remove historical biases, particularly subtle ones.

Does the Web make experts dumb? Part 3: The issues

Thanks for all the comments and conversation on the previous two posts. At this stage, I think it would be worth while setting out a simple list of principles, and see I can get your feedback on them. I feel that it will help move the argument forward constructively.

The principles I’d like to put forward are:

1. No one can become an expert without access to information. The web helps provide and broaden this access.

2. Access and opportunity alone are not enough. Will and perseverance are also required. The web does nothing to prevent this, and may actually augment the perseverance by making it easier to become an expert.

3. Having access to a mentor or moderator is valuable, particularly one who has the experience and critical skills related to the expertise sought. Teachers used to be mentors and moderators for centuries, before chalk-and-talk broadcast was adopted as an Assembly Line norm. Good teachers continue to mentor and moderate. The web facilitates this, in terms of allowing asynchronous communications with relevant links and bibliography, as well as synchronous communications when face to face is not possible.

4. Having access to a mentor or moderator who can inspire as well is invaluable. This is how expertise will really flourish. The web facilitates this as well. You only need to see one TED talk to understand how people can be helped, motivated, inspired by someone they don’t know and haven’t ever met.

5. There are 72 million children of school-going age not at school today. Rather than argue about the nature and role of experts and expertise, we should be doing everything in our power to ensure that every one of them has access to basic education as a human right. Queen Rania and her cohort are doing great things in this respect; the World Economic Forum’s Global Education Initiative, where Queen Rania is also involved, is a good place to start if you want to know more.

6. Of course the web gives us the opportunity to be superficial about learning, about knowledge, about expertise. But then this was true of all previous paradigms as well. What has changed is that the web allows us to delve deeper if we want to. And it makes that easier.

7. Of course face-to-face learning, with a moderator present, is invaluable. Of course the sense of community that comes from being in a classroom with other students is invaluable. But if for some reason this is not possible, then let’s not pooh-pooh the value of putting a computer with web access into a hole in the wall, and allowing for minimal moderation. This is what Sugata Mitra has been demonstrating, and more power to his elbow. You can keep track of what he’s been doing here: http://www.sugatam.wikispaces.com

8. The web is still in its infancy, there’s a lot broken with it. There is a lot that can, and should, be done in the context of curation, of indexing, of search tools, of filters and visualisation tools, of the semantic underpinnings. Read Esther Dyson’s recent post on the future of internet search if you have time, it’s a brilliant piece. You can find it here.  See what Tim Berners-Lee, Wendy Hall, Nigel Shadbolt, Rosemary Leith, Noshir Contractor and Jim Hendler et al are up to at the Web Science Trust.

9. The privileged position of the expert may be under stress. An environment where more people can become experts is a good thing, and should be encouraged. An environment where their heredity and background becomes irrelevant, where what matters is their willingness to apply themselves, is a good thing. So don’t let people convince you otherwise.

10. Education trumps everything. Access, opportunity, facilitation, motivation and inspiration are critical. In all this the web is an aid; it is not the answer by itself. But it helps.

In this series of posts, I have not tried to make out that individuals working in dark rooms on their own, with access to the web, will suddenly become experts. If that is the impression given, I have failed to communicate my message.

What I have been trying to say is this: people are saying the web dumbs us down. This is wrong. The web can dumb us down, but only if we choose to let it.

Comments welcome.