The democratisation of the dev

Science museums have always entranced me. I must have been around eight or nine when I was first taken to the Birla Industrial and Technological Museum in Calcutta, a wonderful institution. I went there regularly, sometimes out of sheer boredom. Not far from there was an ice rink. For a pittance I could walk around the museum in torrid heat, and then walk over to cool off in the ice rink. I never skated. I just luxuriated in the coolth for a while and then went home.

mobile1mobile3mobile4

When I went there, I had no idea of the history of the place. For that I needed Wikipedia, from which I learnt that the site and historical mansion used to belong to the Tagores, and that it was bought from the Tagore family by GD Birla with the intention of making it his family home, and that he gave it to Jawaharlal Nehru to use for the advancement of science in India. I’m grateful to all of them.

It was a magical place. Magical, despite the muddy morass we often had to tramp through to get from exhibit to exhibit. Magical, despite the intense heat of every room, cubicle, bus. Yes, bus. Some of the exhibits were in buses, and they’d be sent around the country as-is. You had to make sure that the exhibits you wanted to see were “in station” at the time. Crowds and crowds of children all the time, laughing and yelling. But then that was true of every school, every neighbourhood, every maidan, so who cared?

We laughed and yelled some of the time, like when the Hall of Mirrors was in residence. But most of the time we were quiet. Spellbound.

I remember the first time I saw a room where everything worked via motion sensors. Doors opened as you approached them. Lights and fans came on as if by magic, some by sensing your presence, some requiring you to wave, some needing to be clapped into action.

I remember all that. Intensely. Even though it must have been 1967 or 1968.

In those days what mattered was access to viewing. When you have nothing else, then just being able to look is a luxury. Most of the time we looked at print and photo exhibits; occasionally we watched some film. Sometimes, if we were really lucky, we looked at physical objects, often built to smaller-than-life scale in order to conserve scarce capital. The motion-sensor room I described above was one of those “real” things, tiny, probably smaller than most urban bathrooms. But it felt huge. Because it lit your imagination.

I left India in 1980. At the time, I knew precisely one person who had a computer, Anu Thakur. He had a Commodore Pet, one of the earliest ones, he bought it in 1977, and he allowed me to touch it and feel it. He even taught me how to play Star Trek on it in 1978. He was an incredible guy, even had a Martin guitar and knew how to play flamenco style. I don’t know if he is still alive; if any of you who reads this knows him or his family, please pass on my regards.

The India I left in 1980 was one where you waited three years for a telephone line. A relatively thick line that came into the house and had a black Bakelite device with a rotary dial at the end of it. Remember them?

I couldn’t have studied Computer Science when I left India; the discipline had not yet made it on to institutional offers.

During the 1980s, the world of computing was in turmoil, with everything changing at a rate of knots. In the space of a decade, we learnt that “shifting tin” wasn’t the way forward, that hardware margins were history; that software and services, free at the time, were to be charged for; that character-based “dumb” terminals were to be replaced by the soon-to-be-ubiquitous graphical-user-interface PCs; that proprietary architectures were going to be supplanted by somewhat more open ones; that traditional “CODASYL” databases were soon to be replaced by relational ones. In the midst of all this, IBM and AT&T were considered too big to be allowed to continue the way they were, and faced significant antitrust moves against them. So IBM gave away the PC Operating System space (and found a grateful recipient in Microsoft) and AT&T freed up Unix.

That in turn helped the Indian software/services industry go ballistic; until then the barriers to entry to proprietary architectures were vast and unforgiving.

If all this wasn’t enough, during that same decade, the internet was quietly leaving defence and academia and proto-utopia to enter the mainstream, the beginnings of the web were being formed, and the first commercially usable transportable phones were showing up.

The 1980s were a decade of immense transition, setting the scene for what we see today. By the end of the decade the stirrings of democratisation were in place: the PC, the phone, Unix, and a move from hardware-only to hardware-and-software-and-services.More and more people had access to CPU cycles, to storage and to bandwidth and connectivity, a trend that continues to this date.

There was something else happening. In the past, telecommunications and computing were separate disciplines; and even within computing there were clear demarcations between general-purpose computing and embedded systems. As recently as a decade ago, when people were left with the “problem” of preparing for “Year 2000”, these were separate departments with independent plans.

All that was also changing. First, in the early 1990s, the distinction between computing and telephony began to blur; and within a decade, the separation between general-purpose computing and embedded systems became unsustainable.

[Of course, Shirky’s Law continued to show its power, and institutions continued to fight for the reason they came into being. But it’s a losing battle. And now it’s over, except for those who will not see.]

Today the democratisation of the dev is nearly complete. Linux, Apache. Mozilla, Android. Arduino, Raspberry Pi. Java, Ruby, Sinatra. Hadoop. GitHub. Storm, Cassandra, Kafka. Tessel.io, Pinocc.io. RepRap. These are all just examples, I could have filled this page if I wanted to. The individual names do not matter. What matters is the principle, that barriers to entry have come down sharply, and now anyone can develop software. For any market. Working on (almost) any device. With access to low cost tin and wire and pipe. And distribution.

Which is why I am so very excited by the announcement of the Salesforce $1m hackathon.

By now most of you know I work for Salesforce. It’s an amazing company, and one of the most amazing things about it is the platform at its core: what it is, what it represents; the sheer scale of the developer community around it; the number of apps built by customers; the size and vibrancy of the partner and ISV community; the smarts at the heart of its multitenant architecture; the daily transaction volume. Every way I look at it, the platform is something else. Incredible.

Now, with the hackathon coming up, I can see that many many more people will learn more about it and empower themselves to change the world where it means something to them. For nonprofits. In health, education and welfare. Affecting industry and logistics and distribution. In retailing, sales and marketing. Across the board.

Because it is now possible. More posts to follow as the stories develop.

 

 

6 thoughts on “The democratisation of the dev”

  1. Dear JP,

    I see the democratisation would be complete with the next step in this evolution. I don’t know half the names you have mentioned, but if I have to make use of this evolution all by myself, I need a way to make these work interdependently and I should be able to configure them the way i want without getting lost in the technical things.

    What IBM could do with 100k employees, probably will be done with 10k employees at Google and if this evolution has to take shape, then 1 person should be able to accomplish what once IBM need 100k employees and G with 10k.
    Venkat

  2. Largest Single Hackathon Prize Ever! — fantastic bounty from Salesforce.com
    (move over consumer internet companies — iPad first prize no longer cuts the mustard).

  3. The good news you can add “commoditization” of Enterprise software to the list. No coders to handle all the required business logic using declarative and already one business analyst could replace 100 IBMers! Will dramatically cut cost of build of next generation Enterprise Software with built in easy change – might even make SaaS profitable….?

  4. ‘Commoditization’ using declarative has democratized business logic, agreed–a huge application productivity effect for business developers by Parker Harris / Benioff et al (with profitability)

    The Nodejs small module movement open source github community turning the browser into a canonical lego like echo system portends another ‘Turning-Force.com’ platform effect coming for business developers. My eyes on “reactive programming’.

  5. Just for clarity our “declarative” is an architecture not programming. Our approach is based upon 2 key facts people create all information and that business logic never changes. Research shows people require support by less than 13 work/task types (incl the UI) human and system see research published here http://www.igi-global.com/chapter/object-model-development-engineering/78620

    So no code change, no code generation or compiling that addresses all business requirements; BPM discipline with rules, events, collaboration/workflow, audit, state, real time, roles/performers, management hierarchy, asynchronous work, intelligent forms all with intelligent orchestration of data from any source to any device. 20 years R&D and proven working with early adopters we call it Enterprise Adaptive Software, 6GL a reality and the new alternative to COTS and custom programming. Welcome to the new world – wide adoption will cost big vendors $bns but only fair as their lack of required advancement of Enterprise Software has cost customers $bns! So a BIG challenge…..and UK based!

Let me know what you think

This site uses Akismet to reduce spam. Learn how your comment data is processed.