I read a beautiful article, Why
software is eating the world by Marc Andreessen. It made me
reflect on how the world of computers and networks has evolved over
the last 20 years. It is comfortable for us to think that the world
has only evolved in incremental ways. But as I look around me, it
seems that the hard-driving pace of change on many fronts has added
up to fundamental change, to places far away from the comfort zone
of people of my vintage.
All the way till the 1980s, business computing was dominated by
databases. The basic story was one of capturing data, storing it, and
summoning it forth with queries.
At first, databases were the exclusive preserve of mainframes and
minicomputer. The PC revolution made it possible for small databases
to be held on the desktop. It's interesting to note that at first,
we got PCs without networks. We evolved from databases stored on
remote mainframes or minicomputers to databases stored on PCs. All
the way into the late 1980s, it was quite a cool thing to have a
standalone PC holding a database where certain queries could be
executed.
The first wave of change, of the early 1990s, was networks in the
form of TCP/IP (the universal communication protocol) and the
Internet (the universal network). Now, suddenly, the data centre
became more interesting. Instead of storing and manipulating data at
the desktop, we could do so many better things by storing and
manipulating data at a big central computer. The desktop diminished
from being the location of data and computation to being the
location of the user interaction.
Then came a series of surprises which have added up to a
qualitative shift.
1. The network got ubiquitous
First, the Internet went everywhere for the road warrior armed with
the laptop computer. Crashing prices of laptop computers and then
netbooks meant that essentially everyone had one. So workers started
spending much more time outside the office (with 100%
connectivity).
Software had to adapt itself to reach out on an Internet
scale. This killed off applications which worked on the scale of the
LAN. The software that the busy road warrier used was the software
that worked effortlessly on his laptop.
Today, 1 Mb/s wireless networks are common and 50 to 100 Mb/s
offerings are on the anvil. This is relentlessly shifting the
balance of convenience to mobility.
In a place like India, the low-end staff might not have netbooks
and/or Internet on the go. So for certain very low-end applications,
it might make sense to hug the desktop at the workplace. For any
modestly well paid person, laptops / netbooks coupled with 3g or CDMA
networks are the norm, and hence being tethered to the office network
is quite limiting.
2. The user interfaces got better
In the 1980s, software came with fat manuals. Users actually sat
down in training classes. A remarkable feature of the new world is
how the manuals and training are gone. Software is incredibly
capable but there are no manuals. Google maps or Amazon or Apple
Mail are very powerful programs, but the fundamental assumption is
that a reasonable person can just start tinkering with them and
learn more as he goes.
The modern office worker gets no formal training in software all
his life. The modern knowledge worker learns major tools (e.g. a
programming language) and often puts in enormous effort for these.
But for the rest, the ordinary flow of day to day life,
where new software systems come up all the time, is done without
formal training.
Once the modern office worker faces high quality UI design from
google and such like, where there is zero training and zero marketing,
it became much harder to accept training. Standards have changed; in
the olden days, people would actually try to learn. Today, knowledge
workers are willing to get training in programming languages (e.g. R
or Stata) but not in applications. The MBAs are generally
training-proof.
3. All of us got busy
There was a time when one purposedly went about the work day
systematically doing certain things with certain software
tools. Knowledge workers have become deluged with information and
with stimuli. We have gone from being an information scarce economy
to being an attention scarce economy.
Software and information systems are now competing for the
attention of the user. The scarce resource is now the mind share of
the user. This is linked to the problem of user interfaces. If
something has a complicated user interface, and there are a hundred
other tasks that need to be done, the user ignores the complicated
thing. Software systems that don't fly immediately just die.
4. Peers determine where attention is directed
In a world where the knowledge worker is bombarded with hundreds of
things every day, what does he do? He tends to direct his scarce
time into the things that come well recommended. The
recommendations of respected peers are supremely important in
determining what a person does.
High powered sales compaigns have lost power. The person just asks
his friends what they do. The impulses through the day coming into
each person - over email, IM, twitter, social networks, etc. - are
the de facto controllers of the persons' time.
Peers are thus the gatekeepers to the user. The stuff that is
striking and remarkable gets noticed and pointed to friends. What
gets pointed tends to get a high google pagerank.
The importance of high pressure sales dropped. Some of the most
successful firms got by with negligible sales departments. Their
stuff was intuitive and good, and got immediately picked up.
5. Network effects leading to user generated content
The old model was one of corporations producing information and
users consuming information. In that power structure, the user was
only a source of revenue.
In the new world, the critical story is about kicking off network
effects. The systems that win are those that get better because of
one more user interaction.
At the simplest, user interactions kick off impulses to peers which
brings in more customers (viral marketing). But very soon, user
interactions generate relevant data. Google watches what users click
and uses that to improve search. Amazon tells you that the people
who liked this book also liked that book. Amazon has user-generated
content in terms of reviews.
Good systems create a warm and supportive environment in which
users contribute bug fixes, feature suggestions. These systems ride
the power of user eyeballs and brains to get better. The power
structure has changed. IBM DB2 used to be designed in a temple and
then went out to the helpess masses. Google's world is critically
linked to the users at so many levels (a receptive environment for
bug reports, feature requests, user generated content, and usage
data being turned back into strengthening the system).
The bottom line: Successful designers found ways to harness every
single user and user interaction to build the quality, the content
and the footprint of the system. Stalinist structures, which
disempowered the user and treated him only as a source of revenue,
stand isolated and stagnant.
6. Loss of power of enterprise IT
In the old world, enterprise IT mattered more. Grave decisions were
made by enterprise IT managers and then thousands of users fell in
line. In the new world, users forge ahead with their laptops and
tablets and mobile phones, exercising enormous autonomous choice
about how they spend their time. Consumer considerations, and the
loyalty of each individual user, are far more important than they
used to be. The enterprise IT department is much less of a
gatekeeper. In the olden days, hardware and software was sold to
enterprise IT, which made decisions for everyone inside the
organisation. In the new world, usage is won one user at a time, and
it is contestable every day.
7. CPUs became too cheap to meter
In the old world, computation was something scarce. The money that
went into building data centres was carefully weighed. System
designers carefully did things that were parsimonious in the use of
CPU.
With the rise of parallel computation, bringing 1000 CPUs into a
problem became cheap. Successful designers were those that found
ways to deploy incredibly large amounts of computer power to do
things that delight users. Google and amazon are spending millions
of clock cycles in the back end, thinking about how to handle the
next move, as the mouse cursor moves! When faced with a choice
between doing something nice that users will like, versus doing
something that saves compute power, the former always won.
8. Unexpected revenue sources
Who would have imagined that an ad agency would become the most
powerful author of operating systems for mobile phones in the world?
When hardware got dramatically cheap, and the Internet generated
access to eyeballs on an unimaginable scale, new revenue models came
about which were surprisingly different from the way we used to
think earlier.