29 June 2016
In 2003, I had the good fortune of being a founding member of the AWS team. I spent more than decade helping define, build and operate the services comprising Amazon's cloud offering. We knew from the beginning that the cloud could redefine how developers build, deploy and manage applications. We wanted to make it as easy for college students in dorm rooms to build scalable and reliable applications as it had been for developers in large organizations. While there's still much more innovation possible to fully realize the vision of a comprehensive cloud platform, AWS has largely delivered on that early goal.

Recently, I have been thinking more deeply about why and how the cloud came to be in the first place. Many people credit virtualization, the growth of the Internet and the emergence of web scale computing as primary drivers. But many people may not recognize the fundamental economic inversion that made cloud computing possible, and indeed necessary.

When I first started as a professional software developer in the late 1980s, the machines were the scarce resource and the humans that programmed them were relatively plentiful and inexpensive. I was perpetually operating in an environment in which machines were the scarce resource. I spent a lot of time waiting for my code to compile and link, and hours reducing the memory footprint of my programs and optimizing my algorithms to save processing time. Disk space was at such a premium that I'd bias towards using bits instead of bytes. It took weeks, if not months, to procure more compute and storage.

My experience as a developer was typical. Since machines were the scarce resource, companies made significant investments in people and infrastructure to optimize the companies' ability to get the most value from their computers. Organizations were willing to bear high coordination costs to extract the maximum business-serving value from the applications the machines enabled them to run. In traditional IT infrastructure, coordination costs include the people who monitor and manage the hardware, the software developers who create business-serving value, and the servers, storage, networking hardware and other physical resources required to deliver the value. However, something fundamental happened, starting in the early 2000s as best I can estimate. Moore's Law, which had compounded for more than 30 years prior, suddenly changed the economics of IT. Computing power became cheaper than developers' time. 
Moore's Law, which had compounded for more than 30 years prior, suddenly changed the economics of IT.
Share this

So instead of the machines being the scarce resource, the bottleneck became the developers building the business-serving applications. When computer power was less expensive, the developers' time became the higher-cost item. It no longer made sense for people to wait for the machines. Companies started to optimize for the inverse; the machines should be waiting for the unscalable resource: the developer, whose time grew relatively more expensive as the price of computing power continued to decline.

So how did this lead to the emergence of the cloud? Quite simply, the inversion in where the value lay -- with the developer and not the machines -- meant that the acceptable coordination costs of an organization operating its own IT infrastructure were no longer tenable.

Companies looking for ways to reduce coordination costs found it efficient to outsource and automate their IT infrastructure. The scarce compute resources that had previously been rationed and managed were now plentiful and instantaneously accessible to developers with the swipe of a credit card. 

Why continue to invest capex building expensive data centers and paying for the hardware, software and staff required to operate them when you can procure these resources on-demand and pay only for what you need using the cloud? You didn't have to be a developer in a large organization: You could be that college kid in the dorm room.
So, why the cloud? Because, the developer.
Share this

I believe we've only begun to see the changes that will be enabled by the cloud. Development methodologies such as Agile, Lean and DevOps, where developers are empowered to manage the entire application lifecycle, will help developers iterate more quickly, saving significant time and people expense. Platform services will provide developers with automated and higher-level abstractions so they can more easily and quickly build, test, deploy and operate their applications. Machine learning and AI will augment developer productivity by increasingly eliminating rote, tedious and repetitive programming and testing tasks.

The cloud has inaugurated an irreversible change in how applications will be built in the future that has placed the developer in the center.

So, why the cloud? Because, the developer.
 
Tap to read full article