In software, truly disruptive technologies are often initially dismissed as toys that will never make it to mainstream adoption. There are several reasons for this. First, human beings are naturally resistant to change. Even though we like for things to improve, we have a hard time letting go of the certainty we feel in the way things are now. A second reason is lack of vision or imagination. For a lot of people, it is really hard to imagine something working differently than it does today. It takes an open and curious mind to lead us toward change and to visualize the potential of a simple MVP to scale into a full-featured technology. The naysayers often limit their view to see only the functionality (or lack thereof) of the product, without taking in the larger view of the limitations the technology is helping to overcome. Early adopters move first because they see that they have access to functionality that was previously unavailable to them, either because of cost, complexity or both. They did a trade-off analysis of accepting limited features vs. gaining expanded functionality. Expanded functionality won.
As we know, adoption becomes a self-fulfilling prophecy. Access to a wider user base brings down costs, and lower prices result in an uptake in adoption. This momentum drives application development and use cases for the new technology that simply weren't possible with existing solutions.
This is a classic scenario in technology, and cloud computing is a clear example of it. When Amazon launched Amazon Web Services (AWS) in the 2000s, the industry initially dismissed it as little more than a novelty. When it started to get traction with small businesses who couldn't justify the cost to build their own technology, the critics said it would never be robust enough for enterprise businesses. According to IDG, the number or organizations with at least one application in the cloud increased from 57 percent in 2012 to 72 percent in 2015, and large-scale enterprises are spending around $3 million per year on cloud services. It seems wide-scale adoption is alive and well in the cloud.
At Bitnami, we were early adopters of AWS. A pivotal moment for us came when we reached a point where limitations of existing technology forced us to envision a better way. We searched the market for a new solution that was also affordable. Then we found our answer when AWS announced the first version of their Elastic Block Storage (EBS) offering in August 2008.
Adoption becomes a self-fulfilling prophesy. Access to a wider user base brings down costs, and lower prices result in an uptake in adoption.
We had set up a small data center in our offices, which included dozens of physical machines. Our technology requires a continuous build-and-test cycle, using hundreds of libraries and components. In 2008, it became apparent to us that we needed a better way to manage backups and snapshots of virtual machines. We considered multiple options, including building our own Network Attached Storage (NAS) setup. However, we concluded that we would need some of the functionality provided by storage appliance vendors.
So, we contacted the enterprise vendors to get pricing. We were sticker-shocked! Even the most basic setups were well into the tens of thousands of dollars. The real kicker came when we asked the question, "OK, how much storage would we be getting for that amount of money?" You could hear a bit of a muffled laugh at the other end of the phone. The reply was, "Well, this price only includes the chassis ... you will need to buy the disks separately." Those were enterprise-grade SCSI disks that cost many times what even high-end commodity ones cost. By that point, we had pretty much resigned ourselves to the fact that we would have to bite the bullet and purchase the solution. But we were a scrappy startup at the time, and we couldn't stop searching for a better answer. That answer came later that same week, with AWS' EBS announcement.
This gave us the ability to attach network storage to existing virtual machines running in their cloud, take and store incremental snapshots, and easily recreate machines from those snapshots. All of this was API-driven with a minimal, usage-based cost of $0.10 per GB. Of course, this offering did not provide all the functionality of the on-premise enterprise appliance, but it was orders of magnitude cheaper. We decided to invest further in automating the build-and-test system and arrived at a solution that was good enough for our use case. We eventually got rid of the on-premise server farm, and we now run hundreds of machines across multiple cloud platforms.
AWS clearly took the lead in this market. In fact, Gartner found that the size of its business is more than 10x that of the next top 14 cloud providers combined. All of the big players -- Amazon, Google, Microsoft and other tier-1 cloud providers -- are expanding their offerings beyond processing and storage to include analytics, database back-ends, video encoding and streaming, log management, monitoring and dozens of other services. These offerings tend to be simpler and less functional than their established enterprise counterparts, but they are also significantly cheaper, more convenient to use, available on a pay-as-you-go basis, and quite simply good enough for many users. The critics that claim these offerings won't win the enterprise because they are lacking this or that feature are taking too narrow of a view. Looking myopically for feature parity out of the gate doesn't take into consideration the overall benefit of the solution and the longer view. The leading cloud platforms are taking a page from Microsoft's platform dominance in the 90s. They can release a limited, sometimes buggy, product and still win in the long term by virtue of iterating quickly and being bundled in with the platform.
Some of the arguments against cloud-based offerings are not purely technical, but point out real or perceived legal and security shortcomings. Over time, these will also be addressed. If there is a significant economic incentive (and there is with the cloud), those with the biggest stakes in the game will find a way to address these issues in a way that opens the doors to adoption. In many ways, this mirrors what we saw with successful open source projects like Linux, MySQL or JBoss. Linux was not considered "enterprise-ready" for a long time. It was restricted to developers' workstations or to powering file-and-print servers. Over time, it added features such as journaling filesystems, multiprocessor support and others. It eventually became "good enough". We can see parallels to the cloud in the setbacks open source experienced from licensing and security issues. Concerns were voiced: "Can I safely use open source software? Won't Microsoft sue me? How can it be secure if the bad guys have access to the source code?" However, the voices were quieted when the first Internet bubble burst and all of a sudden Linux running on commodity Dell hardware was much more attractive than a SUN E10K SPARC machine running Solaris, despite the latter being much more "enterprise ready" than the former.
At this point, it is not a question of whether enterprises embrace the cloud, but when and how. Cloud offerings will become the backbone of enterprise computing. The unknown factors are how quickly it will happen and which vendors will dominate.
Tap to read full article