In case you had any doubt that the cloud is now the norm, consider this: 96% of enterprises are using the cloud, and 81% have adopted a multi-cloud strategy. Simply put, we’ve entered the age of the “multi-cloud.”
Today’s multi-cloud reality might seem like a new phenomenon, but the shift has been decades in the making. To understand the current cloud computing and security landscape, let’s take a look at the key figures and events that shaped the terrain. (Thanks to McAfee for providing such a thorough overview, which we’ve boiled down for this article).
1963: Strides into space and the cloud
During Robert F. Kennedy’s presidency, all eyes were on the space race—except for in the U.S. Pentagon’s Advanced Research Projects Agency (ARPA), where researchers were exploring a different uncharted territory. In 1963, the agency’s director, C.R. Licklider, penned a memo that described a network of computers allowing data to be updated and shared with programs that were “somewhere else.” They would do so by sharing a common language.
Essentially describing the precursor to the Internet and the TCP/IP language, Licklider wrote, “Consider the situation in which several different centers are netted together, each center being highly individualistic and having its own special language and its own special way of doing things. ... With a sophisticated network-control system, I would not decide whether to send the data and have them worked on by programs somewhere else, or bring in programs and have them work on my data. I have no great objection to making that decision, for a while at any rate, but, in principle, it seems better for the computer, or the network, somehow, to do that.”
Licklider’s insights played a key role in the launch of ARPANET, the earliest Internet, and laid the groundwork for future cloud computing technologies. But he wasn’t naïve to the larger implications, writing about the challenges of collaborative editing and keeping data secure if it’s not housed in a single physical location.
1972: The birth of virtualised computing
As ARPA was developing its technologies, IBM was busy with its own innovations. In 1972, it released the VM/370, which set the stage for today’s virtualised data centers. The VM/370 was “a multi-access time-shared system” with a control program similar to modern hypervisors, which opened the door for multiple virtual machines to run operating machines in “time-shared mode.” It also featured a conversational monitor system for “general-purpose, time-sharing capability.”
1998: A renewed focus on computing environments
The next decade or so was dominated by personal computers, which took crucial attention away from mainframe and datacenter computing environments. That changed in 1998, when VMware filed a patent for what it described as a “Virtualisation system including a virtual machine monitor for a computer with a segmented architecture… particularly well-adapted for virtualising computers in which the hardware processor has an Intel x86 architecture.”
VMware’s technology hit shelves a year later, positioning the company at the head of the pack. By using x86, it added extra value to existing datacenter infrastructures. As McAfee put it, cloud computing likely would not exist today had it not been for the enhanced efficiency and affordability that the VMware technology enabled.
1999: Salesforce.com arrives on the scene
By the late 1990s, a virtualised infrastructure was finally beginning to make sense for enterprises. This opened the door for the modern CRM market, with Salesforce.com arriving on the scene at the decade’s end to challenge Oracle and SAP.
The CRM market is expected to reach $35 billion in the next few years.
2003: Setting the stage for Amazon Web Services (AWS)
To manage its internal needs, Amazon began exploring a new automated, web services-reliant infrastructure proposed by one of its lead engineers. CEO Jeff Bezos described the company’s 2003 decision this way: “Basically, what we decided to do is build a [set of APIs] between those two layers so that you could just do coarse-grained coordination between those two groups.”
2006: The launch of AWS
A few years later, that internal infrastructure became Amazon Web Services (AWS). Its original announcement described a web-based service “that provides resisable compute capacity in the cloud…designed to make web-scale computing easier for developers. Amazon EC2 changes the economics of computing by allowing you to pay only for capacity that you actually use.”
Today, that little internal project is generating $10 billion in annual revenue.
2010: The multi-cloud arrives
Microsoft Azure launched in 2010, and Google Cloud Platform arrived on the market a year later. With this growing competition, choosing a single provider was no longer beneficial. Enter the multi-cloud.
2012: More data, more security issues
With the growth of Salesforce and Amazon, the cloud services space exploded. This spurred seas of data and, in 2012, the arrival of Cloud Access Security Brokers (CASBs) as a guard against data loss and complex compliance issues.
Fast-forward to 2018, and more vendors than ever have entered the market with automated deployment technology aimed at helping organisations manage multiple cloud providers. Security solutions have largely followed suit, and the giants in the space—AWS, Google and Azure—have documented their shared responsibility for their customers’ security challenges.
The rise of the multi-cloud has brought key opportunities, as well as many of the challenges that Licklider wrote about all those decades ago. Contact our team today to explore how our experts can help you manage the multi-cloud and avoid costly cloud security issues.