In 1963, DARPA (the Defense Advanced Research Projects Agency), presented MIT with $2 million for Project MAC. The funding included a requirement MIT develop technology allowing for a “computer to be used by two or more people, simultaneously.” In this case, one of those gigantic, archaic computers using reels of magnetic tape for memory and became the precursor to what has now become collectively known as cloud computing. It acted as a primitive cloud with two or three people accessing it. The word “Virtualization” was used to describe this situation, though the word’s meaning later expanded.
In 1969, J. C. R. Licklider helped develop the ARPANET (Advanced Research Projects Agency Network), a “very” primitive version of the Internet. JCR, or “Lick” was both a psychologist and a computer scientist, and promoted a vision called the “Intergalactic Computer Network,” in which everyone on the planet would be interconnected by way of computers, and able to access information from anywhere. (What could such an unrealistic, impossible-to-pay-for, fantasy of the future look like?) The Intergalactic Computer Network, otherwise known as the Internet, is necessary for access to the cloud.
LEARN HOW TO BUILD A DATA LITERACY PROGRAM
Developing Data Literacy is key to becoming a data-driven organization – try our online courses to get started.
The meaning of virtualization began shifting in the 1970s, and now describes the creation of a virtual machine, that acts like a real computer, with a fully functional operating system. The concept of Virtualization has evolved with the Internet, as businesses began offering “virtual” private networks as a rentable service. The use of virtual computers became popular in the 1990s, leading to the development of the modern cloud computing infrastructure.
The Late 1990s
In its early stages, the cloud was used to express the empty space between the end user and the provider. In 1997, Professor Ramnath Chellapa of Emory University defined cloud computing as the new “computing paradigm, where the boundaries of computing will be determined by economic rationale, rather than technical limits alone.” This somewhat ponderous description rings true in describing the cloud’s evolution.
The cloud gained popularity as companies gained a better understanding of its services and usefulness. In 1999, Salesforce became a popular example of using cloud Computing successfully. They used it to pioneer the idea of using the Internet to deliver software programs to the end users. The program (or application) could be accessed and downloaded by anyone with Internet access. Businesses could purchase the software in an on-demand, cost-effective manner, without leaving the office.
The Early 2000s
In 2002, Amazon introduced its web-based retail services. It was the first major business to think of using only 10% of their capacity (which was commonplace at the time) as a problem to be solved. The cloud computing infrastructure Model gave them the flexibility to use their computer’s capacity much more efficiently. Soon after, other large organizations followed their example.
In 2006, Amazon launched Amazon Web Services, which offers online services to other websites, or clients. One of Amazon Web Services’ sites, called Amazon Mechanical Turk, provides a variety of cloud-based services including storage, computation and “human intelligence.” Another of Amazon Web Services’ sites is the Elastic Compute Cloud (EC2), allowing individuals to rent virtual computers and use their own programs and applications.
In the same year, Google launched the Google Docs services. Google Docs was originally based on two separate products, Google Spreadsheets and Writely. Google purchased Writely, which offers renters the ability to save documents, edit documents, and transfer them into blogging systems. (These documents are compatible with Microsoft Word.) Google Spreadsheets (acquired from 2Web Technologies, in 2005) is an Internet-based program allowing users to develop, update, and edit spreadsheets, and to share the data online. An Ajax-based program is used, which is compatible with Microsoft Excel. The spreadsheets can be saved in an HTML format.
In 2007, IBM, Google, and several universities joined forces to develop a server farm for research projects needing both fast processors and huge data sets. The University of Washington was the first to sign up and use resources provided by IBM and Google. Carnegie Mellon University, MIT, Stanford University, the University of Maryland, and the University of California at Berkeley, quickly followed suit. The universities immediately realized computer experiments can be done faster and for less money, if IBM and Google were supporting their research. Since much of the research was focused on problems IBM and Google had interests in, they also benefitted from the arrangement. 2007 was also the year when Netflix launched it’s streaming video service, using the cloud, and provided support for the practice of “binge-watching.”
Eucalyptus offered the first AWS API compatible platform, which was used for distributing private clouds, in 2008. In the same year, NASA’s OpenNebula provided the first open-source software for deploying private and hybrid clouds. Many of its most innovative features focused on the needs of major businesses.
2010 and Beyond
Although private clouds were initiated in 2008, they were still undeveloped, and not very popular. Concerns with poor security in public clouds was a strong driving force promoting the use of private clouds. In 2010, companies like AWS, Microsoft, and OpenStack had developed private clouds that were fairly functional. (2010 was also when OpenStack made an open-sourced, free, do-it-yourself cloud, which became very popular, available to the general public.)
The concept of hybrid clouds was introduced 2011. A fair amount of interoperability is needed between a private and public cloud, and the ability to shift workloads back and forth between the two clouds. At this time, very few businesses had systems capable of doing this, though many wanted to, because of the tools and storage public clouds could offer.
In 2011, IBM introduced the IBM SmartCloud framework, in support of Smarter Planet (a cultural thinking project). Then, Apple launched the ICloud, which focuses on storing more personal information (photos, music, videos, etc.). Also, during this year, Microsoft began advertising the cloud on television, making the general public aware of its ability to store photos, or video, with easy access.
Oracle introduced the Oracle Cloud in 2012, offering the three basics for business, IaaS (Infrastructure-as-a-Service), PaaS (Platform-as-a-Service), and SAAS (Software-as-a-Service). These “basics” quickly became the norm, with some public clouds offering all of these services, while other focused on offering only one. Software-as-a-service became quite popular.
CloudBolt was founded in 2012. This company gets credit for developing a hybrid cloud management platform which helped organizations build, deploy and manage both private and public clouds. They resolved the interoperability problems between public and private clouds.
Multi-clouds began when organizations started using SaaS providers for certain services, such as human resources, customer relations management, and supply chain management. This started becoming popular in roughly 2013-2014. While this use of SaaS providers is still quite popular, a philosophy of using multiple clouds for their specific services and advantages has developed. This philosophy includes not becoming trapped into using a specific cloud because of “interoperability problems.”
By 2014, cloud computing had developed its basic features, and security had become a major concern. Cloud security has become a fast-growing service, because of its importance to customers. Cloud security has advanced significantly in the last few years, and now provides protection comparable to traditional IT security systems. This includes the protection of critical information from accidental deletion, theft, and data leakage. Having said that, security is, and may always be, the primary concern of most cloud users.
Currently, one of the primary users of cloud services are application developers. In 2016 the cloud began to shift from developer-friendly to developer-driven. Application developers began taking full advantage of the cloud for the tools it had available. A large number of services strive to be developer-friendly to draw more customers. Realizing the need, and the potential for profit, cloud vendors developed (and continue to develop) the tools apps developers want and need.
Although primitive containers have been around since 2004 (Solaris containers), these early containers were very limited, and restricted to certain computer systems. It wasn’t until 2013, when Docker came up with a container that was extremely functional, that these tools caught on. It is no coincidence that the growth of both Docker and container use and Docker happened simultaneously.
In 2017, hundreds of tools that had been around for years were modified and used to make working with containers easier. Kubernetes, developed by Google in 2014, and then made available as an open-source product, was one of these. Kubernetes is a container-orchestration system designed to automate application deployments, scaling, and management.
The Future of Cloud Computing
The coronavirus pandemic accelerated use of the internet for eCommerce and working remotely. Automated data governance software for dealing with a growing number of internet laws and regulations seems a reasonable prediction for the future of the cloud. For more on the future of the cloud, check out The Cloud Computing Trends in 2022.
Image used under license from Shutterstock.com