Over the years, cloud computing has emerged as the dominant form of delivering IT. Although many are familiar with the benefits of cloud computing, very few understand how cloud computing came to be.
Don’t get me wrong, focusing on what cloud technology can do for your business is important. However, history matters too.
Cloud Technology, like all modern technology, wouldn’t be possible without the various technologies that came before it. Understanding the history behind cloud computing matters because it (1) informs us of the challenges others faced and overcame in the past, and (2) lays the foundation on which we can base future strategies.
Read on to discover the 7 technological breakthroughs that led to the emergence of cloud computing.
1. Mainframes Computers
One could argue that IT began with mainframe computers. These early computers were used in the 50s and 60s by large corporations to process sizeable amounts of data.
By today’s standards, mainframes were difficult to use. There was no direct human interface (keyboards, screens, etc) and the only way to enter data was through punch cards, paper tape, or hard wiring.
If you were a smaller organization that needed computing power, your salvation did not arrive until the mid-60s when minicomputers emerged. Minicomputers were not as powerful as mainframes; however, they had value because they processed more data than a human could in a reasonable amount of time and were cheaper than mainframes.
2. The PC
As the use of mainframes and minicomputers became common, the need for users to process their own information emerged. For the individual users, mainframes were unable to meet their need because they were inaccessible to most staff members. Minicomputers were also out of the question because they were simply too expensive for each staff member to have.
In response to the growth in demand for personal computing power, IBM introduces the PC, Microsoft introduces the first version of windows, and Steve Jobs introduces the Apple Macintosh computer.
As mainframes, minicomputers, and desktop PCs became standard, people were developing ways to connect computers so they could communicate with each other. The first network that connected computers was ARPANET.
The Defense Advanced Research Projects Agency (DARPA) began the development of ARPANET in 1969. The aim of this network was to give researchers convenient access to mainframes that were too far away from their work locations. From the inception of ARPANET in 1969 to 1981, ARPANET grew from a network of 4 locations to 213+.
The National Science Foundation was another organization that began coordinating its projects over its own network called the National Science Foundation Network (NSFNET). The NSFNET became the backbone of the Internet.
While ARPANET and NSFNET grew, networks for connecting computers within an organization also emerged. These networks were known as Local Area Networks (LANs). LAN’s made it allowed organizations to share data and devices such as printers within a distinct geographical area such as an office.
4. HTML and HTTP
A physicist named Tim Berners-Lee invented the Worldwide Web (WWW) in 1989 using Hypertext Markup Language (HTML) as its publishing language. The idea of the Web came from the need to enable the organization and pooling of information between researchers from remote sites in the world.
HTML is the standard markup language for documents designed to be displayed on a web browser. Hypertext Transfer Protocol (HTTP), which was also developed by Berners-Lee, is a simple protocol for retrieving other documents via hypertext links. Without HTML and HTTP, delivering cloud services wouldn’t be possible.
5. Clustering and Grid Computing
A common challenge in IT is maximizing the use of computer resources. Clustering and Grid computing helps address this challenge.
Clustering is a way to group computer resources together to maximize their functioning. Computer clusters are architectures rather than a type of computer. All computer clusters have one thing in common: they all appear to the user as one device rather than several interconnected devices.
Grid computing on the other hand is a platform made up of loosely interconnected computer resources. It operates with true distributed processing and does not appear as one system to the user.
6. Service–Oriented Architecture (SOA)
Service-Oriented Architecture (SOA) is a software engineering technique that made building software easier by allowing developers to build software from components of a business process rather than needing to “reinvent the wheel” each time software is created. An example of SOA architecture is a website that uses a service from one vendor to provide shopping cart functionality and another service from a different vendor to provide credit card validation.
In a nutshell, SOA can be seen as an orchestration of services from different organizations. With it, components of cloud computing such as software reuse and pay-as-you-go were made possible.
In computing, virtualization means creating copies of something that all share the same physical resource. Each copy executes on a single server, but to the user the copy appears to be its own dedicated resource.
Many came to understand that treating computer resources individually was not effective because each device has an excess capacity that could be used. Virtualization creates efficiency by consolidating servers to improve utilization and cost effectiveness.
A major advantage of virtualization is that it makes infrastructure more scalable. Based on demand, a virtualized infrastructure in which you purchased computing resources will sense your usage and scale those resources as needed. As those resources are added or removed. The amount charged is adjusted.
There you have it: the 7 tech breakthroughs that led to cloud computing. As you went through each one, you may have noticed that the driving force behind these breakthroughs were the needs of businesses and organizations striving for specific goals.
The innovation hasn’t stop. Cloud computing is helping organizations meet their strategic goals by enabling greater agility, flexibility, and scalability. As the needs of organizations shift in the future, we can be certain that new breakthroughs will emerge to meet those needs.