Put simply, cloud computing is the storing and accessing of data, as well as programs, using the internet instead of a computer’s hard drive. The cloud allows servers, storage and applications to be delivered to an organisation’s computers and devices through the internet.
In the 50s computers were huge, occupying entire rooms. They cost a ridiculous amount to purchase and maintain. The solution was “time sharing” allowing multiple users shared access to data and CPU time. This was the premise of cloud computing.
In 1969, J.C.R Licklider developed the ARPANET (Advanced Research Projects Agency Network) – the network that became the basis of the internet. His vision was “for everyone on the globe to be interconnected and accessing programs and data at any site, from anywhere”.
The 1970s saw IBM releasing an operating system, known as a VM (Virtual Machine) that allowed admins to have multiple virtual systems or “virtual machines” on a single physical node. The VM operating system took the 50s “time sharing” model to the next level and many basic functions commonly used today can be traced back to this early version.
Telecommunications companies began offering virtualised private network connections. This change enabled traffic to be shifted as necessary to allow for better network balance and more control over bandwidth usage. Throughout the 90s, online virtualisation for PC-based systems had begun as the internet became more accessible.
Cloud usage became prominent in the 2000s. Corporate solutions from organisations such as IBM and Oracle became popular, with services such as ‘Infrastructure as a service’, ‘Platform as a service’ and ‘Software as a service’ offered to businesses. Consumer use also exploded through products such as Apple’s iCloud, Google Apps and Dropbox.