Cloud computing Flashcards
Cloud computing metaphor
the group of networked elements providing services need not be individually addressed or managed by users; instead, the entire provider-managed suite of hardware and software can be thought of as an amorphous cloud
Cloud is the on-demand availability of computer system resources, especially data storage (cloud storage) and computing power, without direct active management by the user
Large clouds often have functions distributed over multiple locations, each location being a data center. Cloud computing relies on sharing of resources to achieve coherence and economies of scale, typically using a “pay-as-you-go” model which can help in reducing capital expenses but may also lead to unexpected operating expenses for unaware users
Value proposition
Advocates of public and hybrid clouds note that cloud computing allows companies to avoid or minimize up-front IT infrastructure costs. Proponents also claim that cloud computing allows enterprises to get their applications up and running faster, with improved manageability and less maintenance, and that it enables IT teams to more rapidly adjust resources to meet fluctuating and unpredictable demand, providing the burst computing capability: high computing power at certain periods of peak demand
References to the phrase “cloud computing” appeared as early as 1996, with the first known mention in a Compaq internal document
The cloud symbol was used to represent networks of computing equipment in the original ARPANET by as early as 1977, and the CSNET by 1981-both predecessors to the Internet itself. The word cloud was used to denote a network on telephony schematics. With this simplification, the implication is that the specifics of how the endpoints of a network are connected are not relevant to understanding the diagram
The term cloud was used to refer to platforms for distributed computing as early as 1993, when Apple spin-off General Magic and AT&T used it in describing their (paired) Telescript and PersonalLink technologies.
In Wired’s April 1994 feature “Bill and Andy’s Excellent Adventure II,” Andy Hertzfeld commented on Telescript, General Magic’s distributed programming language:
During the 1960s, the initial concepts of time-sharing became popularized via RJE (Remote Job Entry); this terminology was mostly associated with large vendors such as IBM and DEC.
Full-time-sharing solutions were available by the early 1970s on such platforms as Multics (on GE hardware), Cambridge CTSS, and the earliest UNIX ports (on DEC hardware). Yet, the “data center” model where users submitted jobs to operators to run on IBM’s mainframes was overwhelmingly predominant
In the 1990s, telecommunications companies, who previously offered primarily dedicated point-to-point data circuits, began offering virtual private network (VPN) services with comparable quality of service, but at a lower cost. By switching traffic as they saw fit to balance server use, they could use overall network bandwidth more effectively.
They began to use the cloud symbol to denote the demarcation point between what the provider was responsible for and what users were responsible for. Cloud computing extended this boundary to cover all servers as well as the network infrastructure. As computers became more diffused, scientists and technologists explored ways to make large-scale computing power available to more users through time-sharing. They experimented with algorithms to optimize the infrastructure, platform, and applications to prioritize CPUs and increase efficiency
The use of the cloud metaphor for virtualized services dates at least to General Magic in 1994, where it was used to describe the universe of “places” that mobile agents in the Telescript environment could go. As described by Andy Hertzfeld:
“The beauty of Telescript,” says Andy, “is that now, instead of just having a device to program, we now have the entire Cloud out there, where a single program can go and travel to many different sources of information and create a sort of a virtual service.
The use of the cloud metaphor is credited to General Magic communications employee David Hoffman, based on long-standing use in networking and telecom. In addition to use by General Magic itself, it was also used in promoting AT&T’s associated PersonaLink Services
In July 2002, Amazon created subsidiary Amazon Web Services, with the goal to “enable developers to build innovative and entrepreneurial applications on their own.”
In March 2006 Amazon introduced its Simple Storage Service (S3), followed by Elastic Compute Cloud (EC2) in August of the same year. These products pioneered the usage of server virtualization to deliver IaaS at a cheaper and on-demand pricing basis
In April 2008, Google released the beta version of Google App Engine
The App Engine was a PaaS (one of the first of its kind) which provided fully maintained infrastructure and a deployment platform for users to create web applications using common languages/technologies such as Python, Node .js and PHP. The goal was to eliminate the need for some administrative tasks typical of an IaaS model, while creating a platform where users could easily deploy such applications and scale them to demand
NASA’s Nebula
In early 2008, NASA’s Nebula, enhanced in RESERVOIR European Commission-funded project, became the first open-source software for deploying private and hybrid clouds, and for the federation of clouds
Gartner
By mid-2008, Gartner saw an opportunity for cloud computing “to shape the relationship among consumers of IT services, those who use IT services and those who sell them” and observed that “organizations are switching from company-owned hardware and software assets to per-use service-based models” so that the “projected shift to computing…will result in dramatic growth in IT products in some areas and significant reductions in other areas.”
NSF
In 2008, the US National Science Foundation began the Cluster Exploratory program to fund academic research using Google-IBM cluster technology to analyze massive amounts of data
Project Andromede
In 2009, the government of France announced Project Andromede to create a “sovereign cloud” or national cloud computing, with the government to spend 285 million. The initiative failed badly and Cloudwatt was shut down on 1 February 2020
Azure
In February 2010, Microsoft released Microsoft Azure, which was announced in October 2008
In July 2010, Rackspace Hosting and NASA jointly launched an open-source cloud-software initiative known as OpenStack.
The OpenStack project intended to help organizations offering cloud-computing services running on standard hardware. The early code came from NASA’s Nebula platform as well as from Rackspace’s Cloud Files platform. As an open-source offering and along with other open-source solutions such as CloudStack, Ganeti, and OpenNebula, it has attracted attention by several key communities. Several studies aim at comparing these open source offerings based on a set of criteria
On March 1, 2011, IBM announced the IBM SmartCloud framework to support Smarter Planet
Among the various components of the Smarter Computing foundation, cloud computing is a critical part. On June 7, 2012, Oracle announced the Oracle Cloud. This cloud offering is poised to be the first to provide users with access to an integrated set of IT solutions, including the Applications (SaaS), Platform (PaaS), and Infrastructure (IaaS) layers
Google Compute Engine
In May 2012, Google Compute Engine was released in preview, before being rolled out into General Availability in December 2013
Linux on Microsoft Azure
In 2019, Linux was the most common OS used on Microsoft Azure. In December 2019, Amazon announced AWS Outposts, which is a fully managed service that extends AWS infrastructure, AWS services, APIs, and tools to virtually any customer datacenter, co-location space, or on-premises facility for a truly consistent hybrid experience
With operating system-level virtualization essentially creating a scalable system of multiple independent computing devices, idle computing resources can be allocated and used more efficiently
Virtualization provides the agility required to speed up IT operations and reduces cost by increasing infrastructure utilization. Autonomic computing automates the process through which the user can provision resources on-demand. By minimizing user involvement, automation speeds up the process, reduces labor costs and reduces the possibility of human errors. Cloud computing uses concepts from utility computing to provide metrics for the services used. Cloud computing attempts to address QoS (quality of service) and reliability problems of other grid computing models
Cloud computing shares characteristics with:
- Client-server model-Client-server computing refers broadly to any distributed application that distinguishes between service providers (servers) and service requestors (clients)
- Computer bureau
A service bureau providing computer services, particularly from the 1960s to 1980s