The Advent and Future of Cloud Computing Technology

This article is an excerpt from the Shortform book guide to "Thank You for Being Late" by Thomas L. Friedman. Shortform has the world's best summaries and analyses of books you should be reading.

Like this article? Sign up for a free trial here .

What exactly is the cloud? How has the advent of cloud computing technology changed the world?

The cloud is a term for services and software that run on the internet instead of locally on your computer, for example, Netflix, Dropbox, or Microsoft Office 365. Like networking, the cloud is a major player in the acceleration of all the other parts of computing because it allows any type of technology to be constantly updated, improved, or shared.

In this article, we’ll discuss how the cloud came to be and its implications for the future of computing.

The Power of the Cloud

The advent of cloud computing technology is unique because it’s both powerful and far-reaching. In the past, tools were either one or the other.

The cloud increases the power of:

  • Machines. Machines now have most of the same five senses as people and a “brain” (computer) to interpret them. For example, machines have cameras to see things, and their programming can recognize subjects in images and compare them. They have microphones and can recognize and produce speech. They can touch—they can drive cars or vacuum while navigating around the furniture in your house. People are working on teaching computers to work with taste and smell.
  • Flows. Flows are the movement of ideas, intangible goods, and communication around the world. For example, online banking is a flow of money from people and banks and industries all over the world.
  • Individual reach. Before the supernova, a single person was limited in how many people they could connect with and influence. For example, it used to take a single person to kill a single person. Now, a single person who invents a life-saving vaccine can share this medicine with the entire world.
  • Collective reach. Humans now have the ability to affect global systems such as climate and ecosystems.

An example of the increasing power of machines is the design of jet engine parts. Pre-supernova, when GE designed a new jet-engine part, it was a two-year process. They had to design the part, build the tools that could build the part, build a prototype using the new tools, build the actual part, and then test it. Today, it takes a week. GE designs the part on a computer, sends it to a 3-D printer, and then tests it multiple times a day.

How Did We Get Here?

The dot-com boom in the 1990s and 2000 paved the way for the supernova. During the dot-com boom, people and companies overinvested in internet technology, which resulted in the set up of wires all over the world and a decrease in the cost of connectivity. The internet became easy, fast, free, and universal.

The supernova emerged in 2007, and it was designed to make things less complicated. Complexity became like the internet—easy, fast, free, and additionally, invisible. For example, consider Amazon’s “one-click” checkout. From a customer perspective, you only have to click one button to make an order. Behind the scenes, there are many complicated processes and codes running, but from the customer’s end, things have never been simpler. This benefits Amazon—the easier it can make checkout, the more likely the customer is to do so.

The supernova is transforming all industries around the world. Previously, problems were expensive and complicated to solve because businesses didn’t have access to the information they needed to create a solution. Now, however, it’s practically free to get all sorts of information.

The Eras of Computing

According to John E. Kelly III, an IBM senior vice president, there are three eras of computing:

Era #1: The “Tabulating Era.” This era lasted from the 1900s-1940s and was made up of machines that did a single task, such as punch-card systems.

Era #2: The “Programming Era.” This era lasted from the 1950s to around the 2000s and in this era, people programmed machines to use software and algorithms—a series of rules and steps. The rules and steps are determined by people, and while programmable computers are powerful, they can only do what they’ve been programmed to do and they can only handle certain kinds of data. Home computers, smartphones, and the internet are part of this era. 

  • For example, when IBM was building a program to translate from English to Spanish, they hired linguists and asked them to teach the programmers grammar. Once the programmers understood the languages, they could then write a series of rules for translation. However, this approach didn’t work.

Era #3: The “Cognitive Era.” This era started in 2007 (it required advances in processing power, storage, networking, software, and the supernova) and will continue until some point in the future. Cognitive computers are designed to adapt to different situations and types of information. They don’t give a correct solution the way a programmable computer does; they give a list of the most likely answers and their probability of being correct.

  • For example, when IBM’s programmable translation project failed, they took a new approach. They abandoned learning languages and instead taught their computers to compare two human-translated texts and then determine which translation is most accurate. If a computer does enough examples, it will start recognizing patterns of what’s right and wrong. Neither programmers nor computers need to learn a language; translation becomes about statistics.
Will Cognitive Computers Take Over the World?

Some people are scared of cognitive computers and worry that they might take over the world. This is unlikely for two reasons: 

  • Reason #1: As we learned from the translation example above, machines don’t “learn” the same way that people do. Machine learning is based on statistics, pattern recognition, and examples.
  • Reason #2: Cognitive computers learn about specific subjects, such as geology or geography. A computer that’s designed to understand geology is going to be able to keep up with developments in that field, but it’s not going to start doing anything else.
Watson

Watson was the first cognitive computer—you may remember it from its appearance on Jeopardy! in 2011.

Watson “learned” to play Jeopardy! like any cognitive computer—pattern recognition and statistics. One set of algorithms helped Watson understand what the question was asking—was the answer a place, a date, and so on. Another set of algorithms searched all the data Watson had access to and then analyzed the probability of what in that data could be the correct answer. The answer was given a degree of confidence, and if the confidence level was high enough, Watson would give the answer. Then, once Watson’s answer was determined right or wrong, that was one more example to add to Watson’s database and include in its statistical calculations.

The latest version of Watson lives in the supernova and is currently learning all known medical research. Watson isn’t just a repository of information however; it helps doctors with diagnostics.

For example, if IBM gives Watson 3,000 images, 6% of which show melanoma, Watson will use its algorithm to determine distinguishing features of melanomas such as shape and color. Then, when it’s given an image of a patient, it can identify whether it shows a melanoma or not. If Watson handles the diagnostics, then a doctor can focus on the patient, leveraging the human-only skills of judgment and empathy.

Intelligence and Creativity

In the 21st century, how much knowledge you can keep in your head won’t be a measure of your intelligence because technology can hold knowledge for you. Instead, intelligence will be measured by your ability to work with technology to access the information you need.

Additionally, creativity will no longer be about coming up with ideas, it will be about asking the right questions. A person doesn’t have to actually create a design, they just have to pick the best of the options a computer comes up with.

The Supernova and the Economy

Though the supernova has been around for years now, its effects aren’t showing up yet in the ratio of output (goods and services) to input (labor to produce goods and services). Productivity improvements usually drive growth, so economists debate why the ratios aren’t changing. There are two schools of thought:

  • We’re past the days of steadily rising growth. Economist Robert Gordon thinks all the really big innovations have already been discovered (such as air travel and women’s empowerment) so we should expect that growth will level off.
  • The economy is still adjusting and adapting. Business professor Erik Brynjolfsson thinks we will see growth, but we’re still in the transition phase. This delay has happened before in human history—for example, when electricity was invented, it took time to redesign the world’s infrastructure. It wasn’t simply a matter of retrofitting existing factories with electricity; completely new factories had to be redesigned in order to make the most efficient use of electricity. The supernova could be the same situation—factories, businesses, and governments are still working towards setting up the supernova in a way that allows them to access their full potential.

A 2015 study discovered that there’s a large gap between the most digitized industries and other industries, even though everyone is adopting the supernova. The less digitized sectors are the big players in terms of GDP, so they affect the numbers a lot. The study said that the US economy is only at 18% of what it could be if it fully embraced its digital potential.

Even if the numbers don’t show that our economies are more productive yet, it’s obvious that technology is making our world more powerful.

All Industries Will Become Computable

As technology and the supernova accelerate, everything is becoming digitizable and all industries will become computable. There’s a three-step process:

  • Digitization. Analog methods become digital.
    • For example, Uber digitized the process of hailing a cab—instead of standing on the street and waving at traffic, you can use your phone to book a ride.
  • Disruption. Digitization changes the way companies do business.
    • The emergence of Uber affected the entire taxi industry. People choose to ride with Uber instead of a taxi because Uber is more economical.
  • Democratized. Digitization and disruption allow anyone to get involved with any industry.
    • Anyone with a car can become a taxi driver anywhere that Uber is legal.

Extended Industry Example: Airbnb

Airbnb is a company that allows anyone in the world to turn their vacation home, spare bedroom, or couch into a bed and breakfast. Someone with rental space posts a description of it on Airbnb’s website, and then someone looking for accommodation gets in touch via the website, pays via the website, and stays in the host’s home. Airbnb doesn’t own any property, but the company is larger than all the major hotel chains.

Airbnb relies on the supernova. It came into being in 2007, at that sweet spot when the internet became easy, fast, free, and universal, and complexity became easy, fast, free, and invisible. Airbnb was possible because:

  • People were already comfortable paying for things online.
  • The technology existed for people to link their online identities to their real identities—critical for a platform that relies on strangers sleeping in the same space.
  • Smartphones allowed people to easily and quickly take pictures of their spaces.
  • Messaging systems existed, allowing people to talk about travel details and get to know each other in advance.
  • Rating systems existed—also critical to establishing trust between strangers.
The Advent and Future of Cloud Computing Technology

———End of Preview———

Like what you just read? Read the rest of the world's best book summary and analysis of Thomas L. Friedman's "Thank You for Being Late" at Shortform .

Here's what you'll find in our full Thank You for Being Late summary :

  • The problems that arise when the world changes faster than humanity can adapt
  • How to adapt to technology, globalization, and climate change
  • The importance of taking time to reflect and reorient

Darya Sinusoid

Darya’s love for reading started with fantasy novels (The LOTR trilogy is still her all-time-favorite). Growing up, however, she found herself transitioning to non-fiction, psychological, and self-help books. She has a degree in Psychology and a deep passion for the subject. She likes reading research-informed books that distill the workings of the human brain/mind/consciousness and thinking of ways to apply the insights to her own life. Some of her favorites include Thinking, Fast and Slow, How We Decide, and The Wisdom of the Enneagram.

Leave a Reply

Your email address will not be published.