In this episode of Stuff You Should Know, the hosts explore the evolution of data centers from their military computing origins in the 1940s to their central role in today's digital economy. They trace how data centers developed from early business mainframes to modern cloud computing infrastructure, examining the companies and technologies that shaped this transformation.
The hosts also discuss how artificial intelligence is driving unprecedented demand for data center expansion, with tech giants investing heavily in GPU-powered facilities. This growth brings significant environmental challenges, as modern data centers consume resources equivalent to small towns. The episode puts these developments in perspective by comparing facilities like Meta's Hyperion data center to the energy requirements of major cities.

Sign up for Shortform to access the whole episode summary along with additional materials like counterarguments and context.
The evolution of data centers traces a remarkable journey from military computing to today's digital economy. Starting with the British Colossus and American ENIAC in the 1940s, these early military computers laid the foundation for modern computing, despite their limited storage capabilities.
The 1950s marked a pivotal shift as businesses began adopting on-site data processing. IBM emerged as a industry leader, with their System/360 mainframe supporting critical operations like the Apollo 11 mission. As personal computers and the internet gained prominence, data centers expanded to meet growing storage needs, enabling the rise of e-commerce and web services.
Cloud computing transformed data management by enabling remote data access without on-site infrastructure. Companies like Dropbox pioneered new business models by utilizing cloud services from providers like Amazon Web Services. This shift spurred the rapid growth of hyperscale data centers, with tech giants like Google, Amazon, and Microsoft leading the expansion.
Josh Clark explains that AI has dramatically changed data center requirements, with GPUs becoming essential due to their superior parallel processing capabilities. Major tech firms are investing heavily in AI infrastructure, with Morgan Stanley projecting nearly $3 trillion in data center spending from 2025 to 2030.
Chuck Bryant and Josh Clark emphasize that while these developments are important, they come with significant environmental challenges. Modern AI data centers consume as much electricity and water as a town of 50,000 people, raising serious sustainability concerns. Meta's Hyperion data center alone is expected to use about half of New York City's peak energy load, highlighting the mounting pressure on energy grids and natural resources.
1-Page Summary
The journey from the colossal machines of the mid-20th century to today’s ubiquitous digital economy was made possible by the evolution of data centers.
The earliest data centers were massive electronic machines predominantly used by the military. A notable first in programmable computing was the British Colossus, used in World War II for intercepting Hitler's encrypted messages. Sporting cutting-edge technology like vacuum tubes along with manual switches and plugs, Colossus was vital in decrypting communications, most notably between Hitler and Goebbels. Meanwhile, the American ENIAC, utilized to compute missile trajectories, demonstrated early data processing capabilities. Not geared towards storage, both machines laid the groundwork for modern computing. Today, the site of Colossus at Bletchley Park, known as block H, is preserved as the National Museum of Computing.
The 1950s represented a turning point as companies began to process data on-site, stepping beyond military applications. Mainframes, initially a term for the cabinets containing telecommunications gear, became vital in business settings. Lyons, a UK tea shop chain, was a pioneer in this regard with their LEO (Lyons Electronic Office) computer that handled payroll and stock management while also working on computations for the Ministry of Defense.
IBM emerged as a leader in this space in the early 1950s. An early example of IBM’s clout is the leasing of a mainframe unit in 1952 for $16,000 a month. Clark highlights IBM’s System/360 as a benchmark in mainframe technology in the 1960s, supporting endeavors like the Apollo 11 moon mission. Even today, mainframes remain in use within entities such as Visa or healthcare providers, prized for unparalleled reliability and security.
With the rise of personal computers and the internet, the need for dat ...
The History and Evolution of Data Centers
The emergence of cloud computing has revolutionized data storage and processing, leading to significant changes in how data centers operate and the services they provide.
Cloud computing has profoundly affected data management by allowing for remote data access without needing on-site infrastructure. This technology emerged in the early 2000s and marked a notable departure from traditional data storage methods. Rather than keeping data localized, cloud computing relies on third-party managed off-site data centers, enabling interconnected storage and access.
An example of the transformative power of cloud computing is Dropbox's business model. The company capitalized on cloud services by purchasing storage from Amazon Web Services, then reselling it in tiered plans to end users. This approach demonstrates a scalable, cloud-based data storage service model that was not possible before the cloud era.
The cloud shift has acc ...
Impact of Cloud Computing on Data Centers
The rise of AI technologies like OpenAI's ChatGPT has led to a surge in demand for high-performance data centers equipped with powerful GPUs, creating both opportunities and significant environmental challenges.
Josh Clark states that the advent of AI has considerably impacted data center operations, as CPUs are no longer sufficient for the processing demands of AI. GPUs have become essential due to their parallel processing capabilities, which are well-suited for running multiple operations simultaneously, which is crucial for AI workloads. NVIDIA chips, in particular, are in high demand by major companies for their AI applications. The technology's high demand has led to increased prices for the average NVIDIA graphics card, affecting gamers who struggle to buy these essential components.
AI data centers necessitate significant investments as they require hundreds of thousands of GPUs to effectively run complex models. XAI's construction of the Colossus machine in Memphis, Tennessee, exemplifies the extensive infrastructure, wielding 200,000 GPUs. Financial giants are recognizing the trend, with Morgan Stanley estimating nearly $3 trillion to be spent on data centers from 2025 to 2030, half of which is expected to go towards hardware. Microsoft, Amazon, Google, and Meta are making monumental investments in data centers, with Microsoft putting forth $30 billion for UK centers and plans for 100 more AI data centers in the UK alone.
The environmental implications of this AI data center boom are profound. Data centers consume as much electricity and water as a town of 50,000 people. They use water for evaporative cooling to counteract the heat generated by the proces ...
AI's Growing Role and GPU-powered Data Center Demand
Download the Shortform Chrome extension for your browser
