Podcasts > All-In with Chamath, Jason, Sacks & Friedberg > Four CEOs on the Future of AI: CoreWeave, Perplexity, Mistral, and IREN

Four CEOs on the Future of AI: CoreWeave, Perplexity, Mistral, and IREN

By All-In Podcast, LLC

In this episode of All-In, industry leaders discuss the evolution of data centers from cryptocurrency mining to AI computing infrastructure. The conversation explores how companies like Mistral have adapted to market changes by positioning their facilities near renewable energy sources and optimizing them for AI workloads, while facing challenges in GPU access and supply chain constraints.

The discussion also examines AI's broader effects on the workforce and local economies. The guests explain how AI technology is making it easier for individuals to run businesses independently by automating previously labor-intensive tasks, while also creating new opportunities through community development initiatives, including partnerships with educational institutions and local infrastructure investment programs.

Listen to the original

Four CEOs on the Future of AI: CoreWeave, Perplexity, Mistral, and IREN

This is a preview of the Shortform summary of the Mar 23, 2026 episode of the All-In with Chamath, Jason, Sacks & Friedberg

Sign up for Shortform to access the whole episode summary along with additional materials like counterarguments and context.

Four CEOs on the Future of AI: CoreWeave, Perplexity, Mistral, and IREN

1-Page Summary

Transition From Bitcoin to AI In Data Centers

Mistral founder Intrator discusses his company's evolution from cryptocurrency mining to AI computing solutions. Initially using Bitcoin mining to bootstrap their operations, Mistral pivoted away from crypto's volatility toward the more stable and growing AI sector by 2020-2021. The company now strategically positions its data centers near renewable energy sources, optimizing them for AI model training and inference workloads.

Challenges Of Building Large-Scale Data Centers For AI

Daniel Roberts and Michael Intrator outline the complexities of developing AI-focused data centers. While Mistral secured adequate power through early land acquisition, Roberts notes that responding to digital demand curves remains challenging, particularly regarding supply chain constraints. Intrator describes the competitive process of accessing GPUs, highlighting industry-wide supply limitations that extend beyond processors to components like memory and networking equipment.

AI & Automation's Impact on Workforce & Job Market

The discussion reveals AI's transformative effect on work and local economies. Intrator emphasizes how AI is democratizing innovation by lowering operational barriers, while Srinivas points to AI's ability to automate traditionally time-consuming tasks. The technology enables single individuals to run businesses more efficiently, as noted by Calacanis, who reflects on how AI tools now handle tasks that once required dedicated professionals.

Roberts describes Mistral's commitment to community development, partnering with educational institutions to train future technicians and investing in local infrastructure. The company strategically places data centers in areas with existing electrical infrastructure, often revitalizing communities affected by industrial decline through local hiring and retraining initiatives.

1-Page Summary

Additional Materials

Clarifications

  • Bootstrapping operations means using initial resources or profits to fund and grow a business. Bitcoin mining generates revenue by validating transactions and earning cryptocurrency rewards. Mistral used profits from mining to invest in infrastructure and technology for AI computing. This approach helped them build a financial foundation before shifting focus.
  • Cryptocurrency mining is volatile because its profitability depends on fluctuating coin prices and network difficulty. Market prices for cryptocurrencies can change rapidly due to speculation, regulation, or technological shifts. In contrast, AI computing serves a growing, stable demand for data processing and model training. This steady demand reduces financial unpredictability compared to mining.
  • AI model training involves feeding large amounts of data into algorithms so they can learn patterns and make predictions. Inference workloads occur when the trained model processes new data to generate outputs or decisions in real time. Training requires significant computational power and time, while inference is typically faster but still demands efficient hardware. Both tasks are critical for developing and deploying AI applications effectively.
  • Digital demand curves in data center development represent the fluctuating need for computing resources over time. These curves show how demand rises and falls based on factors like user activity, application usage, and market trends. Understanding them helps data centers optimize capacity and energy use to meet peak loads without overspending. Managing these curves is complex due to unpredictable spikes and long-term growth patterns.
  • Supply chain constraints for data center hardware include shortages of semiconductors, which are critical for GPUs and CPUs. Manufacturing delays arise from limited production capacity and geopolitical tensions affecting supply routes. Additionally, specialized components like high-speed memory and networking gear face long lead times due to complex fabrication processes. These factors collectively slow down hardware availability and increase costs.
  • GPUs (Graphics Processing Units) are essential for AI because they can perform many calculations simultaneously, speeding up the training of complex AI models. The demand for GPUs is very high due to the rapid growth of AI applications, causing supply shortages. Companies compete intensely to secure enough GPUs to power their data centers and maintain performance. This competition drives up prices and limits availability, impacting AI development timelines.
  • Memory in AI data centers stores the large datasets and intermediate results needed for training and running AI models efficiently. Networking equipment connects servers and storage, enabling fast data transfer and communication essential for distributed AI workloads. High-speed, low-latency networks reduce delays and bottlenecks during model training and inference. Both components are critical to maintaining performance and scalability in AI computing environments.
  • AI "democratizes innovation" by making advanced tools and technologies accessible to a wider range of people and businesses, not just experts or large companies. It lowers operational barriers by automating complex tasks, reducing the need for specialized skills and expensive resources. This enables individuals and small teams to develop, test, and deploy new ideas faster and more cost-effectively. As a result, innovation becomes more inclusive and widespread across different industries and communities.
  • AI automates tasks such as data entry, scheduling, customer support, and content generation. It also handles complex processes like data analysis, pattern recognition, and decision-making. These tasks traditionally required significant human time and effort. Automation speeds up workflows and reduces errors.
  • Partnering with educational institutions helps create a skilled workforce tailored to the company's technical needs. It ensures a steady pipeline of qualified technicians familiar with the latest technologies. This collaboration supports local employment and economic growth by providing relevant training and job opportunities. It also helps reduce hiring costs and improves employee retention through community engagement.
  • Industrial decline occurs when factories and traditional industries shut down, leading to job losses and economic downturn in communities. This often results in reduced local income, population decline, and deteriorating infrastructure. Data centers can revitalize these areas by creating new jobs, attracting investment, and upgrading electrical and digital infrastructure. This shift helps stabilize local economies and provides new career opportunities for residents.
  • Locating data centers near renewable energy sources reduces carbon emissions and lowers long-term energy costs. Proximity to existing electrical infrastructure ensures reliable power supply and reduces the expense and time needed to build new power lines. It also minimizes energy loss during transmission, improving overall efficiency. These factors support sustainable, cost-effective, and scalable data center operations.

Counterarguments

  • While Mistral's pivot to AI may have been strategic, it's important to consider that the AI sector also has its own forms of volatility, such as regulatory changes, ethical concerns, and rapid technological advancements that could disrupt current business models.
  • The reliance on renewable energy sources is commendable, but it may not always be feasible or cost-effective in all locations, potentially limiting the scalability and placement of data centers.
  • Securing adequate power and managing supply chain constraints are indeed challenges, but these issues are not unique to AI-focused data centers and are faced by various industries, suggesting that broader systemic solutions are needed.
  • The competitive access to GPUs and other components could lead to increased costs, which might be passed on to consumers or result in reduced profitability for data center operators.
  • AI's ability to democratize innovation and automate tasks is significant, but it could also lead to job displacement and require substantial societal adjustments to address the needs of the workforce.
  • The claim that AI enables individuals to run businesses more efficiently does not account for the learning curve and potential barriers to entry associated with understanding and implementing AI technologies.
  • Mistral's commitment to community development and local hiring is positive, but it's important to ensure that these initiatives are sustainable and that the jobs created are of high quality and offer long-term prospects.
  • Placing data centers in areas with existing electrical infrastructure is practical, but it may also lead to concerns about resource allocation, environmental impact, and the perpetuation of centralization in certain regions.

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free
Four CEOs on the Future of AI: CoreWeave, Perplexity, Mistral, and IREN

Transition From Bitcoin to AI In Data Centers

Intrator, the founder of Mistral, discusses the company's strategic pivot from cryptocurrency mining to AI compute solutions in the face of heightened demand and market volatility.

Mistral Shifted From Crypto Mining to AI Compute Amid Demand Surge

Mistral, initially established for Bitcoin mining, utilized cryptocurrency activities to fund and bootstrap its data center operations. However, as volatility in the crypto market became a significant challenge, the company began exploring more stable and diverse computing uses.

Cryptocurrency Funded Mistral's Data Center, but the Company Shifted Focus to AI's Potential

Intrator mentions that since its inception in 2017, Mistral faced several "crypto winters," periods where the value of cryptocurrency dramatically fell. These challenges prompted Mistral to look beyond the unstable world of crypto and consider the burgeoning field of AI. By 2020 and 2021, this shift was clear as Mistral focused its efforts on accommodating the computational demands of neural networks and other AI applications.

Mistral Builds Efficient Data Centers Optimized For AI Workloads

Leveraging the cash flow generated from its early days in Bitcoin mining, Mistral has been transitioning its infrastructure to better support the specific needs of AI computing.

Company Locates Facilities Near Renewable Energy For Sustainable, Cost-Effective Power

Mistral strategically locates its data centers near renewable energy sources to ensure the sustainability and cost-effectiveness of its operations. Th ...

Here’s what you’ll find in our full summary

Registered users get access to the Full Podcast Summary and Additional Materials. It’s easy and free!
Start your free trial today

Transition From Bitcoin to AI In Data Centers

Additional Materials

Clarifications

  • "Crypto winters" refer to extended periods when cryptocurrency prices fall sharply and remain low for months or years. These downturns reduce investor confidence and slow market activity, causing financial strain for companies reliant on crypto revenue. They often follow speculative bubbles and can lead to reduced funding and operational challenges in the crypto industry. The term highlights the cyclical and volatile nature of cryptocurrency markets.
  • Cryptocurrency mining involves using powerful computers to solve complex mathematical problems that validate transactions on a blockchain. Miners earn cryptocurrency rewards for their work, which can be sold for cash. This revenue provides funds to cover the high costs of running and expanding data center infrastructure. Thus, mining profits help finance equipment, electricity, and maintenance expenses.
  • AI compute solutions refer to specialized hardware and software systems designed to handle the large-scale processing required for training and running artificial intelligence models. These solutions often include GPUs, TPUs, or other accelerators optimized for parallel computations. They support tasks like neural network training, data inference, and real-time AI applications. Efficient AI compute solutions reduce latency and energy consumption while maximizing performance.
  • Neural networks are complex algorithms modeled after the human brain that require large amounts of data and calculations to learn patterns. AI applications use these networks to perform tasks like image recognition, language processing, and decision-making. Training neural networks involves repeatedly adjusting millions or billions of parameters, demanding significant computational power and memory. This process requires specialized hardware, such as GPUs or TPUs, to handle the intense parallel processing efficiently.
  • Locating data centers near renewable energy sources reduces reliance on fossil fuels, lowering carbon emissions. It helps companies meet regulatory and consumer demands for sustainability. Proximity to energy sources also minimizes transmission losses, improving efficiency. Additionally, renewable energy can offer more stable and potentially lower long-term power costs.
  • Low-latency computing power means minimal delay between data input and processing output. It is crucial for AI tasks that require real-time or near-real-time responses, such as autonomous driving or interactive applications. Lower latency improves efficiency and accuracy in AI model training and inference. This capability helps data centers handle complex AI workloads faster and more effectively.
  • AI model training is the process where a model learns patterns from large datasets by adjusting its internal parameters. Inference is when the trained model uses what it has learned to ma ...

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free
Four CEOs on the Future of AI: CoreWeave, Perplexity, Mistral, and IREN

Challenges Of Building Large-Scale Data Centers For AI

Building and outfitting data centers to support the computational demands of AI is a formidable challenge. Daniel Roberts and Michael Intrator from Mistral describe the process of developing these centers and the bottlenecks they encounter in sourcing the necessary hardware.

Building and Outfitting Data Centers Is a Challenge

Roberts provides insight into the initial stages of creating a data center, which involves securing land, obtaining permits, and ensuring grid connections are in place. He shares the company's experience with being surprised by the scale of their flagship Texas site, which boasts a capacity of 750 megawatts. While the power industry views power itself as a constraint, for Mistral it is not an issue due to securing land and power early. However, Roberts points to the real-world challenge of responding to the digital demand curves, encompassing the stress on supply chains and impacts on every construction aspect such as memory.

Sourcing and Installing Power, Cooling, Networking, and Compute Hardware Is an Ongoing Battle Against Supply Constraints

Intrator speaks to the competitive process of getting access to GPUs, a stark testament to the high demand for this hardware where customer orders are served based on arrival without preferential treatment, revealing supply constraints. He recalls a period of disequilibrium when the world lacked enough GPUs for all AI use cases. Constraints extend beyond GPUs to other critical components including electricity, power shells, memory, storage, networking, and optics.

Daniel Roberts echoes the sentiment mentioning a demand surge they cannot meet, highlighting the supply constraint in providing the necessary computing power for AI.

Managing Rapid Technological Change In AI Is Critical

Mistral Must Upgrade Facilities for Latest GPU Architectures and Hardware Innovations

While no specific details are provided about upgrading for the latest GPU architectures, it's clear Mistral is tasked with continually updating their facilities to stay current. They are actively replacing Bitcoin-related hardware with AI chips to accommodate the burgeoning demand for AI computation. With Nvidia ...

Here’s what you’ll find in our full summary

Registered users get access to the Full Podcast Summary and Additional Materials. It’s easy and free!
Start your free trial today

Challenges Of Building Large-Scale Data Centers For AI

Additional Materials

Clarifications

  • Grid connections refer to the infrastructure linking a data center to the regional electrical power grid. They include substations, transformers, and transmission lines that deliver stable, high-capacity electricity. Reliable grid connections ensure continuous power supply, preventing outages that could disrupt data center operations. They also allow data centers to draw the large amounts of electricity needed for AI computing workloads.
  • Digital demand curves represent the fluctuating and often rapidly increasing need for digital resources like computing power and data storage. These curves impact supply chains by creating unpredictable spikes in demand, which strain the availability of hardware components. This volatility makes it difficult for suppliers to forecast and meet orders promptly. Consequently, manufacturers and data centers face delays and shortages in critical equipment.
  • In data center construction, "power shells" refer to the physical infrastructure components that house and distribute electrical power, such as power distribution units (PDUs), transformers, and switchgear. They ensure reliable delivery of electricity from the grid to the servers and other hardware. Power shells also include protective enclosures that safeguard electrical equipment from environmental and operational hazards. Their design is critical for managing high power loads and maintaining safety standards.
  • GPUs (Graphics Processing Units) are essential for AI because they can perform many calculations simultaneously, speeding up the training and inference of complex AI models. Their architecture is optimized for the matrix and vector operations common in machine learning tasks. High demand arises from the rapid growth of AI applications requiring massive computational power. Limited manufacturing capacity and the specialized nature of GPUs further constrain supply.
  • Bitcoin-related hardware primarily refers to specialized machines called ASICs designed for cryptocurrency mining. These devices consume significant power but are optimized for a specific task, unlike versatile AI chips. As AI demand grows, data centers repurpose space and power from less flexible Bitcoin miners to more efficient AI processors. This shift maximizes computational resources for AI workloads, which require different hardware capabilities.
  • Nvidia is a leading manufacturer of GPUs, which are specialized processors optimized for AI and graphics tasks. Google, Amazon, and Meta design custom silicon chips tailored to their specific AI workloads, improving efficiency and performance beyond standard GPUs. These companies invest heavily in hardware innovation to gain competitive advantages in AI computing. Their developments influence data center design and the availability of cutting-edge AI processing power.
  • "Time to compute" refers to the total duration required to process AI models and generate results. It is a hurdle because longer computation times slow down development, experimentation, and deployment of AI applications. Faster computation enables quicker iteration and real-time responsiveness, which are critical for advancing AI capabilities. Reducing "time to compute" demands powerful hardware and efficient data center infrastructure.
  • Large language models are AI systems trained on vast text data to understand and generat ...

Counterarguments

  • While securing land and power early can mitigate some constraints, it does not address the ongoing challenges of maintaining a sustainable and environmentally friendly power supply, especially as data centers consume vast amounts of energy.
  • The first-come, first-served approach to GPU allocation may be seen as fair, but it could disadvantage smaller companies or startups that lack the purchasing power or speed of larger corporations.
  • The focus on continual hardware upgrades may lead to increased electronic waste if not managed properly, raising concerns about the environmental impact of rapid technological turnover.
  • The emphasis on staying agile and adaptable to new hardware innovations might overshadow the importance of software optimization and algorithmic efficiency, which can also yield significant performance improvements.
  • The narrative that Nvidia leads GPU technology and sets industry standards could be challenged by the emergence of competitive technologies and the potential for open-source or alternative hardware to disrupt the market.
  • The rush to replace Bitcoin-related hardware with AI chips may not consider the potential for future blockchain technologies that could be more energy-efficient or have applications beyond cryptocurrency.
  • The concept of "time to compute" as a hurdle may not fully acknowledge the role of edge computing and ...

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free
Four CEOs on the Future of AI: CoreWeave, Perplexity, Mistral, and IREN

Ai & Automation's Impact on Workforce & Job Market

The integration of AI and automation into various industries is not only reshaping the job landscape but also creating new opportunities and transforming local economies.

Ai-driven Automation Reshapes Jobs, Creates New Opportunities

Michael Intrator comments on AI’s potential to inspire the workforce by lowering the barrier to operations. This accessibility allows for an unprecedented level of human creativity, as people with innovative ideas can now leverage AI to create solutions and products that were previously unattainable. Intrator observes that AI dismantles barriers that once limited human creativity, now empowering the world's eight billion people to access tools that can overcome seemingly insurmountable challenges.

Srinivas further expands on the capability of AI to automate tasks that traditionally required significant human effort, such as conducting research or managing business processes. These tasks, which could have taken a week or the full attention of a researcher or producer, can now be expedited profoundly. AI and automation technologies now render possible what would be part of a researcher's job, such as preparing for interviews by parsing through years of podcasts and information.

Srinivas envisions a future where businesses could become considerably autonomous through the use of AI, suggesting that a small business might be operated by a single individual using AI tools. This could open new avenues for income generation that don't require the owners' active, day-to-day management. Calacanis reflects on the transformative changes in startup operations over the years, pointing out that tasks once necessitating the hiring of professionals can now be managed by founders using AI and automation tools, including recruitment and HR responsibilities.

Srinivas highlights AI’s potential to accelerate the process of converting an idea into a successful business venture by streamlining the startup process. This enables entrepreneurs to move forward without having to write complex prompts or master intricate programming languages.

Mistral Partners With Universities and Trade Schools to Train Future Data Center Technicians and Tradespeople

Daniel Roberts emphasizes that the workforce needs to prepare for a new era where skills to engage with AI and automation are critical. To address this need, Mistral is actively partnering with universities and trade colleges to train data center technicians and tradespeople, ensuring that they are equipped with the necessary knowledge and skills to thrive in the evolving job market.

Mistral Employs and Supports Local Communities Near Its Facilities

As Mistral introduces new technologies into their operations, it mai ...

Here’s what you’ll find in our full summary

Registered users get access to the Full Podcast Summary and Additional Materials. It’s easy and free!
Start your free trial today

Ai & Automation's Impact on Workforce & Job Market

Additional Materials

Clarifications

  • Data center technicians maintain and troubleshoot the physical hardware and network systems that keep data centers running smoothly. They ensure servers, cooling systems, and power supplies operate efficiently to prevent downtime. Tradespeople in this context may include electricians, HVAC specialists, and maintenance workers who support the infrastructure and safety of the facility. Their combined work ensures continuous, reliable operation of data centers critical for AI and automation technologies.
  • "Lowering the barrier to operations" means making complex tasks easier to perform by reducing the skills, time, or resources needed. In AI, this often involves user-friendly tools that automate difficult processes. This allows more people, even without specialized training, to use advanced technology effectively. As a result, innovation and productivity become accessible to a wider audience.
  • AI can analyze large amounts of audio and text data by converting speech to text using speech recognition technology. It then uses natural language processing (NLP) to understand and summarize key points from the content. This allows AI to quickly extract relevant information from extensive podcast archives. As a result, it can prepare concise briefs or highlight important topics for interview preparation.
  • Businesses becoming "considerably autonomous" through AI means that many routine tasks, such as customer service, inventory management, and marketing, can be handled by AI systems without constant human intervention. This reduces the need for large staff and allows a single person to oversee operations efficiently. AI can analyze data, make decisions, and execute actions automatically, enabling smoother and faster business processes. This autonomy helps businesses operate continuously and adapt quickly to changes with minimal manual input.
  • AI accelerates business creation by providing user-friendly tools that automate complex tasks like market research, product design, and customer engagement. These tools use natural language processing and pre-built models, allowing entrepreneurs to generate business plans, marketing content, and prototypes without coding. AI platforms often include drag-and-drop interfaces and templates that simplify development and testing. This reduces the need for technical expertise, enabling faster and more accessible startup launches.
  • Mistral’s partnerships with universities and trade schools help create a skilled workforce tailored to emerging AI and automation technologies. These collaborations ensure training programs are up-to-date with industry needs, increasing employability for graduates. They also help bridge the gap between education and practical job requirements in data center operations. This proactive approach supports long-term economic stability by preparing workers for future job demands.
  • Data centers require large amounts of reliable electricity to operate efficiently. Areas with strong electrical infrastructure can support this high energy demand without costly upgrades. Industrial decline often leaves behind underutilized facilities and a skilled workforce seeking new employment. Repurposing these sites helps reduce costs and stimulates local ...

Counterarguments

  • AI and automation may lead to job displacement, particularly for roles that are highly automatable, creating short-term unemployment and requiring workers to transition to new types of employment.
  • The benefits of AI-driven innovation may not be evenly distributed, potentially exacerbating economic inequality if certain regions or demographics are left behind in the transition.
  • The reliance on AI and automation could lead to a loss of certain skills and expertise as machines take over tasks that humans previously performed, potentially eroding the craftsmanship and human touch in some industries.
  • The integration of AI into businesses may raise ethical concerns, such as privacy issues, algorithmic bias, and the need for transparent AI decision-making processes.
  • The pace of AI and automation adoption may outstrip the ability of educational institutions to adapt, leading to a skills gap where the workforce is not adequately prepared for the new job market.
  • There may be resistance from employees and labor unions concerned about job security, working conditions, and the implications of AI on worker autonomy and surveillance in the workplace.
  • The environmental impact of data centers and the increased energy consumption associated with AI and automation technologies could be a concern, particularly in areas with less sustainable energy sources.
  • Small businesses may face challenges in adopting AI tools due to costs, complexity, or lack of expertise, potentially giving larger corporations an unfair advantage and hindering competition.
  • The assumption that AI will lead to more creativity and innovation may not hold true for all individuals or sectors, as some people may find their creativity stifled by technology or struggle to adapt to new ways of working.
  • The claim that AI can perform parts of a re ...

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free

Create Summaries for anything on the web

Download the Shortform Chrome extension for your browser

Shortform Extension CTA