← Все кластеры
Meta’s natural gas binge could power South Dakota | TechCrunch
active
Тип событияother
Темаai infrastructure
ОрганизацияMicrosoft
СтранаUnited States
Статей8
Уник. источников4
Важность / Момент1.96 / 0
Период01.04.2026 18:35 — 06.04.2026 15:15
Создан06.04.2026 06:56:14
Статьи в кластере 8
Заголовок Источник Дата публикации Score
S Meta’s natural gas binge could power South Dakota | TechCrunch techcrunch 01.04.2026 18:35 1
Embedding sim.1
Entity overlap1
Title sim.1
Time proximity1
NLP типother
NLP организацияMeta
NLP темаai infrastructure
NLP странаUnited States

Открыть оригинал

Data centers have gotten so large that their power demands now rival entire U.S. states. Take Meta’s Hyperion AI data center, for example. When completed, the new AI data center will draw as much electricity as South Dakota. Last week, Meta announced it would fund seven natural gas power plants — on top of the three it had already committed to building — to support the $27 billion data center. When combined, the 10 power plants in Louisiana will generate around 7.5 gigawatts of electricity, slightly more than the capacity of the entire Mount Rushmore State.  Like many tech companies, Meta has touted its climate and environmental bona fides over the years. It regularly publishes sustainability reports, and it frequently crows about its renewable energy purchases . It effectively bought a nuclear power plant for 20 years. Meta’s Hyperion data center site in Louisiana will test the company’s commitments. Natural gas has been hailed as a “bridge fuel” — build a few natural gas power plants now while renewables, batteries, and nuclear get their legs under them. That’s almost certainly how Meta is justifying the move internally. But people have been making the bridge fuel argument for decades, and it’s wearing a little thin. Renewables and batteries have plummeted in price while prices for gas turbines have skyrocketed . Meta has been a leading purchaser of solar, batteries, and nuclear in recent years, which makes the decision to go big on natural gas all the more perplexing. TechCrunch reached out to Meta. The company did not reply to multiple requests for comment. Techcrunch event Disrupt 2026: The tech ecosystem, all in one room Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $400. Save up to $300 or 30% to TechCrunch Founder Summit 1,000+ founders and investors come together at TechCrunch Founder Summit 2026 for a full day focused on growth, execution, and real-world scaling. Learn from founders and investors who have shaped the industry. Connect with peers navigating similar growth stages. Walk away with tactics you can apply immediately Offer ends March 13. San Francisco, CA | October 13-15, 2026 REGISTER NOW The massive turbines in Louisiana will dump 12.4 million metric tons of CO 2 into the atmosphere every year, according to TechCrunch’s calculations, which is based on data from the Department of Energy. That is 50% more than Meta’s entire carbon footprint in 2024, the most recent year such numbers are available. That figure is an underestimate of the climate impact, too, since it doesn’t include leaks from the natural gas supply chain. Methane, the main component of natural gas, warms the planet 84 times more than carbon dioxide. Even leakage rates of 0.2% along the supply chain can make natural gas’ climate impact worse than coal . In the U.S., natural gas production and pipelines leak methane at a rate that’s closer to 3% . That’s hardly clean power. The company’s latest sustainability report makes no mention of methane leaks. It doesn’t mention methane or natural gas at all. And yet the fuel is poised to become one of the largest contributors to Meta’s carbon footprint in the coming years. The company may well stick to its climate pledge and find a way to offset those emissions through carbon removal credits. But now it will need a lot more of them, along with an honest accounting of exactly how much methane will leak into the atmosphere in service of feeding its new power plants. Topics AI , Climate , data centers , Exclusive , Facebook , Meta , natural gas Tim De Chant Senior Reporter, Climate Tim De Chant is a senior climate reporter at TechCrunch. He has written for a wide range of publications, including Wired magazine, the Chicago Tribune, Ars Technica, The Wire China, and NOVA Next, where he was founding editor. De Chant is also a lecturer in MIT’s Graduate Program in Science Writing, and he was awarded a Knight Science Journalism Fellowship at MIT in 2018, during which time he studied climate technologies and explored new business models for journalism. He received his PhD in environmental science, policy, and management from the University of California, Berkeley, and his BA degree in environmental studies, English, and biology from St. Olaf College. You can contact or verify outreach from Tim by emailing tim.dechant@techcrunch.com . View Bio April 30 San Francisco, CA StrictlyVC kicks off the year in SF. Get in the room for unfiltered fireside chats with industry leaders, insider VC insights, and high-value connections that actually move the needle. Tickets are limited. REGISTER NOW Most Popular Anthropic is having a month Connie Loizos Google is now letting users in the US change their Gmail address Ivan Mehta Why OpenAI really shut down Sora Connie Loizos The Pixel 10a doesn’t have a camera bump, and it’s great Ivan Mehta Anthropic’s Claude popularity with paying consumers is skyrocketing Julie Bort Let’s take a look at the retro tech making a comeback Lauren Forristal Waymo’s skyrocketing ridership in one chart Kirsten Korosec Loading the next article Error loading the next article X LinkedIn Facebook Instagram youTube Mastodon Threads Bluesky TechCrunch Staff Contact Us Advertise Crunchboard Jobs Site Map Terms of Service Privacy Policy RSS Terms of Use Code of Conduct Kalshi Copilot Blue Origin WordPress Bezos Tech Layoffs ChatGPT © 2026 TechCrunch Media LLC.
AI companies are building huge natural gas plants to power data centers. What could go wrong? | TechCrunch techcrunch 03.04.2026 19:48 0.989
Embedding sim.1
Entity overlap0.7895
Title sim.1
Time proximity1
NLP типother
NLP организацияMicrosoft
NLP темаai infrastructure
NLP странаUnited States

Открыть оригинал

Who doesn’t love a good round of FOMO? From dot-com to Web 2.0, virtual reality to blockchain, the tech industry has had its share of being too afraid to miss out on a trend. The AI bubble is the big daddy of them all. Its first offspring — the rush to lock down power for data centers — is now begetting a mad dash to secure natural gas supplies and equipment. If FOMOs could have babies, then the AI bubble is already having grandkids. Microsoft said on Tuesday that it’s working with Chevron and Engine No. 1 to build a natural gas power plant in West Texas that could grow to produce 5 gigawatts of electricity. This week Google confirmed that it’s working with Crusoe to build a 933 MW natural gas power plant in North Texas. And last week, Meta announced that it was adding another seven natural gas power plants to its Hyperion data center in Louisiana, bringing the site to 7.46 GW of capacity — enough to power the entire state of South Dakota . Are we missing anyone? The recent investments are concentrated in the southern U.S., home to some of the largest natural gas deposits in the world. Recently, the U.S. Geological Survey estimated that there’s enough in one region to supply energy to the entire United States for 10 months by itself. Every data center operator seems to want a part of it. The scramble for natural gas has led to a shortage of turbines for the power plants, with prices likely to rise 195% by the end of this year relative to 2019 prices, according to Wood Mackenzie. The equipment contributes 20% to 30% of the cost of a power plant. Companies won’t be able to place new orders until 2028, and it’s taking six years to get turbines delivered, the consultancy notes. That means tech companies are betting that the AI fever won’t break, that AI will continue to need exponential amounts of power , and that natural gas generation will be necessary for success in the AI era. Techcrunch event Disrupt 2026: The tech ecosystem, all in one room Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $400. Save up to $300 or 30% to TechCrunch Founder Summit 1,000+ founders and investors come together at TechCrunch Founder Summit 2026 for a full day focused on growth, execution, and real-world scaling. Learn from founders and investors who have shaped the industry. Connect with peers navigating similar growth stages. Walk away with tactics you can apply immediately Offer ends March 13. San Francisco, CA | October 13-15, 2026 REGISTER NOW They may come to regret that third assumption. Though natural gas supplies in the U.S. are plentiful, and because shipping the fuel isn’t cheap, the country remains somewhat insulated from the turmoil in the Middle East. But supplies aren’t unlimited, and recently, growth in production in the big three regions — responsible for three-quarters of all U.S. shale gas production — has slowed considerably .  It’s not clear how insulated tech companies are from price swings since none of them have disclosed specific terms of their agreements. A lot will depend on how firm the price is in those contracts. Even if the contracted prices are as firm as can be, the companies could still face repercussions. Because natural gas generates about 40% of the electricity in the U.S., according to the Energy Information Administration, electricity prices are closely tied to natural gas prices. Tech companies might be able to shield themselves from scrutiny for a bit by moving their gas power plants behind the meter — by skipping the grid and connecting them directly to their data centers. But natural gas isn’t an unlimited resource, and if their ambitions grow too big, even the behind-the-meter operations could drive up power prices for everyone. We’ve all seen how that’s played out. It won’t just be regular households getting upset either. Other industries, including those that remain much more dependent on natural gas and can’t yet turn to renewables, might balk at data centers grabbing so much of the resource. Powering a data center with wind, solar, and batteries is easy. Running a petrochemical plant? Not so much. Then there’s the weather. One cold winter could change the calculus by driving up demand among households. Wellheads might freeze off, crimping supplies dramatically, as happened in Texas in 2021. When gas runs short, suppliers will face a choice: keep the AI data centers running or let people heat their homes? By snapping up natural gas supplies and moving behind-the-meter, tech companies can claim that they’re “bringing their own power” and not straining the electrical grid. But in reality, they’re just shifting their use from one grid to another, the natural gas grid. The AI rush has illustrated just how physically constrained the digital world remains. Does it make sense for them to bet big on a finite resource? Tech companies might regret falling for the FOMO. Topics AI , Climate , data centers , Enterprise , Google , Meta , Microsoft , natural gas Tim De Chant Senior Reporter, Climate Tim De Chant is a senior climate reporter at TechCrunch. He has written for a wide range of publications, including Wired magazine, the Chicago Tribune, Ars Technica, The Wire China, and NOVA Next, where he was founding editor. De Chant is also a lecturer in MIT’s Graduate Program in Science Writing, and he was awarded a Knight Science Journalism Fellowship at MIT in 2018, during which time he studied climate technologies and explored new business models for journalism. He received his PhD in environmental science, policy, and management from the University of California, Berkeley, and his BA degree in environmental studies, English, and biology from St. Olaf College. You can contact or verify outreach from Tim by emailing tim.dechant@techcrunch.com . View Bio April 30 San Francisco, CA StrictlyVC kicks off the year in SF. Get in the room for unfiltered fireside chats with industry leaders, insider VC insights, and high-value connections that actually move the needle. Tickets are limited. REGISTER NOW Most Popular Anthropic took down thousands of GitHub repos trying to yank its leaked source code — a move the company says was an accident Tim Fernholz The reputation of troubled YC startup Delve has gotten even worse Julie Bort Anthropic is having a month Connie Loizos Google is now letting users in the US change their Gmail address Ivan Mehta Allbirds is selling for $39M. It raised nearly 10 times that amount in its IPO. Connie Loizos Why OpenAI really shut down Sora Connie Loizos The Pixel 10a doesn’t have a camera bump, and it’s great Ivan Mehta Loading the next article Error loading the next article X LinkedIn Facebook Instagram youTube Mastodon Threads Bluesky TechCrunch Staff Contact Us Advertise Crunchboard Jobs Site Map Terms of Service Privacy Policy RSS Terms of Use Code of Conduct Kalshi Copilot Blue Origin WordPress Bezos Tech Layoffs ChatGPT © 2026 TechCrunch Media LLC.
AI companies are building huge natural gas plants to power data centers. What could go wrong? | TechCrunch techcrunch 03.04.2026 19:48 0.714
Embedding sim.0.8071
Entity overlap0.2647
Title sim.0.2326
Time proximity0.8491
NLP типother
NLP организацияMicrosoft
NLP темаai infrastructure
NLP странаUnited States

Открыть оригинал

Who doesn’t love a good round of FOMO? From dot-com to Web 2.0, virtual reality to blockchain, the tech industry has had its share of being too afraid to miss out on a trend. The AI bubble is the big daddy of them all. Its first offspring — the rush to lock down power for data centers — is now begetting a mad dash to secure natural gas supplies and equipment. If FOMOs could have babies, then the AI bubble is already having grandkids. Microsoft said on Tuesday that it’s working with Chevron and Engine No. 1 to build a natural gas power plant in West Texas that could grow to produce 5 gigawatts of electricity. This week Google confirmed that it’s working with Crusoe to build a 933 MW natural gas power plant in North Texas. And last week, Meta announced that it was adding another seven natural gas power plants to its Hyperion data center in Louisiana, bringing the site to 7.46 GW of capacity — enough to power the entire state of South Dakota . Are we missing anyone? The recent investments are concentrated in the southern U.S., home to some of the largest natural gas deposits in the world. Recently, the U.S. Geological Survey estimated that there’s enough in one region to supply energy to the entire United States for 10 months by itself. Every data center operator seems to want a part of it. The scramble for natural gas has led to a shortage of turbines for the power plants, with prices likely to rise 195% by the end of this year relative to 2019 prices, according to Wood Mackenzie. The equipment contributes 20% to 30% of the cost of a power plant. Companies won’t be able to place new orders until 2028, and it’s taking six years to get turbines delivered, the consultancy notes. That means tech companies are betting that the AI fever won’t break, that AI will continue to need exponential amounts of power , and that natural gas generation will be necessary for success in the AI era. Techcrunch event Disrupt 2026: The tech ecosystem, all in one room Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $400. Save up to $300 or 30% to TechCrunch Founder Summit 1,000+ founders and investors come together at TechCrunch Founder Summit 2026 for a full day focused on growth, execution, and real-world scaling. Learn from founders and investors who have shaped the industry. Connect with peers navigating similar growth stages. Walk away with tactics you can apply immediately Offer ends March 13. San Francisco, CA | October 13-15, 2026 REGISTER NOW They may come to regret that third assumption. Though natural gas supplies in the U.S. are plentiful, and because shipping the fuel isn’t cheap, the country remains somewhat insulated from the turmoil in the Middle East. But supplies aren’t unlimited, and recently, growth in production in the big three regions — responsible for three-quarters of all U.S. shale gas production — has slowed considerably .  It’s not clear how insulated tech companies are from price swings since none of them have disclosed specific terms of their agreements. A lot will depend on how firm the price is in those contracts. Even if the contracted prices are as firm as can be, the companies could still face repercussions. Because natural gas generates about 40% of the electricity in the U.S., according to the Energy Information Administration, electricity prices are closely tied to natural gas prices. Tech companies might be able to shield themselves from scrutiny for a bit by moving their gas power plants behind the meter — by skipping the grid and connecting them directly to their data centers. But natural gas isn’t an unlimited resource, and if their ambitions grow too big, even the behind-the-meter operations could drive up power prices for everyone. We’ve all seen how that’s played out. It won’t just be regular households getting upset either. Other industries, including those that remain much more dependent on natural gas and can’t yet turn to renewables, might balk at data centers grabbing so much of the resource. Powering a data center with wind, solar, and batteries is easy. Running a petrochemical plant? Not so much. Then there’s the weather. One cold winter could change the calculus by driving up demand among households. Wellheads might freeze off, crimping supplies dramatically, as happened in Texas in 2021. When gas runs short, suppliers will face a choice: keep the AI data centers running or let people heat their homes? By snapping up natural gas supplies and moving behind-the-meter, tech companies can claim that they’re “bringing their own power” and not straining the electrical grid. But in reality, they’re just shifting their use from one grid to another, the natural gas grid. The AI rush has illustrated just how physically constrained the digital world remains. Does it make sense for them to bet big on a finite resource? Tech companies might regret falling for the FOMO. Topics AI , Climate , data centers , Enterprise , Google , Meta , Microsoft , natural gas Tim De Chant Senior Reporter, Climate Tim De Chant is a senior climate reporter at TechCrunch. He has written for a wide range of publications, including Wired magazine, the Chicago Tribune, Ars Technica, The Wire China, and NOVA Next, where he was founding editor. De Chant is also a lecturer in MIT’s Graduate Program in Science Writing, and he was awarded a Knight Science Journalism Fellowship at MIT in 2018, during which time he studied climate technologies and explored new business models for journalism. He received his PhD in environmental science, policy, and management from the University of California, Berkeley, and his BA degree in environmental studies, English, and biology from St. Olaf College. You can contact or verify outreach from Tim by emailing tim.dechant@techcrunch.com . View Bio April 30 San Francisco, CA StrictlyVC kicks off the year in SF. Get in the room for unfiltered fireside chats with industry leaders, insider VC insights, and high-value connections that actually move the needle. Tickets are limited. REGISTER NOW Most Popular Anthropic took down thousands of GitHub repos trying to yank its leaked source code — a move the company says was an accident Tim Fernholz Anthropic is having a month Connie Loizos Google is now letting users in the US change their Gmail address Ivan Mehta Why OpenAI really shut down Sora Connie Loizos The Pixel 10a doesn’t have a camera bump, and it’s great Ivan Mehta Anthropic’s Claude popularity with paying consumers is skyrocketing Julie Bort Let’s take a look at the retro tech making a comeback Lauren Forristal Loading the next article Error loading the next article X LinkedIn Facebook Instagram youTube Mastodon Threads Bluesky TechCrunch Staff Contact Us Advertise Crunchboard Jobs Site Map Terms of Service Privacy Policy RSS Terms of Use Code of Conduct Kalshi Copilot Blue Origin WordPress Bezos Tech Layoffs ChatGPT © 2026 TechCrunch Media LLC.
A New Google-Funded Data Center Will Be Powered by a Massive Gas Plant wired 02.04.2026 18:27 0.71
Embedding sim.0.8178
Entity overlap0.1724
Title sim.0.1589
Time proximity0.8579
NLP типother
NLP организацияGoogle
NLP темаai infrastructure
NLP странаUnited States

Открыть оригинал

Molly Taft Science Apr 2, 2026 2:27 PM A New Google-Funded Data Center Will Be Powered by a Massive Gas Plant Documents show that one of Google’s new data centers would be powered by a natural gas plant that emits millions of tons of emissions each year—an increasingly common trend in the industry. Photo-Illustration: Darrell Jackson; Getty Images Save this story Save this story A new data center being built with investments from Google will be partly powered by a natural gas project that emits the yearly emissions equivalent of putting more than 970,000 additional gas-powered cars on the road. According to a Texas state air permit application, the Goodnight data center campus in Armstrong County, Texas, will be partly powered by private natural gas turbines that will emit more than 4.5 million tons of greenhouse gases each year. This is more than ten times higher the yearly emissions of an average natural gas plant, and more emissions per year than an average coal plant. Michael Thomas, the founder of Cleanview and author of a new report on Google’s power strategy for its data centers, says that Google’s focus on and continued commitment to renewables is often held up by environmental groups as an example of Big Tech doing things right. But the plans for this campus, he alleges, illustrate how even big tech companies with stated climate goals and a public commitment to renewable energy are exploring fossil fuel investments as the AI race heats up. While the Goodnight campus is not the biggest fossil fuel project planned in the US to power data centers, nor one that will create the most emissions, the fact that the company is seemingly now exploring private, off-the-grid gas power for their data centers “suggests that something is changing,” he says. AI infrastructure company Crusoe began constructing the data center in May, according to local media reports. In November, Google announced that it would be making a $40 billion AI investment in Texas. As part of that investment, the company joined Crusoe to help build the data center already under construction in Armstrong County. The air permit application, filed in January, specifies that of the six buildings at the campus, the first four will be connected to the electric grid, while the fifth and sixth buildings will be powered by the on-site gas plant. In response to questions from WIRED for this story, Google spokesperson Chrissy Moy said the company does not have a “contract in place” for gas power at this facility. In addition to more than 900 megawatts of natural gas, the Goodnight campus would include 265 megawatts of wind power, according to a separate interconnection request made with Texas’s Public Utility Commission. Google says it does have an “agreement” for this wind energy. Moy added that the company is “signed on to the data center campus,” but noted that “a permit for an energy project doesn't necessarily confirm contracted energy plans for the data center, and isn't mutually exclusive to other energy sources.” As data center developers face lengthy wait times to connect to electricity grids and rising concerns over consumer electric bills, they’re increasingly turning to building their own energy, or what’s known as behind-the-meter power. For these projects, gas is king; data centers are now driving a US boom in natural gas. Nearly 100 gigawatts of natural-gas fired power are currently in development throughout the US solely to power data centers, according to research published by the nonprofit Global Energy Monitor in January. Per the Global Energy Monitor research, there are at least 15 projects in development across the US that are larger than the Goodnight campus. Several of these projects have only just been announced or are still in the development phase, and have not yet filed air permits detailing just how much greenhouse gases they will emit. But the numbers that have been made public are jaw-dropping: OpenAI and Oracle’s Project Jupiter in New Mexico’s air permit application declares that it could emit 14 million tons of greenhouse gases each year, more than three times as much as emissions from the Goodnight campus. Meanwhile, Crusoe is developing several other projects in Texas as part of the massive Stargate campus; one of the gas projects involved would emit almost 8 million tons of greenhouse gases, according to the state permit application. “Grid growth can't match AI demand, so a pragmatic 'all-of-the-above' strategy is essential—with gas as a critical bridge,” Cully Cavness, the cofounder and president of Crusoe, told WIRED in a statement. “This isn't the destination; it's the foundation we build on while investing in batteries, solar, wind, and small modular nuclear reactors. We're not waiting for a carbon-free grid—we're building the path to one." Other tech companies are publicly embracing new gas build-outs. This week, Microsoft signed a deal with oil giant Chevron to supply up to 2.5 gigawatts of gas power for a data center in West Texas. For his part, Thomas sees behind-the-meter power potentially becoming the main power strategy for data center developers. “It’s important to note how novel this is,” he says. “This is not something that any business was doing up until a year ago or so, and now it is so popular. The speed is so much better than waiting for the grid.” Since the start of the AI arms race, Big Tech companies that previously shared aggressive climate goals have admitted to backtracking , as they increasingly build out power-hungry data centers. Despite a nearly 50 percent increase in overall emissions over the past five years Google claimed in its sustainability report last year that it had reduced its data center emissions by 12 percent. And the company has publicly touted its commitment to renewable power. In addition to the Armstrong campus, Google’s Texas investment includes a data center in Haskell county that will, per a company press release , “be built alongside a new solar and battery storage plant.” Google is also building out a number of large behind-the-meter renewable energy projects, as Thomas explored in a recent report . With an administration in charge that both champions data center buildouts, scorns greenhouse gas reporting policies, and pushes American natural gas, it seems likely that behind-the-meter gas power will develop in spite of the big emissions cost. In March, the White House convened executives from seven big tech companies, including Google, to sign a nonbinding agreement to protect ratepayers, including a pledge to “build, bring, or buy the new generation resources and electricity needed to satisfy their new energy demands.” Experts told WIRED that this agreement was mostly symbolic, as neither data center developers nor the White House have much control over policies that would lower electric bills. Some lawmakers, however, are questioning Big Tech about the climate impacts of their data center projects. Just a few days after the White House event, three Democratic senators sent letters to a number of AI companies and data center developers, including xAI, OpenAI , and Meta , expressing concern about specific large-scale data center projects and their potential impact on the environment and the climate. (The lawmakers did not send a letter to Google, but did send a letter to Crusoe asking about an unrelated project.) The senators, Sheldon Whitehouse of Rhode Island, Chris Van Hollen of Maryland, and Martin Heinrich of New Mexico, asked that executives from these companies answer several questions about their planned data centers, including why they decided to power the data centers with natural gas as opposed to renewables. “It’s well established that climate upheaval and huge economic impacts will result if we fail to limit global temperature increase to no more than 1.5 degrees Celsius above preindustrial levels,” the senators wrote in their letter to tech executives, laying out the need to significantly reduce greenhouse gas emissions to meet this goal. “I would ask that you explain how your actions are consistent with this goal, and if they are not, why you don’t think that matters.”
People would rather have an Amazon warehouse in their backyard than a data center | TechCrunch techcrunch 03.04.2026 19:20 0.666
Embedding sim.0.7666
Entity overlap0.1
Title sim.0.1462
Time proximity0.8519
NLP типother
NLP организацияHarvard University
NLP темаai infrastructure
NLP странаUnited States

Открыть оригинал

As data centers have grown and proliferated, so too has the backlash. A new Harvard/MIT poll found 40% of people supported the building of a data center in their area, with 32% opposed when asked about the building of different industrial facilities in their neighborhoods. One fun tidbit from the survey, per Axios : More people would rather have an e-commerce warehouse. Two-thirds of respondents in the 1,000-person poll conducted in November were worried that a new data center in their region would nudge electricity prices higher . Interest in jobs and economic growth helped the case for data centers, according to Axios — though that sentiment may fade as most data center projects don’t employ many people once they’re up and running. Another survey, conducted last month and published earlier this week by Quinnipiac University, found much more opposition to data center construction. That poll found 65% of Americans oppose building an AI data center in their community. Only 24% of the 1,397 U.S. adults surveyed supported one being built. The new polls suggest that the debate over data centers is far from settled, and continued discontent from such a large swathe of the electorate is likely to continue spilling over into politics . Data centers once worked quietly in the background, more or less. Not anymore. Topics AI , data centers , Enterprise , In Brief , polling April 30 San Francisco, CA StrictlyVC kicks off the year in SF. Get in the room for unfiltered fireside chats with industry leaders, insider VC insights, and high-value connections that actually move the needle. Tickets are limited. REGISTER NOW Newsletters See More Subscribe for the industry’s biggest tech news TechCrunch Daily News Every weekday and Sunday, you can get the best of TechCrunch’s coverage. TechCrunch Mobility TechCrunch Mobility is your destination for transportation news and insight. Startups Weekly Startups are the core of TechCrunch, so get our best coverage delivered weekly. StrictlyVC Provides movers and shakers with the info they need to start their day. No newsletters selected. Subscribe By submitting your email, you agree to our Terms and Privacy Notice . Related AI OpenAI executive shuffle includes new role for COO Brad Lightcap to lead ‘special projects’ Amanda Silberling 11 hours ago Climate AI companies are building huge natural gas plants to power data centers. What could go wrong? Tim De Chant 12 hours ago Climate Commonwealth Fusion Systems leans on magnets for near-term revenue Tim De Chant 2 days ago Latest in Enterprise Climate AI companies are building huge natural gas plants to power data centers. What could go wrong? Tim De Chant 12 hours ago In Brief People would rather have an Amazon warehouse in their backyard than a data center Tim De Chant 12 hours ago AI Microsoft takes on AI rivals with three new foundational models Rebecca Szkutak 2 days ago X LinkedIn Facebook Instagram youTube Mastodon Threads Bluesky TechCrunch Staff Contact Us Advertise Crunchboard Jobs Site Map Terms of Service Privacy Policy RSS Terms of Use Code of Conduct Kalshi Copilot Blue Origin WordPress Bezos Tech Layoffs ChatGPT © 2026 TechCrunch Media LLC.
Who is liable when AI agents go wrong in business? the_register_ai 05.04.2026 10:00 0.653
Embedding sim.0.7606
Entity overlap0.0513
Title sim.0.1488
Time proximity0.7726
NLP типregulation
NLP организацияOracle
NLP темаai governance
NLP странаUnited Kingdom

Открыть оригинал

AI + ML 26 AI agents promise to 'run the business,' but who is liable if things go wrong? 26 Vendors tout the potential, but responsibility remains unclear Lindsay Clark Sun 5 Apr 2026 // 10:00 UTC Feature "You can't blame it on the box," says the boss of a UK financial regulator. What about the people who sold you the box? Good luck with that, says a global tech analyst. When AI agents... are considered to operate on behalf of an organization, decision-making risk becomes ambiguous and unpredictable. It also signals AI risk redistribution with unknown parameters With AI agents now promising to "actively run the business," anyone looking for an explanation of who might take responsibility for the output of the supposedly world-conquering statistical machines might arrive at the paragraph above, not unreasonably. The stakes are high. The largest enterprise application providers are now talking about using AI agents to automate decisions in HR, finance, and supply chain management. LLM hallucinations in performance summaries, incorrect regulatory filings, and critical supplies failing to turn up are among the risks weighing on businesses that hand decision-making to AI. While tech suppliers eye a trillion-dollar opportunity in AI, who carries the can if it goes wrong? "There's a historic assumption that the vendor will be picking up liability if the thing is going to go wrong. That's the point of origin for more or less all of these discussions," said Malcolm Dowden, senior technology lawyer at Pinsent Masons. Users might be forgiven for having high expectations for AI, given the vendors' claims. Announcing an expansion of its AI Agent Studio for Fusion Applications, Oracle said the technology would be "capable of reasoning, taking action across business systems, and continuously executing processes" such that its software could "actively run the business, with the governance, trust, and security that enterprises require." In legal terms, though, vendors might see things differently. Dowden said: "If you think of a normal tool or system, its behavior is predictable, so the giver of a warranty can have some pretty clear sense of how much liability you're taking on. That's different with AI. The more we get down the chain to what used to be called non-deterministic AI – mostly what falls into that agentic AI category – that gives a much greater scope for unexpected behaviors. That's the big concern from a vendor perspective, if you're giving a warranty about how something will behave, but it's inherently unpredictable, then that makes it a very uncomfortable contractual promise to make." It might also be concerning for the businesses using these systems, given what is at stake and the responsibilities they are expected to take. For example, in the UK this week, the Financial Reporting Council (FRC) could not have been clearer in its guidance for AI adoption. "While technology changes, the fundamental principle of our regulatory framework does not: it is people – the firms and Responsible Individuals – who are accountable for audit quality." Or as FRC executive director Mark Babington told the Financial Times : "You can't blame it on the box. If you use this technology, you are still accountable for it." Nonetheless, technology buyers can at least try to hold their suppliers to account in the terms of the contract. For example, users deploying AI to screen job applications should be aware that they could be challenged under data protection law because it is automated decision-making. The UK's enforcer, the Information Commissioner's Office, has recently said it backs automation so long as users monitor for bias, are transparent with job seekers and explain their right to recourse. Dowden said on questions such as bias in the training model, user organizations would be liable as they are data controllers under UK law. "They would then be looking to lay off that liability on the vendor through contractual provisions about explaining how the AI works, or a contractual obligation to make sure there is no inherent bias." However, vendors are very likely to push back on a straightforward assertion that the bias must be in the model itself, he said. They will want to look at the interaction between the model, the algorithm and the user prompts. "We're seeing in terms of negotiated warranties things like a promise that the system has been tested for bias, and the test will be regularly updated, and the models will be calibrated, but no assumption of responsibility if the bias can be traced to the way in which the prompts have been created and formulated. Both sides are essentially looking to establish the other as the liable party. That's where negotiations are tending to focus," Dowden said. Gartner has predicted that by mid-2026, new categories of unlawful AI-informed decision-making will generate more than $10 billion in remediation costs across global AI vendors and enterprises that leverage AI. Lydia Clougherty Jones, Gartner VP analyst, said decision-making by AI agents may take AI liability to a new level. "When AI agents... are considered to operate on behalf of an organization, decision-making risk becomes ambiguous and unpredictable. It also signals AI risk redistribution with unknown parameters," she said. "Organizations that fail to immediately adopt defensible AI, make AI-ready data 'AI-decision-making ready' and extensively overhaul ML model explainability are at risk of significant loss of investment, government investigations, civil penalties and, in some cases, criminal liability." Clougherty Jones recommended that users should get to grips with the idea of "defensible AI." That means focusing on techniques, including AI decision-making, "that can reliably and repeatedly withstand scrutiny, questioning, and examination." Salesforce is looking to Slackbot to help it solve the SaaSpocalypse puzzle Amazon security boss: AI makes pentesting 40% more efficient OpenAI gets $122B to 'just build things' as the world blows them up Leaked memo suggests Red Hat's chugging the AI Kool-Aid Organizations might want to deploy content and decision-making guardrails for language-model-based solutions across the entire life cycle of AI from data to model to output, she said. Last week, Balaji Abbabatulla, Gartner vice president and lead analyst for Oracle, said there was a lot of legal language to protect the vendors in terms of technology. Instead of being legally liable, they talk about monitoring, observability and audits. "The difference between AI agent decisions and human decisions is the scale and the pace of those decisions, and they could quickly cascade," he said. "If there's something wrong and if it's not identified and prevented, then it could quickly cascade before anybody even takes note of the issue. They're talking about continuous monitoring to identify exceptions: guardian agents, as we call them. But the issue around liability is the key challenge for all vendors." It was precisely the risk of erroneous output cascading unnoticed that worried vendors about accepting liability, said Georgina Kon, Linklaters partner in digital, data and commercial law. "The magnification risk is massive but also there is the difficulty in working out who is responsible," Kon said. "A lot of the current laws don't really lend themselves particularly easily, because it assumes always that a human or company is doing something and that's not true. But you can't also have a world where people are creating agents and not liable for them. What it comes down to is what the market can bear commercially." For this reason, the vendors were soft-launching products and testing them out with users first. As with social media in the early part of the century, the way people will deploy and respond to AI agents is yet to play out, Kon said. "When you have things like AI, it's just another crest of a hill where you have no idea what's ahead of you, because these agents could be unexpected, they could learn the wrong thing and well. No wonder vendors won't take responsibility for everything, but what they can take responsibility for are the processes they followed, and the safeguards that they have implemented. From a profitability perspective, there will come a point where it's not attractive for them to develop agents that they then might have typical contractual liability for." However, some users were happy to go ahead and deploy agents so they can stay on the bleeding edge of their market or gain process efficiency, accepting the risk themselves. It will depend on the sector, Kon said, with financial services and healthcare, for example, being more conservative in their approach. AI investment is set to reach $2.52 trillion this year, with the bulk of it coming from hyperscalers, model builders, and software companies. They will want to see a good return on the outlay. Any senior IT manager or director will testify to the bold marketing claims of the vendors promising to automate internal decision-making at an unprecedented speed and scale. But holding them liable for the output will remain a challenge until the law is clearer, and cases have gone through the courts. The major application vendors were offered the opportunity to explain how much liability they accept in their customers' implementation of AI agents. Microsoft and SAP refused to comment. Workday, Salesforce, ServiceNow, and Oracle have not responded. Despite the industry hype, matching market claims to legal responsibility remains a difficult circle for them to square. ® Share More about AI Enterprise Software More like these × More about AI Enterprise Software Narrower topics AdBlock Plus AIOps App Application Delivery Controller Audacity Confluence Database DeepSeek FOSDEM FOSS Gemini Google AI GPT-3 GPT-4 Grab Graphics Interchange Format IDE Image compression Jenkins Large Language Model Legacy Technology LibreOffice Machine Learning Map MCubed Microsoft 365 Microsoft Office Microsoft Teams Mobile Device Management Neural Networks NLP OpenOffice Programming Language QR code Retrieval Augmented Generation Retro computing Rimini Street Search Engine Software Bill of Materials Software bug Software License Star Wars Tensor Processing Unit Text Editor TOPS User interface Visual Studio Visual Studio Code WebAssembly Web Browser WordPress Broader topics Self-driving Car More about Share 26 COMMENTS More about AI Enterprise Software More like these × More about AI Enterprise Software Narrower topics AdBlock Plus AIOps App Application Delivery Controller Audacity Confluence Database DeepSeek FOSDEM FOSS Gemini Google AI GPT-3 GPT-4 Grab Graphics Interchange Format IDE Image compression Jenkins Large Language Model Legacy Technology LibreOffice Machine Learning Map MCubed Microsoft 365 Microsoft Office Microsoft Teams Mobile Device Management Neural Networks NLP OpenOffice Programming Language QR code Retrieval Augmented Generation Retro computing Rimini Street Search Engine Software Bill of Materials Software bug Software License Star Wars Tensor Processing Unit Text Editor TOPS User interface Visual Studio Visual Studio Code WebAssembly Web Browser WordPress Broader topics Self-driving Car TIP US OFF Send us news
Cisco CEO Chuck Robbins wants data centers in space the_verge_ai 06.04.2026 15:15 0.636
Embedding sim.0.7541
Entity overlap0.1071
Title sim.0.1478
Time proximity0.5958
NLP типother
NLP организацияCisco
NLP темаai infrastructure
NLP странаUnited States

Открыть оригинал

Today, I’m talking with Chuck Robbins, CEO of Cisco. Cisco is one of those big companies that everyone has heard of but that most of us don’t have to interact with very much; it’s not really a consumer brand. But all of us are in some way using Cisco’s products and services every day because it makes a huge amount of networking equipment for other big companies, like telecoms and ISPs. It’s a guarantee that somewhere between me recording this and you watching, listening to, or reading it, the bits have passed through Cisco products. Without the actual routers and switches and silicon — and the software to make those things work — there’s no internet, there’s no cloud, and there’s no AI. Verge subscribers, don’t forget you get exclusive access to ad-free  Decoder  wherever you get your podcasts. Head here . Not a subscriber? You can sign up here . That’s Cisco’s new big business, of course: building all the networking needed inside all of the data centers the AI companies are trying to build. Chuck and I spent a lot of time discussing that. First, where should we build all these data centers? Because it’s not clear that anyone wants them around. A data center is a really unpleasant neighbor to have: It’s loud, it’s ugly, and it uses a ton of electricity, making rates for regular people go up. AI itself is polling pretty badly with Americans, and there’s now fairly robust, bipartisan opposition to new data center builds all over the country. So I had to start by asking Chuck what feels, strangely, like one of the most urgent questions of the moment: Should we build data centers in space? Elon Musk sure seems to think the answer is yes, and he’s pushing SpaceX that way. Sam Altman — along with a whole bunch of experts who understand how cooling and radiation work in orbit — thinks we’re not there yet. So I had to ask Chuck which way he’s leaning, and I was a little surprised how quickly and emphatically he answered. You’ll also hear me ask very directly whether Chuck thinks AI is a bubble, and you’ll hear him say very directly that he thinks it is. And he would know: During the dot-com bubble, Cisco — the internet builder — was very briefly the most valuable company in the world. Beyond the AI of it, I love bringing big companies that are kind of hidden in plain sight onto Decoder , and Cisco is a perfect example. Chuck has made some big bets around chip investments to position Cisco on what he calls the leading edge — but not bleeding edge — that are really fascinating when you think about the kind of infrastructure he sells to companies all over the world. Those companies are dealing with an increasingly fractured global landscape, and asking big questions about data. Who owns data? Where can it be stored? Should the internet have a kill switch in different countries? They’re important questions, but they also don’t have easy answers, and you’ll hear Chuck really delve into how complicated it is keeping the world connected in the deeply weird realities of 2026. Okay: Cisco CEO Chuck Robbins. Here we go. This interview has been lightly edited for length and clarity. Chuck Robbins, you are the CEO of Cisco. Welcome to Decoder . It’s great to be here. Thank you. I’m excited that you’re here in person. I have a lot of questions for you. It seems like a very complicated time to run an infrastructure company — which is fundamentally what Cisco is — especially one for global infrastructure. The internet’s a global network, and that seems to be under a lot of pressure from a lot of different directions. So, I want to get into a lot of things with you. But I actually want to start with a question I’ve been dying to ask you ever since we scheduled this interview/ I thought finally I can ask this question and someone will be able to tell me the answer. Should we put data centers in space? Absolutely. Yes? And we will. You think so? I think so. Right now we’re dealing with lots of power constraints, and up there you don’t have that. And if you think about the people who are talking about putting data centers in space, I wouldn’t doubt them. Elon [Musk]. Yeah. And there’s a lot of stuff we’re working on right now. We’re thinking through what we need to do to our portfolio to make it work properly in the conditions that might exist up there. But I think we’re going to see it. I think we are. So Elon’s plan — he recently filed for approval for this plan — is to launch a million satellites as part of a constellation. He’s launched constellations before. You mentioned power, that’s obviously solar. Can’t we just do solar power here on Earth? Is that not a possibility? Well, up there it’s unlimited and unimpeded, so it’s just easier. You don’t have to deal with a lot of the challenges, like people who don’t want these data centers in or near their communities. So, that’s obviously off the table. I think it solves a lot of problems. There are a lot of challenges figuring out how to make it all happen. But again, given his history, I wouldn’t doubt him. We’re going to prepare so our technology is ready. What does that preparation look like for you? It’s very early stages right now. Our teams literally came to me I think about two or three months ago. My head of product said, “We really have to be prepared for data centers in space.” I looked at him like he was crazy. Subsequent to that, we’ve just been talking about how we don’t even know everything we need to do yet. We’re in the early stages of just making sure the atmospheric issues, the temperatures, all of those things are taken into consideration. But at some level, we don’t have to deal with the cooling and things of that nature, which add a lot of weight to the product because you first start thinking about how do we get them up there. So, there are a lot of things that our team’s thinking through right now. What does that networking stack look like for you? Do you have to invent a whole bunch of new stuff? Is it the same stuff without as many cooling loads or with less energy needs? I think it’s generally the same with perhaps different interfaces for different satellite technologies, things like that. It shouldn’t be too dissimilar. Do you want to be on the bleeding edge of that, or are you waiting and seeing if Elon can prove it out? No, I’d like to be on the leading edge. How about we say that? Maybe not bleed, but let’s lead. What does that investment look like for you? Are you going to send up a team? The teams that currently build our data centers are the logical ones to actually do this analysis, and I think that’s what’s happening right now. To me, the cooling piece of it seems challenging in a lot of ways. You have to move the heat out of the products. There’s no air in space. That’s not going to naturally happen. You’re getting way beyond me pretty quickly. I’m just curious. We’ve written a bunch of “ should we put data centers in space ?” stories now, and I was dying to ask you these questions because it feels like someone has to do a lot of basic R&D work to make this happen. I’d say six months from now, have my chief product officer do this, and he can go through a lot of that with you. Fair enough. Let me ask you the flip side of this; you mentioned this already. There are problems with building data centers in the United States and around the world. I want to come to that in more depth. But are we just running away from the problems of politics and saying we’ll just do it in space where there’s no one to get in our way? I don’t think that’s it. I think it just eliminates a lot of the challenges that you’re facing on the planet. Let me assure you. I grew up on a farm in Georgia, so the last thing I ever thought I’d be talking about are data centers in space. Even five years ago, I wouldn’t have thought I’d be talking about it. If you think about a lot of the dynamics we’re dealing with, I don’t think it’s politics so much as it is the physical limitations, the community. There is an aspect where a lot of the people in the communities don’t want these things in their backyards, and I get that. Sam Altman is one who says, “I don’t think they should be in their backyards.” We’ve got enough rural areas in this country where we ought to be able to put these things, but we’ll see. Sam Altman also notably says putting data centers in space is a pipe dream. So who are you going to believe? Does he? So who’ve you got? I wouldn’t bet against Elon. All right, fair enough. Let’s talk about Cisco for a second. You’ve been CEO for 11 years. You’ve been there for almost 30, I want to say. This is a company that goes boom and bust with boom and busts. I think in the dot-com era, Cisco was briefly the world’s most valuable company. For about a day, I think, yeah. And this is a company that, when it’s time to do infrastructure, can be one of the big growth drivers. Infrastructure’s cool again. It’s time to build, as they say. What is Cisco to you right now? How would you describe this company? We securely connect everything. That’s basically what we do. We connect systems, we connect people, we connect things, and we do it in a secure way. We’re connecting AI data centers, we’re connecting GPUs within AI data centers. It’s primarily about secure connectivity. I think when people have thought about connecting everything, they’ve thought about, honestly, the last mile. Like, you build the big internet, that’s an enterprise problem. Then, we’re going to do 5G. Or it’s Mobile World Congress and we’re going to do 6G now. Who knows when that’s going to be. But that’s the big internet people have long thought about. I know you have a big corporate business. The turn for networking right now feels like data centers. It feels like we’re building these big data centers. We’re going to link up a bunch of GPUs in ways we haven’t linked them up before. We have different kinds of workloads because of AI. Is that a meaningful difference to how you conceive of Cisco? It is. There’s certainly more and faster innovation around things like the silicon we design ourselves that goes into the data centers. The continued evolution of data centers is forcing us to drive those cycles faster. If you look at our enterprise data center business — going back to 2010 or 2008, when the cloud came along — there was a belief that there was never going to be another private data center built. And if you look at the last eight quarters, our enterprise data center networking business has had double-digit growth in six quarters and high single digits in the other two. So, we see that business growing. If you go back five or six years, we had relatively zero business from the big hyperscalers, and this year, we’ll do billions. And most of that’s driven by AI infrastructure and their data centers. So, I think your assumption is accurate. Is that just their lack of capacity? Amazon or Microsoft wants to build out another data center, but they can’t do it themselves because they’re building so fast so they turn to you? No, we’re selling them equipment that they’re using to build their own data centers. So they’re building them. They are building them. So what was the turn? Why did that line start growing for you when it wasn’t growing before? Success in business is always a combination of good decisions and a lot of luck. The luck struck in 2016 when one of my engineers, who built our hardware, came to me and said, “There’s this silicon company in Israel that I think we should buy.” It gave us the opportunity to standardize on a single silicon architecture across the entire portfolio. So in 2016, we bought this company called Leaba . Fast forward and we’re one of basically three companies in the world that can build the networking silicon that’s needed to connect these GPUs, run the training models, and run these AI data centers. So, that was a big part of what’s helped us get there. And to be candid, if we didn’t have that silicon today, we would not be participating in this phase. Otherwise, I’d be buying merchant silicon like all my competitors, and I’d be just like everybody else. So, that’s the biggest thing that’s differentiated us and got us to this point. We have a lot of competitors or would-be competitors come on the show and talk about networking. That seems like a growing business for a class of companies. The one that I’m particularly interested in is Nvidia. You guys have a deep partnership with Nvidia. They just had GTC. Jensen [Huang, Nvidia CEO] is out there pointing out that their networking business is huge. It’s bigger than yours in some ways. Its last fiscal year was $31 billion. I think you guys were at $20 billion in the last quarter. It’s billions bigger than yours. Is it a threat that Nvidia is so deep into building up the networking component? Because its obviously selling the GPUs. There’s a place it could go. It can just expand its footprint. Is that competition? Is that coopetition? How’s that work? It’s coopetition. If you look at the big hyperscalers, they actually build their own integrated architecture using best of breed or whoever they want to use. They are very good at balancing their spending across multiple vendors. They like to have optionality. They want diversity at the silicon level. That’s how they think. You see some neoclouds as an example. Nvidia sells a fully integrated stack that has networking included in it. That’s the path of least resistance, and it helps them get there faster. So sometimes they’ll buy that. If you look at the enterprise, most enterprises have built 40 years of knowledge, processes, everything around our platforms and our technology. That’s why what we can do together in the enterprise is a big part of why Nvidia values our partnership. The other thing we have, which no one else has, is security. As we move to this agentic era with agents operating all over your infrastructure, you have to do security in the network because the latency requirements are going to require full-time security on these agents all the time. I’m doing access, validation, and identity validation of agents. We’re the only networking company that has a big security business. None of our security competitors have a networking business. So it’s a big advantage to us as we go forward. We just had the CEO of Okta on the show, and his entire pitch was, “I will build you a kill switch for your agents.” Is that competition for you? Is that something that will work alongside what you’re planning? I actually think there’s a great opportunity for us to partner with Okta. That kill switch might be implemented at the network layer because we may see something happening that it won’t see at the upper layers. So we’ll figure it out, but the teams are working on this day and night right now. The deal is being made here on Decoder . You heard it here first. Exactly. This seems like the opportunity. When I say Cisco’s a company that grows with booms and busts, the amount of compute that everyone is describing that they need in order to deploy agents at scale across the enterprise and to train the next generation of models is vast. You are obviously going to help build the data centers that supply a lot of that compute. The question I have is, do you see the revenue on the back end of that? This is a lot of growth, a lot of forward investment. For them? For them and for you, right? Well, we’re getting the revenue now. We would not expect this buildout to end anytime soon. Everybody wants to compare this to the dot-com era, right? Is it a bubble? Is it going to bust? I’m like, well, did the dot-com bust or did the winners emerge, the losers failed, and now we have what we have? If they hadn’t been successful, we wouldn’t be talking about anything we’re talking about today. So, it wasn’t like it went away. People lost money, but the winners emerged. I think you’re going to see the same thing here. The difference is that, in a lot of cases, the companies that are spending so much money on this infrastructure view it as an existential issue for their survival. They’re going to continue to build, and they’re going to continue to invest. I think they’ve proven that over the last few years, and I think we have a long way to go. We’re very early in this cycle. I have two thoughts about this I’m eager to push you on. One — and this is just related to the infrastructure — a big part of the bubble there was that we built a fiber network that sat dark for ages. You can say whether that was good or bad, but we had it. And the fiber itself was valuable, even if it wasn’t full of traffic yet. Is a data center valuable on the same scale? If you build a data center, and there isn’t the consumer workload to run it, you can’t just show up 20 years from now and plug into the fiber the way that you could. I think that the difference is that, unlike that fiber, these data centers are being used day one at full capacity. I mean, they’re just being used. In our world, it’s about the networking connectivity, but it’s also about optics. We haven’t talked about optics, but we made some strategic acquisitions in optics, which has also been a big deal for us. Because at some point, you won’t be able to get the packets off the processor over copper because the speed’s just too great. So us having both those technologies in house is another benefit as we look to the future. The other question I’ve been really thinking about with the dot-com era comparison is less dot-com and more mobile. If you look at the promise of the dot-com era, it was, “We’re going to take the economy, and we’re going to move it onto the internet broadly. You’re going to buy your pet food online, and maybe you weren’t going to buy it on a Dell desktop PC.” It actually happened when we got to mobile. We just moved the economy onto the internet. Everyone’s doing e-commerce, and it turns out buying pet food from Amazon on a computer totally works when that computer is a phone. Then, Apple and Google get to extract rent from everybody for all their purchases and games. We have an economy that works that way. The promise of AI is we’re going to do it again. We’re going to move the economy a third time to the next paradigm in computing. What’s the evidence you see that that is happening or will happen at the scale necessary to support this investment? Well, if you look at some of the early agentic platforms, you heard Jensen this week talking about OpenClaw . I guess when this is broadcast, it would have been two or three weeks ago, but nonetheless. If you just look at the early promise of what that can do for you, I think you’re going to see it automate a lot. It’s going to make your whole purchasing process different. I think it’s yet to evolve, but I just reflect back to 2007 when the iPhone came out. None of us had any clue what we’d be doing with that phone today, none of us. Maybe there were some people somewhere who were such visionaries that they saw it coming. But the application portfolio that we have today is much broader than we ever thought it would be. I think you’re going to see the same thing emerge around AI. We don’t know what is going to come. We have ideas about things that we think will happen, but we don’t know everything that’s going to happen. I mean, this stuff’s changing so fast. I talked to Kevin Weil at OpenAI and he’s like, “We’d sit down and have meetings about what are we going to do the next two months, and then three weeks later, we throw it out and start over because everything’s changing so quickly.” I think that’s the way we’re going to all have to operate, which is going to be very uncomfortable for a lot of people. Is that changing the way you’re selling your products to build this capacity? Because if you don’t know what the capacity is for, it must change how it’s being built. It’s changing a lot about how we design silicon. These customers are so big that they’re a market of one. So, we have unique requirements coming from an individual company, which we haven’t had to deal with in the past. We built general-purpose silicon, we sold it to everybody, and it worked. So, you have different applications, different use cases, different customers that are leading us to move faster and build more variants of this technology than we would have in the past. That’s right up against the insatiable demand of other silicon providers, right? There is a capacity crunch for chips, there’s a capacity crunch for RAM. How is that working for you? Are you able to get the flexibility you need? Yeah. Certainly, when you look at fab capacity, we could use more, but the world could use more. I don’t think you’d get anybody on here who builds chips that wouldn’t say, “I’d love to have more capacity.” Same thing for memory. We’re in a crunch for probably 18 months doing everything we can to try to secure what we need. We feel pretty good about where we are right now, but we’ll see how the demand plays out over the next year and a half. I’ve talked to people about RAM margins like consumer laptop vendors who say , “There might not be consumer laptops this year.” It might just be priced out. You might never be able to cover the cost to just put a stick of memory in a cheap laptop. y You might just be out. The CEO of Razer , which makes gaming laptops with lots of fun lights, was like, “Week to week, I don’t know what the margin on that product will be.” It’s true. You’ve got to build a big piece of the infrastructure puzzle. The GPU is useless without the networking. This at least has to equalize somewhere for you, right? You’ll say, “Look, this is our margin to build the networking, to get the value out of the GPUs that we’re buying at super high rates from Nvidia and whoever else.” Is that working in the market to at least equalize your prices? Networking equipment uses a lot less memory than compute platforms do. So, we still have memory in every networking device, but it’s much smaller percentage of the BOM than it would be in a — That’s “bill of materials.” Thank you, sorry about that. It’s a much smaller percentage than it would be in, like, a server. The customers understand… what I keep trying to explain to them is that price increases are happening upstream from us. We’re just an absorber of the price increase. We’re having to do more frequent price increases than we have in the past, and we’re having to change our terms to deal with the same thing that your other guest talked about, which is the dynamic nature of the pricing that we’re seeing right now in the memory space. But when you go to the large hyperscalers… I said earlier that it’s existential. So, what we’ve just adopted with them is a more transparent model that says, ” Here’s what we need. Here’s how it works. Here’s our pricing.” And they generally understand. Because there are other choices, especially, for you to provide — Everybody’s in the same boat. It’s not like you’re going to go somewhere else and somebody’s going to give you memory at 10 percent of the cost of what we’re offering. Everybody’s just trying to deal with the capacity crunch right now. This brings me to the Decoder questions because my next set of questions after this are how you’re handling this interlocking set of complicated puzzle pieces. Tell me how Cisco is structured right now. How big is the company? How is it organized? We’re 85,000 people, plus some contractors. We’re functionally structured like most companies. We’ve got a sales organization. We’ve got a product organization. The one change I made about 18 months ago was to consolidate all of our products under a single leader for the first time that I can remember. It’s a big complex portfolio, so we did that. We’ve got a services organization. It’s fairly functional. Pretty standard. You’ve been reducing the size of the company pretty substantially over the past three years, I would say. You had two big rounds of layoffs in 2024 . You just had some other little layoffs. Most of the time those are rapid reallocations that we need to do. It’s unfortunate, but it’s not… Typically, the ones we’ve done have not been about reducing the total head count. At least, they have not generally been that way up until now. I was reading some coverage of those changes. There’s a lot of, “Are these AI-related layoffs?” Is that on your mind? That you might be thinking about new kinds of structures, new kinds of engineering structures? As an example, let’s say that our engineers become twice as productive because of coding. This year, we’ll have five or six products that’ll be 100 percent written by AI. Next year, we’ll probably have 70 percent of our code be written by AI. You still have to test it. You’ve still got to go through all that stuff. But let’s say you make them twice as productive, just to simplify the math here. The companies are going to have to decide, “Am I going to maintain the same pace of innovation with half the people? Or am I going to double my pace of innovation with the same number of people?” I think different companies are going to make different decisions with some in-between variants. I think that’s where we’re heading. But we’ve got to see this all come to life. We’re seeing the early coding successes of coding, but we haven’t seen the unintended downsides that we haven’t figured out yet. My head of product was saying that we’ve got 20 or 30-year-old code that’s integrated in the systems that’s written in C++, as an example. That head of product told me, “We took all these old lines of code, we compressed it by about 20 percent, and we converted it to a modern language using AI.” My first response to him was, “You better test that like crazy before you put it in a product and then put it in a customer environment.” There’s a lot of stuff we’re still learning as we go through there. Stay on that for one second. Cisco code can’t fail, right? The networking components should not go down in the same way that… I don’t know, how we are resilient to Amazon being broken for five minutes and then it coming back to life, right? The world stops. Yeah. If Cisco fails, something bad happens in an escalating, catastrophic way. I get those calls, by the way. [ Laughs ] I have a lot of listeners who are like, “What’s Chuck’s phone number? Because I manage a Cisco portfolio.” We’ll give it out at the end. Okay. Great. Perfect. Stick to the end of the episode. There’ll be an affiliate code when you call. [Laughs] How do you think about that risk? I keep joking about how I ask everybody the org chart question. I’ve asked it for five years, and there’s two answers: we’re functional, we’re divisional, and we get through it. Now, I think we’re on the cusp of seeing some of the weirdest org charts in business history. “I manage a team of two people and 500 agents.” Meta is about to do one manager to 50 individual contributors all using agents to write code. I don’t know how any of that’s going to work. You can’t take some of those risks, but you’re describing the productivity gains that might come with some of those risks. How are you thinking about that? We need a little more runtime. You’re right. The whole mental model around our software development versus these models. Kevin Weil from OpenAI made a comment at our AI summit, and he said, “You guys should be using these models when they’re working properly 10 percent of the time just to get to use them.” I sat there and I listened to that comment. It’s just a different way of thinking. Granted, they’re going to get it to full… But you go into it recognizing that it’s still evolving. We don’t have that luxury. Our stuff has to work. We’ll have to figure this out as we go, but we’ve seen how dependent the world is on technology functioning properly. We’ll have to just assess it as we get closer, but I think there’s going to be an awful lot of testing that has to get done. But the flip side is that we think AI can help us find bugs more quickly. It can help us assess customers’ infrastructure and say, “Hey, you’re running these four versions of our software. We’ve seen a lot of instances where when you’re running those four, it’s created a problem.” Or, there’s cybersecurity risks in certain parts of the code that AI can help us find. There are a lot of upsides. There’s a lot of opportunity for AI to help us become more reliable safely. You’ve mentioned security several times now. The flip side of deploying AI to help with security is your adversaries who attack might be able to deploy AI to attack you much more efficiently. And they are. How is that playing out for you? The emulations that you’re going to see, like email and video simulations and people replicating me, is just going to get crazier. So, we have to be better at using our tools. I have also been a big proponent of all of the security competitors in the industry laying down our weapons. We still compete but in service to our customers. I believe we have to more effectively share intelligence in real time today to help our customers deal with this because any one of us on our own is going to be less effective than all of us together. That’s a big thing we’ve been pushing. We’ve been building a lot of capabilities. There are a lot of opportunities to integrate our platforms and our threat intelligence. If you think about what you can do with models, like training on threat intelligence and conditions that led up to threat vulnerabilities, there’s an awful lot we can do to get ahead of this. And we need to do that. I think this brings me into the other Decoder question that I ask everybody. This is the one that I think is pressure for everybody. At the scale of change you’re describing here, how do you make decisions? What’s your framework? When I wrote my thesis during the process of becoming CEO and the board was assessing the candidates, one of the things that I called out in the document — and this is 12 years or 11 and a half years ago — I said that the industry is moving so rapidly that you’re going to neede team-based strategy. You have to have a lot of people developing strategy because there’s no one individual. There’s some brilliant minds, so I’m not ruling any one human out, but there’s no one individual who can come up with the exact right strategy every time they’re assessing what they need to do. So, we spend a lot of time together as a team. We spend anywhere from one to three hours together every Monday. We go off-site together for two to three days every quarter. And the way we make decisions… Look, 99 percent of the decisions get made below me because they’re easy or because two smart people agree. When they get to me or any other CEO, you’re usually assessing two potential bad choices. Or you have two smart people who completely disagree, which tells you it’s complicated. In general, we just spend a lot of time in transparent discussion and open communication about how we’re going to make the decisions. At the end of the day, I own them. I have this belief that when a decision goes really well, you give everybody else the credit, and when it goes very poorly, it’s all on me. That’s just how you have to operate. To the decisions question, you are dealing with a vast amount of uncertainty, right? There’s a vast amount of uncertainty with how the global internet will be structured. What do the hyperscalers need as they build out new capacity for uncertain workloads? Who knows. We’re going to sell a bunch of products to the neoclouds, which have circular financing. Those bills might not get paid, which I want to come to. That is a lot of uncertainty. I would say whether or not all of this infrastructure investment pays off in GDP growth is the biggest uncertainty of all. How are you dealing with that? You haven’t even gotten to three or four other big ones. Go ahead. What are they? Well, you’ve got the geopolitical situation, you’ve got sovereignty requirements emerging all around the world. You’ve got two wars around the world. You’ve got tariffs, you’ve got memory costs, you’ve got all these things that we’re all trying to navigate. So, it’s pretty complicated. That is a lot of uncertainty on your decision-making. You’re saying it all rolls up to you when it goes wrong. Has that affected how you’re making choices? Faster. You just have to move faster. We had an all-hands with our entire company yesterday. We do it once a month. I told them, “Look, if speed and change makes you uncomfortable, you’re going to be uncomfortable because it is a world where companies can get seriously damaged in a very short period of time.” This is what’s driving a lot of the investments. There’s a big FOMO issue in the C-suite today. CEOs are like, “What am I missing? What’s my competitor going to do that I don’t know about?” We used to say, “Get 80 percent of the information you can, make the decision, and then adjust as you go.” And I think that’s… Maybe it’s not 80 percent anymore, but you’re going to have to take that approach. You’re going to have to be willing to take risks, and you’re going to have to be comfortable being uncomfortable. And if you’re not, it’s going to be a pretty complicated and stressful time. Billions of dollars in capital are being allocated for infrastructure. Does it come up that the products that might pay this off don’t exist? Does it come up next to the FOMO? Depends on the customer. If you look at the [telecom operators], the cloud providers, the people whose core business is highly dependent upon products that we build. Everybody is, but we will have those conversations with their CEOs and their leadership team. You go to Mobile World Congress, as an example. We were just there in every meeting the CEOs from some of those carriers and service providers are in. So, they care. When you get into the enterprise space, some of them are super technical. They understand the value of technology. So, they want to talk about trends. They want to talk about what we see other companies doing or what we’re doing as an enterprise that they should be thinking about. But usually, if there’s something big going on the table, my only position with them is, “If you go with us, you have my personal commitment that we’ll throw all the resources you need to make you successful.” That’s usually all they want to know. I feel like there’s a split in the market right now. I understand the enterprise use cases for AI. I understand why you’d want to build as fast as you can there — particularly in software development, as you described. We can see the benefit. We talk to developers all the time here at The Verge . They’re like, “Our entire job is different.” The world has changed. The market has cracked open. Something is going to happen there. Then, downstream of that, you can say, “Well we hired a bunch of engineers to build us business process automation. Maybe we get way more value out of those engineers and we get way more automation.” There’s something in the enterprise that’s going to happen with AI, that feels like I understand the value. Do you see any consumer applications of that scale beyond just telling Alexa to buy me shoes. Quite honestly, I don’t yet, apart from Google Search getting a lot weirder over the past two years. I don’t have any great examples yet. You’re right. You are seeing some horizontal areas in the enterprise that are consistent across almost every company, like coding. Customer service is one that everybody’s working. You start to see some emerging horizontal use cases in legal. We’re seeing a lot of use cases in our people organization, too. I think those are pretty standard. Everybody’s at least aware of those opportunities. People are at different stages on the journey. But, I’m not the consumer expert by any stretch. We’re purely B2B, so that’s where I spend all my time. If I saw it, I would have probably read it on something you published. We’re looking for it every day. The reason I’m asking is because I think this relates to why I started out asking about data centers in space. I’ve heard [Google CEO] Sundar Pichai say variations on this idea. Without the big consumer application that everybody understands and can see the benefit of, putting the data center in the backyard is becoming an increasingly harder sell. The power requirements, the water requirements — which I know are controversial and often argued about — just the energy, resources, and requirements of the data centers are making them unpopular. I don’t think it’s all that. I don’t think the environmental argument historically wins in America. I drive a V8 Mustang, and I’m going to keep driving that car. We have an EV that’s parked right next to it, but those cars are popular for reasons. Fast fashion, enormous environmental impact. People like it. There isn’t an AI product for consumers that they like so much that it just transcends objections they can reach for. We’re seeing it play out in really weird ways. In bipartisan ways, people are pushing back against the data centers. Are you going to be able to hit your goals if data center construction slows? Is there a way to overcome those goals without the great consumer product? I think there is. Look, we’ve been the most innovative country on the planet for a very long time, and that’s not going to change. Some of the smartest people in the world are actually trying to solve these problems, and they will. By the way, I think if you give some of those residents the greatest AI tools that they’ve ever seen in their lives on their phones, they still don’t want the data center in their backyard. I don’t think they’re going to say, “Oh, this is great. Go ahead and drive my energy costs through the roof and I’ll be okay with it.” That’s not going to be the gating factor. I think those apps will come. We saw a little bit of this with 5G. They didn’t want radio towers. You remember that whole thing? Oh, I remember. This is that at a much greater scale, but I think we’ll figure it out. The 5G comparison’s really interesting. I know you just came back from Mobile World Congress. At least the telecom industry understood that they had to describe some applications that all of this build-out will accomplish. The ones that got me every time were, “We’re going to have self-driving cars and we’re going to do robot surgery.” There were all these demos of these things. I went to endless CES demos. A self-driving car demo is fundamentally very boring. You don’t want to be in an exciting, self-driving car demo. You don’t. I sat in a lot of them at CES and pretended to be very excited that 5G would drive the car. That car looks like every other car driving. Yeah. And it’s like, “I would like this to be as least exciting as possible.” Yes. Maybe there was one demo of a 5G surgery, and it was still backstopped by wired internet. 6G has the same sort of application problem, right? We don’t know what it’s for. AI has the same problem. You can’t describe what it’s for in a way that might overcome the objection. That feels like a fairly unique point for the whole industry to be at, where the next generation of technology is very exciting to a handful of providers. It’s the future of your business in real ways, and the applications are harder and harder to describe. You’ve seen it all. I’m asking you this question on a big sweep. If we’re talking about the internet, it was easy to describe what it might do for people. I actually disagree with you. I think a lot of people instantly saw what the phone would become. There was an excitement there. That’s where you got startup founders from, in that way that you got founders from. This one just seems more nascent to me. How would you place that in your sweep of history? You’re right, we didn’t have the number of use cases. I think if you asked the telecom CEOs, they would probably say that they’re disappointed in the return they got on all the investment around 5G. That’s pretty well-understood. I think robotics in general could be the real driver of 6G utilization once it gets built. But again, its early days are being defined. Typically, we’ve talked about these tech transitions for years and years and years before they come to fruition. AI is different. We did talk about it for a long time, and then all of a sudden it broke loose. The pace at which it’s changing is just unprecedented. I think we’ll have to see on 6G; it’s still TBD. I don’t expect that you’ll see the same mistakes made from a speed-of-investment perspective until they become more clear. The internet also was hand-in-hand with globalization, right? We both have iPhones that are made all over the world. You had this giant global network. Maybe this is going to lead to an age of prosperity. Maybe this is going to lead to an age of extreme labor displacement. You can read that however you want. There are a lot of opinions about what the internet and particularly globalized manufacturing led to. That’s all being undone. You can see that’s being undone every single day. Whether that’s with tariffs in an effort to bring manufacturing back to the United States — which we’ve talked about on the show with many of your peers — or whether it’s, “Hey, we’re going to put up big walls on the internet.” Australia’s going to have a social media ban for teens . They’ve got to enforce that somehow. That’s probably going to happen at the network layer. You have the European Union saying, “The data has to be here. We have to put the data here. The European data has to be in Europe.” You build the networks. I’m imagining all of this is just one more layer of complication, even as you describe how we should have global systems that bring us to an era of shared prosperity. How are you dealing with that? I think what you’re seeing play out is not only do countries want data sovereignty, they want to have sovereign control over any technology they’re using. And it’s not limited to Europe at this point. They don’t want the US to have the ability to impede the use of those products under any conditions. As an example, some of the meeting platforms like Webex or Zoom don’t want any other country — I’ll say the US but any other country — to have the ability to cut off access to these platforms if they’re going to invest and use them for critical reasons in other countries. Let’s use Europe as an example. In many cases, European companies that build the technology they use in their infrastructure don’t have that capacity at scale. So then they have default to, “Who are the companies that I trust?” And trust becomes such a big… It’s a big discussion obviously around AI, but it is really a big deal. So for us, as an example, we have always tried to be good citizens and good members of the community in evert country we operate in. We’ve had education programs for 25, 30 years that train learners on digital skills around the world. Last year alone, we had 5 million learners around the world go through one of these programs. So, that trust element’s going to become very important. The technology is one thing. You have to build technology that can be deployed the way they want it to be deployed. Then, you have to have a very high degree of trust when you work with them. Is this changing how you’re architecting some of your products? It is. What are some specifics? Well, you’d love to have a cloud solution. Historically, what you would do — and a lot of companies were built this way — is build global instances, partition them, and sell them off to different customers. As an example, if you go to Germany and Germany says, “I want to have my version of that running in my country,” it’s architecturally different than how you might have designed it to begin with. We’re now designing a lot of those control or cloud-oriented systems so that they can be structured to run in a country alone, and we don’t think about building global instances anymore. From your perspective, this is a very different way of building the internet, right? It just isn’t the thing. My first experience with the internet was watching the coffee pot at the University of England.This was the promise when I was a kid watching a one-frame-a-minute live stream of a coffee pot and being like, “I can go there.” You watched live streams of coffee pots, for real? Do you remember this? In 1994, the first video feed on the internet was a coffee pot in Cambridge . It blew my mind when I was kid. In some ways, it’s more than ever, right? You can watch live streams of all the coffee pots ever. Any one you want. If you want to. In some ways, this is just closing down, right? Every country is saying, “Our citizens are here and we’re going to manage what they do. Their lives are on the internet. We are going to control the internet in our country.” That’s happening all over the place in all kinds of ways. It’s honestly happening state to state. The internet in California looks different than the internet in Texas today. You’re the networking provider for many of these countries, for many of these companies. When you think about the sweep of what the internet might look like, when you think about the amount of compute that’s happening in a data center as opposed to happening locally on my laptop, (which is always a kind of dance) what does the internet of the next five years look like to you? Well, it’s going to be more fragmented for sure. You’re seeing the cloud providers build in regions and certain places, and they’re having to re-architect to think about this. I’m not sure most of the functionality we use for the internet today is going to change much, to be honest. There are going to be controls that will exist, but I don’t think it’s going to change the core, normal operating functionality of how it works today. They’ll be there in the case of an issue or an emergency. Now, it’s not the network’s issue where you store data and all those kinds of things, so how that plays out is independent of what we would think about. But I think you get into times of crisis and that’s when you might see things happen differently. If you’re a certain country that gets into a conflict and you want to isolate yourself from a communications perspective so you can trust that your communication’s clear, then that might create short-term dynamics for your citizens. But I don’t think it’ll be meaningfully different on a day-to-day basis. We’re seeing that right now. India shuts off the internet and cash flow all the time. The Iranian internet is on and off every day. Are those things that your customers are coming to you and saying, “The government wants this capability of the network one level up. Can you help us build it?” They are having those conversations primarily around not wanting to have a third party or another country disintermediate their capabilities through tech by having some control or a kill switch. That’s typically what they’re talking about. How does that play into AI a bit? Now, we have these workloads built on networks that you’re supplying, you’ve got a bunch of agents doing stuff all day long, and you’re saying, “We’re going to be the security provider for it.” At some point, does Donald Trump get to say, “Turn off the agents, it’s getting out of control?” You have to think about security at an agent level. It’s like you would do at an employee level but on steroids. You’ll need to apply five to 10 times more security, maybe more. I’m just throwing numbers out. We have to figure that out as we get going, but it’s certainly going to introduce an entry point for bad actors to do things that you wouldn’t want them to be able to do. We’re learning, and everybody is working on this problem simultaneously right now. I feel like for most of this conversation, my assumption has been that this is going to keep going. This is going to keep working. The problems will be complicated, everyone will work diligently, we’ll solve them, someone will invent the consumer product, and all of this will pay off. What if it doesn’t? What if this bubble pops? What does that look like for you? What’ll happen is there’ll be lost, misplaced capital. There’ll be companies that shut their doors. Then, the winners will emerge, and we’ll build out at scale just like we saw with the first wave. I suspect that’s what will happen. I think there are certainly going to be companies that will cease to exist. They’re going to go away. That’s what happens with any of these early things. You take a risk. That’s why the reward is so high; it’s risk-reward. It’s the nature of these massive transitions, and this is bigger and faster than anything we’ve ever seen. The amount of capital tied up in what you might call circular financing with some of these neoclouds seems dangerous to me. It seems like if I had to point at where things will get shaky first, it will be, “Well, we did a lot of forward investment with a lot of debt investment into neoclouds against workloads that themselves have not yet paid off.” Eventually, the bill will come due or the investments don’t happen. Is that a risk that’s on your mind? It is, but not particularly for us. We’re super conservative. I’ve heard instances where we’ve looked at financials and have chosen not to do business with some of these folks. I think every company has to make their own decisions. We also have creative financing solutions that protect us so we can work through. We learned a lot in 2000 because we were doing a lot of that back then. The neoclouds , are you in them or are you staying away from them? No, we’re in some of them. Some of them want to use us. Honestly, a lot of them want partnerships with us because they want the enterprise access. They don’t have a robust enterprise sales force, and they think we can help them there. So, in many cases, we work together to figure that out. The other way I can see things maybe not getting shaky but changing dramatically so that it changes the investment available in the AI industry is if inference becomes more valuable than training, right? So far, all the emphasis has been on running these GPUs red-hot to do training because the next version of the model will finally be capable enough to, I don’t know, be your girlfriend. Whatever it is that they think they’re going to do. Something about training has been the point. We’re going to build AGI. They don’t want to say it, but they’re saying it all the time. There’s a chance that the models are good enough, and it’s actually just inference now. We’re just going to run agents and Claude Code is big enough to meaningfully affect enterprise cost dynamics. Does that change your business if we’re done with training and, actually, inference is the point? No, it is actually great. I don’t think you’re going to be done with training, by the way. I think the inferencing stuff is going to be additive. Do you need all the new data center build-outs if it’s inferencing instead of the training? I mean, we would like to participate, and we’d obviously like to see that continue to grow because it’s good for our business. But some people believe the inferencing side is going to be bigger. I think that it’s going to be very distributed. You think about how a lot of enterprise customers are going to want to do inferencing at a point of interaction with a customer and garner immediate value in that interaction, and that’s going to be very distributed. Distributed compute requires high-performance networks, which is good for us. So, we like that. This is what I was mentioning: the dynamic between the edge and the data center seems to always be changing. I think I saw some press releases out of Nvidia’s GTC about more compute coming to the edge of certain big network providers. Are you seeing that play out? We don’t know where it’s supposed to be, so everyone is investing in both the edge and the data center? It’s still early at the edge. I think everybody believes they’re going to need it, and we’re seeing certain applications where people are starting to pilot it. I think this may be a good opportunity for the telecom providers. There has always been this thesis that edge compute was going to be a big benefit for them. That was the thesis of 5G. I won’t say the name, but I went to a very long dinner with one of the major telecom providers, and they told me all about self-driving cars powered by edge networks. But you could see this become something. There are discussions now of inferencing grids and the dynamic routing of these inferencing requests based on everything from cost of power at a given time of day to capacity that’s available. I mean, there’s a lot of thinking about how this plays out. I think it’s still TBD, but it’s coming. So, I want to bring this all back around. The business here is building data centers with people, with big customers. It’s part of it. We also connect all the employees and everything else, too. Well, sure. I mean, do you want me to talk about Webex for another hour? Because I have a lot of notes about Webex. [ Laughs ] We can talk about anything you want. Apple uses Webex. Does Tim Cook ever say, “Dude, can you just make the Mac client a little bit better?” No, it’s actually better than most others. Do you use it? I’m a journalist. I’m on calls with these companies all of the time. So Webex comes up in my life. Okay, good. I’m glad to hear that. I’m just telling you, find the person, the native Mac client — All right. I’m going to get one of my guys on the phone with you and make sure that — We’ve got to do it on the show, and we’ll just go through a demo together. But you’ve got to be there. I’ve got to be there? Yeah. All right. We’ll do live notes on a Webex call. For you to be happy with Webex, I’ll do that. Every time an enterprise software CEO comes on the show, I’m like, “Do you use your product?” And I would say it’s 50-50. I do all day long. You obviously do. All day long. But I was also a coder early in my life, so I’m a little weird. I’ve used Claude Code, so I’m — You’re in it. Yeah. All right. But I’m saying the growth of the business, the explosive growth that everyone is seeing, is in AI, right? It’s in building these new generation of data centers, this new generation of compute. I just keep circling around it, but the problem is that people don’t want those data centers near them, and I have yet to see the argument for why that should happen. In my mind, the argument is great consumer products. If you’re like, “That’s where Netflix comes from,” I think people will calm down. But that’s not the argument we’re making right now. There isn’t a product like Netflix. That’s where Netflix comes from. I think if you were like, “Netflix is building a data center in your town,” people would be like, “That rules.” Yeah, it’s going to be faster. Right. Is Tom Cruise going to be there? You would have some emotional connection to the thing that’s happening. We don’t have that right now. The pressure on not building these data centers is only going to go up in weird ways. In Alabama, there’s a state senator that proposed blocking solar build-outs as a way to reduce data center interest in his state. That’s a weird outcome. What happens if we can’t build more data centers? What happens if the public just doesn’t buy in? We’ll build them in space faster, I guess. This is why I started off asking if you’re just trying to escape the political problems of Earth. I don’t think they’re political problems. I think they are issues of utility and power, cooling and water, and all those things. They’re all interconnected. Again, I don’t wake up every day and deal with this issue, but the people who do are very smart people. I think the thing a consumer will be okay with is if you go in and not only build a data center but somehow increase the utility capacity of that community or do something positive in that community beyond streaming Netflix faster. That’s when they’ll be okay with it because I don’t think their concerns are around it being unsightly or anything like that. I think the issue is the concern over the inflationary pressure that it puts on utilities and the things they need. In my hometown of Racine, Wisconsin, there was supposed to be a Foxconn factory, and that never came to pass. Now, it’s going to be a Microsoft data center . Instead of 13,000 or 15,000 jobs, which is what Foxconn promised to that site, there’s going to be like a couple thousand. This is a lot of water and a lot of power without the economic lift that you get. Then maybe there’s the inflationary pressures on power or other utilities. As your customers are building out, are you working with them to reduce those pressures, to find ways to make the data centers more efficient? Our role is really around the power consumption of the platforms that we sell, and that’s a massive part of our innovation cycle. We want to deliver higher performance and lower power consumption every time. So, that’s the role we play in that space. Well, Chuck, you’ve given us a lot of time. What’s next for Cisco? What should we be looking out for? It’s hard to predict what’s going to happen. As I said earlier, we had a high degree of luck with the optics and silicon investments that we made. We had some smart people who were suggesting that we make them, but they’ve turned out to be magical for us right now. For the next three to five years, we’re going to be spending every ounce of our energy on secure connectivity in this agentic era. But I mean, I don’t know what we’ll need to do three years from now because things are changing so quickly. I think we’re as prepared as we can be. Well, we’ll need to have you back sooner than three years to see where the pulse is. Thank you so much for being on Decoder , man. Thanks, man. Questions or comments about this episode? Hit us up at decoder@theverge.com. We really do read every email!
The Ridiculously Nerdy Intel Bet That Could Rake in Billions wired 06.04.2026 09:00 0.626
Embedding sim.0.7464
Entity overlap0.1026
Title sim.0.0876
Time proximity0.6358
NLP типpartnership
NLP организацияIntel
NLP темаai infrastructure
NLP странаUnited States

Открыть оригинал

Lauren Goode Business Apr 6, 2026 5:00 AM The Ridiculously Nerdy Intel Bet That Could Rake in Billions Advanced chip packaging is suddenly at the center of the AI boom. Intel is going all in. Courtesy of Intel Save this story Save this story Sixteen miles north of Albuquerque, in Rio Rancho, New Mexico, an Intel chip plant sits on more than 200 acres of land. The site was established in the 1980s, part of it built on top of a sod farm. In 2007, as Intel’s business faltered, operations in one of the key fabs, Fab 9, came to a halt. Employees say families of raccoons and a badger took up residence in the space. Then, in January 2024, the dormant fab was booted up again. Intel funneled billions into the facility, including $500 million it was granted from the US CHIPS Act. Now, Fab 9 and its neighbor, Fab 11X, are critical infrastructure for one of Intel’s quietly fast-growing businesses: advanced chip packaging. Packaging involves combining multiple chiplets, or smaller components, onto a single, custom chip. Over the past six months, Intel has been signaling that its advanced packaging business, which operates within the Foundry chip-making arm of the company, is having a growth spurt. The company’s efforts around this have it going head-to-head with Taiwan Semiconductor Manufacturing Corporation , which far surpasses Intel’s production in terms of scale. But in an era where AI is driving demand for all kinds of computing power, and leading nearly every major tech company to consider making its own custom chips, Intel thinks this effort can help it grab a bigger slice of the AI pie. During a quarterly earnings call in January, Intel CEO Lip-Bu Tan claimed that Intel’s packaging is a “very big differentiator” from competitors. Chief financial officer Dave Zinsner said on the same call that the company expects to see revenue from packaging “come in even before we start to see meaningful wafer revenue.” Zinsner said he had revised his packaging revenue projections over the past 12 to 18 months, from hundreds of millions of dollars to “well north of $1 billion.” Zinsner elaborated on this in March at the Morgan Stanley Technology, Media, and Telecom conference, when he called Intel's packaging “ironically, the more interesting part of the Foundry business today,” adding that the company was “close to closing some deals that are in the billions of dollars per year, in terms of revenue on packaging.” Multiple sources say that Intel has been in ongoing talks with at least two large customers for its advanced packaging services: Google and Amazon, which both make their own custom chips but outsource parts of the fabrication process. These deals would be a boon for beleaguered chipmaker Intel, which is attempting a comeback—partially funded by the US government —after years of stagnation and missing out on mobile chips. A spokesperson for Google, Lee Fleming, declined to comment, saying that Google doesn’t publicly discuss its supplier relationships. Amazon spokesperson Doron Aronson also declined to comment. Intel said it does not comment on specific customers. Intel’s ambitions for its advanced packaging business depend largely on whether the company can secure outside customers like these tech giants. Since 2024, the company has effectively been split into two: There’s the long-standing “product” side, where Intel designs and sells cost-efficient CPUs to PC makers and data centers; and the aspirational Foundry side, where Intel makes advanced semiconductors. Intel’s Foundry plans and the number of advanced chip systems it can yield are closely watched signals among tech analysts and investors, who in recent years have seen Intel cycle through CEOs and start and stop fab buildouts. Zinsner, for one, said at the Morgan Stanley conference that he now believes Intel Foundry’s packaging business can achieve the same 40 percent gross margins that it claims on the rest of its products. It’s still an extremely challenging proposition. “Packaging is not as easy as saying ‘I want to run 100,000 wafers per month,’” says Jim McGregor, a longtime chip industry analyst and the founder of Tirias Research, referring to a continuous flow of chips in various stages of production. “It really comes down to whether Intel’s [packaging] fabs can make deals. If we see them expanding those operations more, that’s an indicator that they have.” Last month, Anwar Ibrahim, the prime minister of Malaysia, revealed in a post on Facebook that Intel is expanding its Malaysian chip-making facilities, which were first established back in the 1970s. Ibrahim said the head of Intel’s Foundry, Naga Chandrasekaran, had “outlined plans to commence the first phase” of expansion, which would include advanced packaging. “I welcome Intel's decision to begin operations for the complex later this year,” a translated version of Ibrahim’s post read. An Intel spokesperson, John Hipsher, confirmed that it’s building out additional chip assembly and test capacity in Penang, “amid rising global demand for Intel Foundry packaging solutions.” Package Store According to Chandrasekaran, who took over Intel’s Foundry operations in 2025 and spoke exclusively with WIRED during the reporting of this story, the term “advanced packaging” itself didn’t exist a decade ago. Chips have always required some sort of integration of transistors and capacitors, which control and store energy. For a long time the semiconductor industry was focused on miniaturization, or, shrinking the size of components on chips. As the world began demanding more from its computers in the 2010s, chips started to get even more dense with processing units, high-bandwidth memory, and all of the necessary connective parts. Eventually, chipmakers started to take a system-in-packages or package-on-package approach, in which multiple components were stacked on top of one another in order to squeeze more power and memory out of the same surface space. 2D stacking gave way to 3D stacking. TSMC, the world’s leading semiconductor manufacturer, began offering packaging technologies like CoWoS (chip on wafer on substrate) and, later, SoIC (system on integrated chip) to customers. Essentially, the pitch was that TSMC would handle not just the front end of chip-making—the wafer part—but also the back end, where all of the chip tech would be packaged together. Intel had ceded its chip manufacturing lead to TSMC at this point but continued to invest in packaging. In 2017 it introduced a process called EMIB, or embedded multi-die interconnect bridge, which was unique because it shrunk the actual connections, or bridges, between the components in the chip package. In 2019, it introduced Foveros, an advanced die-stacking process. The company’s next packaging advancement was a bigger leap: EMIB-T. Announced last May, EMIB-T promises to improve power efficiency and signal integrity between all the components on the chips. One former Intel employee with direct knowledge of the company’s packaging efforts tells WIRED that Intel’s EMIB and EMIB-T are designed to be a more “surgical” way of packaging chips than TSMC’s approach. Like most chip advancements, this is supposed to be more power efficient, save space, and, ideally, save customers money in the long run. The company says EMIB-T will roll out in fabs this year. Unsurprisingly, AI has been a big catalyst for these changes. “Because of AI, advanced packaging has really come to the forefront,” Chandrasekaran said. “Even more so than the silicon itself, chip packaging is going to transform how this AI revolution comes to fruition over the next decade.” Intel began readying for mass production of EMIB-T in Rio Rancho, New Mexico. The Rio Rancho facility houses around 2,700 Intel employees, roughly 200 less than it had last year; Tan slashed Intel’s workforce after he took over as CEO. The surrounding land is arid desert. As is the case with a lot of tech infrastructure expansions, local advocacy groups have expressed serious concerns about Intel’s water usage and the fumes the plant is giving off. (Intel claims it recycles water at the Rio Rancho site.) A short tour inside Rio Rancho’s Fab 9 doesn’t reveal much to the untrained eye. It’s slightly less “clean” than Intel’s Fab 52 semiconductor plant in Arizona , since its method of removing air particles is different there, but the standard clean room precautions and sanitized, zipped-up suits are still required for entry. Inside the fab, hair-thin silicon wafers are mounted, diced, and mold-grinded. Katie Prouty, the Rio Rancho site plant manager and a 31-year veteran of Intel, emphasizes during a walk-through that one of Intel’s selling points for advanced packaging is that customers can opt to use Intel for any part of the process, or “enter and exit the highway” at any point. A customer can, for example, purchase wafers from one entity, then come to Intel’s fabs for the next step; or contract with an outsourced semiconductor assembly and test company for traditional chip packaging, then use Intel for advanced packaging. “That’s not something Intel did before. We never took in other customers’ wafers,” Prouty said. “That’s been a huge mindset shift.” Competent, cutting-edge technology? Check. Chips packaged specifically for AI? Check. Flexibility for customers with certain needs? Check. So, where are all the customers? One former Intel employee, speaking on background, said that Intel’s target packaging customers may be hesitant to announce partnerships with Intel for a couple reasons. They’re either waiting to see if the company can deliver on its fab expansion promises, or they’re concerned TSMC could potentially allocate fewer wafers to customers once they say they’re using Intel for packaging. It’s not the tech itself they would be taking a risk on, the former employee added; it’s the broader market dynamics. Chandrasekaran is more circumspect. “I think we want to be very disciplined around the idea of: We don’t talk about our customers. Successful foundries don’t say, ‘We have signed up these customers.’ We want the customers to talk about our product.” Intel may want to consider adopting another motto: If they come, we will build it—and at great capital expense.The big indicator that the customers have arrived, Chandrasekaran says, will be a notable jump in Intel Foundry’s spending. “As we sign up these customers, we’ll have to increase our capital expenditures,” he says. “And then the street will see it.”