The day after he was sworn in as the 47th president, Donald Trump turned up in the Roosevelt Room in the White House, accompanied by three of the most consequential figures in the booming artificial intelligence industry: OpenAI founder Sam Altman, Oracle founder Larry Ellison and SoftBank’s founder Masayoshi Son. They respectively represent the cutting edge in AI computing, the world’s leading database software provider and a Japan-based venture capital giant that invests in building and underwriting AI infrastructure.
The occasion was the launch of a joint venture, dubbed “Stargate,” that will invest a staggering US $500 billion in massive, state-of-the-art data centres built specifically for a transformative technology capable of everything from writing essays to concocting deep fake videos to deploying AI assistants and chatbots that can take care of an ever-growing list of tasks.
On this occasion, the normally hyperbolic Trump deployed a plausible superlative to describe the new partnership: “the largest AI infrastructure project in history,” according to The Guardian. As he added, “We have to get this stuff built.”
Yet in the 10 intervening months, the hype around the AI data centre sector has only grown louder, larger and more breathless. Some market analysts say that AI, and the handful of global tech giants that dominate this space (Microsoft, Meta, chip maker Nvidia, etc.), account for almost a third of the extraordinary run-up in the S&P 500 this year, according to Paul Kedrosky, a tech entrepreneur and MIT research fellow. “Building in data centres is something in excess of 50 per cent of GDP growth in the United States in the first half of this year,” he said in a recent CBC Radio interview. “I can go on and on, but the numbers are just staggering in terms of both the growth and the size of this one sector and its contribution to the economy.”

Moody’s, the debt rating agency, estimates that data centre investment could exceed US $2 trillion in the next four years; other predictions, such as one from McKinsey, put the figure at nearly US $7 trillion by 2030. According to a recent market assessment by Goldman Sachs, all this new investment is driven in large measure because an older generation of data centres—i.e., sprawling server farms which housed much of the Internet and cloud-computing services for corporate customers—simply isn’t capable of dealing with the processing needs of AI applications.
So far, most centres are located or proposed for the United States and China, but other countries are getting in on the AI gold rush. Malaysia has emerged as the leading data centre hub in Southeast Asia, accounting for more than two-thirds of projects under construction in the region. Korea recently unveiled plans to build the world’s largest proposed data centre in the Jeollanam-do region, in the country’s southwest. Backed by a California firm, Stock Farm Road (which was co-founded by LG’s Brian Koo), the US $10-billion facility will expand to US $35-billion once it is fully built out later in this decade. It will require a staggering 3 gigawatts of power—enough to supply a city of about 2.5 million households. (Stargate recently announced it would be constructing data centres in Korea, as well as Norway and Argentina.)
Elsewhere, a £3.9-billion (US $4.5-billion) data centre project proposed for North Lanarkshire, in Scotland, will rise on a fallow steel factory site, to be powered in part by on-site battery storage. Not to be outdone, Meta is constructing a US $50-billion AI data centre in Louisiana, to be called Hyperion and expected to be the largest in the United States.
In Canada, Telus recently opened an “AI factory,” which provides both algorithm training and the other tasks required to power an AI application, in the Rimouski region of Quebec, serving corporate clients such as OpenText, League and Accenture, with secure links to Telus’s network. Telus says the facility uses 99 per cent renewable energy and 75 per cent less water for cooling than industry standards. The company plans to scale it up in response to demand within Canada for AI applications that don’t rely on foreign data centres to ensure data protection and security, especially for customers in sectors such as government or health care.
“Sovereignty means Canadian organizations can finally develop cutting-edge AI solutions with complete Canadian operational control over every layer of the technology stack, from chips to software to storage,” says Telus spokesperson Tricia Lo. “When done correctly, sovereignty secures the freedom to develop solutions tailored specifically to Canadian markets, regulations and customer needs, rather than adapting generic offerings.”
Telus’s decision to locate in a region with an abundance of inexpensive hydro-electric power is no coincidence. Around the world, the new geography of AI data centres will require a quantum leap in the provision of electricity, which runs both the computers as well as the cooling systems required to keep them from overheating. BloombergNEF forecast this spring that U.S. data centre power demand will more than double by 2035, rising from almost 35 gigawatts in 2024 to 78 gigawatts. (For context, 78 gigawatts is about half of Canada’s installed electrical capacity.) The bulk of the electricity consumed for these facilities will be generated in the United States, China and the European Union, and about half will come from fossil fuels. As the energy demands grow, some data centre developers are building natural gas–fired generators on their own sites as a means of ensuring adequate electricity supply.
These eye-popping aggregate figures have a way of obscuring the physical reality of the data centre boom and how it looks from ground level. The sprawling glass, concrete and steel facilities, typically built behind highly secure perimeters, look like suburban distribution hubs, but without the long rows of portals for freight trucks. They tend to give off a noticeable hum and are frequently situated next to a power substation. Also, while data centre operators use billions of litres of water for cooling, they’re often located in regions in the United States that are already experiencing high or extreme water stress, according to a Bloomberg News analysis published last spring.
“They are just like massive cold storage buildings,” says architect Kelly Alvarez Doran, the co-founder of Ha/f Climate Design, a London-based practice that specializes in low-carbon design. He describes the architecture of data centres as “innocuous” and points out that the proximity to water and power is the critical piece. “It has a lot of the same geographical requirements as a nuclear power station, but the main thing is their big footprint.”
Traditional data centres, packed with servers and storage capacity, are not new and indeed have been part of the portfolio of commercial property developers for years. But the sheer scale of AI data centres reflects the fact that these applications don’t work at all like conventional digital platforms, such as streaming services, social media or e-commerce.
“What’s different about AI is that every time that you query it, for example when you use ChatGPT, is that it’s generating these responses one word at a time, and it involves a lot of computation,” says Nandita Vijaykumar, an assistant professor of computer science at the University of Toronto. The AI platforms that can generate answers or videos in response to prompts, she adds, are based on “large matrix multiplications. It’s not very complex math, but it is very large scale. The data centres are actually supporting computer infrastructure that is intended for very large-scale computation.”
Nvidia, a California firm founded in 1993, began life developing the 3D graphics hardware needed to power computer games and has effectively cornered the market for the highly specialized chips and servers required to process AI algorithms and applications. The new so-called “hyperscale” data centres are filled with Nvidia’s technology and have therefore propelled the company to the top tier of global tech giants, with a market capitalization of US $5 trillion.

What’s increasingly clear is that the rapidly expanding footprint of the extravagantly capitalized data centre industry is becoming a fixture of the landscape, especially in populous regions. In previous eras, the emergence of heavy industries delivered both a boon, in terms of jobs, as well as an environmental shock—just think of the smoke-belching factories that produced the steel for automakers in places such as Detroit, Pittsburgh and Hamilton.
With the data centre industry, the earlier, smaller-scale facilities—many of them geared at cloud-computing services—tended to be built in dense metropolitan areas with high-quality networks and proximity to users, according to a 2020 Harvard Business School study entitled “Where the Cloud Rests: The Economic Geography of Data Centers.”
Some communities have sought to entice hyperscale developers to build within their municipal boundaries aggressively, and the results have been mixed.
A northern Virginia community known as Loudoun County, located in the exurbs of the Washington metropolitan region, is situated at the pointy end of this trend, with the municipality having long offered generous tax breaks and other incentives for data centre builders. The county became a destination. According to a 2023 report by state authorities, “Northern Virginia is the largest data centre market in the world, constituting 13 percent of all reported data centre operational capacity globally and 25 percent of capacity in the Americas.”
Loudoun, according to a recent Bloomberg CityLab investigation, has nearly 200 data centres comprising 49 million square feet of space and another 117 in the pipeline. Loudon’s county supervisor reported earlier this year that data centres provide the county with almost US $900 million in annual tax revenue, which covers 95 per cent of the municipality’s budget.
Many residents, however, have chafed at the impact on their communities. These sprawling, humming facilities are located cheek-by-jowl with residential neighbourhoods, and homeowners in some cases find their backyards opening onto the looming, mute walls of a data centre. “I never thought that a data centre would be built across the street from my house,” Emily Kasabian, a local resident, told the BBC in October. “I would not have bought this house if I had known what was going in across the street.” In fact, earlier this year, Loudoun’s county board changed the zoning for data centres, making it more difficult for developers to get approval. Previously, data centre builders received virtually automatic approvals from the county’s planning department.
The reasons for the pullback are partly political: the sense of encroachment that Loudoun homeowners have complained of, but it’s also about energy. The state audit conducted in 2023 concluded that it would be “very difficult” for the state to provide the transmission lines and renewable energy sources needed to satisfy a so-called “unconstrained” growth scenario for data centres within the region. “For example, new solar facilities would have to be added at twice the annual rate they were added in 2024, and the amount of new wind generation needed would exceed the potential capabilities of all offshore wind sites that have so far been secured for future development,” the audit’s authors wrote. The Trump administration’s decision to halt or cancel many new offshore wind and solar projects further muddies the waters, despite the president’s keen interest in AI and cryptocurrency, an intensely energy-intensive industry which also relies on data centres.

The pushback is occurring in some other jurisdictions. The Malaysian government earlier this year down-shifted data centre growth, according to Reuters, in response to mounting strains on the country’s electrical grid, as well as political pressure from the United States, which wants to prevent China from gaining access to Nvidia’s specialized AI hardware.
Others are concerned that the AI and data centre–driven stock market bubble will burst, as happened in the early 2000s with the dot-com collapse. “The wreckage on the other side is going to be extreme,” Kedrosky told CBC.
After Morgan Stanley released a set of bullish analyses about data centres earlier this year, the investment bank received—and then addressed—some pushback from its clients, the Financial Times reported last summer. “Calling back to the tech boom of the mid-to-late 90s, investors have been asking about the possibility that this investment cycle for data centres could be a bubble. While we agree that it is a lot of financing, very quickly, and in service of a technology that has yet to generate material revenues (GenAI), we believe there are a few important differentiating factors about this situation.” These included a different financing environment and Morgan Stanley’s predictions that AI will finally become profitable this year.
The prospect of an exploding bubble—either due to a gold rush mentality among investors or the inability of electrical utilities to supply all the power data centres crave—has raised fears that communities like Loudoun will be left, at least for a while, with the legacy of half-finished data centres or data centres that could never be switched on because the firms responsible for building out the energy grid couldn’t keep up.
“The major stock indexes are at the mercy of these AI stocks,” observes Tyler Hamilton, senior director for climate at the MaRS Discovery District in Toronto and an expert on emerging clean technologies. “As you know, the stock market’s driven by psychology, and as soon as people go, ‘Okay, this is overhyped,’ then it can go in another direction. I personally equated this very much to the days when there was an overbuild of fibre, anticipating demand for the internet. You saw a lot of companies that just went on this massive build of fibre beside railroad tracks, and that whole market crashed.”
Environmentalists who watch the hockey stick growth in the data centre industry often find themselves pondering how these outsized structures can be constructed in ways that use electricity, water and space more efficiently, as do the data centre developers themselves. Hamilton says AI chip makers like Nvidia are actively looking to improve the energy efficiency of their products, so they cast off less heat as they process prompts and therefore operate more cost-effectively.
Cleantech start-ups, he adds, are now looking at ways to capture the waste heat that comes out of data centres, that is, for now, mainly shunted into lakes and rivers because water is used as a coolant. “Some data centres are locating in colder regions and just piping up the heat outdoors, but that’s obviously limited,” he says. Other approaches involve capturing the waste heat and then using it to generate electricity that can offset the data centre’s own power needs. “That technology is still at the pilot scale, but it shows promise [to commercialize] in the 2030–35 time frame.”
Some municipalities are also looking to collect the waste heat from data centres and either store it for later energy use, or pump it into a district heating system that distributes energy to homes and commercial buildings. “There’s a company in Quebec called QScale that has been experimenting with the co-location of greenhouses with data centres,” Hamilton says, referring to a technique for capturing and recycling heat in order to offset the energy required by these indoor growing operations.
But architect Alvarez Doran points out that some very aggressive data centre developers, among them Elon Musk, have eschewed these kinds of climate-focused energy partnerships and instead have equipped their facilities with gas generators. Others have resorted to using repurposed jet engines to power their centres. “I think, inherently, they’re problematic pieces of infrastructure because of how much power they’re consuming and the kind of bubble that we’re seeing right now, how rapidly they’re getting built and the amount of investment going into them,” he says. “How much of that hardware is going to be obsolete in two or three years and then what happens to that whole facility?”
There’s an intriguing footnote to the emerging pace and scale of the AI data centre industry. For decades, computer scientists and tech companies have recognized a phenomenon known as “Moore’s Law,” which holds that computer chips and circuits become steadily smaller even as computers become increasingly robust. Case in point: today’s smartphone is exponentially more powerful than the room-sized mainframe computers from the 1970s.
Will Moore’s Law eventually apply to data centres, allowing hard-driving consortia such as Stargate to build them so they’re physically more compact, more energy-efficient and less thirsty? University of Toronto’s Vijaykumar says AI chip makers and data centre engineers are already working to streamline these operations. “The problem is that even though we’re making the technologies more efficient and smaller, we use them a lot more,” she says of the rapidly expanding universe of AI products. “The growth trajectory just enables more applications and more use cases, which generates more demands and then we still need more [data centres].”
The future, in other words, is coming into focus: with or without a major correction in the AI data centre bubble, cities and regions around the world will be increasingly ringed by clusters of these humming, power-hungry monoliths—the physical expression of our seemingly insatiable appetite for the strange fruits of the AI revolution.


