![Rethinking Data Center Demand: The Future of AI, Energy Consumption, and Load Projections Artwork](https://www.buzzsprout.com/rails/active_storage/representations/redirect/eyJfcmFpbHMiOnsibWVzc2FnZSI6IkJBaHBCS2JkTndjPSIsImV4cCI6bnVsbCwicHVyIjoiYmxvYl9pZCJ9fQ==--5288a34c5d72ec21f8456515587a4e1435256a24/eyJfcmFpbHMiOnsibWVzc2FnZSI6IkJBaDdDVG9MWm05eWJXRjBPZ2hxY0djNkUzSmxjMmw2WlY5MGIxOW1hV3hzV3docEFsZ0NhUUpZQW5zR09nbGpjbTl3T2d0alpXNTBjbVU2Q25OaGRtVnlld1k2REhGMVlXeHBkSGxwUVRvUVkyOXNiM1Z5YzNCaFkyVkpJZ2x6Y21kaUJqb0dSVlE9IiwiZXhwIjpudWxsLCJwdXIiOiJ2YXJpYXRpb24ifX0=--1924d851274c06c8fa0acdfeffb43489fc4a7fcc/Buzzsprout%20Thumbnail%20-%20Energy%20Future.png)
Energy Future: Powering Tomorrow’s Cleaner World
Energy Future: Powering Tomorrow's Cleaner World" invites listeners on a journey through the dynamic realm of energy transformation and sustainability. Delve into the latest innovations, trends, and challenges reshaping the global energy landscape as we strive for a cleaner, more sustainable tomorrow. From renewable energy sources like solar and wind to cutting-edge technologies such as energy storage and smart grids, this podcast explores the diverse pathways toward a greener future. Join industry experts, thought leaders, and advocates as they share insights, perspectives, and strategies driving the transition to a more sustainable energy paradigm. Whether discussing policy initiatives, technological advancements, or community-driven initiatives, this podcast illuminates the opportunities and complexities of powering a cleaner, brighter world for future generations. Tune in to discover how we can collectively shape the energy future and pave the way for a cleaner, more sustainable world.
Energy Future: Powering Tomorrow’s Cleaner World
Rethinking Data Center Demand: The Future of AI, Energy Consumption, and Load Projections
Can AI revolutionize the way we think about energy consumption and power grid demands? Venture into the world of cutting-edge AI technology with us as we examine the projected 125,000 megawatt surge in data center electricity demand, driven not just by traditional needs but also by the expanding crypto universe. We uncover the breakthrough of DeepSeek, China's open-source AI model that claims to match the performance of leading proprietary models at a fraction of the energy and cost. This development shook Wall Street, as tech giants like NVIDIA faced significant market value losses, sparking intense scrutiny over the legitimacy of DeepSeek's extraordinary claims.
Join us for a thorough exploration of DeepSeek's potential to change the AI landscape and its implications for future energy consumption. We explore whether this could lead to Jeevan's paradox, where cheaper computational capabilities result in increased usage. Despite the mixed reviews and questions about its creative task limitations, DeepSeek's impact is undeniable, particularly in inference and real-time decision-making. We unpack these concepts, sharing insights into how AI's ability to recognize patterns is reshaping real-world applications. This episode offers a captivating look at the intersection of technology and energy, a must-listen for anyone curious about the future of AI.
Just when you get comfortable thinking you know something, you find out that maybe you don't. In a series of videos late last year, I addressed the issue of exploding data center electricity demand and the enormous number of applications utilities had received. In recent months I've been tracking these in a spreadsheet based on various press articles and releases in trade press and thus far I've got over 125,000 megawatts of new projected data center demand. Not all of this demand is AI-related. Some new load will serve your typical data center applications, while some may even be serving crypto loads, now that crypto's in fashion in Washington. But there have been some subtle signs that perhaps this new load might not be as big as headlines suggest. Skepticism was already the word of the day before news came out of China last week that an open-source AI large language model, an LLM there called DeepSeek, was nearly as good as some of the proprietary models being built here in the US by some of the biggest players in the space. The news that mattered most to markets was that it was not only competitive but much cheaper, using fewer chips and far less power. Deepseek reported that its model took only two months and less than six million dollars to build using a less advanced and less costly H800 NVIDIA chip. The one-day carnage on Wall Street was amazing to behold. Leading chip maker NVIDIA's share price fell off a cliff, losing 17% and $600 billion with a B of market value. Modular nuclear and fuel cell stocks got savage as well, shedding up to 25% of their stock prices. They quickly rebounded, though, as over the ensuing week, additional news filtered out that perhaps those numbers weren't quite so reliable, coupled with accusations that there'd been some so-called distilling, ie transferring knowledge from open AI to DeepSeq or at least some reverse engineering from other AI models. So it wasn't like DeepSeq was built from scratch.
Speaker 1:Now come three questions related to the grid and future power consumption. First, how much of DeepSeq's claims will eventually prove to be true, both in terms of the time and resources to build the ELLM and the implications in terms of power and what the other large language models essentially need for chips and power, as they've been brute forcing their way through their trainings? Second, is the model really that good? If one can really build AI capabilities more cheaply, does that in fact lead to Jeevan's paradox, ie, the less expensive the computational capacity is, the more of it we'll use? As far as the first claim, that remains to be verified. However, if it's remotely true, it could dramatically change how much that current energy-intensive brute force huge chip approach is applied to LLM model development in the future. That would bring down energy consumption figures way down, though nobody knows by quite how much. This is still all too new. The second claim also may not stand up to further scrutiny.
Speaker 1:As noted, some anecdotal evidence I've seen and others have seen, suggests that DeepSeek is not really that good at answering some simple questions, and OpenAI has made some claims that need to be verified. But what is true? The model is pretty good. A New York Times tech reporter that spent half of the Monday a week ago playing with the tech came away impressed, noting that it compared well with OpenAI's ChatGPT and Anthropix CLOD. It solved some complex math, physics and reasoning problems at twice the speed of chat, and its responses to computer programming questions were quote as in-depth and speedy as its competitors. It wasn't quite so good at composing poetry, planning vacations or coming up with recipes, but so what If it's almost as good at a fraction of the price? Well, so there, it looks like there's a there there. The next question then comes down to use, or so-called inference DeepSeek is free and it was the most frequently loaded app two weeks ago.
Speaker 1:As defined by Perplexity, ai quote inference involves using the patterns and relationships learned during training to solve real-world tasks without further learning. For instance, a self-driving car recognizing a stop sign on an unfamiliar road is an example of inference. Unquote provision of that response to my query that I did with perplexity was also an example of inference. See what I did there. Inference can also help with real-time decision making and involves a number of steps First, data preparation. Second, model loading. Third, processing and prediction. Fourth, output generation to give you the information or the results you seek.
Speaker 1:Inference is very energy intensive, so if we use less energy on LLMs but they get cheaper and more ubiquitous, what does that mean for energy consumption? In the arena of inference, we're so early into the adaptation and adoption of these tools that nobody knows. But as far as the electricity required, we could be in the midst of a typical Gardner hype cycle, such as the one we experienced in the early 90s dot-com frenzy when petcom's sock puppet was going to dominate the dog food industry. Admittedly, 25% of Dominion Energy's demand in Virginia is already dedicated to serving data centers, and AI will clearly have many uses, some of which we can only imagine today. So we'll certainly see more energy use, but the LLMs may run into various limits with declining economies of scale that would eventually reduce expected demand. There'll also be substantial gains in processing and cooling efficiencies that drive energy requirements down, and we will probably see those results in the years to come. Right now, we're still in the very early days of throwing money, a first version of chips and data, at the opportunity, but checkbooks and coffers are not unlimited and a focus on efficiency will inevitably follow it always does. There will also be companies that don't survive the race. That will probably be dominated by only a few deep pocket participants, although scrappy, low-budget startups like DeepSeek suggest that perhaps an oligarchy isn't inevitable. But if this does go the same way the search engine race did, we'll be left with only a small number of well-resourced players, and this LLM quest may yield similar results, with most companies failing or being consolidated, and if you don't believe me, you can go ask Jeeves.
Speaker 1:There's also a big issue related to these headline demand numbers. The data companies may be filing many more applications with utility for supply than they intend to actually develop because of the way the process for interconnection with utilities actually works. Only a small number of utilities actually have rigorous procedures for evaluating the applications to ensure they're likely to get to physical service. The best ones, like seasoned veteran Dominion Energy, require proof of control of land, a financial commitment from the data company to support required engineering studies and signature of a construction letter of authorization obligating the obligant to pay for all project-related expenditures, regardless of whether the project breaks ground. Only then does an electric service agreement, an ESA, get signed that makes its way into the forecast. A review of various forecasts in other parts of the country demonstrate that this same level of rigor is not routinely applied. Thus it's quite likely that data companies are submitting multiple interconnection requests and utilities are over-reporting the capacity numbers.
Speaker 1:Many data companies are likely doing what you and I would do. If we needed lots of views as fast as possible, we'd logically submit multiple applications to numerous utilities with the hopes that at least some of those would get to. Yes, it's not possible to gain insight into exactly what's happening at any point in time, since the industry's competitive remains a high degree of confidentiality, but it's very likely there are numerous placeholder phantom requests. The analog on the bulk power supply side of the industry may be instructive where over 10,000 generation projects wait in transmission interconnection queues and, if recent history is a guide, fewer than 20% of those endeavors will actually get built. If utilities further tighten up their load interconnection requirements and implement more rigorous procedures that require higher upfront financial commitments, we may get a better sense as to how many real applications are out there.
Speaker 1:It's clear that AI has real value to society and we are now beginning to see some use cases emerge. It's also clear we're in the very early days, with rapidly evolving technologies and business models and many unanswered questions. However, getting past the current hype cycle will take some time. We won't know the full implications until we start to see some projects proceed while others are canceled. And if you don't believe me, ask Perplexity AI, it tells me. Quote several factors suggest that only a fraction of the proposed projects will likely be completed. Unquote. Amen to that. Thanks for watching and we'll see you again, hopefully next week.