Recently the New York Times reported the following:
Why Don’t Data Centers Use More Green Energy? (1/2)
Reliance on fossil fuels is almost unavoidable — at least
for now.
The New York Times - By Evan Gorelick
Sept. 27, 2025, 8:00 a.m. ET
It’s been a big week for A.I. data centers. That means it’s
also been a big week for coal and natural gas.
Nvidia this week announced a $100 billion investment to support OpenAI’s enormous build-out of data centers that use its chips. The next day, OpenAI said it had signed deals with SoftBank and Oracle to build five new data centers as part of the Stargate Project, a $500 billion plan for A.I. infrastructure. (The three companies unveiled it at the White House back in January.)
The announcements are the latest in a global push to speed the construction of A.I. data centers. OpenAI, Amazon, Google, Meta and Microsoft are together spending more than $325 billion on them by the end of the year. To stay on the bleeding edge, the companies want the latest processors, cooling systems, facilities — all running 24/7 on mind-bending quanta of electricity.
In the U.S., more than half of that power is coming from fossil fuels.
President Trump, who called green energy a “scam” at the U.N. General Assembly this week, has enthusiastically endorsed natural gas, coal and oil. He has also subsidized them. As part of his official A.I. plan, he pledged to scrap “radical climate dogma and bureaucratic red tape” and fast-track fossil fuel projects instead.
But there are reasons beyond politics that help explain why smog-spewing fossil fuels have become the go-to power source for futuristic data centers. The pairing is almost unavoidable — at least for now.
Renewables
But renewables often can’t shoulder the load alone, despite being a major part of the A.I. power plan. That’s because servers hum and whir around the clock — not just when the sun is up or the wind is blowing. They demand a constant, stable flow of electricity. If power falters, even for a few seconds, companies lose thousands of dollars, sometimes more.
There’s a fix: Companies can pair solar and wind farms with massive batteries that store power and then release it in a steady stream. But storing energy that way is relatively pricey and may still fall short of providing the nonstop energy that data centers need. “Batteries are a great way to shift daytime electricity to evening electricity — but not a great way to shift July electricity to January electricity,” said Matthew Bunn, a professor at Harvard who studies energy policy. So even the greenest facilities rely on fossil fuels or the local grid for backup, he told DealBook.
(to be continued)
為何數據中心不更多地使用綠色能源? (1/2)
對化石燃料的依賴幾乎不可避免 - 至少目前是如此。
對人工智能數據中心來說,這一周是意義非凡的。這意味著,對於煤炭和天然氣來說,這也是一個意義非凡的一週。
英偉達本週宣布投資1,000億美元,支持OpenAI使用其晶片大規模建設數據中心。第二天,OpenAI宣佈已與軟銀和甲骨文簽署協議,將建造五個新的數據中心,作為「星際之門」計劃的一部分。 「星際之門」計劃是一項耗資5,000億美元的人工智能基礎設施計劃。 (這三家公司於今年1月在白宮公佈了該計劃。)
這些聲明是全球加速人工智能數據中心建造置最新舉措。截至今年底,OpenAI、亞馬遜、Google、Meta 和微軟在這些領域的總投資額將超過 3,250 億美元。為了保持領先地位,這些公司需要最新的處理器、冷卻系統和設施 - 所有這些都需要全天候運行,耗電量驚人。
在美國,超過一半的電力來自化石燃料。
一向稱綠色能源是 “騙局” 的特朗普總統本週在聯合國大會上熱情地支持天然氣、煤炭和石油,並對這些能源進行補貼。作為其官方人工智能計劃的一部分,他承諾將廢除“激進的氣候教條和官僚主義的繁文縟節”,轉而加速化石燃料計劃的推進。
但除了政治因素之外,還有一些其他原因可以解釋為什麼排放著霧霾的化石燃料已成為未來數據中心的首選能源。這種組合幾乎是不可避免的 - 至少目前是如此。
再生能源
龐大的太陽能發電場、風力發電場和水力發電大壩是地球最佳的能源選擇,通常也是最便宜的。它們的經濟優勢使其成為全球數據中心成長最快的電力來源。
然而,儘管再生能源是人工智能電力規劃的重要組成部分,但它們往往無法獨自承擔全部負荷。這是因為服務器晝夜不停地運轉 - 不僅僅是在有太陽或有風的時候。它們需要持續穩定的電力供應。即使只有幾秒鐘的斷電,公司也會損失數千美元,有時甚至更多。
有一個解決辦法:公司可以將太陽能發電場和風力發電場與大型電池配對,這些電池可以儲存電能,然後穩定地釋放電能。但這種儲存電能的方式成本相對較高,而且可能仍無法滿足數據中心不間斷供電的需求。 哈佛大學研究能源政策的教授Matthew Bunn表示: 「電池是將白天電力轉換為晚上電力的好方法,但並非將七月電力轉換為一月電力的好方法」。他告訴 DealBook 説,因此即使是最環保的設施也依賴化石燃料或當地電網作為備用電源。
另一個挑戰是:最大的數據中心園區將消耗數千兆瓦的電力。 (作為本週協議的一部分,OpenAI
同意在至少
10 千兆瓦的數據中心中使用英偉達晶片。)為了持續生產一千兆瓦的電力,一座可再生能源廠就需要大約
1,250 萬塊太陽能電池板 - 足以覆蓋近
5,000 個足球場。風力渦輪機需要更大的空間。許多靠近城鎮的數據中心沒有這樣的空間。
(待續)
沒有留言:
張貼留言