Hi
Welcome (back) to The Prompt. AI isn’t well understood, but we learn a lot in our work that can help. In this newsletter, we share some of those learnings with you. If you find them helpful, make sure you’re signed up for the next issue. You have choices here – to receive our weekly US newsletter, our monthly EU newsletter, or if you really want to load up, then both.
[Insight] Why the US and China get the data center dollars
The world is racing to build data centers that can generate ever more powerful AI models. We know increased computing power, coupled with more training data, will continue to accelerate AI capabilities – in turn, opening new possibilities for what people are able to achieve. AI scaling laws have held through multiple generations of large language models, and we expect this trajectory to continue.
OpenAI is in overdrive to expand our data center capacity. We’re constructing a massive data center campus in Abilene, Texas as part of our $500 billion Stargate Project; scoping dozens of other potential sites across America; and looking for other investment opportunities around the world through our OAI for Countries initiative, which will help us export democratic AI.
All countries should be building AI infrastructure. It’s crucial to economic competitiveness and will create a range of positive economic benefits – including local jobs, technical know-how, and complementary businesses and innovations, with benefits across the AI value chain.
Despite this clear infrastructure imperative, AI-ready data centers and investment plans remain concentrated in two countries: the US and China. Yes, the European Union is discussing ways to build AI “gigafactories” across the bloc – and we’re looking to participate with European partners. Japan and the UK are building or greenlighting data centers, too. These are all important initiatives, but for the time being, we can still expect AI infrastructure investments to be concentrated in the US and China unless two factors change: energy prices and access to data.
Here’s why:
1. Data centers have big energy needs, and availability and pricing will play a critical role in guiding AI infrastructure investments. Industrial energy prices currently are around three times higher in the EU than the US and four times higher in the UK – factors former Italian Prime Minister Mario Draghi cited in his landmark report on European competitiveness.
2. The EU’s approach to data exists for some deeply considered reasons, but the upshot is reduced odds that the EU becomes a center for AI development on par with the US and China. The US “fair use” doctrine has underpinned generations of American technology advances. This approach to copyright ensures new, transformative technologies can be built using existing works – as a US Federal Court held in June. Japan and Singapore have similar copyright protections for innovators, which is encouraging for future AI investments if their energy prices are competitive, too. The UK’s copyright rules only permit narrow research uses – a big reason why the largest AI labs currently aren’t developing models in the UK.
Across all of these jurisdictions, we can see a clear pattern emerging: the more complex copyright rules are for AI development, the less data is available for new innovators, and the harder it is for that jurisdiction to attract large-scale, AI infrastructure investments.
Allowing these differences to persist would be a shame. It will slow AI progress globally and have the unfortunate effect of widening countries’ competitiveness gap with the US – which, in Europe’s case, is due to different adoption rates for technologies, according to Dr. Draghi.
We know the formula: making the raw materials for AI more readily available will enable greater AI investment. Now’s the time to get the policy settings right and put infrastructure in place that will serve us all. – Adam Cohen, Head of Economic Policy
[Policy] The energy trifecta
“[I]t's hard to overstate how important energy is to the future here,” our CEO Sam Altman recently told a Senate committee. AI companies are moving fast to secure compute – us included. But sufficient compute requires sufficient energy to produce it. We see three paths to pursue at once:
1. Build, baby, build: Have federal and state regulatory frameworks that help us build, not hinder energy project development. This will require streamlining the permitting processes at both federal and state agencies, including FERC and NRC, as we move to the next generation of energy supplies. States also are getting in the game by allowing off-the-public-grid and dedicated (or ring-fenced) power generation and extending the life of nuclear facilities.
2. All of the above: Consider all forms of energy to meet this need. We will need solar, natural gas, and nuclear to meet growing energy demands and keep electricity affordable for citizens.
3. Always be innovating: Look to AI to modernize how we generate, distribute, and use energy. From improving grid stability, to creating ever more efficient solar panels, AI could swiftly usher this promising energy source along. For example: less than 1/100th of 1 percent of what we know can be produced in next-generation photovoltaics – the means to turn light into electricity, such as with solar panels – have been synthesized so far, by humans working in labs. Imagine the myriad solar-panel options we could discover if we deploy AI. AI can pave the way to greater power generation, efficiency, and safety. – John McCarrick, Head of Global Energy Policy
[Event] What investing in talent means to us
Energy and compute aren’t the only keys to building AI – people are another. Some eye-popping offers are being extended these days to a handful of terrifically talented researchers, including to folks at OpenAI. Some of these offers are coming with deadlines of just a few hours – literally “exploding offers” – or with restrictions on whether or how they can be discussed.
We believe that how a company treats and invests in talent reflects its values and character. At OpenAI, we invest talent by cultivating it. Through our researcher residency program, for example, we offer 6-month terms to candidates beyond AI, in fields like physics and neuroscience – the kinds of fields where our tools already are being deployed, or will be.
We’re also cultivating talent across product, engineering, infrastructure, scaling, and safety – because it takes all of that, not only phenomenal researchers, to build and credibly call yourself an enduring AI company.
“There is no established path for building toward artificial general intelligence, nor for collaborating and working with increasingly capable AI,” says Joaquin Quinonero Candela, head of recruiting at OpenAI. “In the absence of a blueprint, we don’t want or expect conventional thinking."
So maybe you want to join us? On July 24, Joaquin will share more on his approach to recruiting in an OpenAI Forum talk, “Careers at the Frontier: Hiring the Future at OpenAI.” We’ve got 3,700 RSVPs so far, and we’d love to see you there, too.
[Prompt] Ask the disruptor about past disruptions
We’re convinced AI is going to change the world. But how does it compare with past technological disruptions? And what lessons can we learn from industrial automation and the Internet? OpenAI’s Lauren Oliphant, a member of our Go To Market team, suggests this prompt:
[About] OpenAI Academy
The Academy is OpenAI’s free online and in-person AI literacy training program for beginners through experts.
On Thursday in Utah, the Academy – in partnership with Talent Ready Utah, a state-backed workforce initiative – will be holding a live, in-person session with that state’s workforce commission.
9:00 AM – 12 PM MDT (Event Time Zone) on Jul 10
[Disclosure]
Graphics created by Base Three using ChatGPT.