




From the way we work to the way we stay healthy, the AI revolution has the potential to dramatically improve every aspect of our lives.
However, ensuring that the U.S. and other countries can help shape the trajectory of this technology requires more than relying exclusively on technologies developed by private companies.
University research has driven the development of AI and laid the foundation for the AI business boom we are experiencing today. Just as importantly, the leaders of those pioneering AI companies could not have been cultivated without academia.
But today, large foundational models (LFMs) like ChatGPT, Claude, and Gemini require powerful computing and large datasets that are more readily available to private companies, leading them to take the forefront in the AI field.
Empowering our universities to be at the forefront of AI research alongside private companies will be key to realizing the long-term potential of the field. Doing so will require a change in the unequal status quo between academia and industry in terms of access to computing resources.
The greatest strength of academia is its ability to conduct long-term research programs and fundamental research that pushes the boundaries of knowledge. The freedom to explore and experiment with bold, cutting-edge theories will lead to new discoveries and innovations, laying the groundwork for future technological advancements.
While the tools that underpin large-scale fundamental models are increasingly accessible, there are many unanswered questions, as they often remain a "black box."
For example, we know that AI models have a tendency to produce "hallucinations," but the exact reasons for this are still not well understood.
Universities can plan for a future where AI truly benefits humanity because they are not subject to the pressures of commercial and market forces. Expanding academia's access to resources will make AI research and its applications more inclusive.
U.S. President Joe Biden issued an executive order on AI in October 2023 that authorized the National Artificial Intelligence Research Resource (NAIRR) pilot program. This is a step in the right direction.
By partnering with private industry, NAIRR will create a shared research infrastructure for AI.
If it realizes its full potential, it will be an important centerpiece for helping academic researchers gain more efficient access to GPU computing power. However, even if NAIRR is well-funded, its use of resources is likely to be fragmented.
As some have suggested, this problem could be mitigated if NAIRR focused on a certain number of decentralized projects.
But we should also look for more creative solutions to keep enough GPU resources in the hands of academia. Here are some of our initial thoughts:
First, we should use large-scale GPU clusters to improve and leverage the U.S. supercomputer infrastructure. Academic researchers should be able to collaborate with U.S. national labs to address major challenges in AI research.
Second, the U.S. government should explore ways to reduce the cost of high-end GPUs at academic institutions, for example, by providing financial assistance such as grants or R&D tax credits.
New York has taken a number of initiatives in this area, making universities key partners in AI development in the state and playing an important role. This model should be emulated across the United States.
Finally, some export control restrictions may leave U.S. chipmakers with excess inventories of high-end AI chips. In this case, the government could purchase these chips and distribute them to universities and academic institutions.
These actions would trigger a surge in AI research and innovation in academia. Ambitious researchers at universities have many diverse ideas, but are often stopped in their tracks by a lack of resources.
Providing universities with adequate computing resources will allow their work to complement research in the private sector.
As a result, academia can become an indispensable hub for technological advancement, facilitating interdisciplinary collaboration, long-term research, training the next generation of AI talent, and promoting ethical innovation.
Historically, similar investments have played a key role in innovation. The post-World War II United States forged a symbiotic relationship between government, academia, and industry that took humans to the moon, planted the seeds of Silicon Valley, and created the Internet.
We need to ensure that academia maintains a strong position in the innovation ecosystem, and investing in its computing power is a necessary first step.