■ In this week's AI Lab:AI is changing how chips are designed and used. Plus, the importance of the open language model ecosystem, reasoning with physics, and a project generating realistic virtual humans.
AI may soon make it easier to design and program chips
Nvidia is the undisputed king of AI chips. But thanks to the AI it helped build, the champ could soon face growing competition.
Modern AI runs on Nvidia designs, a dynamic that has propelled the company to a market cap of well over $4 trillion. Each new generation of Nvidia chip allows companies to train more powerful AI models using hundreds or thousands of processors networked together inside vast data centers. One reason for Nvidia’s success is that it provides software to help program each new generation of chip. That may soon not be such a differentiated skill.
A startup called Wafer is training AI models to do one of the most difficult and important jobs in AI—optimizing code so that it runs as efficiently as possible on a particular silicon chip.
Subscribe today to continue receiving my full newsletter each week. I've been writing about AI and related themes for over a decade (since neural networks were considered a dead end, in fact). I'm fascinated by how innovation occurs and how it affects the economy. And I want to understand the global picture, rather than just the world as viewed through the lens of Silicon Valley.
So, if you're keen to keep track of advances from the world of computing—AI, robotics, quantum, chipmaking, and more—that have the potential to transform the way we live ... this is the newsletter you for you.