Separately, J&J on the same day, announced its own partnership with Nvidia, relying on the AI company’s foundation models to create simulated environments for surgical teams to plan their kidney stone procedures. J&J says this application of so-called “physical AI” will optimize the process to map out procedures, make it easier to train doctors, and will result in more consistent and better clinical outcomes for patients.
“There’s only so many hours in the day,” says Neda Cvijetic, senior vice president and global head of robotics and digital research and development for J&J’s MedTech division. “Sometimes it’s super helpful to see that difficult case in a very realistic, simulated environment first, to help best prepare.”
“The business leaders have less patience with generic platforms,” says Zurikiya. “They’ll want something that’s customized to what they need.”
“We don’t even just want the life sciences knowledge model,” says Rau. “We want one that knows Lilly.”
Lilly’s Chief AI Officer Thomas Fuchs adds that the greatest AI advancements will come from the combination of the company’s trove of proprietary data, the compute investments Lilly is making to train large foundation models, and then deploying that tech to thousands of chemists and biologists, who can use those AI tools to make new discoveries.
Fuchs says that precise science can’t be echoed for every large pharmaceutical company. That would be like an astronomer relying on a telescope sold by a big-box retailer. “We are building a space-based telescope,” says Fuchs.
Kimberly Powell, a VP of healthcare at Nvidia who worked on the J&J surgical AI project, touts the potential for physical AI to tap the advancements of computer vision technology and large language models to turn AI into physical workers.
“There is a future goal of how we go from robotic-assisted surgery to robotic surgery, where the robot is actually taking some action on its own,” says Powell. “We’re laying all the groundwork to do that.”
John Kell



