"Before you build an AI Factory, you build the digital twin."
– Jensen Huang, NVIDIA CEO
Last week at NVIDIA GTC 2025, the tech world got a glimpse into the future of AI, digital twins, and accelerated computing. Jensen Huang’s keynote set the stage for a new era where AI and simulation are deeply intertwined—one where digital twins are not just useful but essential for building the next generation of AI systems.
As I work for a company specializing in digital twins and AI-driven solutions, what Jensen Huang said summarizes our vision: digital twins are the foundation for industrial AI, and next-level decision-making.
Here are some of my biggest takeaways from the event.
Goodbye renderings, welcome operational digital twins
Getting back to Huang’s keynote, he emphasized that before industries can fully harness AI Factories, large-scale AI training and inference systems, they must first create accurate, real-time digital twins of their operations. “Operational Digital Twin” was a term that our team found useful. As I have worked with photorealistic digital twins, I’ve learned that some people think they are just pretty renderings for marketing. In today’s world, they are so much more, a tool that serves the full lifecycle from planning to future simulations.
This approach enables businesses to optimize processes in a risk-free virtual environment, train AI models using synthetic data before real-world deployment and accelerate automation in manufacturing, logistics, and energy sectors.
Dish-washing humanoids and Disney robots
A standout theme at GTC was Physical AI, the convergence of AI and robotics in real-world applications. NVIDIA introduced advancements in AI-powered robotics that learn from digital twins before being deployed in the field, healthcare AI models trained in virtual patient environments and smart infrastructure where digital twins predict and prevent system failures.
Strolling down the expo hall at San Jose Convention Center, it became clear that robots and humanoids are making their way to our everyday life. One of the most amazing humanoids was NEO Gamma from Norwegian 1X, whose human-like movements (based on Gr00t N1) were astonishingly real. NEO Gamma was casually washing dishes and watering plants. Clever, even though getting used to having him in my living room probably needs some time.
NVIDIA also announced a collaboration with Disney Research and Google DeepMind to launch Newton, an open-source physics engine that helps robots learn complex tasks with greater precision. Disney will be among the first to use Newton to enhance its robotic character platform — including the Star Wars-inspired “BDX” droids, which even joined Jensen Huang on stage during his keynote. Disney being involved, I’m sure we’ll soon see pet-like furry robots that respond to our emotions.

Younite Team at NVIDIA Headquarters, from left to right: Sami Heinonen, John McGarry, Anne Eskola, Laura Olin
What is the role of a human?
I was also lucky to get to present a case, even though the competition this year was tough. My presentation about Virtual Stage, a live performance planning tool based on digital twin technology that we have created with The Finnish National Opera and Ballet.
When AI takes over the daily routines, art still belongs to humans, was my final statement. We are not yet there, where AI could independently take care of everything. And for some things, such as dance or opera, we hopefully never will. That will always require a human, an interpretation that is based on real emotions, memories and life experiences that are beyond what a machine can ever do.
If you want to watch GTC presentations, you can register for free and click here.
Link to my presentation (at 18:00 minutes)
Laura Olin, Chief of Staff, Younite