Synopsys ramps up work around physical AI to help build adaptive machines capable of sensing and interacting with their physical environments
The latest addition to AI’s book of lingo is physical AI, and it is already entering mainstream.
You may have heard Nvidia drumming it the loudest, but now long-time partner Synopsys is carrying the torch forward.
Ahead of AI India Summit, set to take place in New Delhi next week, Synopsys is doubling down on physical AI.
Physical AI in action
In an interview with CNBC TV18, senior VP of innovation, Prith Banerjee who was formerly CTO of Ansys, said that the company is zeroing in on this new subset of AI post-Ansys acquisition, targeting industries like high-tech, automotive, aerospace, healthcare, and energy.
Banerjee said that Synopsys is specifically looking at silicon-based, software-enabled, AI-driven systems that power these sectors — aka robots, self-driving vehicles, cameras and so on.
“When we look at cars these days, you have a software-defined vehicle…the car is becoming intelligent with the software that runs on the electronic chips that drive these things,” he said.
Like digital twins, physical AI is a disruptive force powering what is now popularly known as Industry 4.0. It’s purpose is to take autonomous systems to the next level where they can sense and interact with the world more sentiently, while requiring minimal human assistance or intervention. Some examples are rule-based, training-based and context-based robotics — each of which is designed to perform a specific combination of tasks at different levels of complexity and volume.
Many of these systems are in production today, performing varied tasks like power grid operations, assisting surgeries, navigating city streets and working alongside humans in factory floors.
The push is most tangible in the manufacturing sector where early adopters like Amazon and Foxconn operating massive robotic fleets are seeing its real benefits. According to a paper published by the World Economic Forum, Amazon reported 25% faster delivery times and 25% greater efficiency, while Foxconn achieved 40% reduction in deployment times and 15% cut in operational costs.
But making robots programmed to do basic lift and shift to understand and reason like intelligent systems is not a small upgrade. It needs deep technological foundations.
Banerjee emphasized that Synopsys is moving past large language models (LLM) that power chatbots to more sophisticated, foundational AI models that “understand the physics of the world around us.”
Agent engineers, the next wave of AI
He said that Synopsys is specifically investing in agentic AI for designing the next generation of solutions. As self-driving systems become more and more complex, companies are facing skill shortages opening a demand-supply gap.
Banerjee noted that building chips manually at the volume required to power swarms of intelligent systems across factory floors will require billions of engineers.
Synopsys developed AgentEngineers to get around this problem. Unveiled in March 2025, these are AI agents designed to work as hands-on builders with their human counterparts.
While an army of bot engineers sounds scary, Banerjee was quick to add that the agents are meant to complement human engineers by working hand in hand to accelerate manufacturing, rather than replacing them.
Synopsys’ physical AI play makes sense now more than ever. Ansys brings a sizable portfolio of physics-based simulation solutions which not only reinforces Synopsys’ leadership in electronic design automation and IP markets, but also opens door to other sectors where advanced AI systems are increasingly being adopted.
Synopsys will showcase its AI-augmented simulation solutions at the AI India Summit next week.
