Synopsys ramps up work around physical AI to help build adaptive machines capable of sensing and interacting with physical environments
The latest addition to AI’s book of lingo, physical AI, is already entering mainstream vocabulary.
You may have heard Nvidia drumming it the loudest, but now long-time partner Synopsys is carrying the torch forward.
Ahead of AI India Summit, set to take place in New Delhi next week, Synopsys is doubling down on physical AI.
Physical AI in action
In an interview with CNBC TV18, senior VP of innovation, Prith Banerjee who was formerly CTO of Ansys, said that the company is zeroing in on this new subset of AI post-Ansys acquisition, targeting industries like high-tech, automotive, aerospace, healthcare, and energy.
Banerjee said that Synopsys is specifically looking at silicon-based, software-enabled, AI-driven systems that power these sectors — aka robots, self-driving vehicles, cameras and so on.
“When we look at cars these days, you have a software-defined vehicle…the car is becoming intelligent with the software that runs on the electronic chips that drive these things,” he said.
Like digital twins, physical AI is a disruptive force powering what is now popularly known as Industry 4.0. It’s purpose is to take autonomous systems to the next level where they can sense and interact with the world more sentiently, and require minimal human assistance or intervention. Some examples of Physical AI-powered intelligent systems are rule-based, training-based and context-based robotics — each of which is designed to perform a specific combination of tasks at different levels of complexity and volume.
Many of these systems are in production today, operating power grids, assisting surgeries, navigating city streets and working alongside humans in factory floors.
The push is already tangible across the manufacturing sector where early adopters like Amazon and Foxconn operating massive robotic fleets are seeing real benefits. According to a paper published by the World Economic Forum, Amazon reported 25% faster delivery times and 25% greater efficiency, while Foxconn achieved 40% reduction in deployment times and 15% cut in operational costs.
But making robots programmed to do basic lift and shift to understand and reason like intelligent programs is not a small upgrade. It’s a feat that needs deep technological foundations.
Banerjee emphasized that Synopsys is moving past large language models (LLM) that power chatbots to more sophisticated, foundational AI models that “understand the physics of the world around us.”
Agent engineers, the next wave of AI
He said that Synopsys is specifically investing in agentic AI for designing the next generation of solutions. As self-driving systems have become more and more complex, companies are facing skill shortages that are widening the gap between demand and supply.
Banerjee noted that building chips manually at the volume required to power swarms of intelligent systems across factory floors will require billions of engineers.
Synopsys is developing “agent engineers” to get around the problem. These AI agents will work as hands-on builders with their human counterparts, he said.
While an army of bot engineers definitely sounds alarming, Banerjee was quick to add that the agents will complement human engineers by working hand in hand accelerating manufacturing, rather than replacing them.
The move makes sense post-acquisition of Ansys as it brought a sizable portfolio or physics-based simulation solutions which can not only reinforce Synopsys’ leadership in electronic design automation and IP markets, but also open the door to conquer other sectors where advanced AI systems are getting adopted.
Synopsys will showcase its AI-augmented simulation solutions at the AI India Summit next week.
