Advanced Computing in the Age of AI|Thursday, October 29, 2020
  • Subscribe to EnterpriseAI Weekly Updates:  Subscribe by email

AI, Big Data Propelling Chip Design 

AI and big data generating those algorithms are transforming chip design in unexpected ways, accelerating fundamental tasks like finding the optimal design tradeoffs between performance and power consumption

“Chips can help produce better chips” in terms of frequency, power and other performance design parameters, Karl Freund of Moor Insights & Strategy, told this week’s AI Hardware Summit. For a range of AI applications spanning automotive and networking to AI acceleration, design objectives were met between 84 percent and 89 percent faster using only a single engineer, according to a recent study by chip design leader Synopsys.

"These results… are game-changing,” Freund said. In one example, an AI algorithm placed transistors from different logic blocks within an IC design “in completely unintuitive manners,” the analyst noted. The results: better frequency, lower power and reduced chip area.

“With the advent of AI and big data, this gives us a new dimension, a new arsenal in our weaponry,” added Arun Venkatachar, vice president of AI and central engineering at Synopsys. “It’s really the data that’s generating the algorithms now, rather than the other way around.”

The tool vendor is using AI to improve simulation, emulation, chip stacking and formal verification as well as for design synthesis before transferring IC designs to a fab.

AI is being applied “all the way through the [design] food chain, trying to get optimal solutions for these complex problems,” Venkatachar said.

Synopsys recently unveiled an AI-based tool called Design Space Optimization, billed as the first autonomous AI application for chip design. The tool is touted as identifying the best design tradeoffs between power and performance, reducing that process from months to days. Synopsys said the tool is now in production with leading semiconductor manufacturers.

Among them, Intel engineer Fadi Aboud said the chip maker has used AI across its design domains spanning analysis, debugging and IC optimization. Applications include model creation and flow predictions along with faster design simulations.

“We have seen really outstanding results using AI… but it doesn’t come for free,” Aboud added. Key challenges include data availability and mechanisms for transferring and scaling chip designs to other projects. “This is not easy,” the Intel engineer noted.

Another summit panelist, Google’s Satrajit Chatterjee, asked what differentiates silicon design from other AI applications. The answer, according to the Google engineering manager and machine learning researcher: “The main difference from other applications is that most [machine learning] problems [are] synthetic prediction problems.

“In other words, we want ML to predict the result of some calculation that would otherwise take too long,” he added. The answers to those calculations would give chip designers more visibility into IC performance later in the EDA workflow.

As AI for chip design evolves, one question is whether new AI-based tools will come from vendors like Synopsys or spring from homegrown projects based on Python and other standard AI software.

For Google, “It’s certainly a combination of techniques,” Chatterjee said, not coincidentally advocating better “cloud coordination” to distribute design steps and provide required computing resources.

Aboud said Intel also mixes EDA vendor and open source software in its chip design workflows. “We definitely have our own homegrown solutions as well,” he added.

“This is one of the main challenges: We are now thinking how we should build our [chip design] infrastructure in a way that allows us to take EDA solutions, deploy them as fast as possible but also allow us to keep innovating and developing internal solutions and giving them back when EDA catches up,” Aboud added. “We definitely see the need for both.”

About the author: George Leopold

George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).

Add a Comment

Do NOT follow this link or you will be banned from the site!
Share This