HomeTech PlusTECH & OTHER NEWSNow Google is using AI to design chips, far faster than human...

Now Google is using AI to design chips, far faster than human engineers can do the job

gettyimages-1094304672.jpg

In only six hours, the model could generate a design that optimizes the placement of different components on the chip.  

Kokouu / Getty Images

A team of researchers from Google has unveiled a new AI model that can come up with complex chip designs in hours – a burdensome, intricate task that typically takes months for human engineers to complete. 

The researchers used a dataset of 10,000 chip layouts to feed a machine-learning model, which was then trained with reinforcement learning. It emerged that in only six hours, the model could generate a design that optimizes the placement of different components on the chip, to create a final layout that satisfies operational requirements such as processing speed and power efficiency. 

The method’s success is such that Google has already used the model to design its next generation of tensor processing units (TPUs), which run in the company’s data centers to improve the performance of various AI applications.  

Processors

“Our RL (reinforcement learning) agent generates chip layouts in just a few hours, whereas human experts can take months,” tweeted Anna Goldie, research scientist at Google Brain, who took part in the research. “These superhuman AI-generated layouts were used in Google’s latest AI accelerator (TPU-v5)!” 

Modern chips contain billions of different components laid out and connected on a piece of silicon the size of a fingernail. For example, a single processor will typically contain tens of millions of logic gates, also called standard cells, and thousands of memory blocks, known as macro blocks – which then have to be wired together. 

The placement of standard cells and macro blocks on the chip is crucial to determine how quickly signals can be transmitted on the chip, and therefore how efficient the end device will be.  

This is why much of engineers’ work focuses on optimizing the chip’s layout. It starts with placing the larger macro blocks, a process called “floorplanning” and which consists of finding the best configuration for the components while keeping in mind that standard cells and wiring will have to be placed in the remaining space. 

The number of possible layouts for macro blocks is colossal: according to Google’s researchers, there are a potential ten to the power of 2,500 different configurations to put to the test – that is, 2,500 zeros after the 1. 

What’s more: once an engineer has come up with a layout, it is likely that they will have to subsequently tweak and adjust the design as standard cells and wiring are added. Each iteration can take up to several weeks. 

Given the painstaking complexity of floorplanning, the whole process seems an obvious match for automation. Yet for several decades, researchers have failed to come up with a technology that can remove the burden of floorplanning for engineers.  

Chip designers can rely on computer software to assist them in the task, but it still takes many months to work out how to best assemble components on the device. 

And the challenge is only getting harder. The often-cited Moore law predicts that the number of transistors on a chip doubles every year – meaning that engineers are faced with an equation that grows exponentially with time, while still having to meet tight schedules. 

This is why Google’s apparently successful attempt to automate floorplanning could be game-changing. “Very nice work from Google on deep RL-based optimization for chip layout,” tweeted Yann LeCun, chief AI scientist at Facebook, congratulating the team on overcoming “40 years” of attempts at resolving the challenge. 

Google’s new AI model could hardly land at a better time: the semiconductor industry is currently rocked by a global shortage of chips that is hitting a number of sectors, ranging from consumer electronics to automotive

While the shortage has been caused by insufficient capabilities at the fabrication level, rather than the design of semiconductors, it remains that cutting the time that it takes to invent next-generation chips could constitute a welcome relief for the entire supply chain. 

Scientific journal Nature, for one, welcomed the new method. “Researchers at Google have managed to greatly reduce the time needed to design microchips,” they said. “This is an important achievement and will be a huge help in speeding up the supply chain.” 

Although the machine-learning model could impact the industry as a whole, it will be worth keeping an eye on Google’s own use of the technology, too.  

The search giant has long been explicit that its ambition is to create custom processors in-house, particularly in the form of systems-on-chips (SoCs)

By ZDNet Source Link

Technology For You
Technology For Youhttps://www.technologyforyou.org
Technology For You - One of the Leading Online TECHNOLOGY NEWS Media providing the Latest & Real-time news on Technology, Cyber Security, Smartphones/Gadgets, Apps, Startups, Careers, Tech Skills, Web Updates, Tech Industry News, Product Reviews and TechKnowledge...etc. Technology For You has always brought technology to the doorstep of the Industry through its exclusive content, updates, and expertise from industry leaders through its Online Tech News Website. Technology For You Provides Advertisers with a strong Digital Platform to reach lakhs of people in India as well as abroad.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

CYBER SECURITY NEWS

TECH NEWS

TOP NEWS