If you are a great athlete, you need to find new tests of endurance to push yourself to new achievements. If you’re a musician, you need to take on new compositions of greater complexity to advance your virtuosity.
For decades, Intel made possible one of the greatest technological achievements of modern times, the mass-market spread of computing technology. Following that with a next act is challenging.
To find new realms, the company in part relies upon a team of scientists at the forefront of aspects of computing that could conceivably, some day, be mainstream.
“We want to pick things that we can scale to the planet,” said Rich Uhlig, senior fellow, vice president and director of Intel Labs, in a video interview with ZDNet. Labs is the research unit that does fundamental scientific research into topics such as quantum computing.
Labs serves a company, Intel, that is a virtuoso at building things to scale to hundreds of millions of units.
“What we do in the labs is pick projects that are not going to be of interest to just a small set of individuals,” said Uhlig.
“We scale to billions over time” he said of Intel’s prowess in manufacturing, “and we want to focus on problems that have meaning at that scale.”
Uhlig spoke to ZDNet in advance of its Labs Day, a virtual event held Thursday that offered a peak at some of the key research endeavors that Uhlig and team are working on. It was the first time in seven years the company has done such a tour of the labs.
Uhlig’s keynote began the day, focused on the theme of advancing computing by orders of magnitude. “In Pursuit of 1000x: Disruptive Research for the Next Decade of Computing,” was the title of the talk.
The day showcased progress in multiple areas.
In quantum computing, the Lab’s James Clark, head of quantum hardware, talked about scaling up spin qubits, as well as using what’s called “cryo-CMOS,” which is intended to simplify the complex interconnections that are used in today’s quantum systems.
Clark’s talk was complemented by a presentation from the Lab’s head of quantum applications, Anne Matsuura, about the “full stack” of hardware and software that will be required to run actual quantum workloads.
Also: Intel details Horse Ridge II as helping overcome quantum computing hurdle
The theme of scale plays a big part in the quantum work. In particular, scaling quantum is a big challenge for everyone in the industry, noted Uhlig.
“To solve practical problems in quantum, you need to get to millions of qubits at some point,” he said. Today’s systems from IBM, Honeywell, Google and others, known as noisy intermediate-scale quantum, or NISQ, machines, have only a handful, at most tens, of qubits.
“If you don’t have the technologies in place to get to scale, and we do mean millions [of qubits], then you are really not going to be solving the practical problems.”
As just one example of scale in quantum, debug is a thorny problem Intel has been focused on.
“We are able to take wafers of spin qubits and test them at low temperature,” Uhlig described. “That’s super-important in order to make rapid progress in the field because it can take days to bring a dilution refrigerator down to temperature,” which slows he whole debug cycle.
“By being able to test all the devices with one cycle of bringing the fridge to where it needs to be, you can make much more rapid progress.”
A session with James Jaussi, head of the company’s PHY optical effort, showcased new developments in using fiber optics as the signaling between chips. That work has been in development in the Lab for over a decade, Uhlig noted. Now, however, it is moving from discrete parts, such as transceivers, into the very package of a chip.
“We are bringing some new technologies like micro-ring modulators, integrated lasers, and optical amplifiers that we can integrate straight into the compute package,” he explained. The effort is aimed to greatly reduce the energy demand of moving bits over copper wire, a broad trend in the computing industry today.
Machine programming, presented by researcher Justin Gottschlich, is another effort that has been ongoing for some time, aimed at making breakthroughs in conventional computer programming. Gottschlich Presented an update on progress in ways to program computers using intent, with less explicit coding.
Neuromorphic computing, where circuits mimic the activity of brain synapses, has been a focus for Intel for years, with the company having cultivated its own AI computer systems, called Loihi. The head of neuromorphic, Mike Davies, gave an update on progress, including talking about the benchmark testing of neuromorphic prototypes.
Also: Intel, partners make new strides in Loihi neuromorphic computing chip development
“Recently, we are building ever larger configurations that bring multiple Loihi into large clusters,” Uhlig told ZDNet. That work involves the research community outside Intel that has been given the Loihi hardware and software to work with. “We are getting into a phase where it is moving beyond research to explorations of future product and applications,” he said.
Again, the focus with neuromorphic, as with photonics, is on energy use: The brain is much more efficient in the biological world than is a conventional computer running a deep learning experiment. “If you look at a cockatiel brain,” said Uhlig, referring to a small form of parrot, “it’s a couple grams, and it’s amazing what a cockatiel can do in navigating complex environments.”
“There’s some thing almost mysterious there that we want to get at to build fundamental capabilities into computing systems.”
Yet another area included federated computing, where data can be used even when it is hidden behind a firewall. The idea is that sensitive data, such as medical data, can be kept anonymous, while also being used for analysis. The Lab’s Rosario Cammarota and Jason Martin, two principal engineers, gave a talk on what’s called Trusted Federated Learning. As an adjunct, they also discussed the technology of homomorphic encryption.
All the day’s presentations are detailed on Intel’s Web site.
All of these areas of exploration in some way come back to Intel’s focus on being a “data company,” as the company puts it, but also Intel’s core value as a computing company.
“Whether it’s about data movement, in the case of silicon photonics, or new ways of doing computation [such as quantum], at our core, we are a compute company, we want to be there with the technology if there’s an emergent new way of doing things,” said Uhlig.
“We think both neuromorphic and quantum speak to that.”
Another aspect of the projects is to find ways to make computing more inclusive. Machine programming, for example, fits with a new mindset around extending computer programming to people who are neither programmers by calling nor by profession, said Uhlig.
“Today, there is a small set of people who program systems,” observed Uhlig.
“Machine programming is about liberating people so that they can express their intent,” he said. “We want to make it so that if you are an expert in an area, if you know what you want to get out of data, it becomes much easier to do so.”
At the same time that Intel itself seeks markets of potential scale, the company historically was able to bring together partnerships and develop ecosystems. And the lab has been part of that. The Thunderbolt technology used for high-speed connections on laptops is just one example: it was developed inside Labs and then commercialized through work with Apple.
A more profound example is how Intel promoted WiFi wireless networking two decades ago. WiFi might not be as ubiquitous today if not for the efforts of Intel to promote it as a generally available industry standard.
And then there’s the historical partnership with Microsoft, a dynamic duo that made possible personal computing. With all the Labs work, said Uhlig, Intel continues to look for alliances it can find.
“You can see these patterns in the way we do our research, we are not keeping the technology to ourselves,” he said. “Intel recognizes the importance of developing ecosystems.”
The company is “on a journey,” he said, with government offices, such as the Department of Energy, on quantum computing. In the case of neuromorphic, a partner, Accenture Labs, discussed applications of neuromorphic computing during the sessions.
Intel is also helping to nurture academic research outside of the Lab’s own work.
“We have pretty extensive programs in funding academic research across the planet,” noted Uhlig. “We are continuously renewing those investments,” he said. In some case, the research grants are direct, but in other areas, Intel will set up a center and send its researchers to a university to work on site.
“The top researchers in a given area are oftentimes inventing the future, and if we are sitting right next to them as they’re doing that, publishing together with them, then it really gives us an edge in various areas.”
Intel also is one of the largest contributors to open source software development. “We know it’s a strategic move because it stimulates new usages and applications,” said Uhlig. It also serves to keep the company plugged into what people want to do.
At the end of the day, being a virtuoso doesn’t mean much if one is not giving back. That thread underlies some of the most profound work the Lab has done.
Years of research, for example, helping the late scientist Stephen Hawking to communicate despite disability has led to advances that inform technologies for assistive computing, as just one example.
A notion of being of service informs the Lab’s work and its relation to Intel’s business, said Uhlig.
“As long as you are advancing generationally, if you are always providing more value, you will always be relevant,” said Uhlig. “You have to deliver value over time, consistently, that’s really our mindset.”