The power of two: quantum or neuromorphic computing?


SOURCE: COMPUTERWEEKLY.COM
OCT 06, 2021

No one wants to run a computer operating at temperatures close to absolute zero – this is pushing the boundaries of new computing architectures

There are some problems that are simply too complex for even the most powerful of today’s computers, and researchers are trying to overcome the limits of traditional computer designs to enable computationally difficult problems to be solved.

The von Neumann architecture, which has defined the layout of computing for the past 75 years, is being pushed in directions that it was never designed to take. This is the classical computing architecture, which effectively defines the way a processor fetches program instructions from memory, runs them and stores values back into memory.

But the stored program architecture described by John von Neumann is less efficient at solving certain complex problem areas, compared with entirely new approaches to computing.

Quantum computing is one of the new approaches to computing that opens up the ability to run calculations that would be impossible to complete on a classical computing architecture. However, quantum computers are currently highly specialised devices. Only recently have scientists been able to demonstrate a device that does not need to be supercooled and kept at near absolute zero (-270°C).

Peter Chapman, CEO and president of IonQ, which recently listed on NYSE, said that when running handwriting recognition, an 11 qubit quantum computer outperformed classical computing and was more accurate in its ability to handle noisy data. “Machine learning is the first application area that will go to quantum computing,” he said. “It is much faster at creating models, and models are better.”

Unlike the classical approach, which needs to be programmed in a way that can compensate for noise in the dataset, “a little bit of noise actually helps”, he said.

What is more, although Moore’s Law has held true for classical computer architectures, where processing power doubles every 18 months to two years, scalability in quantum computing grows exponentially. “We are doubling the number of qubits every 10 months,” said Chapman.

In a machine with n qubits, the computational power is expressed as 2n. In effect, each additional qubit doubles the processing power. To put this into perspective, said Chapman, the number of simultaneous states that a 120-qubit system could handle would be equivalent to the number of atoms in the universe.

According to Chapman, modelling certain chemical reactions would require the computational power that is only available in the realms of quantum computing. But even in the real world, certain types of optimisations are simply too complicated for classical computing.

There are numerous reports about how the programmers who developed the route optimisation software for logistics firm UPS only used right turns in their calculations. Looking at route optimisation, Chapman said: “What we do today is far from the optimal route as there is a set of cheats that programmers have figured out.”

If an individual driver makes 120 deliveries a day, the number of different permutations of routes is a 200-digit number, he said. Multiply that by the number of drivers and, from a calculations perspective, the problem space quickly become astronomical. “A quantum approach offers a different way to solve the problem,” said Chapman.

IonQ is developing a quantum computer that does not need to be supercooled. According to its roadmap, the company plans to offer a rack-mounted quantum computer by 2023.

Neuromorphic alternative

Such a system would avoid the latency associated with running quantum computing as a cloud resource, to support applications in high-performance computing that need low-latency connectivity to supercomputers and applications that rely on real-time processing.

It is this idea of taking computing from the cloud towards the edge that is driving Intel’s new-generation Loihi chip architecture for neuromorphic computing. Loihi 2, unveiled at the end of September, is Intel’s second-generation neuromorphic research chip. The company has also released Lava, an open source software framework for developing neuro-inspired applications.

Neuromorphic computing adapts the fundamental properties of neural architectures found in nature to build a new model of computer architecture.

The paper Advanced neuromorphic computing with Loihi describes neuromorphic computing as classes of brain-inspired computation that challenge the von Neumann model. The paper’s authors said one of the most promising application areas of neuromorphic technology is in emulating how the biological brain has evolved to solve the challenges of interacting with dynamic and often unpredictable real-world environments.

Mirroring the biological world, a neuromorphic chip has a neuron, synapses for neuron-to-neuron connectivity and dendrites, which enable the neuron to receive messages from multiple neurons.

According to Intel’s specifications, each Loihi 2 chip consists of microprocessor cores and up to 128 fully asynchronous neuron cores connected by a network-on-chip (NoC). The neuron cores are optimised for neuromorphic workloads, each implementing a group of “spiking” neurons, including all synapses connecting to the neurons.

All communication between neuron cores is in the form of spike messages, which mimics neural networks in a biological brain. Whereas the previous Loihi chip had three microprocessor cores, Intel said it has doubled the number of embedded microprocessor cores in Loihi 2 to six.

Garrick Orchard, a researcher at Intel Labs, said: “We are not trying to directly model biology, but taking some things we think are important.”

On the Loihi chip, to model biological neuron behaviour, one part of the chip functions as the neuron’s core, he said.“We have a bit of code that describes the neuron,” he added. There are also neuromorphic computing versions of biological synapses and dendrites, all built using asynchronous digital complementary metal-oxide semiconductor (CMOS) technology.

Deep neural networks

Given that neuromorphic computing is inspired by biological systems, deep neural networks (DNN) for machine learning is one of the application areas being targeted. Orchard added: “Using neuromorphic computing for a DNN is something people understand, but we need to differentiate. We are not trying to be a DNN accelerated. There’s more to AI than deep learning.”

Where Loihi 2 and neuromorphic computing as a whole seem to have a good fit is in the area of edge computing for processing sensor data at low latency. Orchard said it could be used within a microphone or camera and offer visual and tactile perception similar to biological systems, in systems such as robotics arm controllers that can adapt to the weight of an object it is trying to lift, or within a drone to provide very low latency control.

Within a datacentre environment, a neuromorphic computer could power a recommendation engine or be used in scientific computing to model how forces propagate through a physical structure, said Orchard.

There is something of an overlap in application areas with quantum computing. Orchard said a neuromorphic computer can be applied to solve a certain class of hard optimisation, such as scheduling at train operator Deutsche Bahn, which is currently investigating its use.

But although there may be an overlap in application areas, Orchard said that, unlike a quantum computer, it is much easier to scale up a neuromorphic computer. The Loihi 2 chip can scale simply by wiring chips together. “You can build very large systems,” he added.

With Loihi 2 and Lava, neuromorphic computing is pushing closer to commercialisation, said Orchard.

Both Intel and IonQ are looking at putting next-generation computing nearer to the edge. Intel’s approach with Loihi is effectively about designing a semiconductor chip to behave in a similar way to a brain neuron, and then use biologically inspired algorithms to run on this new architecture. Quantum computing is built on a foundation of quantum physics.

Although they are very different, both approaches offer an insight into how computationally complex problems could be tackled in the future.

Similar articles you can read