A revolution in computing: driven by a new breed of materials - Annual Report 2016

News & events

A revolution in computing: driven by a new breed of materials - Annual Report 2016

11 August, 2017

Tomorrow’s computers

When the first powerful computers began to appear during World War II, they were viewed solely as military tools. No-one could have predicted that within 40 years, computers would become so compact that you could fit one onto a desk. Fast-forward to today, and around 90% of New Zealanders now own a smartphone; a computer that could outperform all of those early machines combined, despite weighing less than 200 grams. 

We can thank the rapid development of low-cost microelectronics for our ability to shrink computers without compromising on their performance. But there could be a new revolution coming to computing, and it is MacDiarmid Institute researchers who are leading the charge.

Quantum leap

No matter how sophisticated they may be, most of today’s computers rely on the same basic mechanism—electrical circuits, each containing millions of tiny switches called transistors that flick between on and off. Though seemingly simple, this binary system allows us to store and process complex data as strings of ones and zeros, or ‘bits’. 

And now quantum computers could extend our vocabulary even further.

Quantum computers rely on quantum bits (nicknamed ‘qubits’) that can have three possible states—on, off, or both simultaneously. This weird effect, called superposition, exists only at incredibly low temperatures—fractions of a degree above absolute zero—but it could increase our computing power exponentially. Making qubits is a huge challenge, though. Even in silicon, the most widely-used element in electronics, impurities can ruin a stable quantum bit. So MacDiarmid Principal Investigators Dr Andreas Markwitz (GNS Science) and Dr Grant Williams (Victoria University of Wellington) are working with MacDiarmid students, Prasanth Gupta and Konrad Suschke, along with a team from the University of Melbourne to develop a brand new way to produce silicon qubits. 

This work involves building a world-first—a compact, pure Silicon-28 ion source, which can be used to implant quantum computing ‘islands’ into standard silicon wafers. This approach could provide a reliable, scalable way to produce qubits that are stable enough for use in a wide range of applications.

“For certain problems, such as molecular simulations and drug design, quantum computing might well be a total game-changer,” said Dr Markwitz, who received a 2016 MBIE Smart Ideas grant for this work. “If this work is successful it would lead to new computer chips that would be much faster for certain kinds of problems, and would give New Zealand a significant share of the global electronics market.”


Silicon qubits aren’t the only contender for computing’s ‘next big thing’. In fact, another MacDiarmid Principal Investigator, Associate Professor Ben Ruck, from Victoria University of Wellington is looking instead towards superconducting logic. One of the challenges of conventional silicon circuits is that as they’re pushed harder, they dissipate increasing amounts of energy, as heat. That’s why computers employ fans—it helps to cool the circuits down. Scale that effect up to something the size of an internet server farm, and you’re facing an ever-growing energy bill.

This work if successful would give New Zealand a significant share of the global electronics market.

Dr Andreas Markwitz

A new breed of materials

Associate Professor Ruck’s research aims to replace the conventional silicon transistors with superconducting circuits. These have zero electrical resistance, so they don’t dissipate heat at all, making them many times more efficient than today’s computer chips. Superconducting circuits could also promise greater processing power, thanks to the design of their components. Although, like qubits, these circuits need low temperatures to operate, this is not quantum computing. It’s more like conventional computing that’s being done with a new breed of materials. 

Associate Professor Ruck is collaborating with a number of the MacDiarmid researchers on this work, including /associate Investigators and Victoria University of Wellington researchers Associate Professor Franck Natali, Emeritus Professor Joe Trodahl and Dr Simon Granville. And he’s quietly confident about the future: “Silicon chips are fast approaching their physical and economic limits, so there’s reasonably little doubt that superconducting computing will become a reality,” said Associate Professor Ruck. “It won’t be in your desktop computer, but it will be invaluable to those sprawling data centres that we all rely on.”

Electric dreams

Tomorrow’s computers won’t just get faster, though. They’ll get smarter too. Unlike conventional computers, the human brain is a master of complex tasks like pattern recognition. It’s this skill that allows us to learn from experience—we continuously recognise and characterise objects and actions, and can recall them when needed. Part of this ability comes from the way our brain is constructed—as a network of neurons connected by synapses. For MacDiarmid Principal Investigator Professor Simon Brown from the University of Canterbury, this provided inspiration for a new approach to computing.

Beyond ones and zeros

Called neuromorphic computing, it uses tiny circuits to mimic the operation of the human brain. “We are deliberately trying to replicate the structure of the brain, so there are no ones and zeros here,” explains Professor Brown, who received a 2016 MBIE Smart Ideas grant for this work. “Instead, data forms a physical pathway across the complex series of connections present in the chip, similar to how memories are formed in the brain.” 

Alongside Victoria University of Wellington Principal Investigator Dr Natalie Plank and Postdoc Fellow Dr Saurabh Bose, Professor Brown’s eventual aim is to set up a commercial company to fabricate these chips. While neuromorphic devices aren’t going to replace all types of computing, they could have a huge impact on image recognition—perhaps offering a route to real-time, low power consumption data processing, on a chip small enough to fit into your smartphone.