Machine Learning / Programming Languages

Quantum Computing’s Challenging Liftoff to Commercialization

28 Mar 2019 1:00pm, by

If there is one grizzly engineering lessons that could be learned from the recent pair of tragic Boeing 737 Max airplane crashes, it may be that the complex systems we build (such as aircraft) will require a previously unfathomable level of safety testing. Early reports seem to indicate that what befuddled the aircraft’s guidance system was actually a safety feature, and yet its own malfunction evaded countless hours of testing and modeling.

And it is such complex system modeling may finally bring quantum computing into the commercial realm.

Like the blockchain, quantum computing has been one of those technologies that have touted as revolutionary for some time now, promising solutions that may be possible if not readily feasible. Quantum computing has been slowly approaching the commercial sector of IT for about a decade now, with each new generation of quantum-bit (qubit) computers showing more promise in solving actual problems, the kind of problems people pay money to solve.

D-Wave’s Edward “Denny” Dahl, at the Inside Quantum Tech conference.

So the Inside Quantum Technology conference, being held last week in Boston, positioned itself as the first commercially-oriented quantum computing event, and set out to answer the question of what quantum computing goes in terms of actually generating serious amounts of revenue.

The answer is: Not quite yet.

“There’s not going to be a billion-dollar quantum industry next year. Or the year after that. Or the year after that,” said Whurley (who goes by a single name), the CEO of venture capital firm StrangeWorks, during his talk.

Nonetheless, the pace of innovation is indeed accelerating and catching the eyes of investors and system builders.

“This technology is moving out of academia and into the commercial arena,” said Travis Scholten, an IBM quantum computing applications researcher, during his talk. IBM offers a cloud quantum computing service. And with a toolkit already available, Microsoft plans to offer a quantum co-processing service through its Azure cloud. Microsoft even moved its quantum research team to the Azure group. Quantum computing start-up, D-Wave, which has offered quantum computers for well over a decade, offers a full free minute of hosted quantum computing (which is a lot more than you’d expect, or probably even use).

But there are technical hurdles still ahead, not to mention the considerable challenge in getting software developers worldwide to abandon their IntelliSense and learn physics. And even if fully commercialized, there are but a handful of initial uses where this radical approach makes sense, notably those now being tackled in the high-performance computing (HPC) community.

The good news is that industry players have settled on a handful of architectural approaches, getting every to more less get to work on building out all the tooling that will be needed. If successful, they will have helped us move past the era of digital-only computing.

State of the Art

Physicist Richard Feynman thought up the idea of “quantum computing” in the early 1980s as a way to simulate complex scientific problems, in particular problems around quantum mechanics (Feynman himself found it necessary think out problems through diagrams). Mapping a quantum state back to a binary “digital” one would consume far too many resources. Why not just map a problem in quantum physics to a system that ran on quantum properties to begin with?

Quantum properties could be captured in the form of quantum bits, or qubits. They are usually forged from an electron or the nucleus of the phosphorus atom, possibly switched by a Josephson junction. By their nature, qubits are capable of holding two different states simultaneously, each in varying degrees, and expressed as probabilities — perfect for quantum modeling.

“It can be completely ‘0’ or completely ‘1’ or some combination in-between,” explained Edward “Denny” Dahl, a principal research scientist for D-Wave, the quantum computing system maker. “It’s a very abstract model of computation.”

From D-Wave’s Edward “Denny” Dahl


Designs for a quantum computer have fallen largely into two camps, Dahl explained.

One is the gate model, based on the Quantum Fourier Transform (QFT), a quantum implementation of the Fast Fourier Transform (FFT), a method of separating out signals from noise by transforming a time-based signal into frequency-based one (or vice-versa). Devised by physicist Peter Shor, QFT applied these transform principles to the quantum level, where they could be put to work phase-shifting quantum gates.

This approach carries some heavy overhead in error correction, Dahl warned. As many as 100 qubits could be needed to error-correct a production qubit.  “Error correction is possible, but the overhead is significant,” Dahl said.

Dahl is partial to his own company’s approach: annealing, which is a centuries-old metallurgical technique of adding thermal energy to a system — heat traditionally — and then removing that energy as to bend the material into the desired state. Instead of energy, D-Wave’s annealing approach uses quantum fluctuations  (bubbles of uncertainty that pop up in an otherwise uniform space) to build a configuration space. Each set of qubits gets a corresponding set of, more numerous, couplers, to tie them together.

If you can turn a problem into a landscape of some sort, it can be computed in quantum form, Dahl promised, adding that a set of about 8,000 numbers can make a landscape.

The field seems to be settling on a common architecture style. Repeatedly, through the conference, people discussed quantum co-processing, rather than quantum computing. Much of the preparation work for quantum computing will continue to be done with what people referred to repeatedly as “classical computers.”

Market Optimization

Feynman’s to-do list for a quantum computer has proved remarkably prescient. Today, the best use cases are working out those problems too complicated to be done easily through even a fleet of commercial servers.

Dahl mentioned three where D-Wave is seeing interest: Optimization, material simulation, and machine learning.

Optimization may be something like scheduling large numbers of flights, or deliveries, in the most efficient way possible. English online grocer Ocado uses a D-Wave machine to generate optimum routes for robots to assemble market baskets of food for customers.

Material simulation is an emerging use case, one in which quantum computers characterize the properties of materials, such as phase transitions. They can make full use of the Hamiltonian operator which sums the kinetic and potential energies of a set of particles.

Modeling larger molecules requires larger-depth circuits, IBM’s Scholten further explained. The process of discovering these properties would be iterative, he said, with shipping a trial state to a quantum platform to estimate the energy, which is returned back to the adjoining classical system.

IBM’s Travis Scholten.

IBM itself has been looking at quantum computing to tackle large jobs in machine learning and risk analysis. With ML, the company wants to increase the number of dimensions in modeling, with quantum computing offering a way to compare tow different models. Quantum computing could also vastly decrease the amount of time it takes to do Monte Carlo simulations, an essential ingredient for risk analysis.

One company that is already taking a serious look at quantum computing for competitive advantage is aerospace company Airbus, explained Paolo Bianco, who is the Airbus global research and technology cooperation manager. The company has long been a fervent user of high-performance computing (HPC), also known as supercomputing, to tackle solve of its largest calculations. It is similarly bullish for quantum, having invested in at least one start-up, QC Ware.

Airbus has been looking at quantum computing to do Fault Tree Analysis, which is a technique that could be used for finding the minimal number of failures that would lead to a system failure (something Airbus competitor Boeing might have needed a bit more of). As with others, Airbus is looking at a classical-quantum hybrid approach, where quantum could define the minimal set and hand it to classical or HPC computing to finish the task.

To spur more innovation in the field, the aerospace giant launched the Airbus Quantum Computing Challenge, a set of tricky air flight physics problems (including aircraft climb optimization, wing box and aircraft loading design optimization). Thus far, 292 teams and individuals have registered to compete, from 44 countries.

Secret Ingredient: People


That’s a smart move by Airbus. The biggest challenge for quantum computing is not the technology, but rather building up a workforce to program and run quantum computing devices. This is particularly tricky given the skills needed for classical computing development are entirely different than those for that will be needed for quantum computing programming, speakers and attendees at the conference repeatedly said.

“You can’t make a software developer and make a quantum computing engineer,” Whurley said. “You need to understand the psychics in order to effect these machines.”

Just like early computing required an understanding of the underlying hardware, so to does quantum computing require an understanding of physics, many at the conference argued. While the traditional view has held that it is a matter of building out the right abstractions, through frameworks and such, others, like Whurley, argue that the gap is too large, quantum computing can’t be expressed in the language of classical computing.

“It’s an education thing, not an abstraction thing,” he said.

Whurly’s Austin-based  StrangeWorks is looking at “long term investment,” in quantum computing, though he admits it is too early to start picking winners. We are not in the Intel vs. AMD stage of the industry yet, he said.

When will we get there? A lot of optimists in the field predict quantum computing will turn into a full-fledged industry by as soon as 2021, while others don’t expect to bloom for decades.

“It is a very long term view we need to have in this space,” Whurley said.

Quantum Superiority

A lot of people at this conference spoke of when we might see (the unfortunately-named) “quantum supremacy,” that point where quantum computing reaches a level of proficiency that can’t be matched by classical computing (“legacy computing”?).

Reaching such quantum supremacy however has its own problems: ensuring accuracy.

Many consider a quantum computer with 50 qubits to be the point at which quantum surpasses even the most powerful HPC computers. Yet how would you check the work? Practitioners like Google deliberately keep their design simple so their results can be checked by classical computers, noted Joe Emerson, CEO of Quantum Benchmark, and professor of applied research at University as Waterloo.

From Joe Emerson, CEO of Quantum Benchmark.


Digital computing is completely different from quantum computing. Digital computing is discrete and predictable. quantum computing is continuous and error-prone. The power of quantum computing grows exponentially, but the error rate grows even faster, he said.

“Error correction is at the heart of the architecture you want to build,” agreed Microsoft’s director of quantum computing business development, Ben Porter, during his own presentation.

Feature image by skeeze from Pixabay.

A newsletter digest of the week’s most important stories & analyses.

View / Add Comments

Please stay on topic and be respectful of others. Review our Terms of Use.