If the 2020’s is the decade of AI, then the 2030’s is the decade of Quantum Computing. A large-scale quantum computer could change the world; performing certain calculations in minutes that would take the largest supercomputers millions of years. The impact to applications such as cryptography, chemistry, energy and the climate, finance and others would be huge.

Quantum computing originates from the principles of quantum mechanics, most notably superposition and entanglement. Combined, these properties allow for the exponential scaling of compute power: A quantum computer with just 50 bits (or qubits) would outperform all but the largest supercomputers today. Thanks to the nature of physics there are at least a dozen systems that can act as quantum bits of information from superconducting loops to entangled photons.

Today’s quantum processors, however, are limited to 10-100’s of entangled quantum bits. If you believe the hype, a commercially relevant system is just around the corner The reality, however, is that we are still at mile 1 of a marathon. There are many unanswered fundamental questions. At Intel, our approach is to build our qubits using the humble transistor. We are relying on the continued evolution of Moore’s Law to realize quantum computing.

Is this too complex? Remember that Richard Feynman, who worked on the Manhattan project and won a Nobel Prize in Physics, once said “If you think you understand quantum mechanics, then you don’t understand quantum mechanics.” In this talk, let’s see if we can cover this at a level that simultaneously daunts us with challenges yet excites us with possibilities.