Will we see a quantum computing revolution?

Will we see a quantum computing revolution?

At the brink of technological maturity, startups are taking quantum computing to the next level. But is it a revolution or a bubble?

In late 2019, Google built a quantum computer that could make a calculation in three minutes. That might not sound like much — you can order takeout for dinner with your phone in less time — but the same task would have taken a classical computer at least 1,000 years. This presented yet another milestone in the development of quantum computers. Judging by the sheer number of startups in the space, as well as the millions of dollars in funding, quantum computing seems to be the next big thing in tech.

Experts’ projections back that up: by 2025, the quantum computing market is projected to hit $770 million. Quantum cryptography alone might account for $214 million since it provides fundamentally unhackable connections that could add an extra layer of security across all fields, from finance to healthcare.

Given this information, investors are understandably looking to pour cash into startups dealing with quantum computing. Between 2017 and 2018, investors put more than $450 million into quantum computing. That’s more than four times the amount that was invested in the two years prior.

The question that nobody can answer yet is whether this surge in investments will result in a bubble. Once that bubble bursts, the market could cool off and result in “quantum winters,” analogous to AI winters: droughts of investment activity that have slowed the growth of AI multiple times in the last decades.

Whether or not the current boom is a bubble has many implications across the field. At this point in time, investing in or working at a quantum computing startup means setting up an uncertain future for yourself. This doesn’t mean that quantum computing won’t succeed in the long run, but you’ll need some deeper knowledge before making a decision.

Because of its incredible computing power, the quantum world might help to bring on new scientific breakthroughs, discover life-saving drugs, develop new materials to build more efficient devices and buildings, invent financial strategies to live well in retirement, and find new algorithms to quickly direct resources or manage supply chains. For now, however, some challenges remain to be overcome before we can truly speak about a “quantum revolution.”

The Need, Promise, and Reality of Quantum Computing

Despite giving us the most spectacular wave of technological innovation in human history, there are certain…

towardsdatascience.com

Key concepts of quantum computing

In short, quantum computers can store and process way more data on fewer processors than a classical computer. A classical processor encodes all information in sequences of zeros and ones, with each instance of a zero or one being one bit. A quantum processor stores information in qubits, which are the quantum analogue to bits. The difference is that each qubit can be a zero and a one at the same time, however.

Think about it like a coin: When lying on the table, it’s either heads or tails. But if you spin it, there’s no way to tell whether it’s heads or tails; in a sense, it is both. Depending on how it spins, there may be, for example, a 70 percent chance that you get heads and a 30 percent chance that you get tails once the coin comes to rest. This concept of being in two states at once is called superposition.

Since many qubits interact with one another, you end up with a set of probabilities that derives from measuring combinations of zeros and ones. This results in a much greater computing power than with classical bits. For example, each letter that you’re reading on your screen is encoded in a sequence of eight zeros and ones, or eight classical bits. Eight qubits, however, are the equivalent of 2⁸, or 256 classical bits, because each qubit has two possible states. On an eight-qubit quantum computer, you’d therefore be able to write 32 letters instead of one single one.

Schematic of trajectory of a point on a spinning coin

The behavior of a qubit can be described with a wave, just like a point on a spinning coin. Picture by author.

The current state of the art is 50 qubits, and since the power increases exponentially, that processing capability means you could make a quadrillion — that is, 1,000 trillion — calculations at once. At the end of the process, you just measure the end result, i.e. measure the final state of each qubit, and your job is done. Even the biggest supercomputers can’t do that much processing.

Making 50 or more qubits work isn’t as easy as it sounds, however. Interference, a pretty basic phenomenon, makes the qubits quite error-prone. Earlier, I described qubits and their superpositions like a spun coin. Imagine now that two qubits are whizzing around like coins on a table. If you look at the position of a point on one of the coins at each instance in time, it looks like a wave (see figure). Just like sound and water, these waves travel through space and time with a certain amplitude and wavelength.

Sometimes, two waves can destructively interfere, and, as a result, you measure nothing. You can think about it in terms of the two coins: imagine that they’re both spinning on the tabletop, with the same speed, and, let’s say, clockwise. Further imagine that we’re focusing on the same point on each coin, but that one coin is always lagging a half-turn behind the other (see section “destructive interference” in the figure below). In quantum computing, we often can’t separate both measurements, so basically we’re measuring two waves in one. But in this scenario, we won’t ever measure anything because the positions of the coins cancel out!

This phenomenon is called decoherence, and it’s a direct consequence of the destructive interference of two waves. At the moment, it’s a problem that’s yet to be fully resolved. Decoherence causes errors, and we obviously don’t want that when we’re building software that might have a huge impact on human lives.

The third and final key concept of quantum computing is entanglement. This concept has no analogy in the classical computing world, but when two quantum particles become entangled, they always return the same state when they’re measured. It’s as if you had two coins that were somehow magically connected. So if you could stop one coin from spinning in Tokyo and the other in London, they would both return the exact same result, heads or tails, as long as you perform the measurements at the same time.

This phenomenon is extremely important for quantum encryption: in the near future, we might be able to store one quantum particle on one computer and entangle it with one on another computer. The connection between these two particles is fundamentally secure because any disruption is immediately detectable. Quantum entanglement also means that we could have immediate data transfer across computers, which could result in a quantum internet in the decades to come.

Schematic of destructive and constructive interference of two waves

startup-ideas programming software-development towards-data-science quantum-computing data science

Bootstrap 5 Complete Course with Examples

Bootstrap 5 Tutorial - Bootstrap 5 Crash Course for Beginners

Nest.JS Tutorial for Beginners

Hello Vue 3: A First Look at Vue 3 and the Composition API

Building a simple Applications with Vue 3

Deno Crash Course: Explore Deno and Create a full REST API with Deno

How to Build a Real-time Chat App with Deno and WebSockets

Convert HTML to Markdown Online

HTML entity encoder decoder Online

Offshore Software Development - Best Practices

To make the most out of the benefits of offshore software development, you should understand the crucial factors that affect offshore development.

Applications Of Data Science On 3D Imagery Data

The agenda of the talk included an introduction to 3D data, its applications and case studies, 3D data alignment and more.

Program a Quantum Computer Today

Your options on how to start with working with today’s quantum computers. Quantum computing is one of the most rapidly advancing technologies.

Here’s how I Learned Just Enough Programming for Data Science

How to approach learning programming and best books I recommend. There’s no doubt that data science requires decent programming skills, but how much is enough?

What Are The Advantages and Disadvantages of Data Science?

Online Data Science Training in Noida at CETPA, best institute in India for Data Science Online Course and Certification. Call now at 9911417779 to avail 50% discount.