Microsoft Topological Quantum Computing Approach
For faster quantum computing, Microsoft builds a better qubit
Microsoft's new approach to quantum computing is "very close," an executive
says.
Mostly borrowed & updated from Steve Lamb in Microsoft Land….
Google just announced quantum supremacy, a milestone in which the radically
different nature of a quantum computer lets it vastly outpace a traditional
machine. But Microsoft expects progress of its own by redesigning the core
element of quantum computing, the qubit.
Microsoft has been working on a qubit technology called a topological qubit
that it expects will deliver benefits from quantum computing technology that
today are mostly just a promise. After spending five years figuring out the
complicated hardware of topological qubits, the company is almost ready to
put them to use, said Krysta Svore, general manager of Microsoft's quantum
computing software work.
"We've really spent the recent few years developing that technology," Svore
said Thursday after a talk at the IEEE International Conference on Rebooting
Computing. "We believe we're very close to having that."
Quantum computers are hard to understand, hard to build, hard to operate and
hard to program. Since they only work when chilled to a tiny fraction of a
degree above absolute zero -- colder than outer space -- you're not likely
to have a quantum laptop anytime soon.
But running them in data centers where customers can tap into them could
deliver profound benefits by tackling computing challenges that classical
computers can't handle. Among examples Svore offered are solving chemistry
problems like making fertilizer more efficiently, or routing trucks to speed
deliveries and cut traffic.
Better qubitsTogether, the phenomena should enable quantum computers to explore an enormous number of possible solutions to a problem at the same time. ... The key advantage of Microsoft's topological qubit is that fewer physical qubits are needed to make one logical qubit, Svore said.
Better qubits
Classical computers store data as a bit that represents either a 0 or a 1.
Qubits, though, can store a combination of 0 and 1 simultaneously through a
peculiar quantum physics principle called superposition. And qubits can be
ganged together through another phenomenon called entanglement. Together,
the phenomena should enable quantum computers to explore an enormous number
of possible solutions to a problem at the same time.
Quantum Computing 101
One of the basic quantum computing problems is that qubits are easily
perturbed. That's why the heart of a quantum computer is housed in a
refrigerated container the size of a 55-gallon drum.
Even with that isolation, though, individual qubits today only can perform
useful work for a fraction of a second. To compensate, quantum computer
designers plan technology called error correction that yokes many qubits
together into a single effective qubit, called a logical qubit. The idea is
that logical qubits can perform useful processing work when many of their
underlying physical qubits have gone astray.
The key advantage of Microsoft's topological qubit is that fewer physical
qubits are needed to make one logical qubit, Svore said.
Specifically, she thinks one logical qubit will require 10 to 100 physical
qubits with Microsoft's topological qubits. That compares to something like
1,000 to 20,000 physical qubits for other approaches.
"We believe that overhead will be far less," she said. That'll mean quantum
computers will become practical with far fewer qubits.
By comparison, Google's Sycamore quantum computing chip used 53 physical
qubits. For serious quantum computing work, researchers are hoping to reach
qubit levels of at least a million.
Topological quantum computing with Majorana Fermions
One drawback of Microsoft's topological qubit, though, is that they're not
available yet. Alternative designs might not work as well, but they're in
real-world testing today.
Better quantum computing algorithms
Microsoft is also trying to improve other aspects of quantum computing. One
is the control system, which in today's quantum computers is a snarl of
hundreds of wires, each an expensive coaxial cable used to communicate with
qubits.
On Monday at Microsoft's Ignite conference, the company also showed off a
new quantum computer control system developed with the University of Sydney
that uses many fewer wires -- down from 216 to just three, Svore said. "We
think this will scale to tens of thousands of qubits and beyond."
And Svore pushed for progress on quantum computing software, too, urging
professors to introduce their students to learning and improving quantum
computing algorithms.
In one example of those benefits, Microsoft tackled an aspect of that
nitrogen-fixing fertilizer problem that simply couldn't be solved on a
classical machine -- but found that a quantum computer would still take
30,000 years.
That's faster than a classical computer that would require "the lifetime of
the universe," but still not practical, she said. But with algorithm
improvements, Microsoft found a way to shorten that to just a day and a
half.
Non-Abelian Anyons & Topological Quantum Computation ESE 523
"New algorithms can be a breakthrough in how to solve something," Svore
said. "We need to make them better, we need to optimize them, we need to be
pushing."
Developing a Topological Qubit
As quantum technologies advance, we get closer to finding solutions to some
of the world’s most challenging problems. While this new paradigm holds
incredible possibility, quantum computing is very much in its infancy. To
fully embrace the power and potential of quantum computing, the system must
be engineered to meet the demands of the solutions the world needs most.
The fragile nature of qubits is well-known as one of the most significant
hurdles in quantum computing. Even the slightest interference can cause
qubits to collapse, making the solutions we’re pursuing impossible to
identify because the computations cannot be completed.
Microsoft is addressing this challenge by developing a topological qubit.
Topological qubits are protected from noise due to their values existing at
two distinct points, making our quantum computer more robust against outside
interference. This increased stability will help the quantum computer scale
to complete longer, more complex computations, bringing the solutions we
need within reach.
Quantum computing explained to my (Schrödinger) cat - Alessandro Vozza - Codemotion Amsterdam 2018
Topology and quantum computing
Topology is a branch of mathematics describing structures that experience
physical changes such as being bent, twisted, compacted, or stretched, yet
still maintain the properties of the original form. When applied to quantum
computing, topological properties create a level of protection that helps a
qubit retain information despite what’s happening in the environment. The
topological qubit achieves this extra protection in two different
ways:
Electron fractionalization. By splitting the electron, quantum
information is stored in both halves, behaving similarly to data redundancy.
If one half of the electron runs into interference, there is still enough
information stored in the other half to allow the computation to
continue.
Ground state degeneracy. Topological qubits are engineered to have two
ground states—known as ground state degeneracy—making them much more
resistant to environmental noise. Normally, achieving this protection isn’t
feasible because there’s no way to discriminate between the two ground
states. However, topological systems can use braiding or measurement to
distinguish the difference, allowing them to achieve this additional
protection.
From Bohr’s Atom to the Topological Quantum Computer | Charles Marcus
The path to the topological qubit
Currently years into the development of the topological qubit, the journey
began with a single question, “Could a topological qubit be achieved”?
Working with theory as a starting point, Microsoft brought together
mathematicians, computer scientists, physicists, and engineers to explore
possible approaches. These experts collaborated, discussed methods, and
completed countless equations to take the first steps on the path toward
realizing a topological qubit.
Modeling and experimentation work hand-in-hand as an ongoing, iterative
cycle, guiding the design of the topological qubit. Throughout this process,
the Microsoft team explored possible materials, ways to apply control
structure, and methods to stabilize the topological qubit.
A team member proposed the use of a superconductor in conjunction with a
strong magnetic field to create a topological phase of matter—an approach
that has been adopted toward realizing the topological qubit. While bridging
these properties has been long-taught, it had never been done in such a
controlled way prior to this work.
To create the exact surface layer needed for the qubit, chemical compounds
are currently being grown in Microsoft labs using a technique called
“selective area growth.” Chosen for its atomic-level precision, this unique
method can be described as spraying atoms in the exact arrangement needed to
achieve the properties required.
Topological Quantum Computing
The team continues testing functional accuracy through device simulation, to
ensure that every qubit will be properly tuned, characterized, and
validated.
Bridging fields to advance technology
Many fields of knowledge have come together to realize the topological
qubit, including mathematics, theoretical physics, solid state physics,
materials science, instrumentation and measurement technology, computer
science, quantum algorithms, quantum error correction, and software
applications development.
Bridging these fields has led to breakthrough techniques across all aspects
of realizing a topological qubit, including:
- Theory and simulation – Turning a vision into reality by creating a rapid design, simulation, and prototyping process
- Fabrication – Pioneering unique fabrication approaches and finding new ways to bridge properties
- Materials growth – Developing inventive methods to create materials using special growth techniques to create the exact properties required at nanoscale
- Measurement and quantum control – Tuning devices for accuracy in function and measurement
Introduction to Topological Quantum Computing
A complete quantum system from hardware to software
The process of building a quantum computer includes creating the raw
materials needed to make topological quantum devices, fabricating the cold
electronics and refrigeration systems, and developing the overall
infrastructure needed to bring the solution to life. In addition, our system
includes everything you need to program the quantum computer, including a
control system, software, development tools, and Azure services—a
combination we refer to as our full quantum stack.
Because quantum and classical work together, Microsoft Azure is a perfect
environment for quantum processing and deployment. With data stored in
Azure, developers will be able to access quantum processing alongside
classical processing, creating a streamlined experience.
Using the complete Microsoft quantum system, what would the start-to-finish
experience look like?
Beginning with a problem you may be able to solve with a quantum algorithm…
You would start by building your solution in Visual Studio, using the tools
found in the Microsoft Quantum Development Kit.
Using Q#, a language created specifically for quantum development, you would
write the code for your solution with the help of the extensive Microsoft
quantum libraries.
When your code is complete, you would run a quantum simulation to check for
bugs and validate that your solution is ready for deployment.
Once validated, you would be ready to run your solution on the quantum
computer.
Your quantum solution would be deployed from within Microsoft Azure, using
the quantum computer as a co-processor. As many scenarios will use both
quantum and classical processing, Azure will streamline workflows as
real-time or batch applications, later connecting results directly into your
business processes.
Together, this full quantum stack pairs with familiar tools to create an
integrated, streamlined environment for quantum processing.
Scalability, from top to bottom
Quantum computers can help address some of the world’s toughest problems,
provided the quantum computer has enough high-quality qubits to find the
solution. While the quantum systems of today may be able to add a high
number of qubits, the quality of the qubits is the key factor in creating
useful scale. From the cooling system to qubits to algorithms,
scalability is a fundamental part of the Microsoft vision for quantum
computing.
The topological qubit is a key ingredient in our scalable quantum system.
Different from traditional qubits, a topological qubit is built in a way
that automatically protects the information it holds and processes. Due to
the fragile nature of conventional qubits, this protection offers a landmark
improvement in performance, providing added stability and requiring fewer
qubits overall. This critical benefit makes the ability to scale possible.
Microsoft has been working on scalable quantum computing for nearly two
decades, creating its first quantum computing group—known as Station Q—in
2006. Investing in scalable quantum computing for over a decade, we have
connected some of the brightest minds in the industry and academia to make
this dream a reality. Blending physics, mathematics, engineering, and
computer science, teams around the globe work daily to advance the
development of the topological qubit and the Microsoft vision for quantum
computing.
Empowering the quantum revolution
At Microsoft, we envision a future where quantum computing is available to a
broad audience, scaling as needed to solve some of the world’s toughest
challenges. Our quantum approach begins within familiar tools you know and
use such as Visual Studio. It provides development resources to build and
simulate your quantum solutions. And it continues with deployment through
Azure for a streamlined combination of both quantum and classical
processing.
As the path to build a quantum computer continues, challenges from across
industries await solutions from this new computational power. One of the
many examples of high-impact problems that can be solved on a quantum
computer is developing a new alternative to fertilizer production. Making
fertilizer requires a notable percentage of the world’s annual production of
natural gas. This implies high cost, high energy waste, and substantial
greenhouse emissions. Quantum computers can help identify a new alternative
by analyzing nitrogenase, an enzyme in plants that converts nitrogen to
ammonia naturally. To address this problem, a quantum computer would require
at least 200 fault-free qubits—far beyond the small quantum systems of
today. In order to find a solution, quantum computers must scale up. The
challenge, however, is that scaling a quantum computer isn’t merely as
simple as adding more qubits.
Building a quantum computer differs greatly from building a classical
computer. The underlying physics, the operating environment, and the
engineering each pose their own obstacles. With so many unique challenges,
how can a quantum computer scale in a way that makes it possible to solve
some of the world’s most challenging problems?
Navigating obstacles
Most quantum computers require temperatures colder than those found in deep
space. To reach these temperatures, all the components and hardware are
contained within a dilution refrigerator—highly specialized equipment that
cools the qubits to just above absolute zero. Because standard electronics
don’t work at these temperatures, a majority of quantum computers today use
room-temperature control. With this method, controls on the outside of the
refrigerator send signals through cables, communicating with the qubits
inside. The challenge is that this method ultimately reaches a roadblock:
the heat created by the sheer number of cables limits the output of signals,
restraining the number of qubits that can be added.
As more control electronics are added, more effort is needed to maintain the
very low temperature the system requires. Increasing both the size of the
refrigerator and the cooling capacity is a potential option, however, this
would require additional logistics to interface with the room temperature
electronics, which may not be a feasible approach.
Another alternative would be to break the system into separate
refrigerators. Unfortunately, this isn’t ideal either because the transfer
of quantum data between the refrigerators is likely to be slow and
inefficient.
At this stage in the development of quantum computers, size is therefore
limited by the cooling capacity of the specialized refrigerator. Given these
parameters, the electronics controlling the qubits must be as efficient as
possible.
Physical qubits, logical qubits, and the role of error correction
By nature, qubits are fragile. They require a precise environment and state
to operate correctly, and they’re highly prone to outside interference. This
interference is referred to as ‘noise’, which is a consistent challenge and
a well-known reality of quantum computing. As a result, error correction
plays a significant role.
As a computation begins, the initial set of qubits in the quantum computer
are referred to as ‘physical qubits’. Error correction works by grouping
many of these fragile physical qubits, which creates a smaller number of
usable qubits that can remain immune to noise long enough to complete the
computation. These stronger, more stable qubits used in the computation are
referred to as ‘logical qubits’.
In classical computing, noisy bits are fixed through duplication (parity and
Hamming codes), which is a way to correct errors as they occur. A similar
process occurs in quantum computing, but is more difficult to achieve. This
results in significantly more physical qubits than the number of logical
qubits required for the computation. The ratio of physical to logical qubits
is influenced by two factors: 1) the type of qubits used in the quantum
computer, and 2) the overall size of the quantum computation performed. And
due to the known difficulty of scaling the system size, reducing the ratio
of physical to logical qubits is critical. This means that instead of just
aiming for more qubits, it is crucial to aim for better qubits.
Stability and Scale with a Topological Qubit
The topological qubit is a type of qubit that offers more immunity to noise
than many traditional types of qubits. Topological qubits are more robust
against outside interference, meaning fewer total physical qubits are needed
when compared to other quantum systems. With this improved performance, the
ratio of physical to logical qubits is reduced, which in turn, creates the
ability to scale.
As we know from Schrödinger’s cat, outside interactions can destroy quantum
information. Any interaction from a stray particle, such as an electron, a
photon, a cosmic ray, etc., can cause the quantum computer to decohere.
There is a way to prevent this: parts of the electron can be separated,
creating an increased level of protection for the information stored. This
is a form of topological protection known as a Majorana quasi-particle. The
Majorana quasi-particle was predicted in 1937 and was detected for the first
time in the Microsoft Quantum lab in the Netherlands in 2012. This
separation of the quantum information creates a stable, robust building
block for a qubit. The topological qubit provides a better foundation with
lower error rates, reducing the ratio of physical to logical qubits. With
this reduced ratio, more logical qubits are able to fit inside the
refrigerator, creating the ability to scale.
If topological qubits were used in the example of nitrogenase simulation,
the required 200 logical qubits would be built out of thousands of physical
qubits. However, if more traditional types of qubits were used, tens or even
hundreds of thousands of physical qubits would be needed to achieve 200
logical qubits. The topological qubit’s improved performance causes this
dramatic difference; fewer physical qubits are needed to achieve the logical
qubits required.
Developing a topological qubit is extremely challenging and is still
underway, but these benefits make the pursuit well worth the effort.
A solid foundation to tackle problems unsolved by today’s computers
A significant number of logical qubits are required to address some of the
important problems currently unsolvable by today’s computers. Yet common
approaches to quantum computing require massive numbers of physical qubits
in order to reach these quantities of logical qubits—creating a huge
roadblock to scalability. Instead, a topological approach to quantum
computing requires far fewer physical qubits than other quantum systems,
making scalability much more achievable.
Providing a more solid foundation, the topological approach offers robust,
stable qubits, and helps to bring the solutions to some of our most
challenging problems within reach.
Microsoft's Approach: Topological Systems
At Microsoft Quantum, our ambition is to help solve some of the world’s most
complex problems by developing scalable quantum technology. Our global team
of researchers, scientists, and engineers are addressing this challenging
task by developing a topological qubit.
To realize this vision, our teams have been making advances in materials and
device fabrication, designing the precise physical environment required to
support the topological state of matter. The latest discovery by the team
expands the landscape for creating and controlling the exotic particles
critical for enabling topological superconductivity in nanoscale devices.
Discovery: a new route to topology
Our qubit architecture is based on nanowires, which under certain conditions
(low-temperature, magnetic field, material choice) can enter a topological
state. Topological quantum hardware is intrinsically robust against local
sources of noise, making it particularly appealing as we scale up the number
of qubits.
An intriguing feature of topological nanowires is that they support Majorana
zero modes (MZMs) that are neither fermions nor bosons. Instead, they obey
different, more exotic quantum exchange rules. If kept apart and braided
around each other, similar to strands of hair, MZMs remember when they
encircle each other. Such braiding operations act as quantum gates on a
state, allowing for a new kind of computation that relies on the topology of
the braiding pattern.
A topological qubit is constructed by arranging several nanowires hosting
MZMs in a comb-like structure and coupling them in a specific way that lets
them share multiple MZMs. The first step in building a topological qubit is
to reliably establish the topological phase in these nanowires.
While exploring the conditions for the creation of topological
superconductivity, the team discovered a topological quantum vortex state in
the core of a semiconductor nanowire surrounded on all sides by a
superconducting shell. They were very surprised to find Majorana modes in
the structure, akin to a topological vortex residing inside of a nanoscale
coaxial cable.
With hindsight, the findings can now be understood as a novel topological
extension of a 50-year old piece of physics known as the Little-Parks
effect. In the Little-Parks effect, a superconductor in the shape of a
cylindrical shell – analogous to a soda straw – adjusts to an external
magnetic field, threading the cylinder by jumping to a “vortex state” where
the quantum wavefunction around the cylinder carries a twist. The quantum
wavefunction must close on itself.
Thus, the wavefunction phase accumulated by going around the cylinder must
take the values zero, one, two, and so on, in units of 2Ï€. This has been
known for decades. What had not been explored in depth was what those twists
do to the semiconductor core inside the superconducting shell. The
surprising discovery made by the Microsoft team—experiment and theory—was a
twist in the shell, under appropriate conditions, can make a topological
state in the core, with MZMs localized at the opposite ends.
While signatures of Majorana modes have been reported in related systems
without the fully surrounding cylindrical shell, these previous realizations
placed rather stringent requirements on materials and required large
magnetic fields. This discovery places few requirements on materials and
needs a smaller magnetic field, expanding the landscape for creating and
controlling Majoranas.
Worldwide collaboration
What started as two separate papers – one experimental, the other
theoretical – was combined into a single publication that tells the complete
story, with mutual support of experiment, theory, and numerics.
Of course, looking back, deep connections to previous ideas and experiments
can now be recognized, and results that were first mysterious now seem
inevitable. That is the nature of scientific progress: from seemingly
impossible to seemingly obvious after a few months of making, measuring, and
thinking.
Saulius VaitiekÄ—nas, then a PhD student and postdoc at the Niels Bohr
Institute, University of Copenhagen, and now a newly minted Microsoft
researcher, was the main experimentalist. As he comments, “The paper
represents a series of surprises. And it was really exciting to see so many
different disciplines come together, all in a united activity.”
Roman Lutchyn, Principal Research Manager and lead of the theoretical
effort, reflected on the collaboration process. “Microsoft Quantum started
with just a small group in Santa Barbara. Now we’ve grown into a much
broader organization with labs all around the world – Copenhagen, Delft,
Purdue, Sydney, Redmond, among others. I think this paper is a landmark in
our partnership between teams and is a model of how we can work effectively
together as one team – around the world – on related ideas in physics,
ultimately generating new and potentially important results.”
Charles Marcus, Scientific Director of Microsoft Quantum Lab – Copenhagen
and lead for the experimental effort, concurs, “[This paper is an example]
where two results – from theory and experiment – help each other to make
more conclusive statements about physics. Otherwise, we would have been left
with more abstract theory; and experimentally, we would have measurements
but may have hedged on interpretation. By merging theory and experiment, the
overall story is stronger and also more interesting, seeing the connection
to related phenomena in different systems.”
Inside Microsoft’s Quest to Make Quantum Computing Scalable
The company’s researchers are building a system that’s unlike any other
quantum computer being developed today.
Introduction to topological superconductivity and Majorana fermions
There’s no shortage of big tech companies building quantum computers, but
Microsoft claims its approach to manufacturing qubits will make its quantum
computing systems more powerful than others’. The company’s researchers are
pursuing “topological” qubits, which store data in the path of moving exotic
Majorana particles. This is different from storing it in the state of
electrons, which is fragile.
That’s according to Krysta Svore, research manager in Microsoft’s Quantum
Architectures and Computation group. The Majorana particle paths -- with a
fractionalized electron appearing in many places along them -- weave
together like a braid, which makes for a much more robust and efficient
system, she said in an interview with Data Center Knowledge. These qubits
are called “topological cubits,” and the systems are called “topological
quantum computers.”
With other approaches, it may take 10,000 physical qubits to create a
logical qubit that’s stable enough for useful computation, because the state
of the qubits storing the answer to your problem “decoheres” very easily,
she said. It’s harder to disrupt an electron that’s been split up along a
topological qubit, because the information is stored in more places.
In quantum mechanics, particles are represented by wavelengths. Coherence is
achieved when waves that interfere with each other have the same frequency
and constant phase relation. In other words, they don’t have to be in phase
with each other, but the difference between the phases has to remain
constant. If it does not, the particle states are said to decohere.
“We’re working on a universally programmable circuit model, so any other circuit-based quantum machine will be able to run the same class of algorithms, but we have a big differentiator,” Svore said. “Because the fidelity of the qubit promises to be several orders of magnitude better, I can run an algorithm that’s several orders of magnitude bigger. If I can run many more operations without decohering, I could run a class of algorithm that in theory would run on other quantum machines but that physically won’t give a good result. Let’s say we’re three orders of magnitude better; then I can run three orders of magnitude more operations in my quantum circuit.”
Theoretically, that could mean a clear advantage of a quantum computer over
a classical one. “We can have a much larger circuit which could
theoretically be the difference between something that shows quantum
advantage or not. And for larger algorithms, where error corrections are
required, we need several orders of magnitude less overhead to run that
algorithm,” she explained.
A Hardware and Software System that Scales
Microsoft has chosen to focus on topological qubits because the researchers
believe it will scale, and the company is also building a complete hardware
stack to support the scaling. “We’re building a cryogenic computer to
control the topological quantum chip; then we're building a software system
where you can compile millions of operations and beyond.”
The algorithms running on the system could be doing things like quantum
chemistry – looking for more efficient fertilizer or a room temperature
semiconductor – or improving machine learning. Microsoft Research has
already shown that deep learning trains faster with a quantum computer. With
the same deep learning models in use today, Svore says, the research shows
“quadratic speedups” even before you start adding quantum terms to the data
model, which seems to improve performance even further.
Redesigning a Programming Language
To get developers used to the rather different style of quantum programming,
Microsoft will offer a new set of tools in preview later this year (which
doesn’t have a name yet) that’s a superset built on what it learned from the
academics, researchers, students, and developers who used Liquid, an
embedded domain specific language in F# that Microsoft created some years
ago.
The language itself has familiar concepts like functions, if statements,
variables, and branches, but it also has quantum-specific elements and a
growing set of libraries developers can call to help them build quantum
apps.
“We’ve almost completely redesigned the language; we will offer all the
things Liquid had, but also much more, and it’s not an embedded language.
It’s really a domain-specific language designed upfront for scalable quantum
computing, and what we’ve tried to do is raise the level of abstraction in
this high-level language with the ability to call vast numbers of libraries
and subroutines.”
Some of those are low-level subroutines like an adder, a multiplier, and
trigonometry functions, but there are also higher-level functions that are
commonly used in quantum computing. “Tools like phase estimation, amplitude
amplification, amplitude estimation -- these are very common frameworks for
your quantum algorithms. They’re the core framework for setting up your
algorithm to measure and get the answer out at the end [of the computation],
and they’re available in a very pluggable way.”
A key part of making the language accessible is the way it’s integrated into
Visual Studio, Microsoft’s IDE. “I think this is a huge step forward,” Svore
told us. “It makes it so much easier to read the code because you get the
syntax coloring and the debugging; you can set a breakpoint, you can
visualise the quantum state.”
Being able to step through your code to understand how it works is critical
to learning a new language or a new style of programming, and quantum
computing is a very different style of computing.
“As we’ve learned about quantum algorithms and applications, we’ve put what
we’ve learned into libraries to make it easier for a future generation of
quantum developers,” Svore said. “Our hope is that as a developer you’re not
having to think at the lower level of circuits and probabilities. The
ability to use these higher-level constructs is key.”
Hybrid Applications
The new language will also make it easier to develop hybrid applications
that use both quantum and classical computing, which Svore predicts will be
a common pattern. “With the quantum computer, many of the quantum apps and
algorithms are hybrid. You're doing pre and post-processing or in some
algorithms you’ll even be doing a very tight loop with a classical
supercomputer.”
How Many Qubits Can You Handle?
Microsoft, she says, is making progress with its topological qubits, but, as
it’s impossible to put any kind of date on when a working system might
emerge from all this work, the company will come out with a quantum
simulator to actually run the programs you write, along with the other
development tools.
Depending on how powerful your system is, you’ll be able to simulate between
30 and 33 qubits on your own hardware. For 40 qubits and more, you can do
the simulation on Azure.
“At 30 qubits, it takes roughly 16GB of classical memory to store that
quantum state, and each operation takes a few seconds,” Svore explains. But
as you simulate more qubits, you need a lot more resources. Ten qubits means
adding two to the power of 10, or 16TB of memory and double that to go from
40 to 41 qubits. Pretty soon, you’re hitting petabytes of memory. “At 230
qubits, the amount of memory you need is 10^80 bytes, which is more bytes
than there are particles in the physical universe, and one operation takes
the lifetime of the universe,” Svore said. “But in a quantum computer, that
one operation takes 100 nanoseconds.”
Microsoft’s broad-based quantum effort
LIQUi|> is one of a number of quantum computing projects Microsoft
researchers have been spearheading for more than a decade, in the quest to
create the next generation of computing that will have a profound effect on
society.
In addition to the QuArC research group, Microsoft’s Station Q research lab,
led by renowned mathematician Michael Freedman, is pursuing an approach
called topological quantum computing that they believe will be more stable
than other quantum computing methods.
The idea is to design software, hardware and other elements of quantum
computing all at the same time.
“This isn’t just, ‘Make the qubits.’ This is, ‘Make the system,’” Wecker
said.
A qubit is a unit of quantum information, and it’s the key building block to
a quantum computer. Using qubits, researchers believe that quantum computers
could very quickly evaluate multiple solutions to a problem at the same
time, rather than sequentially. That would give scientists the ability to do
high-speed, complex calculations, allowing biologists, physicists and
chemists to get information they never thought possible before.
LIQUiD - Station Q overview
Fertilizer, batteries and climate change
Take fertilizer, for example. Fertilizers are crucial to feeding the world’s
growing population because they allow plants to develop better and faster.
But synthetic fertilizer relies on natural gas, and lots of it: That’s
expensive, depletes an important natural resource and adds to pollution.
Using a quantum computer, Wecker said scientists think they could map the
chemical used by bacteria that naturally creates fertilizers, making it
easier to create an alternative to the current, natural-gas based synthetic
fertilizer.
The incredible power of quantum computers also could be used to figure out
how to create organic batteries that don’t rely on lithium, and Wecker said
they could help to create systems for capturing carbon emissions
effectively, potentially reducing the effects of climate change.
Researchers believe that quantum computers will be ideal for challenges like
this, which involve mapping complex physical systems, but they also know
that they won’t be the best choice for all computing problems. That’s
because quantum computers operate very differently from classical digital
computers.
Although quantum computers can process data much faster, it’s much more
difficult to get the results of their calculations because of how qubits are
structured. A person using a quantum system needs to know the right question
to ask in order to efficiently get the answer they want.
For now at least, quantum computer scientists also are struggling to create
systems that can run lots of qubits. Because qubits are essentially a scarce
resource, Svore said another big research focus is on how to minimize the
number of qubits needed to do any algorithm or calculation. That’s also one
of the main focuses of Station Q, which is using an area of math called
topology to find ways to use fewer qubits.
Wecker said that’s another major advantage to a system like LIQUi|>: It
will help researchers figure out how best to use these unique computers.
As quantum computing technology becomes increasingly sophisticated, the
techniques required to calibrate and certify device performance are becoming
commensurately sophisticated. In this talk, I will discuss the need for QCVV
(quantum characterization, verification, and validation) protocols to
facilitate advances towards fault-tolerant universal quantum computation. In
particular, I'll examine what kind of errors we expect nascent quantum
information processors to suffer from, and how the QCVV tools may be used
for detecting, diagnosing, and ultimately correcting such errors. To
illustrate this point, I will examine the role gate set tomography (GST)
played in characterizing quantum operations on a trapped-Yb-ion qubit, and
how GST was iteratively used to a) make the qubit gate behavior Markovian
and b) verify that the errors on the qubit operations were below the
threshold for fault-tolerance. Lastly, several "GST-adjacent" QCVV
protocols, such as drift- and cross-talk detection will be examined, and the
future of QCVV research will be discussed.
This work was supported by the Intelligence Advanced Research Projects
Activity (IARPA), and Sandia's Laboratory Directed Research and Development
(LDRD) Program. Sandia National Laboratories is a multimission laboratory
managed and operated by National Technology and Engineering Solutions of
Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc.,
for the U.S. Department of Energy's National Nuclear Security Administration
under contract DE-NA- 0003525.
Better Quantum Living Through QCVV
0 reacties:
Post a Comment