28 November 2021

Achieving Scalability in Quantum Computing

 


Achieving Scalability in Quantum Computing


As the path to build a quantum computer continues, challenges from across industries await solutions from this new computational power. One of the many examples of high-impact problems that can be solved on a quantum computer is developing a new alternative to fertilizer production. Making fertilizer requires a notable percentage of the world’s annual production of natural gas. This implies high cost, high energy waste, and substantial greenhouse emissions. Quantum computers can help identify a new alternative by analyzing nitrogenase, an enzyme in plants that converts nitrogen to ammonia naturally. To address this problem, a quantum computer would require at least 200 fault-free qubits—far beyond the small quantum systems of today. In order to find a solution, quantum computers must scale up. The challenge, however, is that scaling a quantum computer isn’t merely as simple as adding more qubits.

Building a quantum computer differs greatly from building a classical computer. The underlying physics, the operating environment, and the engineering each pose their own obstacles. With so many unique challenges, how can a quantum computer scale in a way that makes it possible to solve some of the world’s most challenging problems?

Experience quantum impact with Azure Quantum

Navigating obstacles

Most quantum computers require temperatures colder than those found in deep space. To reach these temperatures, all the components and hardware are contained within a dilution refrigerator—highly specialized equipment that cools the qubits to just above absolute zero. Because standard electronics don’t work at these temperatures, a majority of quantum computers today use room-temperature control. With this method, controls on the outside of the refrigerator send signals through cables, communicating with the qubits inside. The challenge is that this method ultimately reaches a roadblock: the heat created by the sheer number of cables limits the output of signals, restraining the number of qubits that can be added.

As more control electronics are added, more effort is needed to maintain the very low temperature the system requires. Increasing both the size of the refrigerator and the cooling capacity is a potential option, however, this would require additional logistics to interface with the room temperature electronics, which may not be a feasible approach.

Another alternative would be to break the system into separate refrigerators. Unfortunately, this isn’t ideal either because the transfer of quantum data between the refrigerators is likely to be slow and inefficient.

At this stage in the development of quantum computers, size is therefore limited by the cooling capacity of the specialized refrigerator. Given these parameters, the electronics controlling the qubits must be as efficient as possible.

Physical qubits, logical qubits, and the role of error correction

By nature, qubits are fragile. They require a precise environment and state to operate correctly, and they’re highly prone to outside interference. This interference is referred to as ‘noise’, which is a consistent challenge and a well-known reality of quantum computing. As a result, error correction plays a significant role.

As a computation begins, the initial set of qubits in the quantum computer are referred to as ‘physical qubits’. Error correction works by grouping many of these fragile physical qubits, which creates a smaller number of usable qubits that can remain immune to noise long enough to complete the computation. These stronger, more stable qubits used in the computation are referred to as ‘logical qubits’.

In classical computing, noisy bits are fixed through duplication (parity and Hamming codes), which is a way to correct errors as they occur. A similar process occurs in quantum computing, but is more difficult to achieve. This results in significantly more physical qubits than the number of logical qubits required for the computation. The ratio of physical to logical qubits is influenced by two factors: 1) the type of qubits used in the quantum computer, and 2) the overall size of the quantum computation performed. And due to the known difficulty of scaling the system size, reducing the ratio of physical to logical qubits is critical. This means that instead of just aiming for more qubits, it is crucial to aim for better qubits.

Quantum Algorithms Landscape

Stability and scale with a topological qubit

The topological qubit is a type of qubit that offers more immunity to noise than many traditional types of qubits. Topological qubits are more robust against outside interference, meaning fewer total physical qubits are needed when compared to other quantum systems. With this improved performance, the ratio of physical to logical qubits is reduced, which in turn, creates the ability to scale.

As we know from Schrödinger’s cat, outside interactions can destroy quantum information. Any interaction from a stray particle, such as an electron, a photon, a cosmic ray, etc., can cause the quantum computer to decohere.

There is a way to prevent this: parts of the electron can be separated, creating an increased level of protection for the information stored. This is a form of topological protection known as a Majorana quasi-particle. The Majorana quasi-particle was predicted in 1937 and was detected for the first time in the Microsoft Quantum lab in the Netherlands in 2012. This separation of the quantum information creates a stable, robust building block for a qubit. The topological qubit provides a better foundation with lower error rates, reducing the ratio of physical to logical qubits. With this reduced ratio, more logical qubits are able to fit inside the refrigerator, creating the ability to scale.

If topological qubits were used in the example of nitrogenase simulation, the required 200 logical qubits would be built out of thousands of physical qubits. However, if more traditional types of qubits were used, tens or even hundreds of thousands of physical qubits would be needed to achieve 200 logical qubits. The topological qubit’s improved performance causes this dramatic difference; fewer physical qubits are needed to achieve the logical qubits required.

Developing a topological qubit is extremely challenging and is still underway, but these benefits make the pursuit well worth the effort.

A solid foundation to tackle problems unsolved by today’s computers

A significant number of logical qubits are required to address some of the important problems currently unsolvable by today’s computers. Yet common approaches to quantum computing require massive numbers of physical qubits in order to reach these quantities of logical qubits—creating a huge roadblock to scalability. Instead, a topological approach to quantum computing requires far fewer physical qubits than other quantum systems, making scalability much more achievable.

Providing a more solid foundation, the topological approach offers robust, stable qubits, and helps to bring the solutions to some of our most challenging problems within reach.

Myth vs. reality: a practical perspective on quantum computing

There’s a lot of speculation about the potential for quantum computing, but to get a clearer vision of the future impact, we need to disentangle myth from reality. At this week’s virtual Q2B conference, we take a pragmatic perspective to cut through the hype and discuss the practicality of quantum computers, how to future-proof quantum software development, and the real value obtained today through quantum-inspired solutions on classical computers.

Azure Quantum (Itailian Video)

Achieving practical quantum advantage

Dr. Matthias Troyer, Distinguished Scientist with Microsoft Quantum, explains what will be needed for quantum computing to be better and faster than classical computing in his talk Disentangling Hype from Reality: Achieving Practical Quantum Advantage. People talk about many potential problems they hope quantum computers can help with, including fighting cancer, forecasting the weather, or countering climate change. Having a pragmatic approach to determining real speedups will enable us to focus the work on the areas that will deliver impact.

For example, quantum computers have limited I/O capability and will thus not be good at big data problems. However, the area where quantum does excel is large compute problems on small data. This includes chemistry and materials science, for game-changing solutions like designing better batteries, new catalysts, quantum materials, or countering climate change. But even for compute-intensive problems, we need to take a closer look. Troyer explains that each operation in a quantum algorithm is slower by more than 10 orders of magnitude compared to a classical computer. This means we need a large speedup advantage in the algorithm to overcome the slowdowns intrinsic to the quantum system; we need superquadratic speedups.

Troyer is optimistic about the potential for quantum computing but brings a realistic perspective to what is needed to get to practical quantum advantage: small data/big compute problems, superquadratic speedup, fault-tolerant quantum computers scaling to millions of qubits and beyond, and the tools and systems to develop the algorithms to run the quantum systems.

Experiencing Quantum impact with Microsoft today | Julie Love | Microsoft


Future-proofing quantum development

Developers and researchers want to ensure they invest in languages and tools that will adapt to the capabilities of more powerful quantum systems in the future. Microsoft’s open-source Quantum Intermediate Representation (QIR) and the Q# programming language provide developers with a flexible foundation that protects their development investments.

QIR is a new Microsoft-developed intermediate representation for quantum programs that is hardware and language agnostic, so it can be a common interface between many languages and target quantum computation platforms. Based on the popular open-source LLVM intermediate language, QIR is designed to enable the development of a broad and flexible ecosystem of software tools for quantum development.

As quantum computing capabilities evolve, we expect large-scale quantum applications will take full advantage of both classical and quantum computing resources working together. QIR provides full capabilities for describing rich classical computation fully integrated with quantum computation. It’s a key layer in achieving a scaled quantum system that can be programmed and controlled for general algorithms.

In his presentation at the Q2B conference, Future-Proofing Your Quantum Development with Q# and QIR, Microsoft Senior Software Engineer Stefan Wernli explains to a technical audience why QIR and Q# are practical investments for long-term quantum development. Learn more about QIR in our recent Quantum Blog post.

Quantum-inspired optimization solutions today

At the same time, there are ways to get practical value today through “quantum-inspired” solutions that apply quantum principles for increased speed and accuracy to algorithms running on classical computers.

We are already seeing how quantum-inspired optimization solutions can solve complex transportation and logistics challenges. An example is Microsoft’s collaboration with Trimble Transportation to optimize its transportation supply chain, presented at the Q2B conference in Freight for the Future: Quantum-Inspired Optimization for Transportation by Anita Ramanan, Microsoft Quantum Software Engineer, and Scott Vanselous, VP Digital Supply Chain Solutions at Trimble.

Trimble’s Vanselous explains how today’s increased dependence on e-commerce and shipping has fundamentally raised expectations across the supply chain. However, there was friction in the supply chain because of siloed data between shippers, carriers, and brokers; limited visibility; and a focus on task optimization vs. system optimization. Trimble and Microsoft are designing quantum-inspired load matching algorithms for a platform that enables all supply chain members to increase efficiency, minimize costs, and take advantage of newly visible opportunities. 

EdX Grover's Search Algorithm

Many industries—automotive, aerospace, healthcare, government, finance, manufacturing, and energy—have tough optimization problems where these quantum-inspired solutions can save time and money. And these solutions will only get more valuable when scaled quantum hardware becomes available and provides further acceleration.

Building a bridge to the future of supercomputing with quantum acceleration

Using supercomputing and new tools for understanding quantum algorithms in advance of scaled hardware gives us a view of what may be possible in a future with scaled quantum computing. Microsoft’s new Quantum Intermediate Representation (QIR), designed to bridge different languages and different target quantum computation platforms, is bringing us closer to that goal. Several Department of Energy (DOE) national laboratories are using this Microsoft technology in their research at the new National Quantum Initiative (NQI) quantum research centers.

As quantum computing capabilities mature, we expect most large-scale quantum applications will take full advantage of both classical and quantum computing resources working together. QIR provides a vital bridge between these two worlds by providing full capabilities for describing rich classical computation fully integrated with quantum computation.

QIR is central to a new collaboration between Microsoft and DOE’s Pacific Northwest National Laboratory (PNNL) born out of NQI’s Quantum Science Center (QSC) led by DOE’s Oak Ridge National Laboratory (ORNL). The goal of the PNNL project is to measure the impact of noisy qubits on the accuracy of quantum algorithms, specifically the Variational Quantum Eigensolver (VQE). In order to run it in simulation on the supercomputer, they needed a language to write the algorithm, and another representation to map it to run on the supercomputer. PNNL used Microsoft’s Q# language to write the VQE algorithm and then QIR provides the bridge, allowing easy translation and mapping to the supercomputer for the simulation.

The PNNL team is showcasing the simulation running on ORNL’s Summit supercomputer at this week’s virtual International Conference for High Performance Computing, Networking, Storage, and Analysis (SC20). You can view their presentation here: Running Quantum Programs at Scale through an Open-Source, Extensible Framework.

Q# and QIR are also helping to advance research at ORNL, which is accelerating progress by enabling the use of the Q# language for all QSC members, including four national labs, three industry partners, and nine universities. ORNL is integrating Q# and QIR into its existing quantum computing framework, so ORNL researchers can run Q# code on a wide variety of targets including both supercomputer-based simulators and actual hardware devices. Supporting Q# is important to ORNL’s efforts to encourage experimentation with quantum programming in high-level languages.

The ORNL team is using QIR to develop quantum optimizations that work for multiple quantum programming languages. Having a shared intermediate representation allows the team to write optimizations and transformations that are independent of the original programming language. ORNL chose to use QIR because, being based on the popular LLVM suite, it integrates seamlessly with ORNL’s existing platform and provides a common platform that can support all of the different quantum and hybrid quantum/classical programming paradigms.

Shifting left to scale up: shortening the road to scalable quantum computing | Quantum Week 2021

Since QIR is based on the open source LLVM intermediate language, it will enable the development of a broad ecosystem of software tools around the Q# language. The community can use QIR to experiment and develop optimizations and code transformations that will be crucial for unlocking quantum computing.

Microsoft technology is playing a crucial role in DOE’s NQI initiative connecting experts in industry, national labs, and academia to accelerate our nation’s progress towards a future with scaled quantum computing.

Learn more about the latest developments in quantum computing from Microsoft and our QSC national lab partner PNNL in these virtual SC20 conference sessions.

Complex quantum programs will require programming frameworks with many of the same features as classical software development, including tools to visualize the behavior of programs and diagnose issues. The Microsoft Quantum team presents new visualization tools being added to the Microsoft Quantum Development Kit (QDK) for visualizing the execution flow of a quantum program at each step during its execution. These tools are valuable for experienced developers and researchers as well as students and newcomers to the field who want to explore and understand quantum algorithms interactively.

Dr. Krysta Svore, Microsoft’s General Manager of Quantum Systems and Software, is on this year’s exotic system panel. The SC20 panel will discuss predictions from past year sessions, what actually happened, and predict what will be available for computing systems in 2025, 2030 and 2035.

As quantum computers evolve, simulations of quantum programs on classical computers will be essential in validating quantum algorithms, understanding the effect of system noise and designing applications for future quantum computers. In this paper, PNNL researchers first propose a new multi-GPU programming methodology which constructs a virtual BSP machine on top of modern multi-GPU platforms, and apply this methodology to build a multi-GPU density matrix quantum simulator. Their simulator is more than 10x faster than a corresponding state-vector quantum simulator on various platforms.

Full stack ahead: Pioneering quantum hardware allows for controlling up to thousands of qubits at cryogenic temperatures

Quantum computing offers the promise of solutions to previously unsolvable problems, but in order to deliver on this promise, it will be necessary to preserve and manipulate information that is contained in the most delicate of resources: highly entangled quantum states. One thing that makes this so challenging is that quantum devices must be ensconced in an extreme environment in order to preserve quantum information, but signals must be sent to each qubit in order to manipulate this information—requiring, in essence, an information superhighway into this extreme environment. Both of these problems must, moreover, be solved at a scale far beyond that of present-day quantum device technology.

Microsoft’s David Reilly, leading a team of Microsoft and University of Sydney researchers, has developed a novel approach to the latter problem. Rather than employing a rack of room-temperature electronics to generate voltage pulses to control qubits in a special-purpose refrigerator whose base temperature is 20 times colder than interstellar space, they invented a control chip, dubbed Gooseberry, that sits next to the quantum device and operates in the extreme conditions prevalent at the base of the fridge. They’ve also developed a general-purpose cryo-compute core that operates at the slightly warmer temperatures comparable to that of interstellar space, which can be achieved by immersion in liquid Helium. This core performs the classical computations needed to determine the instructions that are sent to Gooseberry which, in turn, feeds voltage pulses to the qubits. These novel classical computing technologies solve the I/O nightmares associated with controlling thousands of qubits.

Quantum Algorithms for Hamiltonian Simulation | Quantum Colloquium

Quantum computing could impact chemistry, cryptography, and many more fields in game-changing ways. The building blocks of quantum computers are not just zeroes and ones but superpositions of zeroes and ones. These foundational units of quantum computation are known as qubits (short for quantum bits). Combining qubits into complex devices and manipulating them can open the door to solutions that would take lifetimes for even the most powerful classical computers.

Despite the unmatched potential computing power of qubits, they have an Achilles’ heel: great instability. Since quantum states are easily disturbed by the environment, researchers must go to extraordinary lengths to protect them. This involves cooling them nearly down to absolute zero temperature and isolating them from outside disruptions, like electrical noise. Hence, it is necessary to develop a full system, made up of many components, that maintains a regulated, stable environment. But all of this must be accomplished while enabling communication with the qubits. Until now, this has necessitated a bird’s nest-like tangle of cables, which could work for limited numbers of qubits (and, perhaps, even at an “intermediate scale”) but not for large-scale quantum computers.

Azure Quantum Developer Workshop | Part 3

Microsoft Quantum researchers are playing the long game, using a wholistic approach to aim for quantum computers at the larger scale needed for applications with real impact. Aiming for this bigger goal takes time, forethought, and a commitment to looking toward the future. In that context, the challenge of controlling large numbers of qubits looms large, even though quantum computing devices with thousands of qubits are still years in the future.

Enter the team of Microsoft and University of Sydney researchers, headed by Dr. David Reilly, who have developed a cryogenic quantum control platform that uses specialized CMOS circuits to take digital inputs and generate many parallel qubit control signals—allowing scaled-up support for thousands of qubits—a leap ahead from previous technology. The chip powering this platform, called Gooseberry, resolves several issues with I/O in quantum computers by operating at 100 milliKelvin (mK) while dissipating sufficiently low power so that it does not exceed the cooling power of a standard commercially-available research refrigerator at these temperatures. This sidesteps the otherwise insurmountable challenge of running thousands of wires into a fridge.

Harnessing the problem-solving power of quantum computing

Their work is detailed in a paper published in Nature this month, called “A Cryogenic Interface for Controlling Many Qubits.” They’ve also extended this research to create the first-of-its-kind general-purpose cryo-compute core, one step up the quantum stack. This operates at around 2 Kelvin (K), a temperature that can be reached by immersing it in liquid Helium. Although this is still very cold, it is 20 times warmer than the temperatures at which Gooseberry operates and, therefore, 400 times as much cooling power is available. With the luxury of dissipating 400 times as much heat, the core is capable of general-purpose computing. Both visionary pieces of hardware are critical advances toward large-scale quantum computer processes and are the result of years of work.

Both chips help manage communication between different parts of a large-scale quantum computer—and between the computer and its user. They are the key elements of a complex “nervous system” of sorts to send and receive information to and from every qubit, but in a way that maintains a stable cold environment, which is a significant challenge for a large-scale commercial system with tens of thousands of qubits or more. The Microsoft team has navigated many hurdles to accomplish this feat.

The big picture: Topological quantum computing and the quantum stack

Quantum computing devices are often measured by how many qubits they contain. However, all qubits are not created equal, so these qubit counts are often apples-to-oranges comparisons. Microsoft Quantum researchers are pioneering the development of topological qubits, which have a high level of error protection built in at the hardware level. This reduces the overhead needed for software-level error correction and enables meaningful computations to be done with fewer physical qubits.

Although this is one of the unique features of Microsoft’s approach, it is not the only one. In the quantum stack, qubits make up its base. The quantum plane (at the bottom of Figure 1) is made up of a series of topological qubits (themselves made up of semiconductors, superconductors, and dielectrics), gates, wiring, and other packaging that help to process information from raw qubits. The vital processes of communication occur in the next layer higher in the stack (labeled “Quantum-Classical Interface” in Figure 1 above). The Gooseberry chip and cryo-compute core work together to bookend this communication. The latter sits at the bottom of the “Classical Compute” portion of the stack, and Gooseberry is unique relative to other control platforms in that it sits right down with the qubits at the same temperature as the quantum plane—able to convert classical instructions from the cryo-compute core into voltage signals sent to the qubits.

Play it cool: Dissipating heat in a CMOS-based control platform

Why does it matter where the Gooseberry chip sits? It is partly an issue of heat. When the wires that connect the control chip to the qubits are long (as they would have to be if the control chip were at room temperature), significant heat can be generated inside the fridge. Putting a control chip near the qubits avoids this problem. The tradeoff is that the chip is now near the qubits, and the heat generated by the chip could potentially warm up the qubits. Gooseberry navigates these competing effects by putting the control chip near, but not too near, the qubits. By putting Gooseberry in the refrigerator but thermally isolated from the qubits, heat created by the chip is drawn away from the qubits and into the mixing chamber. (See Figure 2 below).

Placing the chip near the qubits at the quantum plane solves one set of problems with temperature but creates another. To operate a chip where the qubits are, it needs to function at the same temperature as the qubits—100 mK. Operating standard bulk CMOS chips at this temperature is challenging, so this chip uses fully-depleted silicon-on-insulator (FDSOI) technology, which optimizes the system for operation at cryogenic temperatures. It has a back-gate bias, with transistors having a fourth terminal that can be used to compensate for changes in temperature. This system of transistors and gates allows qubits to be calibrated individually, and the transistors send individualized voltages to each qubit.

Gates galore: No need for separate control lines from room temperature to every qubit

Another advantage of Gooseberry is that the chip is designed in such a way that the electrical gates controlling the qubits are charged from a single voltage source that cycles through the gates in a “round-robin” fashion, charging as necessary. Previous qubit controllers required one-to-one cables from multiple voltage sources at room temperature or 4K, compromising the ability to operate qubits at large scale. The design pioneered by Dr. Reilly’s team greatly reduces the heat dissipated by such a controller. The cryogenic temperatures also come into play here to make this possible—the extreme cold allows capacitors to hold their charge longer. This means that the gates need to be charged less frequently and produce less heat and other disruptions to qubit stability.

Azure Quantum Developer Workshop | July 2020

The Gooseberry chip is made up of both digital and analog blocks. Coupled digital logic circuits perform communication, waveform memory, and autonomous operation of the chip through a finite-state machine (FSM), and the digital part of the chip also includes a master oscillator (see Figure 3). The chip also uses a Serial Peripheral Interface (SPI) for easy communication higher up the quantum stack. The analog component of the chip is a series of cells, called “charge-lock fast-gate” (CLFG) cells, that perform two functions. First, the charge-lock function is the process for charging gates, as described above. The voltage stored on each gate is tailored to individual qubits. Information is processed in qubits by changing the voltages on the gate, and that happens in the second function, “fast-gating.” This creates pulses that physically manipulate the qubits, ultimately directing the processing of information in the qubits.

Benchmarking results of the cryo-CMOS control with a quantum dot chip

Low power dissipation is a key challenge when it comes to communicating with qubits efficiently via these pulses. There are three variables that impact power dissipation: voltage level, frequency, and capacitance. The voltage needed in this case is set by the qubit, and the frequency is set by both the qubit and clock rate of the quantum plane. This leaves capacitance as the only variable you can adjust to create low power dissipation when charging gates and sending pulses—low capacitance means low dissipation. The capacitors in this system are tiny, spaced close together, and are very near the quantum plane, so they require as little power as possible to shuffle charge between capacitors to communicate with the qubits.

Disentangling hype from reality: Achieving practical quantum advantage

The researchers tested the Gooseberry chip to see how it would perform by connecting it with a GaAs-based quantum dot (QD) device. Some of the gates in the quantum dot device were connected to a digital-analog converter (DAC) at room temperature to compare these results with standard control approaches. Power leakage from the CLFG cells is measured by a second quantum dot in the device, and measurements of the QD conductance provide a way to monitor the charge-locking process. The temperature of all the components of the chip are measured as the control chip is being powered up, revealing that temperature stays below 100 mK within the necessary range of frequencies or clock speeds (see figure 4). See the paper for more details on the benchmarking process.

Extrapolating these results, the researchers estimated the total system power needed for the Gooseberry control chip as a function of frequency and the number of output gates. These results take into account both the clock speed and temperature needed for topological qubits, and Figure 5 shows that this chip is able to operate within the acceptable limits while communicating with thousands of qubits. This CMOS-based control approach also appears feasible for qubit platforms based on electron spins or gatemons.

Proof of principle that general-purpose compute is possible at cryogenic temperatures

The general-purpose cryo-compute core is a recent development that continues the progress made by Gooseberry. This is a general-purpose CPU operating at cryogenic temperatures. At present, the core operates at approximately 2 K, and it handles some triggering manipulation and handling of data. With fewer limitations from temperature, it also deals with branching decision logic, which requires more digital circuit blocks and transistors than Gooseberry has. The core acts as an intermediary between Gooseberry and executable code that can be written by developers, allowing for software-configurable communication between the qubits and the outside world. This technology proves it’s possible to compile and run many different types of code (written on current tools) in a cryogenic environment, allowing for greater possibilities of what can be accomplished with qubits being controlled by the Gooseberry chip.

Journey before destination: The zen behind the Microsoft approach to quantum computers

Trapped-ion qubit, the maglev train of a quantum computer

There’s no doubt that both Gooseberry and the cryo-compute core represent big steps forward for quantum computing, and having these concepts peer-reviewed and validated by other scientists is another leap ahead. But there are still many more leaps needed by researchers before a meaningful quantum computer can be realized. This is one of the reasons Microsoft has chosen to focus on the long game. While it might be nice to ramp up one aspect of quantum computers—such as the number of qubits—there are many concepts to be developed beyond the fundamental building blocks of quantum computers, and researchers at Microsoft Quantum and the University of Sydney aren’t stopping with these results.

The Transmon qubit | QuTech Academy

 

Projects like the Gooseberry chip and cryo-compute core take years to develop, but these researchers aren’t waiting to put new quantum projects into motion. The idea is to keep scaffolding prior work with new ideas so that all of the components necessary for quantum computing at large scale will be in place, enabling Microsoft to deliver solutions to many of the world’s most challenging problems.

More Information

https://cloudblogs.microsoft.com/quantum/2018/05/16/achieving-scalability-in-quantum-computing/

https://cloudblogs.microsoft.com/quantum/2018/06/06/the-microsoft-approach-to-quantum-computing/

https://cloudblogs.microsoft.com/quantum/2021/10/07/the-azure-quantum-ecosystem-expands-to-welcome-qiskit-and-cirq-developer-community/

https://news.microsoft.com/europe/2018/09/24/microsoft-and-the-university-of-copenhagen-are-building-the-worlds-first-scalable-quantum-computer/

https://www.microsoft.com/en-us/research/research-area/quantum-computing/?facet%5Btax%5D%5Bmsr-research-area%5D%5B%5D=243138&facet%5Btax%5D%5Bmsr-content-type%5D%5B%5D=post

https://azure.microsoft.com/en-us/resources/whitepapers/search/?term=quantum

https://azure.microsoft.com/en-us/solutions/quantum-computing/#news-blogs

https://sc20.supercomputing.org/

https://www.microsoft.com/en-us/research/research-area/quantum-computing/?facet%5Btax%5D%5Bmsr-research-area%5D%5B0%5D=243138&sort_by=most-recent

https://www.microsoft.com/en-us/research/blog/state-of-the-art-algorithm-accelerates-path-for-quantum-computers-to-address-climate-change/

https://www.microsoft.com/en-us/research/blog/full-stack-ahead-pioneering-quantum-hardware-allows-for-controlling-up-to-thousands-of-qubits-at-cryogenic-temperatures/

https://arxiv.org/abs/2007.14460

https://www.microsoft.com/en-us/research/publication/quantum-computing-enhanced-computational-catalysis/

https://ionq.com/

https://www.honeywell.com/us/en/company/quantum

https://www.honeywell.com/us/en/news/2020/06/quantum-scientific-papers












Share:

0 reacties:

Post a Comment