قالب وردپرس درنا توس
Home https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Science https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Quantum calculations take off

Quantum calculations take off



Quantum computers promise to perform certain tasks much faster than conventional (classic) computers. Essentially, a quantum computer carefully arranges quantum effects (superposition, entanglement, and interference) to explore vast computational space and ultimately to come up with a solution or solutions to a problem. If the number of quantum bits (cubes) and operations reach even modest levels, performing the same task on a state-of-the-art supercomputer becomes insoluble in any reasonable time range — a mode called quantum computational excellence 1 . However, achieving this mode requires a stable quantum processor, as each additional imperfect operation continuously chips in overall performance. Therefore, the question was raised whether a sufficiently large quantum computer could be controlled in practice. But now, in an article by Nature Arute et al. 2 report a quantum dominance using a 53-qubit processor.


Arute and colleagues chose a task that involved the generation of arbitrary numbers: namely, sampling the output of a pseudorandom quantum scheme. This task is accomplished by a sequence of operating cycles, each of which applies operations called gates to each qubit in a n -bit processor. These operations include randomly selected single-cubic gates and prescribed two-qubit gates. The output is then determined by measuring each qubit. The resulting strings of 0s and 1s are not evenly distributed across all 2 possibilities n . Instead, they have a preferential, chain-dependent structure – some strings being much more likely than others due to quantum entanglement and quantum interference. Repeating the experiment and sampling a sufficiently large number of these decisions results in the distribution of probable results. Simulating this probability distribution on a classic computer using even today's leading algorithms becomes exponentially more challenging as the number of qubits and operating cycles increases.

In his experiment, Arute and others. uses a quantum processor called Sycamore. This processor includes 53 individually controlled cubes, 86 couplings (couplings) that are used to enable or disable two-cubic interactions of the nearest neighbor, and a circuit to measure all cubes simultaneously. In addition, the authors used 277 digital-to-analog converters to control the processor.

When all cubes are managed simultaneously, each one-cubic and two-cubic gates has approximately 99-99.9% fidelity – a measure of whether the actual result of the operation is similar to the ideal result. The achievement of such fidelity is one of the remarkable technical achievements that made this work possible. Arute and his colleagues determined the fidelities using a protocol known as cross entropy comparative evaluation (XEB). This protocol was introduced last year 3 and offers certain advantages over other methods of diagnosing systemic and accidental errors.


The demonstration of the authors of quantum dominance involves subtracting solutions from a pseudorandom scheme implemented on Sycamore and then comparing these results with simulations performed on several powerful classic computers, including the Summit supercomputer at the Oak Ridge National Laboratory in Tennessee (see go.nature.com/35zfbuu). Currently, Summit is the world's leading supercomputer capable of performing around 200 million billion operations per second. It consists of about 40,000 processing units, each containing billions of transistors (electronic switches) and having 250 million gigabytes of storage. Approximately 99% of Summit's resources were used for classical sampling.

The test of quantum dominance over the problem of sampling is challenging, as this is precisely the mode in which classical simulations are impossible. To solve this problem, Arute et al. first conducted experiments in classically verifiable mode using three different circuits: the complete circuit, the patch scheme and the radiated circuit (Fig. 1). The complete scheme used all n cubes and was the most difficult to simulate. The patch chain cut the entire chain into two patches, each having about n / 2 cubes and individually was much easier to simulate. Finally, the radiated chain made limited double-cubic connections between the two patches, resulting in a level of computational difficulty that is intermediate between those of the complete circuit and the patch circuit.

Figure 1 | Three types of quantum scheme. Arute and others. 2 demonstrate that a quantum processor containing 53 quantum bits (86 cubes) and 86 connections (cubic links) can accomplish a specific task much faster than a regular computer can simulate the same task, is based on three quantum chains: the complete chain, the patch circle, and the radiated chain. The complete scheme includes all 53 cubes and is the most difficult to simulate on a regular computer. The patch chain cuts the entire chain into two patches, each of which is relatively easy to simulate. Finally, the radiated circuit binds these two patches using a reduced number of operations with two cubes along the re-introduced two-hub connections and is intermediate between the complete and the patch chains in terms of its ease of simulation.

The authors have chosen a simplified set of two cubic gates and a limited number of cycles (14) to create complete, patch and radiated circuits that can be simulated in a reasonable amount of time. Most importantly, the classical simulations for all three circuits give consistent XEB fidelities for up to n = 53 cubes, providing proof that the patches and broadcast circuits serve as good proxies for the complete circuit. Full-chain simulations also coincide with calculations based solely on the individual fidelities of the one-cubic and two-quad gates. This finding shows that errors remain well described by a simple, localized model, even as the number of qubits and operations increases.

The longest, directly verifiable measurement of Arute and colleagues was performed in the complete chain (containing 53 cubes) over 14 cycles. The quantum processor took one million samples in 200 seconds to reach an XEB accuracy of 0.8% (with a sensitivity limit of approximately 0.1% due to sample statistics). By comparison, completing the 0.8% fidelity sampling task on a classic computer (containing about one million processor cores) took 130 seconds, and a precision classic (100% fidelity) classic check took 5 hours. Given the huge mismatch in physical resources, these results already show a clear advantage of quantum hardware over its classic counterpart.

The authors then extended the circuits into a not directly verifiable mode of dominance. They used a wider set of two cubic gates to spread the entanglement more widely throughout the 53-quad processor and increased the number of cycles from 14 to 20. The complete scheme could not be simulated or directly verified in a reasonable amount of time, so Arute et al. just archive this quantum data for future reference – in case one day you find extremely efficient classic algorithms that would allow verification. However, the patch chain, the elimination circuit and the calculated XEB fidelities all agreed. When 53 cubes were run over 20 cycles, the XEB fidelity calculated using these proxies remained greater than 0.1%. Sycamore will sample the solution in just 200 seconds, while classic 0.1% sampling will take 10,000 years, and a full check will take several million years.

This demonstration of quantum dominance over the leading classical algorithms of the world's fastest supercomputers is a truly remarkable achievement and a milestone for quantum computing. He experimentally assumes that quantum computers represent a model of computation that is radically different from that of classical computers 4 . It also tackles the criticisms 5 6 of the controllability and viability of quantum computations in an extremely large computational space (containing at least 2 states 53 used here

it takes a lot of work before quantum computers become a practical reality, in particular algorithms that can be commercialized and work on noisy (error-prone) large-scale quantum processors that will be available will need to be developed in the near future 1 And researchers will need to demonstrate robust quantum error correction protocols that will allow for a long, resilient, long-term failure operation.

The demonstration of Arute and colleagues in many respects reminds on the Wright Brothers' first flights, their aircraft, the Wright Flyer was not the first aircraft to fly and solve a pressing transportation problem. Neither announced the widespread adoption of aircraft nor marks the beginning of the end for other modes of transport. Instead, the event remembers showing a new operating mode – a self-propelled aircraft that was heavier than air. It was precisely what constituted the event, not what it actually accomplished, that was paramount. And so it is with this first report on quantum computational excellence.


Source link