QUANTUM FRONTIERS - FUTUROLOGY CHRONICLE No 32

Page 1

2 Your Editor of the Futurology Chronicle “Independent and Sponsor free.” MAY 2024 – Edition - 4th Year
3 Contents pages PART 1 HISTORIC INTRODUCTION 6 97 YEARS AGO 7-9 TEN PILLARS OF QUANTUM TECHNOLOGY 10-12 QUANTUM COMPUTING FOUNDERS 13-14 QUANTUM COMPUTING TIMELINE 15-17
PATHETIC ELEGANT, COMPLEX, UNEFFICIENT CHANDELIER 19-21 CRUCIAL DILUTION REFRIGERATOR 22-23 QUANTUM ERROR CORRECTION FATALITY 24-25 IS IT NISQ FAULT? 26-28 THE CHASE: NISQ TO FTQC AND TO ISQ ALGO 29-30
PART 2
4 PART
FRANTIC Q MACHINE LEARNING 32-33 SPINTRONICS IN QC 34-36 SURREAL NUMBERS AND QC 37-39 TIME CRYSTALS AND QC 40-42 SURFACE CODE AND LOW DENSITY CODE 43-45
REALISTIC PHOTONIC QUANTUM COMPUTING 49-50 REALITY CHECK BY SCEPTICS 51-57 EXASCALE SWEET REVENGE UP TO NOW 58-61 FUTURE OBSTACLES FOR QC 62-63 CHINA TECH ASCENDANCY: QC AND AI AMBITIONS 64-65
FUTURISTIC LATTICE BASE CRYPTO VS QC 67-69 AI INTEGRATION WITH QC 70-71 CONVERGENCE AI AND QC 72-74 CONCLUSIVE REALITY CHECK 75-76 LAST MINUTE BREAKING NEWS 77-81 SOURCES 82 NEXT EDTION JUNE: CLIMATE TECH 83 SIGNATURE STATEMENT 84
3
PART 4
PART 5

PART 1

Historic

5
-

INTRODUCTION

The emergence of quantum computing from the foundations of quantum physics has captivated my attention and research endeavors for many years.

The inaugural issue of this chronicle in January 2021 (No 1: Attoseconds quantum computing- now!) was composed with a sense of anticipation for what I considered the pinnacle of imminent technological advances.

Over the past four years, I have monitored every development, leading to the present update, which is delivered with a tone of disillusionment.

Billions have been invested in this field, with thousands of eminent researchers from various nations dedicating extensive hours to achieve results that can best be described as modest and limited in scope.

The term "winter" is now being used by analysts to denote a period during which venture capital, frustrated by the lack of substantial outcomes, has withdrawn its support, leaving only state funding to sustain ongoing efforts.

This "winter period" echoes the 32 years spanning the early 1990s to the breakthrough in November 2022 with the advent of ChatGPT, during which AI's reputation within the broader computing industry was akin to the perception of “homeopathy within mainstream medicine”.

Currently, AI's growth appears unbounded, attracting vast investments to the extent that the financial sector shows little interest in other areas.

Quantum computing, having failed to capitalize on its moment in the limelight, remains marginalized, with many of the field's brightest minds shifting their focus to AI, leaving only the most dedicated scientists in pursuit of the elusive breakthrough.

There is speculation that AI could provide a solution, with some envisioning a potential integration or merger between the two fields, combining extreme computational power with AI's capabilities for universal learning.

Despite being disheartened by the limited progress in quantum computing, I am reminded by the history of other innovations to afford it another opportunity to emerge from this "winter period" through modesty and diligent effort, potentially with AI as an ally, to become a robust and significant technology for humanity's future.

6

The 1927 Solvay Conference (Brussels) on quantum mechanics was a pivotal event that led to key discoveries and discussions in the field. Here are some of the significant outcomes and contributions from the conference (colorized photo)

The solid debaters with a circle: Top row E. Schrodinger, Middle line L. de Broglie and N. Bohr. First line M. Planck and A. Einstein

“Copenhagen” conference Interpretation:

The conference was dominated by disputes about the ideas behind quantum mechanics, particularly focusing on the Copenhagen interpretation.

Physicists like Niels Bohr, Werner Heisenberg, and Max Born presented and defended this interpretation, which postulated that indeterminacy in quantum theory was fundamental and should be accepted by scientists.

Debate with Einstein:

Albert Einstein, a prominent figure at the conference, engaged in intense debates challenging the newly formed quantum theory. Einstein led thought experiments trying to disprove the Heisenberg Uncertainty Principle and quantum mechanics itself.

However, Niels Bohr responded with keen rebuttals, using Einstein's own theory of relativity against him, ultimately winning the argument.

7
97 YEARS AGO!

Quantum Mechanics Development:

The discussions at the conference helped to establish quantum mechanics as a leading field of research and accelerated the dissemination of the Copenhagen interpretation among physicists.

This interpretation, emphasizing the probabilistic nature of quantum phenomena, became a prevailing view in quantum mechanics.

The Solvay Conference on Quantum Mechanics shall be seen as the foundational bedrock upon which the concept of quantum computing was later built.

At its core, the conference laid down the theoretical groundwork essential for understanding and harnessing quantum mechanics, which is the underpinning of quantum computing as follows.

Establishment of Quantum Theory Principles:

The conference discussions centered on the Copenhagen Interpretation and the principles of quantum mechanics, such as the superposition of states and the probabilistic nature of quantum phenomena.

These principles are fundamental to quantum computing, which relies on quantum bits or qubits that can exist in multiple states simultaneously, unlike classical bits.

Resolution of Quantum Paradoxes:

The intense debates, especially between Einstein and Bohr, helped clarify the conceptual underpinnings of quantum mechanics.

By addressing and resolving these paradoxes, the conference contributed to a deeper understanding of quantum behavior, which was crucial for the development of quantum algorithms and error correction methods in quantum computing.

Acceleration of Quantum Research:

By solidifying the Copenhagen Interpretation as the prevailing view, the Solvay Conference accelerated research in quantum mechanics.

This laid the groundwork for future discoveries in quantum entanglement, coherence, and decoherence, all key concepts in the operation of quantum computers.

Cross-Disciplinary Collaboration: The conference demonstrated the value of collaboration among physicists from various backgrounds.

8

This collaborative spirit is mirrored in the multidisciplinary approach required for quantum computing, which combines quantum physics, computer science, mathematics, and engineering.

In essence, the Solvay Conference not only advanced the understanding of quantum mechanics but also set the stage for the future technological applications of quantum computing.

By establishing the theoretical foundation and encouraging a culture of rigorous debate and collaboration, the conference indirectly paved the way for the exploration of quantum computing as a powerful tool for solving complex problems beyond the reach of classical computers.

It is widely recognized that Albert Einstein grew weary of the debate and feigned a late concession to his colleagues. Seeking a break from the monotony, he famously retreated to the serene shores of Lake Geneva to revel in the calming breeze. (colorized photograph)

9
…………

THE 10 PILLARS OF QUANTUM TECHNOLOGY

Quantum Mechanics, a cornerstone of modern physics, unveils the bewildering and intricate behaviors of particles at the atomic and subatomic levels.

This foundational theory emerged from the 1927 Solvay Conference to revolutionize our understanding of the natural world, challenging conventional notions of space, time, and matter.

By introducing fundamental principles such as wave-particle duality, superposition, entanglement, and the uncertainty principle, Quantum Mechanics set the stage for a profound paradigm shift in science and technology.

From this bedrock of quantum theory, ten specialized quantum domains have progressively emerged, each extending the principles of Quantum Mechanics to explore new horizons and solve complex problems across various fields:

1. Computing:

Quantum computing utilizes the principles of quantum mechanics to process information at speeds unattainable by classical computers, especially for certain types of problems.

2. Physics:

Quantum physics underpins all quantum technologies, providing the theoretical foundation for understanding and exploiting phenomena like superposition and entanglement.

3. Super chemistry:

Often related to the use of quantum mechanics in chemistry, this might involve quantum simulations for understanding chemical reactions at a quantum level, potentially leading to new materials and drugs.

10

4. Cryptography:

Quantum cryptography leverages quantum mechanical properties to create secure communication channels that are theoretically immune to eavesdropping.

5. Communication:

Quantum communication uses quantum states to transmit information, promising ultra-secure networks by implementing quantum key distribution (QKD).

6. Sensing and Metrology:

Quantum sensing and metrology use quantum states or phenomena to measure physical quantities with unprecedented precision, improving navigation, timing, and imaging technologies.

7. Simulation:

Quantum simulation employs quantum computers to simulate complex quantum systems, offering insights into materials science, pharmacology, and physics that are infeasible with classical computing.

8. Materials:

The study and development of quantum materials explore substances with distinct quantum mechanical properties, such as topological insulators and superconductors, for use in technology.

9. Spintronics: Spintronics involves the study of the intrinsic spin of the electron and its associated magnetic moment, in addition to its fundamental electronic charge, in solid-state devices.

10. Quantum Algorithms and Software:

Beyond the hardware and theoretical physics, the development of quantum algorithms and software is crucial for realizing the potential of quantum computing.

11

This includes the creation of algorithms that can run on quantum processors to solve specific problems more efficiently than classical algorithms, as well as the software infrastructure needed to program, control, and interface with quantum computers.

…………………….

These domains collectively represent the forefront of quantum research and application. As the field continues to evolve, it's possible that new specialized domains will emerge, further expanding the scope of quantum technology's impact.

12

QUANTUM COMPUTING FOUNDERS

Quantum computing emerged as a distinct concept in the early 1980s, its foundation laid by multiple researchers whose work transcended disciplinary boundaries.

It is challenging to attribute the inception of quantum computing to a singular individual, given its interdisciplinary roots. However, several key figures and milestones significantly contributed to the early development of the field:

Paul Benioff is often recognized for presenting the first quantum mechanical model of a computer.

His 1980 publication delineated a quantum mechanical model of Turing machines, positing that computers could operate under the principles of quantum mechanics.

Richard Feynman, a renowned physicist, stands out as a pivotal figure in quantum computing's nascent stages.

During a 1981 conference on physics and computation at MIT, Feynman highlighted the inefficiency of simulating quantum systems with classical computers, due to the exponential resources required.

He proposed that a quantum mechanics-based computer would be more apt for such simulations.

His 1982 paper further elaborated on the idea, explicitly suggesting the development of quantum computers to simulate quantum systems.

David Deutsch (Portrait next page) at the University of Oxford extended Feynman's propositions by introducing the quantum Turing machine, thereby providing a theoretical framework for quantum computing.

Deutsch's work articulated the concept of the universal quantum computer, demonstrating that quantum computers could theoretically execute any computation achievable by classical computers, with the added potential for significantly faster processing of certain tasks.

13

The collective endeavors of Benioff, Feynman, and Deutsch during the early 1980s marked the inception of quantum computing as a field of research. Their pioneering contributions underscored the efficiency of quantum computers in solving specific problems—particularly in simulating quantum physical processes—more effectively than their classical counterparts, catalyzing the field's evolution in the years that followed.

14

QUANTUM COMPUTING TIMELINE

1980 Paul Benioff of the Argonne National Laboratory publishes a paper describing a quantum mechanical model of a Turing machine or a classical computer, the first to demonstrate the possibility of quantum computing.

1981 In a keynote speech titled Simulating Physics with Computers, Richard Feynman of the California Institute of Technology argues that a quantum computer had the potential to simulate physical phenomena that a classical computer could not simulate

1985 David Deutsch of the University of Oxford formulates a description for a quantum Turing machine.

1992 The Deutsch–Jozsa algorithm is one of the first examples of a quantum algorithm that is exponentially faster than any possible deterministic classical algorithm

1993 The first paper describing the idea of quantum teleportation is published

1994 Peter Shor of Bell Laboratories develops a quantum algorithm for factoring integers that has the potential to decrypt RSA-encrypted communications, a widely-used method for securing data transmissions

1996 Lov Grover of Bell Laboratories invents the quantum database search algorithm.

1998 First demonstration of quantum error correction; first proof that a certain subclass of quantum computations can be efficiently emulated with classical computers

1999 Yasunobu Nakamura of the University of Tokyo and Jaw-Shen Tsai of Tokyo University of Science demonstrate that a superconducting circuit can be used as a qubit.

2002 The first version of the Quantum Computation Roadmap, a living document involving key quantum computing researchers, is published.

2004 First five-photon entanglement demonstrated by Jian-Wei Pan's group at the University of Science and Technology in China

2011 The first commercially available quantum computer is offered by DWave Systems

15

2012 1QB Information Technologies (1QBit), the first dedicated quantum computing software company, is founded.

2014 Physicists at the Kavli Institute of Nanoscience at the Delft University of Technology, The Netherlands, teleport information between two quantum bits separated by about 10 feet with zero percent error rate

2017 Chinese researchers report the first quantum teleportation of independent single-photon qubits from a ground observatory to a low Earth orbit satellite with a distance of up to 1400 km

2018 The National Quantum Initiative Act is signed into law by President Donald Trump, establishing the goals and priorities for a 10-year plan to accelerate the development of quantum information science and technology applications in the United States

2019 Google claims to have reached quantum supremacy by performing a series of operations in 200 seconds that would take a supercomputer about 10,000 years to complete;

IBM responds by suggesting it could take 2.5 days instead of 10,000 years, highlighting techniques a supercomputer may use to maximize computing speed.

2020 Notable advancements in quantum computing included Honeywell's breakthrough supported by its innovative quantum charge-coupled device (QCCD) architecture.

Google’s AI Quantum team conducting a groundbreaking chemical simulation on a quantum computer, and Intel introducing Horse Ridge II, a cryogenic control chip designed to enhance the scalability of quantum computers.

2021 The most striking advancements came from a team including researchers from the University of Texas at San Antonio, who achieved highfidelity, laser-free universal control of trapped ion qubits.

2022 IBM unveiled an ambitious new roadmap to practical quantum computing, announcing plans to develop a system with over 4,000 qubits. Orchestrated by intelligent software, these new modular and networked processors are intended to tap into the strengths of both quantum and classical computing to reach near-term Quantum Advantage. development

2023 In July, Google announced that their Sycamore quantum processor, equipped with 53 superconducting qubits, demonstrated quantum

16

supremacy by performing a complex task of generating pseudo-random numbers in just 200 seconds. This task would take existing supercomputers approximately 10,000 years, showcasing a significant quantum advantage.

In August, IBM introduced the Osprey processor with 433 superconducting qubits, setting a new record for quantum chip qubit count. Though qubit quantity doesn't solely define computing power, IBM highlighted its achievement of sub-0.1% two-qubit error rates, enhancing reliability with quantum error correction and underscoring the importance of balancing qubit expansion with noise reduction for quantum computing scalability.

In October, the University of Chicago successfully transmitted entangled particles over 52 miles using underground fiber optic cables in the Chicago suburbs, a significant step toward building quantum networks by extending quantum connections through precise particle control.

In November, Chinese scientists achieved the first intercontinental quantumencrypted video call, showcasing quantum cryptography's potential by using un hackable keys encoded in photons for secure transmission between cities in Asia.

The same month Ion Q trapped-ion quantum computer achieved a two-qubit operation fidelity of 99.9%, setting a new low for gate error rates and marking a critical step towards practical quantum computing by enabling effective error correction.

The challenge of "error correction" remains a significant hurdle for quantum computing researchers, symbolizing a persistent obstacle in the field. When breakthroughs occur, they celebrated momentarily before the complex reality of quantum computing demands a return to further innovation.

Despite decades of research since the 1980s and considerable global investment, quantum computing has yet to achieve robust operational status, reflecting the cautious optimism that pervades the sector.

17
………………….

Part 2

PATHETIC-

18

ELEGANT, COMPLEX, UNEFFICIENT CHANDELIER

A quantum computer” chandelier” model is an intricately built, avant-garde construction that houses the quantum processor chip in its center. For the qubits to operate properly, the device must be kept at temperatures as low as possible—near absolute zero—thanks to its chandelier-like configuration.

To handle microwave signals, regulate qubits, and guarantee coherence for computer operations, the architecture calls for several tiers of cooling systems and complex engineering.

19

In addition to being aesthetically pleasing, the chandelier model is essential to the quantum computing process because it creates the right atmosphere for quantum activities to occur.

The use of qubits rather than bits in quantum computing fundamentally differs from that of traditional computers. Qubits, which enable complicated calculations, exist in a superposition of states until measured, in contrast to bits, which are binary in traditional computing and can only be either 0 or 1.

This special characteristic allows quantum computers to handle exponentially more data.

To generate qubits undefined qualities before measurement, including electron spin or photon polarization these machines rely on the quantum state of objects.

20

To effectively handle difficult issues like chemical simulations or encryption, quantum algorithms make use of these superpositions and entanglements.

Complex engineering is required to keep qubits at very low temperatures, almost at absolute zero, in the physical architecture of quantum computers, such as IBM's System One.

Layers of cooling techniques are used in the chandelier-like design to maintain coherence for computing activities while housing the quantum processor chip.

Logic operations are performed on the qubits by gates and circuits, which are managed by microwave signals. Quantum computers need error correction techniques in order to handle frequent faults and increase dependability.

Though it currently confronts difficulties with error correction and decoherence, quantum computing has enormous promise to solve complicated issues.

21

CRUCIAL DILUTION REFRIGERATOR

Dilution refrigerators are essential for chilling quantum computers to extremely low temperatures in the field of quantum computing. The preservation of the quantum bits, or qubits, in a superposition state and the reduction of decoherence depend on these refrigerators.

New developments include the development of IBM's Goldeneye cryogenic system considered one of the largest dilution refrigerators in the world, as created by Fermilab. These developments emphasize how crucial cryogenic cooling is to the proper functioning of quantum computers.

To keep qubits stable and coherent during sophisticated quantum computations, dilution refrigerators are an essential part of quantum computing installations.

Utilizing the heat produced by the mixing of helium isotopes specifically, helium-3 and helium-4 a dilution refrigerator can reach extremely low temperatures. There are two cooling stages in this process: 300K to 3K and 3K to 20mK.

To reach 3K, the first cooling step uses pulse tube cooling, which is comparable to evaporation cooling and uses nitrogen first, then helium.

Then, depending on the amount of helium-3 present, a two-phase combination of diluted and concentrated helium-3 and helium-4 is used.

The temperature drops to 20 mK when helium-3 undergoes a phase transition that removes heat. Using a compressor and a device known as a still in the cryostat, helium-3 is recycled inside the system while maintaining constant cooling.

Even with recycling, there may be a loss factor when it comes to the gradual depletion of helium atoms over time. For quantum computing applications, the recycling process seeks to reduce this loss and preserve the dilution refrigerator's efficiency.

China news agency reports that the nation has started producing large quantities of the "EZ-Q Fridge," a dilution refrigerator required for quantum computing chips that are superconducting.

22

Considering continuous technological restraints from abroad, this development could constitute a breakthrough in equipment essential to quantum computing.

China successfully produced dilution refrigerators in large quantities for the first time, which are essential for providing superconducting quantum computers with extremely low temperatures nearly zero degrees Celsius."

A defiant optimism to keep the limitations in check, as usual!

23
Dilution refrigerator insertion into “chandelier” IBM Quantum computer

QUANTUM ERROR CORRECTION FATALITY!

A central fatality in quantum computing is quantum error correction, which guards against errors in quantum data brought on by quantum noise and decoherence. The easiest way to put this into plain English is to divide some difficult concepts into simpler ones.

Quantum Fragility Challenge

It is conceivable that you would write a secret message on a piece of paper so thin that a slight wind may smear the ink and render the message unintelligible. Qubits are similar to that extremely thin piece of paper in the context of quantum computing. Quantum bits, the fundamental building blocks of quantum information, are highly environment-sensitive. Their condition can be altered by minuscule vibrations, electromagnetic waves, or even heat. Analogously to smearing your secret message, this sensitivity can produce inaccuracies in the calculations.

The Fundamentals of error correction

Additional data points are added to discover and fix faults in classical computing, such as on your computer or smartphone. To guarantee that a text message reaches its destination accurately, even in the event of data corruption, your phone, for instance, adds additional information when you send it.

To fix any faults without changing the quantum state of the qubits, we want to accomplish something similar in quantum computing. It's complicated because, according to a principle known as quantum measurement, directly testing a qubit to look for faults would destroy its quantum features. Akin to attempting to determine whether a soap bubble contains a flaw without popping it.

how this works ?

A smart technique is used in quantum error correction. One qubit's information is distributed across multiple other qubits rather than being checked individually. Encoding is the term for this procedure. Even without direct quantum state measurements for each individual qubit, the system can nonetheless identify and fix any errors that impact one of these qubits.

24

Say you have a family recipe that is really special to you and you don't want to lose it or make a mistake. You write the recipe and ingredients on multiple cards rather than just one. The information is safeguarded and dispersed among the other cards, so even if one is accidentally stained with coffee, you can still recover the entire recipe.

The scaling challenge

While quantum error correction appears to be the ideal solution, it depends on numerous additional qubits for redundant information encoding as well as processes for error repair that must occur continuously. Strong error correction requires hundreds or even thousands of qubits, which are difficult to create and maintain now.

Much effort is being put into developing more effective quantum error correction methods as well as ways to reduce the sensitivity of qubits to their surroundings.

The aim is to develop a quantum computer that can do intricate computations over extended periods of time without errors affecting the outcomes.

Today, addressing this “fatality” will mark a pivotal step toward ensuring the reliability and viability of quantum computing. This progression is crucial to perform computations that far surpass the capabilities of today's most advanced classical computers.

25
……………..

IS IT NISQ FAULT?

The term "noisy intermediate scale quantum" (NISQ) computing was first used by John Preskill in 2018. He pointed out that the number of logical or even physical qubits in a quantum computer system limits its size and that quantum computers today, even in 2023, are prone to significant error rates.

This basically indicates that they can't be trusted to do generic calculations.

Even though it has taken several decades for universities and other academic institutions to fund this stage of development, at this moment, a quantum computer is usually no more efficient at solving problems than a classical computer.

Some industry analysts anticipate a "quantum winter" because of this basic fallibility. Some industry insiders predict that the market will emerge from the NISQ period in the coming years, while others think the engineering hurdles to overcome the NISQ era would keep the sector locked for decades.

Is the NISQ era upon us?

Indeed, we are living in the NISQ era, which is characterised by high mistake rates and a dearth of useable qubits in quantum devices.

Even though the industry has progressed beyond merely functioning in laboratories, faults still occur in commercially used quantum computers; identifying and fixing these errors remains a major problem.

But as physical qubit counts rise and our knowledge of quantum error correction (QEC) deepens, experts predict a transition from the NISQ era to one characterised by more dependable quantum hardware and software.

A Google team showed that quantum error correction functions as well as theory late last year (in a preprint) and early this year (in Nature). Even in the NISQ age, there are still a lot of problems we need to solve, such as improving qubit performance, cutting noise, and creating far more effective error correction techniques.

We are expected to continue making progress this decade, leaving behind the NISQ era and a completely error-corrected system with thousands to several million qubits at the post-NISQ period.

26

How We View NISQ Computing

Condensed matter physicist and consultant to The Quantum Insider Chris Coleman claims that the NISQ era is quantum computing's equivalent of the first wave of artificial intelligence.

There is no denying that the field is advancing steadily, even though obstacles still need to be overcome. In many cases, we are witnessing the groundwork for greater things to come. This is seen throughout the ecology.

Coleman went on to note that innovations in supporting systems like cryogenics, optics, control, and readout, as well as on the quantum level, are what are driving performance improvements. He attributes this, he thinks, to extensive study and collaboration between academia and business.

This demonstrates how eager the supply chain is to adjust to the shifting needs of the industry.

Coleman stated, "Developing dynamic compilation strategies to maximise processing power from existing devices and make quantum systems more user-friendly is another characteristic of NISQ."

Major NISQ themes include error correction and benchmarking, where we find interdisciplinary research in quantum theory and hardware to assist us move past this noisy, mistake-prone stage.

There are already a number of approaches being investigated to provide fault tolerance, even if error correction can be one of the main obstacles to present systems.

He went on to say that accessibility is another significant feature of the NISQ era, made possible by the astounding amount of systems that have gone online and are accessible via the cloud, providing a wide variety of systems for investigation.

This accessibility is supporting early adopters and the hunt for near-term applications in addition to driving basic research.

Examining NISQ computing from a distance reveals a multifaceted discipline with many challenging issues that remain unsolved, but also a committed community of researchers and business people collaborating to address these issues.

27

As new quantum devices with more qubits and better coherence times are developed, we anticipate that the field of quantum computing will continue to grow quickly in the coming years.

These devices will allow researchers to perform increasingly complex quantum computations and possibly even show that certain problems can benefit from a quantum advantage over classical algorithms.

Nevertheless, NISQ-era machines are unlikely to be able to tackle complex issues due to their limitations, which include their vulnerability to noise and errors.

This indicates that, in the long run, error-corrected quantum computers that can tackle far more complicated and large-scale issues than the NISQ devices of today will probably be developed as part of the future of quantum computing.

28

THE CHASE: NISQ to FTQC and to ISQ ALGO!

Noisy Intermediate-Scale Quantum (NISQ) algorithms and Fault-Tolerant Quantum Computing (FTQC) algorithms represent two distinct approaches in quantum computing together with the arrival of Intermediate scale (ISQ)

NISQ Algorithms:

NISQ algorithms are designed for quantum processors in the NISQ era, characterized by quantum processors with up to 1000 qubits that lack faulttolerance capabilities or the size to achieve quantum advantage[3].

Characteristics: NISQ devices are sensitive to noise, prone to quantum decoherence, and lack continuous quantum error correction[3].

Applications: NISQ algorithms like Variational Quantum Eigensolver (VQE) and Quantum Approximate Optimization Algorithm (QAOA) leverage NISQ devices but offload some calculations to classical processors, showing success in quantum chemistry and potential applications across various fields[3].

FTQC Algorithm:

FTQC algorithms are part of a long-term approach that assumes the availability of a large number of qubits for fault-tolerant quantum computing[2].

Feasibility*: FTQC algorithms become feasible under the assumption of having a significant number of qubits available, unlike the constraints of NISQ devices[2].

Quantum Error Correction: FTQC algorithms operate in a fault-tolerant environment where continuous quantum error correction is possible, enabling more complex computations with higher accuracy[2].

In summary, NISQ algorithms cater to the current limitations of noisy intermediate-scale quantum devices, focusing on practical applications within these constraints. On the other hand, FTQC algorithms look towards a future where fault-tolerant quantum computing with a larger number of qubits allows for more sophisticated and accurate computations.

29

ISQ Algorithm:

The ISQ era is currently being developed by most research groups, with a key milestone being the demonstration of a fully programmable logical qubit with extremely low error rates.

This achievement would enable the emergence of devices with few errorcorrected logical qubits. ISQ devices are still intermediate-scale, but noise is less of a problem due to the benefits of error correction, allowing for quantum algorithms with much longer circuit depths.

The motivation for embracing the ISQ era is to optimize quantum algorithms so they can run with lower error-correction requirements. This means performing a careful balancing act to achieve objectives such as reducing the number of non-Clifford operations while respecting the constraints of ISQ devices. The algorithms suitable for the ISQ era are different from NISQ (since we have access to more powerful devices) and full fault tolerance (since they still have restrictions on qubit number and circuit depth).

Let us now honor those brave scientists who have, in a truly magnificent show of defiance, turned their backs on the enticing advances of the AI rocket ship to toil in the abrasive trenches of quantum computing—standing firm to the very end. These trailblazers are bravely battling the never-ending agony of noise errors in their dreams while never giving up on finding a solution to this nagging headache.

Still, we must give them our highest compliments. It's not like they can just give up on each other. No, they are embarking on a heroic journey to traverse the study "winter," equipped only with their cunning and maybe a hint of caffeine-induced optimism. So let's give them our support—they will undoubtedly require it to overcome this chilly season and come out on top.

30
……………

PART 3 -Frantic-

31

Q MACHINE LEARNING

Groups such as Quantinuum are presently engaged in the development of adaptable and efficient algorithms for Quantum Machine Learning in order to optimise the capabilities of existing quantum computing technology.

In order to create novel methods for data processing and optimisation, it entails foundational research that examines the nexus between quantum computing and machine learning.

Data analysis, optimisation, computational chemistry, and other fields stand to benefit greatly from the synergy between quantum computing and machine learning, which is why quantum machine learning is a promising field.

Applications in a variety of industries could be revolutionised by quantum machine learning (QML), including:

Creation of Quantum Algorithms: QML can assist in the creation of quantum algorithms, allowing for the more precise and effective resolution of challenging issues[2].

The development of quantum error correction algorithms, which guarantee the dependability and stability of quantum computing systems, can be aided by QML.

Comprehending Quantum Systems: By offering insights into their behaviour and their uses, QML can aid in the comprehension of quantum systems.

Using quantum computing capabilities, machine learning algorithms can improve upon existing machine learning techniques, including reservoir computing, autoencoders, Boltzmann machines, parameterized circuits, and support vector machines (SVM).

QML can enhance data analysis and optimisation procedures by processing enormous volumes of data at previously unheard-of rates, producing predictions and insights that are more precise.

These applications show off QML's revolutionary potential, showcasing how it may harness the special powers of quantum computing to produce groundbreaking breakthroughs in a variety of sectors.

32

According to the search results, there may be a few advantages to utilising quantum machine learning over classical machine learning.

Speed: When compared to classical methods, quantum machine learning algorithms can offer a quadratic speedup for sampling and unstructured search issues. This results in speedier optimisation and decision-making processes.

Efficiency: The special powers of quantum computing, such as entanglement and superposition, enable more effective data processing and optimisation, which raises the general efficiency of machine learning algorithms.

Ideal Solutions: Using quantum gradient descent techniques, in particular, quantum machine learning can identify ideal solutions faster than classical techniques, enhancing the precision and potency of artificial intelligence models.

Enhanced Information Processing: There are benefits to using quantum bits over classical bits for information processing, which allows for more complex information analysis and processing in machine learning applications.

Regarding typical classical methodologies, these advantages demonstrate how Quantum Machine Learning can greatly improve the speed, efficiency, and accuracy of machine learning algorithms, opening the door for more sophisticated applications in a variety of fields.

Pure brains, uninterested by extremely high salaries, are deeply involved in QML research, even though all the brightest minds have flocked to the AI players, as was indicated in the preamble. One day, let's hope for some sweet retaliation!

33

sPINTRONICS IN QC

Spintronics uses electron spin to store and process data in quantum computing. While electron charge carries information in classical electronics, spintronics uses electron spin to control current flow.

The following are the main ideas behind spintronics in QC:

Spin-Polarized Transport: Spintronics devices produce spin-polarized currents, which manipulate the conduction electrons' spin orientation to transfer information over a wire.

Applications such as magnetic sensors, memory storage, and maybe quantum computing depend on this spin-dependent transport.

Active Control of Spin Dynamics: By actively regulating electron spin dynamics, spintronics seeks to advance beyond passive spin devices.

Through the active manipulation of spin dynamics, new technologies that are appropriate for quantum information processing and computation, including as spin transistors, spin filters, and new memory devices, may be developed.

Spin-Valley Coupling: By overcoming the drawbacks of more conventional techniques like electron spin resonance (ESR), a novel approach of regulating electron spin in silicon quantum dots has been created.

This technique offers a new avenue for the use of silicon quantum dots in quantum computing by utilising "spin-valley coupling," which is the process of manipulating the spin and valley states of electrons using a voltage pulse instead of fluctuating magnetic fields.

To sum up, spintronics in quantum computing makes use of the special qualities of electron spin to actively manipulate electron spins for information processing and storage, hence enabling quicker, more energy-efficient, and possibly more powerful computing systems.

34

Recent advances in spintronics for quantum computing have shown promise for improving the use of spin in quantum technology. The following are some major developments:

Molecular Spintronics: A novel technology termed "molecular spintronics" has been put forth, providing hope for quantum computing based on electron spins in minuscule semiconductor particles.

By utilising the special qualities of molecular structures, this technology has the potential to completely alter the field of quantum computing[1].

Predictive Modelling with Semiconductors: The use of spin on semiconductors for quantum technologies has advanced significantly as a result of predictive modelling.

35

In particular, halide perovskites have changed the way that semiconductors are thought about, opening up new avenues for quantum technology-related developments in computing, sensing, and communication[2].

Spin-Based Quantum Phenomena: In order to develop new tools and technologies, spintronics is presently investigating spin-based quantum phenomena. One of spintronics' lofty objectives is the creation of spin-based quantum computers in solid-state architectures, which seek to enhance and add new features to current electronic systems. The future of quantum computing and information processing depends on these developments[4].

2D Materials and Quantum Future: Spintronics in conjunction with 2D materials is influencing quantum technologies in the future. Spintronics is opening the door for novel applications that utilise quantum processes to boost computing power by taking advantage of the inherent quantum characteristics of particles like electrons[5].

All things considered, these new discoveries demonstrate the continued advancements in spintronics for quantum computing, offering creative ways to take advantage of electron spins and expand the realm of quantum information processing and computation.

36

SURREAL NUMBERS AND QC

Surreal numbers, introduced by John Horton Conway, indeed form a fascinating and vast number system capable of describing not only real numbers but also infinities and infinitesimals.

This extensive number system can be thought of as a superstructure that includes all the conventional numbers, we're familiar with, plus much more, structured in a way that allows for the representation of a wide range of mathematical concepts.

Creation narrative: With starting with 0 and going up to 1 and -1, Knuth's book walks readers through the process of creating bizarre numbers. Further numbers, such as fractions and dyadic numbers, emerge as the days of this creation story go by.

But day ω, an idea taken from set theory that denotes a countable infinity, is where the true magic happens. This illustrates the power of Conway's conceptual framework, since all real numbers, irrational numbers, and nondyadic fractions suddenly materialise.

Beyond Infinity: The study of surreal numbers continues on day ω, exploring domains beyond the purview of conventional mathematics.

Beyond infinity numbers are introduced by new numbers such as ω + 1 and ω - 1. Infinitesimals are numbers so small that they defy our standard conception of size and scale, adding another layer of complexity to the story.

A Universe of numbers: Because of their immense size, surreal numbers constitute a class rather than merely a set. They extend beyond that and include every known kind of number, including real, rational, and natural ones. Giving a fresh perspective on the mathematical universe, they characterise infinities and infinitesimals.

A different kind of Number Line : There are lots of holes in the surreal number line as opposed to the real number line, which is tightly packed and has no gaps. Our conventional notions of continuity and proximity are challenged by these holes, which stand for the infinitesimals that lie between surreal numbers.

Novel Basis for Analysis : One must reconsider mathematical analysis in light of the peculiarities of surreal numbers. Under "nonstandard analysis," a discipline that takes into account the quirks of surreal numbers, traditional notions of derivatives and integrals dissolve.

37
38

Quantum computing (QC), on the other hand, is a field of computing focused on the development of computer technology based on the principles of quantum theory, which explains the behavior of energy and material on the quantum (atomic and subatomic) level.

Quantum computers use quantum bits or qubits, which can represent and store information in a way that is vastly different from classical bits.The relationship between surreal numbers and quantum computing isn't direct in the sense that quantum computing doesn't rely on the concept of surreal numbers to function.

Quantum computing is rooted in quantum mechanics, leveraging phenomena such as superposition, entanglement, and quantum interference to perform computations.However, surreal numbers contribute to the broader mathematical and theoretical landscape in which concepts like quantum computing are developed. Surreal numbers offer new perspectives and tools for understanding concepts of continuity, order, and magnitude, which can be fundamental in theoretical physics and mathematics.

While they don't directly impact quantum computing technology, the exploration of surreal numbers can inspire novel ways of thinking about mathematical problems and potentially lead to new insights in fields indirectly related to quantum computing, such as theoretical physics, mathematical logic, and the foundations of mathematics.

In summary, while there isn't a direct relationship where quantum computing uses surreal numbers in its operations or theory, both belong to the rich tapestry of mathematical and physical sciences, exploring and expanding our understanding of the universe from different angles.

39

TIME CRYSTALS AND QC

Nobel laureate Frank Wilczek initially presented the groundbreaking idea of time crystals in quantum physics in 2012These structures display "timetranslation symmetry breaking," which is the oscillation between states without the use of energy.

Time crystal research and development could have a big impact on quantum computing by improving our knowledge of quantum systems and possibly paving the way for more reliable quantum computers.

Quantum Computing and Time Crystals: Using quantum computers, like Google's Sycamore system, researchers have successfully produced time crystals, illustrating the potential of quantum devices to realise new phases of matter. By assisting in the regulation of quantum systems and possibly enhancing the resilience of quantum computers against mistakes, the development of time crystals could enhance quantum computing.

Potential Future Implications - Although the development of time crystals is an important scientific accomplishment, there may be future implications for memory in quantum computers as they continue to be investigated in practical applications.

Time crystals are an intriguing contender for future developments in quantum computing, and scientists are constantly investigating them to learn more about their nature and possible uses.

These latest findings open the door for further discoveries and breakthroughs in quantum technology by highlighting the fascinating developments in the realm of time crystals and its relationship to quantum computing.

QC challenges to Create Time Crystals:

In order to fully utilise this new phase of matter, researchers are aggressively tackling the following issues associated with the creation of time crystals using quantum computers:

Decoherence: Decoherence, or the gradual decay of quantum states, is a result of quantum systems' extreme sensitivity to outside perturbations[2]. One major issue in the creation and observation of time crystals is maintaining coherence, particularly in larger systems .

40

Time crystal illustration of QC experiment

System Size and Coherence Time: The scope and length of time crystal experiments are restricted by quantum device imperfections, such as finite size and coherence time[1]. Resolving these issues is essential to conducting an effective research of the characteristics and behaviour of time crystals[1].

Verification of Ideal Behaviour: It is difficult to conduct experiments to confirm that time crystals exhibit ideal behaviour, such as infinite oscillations from any state.

Although protocols have been established to efficiently explore multiple states of time crystals, researchers are still working to improve methods for thorough analysis.

41

Error Correction: Due to flaws in hardware and external circumstances, quantum computers are prone to errors by nature. In order to guarantee precise time crystal observation and research, scientists are investigating ways to rectify mistakes inside quantum systems.

Practical Applications: Although the development of time crystals is revolutionary, there is still work to be done in terms of practical uses. One of the biggest challenges facing researchers is figuring out how to use the special qualities of time crystals for practical quantum computing applications.

By creating and using time crystals, these obstacles must be overcome to advance the area of quantum computing, provide new understanding of quantum systems, and maybe transform the capabilities of upcoming quantum technologies.

42

SURFACE CODE AND LOW-DENSITY CODE

Shor's algorithm has served as a model of quantum computing capabilities for many years. This set of instructions, created in 1994 by Peter Shor, enables a device to break big numbers into their prime factors far quicker than a standard, classical computer by taking advantage of quantum mechanics' peculiarities.

The first noteworthy advancement in Shor's algorithm since its creation was made in August when a computer scientist created an even quicker version of the method.

However, quantum computers that are useful are still far off. Tiny mistakes in real life can quickly build up, causing computations to be ruined and any quantum benefits to be lost.

In fact, a group of computer scientists demonstrated late last year that a classical algorithm performs about as well as a quantum algorithm that incorporates errors for a particular issue.

However, there is still hope: Research conducted in August revealed that some error-correcting codes also referred to as low-density parity check codes are at least ten times more effective than the norm.

The technique of employing additional qubits for redundant information encoding, known as quantum error correction, will be used by researchers. The 1998 release of the surface code arranged qubits into a square grid and ran a Minesweeper game. It was extremely tolerant of misbehaving qubits, established as the gold standard for error correction, and had an impact on the development of quantum processors and quantum road maps.

The drawback of the surface code is that it creates a voracious desire for qubits because larger blocks of inferior qubits are required to safeguard the trustworthy qubit more effectively.

These codes with "nonlocal" links have attracted the attention of quantum information theorists; they are colloquially referred to as LDPC codes.

Low-density parity check (LDPC) codes are a new family of quantum error-correcting codes that have been demonstrated to be more efficient than the surface code, which is the current gold standard, in new simulations from two groups.

43

With the use of these codes, a large group of qubits that frequently make mistakes are reduced to a much smaller group of "protected" qubits.

Compared to the surface code, LDPC codes in the two simulations were able to create protected qubits from 10–15 times fewer raw qubits. The emergence of more powerful quantum devices could be accelerated by these codes, according to experimental plans.

In an effort to see if any of the LDPC codes may be incorporated into modern devices, researchers have been examining the performance of these codes on smaller systems.

Based on an LDPC code from a 2012 study, IBM researchers recently released a simulation of the smallest and most detailed LDPC blueprint to date.

The group discovered that by employing 288 raw qubits that failed 0.1% of the time to make 12 protected qubits with a failure rate 10,000 times lower, its code was able to protect its reliable qubits much more efficiently than the industry standard.

44
Nicolas Breukmann (German-University of Bristol) the LDPC code genius

With devices with hundreds of qubits in the horizon, the simulation toes the line between today and tomorrow's error correction. Similar findings were reported by a multi-institution cooperation of researchers led by Liang Jiang of the University of Chicago and Mikhail Lukin of Harvard University.

There are two primary obstacles that could prevent these developments from becoming widespread: first, it is difficult to establish nonlocal connections among qubits, particularly for firms such as IBM that fabricate qubits from stationary superconducting circuits.

Lukin et al. combined a surface code-protected quantum processor with an LDPC-protected quantum memory, taking small steps towards resolving these fundamental flaws.

Additionally, they customised their simulations for a particular class of freely travelling qubits that are well suited for setting up distant links. Though it is currently not feasible to do so in a practical manner, there is a clear future for LDPC codes.

45

PART 4 -REALISTIC-

46

PHOTONIC QUANTUM COMPUTING

Utilizing photons as qubits, photonic quantum computing is a parallel subset of quantum computing Vs the “Chandelier” classic Qubits model.

According to this method, quantum bits are represented by photons, and the information they convey is either 1 or 0. A scattering unit interacts with a single atom to encode information onto photons, and photons are stored in a ring which makes up photonic quantum computers.

Between the photon and the atom, this interaction produces entanglement, which makes it possible to modify quantum states without being limited by distance.

The principal benefits of photonic quantum computing are its room temperature functioning, which eliminates the need for large cooling systems, its basic components, and its versatility in performing different quantum processes.

The key advantages of using photons in quantum computing include:

Easy generation and measurement: Photons are relatively easy to generate and measure, making them a suitable choice for quantum computing[2].

Weak interaction with the environment: Photons interact very weakly with their environment, which helps in maintaining the quantum state of the qubits and reduces the likelihood of errors[2].

Easy transfer of qubits: Photons can be easily transferred from one point to another, which is essential for quantum communication and the integration of quantum computers into existing fiber-optic-based telecommunications systems[4].

Potential for room-temperature operation: Unlike other quantum computing approaches that require cryogenic temperatures to maintain qubits, photonic quantum computers can, in principle, operate at room temperature.

Scalability: Photonic quantum computers have the potential to scale more easily than other approaches, as they can be integrated into optical chips with all the necessary components[2].

47

Entanglement and interaction: Despite not interacting directly with each other, photons can be entangled, allowing for the manipulation of quantum states without the need for direct interaction between qubits.

Integration with existing technologies: Photonic quantum computers can potentially integrate with existing fiber-optic-based telecommunications systems, which could facilitate the development of a quantum Internet.

These advantages make photons a promising choice for quantum computing, and researchers are actively exploring their potential in various quantum computing applications and systems. (Xanadu- Canada- Photon Q image)

Despite the 7 advantages, there are 7 disadvantages to consider:

Difficulty in generating and measuring: While photons are relatively easy to generate and measure, the process of generating and measuring qubits in a photonic quantum computer can be challenging. This is because the photons must be generated in a specific state and then measured with high precision to extract the desired quantum information.

Limited interaction between photons: Photons do not interact with one another directly, which is a significant disadvantage in quantum computing. This lack of interaction makes it difficult to create entangled states, which are essential for quantum computing operations.

48

Scalability: Photonic quantum computers face challenges in scaling up their operations. The current systems rely on bulk optic components, which will be difficult to scale efficiently. The path forward is integration, but this process is still in development.

Efficient construction of two-photon gates: Optical quantum computing, which uses photons, faces challenges in constructing two-photon gates. These gates are crucial for quantum computing operations, but current methods rely on linear optics with projective measurements, which can be inefficient.

Need for high-efficiency sources of indistinguishable single photons: Developing high-efficiency sources of indistinguishable single photons is a significant challenge in photonic quantum computing. These sources are essential for generating and manipulating qubits.

Low-loss scalable optical circuits: Creating low-loss scalable optical circuits is another challenge in photonic quantum computing. These circuits are necessary for the efficient transfer and manipulation of qubits.

High-efficiency single-photon detectors: Developing high-efficiency single-photon detectors is crucial for measuring the quantum states of photons.

These disadvantages highlight the ongoing research and development efforts required to overcome the challenges and fully harness the potential of photonic quantum computing.

Potential solutions to the disadvantages of photonic quantum computing are actively being researched and developed to enhance the capabilities of this technology. Some of the solutions include:

Boson Sampling: Researchers have demonstrated quantum advantage with photonic quantum computers using a specific computation known as Boson Sampling. This method leverages the bosonic nature of photons to control the behavior of other photons, enabling computations that outperform classical computers significantly.

Integration and Scaling: To address scalability challenges, the path forward for photonic quantum computing involves integration. Startups like “Xanadu” and “Psi Quantum” are developing optical chips with all the necessary components to facilitate scaling and efficient operation.

Improved Components: Companies in the quantum optical ecosystem are continuously working on developing improved components for photonic

49

quantum computers. This includes advancements in single-photon sources, detectors, and optical circuits to enhance the efficiency and performance of these systems.

Efficient Construction of Two-Photon Gates: Efforts are being made to improve the construction of two-photon gates in optical quantum computing. Research is focused on developing more efficient methods for creating these gates, which are essential for quantum computing operations.

High-Efficiency Single-Photon Detectors: Companies like “ID Quantique” are working on enhancing single-photon detectors to improve the measurement of quantum states in photonic quantum computing systems. These detectors play a crucial role in the reliability and accuracy of quantum computations.

By addressing these challenges through innovative research and technological advancements, the potential of photonic quantum computing can be further realized, paving the way for more efficient and powerful quantum computing systems in the future.

These advantages make photons a promising choice for quantum computing, and researchers are actively exploring their potential in various quantum computing applications and systems.

If you consider the central advantage that Computing is at “room temperature” and the main disadvantage of “Scalability” that will be addressed, my personal vote goes to all the group focused on Photonic Vs the classic others.

I bet on them since 2020 and I keep faith to these teams! Let’s see if I will be contradicted in …..2030 !

50
……………

reality check by sceptics

Contrary to popular belief, the quantum computing revolution might not happen very soon and may even have greater limitations. That's the takeaway from a tiny but vociferous group of well-known sceptics in and around the nascent field of quantum computing.

Many issues have been attributed to quantum computing, including financial modelling, logistics optimization, and machine learning acceleration.

According to some of the more audacious timetables put up by firms that specialize in quantum computing, these devices may have an impact on practical issues in a few years. However, there's a growing backlash against what many consider to be exaggerated demands placed on technology.

Yann LeCun, the leader of AI research at Meta, recently made news for casting doubt on the likelihood that quantum computers will have a significant impact anytime soon.

At a press conference honoring Meta's Fundamental AI Research team's tenth anniversary, he stated that while the technology is "a fascinating scientific topic," he was less certain about "the possibility of actually fabricating quantum computers that are actually useful."

Despite LeCun's lack of expertise in quantum computing, prominent individuals in the industry are also issuing a warning. There is currently "a tremendous amount of hype" in the sector, according to Oskar Painter, head of quantum hardware at Amazon Web Services, and "it can be difficult to filter the optimistic from the completely unrealistic."

The fact that modern quantum computers are incredibly error-prone presents a fundamental challenge. There are still potential applications for these so-called "noisy intermediate-scale quantum" (NISQ) machines, according to some.

According to Painter, however, there is a growing realisation that this is improbable and that quantum error-correction systems will be essential to the development of useful quantum computers.

51

To develop more resilient "logical qubits," the leading idea entails distributing information over many physical qubits. However, this could require up to 1,000 physical qubits for every logical one. Although it is not widely accepted, some have even proposed that quantum error correction might not be conceivable in the first place.

In any case, Painter argues, achieving these ideas at the necessary scales and speeds is still a long way off. He states, "I would estimate at least a decade out given the remaining technical challenges in realizing a fault-tolerant quantum computer capable of running billions of gates over thousands of qubits."

52
IBM QC Superconductor CONDOR

Timescales are not the only issue. Matthias Troyer, a technical fellow at Microsoft who oversees the company's quantum computing initiatives, co-authored a paper in Communications of the ACM in May that made the argument that there were fewer applications than some may have you believe where quantum computers could offer a significant advantage.

"Over the past ten years, we have discovered that a great deal of what people have suggested has not worked," he states. "And then we discovered a few really basic causes for that."

The ability to answer problems far quicker than classical computers is the core promise of quantum computing, albeit the precise speed at which this can be achieved varies. According to Troyer, there are two areas where quantum algorithms seem to offer an exponential speed increase.

One is factoring big numbers, which may allow for the breach of the public key cryptography that underpins the internet. The other is quantum system simulation, which has potential uses in materials science and chemistry.

Numerous other issues, such as fluid dynamics, medication design, and optimisation, have been addressed by quantum algorithms. Promised speed increases, however, don't always materialise; occasionally, they amount to a quadratic gain, which means that the time required for a problem to be solved by a quantum algorithm is equal to the square root of the time required by a classical approach.

According to Troyer, these improvements could be swiftly erased by quantum computers' enormous processing cost. A qubit operates orders of magnitude slower than a transistor because it is significantly more difficult to operate.

This indicates that a classical computer will always be faster for simpler problems, while the quantum computer's advantage depends on how quickly the classical algorithm's complexity scales.

A qubit operates orders of magnitude slower than a transistor because it is significantly more difficult to operate.

53

A single Nvidia A100 GPU was pitted against a theoretical fault-tolerant quantum computer with 10,000 "logical qubits" with gates that were substantially quicker than those of modern technology by Troyer and his colleagues.

According to Troyer, their findings indicate that a quantum algorithm that achieves a quadratic speedup would need to execute for decades or even millennia in order to surpass a conventional algorithm on problems large enough to be practically meaningful.

Data bandwidth is another important barrier. The slow working rates of qubits essentially restrict how quickly classical data may enter and exit a quantum computer. According to Troyer, this is probably going to be hundreds or perhaps millions of times slower than traditional computers, even in the most optimistic future scenarios. This implies that for the foreseeable future, data-intensive applications like database searching and machine learning will most likely remain unattainable.

Troyer concludes that the main benefit of quantum computing will come from solving small-data issues at exponentially faster speeds. He continues, "The rest is lovely theory, but it won't be applicable. According to him, the study had little effect

54

on the quantum community, but many Microsoft customers expressed gratitude for the clarity it provided on practical uses for quantum computing. He claims that several businesses, especially in the finance industry, have reduced or closed their teams dedicated to quantum computing.

Scott Aaronson Texas University:

Professor of computer science at the University of Texas at Austin Scott Aaronson argues that anyone who has been following the developments in quantum computing research shouldn't be too surprised by these limits.

He states, "I think skepticism was always warranted because there are these claims about how quantum computing will revolutionise machine learning, optimisation, finance, and all these industries.

" "If they are only now realising that, then good for them. Even though he believes that practical applications are still a way off, he is actually encouraged by recent advancements in the subject.

This month, scientists from Harvard and the quantum computing firm QuEra showed that they could produce 48 logical qubits using a 280 qubit processor, which is a significant increase over what has been accomplished in earlier studies.

According to Aaronson, "this was definitely the biggest experimental advance maybe for several years."

"It's a little disappointing when you say quantum computing is going to fix every problem in the world, but it doesn't, or not right now.”

Yuval Boger- QuEra:

Although the experiment was a lab demonstration, Yuval Boger, chief marketing officer at QuEra, is quick to point out that some have reevaluated their timelines for fault-tolerant quantum computing as a result of the findings. However, he also claims to have observed a pattern of businesses covertly diverting their resources from quantum computing.

According to him, one reason for this has been the increased interest in AI with the release of big language models. He acknowledges, however, that some in the business have overstated the technology's near-term promise and that the excitement has been misleading.

55

"it helps attract investments and inspires talented individuals to enter the field." However, it is a little disappointing when one claims that quantum computing will cure every issue in the world and then discovers that it cannot, or at least not now.

The uses of quantum computers may be more limited than first thought, even in the fields where they appear to hold the greatest promise. Recent publications from a multi-institutional team and the scientific software business Schrödinger have revealed that only a small subset of quantum chemistry issues are likely to profit from quantum speedups.

Philip Harbach-: Merck Pharma's Philipp Harbach, global head of group digital innovation at German pharmaceutical giant Merck KGaA, in Darmstadt, Germany, notes that it's also critical to keep in mind that many businesses already have established and effective quantum chemistry workflows that run on classical hardware (not to be confused with the American company Merck).

He claims that the public's perception of the quantum computer was incorrectly painted as something that it would make possible. It will primarily expedite current procedures as opposed to launching a whole unsettling new application field. Thus, here we are assessing a distinction.

For the past six years or so, Harbach's group has been examining how quantum computing relates to Merck's research. Although NISQ devices might be useful for some extremely specific issues, they have come to the conclusion that unless faulttolerance is realised, quantum computing will not have a major impact on industry.

Even still, according to Harbach, the precise use case and products a company is working on will determine just how revolutionary that influence could be.

When problems grow too big for traditional computers to handle, quantum computers excel at offering precise solutions. According to Harbach, that might be very helpful for some applications, like creating new catalysts. However, Merck is primarily interested in chemical challenges that need rapid screening of a large number of candidate molecules.

"Approximations are sufficient because most problems in quantum chemistry do not scale exponentially," he claims. "These are well-behaved problems; all you need

56

to do is increase the system size to make them faster." But there can still be hope, according to Troyer at Microsoft.

Even if quantum computers are only able to solve a small range of issues in fields like materials science and chemistry, the implications could still be revolutionary. Materials have a significant impact on humanity, he adds, citing the Stone, Bronze, Iron, and Silicon Ages as examples.

57

Exascale Sweet revenge ..up to now!

The world's fastest supercomputer can compute numbers at a speed that can be hard to comprehend. However, according to Jack Dongarra, a computer scientist at the University of Tennessee, "it would take four years to equal what that computer can do in one second if everybody on Earth were to do one calculation per second." The supercomputer is known as Frontier.

Here are a few other specifications: In comparison to the 16 or 24 CPUs found in the most powerful laptop, Frontier uses about 50,000. It uses 20 million watts as opposed to the roughly 65 watts used by a laptop. The building of it cost $600 million.

The first exascale computers, or computers capable of performing an exaflop, or a quintillion (1018) floating point operations per second, were introduced when Frontier went online. Since then, researchers have been working to build more of these incredibly fast computers; in 2024, several exascale machines are expected to go online in the US and Europe.

However, speed by itself is not the goal. Exascale computers are being built by researchers to investigate science and engineering issues in astronomy, biology, climate change, and other domains that were previously unexplorable.

Frontier will be used in the coming years by scientists to execute the most intricate computer simulations that have ever been created by humans. They intend to investigate unresolved mysteries of the natural world and create novel technologies ranging from medical to transportation.

For example, Evan Schneider of the University of Pittsburgh is simulating the evolution of our galaxy throughout time using Frontier. She is especially curious about the movement of gas into and out of the Milky Way. In a sense, a galaxy breathes: gas enters it and condenses into stars through gravity, but gas also exits the galaxy as stars burst and liberate matter. Schneider investigates the processes involved in galaxies' exhalation.

"We can assess whether we're understanding physics correctly by comparing the simulations to the actual observed universe," explains Schneider.

58

Schneider is utilising Frontier to create a high-resolution computer model of the Milky Way that will allow him to focus on specific exploding stars.

This implies that both the large-scale properties of our galaxy at 100,000 light-years and the properties of the supernovas at roughly 10 light-years across must be captured by the model.

Engineers have suggested removing the nacelle, or outside structural frame, to expose the blades like a pinwheel, to increase the size of the fans. "Early in the design phase, the simulations allow us to obtain a detailed view of the aerodynamic performance," adds Priebe.

They provide engineers with information on how to sculpt the fan blades to improve aerodynamics or reduce noise. Priebe's research on turbulence the chaotic motion of a disturbed fluid, in this case air—around the fan will notably benefit from Frontier. One often occurs occurrence of turbulence.

The curl of smoke rising from an extinguished candle and the crashing of ocean waves are examples of it. However, precisely predicting the flow of a turbulent fluid remains a challenge for scientists.

This is due to the fact that it reacts to both microscopic factors, like the rubbing of individual nitrogen molecules in the air against one another, and macroscopic factors, such pressure and temperature variations.

59

Nuclear fusion, the chaotic process by which the sun creates energy by pressing atoms together to make different elements, is another process that physicists can imitate.

With the goal of developing fusion as a clean energy technology, they wish to gain a deeper understanding of the process. Although multi-scale simulations of this kind have long been a mainstay of supercomputing, Frontier is able to support more diverse scales than in the past.

In order to utilize Frontier, authorized scientists must remotely connect to the supercomputer and submit their work via the internet Oak Ridge wants to maximize the supercomputer's performance, therefore about 90% of its processors should be busy with calculations seven days a week. For a few years, Messer claims,

With the exascale, the need for more processing power doesn't end. According to Messer, Oak Ridge is already thinking about the upcoming generation of computers. These would be computationally three to five times as powerful as Frontier. However, a significant obstacle remains: the enormous energy impact. Even in idle mode, the electricity that Frontier consumes is sufficient to power thousands of houses.

"Growing machines bigger and bigger is probably not a sustainable approach for us," remarks Messer. Oak Ridge engineers have pushed to increase the supercomputers' efficiency with new inventions, such as a novel cooling technique,

As the company has developed progressively larger machines. About 10% of the energy used by Summit, the forerunner to Frontier that is still operational in Oak Ridge, is used for cooling. In contrast, Frontier uses between 3% and 4% of its energy for cooling. This improvement resulted from cooling the supercomputer using ambient temperature water as opposed to chilled water.

Supercomputers of the future would be able to simulate even more scales at once. Schneider's galaxy simulation, for instance, offers resolution down to tens of lightyears using Frontier. Researchers must replicate each explosion separately since it is currently insufficient to fully capture the size of individual supernovas. It may be possible to combine all of these scales in a future supercomputer.

60

These supercomputers push the boundaries of science by more faithfully mimicking the complexity of nature and technology. Scientists can now grasp the expanse of the cosmos thanks to a more accurate galaxy simulation.

It is not necessary to construct an unaffordable wind tunnel when a precise model of the air turbulence surrounding an aeroplane fan exists. Scientists are able to forecast the future of our planet thanks to improved climate models. Put differently, they provide us a fresh resource to get ready for an unpredictable future.

With an HPL score of 1.102 Exaflop/s, the Frontier system at Oak Ridge National Laboratory (ORNL) in the United States leads the world's most recent supercomputer classification. Followed by :

1. Supercomputer Fugaku, installed at the RIKEN Center for Computational Science in Japan, with a Rmax PFlops of 442.01.

2. Lumi, kept at EuroHPC/CSC in Finland, with an HPL score of 309.1 Pflops..

3. Leonardo, kept at EuroHPC/CINECA in Italy, with an HPL score of 239 Pflops..

4. Summit, an IBM Power System, placed at the Oak Ridge National Laboratory in the USA, with an HPL score of 148.60 PFlops.

5. Sierra, installed at the Lawrence Livermore National Laboratory in California, with an HPL score of 94.64 PFLOPS-USA.

The expected release date of the next fastest supercomputer after Frontier is Europe's first exascale supercomputer, JUPITER, anticipated to come late 2024.

61

Future obstacles for qC

Selecting the strategy that will work with various methods for putting quantum computing into practice. The significant investment costs associated with quantum computing and quantum circuits mean that trying out all the various strategies will be expensive in terms of both time and money. Currently, it seems that the most likely option is to use separate techniques for different purposes.

QC firms are now investigating many methodologies, including quantum annealing, universal quantum gate model, and analogue quantum model.

The method most likely to be commercialised soon for resolving challenging mathematical issues is quantum annealing. Stable quantum processor production and error correction

The features of quantum mechanics require manipulations at smaller scales, perhaps even smaller than an atom, to be utilised. Stability and error verification issues arise at small sizes.

According to quantum scientists, qubit error-correction is more important than the total amount of qubits acquired. Complex problem solving remains a challenge due to the unpredictability of qubits.

Preserving the harsh operational environment :IBM maintains a temperature of fifteen millikelvin, which is so low as to prevent ambient noise or heat from exciting the superconducting qubit, hence improving stability and controlling qubits.

Maintaining the temperature at such a low level also throws off the balance. It is necessary to enhance operating conditions before a quantum computer or processor is widely commercialized.

Researchers in quantum technology are trying to figure out how to use quantum computers at greater temperatures. Recently, the maximum functioning temperature was attained.

The maximum working temperature ever recorded was 1 Kelvin, or -272 degrees.

62

Nevertheless, running these systems at ambient temperature take longer.

But hopefully not as long as this engineer wiring an IBM computer in 1958!

Stability and error correction are two issues that depend on technological investment, research funding, and advancements in quantum physics. Through a variety of approaches, various organisations are attempting to acquire the most widely available quantum computing technology. It will take some time to determine which strategy will work best in various contexts.

63

CHINAs TECH ASCENDANCY: Qc & ai AMBITIONS

On March 5th in Beijing, Premier Li Qiang delivered an extensive work report to the National People's Congress, outlining China's ambitious plan to spearhead the development of strategic emerging industries.

Central to this strategy is the recent unveiling of a national strategic industry list, which earmarks Quantum Computing and Artificial Intelligence (AI) as critical areas of focus.

This move is part of China's broader aim to fortify its technological sovereignty and place itself as a leader in avant-garde technologies. By concentrating efforts on these domains, China aspires to carve out a path to technological self-reliance, ensuring its prowess in quantum computing and AI.

This focus is a testament to the country's ambition to dominate in strategic technological spheres and establish itself as a powerhouse in science and technology.

Substantial investments in research and development are being channeled to enhance innovation capabilities across a myriad of sectors. A strategy designed to bolster national security and economic fortitude.

In addition to quantum computing and AI, the report underscores an intensified commitment to big data and introduces an "AI-plus" initiative.

It also announces plans to kickstart numerous major science and technology programs, aimed at addressing significant strategic and industrial development objectives.

The report eloquently states: "We will fully leverage the strengths of the new system for mobilizing resources nationwide to raise China's capacity for innovation across the board."

This declaration reflects Beijing's prioritization of technological selfsufficiency, especially in the wake of intensified trade tensions with the United States, which led to restrictions on exports of vital components such as chips to China.

64

By focusing on nurturing domestic innovation capabilities and reducing dependency on foreign technology suppliers, China aims to enhance its national security and economic resilience.

The government's increasing involvement in directing resources towards these ends is evident, as is the enhanced power granted to the Communist Party in shaping tech-related policies, marking a significant shift in government restructuring.

Moreover, China is dedicated to cultivating a new generation of top-tier scientists and innovation teams, alongside improving the mechanisms for identifying and nurturing leading innovators.

This concerted effort delineates China's strategy to not only advance its technological infrastructure but also to foster a robust ecosystem of scientific excellence and innovation.

65

PART 5

-FUTURISTIC-

66

LATTICE BASE CRYPTO VS QC

Imagine that you had a massive, intricate hedge maze. You can take countless routes through this maze, some of which will lead to dead ends and others which will lead to the exit.

The maze is constructed in such a way (let's call this design a "lattice") that the sheer amount of paths and decisions you have to make makes it very impossible to find the exit without a map.

In the field of computer security, certain intricate puzzles like our maze are employed to safeguard data. The mathematical foundation of these puzzles is lattice mathematics, which deals with grids and patterns that are simple to build up but extremely challenging to answer without the proper key.

Using today's technology and knowledge, the National Institute of Standards and Technology (NIST) has chosen a few of these extremely difficult lattice-based puzzles to protect sensitive data. They are thus effective in safeguarding sensitive information.

But, like with any maze, there's always a chance that someone may discover a way around or create a brand-new technique to get past our preconceived notions of how soon the puzzle can be solved.

Saying there is "no guarantee that a future breakthrough could not crack them" is comparable to admitting that our elaborate security system may one day be broken by someone who finds a quicker or more effective way to go through the maze and get out.

Thus, even though lattice-based puzzles are currently among the strongest methods we have for information security, we are mindful that future technological or mathematical developments may be able to circumvent them.

In terms of cryptography and the explanation of mazes, a lattice is a mathematical structure that resembles an endlessly long grid with regular points spaced out in numerous directions, rather than a real maze.

Imagine a 3D checkerboard that stretches not just left and right, up and down, but also forward and backward, as well as into higher dimensions that exist beyond of our three-dimensional world.

67

Every point on this grid represents a potential solution to an issue, and the lattice's mathematics explain the rules for getting from one point to another.

Certain movements or paths between points can be easily calculated if you know the initial conditions (much like when you have a map of the maze).

However, without the key, it can be very challenging to find the starting point when you only know the end point (much like when you try to find the maze's entrance from the exit without a map).

Cryptography use lattices to formulate problems that are thought to be difficult to answer without the right key.

These challenging issues focus on locating specific spots inside the lattice or figuring out the best or quickest path between locations while adhering to strict guidelines.

Since these puzzles are hard for an outsider to solve without the key, latticebased encryption is a good option to protect data from future, supercomputers (such quantum computers) that might be able to crack most of the existing cryptography systems.

Lattices are structures made of points organised in a regular grid-like pattern in multidimensional space. NIST-selected algorithms make use of this mathematical foundation.

68

Complex mathematical problems can be solved using these points and the relationships between them.

Based on the intrinsic complexity of some computational problems within these structures like determining the shortest path between points or locating specific lattice points based on incomplete information the latticebased method to cryptography was developed.

These problems lend themselves to robust cryptographic techniques because their exponential complexity increases with the lattice's dimensions, rendering them computationally impossible to solve without the right cryptographic keys.

Nonetheless, the statement that "there is no guarantee that a future breakthrough could not crack them" recognises the transient nature of cryptography's security.

Due to the assumed computational difficulty of the underlying lattice issues, lattice-based cryptography systems are currently considered secure against both classical and quantum computational attacks; nonetheless, the field of computational mathematics is ever-changing.

These hardness assumptions could be disproved theoretically or practically, making the security of the lattice-based cryptography techniques in use today susceptible.

This assertion embodies a more general cryptographic principle, which states that the security of cryptographic algorithms depends on available information and skills and needs to be updated frequently to account for new discoveries.

69

Ai integration with Quantum computing

Integrating artificial intelligence (AI) with quantum computing represents a frontier in technological advancement, poised to redefine the capabilities of computational models and algorithms.

This integration leverages the unique properties of quantum bits (qubits) to process and analyze data in ways that are fundamentally different from classical computing, offering exponential speed-ups for specific AI tasks.

Quantum-enhanced machine learning, for instance, benefits from quantum parallelism and entanglement, allowing it to tackle pattern recognition and optimization problems with unprecedented efficiency. The creation of hybrid systems that combine classical and quantum computing elements marks a pragmatic approach to harnessing quantum capabilities, facilitating a gradual integration into existing AI infrastructures. However, this convergence is not without challenges.

Technical hurdles such as error rates and qubit stability persist, prompting research into error correction techniques and quantum error mitigation to make quantum AI a practical reality.

The practical applications of quantum AI are vast and varied, promising to revolutionize fields such as drug discovery, materials science, financial modeling, climate modeling, and cybersecurity.

In drug discovery and materials science, for example, quantum AI can simulate molecular interactions at a quantum level with remarkable accuracy, drastically reducing the time and cost associated with developing new drugs. Financial modeling also stands to benefit, as quantum AI algorithms process vast datasets more efficiently, enhancing market trend predictions, risk assessment, and portfolio management.

Furthermore, the potential of quantum AI in processing complex environmental data could lead to more accurate climate models, informing policy and environmental protection efforts more effectively.

As for cybersecurity, quantum computing plays a dual role: it poses a threat to current encryption methods while simultaneously paving the way for virtually unbreakable encryption through quantum key distribution.

This dichotomy underscores the transformative potential of quantum AI across various domains, balancing risks with revolutionary advancements.Looking towards the future, the prospects of integrating AI

70

with quantum computing hinge on scaling quantum technologies to make them more accessible for AI applications.

Achieving quantum supremacy in specific AI tasks—where quantum computers can perform tasks beyond the reach of classical computers— would mark a pivotal moment in this integration, potentially transforming industries by solving previously intractable problems.

Interdisciplinary collaboration is paramount in advancing the integration of AI with quantum computing. The convergence of computer science, physics, and domain-specific knowledge is critical for overcoming the technical and conceptual challenges that lie ahead.

In conclusion, the integration of AI with quantum computing stands at the cusp of a new era in technological innovation, promising to address some of the most complex and pressing challenges facing society.

While obstacles remain, the path forward is illuminated by the potential for breakthroughs that harness the strengths of both AI and quantum computing. As researchers, policymakers, and technologists navigate this uncharted territory, their efforts will not only define the future of computing but also shape the future of humanity.

71

CONVERGENCE AI AND QUANTUM COMPUTING

The potential applications of the convergence of quantum computing with artificial intelligence (AI) are enormous. Studies and observations indicate that this convergence may open up countless possibilities in various AI use cases, enabling quantum computing to find uses in a variety of fields. Among the potential uses are:

Enhanced Machine Learning: AI and quantum computing together have the potential to greatly improve machine learning algorithms, allowing for more effective pattern identification and data processing.

Optimised Data Analysis: The power of quantum computing can transform data analysis by processing enormous volumes of data at previously unheard-of rates, producing predictions and insights that are more precise.

Improved Optimisation Problems: Compared to traditional computers, quantum algorithms are more adept at handling complicated optimisation issues, providing answers for the finance, logistics, and other sectors.

Secure Communication: By creating unbreakable encryption techniques, quantum computing can improve cybersecurity and guarantee secure communication routes.

Drug development and Material Science: By precisely mimicking molecular interactions, the convergence of AI and quantum computing can expedite drug development procedures and promote material science research.

These uses only scratch the surface of the revolutionary possibilities that result from the combination of AI and quantum computing, offering groundbreaking developments in a wide range of industries..

By utilising its special properties, quantum computing can greatly improve the accuracy of AI models. Here's how AI models can get more accurate thanks to quantum computing:

Faster Processing: AI algorithms can optimise and make decisions more quickly thanks to quantum computers' capacity to process a large number of options at once.

72

Forecasts and insights may become more accurate as a result of this speed.

Difficult Optimisation:

AI models are able to identify optimal solutions with more accuracy due to quantum computing's ability to handle difficult optimisation issues more effectively than conventional computers. This feature is especially helpful in situations where conventional computing techniques are ineffective

Enhanced Data Processing:

By utilising qubits, quantum computing can process data more quickly and efficiently, which improves data analysis and helps AI models recognise patterns in data.

73
The overall accuracy of AI systems is influenced by this improved capacity for data processing.

AI models can attain greater levels of accuracy, faster processing rates, and enhanced optimisation by utilising the potential of quantum computing, opening the door for more sophisticated applications across a variety of industries..

The following are some particular AI algorithms that can profit from quantum computing:

Planning and Scheduling Algorithms: By more quickly examining workable solutions to challenging issues, quantum computing might improve AI applications like planning and scheduling.

Optimisation Algorithms: AI models can have their solutions optimised thanks to quantum computing, which enhances decision-making and produces more accurate results.

Machine Learning Algorithms: By facilitating quicker optimisation, more precise forecasts, and enhanced data processing capabilities, quantum computing has the potential to improve machine learning algorithms.

The utilisation of quantum superposition and entanglement by quantum algorithms might facilitate the parallel computation or verification of mathematical problems, hence augmenting the precision and efficacy of artificial intelligence models .

The convergence of AI and Quantum Computing is not just a technological evolution; it's a revolution.

As these technologies integrate, they promise to reshape industries, redefine problem-solving paradigms, and offer unprecedented business opportunities

74
………………….

CONCLUSIVE REALITY CHECK

Beyond the capacity of the most potent supercomputers available today, complicated problems can be solved with quantum computing, which promises to be a dramatic leap ahead from classical computing.

Because of its potential to revolutionise sectors through advances in machine learning acceleration, logistics optimisation, and financial modelling, it is highly anticipated. The industry for quantum computing has even expressed optimism that these revolutionary impacts may occur in a matter of years.

Skepticism and Difficulties:

The full potential of quantum computing may not materialise as quickly or as fully as once thought, according to a wave of skepticism spearheaded by prominent scientists in the area. Opponents draw attention to the substantial technological obstacles that presently stand in the way of quantum computing's widespread use.

Qubits' intrinsic instability, large mistake rates, and the difficult problem of quantum error correction are a few examples. The distance between theoretical promise and real-world viability is highlighted by obstacles like these.

Cautionary and elucidating voices

Notable sceptics have voiced worries about the industry's propensity to exaggerate the technology's near-term potential, including Yann LeCun of Meta and Oskar Painter of Amazon Web Services.

Their cautious approach is reinforced by the current status of "noisy intermediate-scale quantum" (NISQ) processors, which, despite optimistic predictions, are not expected to be practically usable unless there are substantial improvements made to error correction methodologies.

Matthias Troyer of Microsoft agrees, stressing the magnitude of the limitations of quantum computing's advantage over traditional systems, with special attention to the consequences for data-intensive applications.

Quantum error correction technical obstacles:

Due to qubits' high error susceptibility, which has not yet been achieved on a practical scale, quantum error correction systems are complex and represent a fundamental obstacle for quantum computing. According to estimates, it will take at least ten years to get fault-tolerant quantum computing that can perform complex operations across thousands of qubits.

75

Evaluation of Useful Applications of Quantum Computing

Notwithstanding these difficulties, the theoretical basis of quantum computing points to important benefits in particular fields, like simulation of quantum systems and cryptography. The reality check, however, from industry experts (including Microsoft) indicates that there are fewer problems than previously thought where quantum computing could perform better than classical systems. Reassessing the timeframe and applications of quantum developments has resulted from this.

The Effects on Sectors and Studies

As a result of these disclosures, research institutions and industries have started to modify their plans and expenditures. Several companies have scaled back or stopped their quantum computing ambitions, particularly in the finance sector, which has seen a noticeable shift. Still, there is cautious optimism regarding the technology's ability to transform domains like chemistry and materials research, where quantum computing may allow for new discoveries and advances in knowledge.

A Well-Rounded View of Quantum Computing in Conclusion

There is a conflict in the field of quantum computing between the revolutionary possibilities of this technology and the substantial technical obstacles that still need to be overcome.

Notwithstanding the fact that the road to fully functional quantum computers is more convoluted and takes longer than anticipated, the development of this revolutionary technology spurs creativity and teamwork.

The scientific community and business sector may concentrate on the most promising uses of quantum computing by keeping a realistic view of the technology's potential and availability.

Among tech executives, investors, and academics in the field of quantum computing, 72 percent anticipate the development of a fully fault-tolerant quantum computer by 2035. The other 28 percent expect this achievement to occur in 2040 or later.

This will guarantee that, when it does arrive, quantum computing will have a significant and long-lasting influence.

76

LasT minute breaking news

77

QUANTUM INTERNET IGNITED

An important step towards quantum networking has been reached when researchers have created, saved, and retrieved quantum information for the first time. This is essential for creating quantum networks for secure communication and distributed computation, which will help with issues like financial risk optimisation, data decryption, molecular design, and material property research.

Researchers have developed a system that links two essential components and uses ordinary optical fibers to transport the quantum data to get around the problem of quantum information being lost when carried over long distances.

Large-scale information loss can occur in normal communications, but information on quantum networks cannot be tapped without erasing it. Researchers have developed a solution where both devices use the same wavelength to get around this issue.

Photons, which were not entangled, were generated by a quantum dot, and subsequently transferred to a quantum memory device. The photons were stored in a cloud of rubidium atoms. The memory was turned on and off by a laser, which made it possible to store and release photons as needed.

This is the first evidence that devices can communicate at wavelengths used in telephony. The group will now focus on enhancing the system's functionality, which will include ensuring that all photons are generated at the same wavelength, extending the system's capacity to store photons, and decreasing its overall size. This is a significant advancement as a proof of concept for upcoming quantum internet networks.

78

“Cerebras” - BREAKTHROUGH

The Wafer Scale Engine 3 (WSE-3), the largest computer chip in the world created by CEREBRAS scientists, will power an AI supercomputer.

• 4 trillions transistors

• 900,000 AI cores

• 125 petaflops of peak AI performance

• 44GB on-chip SRAM

• 5nm TSMC process

• External memory: 1.5TB, 12TB, or 1.2PB

• Trains AI models up to 24 trillion parameters

The third iteration of Cerebras' platform, the WSE-3, is intended to power AI systems like Anthropic's Claude 3 Opus and OpenAI's GPT-4.

With 900,000 AI cores, the chip is twice as strong as its predecessor while consuming the same amount of power.

According to Moore's Law, which asserts that a computer chip's transistor count usually doubles every two years, the prior device had 2.6 trillion transistors and 850,000 artificial intelligence cores.

The WSE-3 chip will power the Dallas, Texas-based Condor Galaxy 3 supercomputer. When the 64 Cerebras CS-3 AI system "building blocks" are combined and activated, the WSE-3 chip-powered system will have the capacity to process eight exaFLOPs of data.

The network will reach a total of 16 exaFLOPs when paired with the Condor Galaxy 1 and Condor Galaxy 2 systems.Future artificial intelligence systems, potentially ten times larger than GPT-4 will be trained on the supercomputer Condor Galaxy 3.

While the Condor Galaxy 3 could manage AI systems with about 24 trillion parameters, GPT-4 requires about 1.76 trillion.

79
80

Blackwell: “Nvidia's” AI LATEST Revolution

The most awaited news at the NVDIA conference (18th March) was the company's new GPU architecture, called Blackwell after mathematician David Harold Blackwell, which is optimised for artificial intelligence. It is the Hopper architecture's replacement, which debuted in 2022.

The B200 was created by merging two GPUs. The GPUs to beat these days are the B100 and B200, forget about the H100 (which several rivals are vying to unseat). Nvidia has concentrated on the B200, an assembly of two B100 GPUs side by side on the same packaging (like chips in bulk but referred to as a "super IA chip" by Nvidia). The two GPUs are connected to one another via a 10TB/s NV-HBI connection and work together as a single chip.

Instead of using the N3 technique, the B200 is engraved with a TSMC 4NP manufacturing process that was tailored for Nvidia. It has 2.5 times more transistors overall than an H100. It requires 192 GB of HBM3E memory with a bandwidth of 8 TB/s. With 20 petaflops of processing power, it can be 2.5 times faster in FP8 precision and 5 times faster in FF4 (40 petaflops) thanks to all of this.

With this design, Nvidia claims that training and inferring generative AI models with 1,000 billion parameters will be possible for 25 times less money and 25 times less power than with Hopper GPUs.

OpenAI, Amazon, Google, Meta, and Microsoft are among the first customers to be named. Undoubtedly, five of them have already committed to reserving a significant portion of their B200 production for 2024.

81

PUBLISHING PROGRAM

82 sources
Quantum insider Interesting Engineering The Quantum quarterly Clubic.com.com Science Daily .com wired.com Technology review.com Scientificcamerican.com Quantumseitgeist.com Quantamagazine.com eetimes.com scitechdaily.com Scmp.com qz.com Taiwannews.com futurism.com azooptics.com sifted.eu physicsworld.com azoai.com
The
June No 33 -CLIMATE TECH October No 34 - HUMANOID 3 0 November No 35 -BIOHACKING December No 36 -MOONBOUND

Next Edition- JUNE

83

Signature Statement

I appreciate your reading this month's issue of my independent futurology Chronicle. My mission is to provide you with a new, unbiased viewpoint on the most recent progress in science and technology, the advancement of space exploration, and the critical problems and solutions associated with climate change.

As a nonprofit publication, I work. with total editorial autonomy and flexibility, ensuring that my ideas stay impartial and objective.

In the months to come, I want to provide you with more interesting and educational information, and I thank you for your friendly support. www.frank.blue

84

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.