Quantum Computing

What is Quantum Computing?

Quantum Computers make use of quantum mechanics phenomena such as superposition and entanglement. Quantum Computers consist of qubits, which, unlike classical binary bits, can exist in two states (0 & 1) simultaneously. Quantum Computers require cryogenic cooling for proper functioning and controlling the energy states. This nascent technology can thus provide enormous computing power and is widely expected to solve valuable problems that today's most powerful classical supercomputers cannot solve and never will. It can be applied to Finance, Cryptography, Medicine & Chemistry, Weather services & Governments.

"Quantum supremacy is the ‘hello world’ moment we’ve been waiting for — the ability to solve a problem that classical computers practically cannot."
— Sundar Pichai, CEO of Google
IBM Quantum Computing is a prominent player in the advancement of quantum computing technologies, employing the principles of quantum mechanics to perform complex calculations. IBM’s quantum computers utilize qubits, which unlike classical binary bits, can exist in superposition, allowing them to process vast amounts of information simultaneously. Key quantum algorithms used include Shor’s Algorithm, designed for factoring large numbers exponentially faster than classical computers, and Grover’s Algorithm, used for unsorted database searching with a quadratic speedup. IBM’s approach also emphasizes quantum error correction techniques and the Quantum Volume metric, a holistic measure of quantum computer performance that accounts for qubit quality, quantity, and interconnectedness. IBM is pioneering cloud-based quantum computing with the IBM Quantum Experience and the Qiskit open-source software development kit, democratizing access to quantum resources and research.

Quantum Computing is the first really new computational technique since 1940 or so when a mathematician from Princeton called John Van Neumann really invented what is today's classic computer. Quantum Computers don't operate on these fixed zero one bits. They really operate on quantum mechanical phenomenon, which means that they more understand probability, but we can understand that they may exist in multiple states, and they can compute problems not with deterministic accuracy. So they may not always give you the same result each time, but they kind of more mimic what's happening with material and risk and optimization. So in some sens, you're harnessing quantum mechanical phenomenon in order to get to an answer of problems that may not be even feasible. And here's a really simple example: if you drink a coffee or a coke, you know that all the molecules in there that causes the burst of energy is caffeine. Chemists know caffeine has 160 electrons. Trying to understand the shape of that molecule on a classical computer would take something massive. We're talking state or country sized classical computers. Nobody can build a machine that big. I believe in the next few years that a Quantum Computer is something which is the size of this stage, is going to be able to solve a caffeine molecule. — Arvind Krishna, CEO of IBM

Quantum NetworksQuantum simulators
Post-Quantum CryptographyQuantum sensors, Particle Generators, Atomic clocks
Quantum Cloud ComputingQuantum Memories, Quantum Repeaters, Quantum Chips
Quantum SoftwareQuantum Computing, Quantum Annealers
Quantum MaterialsQuantum Key Distribution
"Quantum will impact everything."
— Jay Gambetta, IBM
Click on a part of the image to see details.
1 Shell:
When the computer is operational, five casings (like the white one shown at the top of the image) envelop the machine. These cans nest inside each other and act as thermal shields, keeping everything super cold and vacuum-sealed inside.
2 Nerves:
These photon-carrying cables deliver signals to and from the chip to drive qubit operations and return the measured results.
3 Heart:
Beneath the heat exchangers sits the “mixing chamber.” Inside, different forms of liquid helium—­helium-3 and helium-4—separate and evaporate, diffusing the heat.
4 Skeleton:
These gold plates separate cooling zones. At the bottom, they plunge to one-hundredth of a Kelvin—hundreds of times as cold as outer space.
5 Brain:
The QPU (quantum processing unit) features a gold-plated copper disk with a silicon chip inside that contains the machine’s brain.
Original Shell Nerves Heart Skeleton Brain

Types of Quantum Computing

1. Quantum Annealing: It is a technique used for solving optimization and probabilistic sampling problems i.e determining the optimal state. D-Wave System's Quantum Computers use this technique but many experts have been skeptic about calling these as real quantum computers due to limited scope. But D-Wave has proved that these systems are powerful than supercomputers.

2 Quantum Simulations: Simulators are used as systems to execute the quantum operations that are provided to them. IBM's Oiskit and other frameworks provide these simulators to execute quantum operations. These simulations are considered to be run on logical' qubits i.e no environment change can affect them.

3. Universal Quantum Computing: These are the most powerful and run on real qubits which can get affected due to the outside environmental changes i.e called Quantum Decoherence. Building Quantum Computers which truly exhibit quantum nature is hard but many companies have made it possible and research is still going on.


Niels Bohr

Niels Bohr was a pioneer of the Copenhagen interpretation of quantum mechanics, which states that physical reality is not determined until it is observed. He argued that quantum phenomena cannot be described by classical physics, but only by probabilistic laws and wave functions. He also introduced the concept of complementarity, which means that some properties of a system can only be measured at the expense of others.
📷 Keystone/Getty Images

Niels Bohn


The history of Quantum Physics ✍️

1900: Max Planck introduces the quantum hypothesis to explain black body radiation.
1905: Albert Einstein proposes the light quantum hypothesis to explain the photoelectric effect.
1924: Louis de Broglie suggests particles can exhibit wave-like behavior.
1925: Werner Heisenberg formulates matrix mechanics, the first version of quantum mechanics.
1926: Erwin Schrödinger develops wave mechanics and the Schrödinger equation.
1927: Heisenberg states the uncertainty principle, which shows precision limits for measuring pairs of physical properties.
1928: Paul Dirac formulates the Dirac equation, laying the groundwork for quantum field theory and predicting the existence of antimatter.
1935: Einstein, Podolsky, and Rosen publish the EPR paradox challenging quantum mechanics' completeness.
1947: Richard Feynman, Julian Schwinger, and Sin-Itiro Tomonaga independently develop the foundations of Quantum Electrodynamics (QED), which describes how light and matter interact.
1954: Chen Ning Yang and Robert Mills develop non-abelian gauge theories, paving the way for the development of QCD.
1961: Sheldon Glashow introduces a unification scheme for electromagnetic and weak interactions, an early step towards the Standard Model.
1964: John Bell proposes Bell's theorem, showing quantum entanglement cannot be explained by local hidden variables.
1964: Murray Gell-Mann and George Zweig, independently, propose the quark model for hadrons, leading to the concept of Quantum Chromodynamics (QCD) as the theory of strong interactions.
1973: David Gross, Frank Wilczek, and Hugh David Politzer discover asymptotic freedom in QCD, explaining how quarks behave inside nucleons, and strengthening QCD as the theory of the strong force.
1979: The Nobel Prize in Physics is awarded to Sheldon Glashow, Abdus Salam, and Steven Weinberg for their contributions to the unification of the weak force and electromagnetic interaction between elementary particles, using the framework of Quantum Field Theory.
1982: Alain Aspect's experiment confirms quantum entanglement, supporting Bell's theorem.

John S. Bell
I'm quite convinced of that: quantum theory is only a temporary expedient. - John S. Bell (1928 - 1990)


Quantum Programming Language

  • Open Quantum Assembly Language (Open QASM) is rather like the classical assembly language in that instructions can be pieced together to perform operations on registers and in the quantum world, qubits. For example in this language you work on the manipulation of qubits and logical bits, perform measurements and collect those measurements.
  • Q#: Microsoft’s language is named after their “#” (sharp) family names of software such as C# and F# and Q# is their growing in popularity quantum language.
  • Qiskit Language: Perhaps the “python” of quantum world.
  • Open Cirq: One of the tech giants (Google) is behind and supports the language and framework.
  • Strawberry Fields and PennyLane: Emerging from Xanadu, the language has emerged from the photonic quantum computing company that is working on building a light based quantum computer.
  • Silq (which we covered a short time) ago which aims to be a much higher level language quantum computing language. There are other langues such as PyQuil (from Rigetti), Quipper, [we can go on], but we wanted to cover some of the more popular in-use languages.

Applications of Quantum algorithms

Finance: Financial institutes rely on faster and secure transactions and this is why these institutes are eager to know what Quantum Computing can offer. This will still take time as reliability is important in the finance sector and the field is in its early stages. But it does have great possibilities to maximize the profits and reduces risks.

Chemistry: When it comes to chemistry simulating molecular bonds and structure of various molecules is a computationally expensive task. By leveraging the power of Quantum Computation these bonds may be simulated to gain more knowledge and insights about them. This could also speed up the process of generating vaccines for various viruses.

Cryptography: is a study of protecting information during communication by using appropriate methods to encrypt the information. Many have feared that using QC for cryptographic purposes would break in the encrypted texts but it would lead to more possibilities like quantum- resistant cryptographic algorithms. Currently, there are algorithms regarding Quantum Key Distribution like the BB84 Protocol.

Machine Learning: As the research is increasing in the field of machine learning and deep learning, models are becoming more diverse and their applications are also increasing. Model generation and optimization takes a lot of computation power and hence takes time to build complex models. Various research are going in this field and it named as Quantum Machine Learning.

Industrial Automation: Quantum computing would remove the limitations of classical computing and enable industrial automation to reach new levels of size and complexity. It would be applicable to industry in many, almost infinite ways. The quantum computer could take in inputs from sensors at every stage of the manufacturing process and monitor quality and precision. It could also manage the factory's output according to demand and logistics considerations, greatly reducing wastage.

Searching Big Data: A machine that can search the ever-growing amount of data being created, and locate connections within it, could have tremendous impact across many industries. Quantum computing offers the possibility of doing this significantly faster than classical computers. Further research will lead to the realization of this capability.

Examples of quantum secure algorithms:

Lattice-based cryptographyCode-based cryptographyMultivariate-based cryptography
Based on abstract structures of mathematics. It currently looks like the most promising method.Uses error-correcting-codes that allows read or data being transmitted to be checked for errors and corrected in real time.Based on solving multi variable equations. These equations are hard to solve using brute force.


What are Quantum algorithms?

A quantum algorithm is an algorithm which runs on a realistic model of quantum computation. Although all classical algorithms can also be performed on a quantum computer, the term quantum algorithm is usually used for those algorithms which seem inherently quantum, or use some essential feature of quantum computation such as quantum superposition or quantum entanglement.


Quantum platforms

- IBM Q-Experience (bell states)
- Google Q-CTRL
- D-Wave Leap
- Quantum Programming Studio
- Riggeti
- Azure Quantum, Q# & QDK
- QC-Ware Backends
- StrangeWorks
- IonQ


Linear Algebra for Quantum Computing

Linear Algebra can be called the universal language of quantum computing. Quantum Computation inherits Linear Algebra from quantum mechanics which makes it essential to have a solid knowledge about the basic linear algebra operations in order to explore this vast field of quantum computing. This also makes the person understand how the quantum algorithms really work.

Basic Linear Algebra Concepts:
1.) Vectors and Vector Spaces
2.) Matrices and Matrix Operations
3.) Spanning Sets, Linear Dependence, and Bases
4.) Hilbert Spaces, Orthonormality, and the Inner Product
5.) Outer Products and Tensor Products
6.) Eigenvectors and Eigenvalues
7.) Matrix Exponentials


Qubits

Quantum computers leverage quantum mechanical phenomena to manipulate information. To do this, they rely on quantum bits, or qubits. Qubits are manipulated using the phenomena called superposition and entanglement.

Superposition refers to the quantum phenomenon where a quantum system can exist in multiple states or multiple places at the exact same time. In other words, a quantum state corresponds to a probabilty of being in a particular state.

Entanglement (sometimes called "Spooky Action at Distance"), is the correlation between two or more quantum particles and in context to Quantum Computing these particles are qubits. Consider 2 qubits in their combined state i.e 100> then by applying the hadmard gate to one qubit and cnot gate we achieve entagled states.

Electron Spin Qubits: Electrons are considered as natural qubits. To use electrons as qubits there are attempts to embed them into solid crystals like silicon. But controlling the electrons is a difficult task. Electrons are close to each other so it results in decoherence.

Superconducting Qubits: In these types of qubits an electrical LC resonator is used, which is built by a superconducting metal. This resonator if used at very low temperatures can be used as a quantum system. There are different flavours of superconducting qubits: transmon, flux, fluxonium

lon-Trap Qubits: By using electromagnetic fields of an optical laser to trap ions in space (ion -> charged atom or molecule). Traps avoid the interaction of ions with the environment. A two-level quantum system i.e a qubit is then made from a pair of electrons.


Quantum Computing vs Classical Computing

Quantum ComputingClassical Computing
Information is stored in quantum bits (qubits) based on the direction of electron spinInformation is stored in binary digits (bits) - based on charge voltage/charge
Information is represented in form of 0 or 1 or superposition of 0 or 1 (0 and 1 can be represented at the same time)Information is represented in form of 0 or 1 only
Can manipulate many pieces of data at a timeCan only manipulate one piece of data at a time
Power increases exponentially in proportion to the number of qubitsPower increases in a 1:1 relationship with the number of transistors
Quantum computers have high error rates and need to be kept ultracoldClassical computers have low error rates and can operate at room temp
Well suited for tasks like optimization problems, data analysis, and simulationsMost everyday processing is best handled by classical computers
Circuit behavior is governed by quantum physicsCircuit behavior is governed by classical physics
Operations are defined by linear algebra over Hilbert SpaceOperations are defined by Boolean Algebra

Types of Cryptography

Quantum-breakableQuantum-Secure
RSA encryption: A message is encrypted using the intended recipient's public key, which the recipient then decrypts with a private key. The difficulty of computing the private key from the public key is connected to the hardness of prime factorization.Lattice-based cryptography: Security is related to the difficulty of finding the nearest point in a lattice with hundreds of spatial dimensions where the lattice point is associated with the private key, given an arbitrary location in space (associated with the public key).
Diffie-Hellman key exchange: Two parties jointly establish a shared secret key over an insecure channel that they can then use for encrypted communication. The security of the secret key relies on the hardness of the discrete logarithm problem.Code-based cryptography: The private key is associated with an error-correcting code and the public key with a scrambled and erroneous version of the code. Security is based on the hardness of decoding a general linear code.
Elliptic curve cryptography: Mathematical properties of elliptic curves are used to generate public and private keys. The difficulty of recovering the private key from the public key is related to the hardness of the elliptic-curve discrete logarithm problem.Multivariate cryptography: These schemes rely on the hardness of solving systems of multivariate polynomial equations.

Timeline

A Brief Look at Some of the Crucial Events to Quantum Computing.

YearEvent
1838Michael Faraday, an English physicist, experimentally shows the existence of negatively charged subatomic particles using cathode rays.
1887Heinrich Hertz, a German physicist, discovers the photoelectric effect.
1900Max Planck publishes his study on blackbody radiation, suggesting energy is quantized and paving the way for future physicists, such as Albert Einstein, to pursue quantum physics.
1918Max Planck receives a Nobel Prize for his work in blackbody radiation. Quantum physicist Richard Feynman is born.
1927The Solvay Conference in Brussels occurs, with 29 scientists in attendance. Here, the infamous Bohr versus Einstein disagreement comes to light.
1980Feynman and others begin considering the use of quantum systems to model other quantum systems, going beyond the mere two-state limitations imposed by classical computers.
1985David Deutsch publishes his paper describing the Quantum Turing Machine
1993Ethan Bernstein and Umesh Vazirani show quantum computers have the potential to be far faster and more efficient than classical computers.
1994Peter Shor develops Shor's Algorithm to calculate the prime factors of an integer using a quantum computer. In the same year, Dan Simon shows quantum computers are capable of being exponentially faster than their classical counterparts.
1995NIST and Caltech develop a method of using magnetic fields to shield particles from environmental influences that could lead to decoherence, although only a few bits are able to be created for a short period of time.
1996Lov Grover develops Grover's algorithm, a quantum database search algorithm. Peter Shor and Andrew Steane also develop error-correcting codes aimed at decreasing decoherence.
1998Neil Gershenfeld, Isaac Chang, and Mark Kubinec create a 2-qubit quantum computer that uses nuclear magnetic resonance.
2000NIST announces the possibility of creating a 4-qubit quantum computer using entangled beryllium atoms. A week after the announcement, researchers separately declare having developed a 7-qubit quantum computer using nuclear magnetic resonance

Why study Quantum Mechanics and Quantum Computing?

Two main reasons: fundamental and technological.

Nature is fundamentally quantum.
To harness the secrets of the laws governing elementary particles (anomalous magnetic moment of muon, asymmetry problem), quantum spacetime (emergence of it, quantum information entropy and gauge-gravity duality), the problem of vacuum, etc…
Even the paper on quantum gravity I have posted recently is written in quantum mechanical language of the density matrix formalism. So to be aware of state-of-art knowledge, you have to know about quantum mechanics.
Power plants, smartphones you are reading this text from, TV, laptops - all due to developments of quantum mechanics.
Quantum computing. This one has relation to both fundamental science and technology.

Richard Feynman stated that since nature fundamentally quantum, we can build up nature from scratch by using quantum blocks called “qubits” which are vector state in a two dimension complex Hilbert space. This device is called “quantum computer”. Qubits are quantum bits, gates are unitary matrices (operators), acting together they create quantum entanglement and transform the initial state.

How it can be used? Already now:

  • logistic optimization (São Carlos (UFSCar) Prof. Celso Jorge Villas-Bôas’s group used D Wave quantum computer on 5000 qubits to speed up the logistics tasks)
  • quantum finance
  • investigations on quantum neural networks speed up

Quantum Computing does not have the biggest market size radius. But wireless communication technologies and petroleum industries do. And Quantum Computing ad part of AI program is in active stage of investigation to be applied in there.

  • CBPF had quantum computing project sponsored by petroleum companies
  • there are tasks to be potentially solved faster via quantum computing algorithms

Chemistry/Pharmacy/Ecology:
- new materials development via quantum simulations

… to be explored. In such growing emerging field where demand exceeds supply of researchers and students, it is smart to get into quantum. Happy to be part of this mission and start my new Quantum Class today. 😌📘📚


Roadmap

Quantum Roadmap


Q-Day

They call it Q-day. That is the day when a robust quantum computer will be able to crack the most common encryption method used to secure our digital data.
This will have massive implications for all internet companies, banks and governments - as well as our own personal privacy.

We don't know when.
We don't know where.
But we know Q-Day will happen.

Quantum computing makes use of the laws of quantum mechanics which allows it to perform certain tasks more efficiently, such as solving a maze.

  Youtube Quantum Playlist →


Quantum processor specs Qubits roadmap

FeatureGoogle Quantum Willow (2024)IBM Quantum Condor (2023)IBM Quantum Starling (Planned 2029)
Qubits (Physical)105 superconducting transmon qubits1,121 superconducting qubitsNot specified; modular system with enhanced connectivity (e.g., C-couplers for 3D qubit links)
Logical QubitsUp to 49 data qubits in distance-7 surface code (demonstrated scalable logical qubits)N/A (focus on scale, not yet error-corrected at large scale)200 logical qubits (using qLDPC error correction codes)
Key InnovationFirst below-threshold quantum error correction: errors reduce exponentially with scale (e.g., halved from 3x3 to 7x7 grids); logical error rate ~0.143% per cycleRecord-breaking qubit count for general-purpose processor; 50% increase in qubit density over OspreyLarge-scale fault-tolerant system: 100 million quantum gates; 20,000x more operations than current systems; reduces physical qubit overhead by ~90% via qLDPC codes
Coherence Time (T1)Up to 100 µs (median ~85 µs; logical >291 µs)Comparable to Osprey (~100-200 µs, not detailed)Not specified; focuses on modular, scalable fault tolerance
Performance BenchmarkRCS task in <5 minutes (vs. supercomputer: 10^25 years); verifiable quantum advantageSimilar performance to 433-qubit Osprey; enables larger circuits for chemistry/optimization100 million gates on 200 logical qubits; foundational for drug discovery, materials science
ArchitectureSquare lattice with average connectivity 3.47Heavy-hexagonal lattice with cross-resonance gatesModular processors (e.g., Loon 2025 for testing, Kookaburra 2026 for memory-logic integration); built in Poughkeepsie data center
StatusReleased; demonstrates path to scalable fault toleranceReleased; innovation milestone for hardware scalePlanned; roadmap includes Nighthawk (2025: up to 1,080 qubits), Loon (2025)

Quotes

52.9% Quantum computing market annual growth rate, 2022-2027
It seems clear that the present quantum mechanics is not in its final form. - Paul Dirac
If you bank tells you that your bank balances on a Quantum Computer, choose a different bank. Because you don't want the answer to be a dollar today and a million dollar the next day. — Arvind Krishna, CEO of IBM
You can actually be at risk from a quantum computer, even though a [high-performance] quantum computer does not yet exist. This is often called ‘harvest now, decrypt later. — Dustin Moody, NIST
We’re going to standardize a number of things so that we have a diversity of different mathematical problems to base our security on. — Dustin Moody, NIST
Nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical.
— Richard Feynman, physicist
I think I can safely say that nobody understands quantum mechanics.
— Richard Feynman, physicist
Anyone who claims to understand quantum theory is either lying or crazy.
— Richard Feynman, physicist
I would rather have questions that can't be answered than answers that can't be questioned.
— Richard Feynman
Quantum computation… will be the first technology that allows useful tasks to be performed in collaboration between parallel universes.
— Prof. David Deutsch - The Fabric of Reality
Each qubit emerging in Hawking radiation becomes entangled with previously emitted qubit so at the end of the evaporation process the state is again a pure state and the information has been revealed ! but for a Solar mass Black hole it will happen in 1064 years.
The history of the universe is, in effect, a huge and ongoing quantum computation. The universe is a quantum computer.
— Seth Lloyd
Life is and will ever remain an equation incapable of solution, but it contains certain known factors.
— Nikola Tesla
Their pioneering experiments on entanglement have opened the door to new technology based on quantum information.
— Ben Skuse
It was in 1981 that Feynman wrote an extremely influential paper on the relationship between quantum theory and computation. One of the questions he asked was, “Can quantum systems be simulated probabilistically by a classical computer with local connections?” Feynman’s answer was in the negative, and he explained this answer by analyzing what he called the two-photon correlation experiment, which was designed to check the possibility of using hidden variables to produce the quantum mechanical results for an entangled system. Feynman explains the crucial element of his analysis that causes it to fail: it requires some probabilities to be negative. He emphasizes that the experiment has been performed and that the results agree with the quantum predictions. What Feynman constructs is undoubtedly just Bell’s Theorem. In the collection that he has edited on Feynman’s work on computation, which includes Feynman’s influential paper, Tony Hey comments that “Only Feynman could discuss ‘hidden variables,’ the Einstein-Podolsky-Rosen paradox and produce a proof of Bell’s Theorem, without mentioning John Bell.” Hey assumes that Feynman had read or heard of Bell’s work, almost certainly just the barest bones of it, but had not picked up or remembered his name. Feynman must also, of course, have known that the experiments had been performed. Hey remarks that Feynman “had no problem about the fact that he was sometimes recreating things that other people already knew—in fact I don’t think he could learn a subject any other way than by finding out for himself.”
Quantum technologies are difficult to understand, but that will not stop the disruption this set of emerging technologies will bring in the next few years! — Kevin Coleman
A quantum computer is going to be able to better simulate the quantum world, so simulation of atoms and molecules (this will allow quantum computers to aid in the design and discovery of new materials with tailored properties). If I am able to design a better material for energy storage, I can solve the problem of mobility. If I am able to design a better material as a fertiliser, I am able to solve the problem of hunger and food production. If I am able to design a new material that allows [us] to do CO2 capture, I am able to solve the problem of climate change.
— Alessandro Curioni, the director of the IBM Research Lab in Zurich
When people communicate over the Internet, anyone can listen to the conversation. So they have to first be encrypted. And the way encryption works between two people who haven't met is they have to rely on some algorithms known as RSA or Elliptic Curve, Diffie–Hellman, to exchange a secret key.
— Vadim Lyubashevsky, cryptographer at the IBM Research Lab in Zurich
Quantum computing is actually very different from our regular computing. It's not just that this is a more powerful version of what we have today. It's actually an entirely different framework for computing itself. It's not the case that a quantum computer is better at every task and will somehow speed up everything we do. There are very specific tasks that a quantum computer can actually do in ways that are better.
— Shohini Ghose, Quantum physicist at Wilfrid Laurier University in Canada
You don't have to be a physicist to be part of this new quantum computing revolution.
— Shohini Ghose, Quantum physicist at Wilfrid Laurier University in Canada
Even if the first quantum computer does not come for 20 years we are, in a sense, already late
— Christophe Petit, University of Birmingham
The day a big quantum computer is built, all the cryptography we are using today is dead
— Christophe Petit, University of Birmingham
To crack encryption, all you need is one working quantum computer under laboratory conditions
— Andersen Cheng, chief executive of Post-Quantum
A lot of nation states are building quantum computers and they just need a working engine to start cracking encryption
— Andersen Cheng, chief executive of Post-Quantum
In the public discourse, people are saying it will be 10 to 20 years until we have the first full commercially available quantum computer. In the cyber security domain, they say it will be more like five to 10 years, but the intelligence community [has] become worried… over the past two years. They believe a working quantum computer will arrive much earlier than we think.
— Andersen Cheng, chief executive of Post-Quantum
Entanglement: a somewhat mysterious feature of quantum mechanics that even baffled Einstein in his time who declared it "spooky action at a distance".

— Albert Einstein
Quantum Mechanics introduced us to a world where the very act of observing alters the reality we observe. The uncertainty principle emerges as a consequence of this strange and counterintuitive behavior, telling us that there are fundamental limits to the precision with which we can measure certain pairs of physical quantities in the quantum realm.
Our post-quantum cryptography program has leveraged the top minds in cryptography — worldwide — to produce this first group of quantum-resistant algorithms that will lead to a standard and significantly increase the security of our digital information.
— NIST Director Laurie E. Locascio
It's really about ultimately having a parallel internet - Jack Hidary, CEO of Quantum Technology start-up Sandboxaq
The beauty of mathematics only shows itself to more patient followers — Maryam Mirzakhani

Codepen

See the Pen Quantum Circuit Inspector by Fabien Laurent Patrice Egot (@equant_org) on CodePen.


See the Pen Regenerate Quantum State by Fabien Laurent Patrice Egot (@equant_org) on CodePen.


See the Pen BraKetVue / ⟨𝜑|𝜓⟩ by Fabien Laurent Patrice Egot (@equant_org) on CodePen.


See the Pen Quantum Harmonic Oscillator by Fabien Laurent Patrice Egot (@equant_org) on CodePen.