happy equinox, anon
as i chill under the sun and think about the adventures for spring, i want to talk about some science/engineering i am excited about:
quantum computing might be undergoing a groundbreaking shift with microsoft's recent announcement of a
chip
hosting eight majorana-based topological qubits.
although i won't take part in the discussion whether
these results are hyped or not,
i thought i could talk a little bit about the science behind topological quantum computers.
in 1936, alan turing introduced the universal turing machine,
a theoretical model of computation that could simulate any other turing machine.
in 1985, david deutsch extended this concept to quantum mechanics by proposing the
quantum turing machine,
a theoretical model capable of efficiently simulating quantum systems and laying the foundation for quantum computation.
remarkably, in 1994, peter shor demonstrated that
quantum computers could efficiently factor large numbers,
proving their superiority over classical computers and its implications for cryptography.
lot of progress has happened in the field in the last decades,
however, one of the greatest challenges in quantum computing continues to be decoherence,
(i.e., when quantum information is lost due to interactions with the environment).
unlike classical systems, where errors can often be mitigated through cooling,
quantum systems require error correction mechanisms due to their continuous nature.
i explored various aspects of
quantum computing back during my phd,
but as a string theorist,
i was particularly drawn to the theoretical yet elegant approach of quantum topological
computing,
which is inherently fault-tolerant (i.e., resistant to decoherence). let's talk about it.
starting from the basics, a quantum computer is a computational device that harnesses the principles of quantum mechanics
(such as superposition,
entanglement, and
quantum interference) to process information in ways that classical computers cannot.
it does this by leveraging a primitive called qubit, a normalized linear superposition of the
orthonormal states |0> and |1>,
which are members of the 2D complex vector space called hilbert space:
in a 3D space, particles can be classified as bosons (with integer spin quantum numbers) or
fermions (with half-integer spin quantum numbers).
when one particle in a 3D space is moved around another and returned to its original position,
this path is topologically equivalent to not moving the particle at all (because the path can be deformed into an arbitrarily small loop).
this constraint makes the statistics involved in exchanging fermions and bosons very simple, producing a phase change of ฯ or 2ฯ only.
frank wilczek introduced the term anyon in 1982, describing them as
composites of charged particles in (2+1)-dimensional models.
the statistical properties of these quasiparticles are that they interpolate continuously between bosons and fermions,
i.e., they acquire arbitrary phase factors when exchanged (and this phenomenon has been observed in the fractional quantum hall effect).
in 2003, alexei kitaev published a groundbreaking idea in his "fault-tolerant quantum computation by anyons",
proposing that a 2D quantum system hosting non-abelian (i.e., non-commutative) anyonic excitations could function as a quantum computer.
in this model, computations are performed by braiding these anyons, inducing unitary transformations
on the system's quantum state.
one of the simplest models of non-abelian anyon is the fibonacci anyon,
which appears on the SU(2) wittenโchernโsimons topological quantum field theory
(yeap, that simons).
the fibonacci model contains two particle types:
i) the vacuum (with 'charge' 0), denoted by 1, and
ii) the non-trivial anyon (with 'charge' 1) here denoted by ๐
their fusion rules can be written as:
1 โ ๐ = ๐ (fusion with vacuum does nothing)
๐ โ 1 = ๐ (fusion with vacuum does nothing)
๐ โ ๐ = 1 โ ๐ (creation of another anyon)
(โ denotes the fusion of two particles and โ denotes multiple possible outcomes)
this means that when two fibonacci anyons interact, they can either fuse into vacuum (annihilating each other) or another fibonacci anyon.
if the braid was not present, then the two pairs would individually fuse to vacuum with 100% probability.
however, by performing the braiding and then fusing the particles, there is a non-zero probability that the fusion could give 1 instead of 0.
the net 'charge' of the whole system is still 0, though, so if the remaining two particles are fused, they must give the vacuum.
the outcome of the fusion of one of the pairs of anyons determines the fusion outcome of the other pair.
thus, their non-abelian nature ensures that exchanging them implements unitary transformations, forming the basis of topological quantum computation:
1. create qubits from non-abelian anyons
2. move the anyons around โ 'braiding' them to perform a computation
3. measure the state of the anyons by fusion
for fibonacci anyons, the number of possible fusion outcomes (the dimension of the hilbert space)
grows according to the fibonacci sequence as more anyons are added.
this gives fibonacci anyons a quantum dimension of the golden ratio.
in summary, fibonacci anyons naturally correct errors due to their topological properties:
a topologically ordered state with non-local quantum entanglement (i.e., small perturbations do not disrupt the system).
plus, the robustness of these systems allows for long-lasting quantum information storage.
if you are interested in learning more, i have some notes here
(or specifically on fibonacci anyons here).