Thresholds for the distributed surface code in the presence of memory decoherence

Kavli Affiliate: Tim H. Taminiau

| First 5 Authors: Sébastian de Bone, Paul Möller, Conor E. Bradley, Tim H. Taminiau, David Elkouss

| Summary:

In the search for scalable, fault-tolerant quantum computing, distributed
quantum computers are promising candidates. These systems can be realized in
large-scale quantum networks or condensed onto a single chip with closely
situated nodes. We present a framework for numerical simulations of a memory
channel using the distributed toric surface code, where each data qubit of the
code is part of a separate node, and the error-detection performance depends on
the quality of four-qubit Greenberger-Horne-Zeilinger (GHZ) states generated
between the nodes. We quantitatively investigate the effect of memory
decoherence and evaluate the advantage of GHZ creation protocols tailored to
the level of decoherence. We do this by applying our framework for the
particular case of color centers in diamond, employing models developed from
experimental characterization of nitrogen-vacancy centers. For diamond color
centers, coherence times during entanglement generation are orders of magnitude
lower than coherence times of idling qubits. These coherence times represent a
limiting factor for applications, but previous surface code simulations did not
treat them as such. Introducing limiting coherence times as a prominent noise
factor makes it imperative to integrate realistic operation times into
simulations and incorporate strategies for operation scheduling. Our model
predicts error probability thresholds for gate and measurement reduced by at
least a factor of three compared to prior work with more idealized noise
models. We also find a threshold of $4cdot10^2$ in the ratio between the
entanglement generation and the decoherence rates, setting a benchmark for
experimental progress.

| Search Query: ArXiv Query: search_query=au:”Tim H. Taminiau”&id_list=&start=0&max_results=3

Read More