Below is a list of my publications. This list is not always up to date, and so if you are looking for a recent paper you are advised to check either my Google Scholar profile or the arXiv preprint server.

Sort by year:

We explore the space “just above” BQP by defining a complexity class PDQP (Product Dynamical Quantum Polynomial time) which is larger than BQP but does not contain NP relative to an oracle. The class is defined by imagining that quantum computers can perform measurements that do not collapse the wavefunction. This (non-physical) model of computation can efficiently solve problems such as Graph Isomorphism and Approximate Shortest Vector which are believed to be intractable for quantum computers. Furthermore, it can search an unstructured N-element list in O(N^{1/3}}) time, but no faster than {\Omega}(N^{1/4}), and hence cannot solve NP-hard problems in a black box manner. In short, this model of computation is more powerful than standard quantum computation, but only slightly so.

Our work is inspired by previous work of Aaronson on the power of sampling the histories of hidden variables. However Aaronson’s work contains an error in its proof of the lower bound for search, and hence it is unclear whether or not his model allows for search in logarithmic time. Our work can be viewed as a conceptual simplification of Aaronson’s approach, with a provable polynomial lower bound for search.

Scientific Reports 5, 18281

In ordinary, non-relativistic, quantum physics, time enters only as a parameter and not as an observable: a state of a physical system is specified at a given time and then evolved according to the prescribed dynamics. While the state can, and usually does, extend across all space, it is only defined at one instant of time, in conflict with special relativity where space and time are treated on an equal footing. Here we ask what would happen if we defined the notion of the quantum density matrix for multiple spatial and temporal measurements. We introduce the concept of a pseudo-density matrix which treats space and time indiscriminately. This matrix in general fails to be positive for timelike separated measurements, motivating us to define a measure of causality that discriminates between spacelike and timelike correlations. Important properties of this measure, such as monotonicity under local operations, are proved. Two qubit NMR experiments are presented that illustrate how a temporal pseudo-density matrix approaches a genuinely allowed density matrix as the amount of decoherence is increased between two consecutive measurements.

With recent progress on experimental quantum information processing, an important question has arisen as to whether it is possible to verify arbitrary computation performed on a quantum processor. A number of protocols have been proposed to achieve this goal, however all are interactive in nature, requiring that the computation be performed in an interactive manner with back and forth communication between the verifier and one or more provers. Here we propose two methods for verifying quantum computation in a non-interactive manner based on recent progress in the understanding of the local Hamiltonian problem. Provided that the provers compute certain witnesses for the computation, this allows the result of a quantum computation to be verified after the fact, a property not seen in current verification protocols.

Gaussian processes (GP) are a widely used model for regression problems in supervised machine learning. Implementation of GP regression typically requires O(n^3) logic gates. We show that the quantum linear systems algorithm [Harrow et al., Phys. Rev. Lett. 103, 150502 (2009)] can be applied to Gaussian process regression (GPR), leading to an exponential reduction in computation time in some instances. We show that even in some cases not ideally suited to the quantum linear systems algorithm, a polynomial increase in efficiency still occurs.

A permutation-invariant code on m qubits is a subspace of the symmetric subspace

of the *m* qubits. We derive permutation-invariant codes that can encode an increasing

amount of quantum information while suppressing leading order spontaneous decay errors.

To prove the result, we use elementary number theory with prior theory on permutation

invariant codes and quantum error correction.

The connection between certain entangled states and graphs has been heavily studied in the context of measurement-based quantum computation as a tool for understanding entanglement. Here we show that this correspondence can be harnessed in the reverse direction to yield a graph data structure which allows for more efficient manipulation and comparison of graphs than any possible classical structure. We introduce efficient algorithms for many transformation and comparison operations on graphs represented as graph states, and prove that no classical data structure can have similar performance for the full set of operations studied.

The recent discovery of fully-homomorphic classical encryption schemes has had a dramatic effect on the direction of modern cryptography. Such schemes, however, implicitly rely on the assumptions that solving certain computation problems are intractable. Here we present a quantum encryption scheme which is homomorphic for arbitrary classical and quantum circuits which have at most some constant number of non-Clifford gates. Unlike classical schemes, the security of the scheme we present is information theoretic, satisfying entropic security definitions, and hence independent of the computational power of an adversary.

Physical Review Letters 114 (22), 220502

Blind quantum computation allows a user to delegate a computation to an untrusted server while keeping the computation hidden. A number of recent works have sought to establish bounds on the communication requirements necessary to implement blind computation, and a bound based on the no-programming theorem of Nielsen and Chuang has emerged as a natural limiting factor. Here we show that this constraint only holds in limited scenarios, and show how to overcome it using a novel method of iterated gate teleportations. This technique enables drastic reductions in the communication required for distributed quantum protocols, extending beyond the blind computation setting. Applied to blind quantum computation, this technique offers significant efficiency improvements, and in some scenarios offers an exponential reduction in communication requirements.

As progress on experimental quantum processors continues to advance, the problem of verifying the correct operation of such devices is becoming a pressing concern. The recent discovery of protocols for verifying computation performed by entangled but non-communicating quantum processors holds the promise of certifying the correctness of arbitrary quantum computations in a fully device-independent manner. Unfortunately, all known schemes have prohibitive overhead, with resources scaling as extremely high degree polynomials in the number of gates constituting the computation. Here we present a novel approach based on a combination of verified blind quantum computation and Bell state self-testing. This approach has dramatically reduced overhead, with resources scaling as only O(*m*^{4} ln *m*) in the number of gates.

Physical Review A 91 (1), 012342

Boson sampling has been presented as a simplified model for linear optical quantum computing. In the boson-sampling model, Fock states are passed through a linear optics network and sampled via number-resolved photodetection. It has been shown that this sampling problem likely cannot be efficiently classically simulated. This raises the question as to whether there are other quantum states of light for which the equivalent sampling problem is also computationally hard. We present evidence, without using a full complexity proof, that a very broad class of quantum states of light—arbitrary superpositions of two or more coherent states—when evolved via passive linear optics and sampled with number-resolved photodetection, likely implements a classically hard sampling problem.

Proceedings of ITCS '15 pp 103-112

We give a quantum interactive proof system for the local Hamiltonian problem on n qubits in which (i) the verifier has a single round of interaction with five entangled provers, (ii) the verifier sends a classical message on O(log n) bits to each prover, who replies with a constant number of qubits, and (iii) completeness and soundness are separated by an inverse polynomial in $n$. As the same class of proof systems, without entanglement between the provers, is included in QCMA, our result provides the first indication that quantum multiprover interactive proof systems with entangled provers may be strictly more powerful than unentangled-prover interactive proof systems. A distinguishing feature of our protocol is that the completeness property requires honest provers to share a large entangled state, obtained as the encoding of the ground state of the local Hamiltonian via an error-correcting code. Our result can be interpreted as a first step towards a multiprover variant of the quantum PCP conjecture.

Encryption schemes often derive their power from the properties of the underlying algebra on the symbols used. Inspired by group theoretic tools, we use the centralizer of a subgroup of operations to present a private-key quantum homomorphic encryption scheme that enables a broad class of quantum computation on encrypted data. A particular instance of our encoding hides up to a constant fraction of the information encrypted. This fraction can be made arbitrarily close to unity with overhead scaling only polynomially in the message length. This highlights the potential of our protocol to hide a non-trivial amount of information, and is suggestive of a large class of encodings that might yield better security.

Physical Review A 90 (5), 050303

Homomorphic encryption is a form of encryption which allows computation to be carried out on the encrypted data without the need for decryption. The success of quantum approaches to related tasks in a delegated computation setting has raised the question of whether quantum mechanics may be used to achieve information theoretically secure fully homomorphic encryption. Here we show, via an information localisation argument, that deterministic fully homomorphic encryption necessarily incurs exponential overhead if perfect security is required.

Physical Review X 4 (4), 041041

Exquisite quantum control has now been achieved in small ion traps, in nitrogen-vacancy centres and in superconducting qubit clusters. We can regard such a system as a universal cell with diverse technological uses from communication to large-scale computing, provided that the cell is able to network with others and overcome any noise in the interlinks. Here we show that loss-tolerant entanglement purification makes quantum computing feasible with the noisy and lossy links that are realistic today: With a modestly complex cell design, and using a surface code protocol with a network noise threshold of 13.3%, we find that interlinks which attempt entanglement at a rate of 2 MHz but suffer 98% photon loss will result in kilohertz computer clock speeds (i.e. rate of high fidelity stabilizer measurements). Improved links would dramatically increase the clock speed. Our simulations employed local gates of a fidelity already achieved in ion trap devices.

Physical Review Letters 112 (13), 130502

Deterministic quantum computation with one quantum bit (DQC1) [E. Knill and R. Laflamme, Phys. Rev. Lett. 81, 5672 (1998)] is a model of quantum computing where the input is restricted to containing a single qubit in a pure state and has all other qubits in a completely mixed state. Only the single pure qubit is measured at the end of the computation. While it is known that DQC1 can efficiently solve several problems for which no known classical efficient algorithms exist, the question of whether DQC1 is really more powerful than classical computation remains open. In this Letter, we introduce a slightly modified version of DQC1, which we call DQC1_k, where k output qubits are measured, and show that DQC1_k cannot be classically efficiently simulated for any k≥3 unless the polynomial hierarchy collapses at the third level.

Physical Review Letters 111 (23), 230502

Blind quantum computation allows a client with limited quantum capabilities to interact with a remote quantum computer to perform an arbitrary quantum computation, while keeping the description of that computation hidden from the remote quantum computer. While a number of protocols have been proposed in recent years, little is currently understood about the resources necessary to accomplish the task. Here, we present general techniques for upper and lower bounding the quantum communication necessary to perform blind quantum computation, and use these techniques to establish concrete bounds for common choices of the client’s quantum capabilities. Our results show that the universal blind quantum computation protocol of Broadbent, Fitzsimons, and Kashefi, comes within a factor of 8/3 of optimal when the client is restricted to preparing single qubits. However, we describe a generalization of this protocol which requires exponentially less quantum communication when the client has a more sophisticated device.

Nature Physics 9, 727–731

Quantum computers are expected to offer substantial speed-ups over their classical counterparts and to solve problems intractable for classical computers. Beyond such practical significance, the concept of quantum computation opens up fundamental questions, among them the issue of whether quantum computations can be certified by entities that are inherently unable to compute the results themselves. Here we present the first experimental verification of quantum computation. We show, in theory and experiment, how a verifier with minimal quantum resources can test a significantly more powerful quantum computer. The new verification protocol introduced here uses the framework of blind quantum computing and is independent of the experimental quantum-computation platform used. In our scheme, the verifier is required only to generate single qubits and transmit them to the quantum computer. We experimentally demonstrate this protocol using four photonic qubits and show how the verifier can test the computer’s ability to perform quantum computation.

Advances in Cryptology–ASIACRYPT 2014 pp 406-425

Delegating difficult computations to remote large computation facilities, with appropriate security guarantees, is a possible solution for the ever-growing needs of personal computing power. For delegated computation protocols to be usable in a larger context – or simply to securely run two protocols in parallel – the security definitions need to be composable. Here, we define composable security for delegated quantum computation, and prove that several known protocols are composable, including Broadbent, Fitzsimons and Kashefi’s Universal Blind Quantum Computation protocol.

We distinguish between protocols which provide only blindness – the computation is hidden from the server – and those that are also verifiable – the client can check that it has received the correct result. We show that the composable security definition capturing both these notions can be reduced to a combination of two distinct stand-alone security definitions.

Phys. Rev. A 88 (2), 022310

Quantum states of light are the obvious choice for communicating quantum information. To date, encoding information into the polarization states of single photons has been widely used as these states form a natural closed two-state qubit. However, photons are able to encode much more—in principle, infinite—information via the continuous spatiotemporal degrees of freedom. Here we consider the information capacity of an optical quantum channel, such as an optical fiber, where a spectrally encoded single photon is the means of communication. We use the Holevo bound to calculate an upper bound on the channel capacity, and relate this to the spectral encoding basis and the spectral properties of the channel. Further, we derive analytic bounds on the capacity of such channels, and, in the case of a symmetric two-state encoding, calculate the exact capacity of the corresponding channel.

While the above discussion may paint a bleak picture for cryptography in a world where large scale quantum computers are available, all is not lost. As we have seen, certain areas of cryptography, such as symmetric-key ciphers and hashes are not particularly inherently vulnerable to quantum attacks. Indeed, in these areas there do exist information theoretically secure protocols, which are of course invulnerable to quantum attacks. However, quantum attacks cause problems for areas of cryptography where such information theoretically secure classical schemes do not and cannot exist, such as public key ciphers, digital signatures and key exchange protocols.

Computation for Humanity - Information Technology to Advance Society, CRC Press

The success of the abstract model of computation, in terms of bits, logical operations, programming language constructs, and the like, makes it easy to forget that computation is a physical process. Our cherished notions of computation and information are grounded in classical mechanics, but the physics underlying our world is quantum. In the early 80s researchers began to ask how computation would change if we adopted a quantum mechanical, instead of a classical mechanical, view of computation. Slowly, a new picture of computation arose, one that gave rise to a variety of faster algorithms, novel cryptographic mechanisms, and alternative methods of communication. Small quantum information processing devices have been built, and efforts are underway to build larger ones. Even apart from the existence of these devices, the quantum view on information processing has provided significant insight into the nature of computation and information, and a deeper understanding of the physics of our universe and its connections with computation.

We start by describing aspects of quantum mechanics that are at the heart of a quantum view of information processing. We give our own idiosyncratic view of a number of these topics in the hopes of correcting common misconceptions and highlighting aspects that are often overlooked. A number of the phenomena described were initially viewed as oddities of quantum mechanics. It was quantum information processing, first quantum cryptography and then, more dramatically, quantum computing, that turned the tables and showed that these oddities could be put to practical effect. It is these application we describe next. We conclude with a section describing some of the many questions left for future work, especially the mysteries surrounding where the power of quantum information ultimately comes from.

Physical Review Letters 109 (15), 150501

In the setting of networked computation, data security can be a significant concern. Here we consider the problem of allowing a server to remotely manipulate client supplied data, in such a way that both the information obtained by the client about the server’s operation and the information obtained by the server about the client’s data are significantly limited. We present a protocol for achieving such functionality in two closely related models of restricted quantum computation—the boson sampling and quantum walk models. Because of the limited technological requirements of the boson scattering model, small scale implementations of this technique are feasible with present-day technology.

Blind Quantum Computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client’s input, output and computation remain private. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol, or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory.

The authors, together with Broadbent, previously proposed a universal and unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. In this paper we extend that protocol with new functionality allowing blind computational basis measurements, which we use to construct a new verifiable BQC protocol based on a new class of resource states. We rigorously prove that the probability of failing to detect an incorrect output is exponentially small in a security parameter, while resource overhead remains polynomial in this parameter. The new resource state allows entangling gates to be performed between arbitrary pairs of logical qubits with only constant overhead. This is a significant improvement on the original scheme, which required that all computations to be performed must first be put into a nearest neighbour form, incurring linear overhead in the number of qubits. Such an improvement has important consequences for efficiency and fault-tolerance thresholds.

Science 335 (6066), 303-308

Quantum computers, besides offering substantial computational speedups, are also expected to preserve the privacy of a computation. We present an experimental demonstration of blind quantum computing in which the input, computation, and output all remain unknown to the computer. We exploit the conceptual framework of measurement-based quantum computation that enables a client to delegate a computation to a quantum server. Various blind delegated computations, including one- and two-qubit gates and the Deutsch and Grover quantum algorithms, are demonstrated. The client only needs to be able to prepare and transmit individual photonic qubits. Our demonstration is crucial for unconditionally secure quantum cloud computing and might become a key ingredient for real-life applications, especially when considering the challenges of making powerful quantum computers widely available.

Physical Review A 84 (1), 012103

Entangled states can potentially be used to outperform the standard quantum limit by which every classical sensor is bounded. However, entangled states are very susceptible to decoherence, and so it is not clear whether one can really create a superior sensor to classical technology via a quantum strategy which is subject to the effect of realistic noise. This paper presents an investigation of how a quantum sensor composed of many spins is affected by independent dephasing. We adopt general noise models including non-Markovian effects, and in these noise models the performance of the sensor depends crucially on the exposure time of the sensor to the field. We have found that, by choosing an appropriate exposure time within the non-Markovian time region, an entangled sensor does actually beat the standard quantum limit. Since independent dephasing is one of the most typical sources of noise in many systems, our results suggest a practical and scalable approach to beating the standard quantum limit.

Physical Review A 83 (6), 060303

In distributed quantum computation, small devices composed of a single or a few qubits are networked together to achieve a scalable machine. Typically, there is an optically active matter qubit at each node, so that photons are exploited to achieve remote entanglement. However, in many systems the optically active states are unstable or poorly defined. We report a scheme to perform a high-fidelity entanglement operation even given severe instability. The protocol exploits the existence of optically excited states for phase acquisition without actually exciting those states; it functions with or without cavities and does not require number-resolving detectors.

In his famous 1981 talk, Feynman proposed that unlike classical computers, which would presumably experience an exponential slowdown when simulating quantum phenomena, a universal quantum simulator would not. An ideal quantum simulator would be controllable, and built using existing technology. In some cases, moving away from gate-model-based implementations of quantum computing may offer a more feasible solution for particular experimental implementations. Here we consider an adiabatic quantum simulator which simulates the ground state properties of sparse Hamiltonians consisting of one- and two-local interaction terms, using sparse Hamiltonians with at most three-local interactions. Properties of such Hamiltonians can be well approximated with Hamiltonians containing only two-local terms. The register holding the simulated ground state is brought adiabatically into interaction with a probe qubit, followed by a single diabatic gate operation on the probe which then undergoes free evolution until measured. This allows one to recover e.g. the ground state energy of the Hamiltonian being simulated. Given a ground state, this scheme can be used to verify the QMA-complete problem LOCAL HAMILTONIAN, and is therefore likely more powerful than classical computing.

Physical Review Letters 106 (16), 167204

Electron and nuclear spins have been employed in many of the early demonstrations of quantum technology. However, applications in real world quantum technology are limited by the difficulty of measuring single spins. Here we show that it is possible to rapidly and robustly amplify a spin state using a lattice of ancillary spins. The model we employ corresponds to an extremely simple experimental system: a homogenous Ising-coupled spin lattice in one, two, or three dimensions, driven by a continuous microwave field. We establish that the process can operate at finite temperature (imperfect initial polarization) and under the effects of various forms of decoherence.

Physical Review A 82 (1), 010302

In a distributed quantum computer, scalability is accomplished by networking together many elementary nodes. Typically the network is optical and internode entanglement involves photon detection. In complex networks the entanglement fidelity may be degraded by the twin problems of photon loss and dark counts. Here we describe an entanglement protocol which can achieve high fidelity even when these issues are arbitrarily severe; indeed the method succeeds with finite probability even if the photon detectors are entirely removed from the network. An experimental demonstration should be possible with existing technologies.

Physical Review A 82 (4), 042114

The field of quantum metrology promises measurement devices that are fundamentally superior to conventional technologies. Specifically, when quantum entanglement is harnessed, the precision achieved is supposed to scale more favorably with the resources employed, such as system size and time required. Here, we consider measurement of magnetic-field strength using an ensemble of spin-active molecules. We identify a third essential resource: the change in ensemble polarization (entropy increase) during the metrology experiment. We find that performance depends crucially on the form of decoherence present; for a plausible dephasing model, we describe a quantum strategy, which can indeed beat the standard strategy.

Physical Review Letters 104 (5), 050501

The creation of complex entangled states, resources that enable quantum computation, can be achieved via simple “probabilistic” operations which are individually likely to fail. However, typical proposals exploiting this idea carry a severe overhead in terms of the accumulation of errors. Here, we describe a method that can rapidly generate large entangled states with an error accumulation that depends only logarithmically on the failure probability. We find that the approach may be practical for success rates in the sub-10% range. The assumptions that we make, including parallelism and high connectivity, are appropriate for real systems including those based on measurement-induced entanglement. This result therefore indicates the feasibility of such devices.

Formal Methods for Quantitative Aspects of Programming Languages, 43-86

Measurement-based quantum computation (MBQC) is a novel approach to quantum computation where the notion of measurement is the main driving force of computation. This is in contrast with the more traditional circuit model which is based on unitary operation. We review here the mathematical model underlying MBQC and the first quantum cryptographic protocol designed using the unique features of MBQC.

Electronic Notes in Theoretical Computer Science 258 (2), 35-49

In many proposed architectures for quantum computing the physics of the system prevent qubits from being individually controlled. In such systems universal computation may be possible via bulk manipulation of the system. Here, we describe a method to execute globally controlled quantum information processing which admits a fault tolerant quantum error correction scheme. Our scheme nominally uses three species of addressable two-level systems which are arranged in a one dimensional array in a specific periodic arrangement. We show that the scheme possesses a fault tolerant error threshold.

50th Annual IEEE Symposium on Foundations of Computer Science, 517-526

We present a protocol which allows a client to have a server carry out a quantum computation for her such that the client’s inputs, outputs and computation remain perfectly private, and where she does not require any quantum computational power or memory. The client only needs to be able to prepare single qubits randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. Our protocol is interactive: after the initial preparation of quantum states, the client and server use two-way classical communication which enables the client to drive the computation, giving single-qubit measurement instructions to the server, depending on previous measurement outcomes. Our protocol works for inputs and outputs that are either classical or quantum. We give an authentication protocol that allows the client to detect an interfering server; our scheme can also be made fault-tolerant. We also generalize our result to the setting of a purely classical client who communicates classically with two non-communicating entangled servers, in order to perform a blind quantum computation. By incorporating the authentication protocol, we show that any problem in BQP has an entangled two-prover interactive proof with a purely classical verifier. Our protocol is the first universal scheme which detects a cheating server, as well as the first protocol which does not require any quantum computation whatsoever on the client’s side. The novelty of our approach is in using the unique features of measurement-based quantum computing which allows us to clearly distinguish between the quantum and classical aspects of a quantum computation.

International Journal of Quantum Information, 8 (1-2), 219

This review provides a gentle introduction to one-way quantum computing in distributed architectures. One-way quantum computation shows significant promise as a computational model for distributed systems, particularly those architectures which rely on probabilistic entangling operations. We review the theoretical underpinnings of one-way quantum computation and discuss the practical issues related to exploiting the one-way model in distributed architectures.

Science 324 (5931), 1166-1168

Quantum entangled states can be very delicate and easily perturbed by their external environment. This sensitivity can be harnessed in measurement technology to create a quantum sensor with a capability of outperforming conventional devices at a fundamental level. We compared the magnetic field sensitivity of a classical (unentangled) system with that of a 10-qubit entangled state, realized by nuclei in a highly symmetric molecule. We observed a 9.4-fold quantum enhancement in the sensitivity to an applied field for the entangled system and show that this spin-based approach can scale favorably as compared with approaches in which qubit loss is prevalent. This result demonstrates a method for practical quantum field sensing technology.

Information Theoretic Security, 181-182

We introduce the first protocol for the anonymous transmission of a quantum state that is information-theoretically secure against an active adversary, without any assumption on the number of corrupt participants. The anonymity of the sender and receiver is perfectly preserved, and the privacy of the quantum state is protected except with exponentially small probability. Even though a single corrupt participant can cause the protocol to abort, the quantum state can only be destroyed with exponentially small probability: if the protocol succeeds, the state is transferred to the receiver and otherwise it remains in the hands of the sender (provided the receiver is honest).

Physical Review Letters 99 (3), 30501

Conventional quantum computing schemes are incompatible with nanometer-scale “hardware,” where the closely packed spins cannot be individually controlled. We report the first experimental demonstration of a global control paradigm: logical qubits delocalize along a spin chain and are addressed via the two terminal spins. Using NMR studies on a three-spin molecule, we implement a globally clocked quantum mirror that outperforms the equivalent swap network. We then extend the protocol to support dense qubit storage and demonstrate this experimentally via Deutsch and Deutsch-Jozsa algorithms.

We describe a method to execute globally controlled quantum information

processing which admits a fault tolerant quantum error correction scheme. Our scheme nominally uses three species of addressable two-level systems which are arranged in a one dimensional array in a specific periodic arrangement. We show that the scheme possesses a fault tolerant error threshold.

New Journal of Physics 9 (6), 196

Given a suitably large and well connected (complex) graph state, any quantum algorithm can be implemented purely through local measurements on the individual qubits. Measurements can also be used to create the graph state: path erasure techniques allow one to entangle multiple qubits by determining only global properties of the qubits. Here, this powerful approach is extended by demonstrating that even imperfect path erasure can produce the required graph states with high efficiency. By characterizing the degree of error in each path erasure attempt, one can subsume the resulting imperfect entanglement into an extended graph state formalism. The subsequent growth of the improper graph state can be guided, through a series of strategic decisions, in such a way as to bound the growth of the error and eventually yield a high-fidelity graph state. As an implementation of these techniques, we develop an analytic model for atom (or atom-like) qubits in mismatched cavities, under the double-heralding entanglement procedure of Barrett and Kok (2005 Phys. Rev. A 71 060310). Compared to straightforward post-selection techniques our protocol offers a dramatic improvement in growing complex high-fidelity graph states.

Physical Review A 75 (4), 042303

Graph states (or cluster states) are the entanglement resource that enables one-way quantum computing. They can be grown by projective measurements on the component qubits. Such measurements typically carry a significant failure probability. Moreover, they may generate imperfect entanglement. Here we describe strategies to adapt growth operations in order to cancel incurred errors. Nascent states that initially deviate from the ideal graph states evolve toward the desired high fidelity resource without impractical overheads. Our analysis extends the diagrammatic language of graph states to include characteristics such as tilted vertices, weighted edges, and partial fusion, which arise from experimental imperfections. The strategies we present are relevant to parity projection schemes such as optical path erasure with distributed matter qubits.

Advances in Cryptology–ASIACRYPT 2007, 460-473

We present the first protocol for the anonymous transmission of a quantum state that is information-theoretically secure against an active adversary, without any assumption on the number of corrupt participants. The anonymity of the sender and receiver, as well as the privacy of the quantum state, are perfectly protected except with exponentially small probability. Even though a single corrupt participant can cause the protocol to abort, the quantum state can only be destroyed with exponentially small probability: if the protocol succeeds, the state is transferred to the receiver and otherwise it remains in the hands of the sender (provided the receiver is honest).

Physical Review Letters 97 (9), 090502

We describe a new design for a q wire with perfect transmission using a uniformly coupled Ising spin chain subject to global pulses. In addition to allowing for the perfect transport of single qubits, the design also yields the perfect “mirroring” of multiply encoded qubits within the wire. We further utilize this global-pulse generated perfect mirror operation as a “clock cycle” to perform universal quantum computation on these multiply encoded qubits where the interior of the q wire serves as the quantum memory while the q-wire ends perform one- and two-qubit gates.

New Journal of Physics 8 (8), 141

Using the graph-state approach to quantum computation, one can avoid the need for complex array nanostructures in which quantum bits (qubits) interact directly. Instead one can employ simple ‘atom-like’ nanostructures, coupled over macroscopic distances via optical emissions. Here, we describe a robust coupling procedure, which we call brokering, that is especially well suited to nanostructures bearing both nuclear and electron spins. We describe how this approach can be implemented with N−V centre materials.

Physical Review A, 72(5), 050301

We study the dynamics of a single excitation in an infinite XXZ spin chain, which is launched from the origin. We study the time evolution of the spread of entanglement in the spin chain and obtain an expression for the second-order spatial moment of concurrence, about the origin, for both ordered and disordered chains. In this way, we show that a finite central disordered region can lead to sustained superballistic growth in the second-order spatial moment of entanglement within the chain.