On April 19th I was on a skype call with Dorit Aharonov (yes, Skype remembers the date :-)), one of the pioneers of quantum complexity theory, and even more so of the research around the local Hamiltonian problem and the class QMA. The call was meant to be about a workshop we are organizing together with John Watrous, but towards the end of the call Dorit switched topics and told me she had just been reminded of a short survey she had agreed to write for the SIGACT complexity column hosted by Lane A. Hemaspaandra. Dorit’s column was meant to be a survey of recent developments around the quantum PCP conjecture, addressed to a wide audience from computer science (SIGACT is ACM‘s

“Special Interest Group on Algorithms and Computation Theory”). Although the column was originally due a couple days later, Lane had agreed to extend the deadline by about two weeks. Would I be interested in co-authoring the survey with Dorit (as well as withItai Arad, who was also part of the effort)?

My parents were about to visit for a week, and I was right in the mist of planning second visits for my job applications, but I thought, why not — I’ll learn the topic, and writing a survey might just be the right kind of occupation to keep my mind going during these incessant travels. In any case, the deadline was at most a couple weeks away: worst comes to worst, two weeks from now, we’re done.

Well, I’m happy to report that, a mere *5 months* after our initial phone call, the survey is finished and has just been posted to the arXiv! (To be fair, the guest column in SIGACT was actually submitted — in somewhat of a frenzy — much earlier than that; we sent the final version on May 14th after Lane and the SIGACT editors had graciously accepted a couple extra weeks delay.) If there is any lesson to learn from the experience it would probably be that deadlines are good, but they *must be hard deadlines*. A “semi-hard” deadline, one that you feel you should really respect, but could nevertheless probably extend at the last minute by being sufficiently persuasive, does not help: it means you are constantly rushing to meet the next instantiation of the fleeting deadline without ever reaching the relief of getting the task done. Of course, in our case the delay between submission to SIGACT and posting on the arXiv is due to not having *any* deadline anymore, which means other tasks took priority for each of us. Searching through my email I can find one of us encouragingly stating that “we’ll be done by the end of the week”, or something similar, every couple weeks over the past three months!

This being said, I am quite happy with the result we arrived at. Although the survey is far from comprehensive (we did originally have a page limit, which we of course copiously exploded by now), I dare hope that it provides a reasonable (very) high-level introduction to the feast of quantum PCP delicacies that the courageous reader will treat herself to by letting the survey lead her through the five-course menu, with many references provided in guise of mignardises in case she has any space left. Although personally I still have a lot on my plate (and space in my stomach), I already reaped big benefits just from trying to catch up with Dorit and Itai’s much deeper understanding of the topic than mine (and indeed, being forced to do so was my original motivation for joining the writing effort). Both put a huge effort into putting together a comprehensive picture from the many recent, and often quite subtle, results touching on the conjecture. I hope that the story-line we eventually put together will be as enlightening to other computer scientists (to whom the survey is primarily addressed) as it was to me.

Before I release the starving hordes, I can briefly point to a couple aspects of the survey that I found particularly enlightening:

**1. Multipartite entanglement**

In the survey we spend a good deal of effort to communicate some of the beauty and magic of multipartite entanglement. (To be fair, I should really say, Dorit and Itai spent a good deal of effort…here I was definitely the learner!) It is only by first getting a flavor of the many counter-intuitive consequences of entanglement that one can really appreciate the subtleties behind the quantum PCP conjecture. A simple, but already striking, example is given by the so-called *CAT states*. These are two very simple -qubit states,

that are orthogonal (hence perfectly distinguishable), but such that their reduced density on any subset of at most qubits are exactly identical. The only possible way to distinguish the two states is by having access to absolutely *all* qubits. No such phenomenon exists classically — two distinct strings must differ in at least one coordinate.

**The toric code.** Kitaev’s toric code is a particularly beautiful example of a family of four quantum states, defined on a grid of qubits (qubits are on the edges), which exemplifies many more features of multipartite entanglement. One reason why this “code” is so beautiful is that its properties are related to the *topology* of the torus, with which the grid is identified by matching opposite borders. For instance, any “string” of Pauli operators which *does not* wrap around the torus leaves any of the four code states invariant — “indeed”, such strings are homotopic to the null string. However, the “group” of size comprised of strings wrapping around the torus along one or the other possible direction maps these states to one another.

Note how many quote marks I have used in the preceding paragraph; amazingly though all these statements can be made perfectly clear and rigorous. For me, working out some of the details came as a revelation — although I had frequently heard of the toric code, I had never actually kneaded it with my own hands… I’ll refer you to Section 2 of the survey for more details (incidentally, while working on that section I looked for some good lecture notes on the toric code: *surely*, such notes *must* exist, but I couldn’t find any that that were destined to the neophyte; any suggestions?).

**The NLTS conjecture.** The No Low-energy Trivial States (NLTS) conjecture, recently introduced by Hastings, is a weaker conjecture than the quantum PCP; it attempts to extract the physical intuition of why the latter is hard. Very roughly, the NLTS states that there exists Hamiltonians whose every low-energy state is “non-trivial”. The precise definition of non-trivial is a bit subtle, but intuitively it means that the state cannot be generated, or approximated, by a low-depth circuit; “non-trivial” gives a computational definition of a highly entangled state. It is clear that the quantum PCP conjecture implies the NLTS conjecture (any trivial low-energy state can serve as a classical witness for the energy of the Hamiltonian), but the converse is unknown, as the NLTS does not make any claims about the computational difficulty of approximating the ground energy of its Hamiltonians. Freedman and Hastings have proposed intricate constructions of Hamiltonians coming close to satisfying the NLTS condition, but the conjecture still stands and looks like a very good (though by no means easy) target for further research on the quantum PCP. In Section 4 of the survey we also discuss objections to the conjecture coming from the study of commuting Hamiltonians.

**2. An exponential size PCP for QMA**

In Section 5 of the survey we exhibit an explicit “exponential-sized PCP for QMA”. What does this mean? Take an instance of the -local Hamiltonian problem, as described here. Suppose a *classical* verifier sets out to distinguish the two cases, or , where denotes the ground energy (the smallest eigenvalue) and are separated by an inverse polynomial in the number of qubits. To help him in this task, the verifier may ask for access to a *classical* proof of the statement “”. For instance, he could ask for a classical description of the ground state . A possible naïve description would be the list of all complex amplitudes of in the standard basis. Unfortunately, not only will this description require exponentially many bits to be written down, but verifying it also requires exponentially many bits to be read — requiring much more verification time than our verifier cares to spend on the problem. Indeed, this “naïve” representation does not respect the locality of the problem: evaluating the energy of even a single -local term requires taking a partial trace over the remainder of the system, an exponentially long calculation. Can we do better?

A first attempt to improve the above would be to require the proof to contain pre-computed descriptions of all possible reduced density matrices of , on any subset of qubits. This only requires polynomially many bits…which should make us suspicious: it would put the problem in NP! The catch is that, even though the energy of any local term may now be estimated efficiently, we still need to check that the reduced densities given in the proof are consistent with a single global state — itself a QMA-complete problem! We have come full circle.

Nevertheless, as we recount in the survey it follows from well-known arguments that to the least an efficiently verifiable exponential size PCP — an exponential size proof that can be checked by reading only a constant number of bits — for the local Hamiltonian problem *must* exist; to state it briefly this comes as a consequence of the well-known chain of inclusions QMA PSPACE IP PCP. Unwinding the arguments behind these inclusions, however, requires going through a long series of intricate reductions, and it is not immediately clear what the resulting proof looks like.

In the survey we show that one can give a very intuitive PCP; although the resulting proof and verification procedure build heavily on the classical tools, such as the sum-check protocol, they still manage to keep the “quantum” structure of the problem on the surface, making crucial use of its “qubit-by-qubit” locality. Details of the construction are given in the survey, but before going there I invite you to find your own; indeed I should be many ways to obtain such a PCP, some of them more elegant than ours. Now for the major open question: can you make the PCP any shorter by allowing verifier and proof to be quantum (of course still limiting the number of queries to a constant, and requiring completeness and soundness of the PCP to be separated by a constant as well)?