Complexity Theory

  • Joachim von zur Gathen

    Universität Bonn, Germany
  • Oded Goldreich

    Weizmann Institute of Science, Rehovot, Israel
  • Claus-Peter Schnorr

    Universität Frankfurt, Germany
  • Madhu Sudan

    The Stata Center, Cambridge, United States
Complexity Theory cover
Download PDF

A subscription is required to access this article.

Abstract

The workshop Complexity Theory was organised by Joachim von zur Gathen (Bonn), Oded Goldreich (Rehovot), Claus-Peter Schnorr (Frankfurt), and Madhu Sudan (Cambridge). The workshop was held on June 5th–11th 2005, and attended by approximately 50 participants spanning a wide range of interests within the field of Computational Complexity. Sixteen talks were presented in the mornings, attended by all participants. In addition, extensive interaction took place in smaller groups. Specifically, several more specialized sessions were held in the afternoons and were typically attended by 5-20 participants, and numerous imformal meetings of 2-5 participants took place at various times (including at night).

The Oberwolfach Meeting on Complexity Theory is marked by a long tradition and a continuous transformation. Originally starting with a focus on Algebraic and Boolean Complexity, the meeting has continuously evolved to cover a wide variety of areas, most of which were not even in existence at the time of the first meeting (in 1972). The format of the meetings has also been drastically changed in the recent meetings so that the focus is on interactions in small specialized sessions, maintaining unity via general plenary sessions. While inviting many of the most prominent researchers in the field, the organizers try to identify and invite a fair number of promising young researchers. The current meeting marks the retirement from the organizing team of the last and youngest member of the founding team (Claus Schnorr).

Computational Complexity (a.k.a Complexity Theory) is a central field of Computer Science with a remarkable list of celebrated achievements as well as vibrant research activity. The field is concerned with the study of the intrinsic complexity of computational tasks, and this study tends to aim at generality: it focuses on natural computational resources, and considers the effect of limiting these resources on the class of problems that can be solved. Computational complexity is related to and has substantial interaction with other areas of mathematics such as number theory, algebra, combinatorics, coding theory, and optimization.

The workshop has focused on several sub-areas of complexity theory and its nature may be best examplified by a brief survey of some of the meeting's highlights.

The complexity of Undirected Connectivity. For more than two decades, undirected connectivity was one of the most appealing examples of the computational power of randomness. Whereas every graph (e.g., a planar graph representing a maze) can be efficiently traversed by a deterministic algorithm, the classical deterministic algorithms required an extensive use of (extra) memory (i.e., linear in the size of the graph). On the other hand, it was known that, with high probability, a random walk (of polynomial length) visits all vertices in the corresponding connected component. Thus, the randomized algorithm requires a minimal amount of auxiliary memory (i.e., logarithmic in the size of the graph). Even after more than a decade of focused attension at the issue, a significant gap remained between the space complexity of randomized and deterministic polynomial-time algorithms for this natural and ubiquitous problem. After deterministic polynomial-time primality testing was discovered in 2003, undirected connectivity became the most famous example where randomized computations seemed more powerful than deterministic ones.

In the workshop, Omer Reingold presented his recent breakthrough result asserting that any graph can be traversed by a deterministic polynomial-time algorithm that only uses a logarithmic amount of auxiliary memory. His algorithm is based on a novel approach that departs from previous attempts, where the latter tried to derandomize the random-walk algorithm. Instead, Reingold's algorithm traverses a virtual graph, which (being an “expander”) is easy to traverse (in deterministic logarithmic-space), and maps the virtual traversal of the virtual graph to a real traversal of the actual input graph. The virtual graph is constructed in (logarithmically many) iterations, where in each iteration the graph becomes easier to traverse.

A new proof of the PCP Theorem. The PCP Theorem is one of the most influential and impressive results of complexity theory. Proven in the early 1990's, the theorem asserts that membership in any NP-set can be verified, with constant error probability (say 1%), by a verifier that probes a polynomially long (redundant) proof at only a constant number of randomly selected bit locations. The PCP Theorem led to a breakthrough in the study of the complexity of combinatorial approximation problems. Its original proof is very complex and involves the composition of two highly non-trivial proof systems, each minimizing a different parameter of the asserted PCP system (i.e., the proof length and the number of probed bits).

In the workshop, Irit Dinur presented an alternative approach to the proof of the PCP Theorem. Her recent breakthrough approach leads to a simpler proof of the PCP Theorem as well as to resolving an important open problem regarding PCP systems (namely, constructing a PCP system having proofs of almost-linear rather than polynomial length). Dinur's approach is based on gradually improving the performance of PCP-like systems, starting with a trivial system and performing (logarithmically) many amplification steps. In each step, the PCP-like system is composed with itself in a way that almost preserves all parameters while drastically improving one particular parameter.

Extracting randomness. Extracting almost-perfect randomness from weak sources of (imperfect) randomness is crucial for the actual use of randomized procedures. The latter are analyzed assuming they are given access to a perfect random source, while in reality one typically has access only to sources of weak randomness (e.g., having constant entropy rate). Indeed, the problem has attracted a lot of attention in the last couple of decades. In the 1990's and early 2000's, the focus was on single-source extractors that utilize a very short auxiliary random seed. After more than a decade of impressive progress, culminating in an almost optimal construction, the focus has shifted back to “seedless” extraction from a few independent weak sources. In the workshop, Avi Wigderson surveyed the progress made on the latter problem in the last couple of years, and the techniques used towards this end. His presentation was followed by a specialized session devoted to this subject.

Cryptography. Modern Cryptography is intimately related to Complexity Theory. A new aspect of this relationship was manifested in a talk by Yuval Ishai, which described a recent work by himself, Eyal Kushilevitz and their graduate student Benny Applebaum. They showed that, for many central cryptographic primitives, secure implementations that have moderate complexity (which exists under standard complexity assumptions) can be transformed into secure implementations that have very low (and in fact minimal) complexity (i.e., each output bit in these implementations can be computed in constant time). Additional works in the area of Cryptography were presented and discussed in a specialized session devoted to this area.

Holographic Reductions. Standard (many-to-one) reductions between computational problems utilize gadgets that enforce a correspondence between global solutions and a sequence of partial local solutions (within the gadgets). In the workshop Les Valiant presented a novel type of reductions, called holographic, in which individual global solutions are not a combination of corresponding local solutions, but rather the set of global solutions is a combination of the sets of local solutions. He presented holographic reductions between counting problems, noting that the corresponding gadgets cannot be implemented in the standard (non-holographic) manner. These reductions (to a problem that is solvable in polynomial-time) yield polynomial-time algorithms for problems that were not known to be efficiently solvable.

The complexity of Matrix Multiplication. Improved algorithms for matrix multiplication were the focus of extensive research in the 1970's and 1980's, culminating in a -time algorithm for multiplying two -by- matrices. Much of the progress on this question has occurred at the various Oberwolfach meetings on Complexity Theory. In the workshop, Chris Umans presented a novel approach to the design of such algorithms. So far, this approach has not yielded an improved algorithm, however it yields significantly a simpler proof of the fact that matrix multiplication can be performed in steps. This is remarkable in light of the formidable complexity of previous proofs in the area. Additional works in the area of Algebraic Complexity were presented and discussed in a specialized session devoted to this area.

Additional topics that were discussed in the workshop include a geometric approach to combinatorial optimization problems (see Sanjeev Arora's extended abstract), the pursuit of even stronger PCP systems (see extended abstracts by Eli Ben-Sasson and Oded Regev), computational problems regarding integer lattices (see specialized session devoted to the topic), the complexity of approximation problems (see Julia Chuzhoy's extended abstract), computational problems in coding theory (see Eyal Kushilevitz's extended abstract), the relation between worst-case and average-case complexity (see extended abstracts by Adi Akavia and Amnon Ta-Shma), and Quantum Computing (see extended abstracts by Scott Aaronson and Ran Raz).

This report contains extended abstracts of the sixteen plenary talks as well as summaries of the specialized sessions, which were written by the organizers of these sessions. In addition, the report includes three extended abstracts of talks given in the specialized sessions (by Peter Buergisser, Ran Raz, and Ronen Shaltiel).

Cite this article

Joachim von zur Gathen, Oded Goldreich, Claus-Peter Schnorr, Madhu Sudan, Complexity Theory. Oberwolfach Rep. 2 (2005), no. 2, pp. 1431–1504

DOI 10.4171/OWR/2005/26