polynomial space - cs.cmu.edu

59
Polynomial Space Klaus Sutner Carnegie Mellon University 2019/03/26

Upload: others

Post on 16-Oct-2021

14 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Polynomial Space - cs.cmu.edu

Polynomial Space

Klaus Sutner

Carnegie Mellon University

2019/03/26

Page 2: Polynomial Space - cs.cmu.edu

1 Context Sensitive Languages

2 Linear Bounded Automata

3 Polynomial Space

Page 3: Polynomial Space - cs.cmu.edu

Towards Polynomial Space 2

Logarithmic space and linear (deterministic) space clearly make algorithmicsense. But how about nondeterministic linear space?

Generalizing context-free grammars naturally leads to context-sensitivegrammars and the obvious parsing “algorithm” for a context-sensitive languageis in NSPACE(n) (and, by Savitch, in SPACE(n2)).

Somewhat surprisingly, it’s just a small step from there to all of PSPACE.

Page 4: Polynomial Space - cs.cmu.edu

Context-Sensitive Grammars 3

Definition (CSG)

A context-sensitive grammar (CSG) is a grammar where all productions are ofthe form

αAβ → αγβ where γ 6= ε

For technical reasons, some authors also allow S → ε in which case S may notappear on the righthand side of any production. A language is context-sensitiveif it can be generated by a context-sensitive grammar.

Note the constraint that the replacement string γ 6= ε; as a consequence wehave

α⇒ β implies |α| ≤ |β|

Page 5: Polynomial Space - cs.cmu.edu

Brute-Force Recognition 4

Lemma

Every context-sensitive language is decidable.

Proof.

Suppose w ∈ Σ? and n = |w|. In any potential derivation (αi)i<N we have|αi| ≤ n.

So consider the derivation graph D (recall the pictures from last time forCFLs):

vertices are Γ≤n,

edges are α⇒1 β.

Then w is in L if w is reachable from vertex S in D.

2

Page 6: Polynomial Space - cs.cmu.edu

Needless to say . . . 5

Lemma

Not all decidable languages are context-sensitive.

Proof. Here is a cute diagonalization argument for this.

Let (xi)i be an effective enumeration of Σ? and (Gi)i an effective enumerationof all CSG over Σ (say, both in length-lex order). Set

L = {xi | xi /∈ L(Gi) }

By the lemma, L is decidable.

But L cannot be context-sensitive by the usual diagonal mumbo-jumbo.

2

Page 7: Polynomial Space - cs.cmu.edu

Example: Counting 6

We know that the language

L = { anbncn | n ≥ 1 }

is not context free. Here is a context-sensitive grammar G for L:

let V = {S,B} and set

S → aSBc | abccB → Bc

bB → bb

A typical derivation looks like

S ⇒ an+1bc(Bc)n ⇒ an+1bBncn+1 ⇒ an+1bn+1cn+1

It follows by induction that L ⊆ L(G).

Page 8: Polynomial Space - cs.cmu.edu

Not so fast . . . 7

Alas, we also need to show that L(G) ⊆ L.

This is a bit harder: we need to show that the productions cannot be abused insome unintended way to generate other strings: recall that there is norestriction on the order in which productions can be applied, they just have tomatch.

E.g., the following is allowed:

S ⇒ aaSBcBc⇒ aaSBBcc

Exercise

Figure out the details.

Page 9: Polynomial Space - cs.cmu.edu

Example: More Counting 8

It is also known that the language

L = {x ∈ {a, b, c}? | #ax = #bx = #cx }

is not context free. But, again, it is easily context-sensitive:

let V = {S,A,B,C} and set

S → S′ | εS′ → S′ABC | ABC

XY → Y X for all X,Y ∈ {A,B,C}A→ a

B → b

C → c

Note that most productions are actually context free. The critical part is thecommutation productions for {A,B,C}.

Page 10: Polynomial Space - cs.cmu.edu

Closure Properties 9

Theorem

Context-sensitive languages are closed under union, concatenation, Kleene starand reversal.They are also closed under ε-free homomorphisms.

Proof. Straightforward by manipulating the grammar. 2

Note that arbitrary homomorphisms do not work in this case: they erase toomuch information and can force too large a search.

Page 11: Polynomial Space - cs.cmu.edu

Two Innocent Questions 10

Are CSL closed under intersection?

Are CSL closed under complement?

The answer is Yes in both cases (so this is quite different from context-freelanguages).

The proof for intersection can be based on a machine model, and is mucheasier than the proof for complement (which requires a special and verysurprising counting technique; see next lecture).

Page 12: Polynomial Space - cs.cmu.edu

Normal Form 11

Theorem (Kuroda)

Every context-sensitive grammar can be written with productions of the form

A→ BC AB → CD A→ a

The proof is very similar to the argument for Chomsky normal form for CFG(using only productions A→ BC and A→ a).

Note that the recognition algorithm becomes particularly simple when the CSGis given in Kuroda normal form: we first get rid of all terminals and thenoperate only on pairs of consecutive variables.

Page 13: Polynomial Space - cs.cmu.edu

Monotonicity 12

Derivations in CSGs are length-non-decreasing. Correspondingly, define agrammar to be monotonic if all productions are of the form

π : α→ β where α ∈ Γ?V Γ?, β ∈ Γ?, |α| ≤ |β|

As we have seen already, a monotonic grammar can only generate a decidablelanguage.

In fact, these look rather similar to context-sensitive grammars except that weare now allowed to manipulate the context (but see 2 slides down). Inparticular, every CSG is monotonic.

Page 14: Polynomial Space - cs.cmu.edu

Monotonic is Context-Sensitive 13

Theorem

A language is context-sensitive iff it is generated by a monotonic grammar.

Proof.

Using LBAs introduced below, it suffices to show that every monotoniclanguage can be accepted by an LBA. Initially, some terminal stringa1, a2, . . . , an is written on the tape.

Search handle: the head moves to the right a random number of places,say, to ai.

Check righthand side: the machine verifies that ai, . . . , aj = β whereα→ β is a production.

Replace by lefthand side: the block ai, . . . , aj is replaced by α, possiblyleaving some blanks.

Collect: remove possible blanks by shifting the rest of the tape left.

This loop repeats until the tape is reduced to S and we accept. If any of theguesses goes wrong we reject.

Page 15: Polynomial Space - cs.cmu.edu

Tiny Example 14

One can also directly convert monotonic productions to context-sensitive ones.As a simple example, consider a commutativity rule

AB → BA

Here are equivalent context-sensitive rules:

AB → AX

AX → BX

BX → BA

Here X is a new variable and the green variable is the one that is beingreplaced. Note how the left/right context is duly preserved.

Page 16: Polynomial Space - cs.cmu.edu

And the Difference? 15

CFL recognition is cubic time, via a dynamic programming algorithm thatbasically enumerates all possible parse trees.

Why would a similar approach not also work for CSLs? As Kuroda NF shows,on the face of it, the productions seem only mildly more complicated.

Recall how one can associate undecidability with CFGs: we are interested in thelanguage C(M) of all accepting computations of a TM:

#C0 #C1 # . . . #Cn #

Unfortunately, this language fails to be context-free: the copy language{ww | w ∈ Σ? } is not context-free, and this one is worse.

Page 17: Polynomial Space - cs.cmu.edu

Tricks 16

Consider the complement Σ? − C(M).

Consider a representation where we use alternating mirror images:

#C0 #Cop1 #C2 #Cop

3 . . . #Cn #

This works essentially since palindromes are context-free.

Claim

The resulting language of non-computations is context-free.

Page 18: Polynomial Space - cs.cmu.edu

And Context-Sensitive? 17

For context-sensitive languages there is no problem: we can directly generateC(M). This requires a bit of work using a grammar directly, but is fairly easyto see from the machine model (see below): an LBA can easily perform thenecessary checks.

As a consequence even some the most basic questions about CSG areundecidable. This is really quite surprising.

Theorem

It is undecidable whether a CSG generates the empty language.

Page 19: Polynomial Space - cs.cmu.edu

Decidability Overview 18

x ∈ L L = ∅ L = Σ? L = K L ∩K = ∅

regular Y Y Y Y Y

DCFL Y Y Y Y N

CFL Y Y N N N

CSL Y N N N N

decidable Y N N N N

semi-dec. N N N N N

Standard decision problems for various language classes.

Needless to say, for the decidable ones we would like a more precise complexityclassification (in which case it may matter how precisely the instance is given).

Page 20: Polynomial Space - cs.cmu.edu

1 Context Sensitive Languages

2 Linear Bounded Automata

3 Polynomial Space

Page 21: Polynomial Space - cs.cmu.edu

A Machine Model? 20

In order to find a parser for CSL it seems natural to look for an associatedmachine models:

semidecidable — Turing machines

context free — pushdown automata

regular — finite state machines

We need a machine model that is stronger than pushdown automata (FSMplus stack), but significantly weaker than full Turing machines.

Note that two stacks won’t work, we need a different approach.

Page 22: Polynomial Space - cs.cmu.edu

Backwards Derivation 21

Suppose L is context-sensitive via G. The idea is to run a derivation of Gbackwards, starting at a string x of terminals.

To this end, nondeterministically guess a handle in the current string, a placewhere there is a substring of the form β, where α→ β is a production in G.Erase β and replace it by α. Rinse and repeat.

The original string x is in L iff we can ultimately reach S this way.

Of course, this is yet another path existence problem in a suitable digraph.

Page 23: Polynomial Space - cs.cmu.edu

Linear Bounded Automata 22

Definition

A linear bounded automaton (LBA) is a type of one-tape, nondeterministicTuring machine acceptor where the input is written between specialend-markers and the computation can never leave the space between thesemarkers (nor overwrite them).

Thus the initial configuration looks like

#q0x1x2 . . . xn#

and the tape head can never leave this part of the tape.

It may seem that there is not enough space to perform any interestingcomputations on an LBA, but note that we can use a sufficiently large tapealphabet to “compress” the input to a fraction of its original size and makeroom.

Page 24: Polynomial Space - cs.cmu.edu

The Myhill-Landweber-Kuroda Theorem 23

The development happened in stages:

Myhill 1960 considered deterministic LBAs.

Landweber 1963 showed that they produce only context-sensitivelanguages.

Kuroda 1964 generalized to nondeterministic LBAs and showed that thisproduces precisely all the context-sensitive languages.

Theorem

A language is accepted by a (nondeterministic) LBA iff it is context-sensitive.

Page 25: Polynomial Space - cs.cmu.edu

Proof 24

One direction is by definition of an LBA.

For the opposite direction it is a labor of love to check in great gory detail thatthe workings of an LBA can be described by a context-sensitive grammar.

2

Note that nondeterminism is critical here: grammars are naturallynondeterministic and a deterministic machine would have to search through allpossible choices. That seems to require more than just linear space (openproblem, see below).

Page 26: Polynomial Space - cs.cmu.edu

Digression: Physical Realizability II 25

It was recognized already in the 1950s that Turing machines are, in many ways,too general to describe anything resembling the type of computation that waspossible on emerging digital computers.

The Rabin-Scott finite state machine paper was one radical attempt to imposea radical constraint on Turing machines that brings them into the realm of“feasible” computation.

Myhill’s introduction of LBA is another attempt at constructive restrictions. Aswe now know, LBAs are still a rather generous interpretation of the notion offeasible computation; real, practical algorithms need further constraints.

Still, it is a perfectly good model and there are many interesting problems thatfit perfectly into this framework.

Page 27: Polynomial Space - cs.cmu.edu

Intersection Closure 26

Theorem

CSL are closed under intersection.

Proof.

Given two LBAs Mi for Li. we can construct a new LBA for L1 ∩ L2 by usinga 2-track tape alphabet Γ = Σ× Σ.

The upper track is used to simulate M1, the lower track is used to simulate M2.

It is easy to check that the simulating machine is again a LBA (it will sweepback and forth over the whole tape, updating both tracks by one step on theway).

2

Page 28: Polynomial Space - cs.cmu.edu

Why Not CFG? 27

In essence, the argument says that we can combine two LBAs into a single onethat checks for intersection. This is entirely similar to the argument for FSMs.

Burning Question: Why can’t we do the same for PDAs?

Because we cannot in general combine two stacks into a single one (thoughthis works in some cases; the stack height differences need to be bounded).But in general, two stacks suffice to simulate a Turing machine.

Page 29: Polynomial Space - cs.cmu.edu

1 Context Sensitive Languages

2 Linear Bounded Automata

3 Polynomial Space

Page 30: Polynomial Space - cs.cmu.edu

More Problems 29

So context-sensitive recognition is naturally in NSPACE(n), and by Savitch, inSPACE(n2).

Two natural questions:

Are there other natural decision problems of this kind?

Is there any reason to consider larger space classes, say, SPACE(n42)?

Page 31: Polynomial Space - cs.cmu.edu

Quantified Boolean Formulae 30

Definition

A quantified Boolean formula (QBF) is a formula consisting of propositionalconnectives “and,” “or” and “not,” as well as existential and universalquantifiers.

If all variables in a QBF are bounded by a quantifier then the formula has atruth value: we can simply expand it out as in

∃xϕ(x) 7→ ϕ(0) ∨ ϕ(1)

∀xϕ(x) 7→ ϕ(0) ∧ ϕ(1)

In the end we are left with an exponential size propositional formula withoutvariables which can simply be evaluated in linear time.

Page 32: Polynomial Space - cs.cmu.edu

Validity of QBF 31

Problem: Validity of Quantified Boolean Formulae (QBF)Instance: A quantified Boolean sentence ϕ.Question: Is ϕ valid?

Note that many properties of propositional formulae can easily be expressed interms of QBF. For example, ϕ(x1, x2, . . . , xn) is satisfiable iff

∃x1, . . . , xn ϕ(x1, x2, . . . , xn) is valid

Likewise, the formula is a tautology iff

∀x1, . . . , xn ϕ(x1, x2, . . . , xn) is valid

Page 33: Polynomial Space - cs.cmu.edu

QBF 32

Lemma

Validity of QBF can be checked by a deterministic LBA.

Proof. As a case in point, consider the formula

∀x1, . . . , xn ∃ y1, . . . , ym ϕ(x,y)

The argument easily extends to any other formula.

To check validity we use two loops, one counting from 0 to 2n − 1 for x andanother counting from 0 to 2m − 1 for y.

Page 34: Polynomial Space - cs.cmu.edu

Validity Tester 33

for x = 0..2n−1 dov = 0for y = 0..2m−1 do

if ϕ(x, y)then v = 1; break

if v = 0then return No

return Yes

Of course, the running time is exponential, but linear space is quite enough.

2

Page 35: Polynomial Space - cs.cmu.edu

QBF Solvers 34

http://satisfiability.org/

http://www.qbflib.org/

There is a lot of current research on building powerful QBF solvers.

Page 36: Polynomial Space - cs.cmu.edu

Succinct Representations 35

A quantified Boolean formula can be construed as a succinct representation ofa much larger ordinary Boolean formula: deciding validity becomes harderbecause the input is much shorter.

Succinct representations appear in many places:

Page 37: Polynomial Space - cs.cmu.edu

Succinct Hypercube 36

Vertex set: 2d

Edge set: (x, y) ∈ 2d × 2d such that dist(x, y) = 1

Here dist(x, y) refers to the Hamming distance: |{ i | xi 6= yi }|.

More generally, we could use a Boolean formula Φ(x, y) to determine the edgesin a graph G(2d,Φ) on 2d.

And the formula could be represented by a ROBDD.

Page 38: Polynomial Space - cs.cmu.edu

Games 37

Many games such as Hex, Checkers, Chess, . . . are naturally finite (at least withthe right set of rules). Thus, from the complexity perspective, they are trivial.

However, they often generalize to an infinite class of games. For example, Hexcan obviously be played on any n× n board.

Incidentally, John Nash proved that there is no draw in Hex (in 1949).

Page 39: Polynomial Space - cs.cmu.edu

Winning 38

There are several natural questions one can ask about these generalized games:

Is a particular board position winning?

Does the first player have a winning strategy?

If the total number of moves in a game is polynomial in the size of the board(Checkers, Hex, Reversi), then these questions are typically in PSPACE: wecan systematically explore the game tree to check if a position is winning.

Page 40: Polynomial Space - cs.cmu.edu

Integer Circuit Evaluation 39

Consider a circuit with n integer inputs; we think of x ∈ Z as the singleton {x}.Internal nodes all have indegree 2 (but unbounded outdegree), and come inthree types:

Union A ∪BSum A⊕B = { a+ b | a ∈ A, b ∈ B }

Product A⊗B = { a · b | a ∈ A, b ∈ B }

We assume there is exactly one output node of outdegree 0, the subset of Zappearing there is the result of evaluating the circuit.

Problem: Integer Circuit Evaluation (ICE)Instance: A ICE ciruit C, an integer a.Question: Is a in the output set?

Page 41: Polynomial Space - cs.cmu.edu

PSPACE 40

A ∪B is linear; we might also admit arbitrary finite sets as input.

But: A⊕B and A⊗B are both potentially quadratic, so the sets defined byan ICE can be exponentially large.

Lemma

ICE is in PSPACE.

Proof. We cannot simply evaluate the circuit in polynomial space: the setsmight be too large.

But we can nondeterministically guess the one element in each set that isactually needed to obtain the target value a in the end, and follow thesewitnesses throughout the circuit.

2

Page 42: Polynomial Space - cs.cmu.edu

Comment 41

This result may seem a bit contrived, but it actually is quite natural: there is along-standing question of how circuit evaluation compares to formulaevaluation (it should be slightly harder if the fan-out is higher than 1).

For example, Boolean circuit evaluation is P-complete, but evaluating aBoolean formula is in NC1 (presumably a smaller class based on circuits, morelater).

More generally, there are interesting connections between various kinds ofcircuits (or straight-line programs) and small complexity classes.

Page 43: Polynomial Space - cs.cmu.edu

Boring Completeness 42

As before with NP, it is not too hard to come up with a totally artificialPSPACE-complete problem:

K = { e#x# 1t | x accepted by Me in t = |x|e + e space }

Because of the padding, we can easily check membership in K in linear space.

And the standard reduction shows that the problem is hard.

Not too interesting, here is a much better problem.

Page 44: Polynomial Space - cs.cmu.edu

Quantified Boolean Formulae 43

Theorem

Validity testing for quantified Boolean formulae is PSPACE-complete.

Proof.

Membership in PSPACE is easy to see: we can check validity by brute-forceusing just linear space.

To see this, note that every block of existential and universal quantifiers can beconsidered as a binary counter: for k Boolean variables we have to check values0, . . . , 2k − 1.

This is exponential time but linear space.

Page 45: Polynomial Space - cs.cmu.edu

Proof Cont’d 44

For hardness one uses the fact that a configuration C′ of a Turing machine canbe expressed by a (large) collection C of Boolean variables much as inCook-Levin, plus ideas from Savitch’s theorem.

We will construct a quantified Boolean formula Φk of polynomial size such that

Φk(C1, C2) ⇐⇒ ∃ t ≤ 2k (C′1 `tM C′2).

This is straightforward for k = 0, 1: copy the appropriate parts of theCook-Levin argument.

For k > 1 note that Φk(C1, C2) iff ∃C (Φk−1(C1, C) ∧ Φk−1(C,C2)).

Unfortunately, this direct approach causes an exponential blow-up in size.Therefore we use a trick:

Φk(C1, C2) ⇐⇒ ∃C ∀D1, D2 ((C1, C) = (D1, D2) ∨(C,C2) = (D1, D2)⇒ Φk−1(D1, D2))

Page 46: Polynomial Space - cs.cmu.edu

Quoi? 45

So Φk(C1, C2) means that for some witness C and all choices of D1 and D2

we have

(C1, C) = (D1, D2) ∨ (C,C2) = (D1, D2) implies Φk−1(D1, D2)

This just gets us around having to write down Φk−1 twice.

2

Depending on your temperament, you will recognize this as a brilliant technicalmaneuver, or dismiss it as a slimy trick.

Page 47: Polynomial Space - cs.cmu.edu

Context Sensitive Recognition 46

We have seen that for context-sensitive grammars it is decidable whether aword is generated by the grammar.

Problem: Context Sensitive Recognition (CSR)Instance: A CSG G = 〈V,Σ, P, S〉 and a word x ∈ Σ?.Question: Is x in L(G)?

In fact, we can solve this problem in NSPACE(n) by guessing where in thecurrent string to apply a production of G backwards.

Ultimately we will wind up in S iff the given string is in the language.

Page 48: Polynomial Space - cs.cmu.edu

CSR Completeness 47

Theorem

Context Sensitive Recognition is PSPACE-complete.

Proof.

For hardness use the following well-known fact from language theory:ε /∈ L ∈ NSPACE(n) implies L is context-sensitive (via monotonic grammars).

Now let L ⊆ Σ? be in PSPACE, say, L ∈ SPACE(p(n)). Define

L′ = {x# 1p(|x|) | x ∈ L }

By the previous remark L′ is context-sensitive. In fact, a grammar G for L′ canbe constructed in polynomial time from a Turing machine for L.

Hence the map f defined by f(x) = (G, x# 1p(|x|)) is a polynomial timereduction from L to CSR.

2

Page 49: Polynomial Space - cs.cmu.edu

PSPACE and FSMs 48

Here is a question regarding the intersection of a family of DFAs.

Problem: DFA IntersectionInstance: A list A1, . . . , Am of DFAs.Question: Is

⋂L(Ai) empty?

Note that m, the number of machines, is not fixed. We can check Emptiness inlinear time on the accessible part of the product machine A =

∏Ai; alas, the

latter has exponential size in general.

Theorem (Kozen 1977)

The DFA Intersection Problem is PSPACE-complete.

Page 50: Polynomial Space - cs.cmu.edu

The Proof 49

The problem is in linear NSPACE as follows.

Let ni = |Ai| and set n =∏ni.

set pi = init(Ai)for i = 1..n do

guess a ∈ Σcompute pi = δi(pi, a)if pi ∈ Fi for all ithen return Yes

return No

As usual, this “algorithm” may produce false No’s, but a Yes guarantees ayes-instance.

Page 51: Polynomial Space - cs.cmu.edu

For hardness, let T be a Turing machine with polynomial space boundp(n) ≥ n. Consider IDs encoded as strings

#a1a2 . . . al p b1b2 . . . br

A whole computation has the form

#I0#I1 . . .#Im−1$

# and $ are new markers and we may safely assume that the length of each IDis exactly p(n) (pad with blank tape symbols). Also, assume that m is even(change the TM).

The idea is to construct a family of DFAs that make sure that I2i 7→ I2i+1 iscorrect, and another family that checks I2i+1 7→ I2i+2. We will only deal withthe first case.

Page 52: Polynomial Space - cs.cmu.edu

Here is machine Ak, where 0 ≤ k ≤ p(n)− 3.

It reads #Σkxyz and remembers x, y and z;

It then skips p(n)− k − 3 letters, reads a # and skips k more letters.

Then the machine reads the next three letters x′, y′, z′ and checks thatthey are compatible with the transition relation of the Turing machine.

Lastly, it skips another p(n)− k − 3 letters, and either reads a # andstarts all over, or reads $ and accepts.

For example, if x = p, y = a and δ(p, a) = (b, q,+1) then x′ = b, y′ = q andz′ = z. Of course, there may be multiple possibilities.

Page 53: Polynomial Space - cs.cmu.edu

Combining “even” Ak machines with corresponding “odd” machines insuresthat every accepted string is a proper sequence of IDs.

Lastly, we add one more DFA that checks that the sequence of IDs starts at theproper initial ID, and ends in an accepting ID (say, after emptying out thetape).

All the DFAs together accept precisely one string (coding the acceptingcomputation of M) iff M(ε) ↓.

It is straightforward to modify the construction to check for M(x) ↓ instead.

2

Page 54: Polynomial Space - cs.cmu.edu

Determinization 53

We can think of a nondeterministic automaton A to a succinct representationof an equivalent deterministic automaton B: the accessible part of the powerautomaton as detailed in the standard Rabin-Scott construction.

The size of B has no better than an exponential bound

|B| ≤ 2|A|

Unfortunately, this bound is tight in general. It remains tight when oneconsiders only semiautomata where Q = F = I, even when they are nearlydeterministic and co-deterministic.

Page 55: Polynomial Space - cs.cmu.edu

Blow-Up Example 1 54

Consider the following DFA A = 〈[n],Σ, δ, 1, {1}〉 where [n] = {1, . . . , n},Σ = {a, b, c} and the transition function is given by

δa a cyclic shift on [n],

δb the transposition that interchanges 1 and 2,

δc sends 1 and 2 to 2, identity elsewhere.

It is well-known that A has a transition semigroup of maximal size nn.

Furthermore, the reversal rev(A) of A produces a power automaton of maximalsize 2n, and the power automaton turns out to be reduced.

Page 56: Polynomial Space - cs.cmu.edu

Blow-Up Example 2 55

Here is a 6-state semiautomaton.

X = b: machine is deterministicand co-deterministic, so the powerautomaton has size 1.

X = a: the power automaton hasmaximal size 26.

a

a

a

a

a

a

X

bb

b

bb

Page 57: Polynomial Space - cs.cmu.edu

Power Automaton Reachability 56

Problem: Power Automaton Reachability (PAR)Instance: A nondeterministic automaton A, a set P ⊆ Q of states.Question: Is P a state in the power automaton of A?

Theorem

PAR is PSPACE-complete.

Proof.

Membership is PSPACE follows from the usual guess-one-letter-at-a-timeargument (as opposed to guessing the whole witness string).

Page 58: Polynomial Space - cs.cmu.edu

For hardness one can exploit a feature of Kozen’s hardness proof of DFAIntersection: all the DFAs in the argument have a unique final state.

So we can let A =⋃A be the disjoint union of the Kozen machines, and set

P = { fin(Ai) | i = 1, . . . ,m }

2

We will see shortly that counting the states of the power automaton is similarlyhard.

Page 59: Polynomial Space - cs.cmu.edu

And More 58

Many of the generalized games from above turn out to be PSPACE-complete.

ICE is PSPACE-complete.

Testing whether two regular expressions are equivalent is PSPACE-complete(even if one of them is Σ?). The same holds true for nondeterministic finitestate machines.