I would like to do more teaching, but as our department has a hard enough time keeping all but the core classes open, and there aren't very many assistantships (due to our department nearly folding last school year), I don't have the opportunity. I might have the chance at a community college after Sara and I graduate--a community college since that will probably be the closest institute of higher learning in the area--but that's for the future.

Anyway, the class, COSC 4200: Computation and Complexity, covers some basic computation theory in the first half of the semester, focusing on regular languages and the associated finite automata, and context-free languages and their associated grammars and push-down automata; and then delving into Turing Machines and some complexity theory in the second half of the semester.

Since these theory topics are within my area of study, I'm always fascinated by them. Personally, I find playing with DFAs and PDAs quite enjoyable. I tried to describe them as the tinker toys of computational theory to my wife, but sadly she felt that meant that talking about them was a waste of time. Sigh.

I personally find it interesting how some very simple paradigms of computation can actually accomplish a fair amount of work. It is also interesting noting the closure properties of such languages. Regular languages, for example, are closed under union, intersection, complement, concatenation. Context-free languages are closed under union and concatenation, but not complement or intersection. More importantly, under regular languages, nondeterminism is not any more powerful than determinism, one of the few cases where we know this happens (one other being PSPACE=NPSPACE, due to Savitch's Theorem). Of course, we can argue that we can only make this assertion since we restrict ourselves to a finite number of states with no memory devices (like a stack or a queue), and that there is still an exponential blowup of states when converting from NFAs to DFAs. This exponential blowup is the reason why it seems that P is strictly contained in NP.

On the other hand, context-free languages are essentially defined nondeterministically, and CFLs are strictly more powerful than DCFLs, a case where separation between determinism and nondeterminism is known.

One of the neat things about playing with DFAs is that they are memoryless. They consume the input as they work, and what memory they have is a tiny, finite amount. For example, we can craft DFAs to remember the last

*c*bits it has read, where

*c*is some constant. Useful, but very limited, considering that most problems require us to have unbounded amounts of memory to solve them.

This limited factor of DFAs makes moving up to PDAs more exciting (at least for me), because PDAs are essentially NFAs with a stack. A stack operates on a LIFO (last in, first out) property, but has infinite capacity. This means we can remember what we've read--only when we try to re-read the input, we read it backwards.

One of the considerations I wondered about when I first learned about these was what would happen if we used a queue, instead of a stack. The answer is: we have the equivalent of a Turing Machine. So having a queue gives us all the computational capabilities (minus a speed factor) of the PCs we have in front of us.

I was rather pleased when the class thought to ask about that during lecture yesterday.