July 30, 2007 2:01 PM

Preparing for a Fall Marathon

I haven't written about running in a while, but there hasn't been much to say. I've been building my mileage back up to a respectable weekly average (in the mid 30s), with an eye toward being ready for marathon training. My last few weeks have been interrupted by only one mishap, a double-hamstring injury brought on not by running but by an intense day of landscaping. Last week, I managed 38 miles with two rest days, which puts me in good shape to begin training.

At the end of October, I'll be running the Marine Corps Marathon, my first "destination" marathon. It's also the latest in the year I will have run a marathon, with my previous four all coming in the first half of October. As a result, this is the latest I have ever started my official training plan for a marathon. This affects training in two ways: more of my mileage will come after the hottest part of the summer, and more of my mileage will come during the academic year. For a university prof or student, this means spending more hours on the road, away from work, and being more tired when doing CS. I think I will have to get to bed earlier most nights and so change some of my routine.

I am again using a 12-week plan for "advanced" runners that I read about in Runners' World, designed by running coach Bob Williams. Via Google, I found a 16-week plan by Williams, but it's much more complicated than the plan I am using; I prefer workouts that don't require a lot of switching gears. The plan I am using puts me on the track twice most weeks, once doing long repeats (≥ 800m) to build long speed and once doing shorter repeats (< 800m) to increase leg turnover, improve form and efficiency, and build strength for longer speed.

Last year, I customized this plan quite a bit, spreading it over 14 weeks and adding a lot of miles. This year, I am sticking to the plan even closer than last year, not just the speed workouts but also the off-day workouts, the order and length of the long runs, and the weekly mileage recommendations. I suppose that having run several hundred miles fewer this year than last has me feeling a bit less cocky, and I also think it's time to let the expert guide me. My only customization this year is to stick in an extra week next week, between Weeks 1 and 2, while I am on the road to Indiana and southern California for a reunion and a little R&R before the school year -- and heavy training -- commence. Next week, I'll just work on my aerobic base with some mixed-speed road running.

Wish me luck.


Posted by Eugene Wallingford | Permalink | Categories: Running

July 27, 2007 6:58 PM

A Nice Example of a Functional Programming Pattern

David Altenburg gives a nice example of a pattern you'll find in functional programs, which he calls Enumerate, Map, Filter, Accumulate. This isn't the typical snappy pattern name, but it does show deference to Chapter 2 of Structure and Interpretation of Computer Programs, which discusses in some detail this way of processing a stream of data. I like this blog entry because it illustrates nicely a difference between how functional programmers think and how OO programmers think. He even gives idiomatic Java and Ruby code to demonstrate the difference more clearly. Seeing the Ruby example also makes clear just how powerful first-order blocks can be in an OO language.

SICP is a veritable catalog of the patterns of programs written in a functional style. They aren't written in Alexander's pattern style, but they are patterns nonetheless.

(In a snarkier mood, I might say: I am sure some Lisp aficionado will explain that Enumerate, Map, Filter, Accumulate isn't a pattern because we could define a enumapfilacc macro and make it go away...)


Posted by Eugene Wallingford | Permalink | Categories: Patterns

July 26, 2007 1:21 PM

Agile Themes: Honesty and The Prime Directive

My last post looked at the relationship between honesty and blocking, motivated by a recent thread on the XP discussion list. In another thread, I encountered Dale Emery's message on The Prime Directive, and that got me to thinking about being honest with myself about my own behavior, and how to get better.

If you read much in the agile world, you'll run across the phrase "Prime Directive" a lot. I'm not a Trekkie, though I have enjoyed the several movies and TV series, but the first thing I think of when I hear the phrase is James T. Kirk. That's not what the agile folks are talking about... even if that directive raises interesting questions for a software person introducing agile methods to an organization!

If you google "prime directive agile", the first link is to Bob Martin's The Prime Directive of Agile Development, which is: Never be blocked. This is an ironic choice of words, given what I discussed in my previous post, but Martin is using an analogy from billiards, not football: An agile developer "makes sure that the shot he is taking sets up the next shot he expects to take.... A good agile developer never takes a step that stops his progress, or the progress of others." This is a useful notion, I think, but again not what most agilists mean when they speak of the Prime Directive.

They are referring instead to Norm Kerth's use of the phrase in the realm of project retrospectives, in which teams learn from the results of a recently-completed project in order to become a better team for future projects. Here is the Prime Directive for retrospectives, according to Norm:

The prime directive says:

Regardless of what we discover, we understand and truly believe that everyone did the best job they could, given what they knew at the time, their skills and abilities, the resources available, and the situation at hand.

At the end of a project everyone knows so much more. Naturally we will discover decisions and actions we wish we could do over. This is wisdom to be celebrated, not judgement used to embarrass.

This directive creates an environment in which people can examine past actions and results without fear of blame or reprisal. Instead the whole team can find ways to improve. When we look back at behavior and results in this context, we can be honest -- with our teammates and with ourselves. It's hard to improve oneself without facing the brutal facts that define our world and our person.

Emery's article focuses on the power of the phrase "given what they knew at the time". He does not view it as a built-in excuse -- well, I didn't know any better, so... -- rather as a challenge to identify and adjust the givens that limit us.

I apply The Prime Directive to my personal work by saying, "I did the best I could, given..." then fill in the givens. Then I set to work removing or relaxing the limiting conditions so that I perform better in the future. Usually, the most important conditions are the conditions within me, the conditions that I created.... If I created those conditions (and I did), then they are the conditions I can most directly improve.

Well said. Being honest with myself isn't easy, nor is following through on what I learn when I am. I take this as a personal challenge for the upcoming year.

(By the way, I strongly recommend Norm Kerth's book on retrospectives, as well as his pattern language on the transition from software analysis to design, Caterpillar's Fate. Norm is an engaging speaker and doer who celebrates the human element in whatever he touches. I reported on a talk he gave at PLoP 2004 on myth and patterns back in the early days of this blog.)


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

July 26, 2007 1:08 PM

Agile Themes: Honesty and Blocking

I recently wrote about a long-running thread on the XP discussion list about defining 'agile'. Another theme I've noticed across several threads is honesty. This entry and the one that follows look at two facets of the theme.

In one thread that seems to be dying down, the list has discussed the ethics of "blocking", a term that Scott Ambler borrowed from (American) football to describe teams that create the facade of following the official software development methodology while behind the scenes doing what they think is best to deliver the software. Scott wrote about this behavior, in which some members of the team protect the agile process by Running Interference for the rest of the team, in a 2003 Software Development article.

Is it right to do this? As developers, do we want to live our lives doing one thing and saying that we do another? I'm leery of any prescription that requires me to lie, yet I see shades of gray here. I don't think that my employer or our client are better served by actually following a process that is likely to fail to deliver the software as promised. Or, if my team is capable of delivering the software reasonably using the official methodology, then why do I need to lie in order to use an agile process? For me, programming in an agile way is a lot more fun, so there is that, but then maybe I need to find a place that will let me do that -- or start my own.

As I mentioned last time, I have not been able to follow the list discussion 100%, and I can't recall if Kent Beck ever chimed in. But I can imagine what he might say, given the substance and tone of his postings the last few years. If you have to lie -- even if we give it an innocuous name like "blocking"; even if we view it as a last resort -- then something is wrong, and you should think long and hard about how to make it right. Agile developers value people over processes, and honesty is one way we demonstrate that we value people.

George Dinwiddie has a blog entry that considered a more pragmatic potential problem with blocking. We may be getting the job done in the short term, but blocking is shortsighted and may hurt the agile cause in the long run. If we give the appearance of succeeding via the official route, our employer and customer are likely to conclude that the official route is a good one -- and that will make it even harder to introduce agile practices into the workplace. There is a practical value in telling the truth, even it requires us to take small steps. After all, agile developers ought to appreciate the power of small steps.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

July 25, 2007 7:45 AM

Miscellaneous Blogging Thoughts

... at the end of a long day.

  1. I must be an old hand at blogging now. I let Knowing and Doing's third anniversary pass without comment. And I let my 500th post -- and my 512th! -- go by with no fanfare.
  2. I continue to be amazed by Google and the blogosphere. While preparing to visit my old hometown for my upcoming high school reunion, I googled "Scott Merrell ARC" in hopes of finding out if the named Mr. Merrell still owned and operated ARC Sheet Metal, where I worked as a part-time sheet metal apprentice and installer of ductwork during several high school summers. Scott had met me at local chess tournaments and took me on for a job for which I had no training or particular native talent. He patiently taught me a few skills and endured a few mistakes. I thought it might be nice to stop in to see Scott after twenty years, introduce him to my wife and daughters, and maybe give him one more shot in his pet Lasker's variation against my Petrov's Defense. It was unlikely that a small local sheet metal shop would have a web page, but it cost me nothing to try.

    I found only one relevant link -- the first link on the results page, of course -- but it was not for the shop. Instead it included a blog entry written by a friend of Scott's son, which quoted the full text of the son's eulogy for his father. My good friend and former boss died this past March after a long battle with lung disease. (In addition to being a chess hound and a professional sheet metal man, he smoked far too much.) The eulogy almost brought me to tears as it reminisced about the decent man I, too, remembered fondly and respected so. I have no simple way to contact Scott's son to thank him for sharing his eulogy, but I did leave a comment on the blog.

    Not many years ago, the idea that I could have learned about Scott's passing in this way and read the eulogy would have been unthinkable. The connection was indirect, impersonal in some ways, but deeply personal. For all its shortcomings, our technology makes the world a better place to live.

  3. I don't write a personal blog like the one that quoted Scott's eulogy, this entry and a few others notwithstanding. Dave Winer expressed one of the powerful reasons for writing a blog in his essay The unedited voice of a person. A blog like mine provides an outlet for thinking out loud, developing professional ideas in front of an audience, and sharing the small insights that would likely never appear in a refereed publication in a journal or conference. Writing without an editor creates a little fear, but soon the fear is counterbalanced by the freedom that comes from not having to carry someone else's reputation into the written word. By having readers, I do feel the weight of expectation, as I don't want to waste the valuable time of others. But the voice here can be mine, and only mine.
  4. Besides, I like Winer's explanation for why comments are not the be-all, end-all of a blog. I've always told myself and anyone who asked why I don't have comments that I would soon, but I have remained too lazy or busy to set them up. The lightweight, shell-based blogging tool I use, an old version of Nanoblogger, doesn't support comments out of the box, and in fact seems to require magical incantations to make a third-party add-on to work with it. And I don't have the time or inclination to write my own just now.

    But I don't actually mind not having comments. I sometimes miss the interactivity that comments would enable, but managing comments and combatting comment spam takes time, time that I would rather spend reading and blogging.

  5. Last summer, Brad DeLong wrote a fun but more academic essay, The Invisible College, for the Chronicle of Higher Education Review that describes well why I like to blog. Even without comments enabled, I receive e-mail from smart, interesting people all over the world who have read something I wrote here, discussing some point I made, offering alternatives, and suggesting new ideas and resources to me. My academic work benefits from this invisible college. With any luck, some of my ideas might reach the eyes of non-computer scientists and start a conversation outside the confines of my discipline.

    Oh, and he's spot on about that procrastinating thing.

Back to paradise.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

July 24, 2007 3:42 PM

Agile Themes: Defining Agile

Every so often the XP discussion list takes off in a frenzy of activity. The last few weeks have seen such a frenzy, which makes it difficult for readers like me to keep up. Fortunately, I am often able to find some nuggets of value with a two-pronged heuristic approach to reading: I usually home in on messages from a few folks whose posts I have found valuable in the past and scan the sub-threads that follow, and I occasionally select messages pseudo-randomly based on subject lines that tickle my fancy. I'm sure that I miss out on some valuable conversations along the way, but I do find enough to keep my wheels moving.

One long-running recent thread has focused on "defining agile" -- that is, being able to tell when an organization is using agile methods or not. If the list were made up solely of academics, this thread would be a natural. Academics love to put things into categories and name them. But this list is made up mostly of practitioners, and there interest in defining agile comes from many different directions, not the least of which is dispelling hype and distinguishing practices that help folks succeed.

I used to be much more dogmatic about naming things and separating methods and approaches and people into categories. A prime example was "real" object-oriented programming from the charade that some people display in a language that supports objects. Over time, I have soured of the battles and am more likely to espouse the point of view expressed by Andy Hunt somewhere back in the thread:

Instead of a test for agile, how about a test for "was your project a success or was it horked?" If it was a success, call it anything you want. If not, don't dare call it agile. :-)

This sort of pragmatism is reminiscent of Alan Turing's sidestepping of the question "what is intelligence?" in his seminal Computing Machinery and Intelligence. Such a definition makes it hard for agilists to defend their turf, but it lets folks who want to build systems get down to busy, rather than argue.

That said, I think that George Dinwiddie has done a nice job of capturing the essence of defining agile methods in a blog entry responding to the thread: using feedback that is frequent, timely, aligned with our desired goals, and pervasive. If you have read much of this blog, especially back in the first couple of years, you know that I have written frequently of the value of using feedback in many different circumstances, from developing software to teaching a class to training for and running a marathon. My appreciation of Dinwiddie's characterization is unsurprising.

Tim Haughton created a branch off this thread with a post that defines the Three Colours of Agile. Haughton reminds us that we need to tell different stories when we are dealing with the different parties involved in software projects. In particular, the customer who will use our software and the manager who oversees the development team have much different views on software development than the developers themselves have. Most of our discussion about agile methods focuses on the practices of developers and only peripherally with our interface to the rest of the world, if at all. Telling the right story at the right time can make a huge difference in whether another person buys into a new method proposed by developers. When we communicate the value of agile methods to the customer and manager from their perspective, the approach looks so much more palatable than a laundry list of developer practices they don't understand. Again, frequent readers of this blog will recognize a recurring theme of story telling.


Posted by Eugene Wallingford | Permalink | Categories: Software Development

July 23, 2007 1:59 PM

Intelligent Game Playing in the News

Two current events have me thinking about AI, one good and one sad.

First, after reporting last week that checkers has been solved by Jonathan Schaeffer's team at the University of Alberta, this week I can look forward to the Man vs. Machine Poker Challenge at AAAI'07 The computer protagonist in this event, Polaris, also hails from Alberta and Schaeffer's poker group. In this event, which gets under way shortly in Vancouver, Polaris will play a duplicate match against two elite human pros, Phil Laak and Ali Eslami. Laak and Eslami will play opposite sides of the same deal against Polaris, in an attempt to eliminate the luck of the draw from the result.

I don't know much about computer card-playing. Back when I was teaching AI in the mid-1990s, I used Matthew Ginsberg's text, and from his research learned a bit about programs that play bridge. Of course, bridge players tend to view their game as a more intellectual task than poker (and as more complex than, say, chess), whereas poker introduces the human element of bluffing. It will be fun seeing how a "purely rational" being like Polaris bluffs and responds to bluffs in this match. If poker is anything at all like chess, I figure that the program's dispassionate stance will help it respond to bluffs in a powerful way. Making bluffs seems a different animal altogether.

I wish I could be in Vancouver to see the matches. Back in 1996 I was fortunate to be at AAAI'96 in Philadelphia for the first Kasparov-Deep Blue match. The human champ won a close match that year before losing to Deep Blue the next. We could tell from Kasparov's demeanor and behavior during this match, as well as from his public statements, that he was concerned that humans retain their superiority over machines. Emotion and mental intimidation were always a part of his chess.

On the contrary, former World Series of Poker champion Laak seems unconcerned at the prospect that Polaris might beat him in this match, or soon; indeed, he seems to enjoy the challenge and understand the computational disadvantage that we humans face in these endeavors. That's a healthier attitude, both long term and for playing his match this week. But I appreciated Kasparov's energy during that 1996 match, as it gave us demonstrative cues about his state of mind. I'll never forget the time he made a winning move and set back smugly to put his wristwatch back on. Whenever Garry put his watch back on, we knew that he thought he was done with the hard working of winning the game

The second story is sadder. Donald Michie, a pioneer in machine learning, has died. Unlike many of the other founders of my first love in computing, I never had any particular connection to Michie or his work, though I knew his name well from the series of volumes on machine learning that he compiled and edited, as they are staples of most university libraries. But then I read in his linked Times On-Line article:

In 1960 he built Menace, the Matchbox Educable Noughts and Crosses Engine, a game-playing machine consisting of 300 matchboxes and a collection of glass beads of different colours.

We Americans know Noughts and Crosses as tic-tac-toe. It turns out that Michie's game-playing machine -- one that needed a human CPU and peripherals in order to run -- was the inspiration for an article by Martin Gardner, which I read as a sophomore or junior in high school. This article was one of my first introductions to machine learning and fueled the initial flame of my love for AI. I even built Gardner's variant on Michie's machine, a set of matchboxes to play Hexapawn and watched it learn to play a perfect game. It was no Chinook or Deep Blue, but it made this teenager's mind marvel at the possibilities of machine intelligence.

So, I did have a more direct connection to Michie, and had simply forgotten! RIP, Dr. Michie.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Personal

July 20, 2007 7:38 PM

A Reunion with Reunion

My 25th high school reunion is next month. (I can just hear the pencils at work as students, current and former, figure out just how old I am.) So I took this opportunity to re-read Alan Lightman's novel Reunion, which is about a college professor's 30th college reunion. I first read this book when it came out several years ago, but the theme was more timely this time around.

I first learned about Lightman, a physicist-turned-novelist whose fact and fiction both rest on a physics foundation, from an endnote in David Bodanis's E=mc2, which referred me to Einstein's Dreams, This was an unusual book, only a couple of dozen short chapters, that consisted of a few fictional vignettes of Einstein's thinking and discussion with Hans Bethe as he reconceptualized time for his theory of relativity, interspersed among twenty or so fictional dreams that Einstein might have had about worlds in which time behaves differently than it does in our world. For example, in one world, time passes faster when one is at higher altitudes; in another, one occasionally gets stuck to a single place in time; in yet another, time moves backward.

I found this book delightful, both creative and wonderfully written. The conversations between Einstein and Bethe sounded authentic to this non-physicist, and the dream chapters were both "whimsical" and "provocative" (words I borrow from a literary review of the book) -- what would it be like if different neighborhoods lived in different decades or even centuries? Lightman writes as a poet, spare with words and description, precise in detail. Yet the book had a serious undercurrent, as it exposed some of the questions that physicists have raised about the nature of time, and how time interacts with human experience.

Later I found Reunion. It's more of a traditional human story, and I expect that some of my friends would derogate it as "chick lit". But I disagree. First, it's a man's story: a 52-year-old man keenly aware that time has passed beyond his dreams; a 22-year-old man alive with promise unaware that he is reaching branches in time that can never be passed again. And while its structure is that of a traditional novel, the underlying current is one of time's ambiguity: looking back, looking forward, standing still. Lightman even resorts in the shortest of passages to a common device which in other authors' hands is cliché, but which in his seems almost matter of fact. It's not science fiction because it sticks close to the way a real person might feel in this world, where time seems to move monotonically forward but in which our lives are a complex mishmash of present and past, future and never-was.

I enjoyed Reunion again and, though it's a bit of downer, it hasn't diminished my anticipation of stepping back in time to see people who were once my friends, and who because of how time works in my mind will always be my friends, to reminisce about back-when and since-then, and what-now. Time's linearity will show through, of course, in the graying of hair and the onset of wrinkles...


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

July 19, 2007 3:03 PM

Checkers -- Solved!

I told the story of Jonathan Schaeffer's SIGCSE talk on the history of Chinook back in March. In that talk, he said that his team was a few months away from solving checkers. They have done it, as this Scientific American article reports:

Jonathan Schaeffer's quest for the perfect game of checkers has ended. ... after putting dozens of computers to work night and day for 18 years -- jump, jump, jump -- he says he has solved the game -- king me!. "The starting position, assuming no side makes a mistake, is a draw," he says.

The proof is on-line, but the best proof is Chinook, Schaeffer's checker-playing program that is now the one player in the wold that will never lose a game of checkers. You can still play Chinook on-line, if you got game.

This is easily the most complex game to be solved by computation, and the result depends on several different areas of computer science: AI, distributed computing, parallel programming, and databases most prominent among them. Chinook's endgame database now contains approximately 39 trillion positions and is the practical keystone of its play. Chinook searches deep, like many masters, but now it can relatively quickly terminate its analysis not with a heuristic static evaluation function but a database look-up that guarantees correctness. So even analytical mistakes early in the game can be corrected for as soon as the program reaches a solved position.

I am a game player, primarily chess, and I know some folks who will call this database an unfair advantage. But the best game players I know have always capitalized on their better memories and better computational skills; I don't know why Chinook or any other program should be held to a different standard. But what a memory that is!

I am finally ready to believe that, if Chinook were to play Marion Tinsley -- may he rest in peace [*] -- in another match, it would not lose, and most likely would win. Even the great Tinsley made an occasional error.

And if you have not yet read Schaeffer's book One Jump Ahead on my earlier recommendation, well, shame on you. Do so now.

But is checkers really dead, as the popular press is now saying? Not at all. It is still a fun game for people to play, and a great mental battlefield. It's just that now we have an objective standard against which to measure ourselves.

----

[*] Or should that be "rest in piece"?


Posted by Eugene Wallingford | Permalink | Categories: Computing

July 19, 2007 9:27 AM

Copying the Masters

Ludwig van Beethoven, on copying Beethoven, in Copying Beethoven:

The world doesn't need another Beethoven. But it may need you.

Whether you are a composer copying Beethoven, a programmer copying Ward, or a pattern writer copying Alexander, learn from the masters, and then be yourself.

(On the movie itself: It is a highly fictionalized account of the last year of Beethoven's life. It is a good story that could have been a better movie, but Ed Harris is convincing as the master. Worth a couple of hours.)


Posted by Eugene Wallingford | Permalink | Categories: Teaching and Learning

July 18, 2007 4:20 PM

Mathematics as "Social Construct"

Many folks like to make analogies between mathematics and art, or computer science and creative disciplines. But there are important ways in which these analogies come up short. Last time, I wrote about Reuben Hersh's view of how to teach math right. The larger message of the article, though, was Hersh's view of mathematics as a social construct, a creation of our culture much as law, religion, and money are. One of the neat things about Edge is that it not only gives interviews with thinkers but also asks thinkers from other disciplines to comment on the interviews. In the same issue as Hersh's interview is a response by Stanislas Dehaene, a mathematician-turned-neuroscientist who has studied the cognition of reading and number. He agrees that a Platonic view of number as Platonic ideal is untenable but then draws on his knowledge of cognitive science to remind us that math is not like art and religion as social constructs in two crucial ways: universality and effectiveness. First, there are some mathematical universals to which all cultures have converged, and for which we can construct arguments sufficient to convincing any person:

If the Pope is invited to give a lecture in Tokyo and attempts to convert the locals to the Christian concept of God as Trinity, I doubt that he'll convince the audience -- Trinity just can't be "proven" from first principles. But as a mathematician you can go to any place in the world and, given enough time, you can convince anyone that 3 is a prime number, or that the 3rd decimal of Pi is a 1, or that Fermat's last theorem is true.

I suspect that some cynics might argue that this is true precisely because we define mathematics as an internally consistent set of definitions and rules -- as a constructed system. Yet I myself am sympathetic to claims of the universality of mathematics beyond social construction.

Second, mathematics seems particular effective as the language of science. Dehaene quotes Einstein, "How is it possible that mathematics, a product of human thought that is independent of experience, fits so excellently the objects of physical reality?" Again, a cynic might claim that much of mathematics has been defined for the express purpose of describing our empirical observations. But that really begs the question. What are the patterns common to math and science that make this convergence convenient, even possible?

Dehaene's explanation for universality and effectiveness rests in evolutionary biology -- and patterns:

... mathematical objects are universal and effective, first, because our biological brains have evolved to progressive internalize universal regularities of the external world ..., and second, because our cultural mathematical constructions have also evolved to fit the physical world. If mathematicians throughout the world converge on the same set of mathematical truths, it is because they all have a similar cerebral organization that (1) lets them categorize the world into similar objects ..., and (2) forces to find over and over again the same solutions to the same problems ....

The world and our brains together drive us to recognize the patterns that exist in the world. I am reminded of a principle that I think I first learned from Patrick Henry Winston in his text Artificial Intelligence, called The Principle of Convergent Intelligence:

The world manifests constraints and regularities. If an agent is to exhibit intelligence, then it must exploit these constraints and regularities, no matter the nature of its physical make-up.

The close compatibility of math and science marveled at by Einstein and Dehaene reminds me of another of Winston's principles, Winston's Principle of Parallel Evolution:

The longer two situations have been evolving in the same way, the more likely they are to continue to evolve in the same way.

(If you never had the pleasure of studying AI from Winston's text, now in its third edition, then you missed the joy of his many idiosyncratic principles. They are idiosyncratic in that you;ll read them no where else, certainly not under the names he gives them. But they express truths he wants you to learn. They must be somewhat effective, if I remember some from my 1986 grad course and from teaching out of his text in the early- to mid-1990s. I am sure that most experts consider the text outdated -- the third edition came out in 1992 -- but it still has a lot to offer the AI dreamer.)

So, math is more than "just" a mental construct because it expresses regularities and constraints that exist in the real world. I suppose that this leaves us with another question: do (or can) law and religion do the same, or do they necessarily lie outside the physical world? I know that some software patterns folks will point us to Christopher Alexander's arguments on the objectivity of art; perhaps our art expresses regularities and constraints that exist in the real world, too, only farther from immediate human experience.

These are fun questions to ponder, but they may not tell us much about how to do better mathematics or how to make software better. For those of us who make analogies between math (or computer science) and the arts, we are probably wise to remember that math and science reflect patterns in our world, at least more directly with our immediate experience than some of our other pursuits.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 17, 2007 8:17 PM

Mathematics, Problems, and Teaching

I'm surprised by how often I encounter the same topic in two different locations on the same day. The sources may be from disparate times, but they show up on my radar nearly coincident. Coincidence, perhaps.

Yesterday I ran across a link to an old article at Edge called What Kind of a Thing is a Number?. This is an interview with Reuben Hersh, a mathematician with a "cultural" view of mathematics. More on that notion later, but what caught my eye was Hersh's idea of how to teach math right:

A good math teacher starts with examples. He first asks the question and then gives the answer, instead of giving the answer without mentioning what the question was. He is alert to the body language and eye movements of the class. If they start rolling their eyes or leaning back, he will stop his proof or his calculation and force them somehow to respond, even to say "I don't get it." No math class is totally bad if the students are speaking up. And no math lecture is really good, no matter how beautiful, if it lets the audience become simply passive. Some of this applies to any kind of teaching, but math unfortunately is conducive to bad teaching.

Computer science isn't mathematics, but it, too, seems conducive to a style of teaching in which students are disengaged from the material. Telling students how to write a program is usually not all that enlightening for students; they need to do it to understand. Showing students how to write a program may be a step forward, because at least then they see a process in action and may have the gumption to ask "what?" and "why?" at the moments they don't understand. But I find that students often tune out when I demonstrate for too long how to do something. It's too easy for me to run ahead of what they know and can do, and besides, how I do something may not click in the right way with how they think and do.

The key for Hersh is "interaction, communication". But this creates a new sort of demand on instructors: they have to be willing to shut up, maybe for a while. This is uncomfortable for most faculty, who learned in classrooms where professors spoke and students took notes. Hersh tells a story in which he had to wait and wait, and then sit down and wait some more.

It turned out to be a very good class. The key was that I was willing to shut up. The easy thing, which I had done hundreds of times, would have been to say, "Okay, I'll show it to you." That's perhaps the biggest difficulty for most, nearly all, teachers--not to talk so much. Be quiet. Don't think the world's coming to an end if there's silence for two or three minutes.

This strategy presumes that students have something to say, and just need encouragement and time to say it. In a course like Hersh's, on problem solving for teachers, every student has a strategy for solving problems, and if the instructors goal is to bring out into the open different strategies in order to talk about them, that works great. But what about, say, my compilers course? This isn't touchy-feely; it has well-defined techniques to learn and implement. Students have to understand regular expressions and finite-state machines and context-free grammars and automata and register allocation algorithms... Do I have time to explore the students' different approaches, or even care what they are?

I agree with Hersh: If I want my students actually to learn how to write a compiler, then yes, I probably want to know how they are thinking, so that I can help them learn what they need to know. How I engage them may be different than sending them to the board to offer their home-brew approach to a problem, but engagement in the problems they face and with the techniques I'd like them to learn is essential.

This sort of teaching also places a new demand on students. They have to engage the material before they come to class. They have to read the assigned material and do their homework. Then, they have to come to class prepared to be involved, not just lean against a wall with a Big Gulp in their hands and their eyes on the clock. Fortunately, I have found that most of our students are willing to get involved in their courses and do their part. It may be a cultural shift for them, but they can make it with a little help. And that's part of the instructor's job, yes -- to help students move in the right direction?

That was one article. Later the same afternoon, I received ACM's regular mailing on news items and found a link to this article, on an NSF award received by my friend Owen Astrachan to design a new curriculum for CS based on... problems. Owen's proposal echoes Hersh's emphasis on problem-before-solution:

Instead of teaching students a lot of facts and then giving them a problem to solve, this method starts out by giving them a problem.... Then they have to go figure out what facts they need to learn to solve it.

This approach allows students to engage a real problem and learn what they need to know in a context that matters to them, to solve something. In the article, Owen even echoes the new demand made of instructors, being quiet:

With problem-based learning, the faculty person often stands back while you try to figure it out, though the instructor may give you a nudge if you're going the wrong way.

... and the new demand made of students, to actively engage the material:

And [the student] might spend a couple of weeks on a problem outside of class.... So you have to do more work as a student. It's kind of a different way of learning.

The burden on Astrachan and his colleagues on this grant is to find, refine, and present problems that engage students. There are lots of cool problems that might excite us as instructors -- from the sciences, from business, from the social sciences, from gaming and social networking and media, and so on -- but finding something works for a large body of students over a long term is not easy. I think Owen understands this; this is something he has been thinking about for a long time. He and I have discussed it a few times over the years, and his interest in redesigning how we teach undergraduate CS is one of the reasons I asked him to lead a panel at the OOPSLA 2004 Educators' Symposium.

Frank Oppenheimer's Exploratorium

This is also a topic I've been writing about for at least that long, including entries here on how Problems Are The Thing and before that on Alan Kay's talks ... at OOPSLA 2004! I think that ultimately Kay has the right idea in invoking Frank Oppenheimer's Exploratorium as inspiration: a wide-ranging set of problems that appeal to the wide-ranging interests of our students while at the same time bringing them "face to face with the first simple idea of science: The world is not always as it seems. This is a tall challenge, one better suited to a community working together (if one by one) than to a single researcher or small group alone. The ChiliPLoP project that my colleagues and I have been chipping away slowly on the fringes. I am looking forward to the pedagogical infrastructure and ideas that come from Owen's NSF project. If anyone can lay a proper foundation for problems as the centerpiece of undergrad CS, he and his crew can.

Good coincidences. Challenging coincidences.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Teaching and Learning

July 12, 2007 7:32 PM

Stay Focused

A former student recently wrote:

I am periodically reminded of a saying that is usually applied to fathers but fits teachers well -- when you are young it's amazing how little they know, but they get much smarter as I get older.

For a teacher, this sort of unsolicited comment is remarkably gratifying. It is also humbling. What I do matters. I have to stay on top of my game.


Posted by Eugene Wallingford | Permalink | Categories: Personal, Teaching and Learning

July 11, 2007 7:02 PM

Heard at a Summit on K-12 Education

As I mentioned last month, the Board of Regents in my state has charged the public universities with a major reform effort aimed primarily at K-12 science and math education. My university, as the state's old teachers' college, is leading the effort. Today, we hosted a "summit" attended by researchers and practitioners at all three public universities, folks from other four-year colleges, K-12 science and math teachers, legislators, representatives from the state Department of Education, and many other folks. It was a big crowd.

Here are some of the things I heard in my break-out group of twelve:

  1. Education researchers talk a lot about students and teachers "talking about thinking" and "talking about learning".
  2. The teachers on the front lines face many of the same problems we face in the university, including students who would rather spend their time at their part-time jobs than doing homework in a challenging course.
  3. "Context matters."
  4. "How do we measure the quality of teaching?" Too often the answer from the Establishment is that it's a really hard problem to quantify, or that we can't quantify it all. Unfortunately, when we can't measure our "output" we are going to have a hard time knowing when we have succeeded or failed.
  5. "You don't need to be a physicist to teach physics." No, but you need to know some physics -- more and at a deeper level than what you hope to teach. And you need to be able to think like a physicist, and be able to do a little physics when the situation calls for it.

I can see now why this problem is so hard to solve. We can't specify our target very clearly, which makes it hard to agree on what to do to solve it, or to know if we have. There are so many different stakeholders with so many different ideas at stake. It's pretty daunting. I can see why some folks want to "start over" in the form of charter schools that can implement a particular set of ideas relatively free of the constraints of the education establishment and various other institutional and personal agendas.

My initial thought is that the best way to start is to start small. Pick a small target that you can measure, try an idea, get feedback, and then improve. Each time you meet a target, grow your ambitions by adding another "requirement" to the system. Do all of this in close consultation with parents and other "customers". This sounds "agile" in the agile software sense, but in a way it's just the scientific method at work. It will be slow, but it could make progress, whereas trying to wrestle the whole beast to the ground at once seems foolhardy and a waste of time. Starting from scratch in a new school (greenfield development) also seems a lot easier than working in an existing school (legacy development).


Posted by Eugene Wallingford | Permalink | Categories: Teaching and Learning

July 10, 2007 5:53 PM

Thinking Ahead to OOPSLA

OOPSLA 2007 logo

I haven't written much in anticipation of OOPSLA 2007, but not because I haven't been thinking about it. In years when I have had a role in content, such as the 2004 and 2005 Educators' Symposia or even the 2006 tutorials track, I have been excited to be deep in the ideas of a particular part of OOPSLA. This year I have blogged just once, about the December planning meeting. (I did write once from the spring planning meeting, but about a movies.) My work this year for the conference has been in an administrative role, as communications chair, which has focused on sessions and schedules and some web content. Too be honest, I haven't done a very good job so far, but that is a subject for another post. For now, let's just say that I have not been a very good Mediator nor a good Facade.

I am excited about some of the new things we are doing this year to get the word out about the conference. At the top of this list is a podcast. Now, podcasts have been around for a while now, but they are just now becoming a part of the promotional engine for many organizations. We figured that hearing about some of the cool stuff that will happen at OOPSLA this year would complement what you can read on the web. So we arranged to have two outfits, Software Engineering Radio and DimSumThinking, co-produce a series of episodes on some of the hot topics covered at this year's conference.

Our first episode, on a workshop titled No Silver Bullet: A Retrospective on the Essence and Accidents of Software Engineering, organized by Dennis Mancl, Steven Fraser, and Bill Opdyke, is on-line at the OOPSLA 2007 Podcast page. Stop by, give it a listen, and subscribe to the podcast's feed so that you don't miss any of the upcoming episodes. (We are available in iTunes, too.) We plan to role new interviews out every 7-10 for the next few months. Next up is a discussion of a Scala tutorial with Martin Odersky, due out on July 16.

If you would like to read a bit more about the conference, check out conference chair Richard Gabriel's The (Unofficial) How To Get Around OOPSLA Guide, and especially his ooPSLA Impressions. As I've written a few times, there really isn't another conference like OOPSLA. Richard's impressions page does a good job communicating just how, mostly in the words of people who've been to OOPSLA and seen it.

While putting together some of his podcast episodes, Daniel Steinberg of DimSumThinking ran into something different than usual: fun.

I've done three interviews for the oopsla podcast -- every interviewee has used the same word to describe OOPSLA: fun. I just thought that was notable -- I do a lot of this sort of thing and that's not generally a word that comes up to describe conferences.

And that fun comes on top of the ideas and the people you will encounter, that will stretch you. We can't offer a Turing Award winner every year, but you may not notice with all the intellectual foment. (And this year, we can offer John McCarthy as an invited speaker...)


Posted by Eugene Wallingford | Permalink | Categories: Computing, Patterns, Software Development

July 09, 2007 7:28 PM

Preparing for Fall Compilers Course (Almost)

Summer is more than half over. I had planned by now to be deep in planning for my fall compilers course, but the other work has kept me busy. I have to admit also to suffering from a bout of intellectual hooky. Summer is a good time for a little of that.

Compilers is a great course, in so many ways. It is one of the few courses of an undergraduate's curriculum in which students live long enough with code that is big enough to come face-to-face with technical debt. Design matters, implementation matters, efficiency matters. Refactoring matters. The course brings together all of the strands of the curriculum into a real project that requires knowledge from the metal up to the abstraction of language.

In the last few weeks I've run across several comments from professional developers extolling the virtues of taking a compilers course, and often lamenting that too many schools no longer require compilers for graduation. We are one such school; compilers is a project option competing with several others. Most of the others are perceived to be easier, and they probably are. But few of the others offer anything close to the sort of capstone experience that compilers does.

In a comment on this post titled Three Things I Learned About Software in College, Robert Blum writes:

Building OSes and building compilers are the two ends of the spectrum of applied CS. Learn about both, and you'll be able to solve most problems coming your way.

I agree, but a compilers course can also illuminate theoretical CS in ways that other courses don't. Many of the neat ideas that undergrads learn in an intro theory course show up in the first half of compilers, where we examine grammars and build scanners and parsers.

My favorite recent piece on compilers is ultra-cool Steve Yegge's Rich Programmer Food. You have to read this one -- promise me! -- but I will tease you with Yegge's own precis:

Gentle, yet insistent executive summary: If you don't know how compilers work, then you don't know how computers work. If you're not 100% sure whether you know how compilers work, then you don't know how they work.

Yegge's article is long but well worth the read.

As for my particular course, I face many of the same issues I faced the last time I taught it: choosing a good textbook, choosing a good source language, and deciding whether to use a parser generator for the main project are three big ones. If you have any suggestions, I'd love to hear from you. I'd like to build a larger, more complete compiler for my students to have as a reading example, and writing one would be the most fun I could have getting ready for the course.

I do think that I'll pay more explicit attention in class to refactoring and other practical ideas for writing a big program this semester. The extreme-agile idea of 15 compilers in 15 days, or something similar, still holds me in fascination, but at this point I'm in love more with the idea than with the execution, because I'm not sure I'm ready to do it well. And if I can't do it well, I don't want to do it at all. This course is too valuable -- and too much fun -- to risk on an experiment in whose outcome I don't have enough confidence.

I'm also as excited about teaching the course as the last time I taught it. On a real project of this depth and breadth, students have a chance to take what they have learned to a new level:

How lasts about five years, but why is forever.

(I first saw that line in Jeff Atwood's article Why Is Forever. I'm not sure I believe that understanding why is a right-brain attribute, but I do believe in the spirit of this assertion.)


Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development, Teaching and Learning

July 07, 2007 7:10 AM

Quick Hits, Saturday Edition

Don't believe me about computational processes occurring in nature? Check out Clocking In And Out Of Gene Expression, via The Geomblog. Some genes turn other genes on and off. To mark time, they maintain a clock by adding ubiquitin molecules to a chain; when the chain reaches a length of five, the protein is destroyed. That sounds a lot like a Turing machine using a separate tape as a counter...

Becky Hirta learned something that should make all of us feel either better or worse: basic math skills are weak everywhere. We can feel better because it's not just our students, or we can feel worse because almost no one can do basic math. One need not be able to solve solve linear equations to learn how to write most software, but an inability to learn how to solve solve linear equations doesn't bode well.

Hey, I made the latest Carnival of the Agilists. The Carnival dubs itself "the bi-weekly blogroll that takes a sideways slice through the agile blogosphere". It's a nice way for me to find links to articles on agile software development that I might otherwise have missed.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development, Teaching and Learning

July 06, 2007 12:00 PM

Independence Day Reading

While watching a little Wimbledon on television the other day, I read a couple of items on my daunting pile of papers to read. Among the first was the on-line excerpt of Scott Rosenberg's recent book Dreaming in Code, about the struggles of the team developing the open-source "interpersonal information manager" Chandler. In the introduction, Rosenberg says about software:

Never in history have we depended so completely on a product that so few know how to make well.

In context, the emphasis is on "know how to make well". He was speaking of the software engineering problem, the knowledge of how to make software. But my thoughts turned immediately to what is perhaps a more important problem: "so few". The world depends so much on computer software, yet we can't seem to attract students to study computer science or (for those who think that is unnecessary) to want to learn how to make software. Many young people think that the idea of making software is beyond them -- too hard. But most don't think much about programming at all. Software is mundane, too ordinary. Where is the excitement?

Later I was reading Space for improvement, on "re-engaging the public with the greatest adventure of our time": space travel. Now space travel travel still seems pretty cool to me, one of the great scientific exercises of my lifetime, but polls show that most Americans, while liking the idea in the abstract, don't care all that much about space travel when it comes down to small matters of paying the bills.

The focus of the article is on the shortcomings of how NASA and others communicate the value and excitement of space travel to the public. It identifies three problems. The first is an "unrelenting positiveness" in PR, which may keep risk-averse legislators happy but gives the public the impression that space travel is routine. The second is a lack of essential information from Mission Control during launches and flights, information that would allow the PR folks tell a more grounded story. But author Bob Mahoney thinks that the third and most important obstacle in the past past has been a presumption that has run through NASA PR for many year's:

The presumption? That the public can't understand or won't appreciate the deeper technical issues of spaceflight. By assuming a disinterested and unintelligent public, PAO [NASA's s Public Affairs Office] and the mainstream media have missed out completely on letting the public share in the true drama inherent in space exploration.

If you presume a disinterested and unintelligent public, then you won't -- can't -- tell an authentic story. And in the case of space travel, the authentic story, replete with scientific details and human drama, might well snag the attention of the voting public.

I can't claim that that software development is "the greatest adventure of our time", but I think we in computing can learn a couple of things from reading this article. First, tell people the straight story. Trust them to understand their world and to care about things that matter. If the public needs to know more math and science to understand, teach them more. Second, I think that we should tell this story not just to adults, but to our children. The only way we can expect students to want to learn how to make software or to learn computer science is if they understand why these things matter and if they believe that they can contribute. Children are in some ways a tougher audience. They still have big imaginations and so are looking for dreams that can match their imagination, and they are pretty savvy when it comes to recognizing counterfeit stories.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Teaching and Learning

July 05, 2007 3:59 PM

Language Science

My previous entry discussed the scientific study of language. It occurred to me that the scientists who object to computing as science might also object to the study of language as "real" science. Certainly, the biological and neurological foundation of language seems to meet the criterion that a phenomenon occur in the natural world in order to be in the realm of science. But what of syntax and semantics? Even if we computer scientists speak of "natural language" Do they occur naturally in the world?

According to Chomsky, the answer is 'yes': People are born with a universal grammar implemented in their brains, which makes it possible for us to learn what otherwise might not be learnable in a tractable way. This is a claim that is open to empirical study, at least in principle, though the story of the Pirahã shows how difficult it is to support and falsify claims in this area. (And according to the article I cited last time, there are some are who concerned with how engaged Chomsky himself is in empirical verification these days.)

But isn't natural language man-made? We grow and evolve our vocabulary with some degree of intention. Syntax changes over time, too, and at least part of that change is intentional.

Not everyone thinks of language as a designed artifact. Consider this quote from an essay at Design Observer

According to new book by linguist David Harrison, "Languages can package knowledge in radically different ways, thus facilitating different ways of conceptualizing, naming, and discussing the world." If languages package information, can they be considered design objects?

I'm a computer scientist, so my first thought was, "Well, duh." Had I written a separate blog entry on that piece, I would have titled it "Yet Another Example of Why Non-Computer Scientists Should Study CS". Maybe, though, I am betrayed by living in a world artificial -- designed -- languages. (Well, if you can call a language like Perl "designed".)

I'm not a linguist, so it is hard for me to say to what extent primitive languages such as Pirahã esp. pre-literate ones, are designed and to what extent they flow out of the machinery of the human brain.

Interestingly, a few years ago we had a multidisciplinary language science seminar at my university. You can see the web page from our last full semester of activity. It was a diverse crew, ranging from folks interested in the biological and neurological side of language, to a cognitive psychologist, to linguists, on up to teachers of modern languages and literature profs interested in the use of language in poetry and prose. And there was one person interested in artificial languages and their interplay with natural language -- me. The group is dead now, but I found it quite valuable both as a teacher and as someone interested in programming languages. I miss our biweekly sessions.


Posted by Eugene Wallingford | Permalink | Categories: General

July 04, 2007 9:19 PM

Recursion, Natural Language, and Culture

M.C. Escher, 'Hands'

It's not often that one can be reading a popular magazine, even one aimed at an educated audience, and run across a serious discussion of recursion. Thanks to my friend Joe Bergin for pointing me to The Interpreter, a recent article in the The New Yorker by Reporter at Large John Colapinto. The article tells the story of the Pirahã, a native tribe in Brazil with a most peculiar culture and a correspondingly unusual language. You see, while we often observe recursion in nature, one of the places we expect to see it is in natural language -- in the embedding of sentence-like structures within other sentences. But the Pirahã don't use recursion in their language, because their world view makes abstract structure meaningless.

Though recursion plays a critical role in Colapinto's article, it is really about recursion; it is about a possible crack in Chomsky's universal grammar hypothesis about language, and some of the personalities and technical issues involved. Dan Everett is a linguist who has been working with the Pirahã since the 1970s. He wrote his doctoral dissertation on how the Pirahã language fit into the Chomsky, but upon further study and a new insight now "believes that Pirahã undermines Noam Chomsky's idea of a universal grammar." As you might imagine, Chomsky and his disciples disagree.

What little I learned about the Pirahã language makes me wonder at what it must be like to learn it -- or try to. One the one hand, it's a small language, with only eight consonants and three vowels. But that's just the beginning of its simplicity:

The Pirahã, Everett wrote, have no numbers, no fixed color terms, no perfect tense, no deep memory, no tradition of art or drawing, and no words for 'all', 'each', 'every', 'most', or 'few' -- terms of quantification believed by some linguists to be among the common building blocks of human cognition. Everett's most explosive claim, however, was that Pirahã displays no evidence of recursion, a linguistic operation that consists of inserting one phrase inside another of the same type..."

This language makes Scheme look like Ada! Of course, Scheme is built on recursion, and Everett's claim that the Pirahã don't use it -- can't, culturally -- is what rankles many linguists the most. Chomsky has built the most widely accepted model of language understanding on the premise that "To come to know a human language would be an extraordinary intellectual achievement for a creature not specifically designed to accomplish this task." And at the center of this model is "the capacity to generate unlimited meaning by placing one thought inside another", what Chomsky calls "the infinite use of finite means", after the nineteenth-century German linguist Wilhelm von Humboldt.

According to Everett, however, the Pirahã do not use recursion to insert phrases one inside another. Instead, they state thoughts in discrete units. When I asked Everett if the Pirahã could say, in their language, "I saw the dog that was down by the river get bitten by a snake", he said, "No. They would have to say, 'I saw the dog. The dog was at the beach. A snake bit the dog.'" Everett explained that because the Pirahã accept as real only that which they observe, their speech consists only of direct assertions ("The dog was at the beach."), and he maintains that embedded clauses ("that was down by the river") are not assertions but supporting, quantifying, or qualifying information -- in other words, abstractions.

The notion of recursion as abstraction is natural to us programmers, because inductive definitions are by their nature abstractions over the sets they describe. But I had never before thought of recursion as a form of qualification. When presented in the form of an English sentence such as "I saw the dog that was down by the river get bitten by a snake", it makes perfect sense. I'll need to think about whether it makes sense in a useful for my programs.

Here is one more extended passage from the article, which discusses an idea from Herb Simon, which appears in the latest edition of the Simon book I mentioned in my last entry:

In his article, Everett argued that recursion is primarily a cognitive, not a linguistic, trait. He cited an influential 1962 article, "The Architecture of Complexity," by Herbert Simon, a Nobel Prize-winning economist, cognitive psychologist, and computer scientist, who asserted that embedding entities within like entities (in a recursive tree structure of the type central to Chomskyan linguistics) is simply how people naturally organize information. ... Simon argues that this is essential to the way humans organize information and is found in all human intelligence systems. If Simon is correct, there doesn't need to be any specific linguistic principle for this because it's just general cognition." Or, as Everett sometimes likes to put it: "The ability to put thoughts inside other thoughts is just the way humans are, because we're smarter than other species." Everett says that the Pirahã have this cognitive trait but that it is absent from their syntax because of cultural constraints.

This seems to be a crux in Everett's disagreement with the Chomsky school: Is it sufficient -- even possible -- for the Pirahã to have recursion as a cognitive trait but not as a linguistic trait? For many armchair linguists, the idea that language and thought go hand in hand is almost an axiom. I can certainly think recursively even when my programming language doesn't let me speak recursively. Maybe the Pirahã have an ability to organize their understanding of the world using nested structures (as Simon says they must) without having the syntactic tools for conceiving such structures linguistically (as Everett says they cannot).

I found this to be a neat article for more reasons than just its references to recursion. Here are few other ideas that occurred as I read.

Science and Faith Experience

At UNICAMP (State Univ. of Campinas in Brazil), in the fall of 1978, Everett discovered Chomsky's theories. "For me, it was another conversion experience," he said.

Everett's first conversion experience happened when he became a Christian in the later 1960s, after meeting his wife-to-be. It was this first conversion that led him to learn linguistics in the first place and work with the Pirahã under the auspices of the Summer Institute of Linguistics, an evangelical organization. He eventually fell away from his faith but remained a linguist.

Some scientists might balk at Everett likening his discovery of Chomsky to a religious conversion, but I think he is right on the mark. I know what it's like as a scholar to come upon a new model for viewing the world and feeling as if I am seeing a new world entirely. In grad school, for me it was the generic task theory of Chandrasekaran, which changed how I viewed knowledge systems and foreshadowed my later move into the area of software patterns.

It was interesting to read, even briefly, the perspective of someone who had undergone both a religious conversion and a scientific conversion -- and fallen out of both, as his personal experiences created doubts for which his faiths had no answers for him.

Science as Objective

Obvious, right? No. Everett has reinterpreted data from his doctoral dissertation now that he has shaken the hold of his Chomskyan conversion. Defenders of Chomsky's theory say that Everett's current conclusions are in error, but he now says that

Chomsky's theory necessarily colored his [original] data-gathering and analysis. "'Descriptive work' apart from theory does not exist. We ask the questions that our theories tell us to ask.

Yes. When you want to build generic task models of intelligent behavior, you see the outlines of generic tasks wherever you look. You can tell yourself to remain skeptical, and to use an objective eye, but the mind has its own eye.

Science is a descriptive exercise, and how we think shapes what we see and how we describe. Do you see objects or higher-order procedures when you look at a problem to describe or when you conceive a solution? Our brains are remarkable pattern machines and can fall into the spell of a pattern easily. This is true even in a benign or helpful sense, such as what I experienced after reading an article by Bruce Schneier and seeing his ideas in so many places for a week or so. My first post in that thread is here, and the theme spread throughout this blog for at least two weeks thereafter.

Intellectually Intimidating Characters

Everett occupied an office next to Chomsky's; he found the famed professor brilliant but withering. "Whenever you try out a theory on someone, there's always some question that you hope they won't ask," Everett said. "That was always the first thing Chomsky would ask.

That is not a fun feeling, and not the best way for a great mind to help other minds grow -- unless used sparingly and skillfully. I've been lucky that most of the intensely bright people I've met have had more respect and politeness --and skill -- to help me come along on the journey, rather than to torch me with their brilliance at every opportunity.

Culture Driving Language

One of the key lessons we see from the Pirahã is that culture is a powerful force, especially a culture so long isolated from the world and now so closely held. But you can see this phenomenon even in relatively short-term educational and professional habits such as programming styles. I see it when I teach OO to imperative programmers, and when I teach functional programming to imperative OO programmers. (In a functional programming course, the procedural and OO programmers realize just how similar their imperative roots are!) Their culture has trained them not to use the muscles in their minds that rely on the new concepts. But those muscles are there; we just need to exercise them, and build them up so they are as strong as the well-practiced muscles.

What Is Really Universal?

Hollywood blockbusters, apparently:

That evening, Everett invited the Pirahã to come to his home to watch a movie: Peter Jackson's remake of "King Kong". (Everett had discovered that the tribe loves movies that feature animals.) After nightfall, to the grinding sound of the generator, a crowd of thirty or so Pirahã assembled on benches and on the wooden floor of Everett's [house]. Everett had made popcorn, which he distributed in a large bowl. Then he started the movie, clicking ahead to the scene in which Naomi Watts, reprising Fay Wray's role, is offered as a sacrifice by the tribal people of an unspecified South Seas island. The Pirahã shouted with delight, fear, laughter, and surprise -- and when Kong himself arrived, smashing through the palm trees, pandemonium ensued. Small children, who had been sitting close to the screen, jumped up and scurried into their mothers' laps; the adults laughed and yelled at the screen.

The Pirahã enjoy movies even when the technological setting is outside their direct experience -- and for them, what is outside their direct experience seems outside their imagination. The story reaches home. From their comments, the Pirahã seemed to understand King Kong in much the way we did, and they picked up on cultural clues that did fit into their experience. A good story can do that.

Eugene sez, The Interpreter, is worth a read.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

July 03, 2007 8:02 AM

Computational Processes in Nature

Back in February I wrote a speculative piece on computer science as science, in which I considered ways in which CS is a scientific discipline. As a graduate student, I grew up on Herb Simon's book The Sciences of the Artificial, so the notion of studying phenomena that are contingent on the intentions of their designer has long seemed a worthwhile goal. But "real" scientists, those who study the natural world, have never been persuaded by Simon's arguments. For them, science deals with natural phenomena; programs and more comprehensive computer systems are man-made; and so computing as a science of the artificial is not a real science.

As the natural sciences develop, though, we have begun to learn something that computer scientists have sensed for a long time: computational processes occur in the natural world. Peter Denning has taken dead aim on this observation in his new essay Computing is a Natural Science, published in this month's Communications of the ACM. He opens with his take-home claim:

Computing is now a natural science. Computation and information processes have been discovered in the deep structures of many fields. Computation was present long before computers were invented, but the remarkable shift to this realization occurred only in the last decade. We have lived for so long in the belief that computing is a science of the artificial, it may be difficult to accept that many scientists now see information processes abundantly in nature.

Denning supports his claim with examples from biology and physics, in which natural computations now form the basis of much of the science, in the form of DNA and quantum electrodynamics, respectively. In many ways, the realization that computation lies in the deep structures of many natural systems is a vindication of Norbert Wiener, who in the 1940s and 1950s wrote of information as a fundamental element of systems that communicate and interact, whether man-made or living.

The article continues with a discussion of some of the principles discovered and explored by computer scientists, all of which seem to have correlates in natural phenomena. The table in his paper, available on his web site as a PDF file, lists a few key ones, such as intractability, compression, locality, bottlenecks, and hierarchical aggregation. That these principles help us to understand man-made systems better and to design better systems should not distract from their role in helping us to understand computations in physical, chemical, and biological systems.

There is some talk on my campus of forming a "school of technology" into which the Department of Computer Science might move. From my department's perspective, this idea offers some potential benefits and some potential costs. One of the potential costs that concerns me is that being in a school of technology might stigmatize the discipline as merely a department of applications. This might well limit people's perception of the department and its mission, and that could limit the opportuniies available to us. At a time when we are working so hard to help folks understand the scientific core of computing, I'm not keen on making a move that seems to undermine our case.

Explicating the science of computing has been Denning's professional interest for many years now. You can read more about his work on his Great Principles of Computing web site. There is also an interview with Denning that discusses some of his reasons for pursuing this line of inquiry in the latest issue of ACM's Ubiquity magazine. As Denning points out there, having computing as a common language for talking about the phenomena we observe in natural systems is an important step in helping the sciences that study those systems advance. That we can use the same language to describe designed systems -- as well as large interactive systems that haven't been designed in much detail, such as economies -- only makes computer science all the more worthy of our efforts.


Posted by Eugene Wallingford | Permalink | Categories: Computing