February 28, 2010 7:33 PM

Another Running Month in Review

A month still hardly deserves its own review, but two months in a row of steady running and mileage on the rise leave me feeling a little tired but back in something of a groove. February began with a week of light recovery at 24 miles followed by weeks of 29, 32.3, and 32.6 miles. The last two weeks I bumped my Sunday long run to 10 miles and felt good afterwards. My body is even beginning to respond with some speed.

The big change this month has been a return outdoors. I tired of running indoors and tried a few more 15-degree days, then a few 10s, and finally even a couple days down around the 0-degree mark -- and found I enjoyed them! Two years of difficult running off and on had killed my hardiness. Now I feel it coming back strong. It probably sounds insane to when I say I enjoyed a run this week at 4 degrees below zero, but I did!

I'm not quite back into the mid-30s for weekly mileage, which was my ambitious hope at the end of last month, but I can imagine it happening soon. The weather is slowly, imperceptibly, turning to spring, which means more and more good mornings to run outdoors.

This week I plan a few adjustments to my weekly schedule in anticipation of a few days on the road in Milwaukee for SIGCSE 2010 followed by a few days in St. Louis on spring break with my family. Running in other places invigorates me, and I look forward to some new sights as spring comes alive.


Posted by Eugene Wallingford | Permalink | Categories: Running

February 27, 2010 9:40 AM

Increasing Duplication to Eliminate Duplication

In a recent entry, I discussed how Kent Beck' design advice "Exploit Symmetries" improves our ability to refactor code. When we take two things that are similar and separate them into parts that are either identical or different, we maximize the repetition in our code. This enables us to factor the repetition in the sharpest way.

Here is a simple example from Scheme. Suppose we are writing a procedure to walk down a vector and count how many of items satisfy a particular condition. Along the way, we might produce code something like this:

  (define count-occurrences-of-test-at
    (lambda (test? von position)
      (if (>= position (vector-length von))
          0
          (if (test? (vector-ref von position))
              (+ 1 (count-occurrences-of-test-at test? von (+ position 1)))
              (count-occurrences-of-test-at test? von (+ position 1))))))

Our procedure duplicates code, but it may not be obvious at first how to factor it away. The problem is that the duplication occurs nested in a larger expression at two different levels: one is a consequent of the if expression, and the other is part of the computation that is the other consequent.

As a first step, we can increase the symmetry in our code by rewriting the else clause as a similar computation:

  (define count-occurrences-of-test-at
    (lambda (test? von position)
      (if (>= position (vector-length von))
          0
          (if (test? (vector-ref von position))
              (+ 1 (count-occurrences-of-test-at test? von (+ position 1)))
              (+ 0 (count-occurrences-of-test-at test? von (+ position 1)))))))

Now we see that the duplicated code is always part of the value of expression, and the if expression itself is about choosing whether to add 1 or 0 to the value of the recursive call. We can use one of the distributive laws of code to factor out the repetition:

  (define count-occurrences-of-test-at
    (lambda (test? von position)
      (if (>= position (vector-length von))
          0
          (+ (if (test? (vector-ref von position)) 1 0)
             (count-occurrences-of-test-at test? von (+ position 1))))))

Voilá! No more duplication. By increasing the duplication in our code, we create a more symmetric relation, and the symmetry enables us to eliminate the duplication entirely. I have never thought of myself as thinking in terms of symmetry when I write code, but I do think in terms of regularity. My mind prefers code with regular form, both on the surface and in the programming structures I use. Often times, my thorniest refactoring problems arise when I let irregular structure sneak into my code. When some duplication or complexity make me uneasy, I find that taking the preparatory step of increasing regularity can help me see a way to simpler code.

Of course, we might approach this problem differently altogether, from a functional point of view, and write a different sort of solution:

  (define count-occurrences-of-test
    (lambda (test? von)
      (apply + (vector-map test? von))))

This eliminates another form of duplication that we find across many procedures that operate on vectors: the common structure of the code simulates a loop over a vector. That is yet another form of regularity that we can exploit, once we begin to recognize it. Then, when we write new code, we can look for ways to express the solution in terms of the functionally mapping pattern, so that we don't have to roll our own loop by hand. When imperative programmers begin to see this form of symmetry, they are on their way to becoming functional programmers. (It is also the kind of symmetry at the heart of MapReduce.)


Posted by Eugene Wallingford | Permalink | Categories: Patterns, Software Development

February 23, 2010 6:34 PM

Strength Later, Weakness Now

One of the things I've always liked about Scheme is that the procedures I write are indistinguishable from the primitive procedures of the language. All procedures are applied prefix and named using the same binding mechanism. This similarity extends beyond simple procedure definitions to other features such as variable arity and special forms, which aren't procedures at all.

All this makes Scheme an especially handy tool for creating domain-specific languages. I can create a whole new language for the domain I'm working in, say, language processing or financial accounting, simply by defining the procedures and forms that embody the domain's concepts. When I write programs using these procedures, they mix seamlessly with Scheme's primitive procedures and forms to create a nearly friction-less writing -- and reading -- experience.

I've always touted this feature of the language to my students, who usually learn Scheme as a tool for writing interpreters and other simple language processors in our programming languages course. However, over the last couple of offerings of the course, I am beginning to realize that this strength is a weakness for many beginners.

At the outset of their journey into functional programming, students have so many things to think about (language syntax, new programming principles, idioms to accomplish tasks they understand well, and so on) that a lack of friction may actually hurt them. They have trouble finding the boundary between the language Scheme and the language we build on top of it. For example, when we first began to implement recursive procedures, I gave students a procedure sequence:

    (define sequence
      (lambda (start finish)
        (if (> start finish)
            '()
            (cons start (sequence (+ start 1) finish)))))

as a simple example and to use for testing code that processes lists of numbers. For weeks afterwards, I had students e-mailing me because they wrote code that referred to sequence: "Why am I getting errors?" Well, because it's not a primitive and you don't define it. "But we use it in class all the time." Well, I define it each time I need it. "Oh, I guess I forgot." This sequence has re-played itself many times already this semester, with several other pieces of code.

I suppose you could say the students ought to be more disciplined in their definition and use of code, or that I ought to do a better job of teaching them how to write and reuse code, or simply that the students need better memories. One or all of these may be true, but I think there is more happening here. A language with no friction between primitives and user-defined code places one more burden on students who are already juggling a lot of new ideas in their minds.

As students become more experienced with the language and the new style of programming, they have a better chance to appreciate the value of seamless layers of code as they grow a program. They begin to notice that the lack of friction helps them, as they don't have to slow down to handle special cases, or to change how a piece of code works when they decide to add an abstraction between the code and its clients. Whereas before the lack of friction slowed them down while they pondered boundaries and looked up primitives, now it helps them move faster.

This phenomenon is not peculiar to functional programming or Scheme. I think it is also true of OOP and Java. Back when Java was first becoming part of CS 1 at many schools, many of my colleagues objected to use the use of home-grown packages and libraries. The primary argument was that students would not be learning to write "real Java" (which is so wrong!) and that their code would not be as portable (which is true). In retrospect, I think a more compelling case can be made that the use of homegrown packages might interfere with students cognitively as they learn the language and the boundaries around it. There are elements of this in both of their objections, but I now think of it as the crux of the issue.

This phenomenon is also not a necessary condition to learning functional programming or Scheme. A number of schools use Scheme in their first-year courses and do just fine. Perhaps instructors at these schools have figured out ways to avoid this problem entirely, or perhaps they rely on some disciplines to help students work around it. I may need to learn something from them.

I have noticed my students having this difficulty the last two times we've offered this course and not nearly as much before, so perhaps our students are changing. On the other hand, maybe this reflects an evolution in my own recognition and understanding of the issue. In either case, my job is pretty much the same: find ways to help students succeed. All suggestions are welcome.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Teaching and Learning

February 22, 2010 6:56 PM

I'll Do Homework, But Only for a Grade

In the locker room one morning last week, I overheard two students talking about their course work. One of the guys eventually got himself pretty worked up while talking about one professor, who apparently gives tough exams, and exclaimed, "We worked two and a half hours on that homework, and he didn't even grade it!"

Yesterday, I was sitting with my daughters while they did some school work. One of them casually commented, "We all stopped putting too much effort into Teacher Smith's homework when we figured out s/he never grades it."

I know my daughter's situation up close and so know what she means. She tends to go well beyond the call of duty on her assignments, in large part because she is in search of a perfect grade. With time an exceedingly scarce and valuable resource, she faces an optimization problem. It turns out she can put in less effort on her homework than she ordinarily does and still do fine on her test. With no prospect of a higher grade from putting more time into the assignment to pull her along, she is willing to economize a bit and spend her time elsewhere.

Maybe that's just what the college student meant when I overheard him that morning. Perhaps he is actually overinvesting in his homework relative to its value for learning, because he seeks a higher grade on the homework component of the course. That's not the impression I got from my unintentional eavesdropping, though. I left the locker room thinking that he sees value in doing the homework only if it is graded, only if it contributes to his course grade.

This is the impression too many college students give their instructors. If it doesn't "count", why do it?

Maybe I was like that in college, too. I know that grades were important to me, and as double-major trying to graduate in four years after spending much of my freshman year majoring in something else, I was taking a heavy class load. Time was at premium. Who has time or energy to do things that don't count?

Even if I did not understand then, I know now that the practice itself is an invaluable part of how I learned. Without lots of practice writing code, we don't even learn the surface details of our language, such as syntax and idiom, let alone reach a deep understanding of solving problems. In the more practical terms expressed by the student in the locker room, without lots of practice, most every exam will seem too long, look to be difficult, and seem to be graded harshly. That prof of his has found a way to get the student to invest time in learning. What a gift!

We cannot let the professor off the hook, though. If s/he tells the class that the assignment will be graded, or even simply gives students the impression that it "counts for something", then not to grade the assignment is a deception. Such a tactic is justified only in exceptional circumstances, and not only moral grounds. As Teacher Smith has surely learned by now, students are smart enough not to fall for a lie too many times before they direct their energies elsewhere.

In general, though, homework is a gift: a chance to learn under controlled conditions. I'm pretty sure that students don't see it this way. This reminds me a conversation I had with my colleague Mark Jacobson a couple of weeks ago. We were discussing the relative abundance and paucity of a grateful attitude among faculty in general. He recalled that, in his study of the martial arts, he had encountered two words for "thank you". One, suki, from the Japanese martial arts, means to see events in our lives as opportunity or gift. Another, sugohasameeda, comes from Korean Tae Kwon Do and is used to say, "Thank you for the workout".

Suki and sugohasameeda are related. One expresses suki when things do not go the way we wish, such as when we have a flat tire or when a work assignment doesn't match or desires. One expresses sugohasameeda in gratitude to one's teacher for the challenging and painful work that make us grow, such as workouts that demand our all. I see elements of both in the homework we are assigned. Sugohasameeda seems to be spot-on with homework, yet suki comes into play, too, in cases such as the instructor going counter to our expectations and not grading an assignment.

I do not find myself in the role of student as much these days, but I can see so many ways that I can improve my own sense of gratefulness. I seem to live sugohasameeda more naturally these days, though incompletely. I am far too often lacking in suki. My daily life would be more peaceful and whole if I could recognize the opportunity to grow through undesired events with gratitude.

One final recollection. Soon after taking my current job, I met an older gentleman who had worked in a factory for 30+ years. He asked where I worked, and when I said, "I teach at the university", he said, "That beats workin' for a livin'". My first reaction was akin to amused indignation. He obviously didn't know anything about what my job was like.

Later I realized that there was a yin to that yang. I am grateful to have a career in which I can do so many cool things, explore ideas whenever they call to me, and work with students who learn and help me to learn -- to do things I love every day. So, yeah, I guess my job does beat "workin' for a livin'".

I just wish more students would take their homework seriously.

~~~~

My colleague Mark also managed to connect his ideas about gratitude from the martial arts to the 23rd Psalm of the Christian Bible. The green pastures to which it famously refers are not about having everything exactly as I want it, but seeing all things as they are -- as gift, as opportunity, as suki. I continue to learn from him.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

February 21, 2010 7:32 PM

Typos and Uncertainty

Last week, a student asked me why one of my examples in the programming assignment said this:

     > (insertion-sort > '(1 4 2 5 3 6 4 7))
     (1 2 3 4 4 5 6 7)

Shouldn't the answer be (7 6 5 4 4 3 2 1)? Or was there something that he didn't understand?

At first, I thought his question was a polite way of pointing out my typo, but as we talked it became clear that he felt some real uncertainty about the answer.

How? Surely it was obvious that sorting the list in descending order should produce the second list. This seemed all the more obvious because the previous example on the same page sorted the same input with < and had the correct output! What could he be thinking?

Sometimes, I ask myself such things rhetorically, out of wonder or frustration. Over the years, though, I have learned to take these questions seriously, because they are the key to understanding what's going on with my students.

In the Case of the Backward Bracket, I recognize a lesson I have learned before: Even the smallest error or inconsistency can create major doubt in the mind of a novice. Originally, I wrote "fragile novice", and it's true that some novices are more fragile than others. But to be a beginner is by its nature to be fragile. Our minds are still learning to see, so when they see something that is wrong we are willing to believe that something is wrong with with us.

Learning functional programming and Scheme puts my students in the position of facing problems they feel confident solving -- if only they could use their favorite programming style and language. Right now, though, they struggle with a new way of seeing, and this creates uncertainty for them. It makes them tentative, maybe even scared. They see my typo and wonder what it is they don't get.

This lesson means at least two things to me as a teacher. First, I need to be extra careful to weed out mistakes in what I tell, show, and ask them. I want them to be able to focus as much as possible on the necessary complexity in the problems, not on distractions that result from my fingerfehlers. Second, I need to keep my eyes open for moments when this kind of uncertainty and fear begin to dominate my students' minds, whether in class or working on their own. By recognizing the situation early enough and intervening, carefully, I may be able to help them stay on a productive path toward understanding.


Posted by Eugene Wallingford | Permalink | Categories: Teaching and Learning

February 19, 2010 4:33 PM

Thoughts on How to Design

Smalltalk Best Practice Patterns is one of my favorite programming books. In it, Kent Beck records some of the design and implementation patterns that he has observed in Smalltalk systems over the years. What makes the book most valuable, though, is that most of its patterns apply beyond Smalltalk, to other object-oriented programming languages and even to non-OO languages. That's because it is really about how we think about and design our programs.

Kent Beck

Kent's latest design endeavor is what he calls the Responsive Design Project, and he reports some of this thinking so far in a recent blog entry. The entry includes a number of short patterns of design. These are not patterns that show up in designs, but patterns of thinking that help give rise to designs. Being hip deep in teaching functional design style to students whose experience is imperative programming, many of Kent's lessons hit home for me.

Inside or Outside. Change the interface or the implementation but not both at the same time.

Winnie the Pooh, a bear of very little brain

This is a classic that bears repeating. It's tempting to start making big changes to even a small piece of code, but whenever we conflate changes to interface and implementation, we risk creating more complexity than our small brains can manage.

Isolate Changes. Before making a change, isolate the area to be changed from the rest of the system so you can change an entire element at a time. For example, before changing a part of a procedure, extract the area to be changed into its own procedure. Make the change, then inline the changed sub-procedure if appropriate.

This one is beyond the ken of most intermediate-level students, so it doesn't show up in my courses often. When we confine a change to a small box, we control the complexity of the change and the range of its effect. This technique can even be used to control software evolution at a higher level.

Exploit Symmetries. Divide similar elements into identical parts and different parts.

Many beginning programmers find it counter-intuitive that the best way to eliminate duplicated code is to increase the level of duplication, maximizing the repetition to the point that it can be factored out in the cleanest and sharpest way. I have come to know this pattern well but sense that its value runs much deeper than the uses to which I've put it thus far.

Then there is the seemingly contradictory pair

Cultivate Confidence. Master your tools. Your feeling of mastery will improve your cognition.

and

Cultivate Humility. Try tools or techniques you aren't comfortable with. Being aware of your limitations will improve your effectiveness.

Of course, the practices themselves aren't contradictory at all, though the notion that one can be confident and humble at the same time might seem to be. But even that holds no contradiction, because it's all about the edge between mastery and learning. I often talk about these patterns in my programming classes, if only hope that a student who is already starting to sense the tension between hubris and humility will know that it's okay to walk the line.

Finally, my nominee for best new pattern name:

Both. Faced with design alternatives without a clear winner, do it every way. Just coding each alternative for an hour is more productive than arguing for days about which is better in theory, and a lot more satisfying.

You may have heard the adage, "Listen to your code", which is often attributed to Kent or to Ward Cunningham. This pattern goes one step beyond. Create the code that can tell you what you need to hear. Time spent talking about what code might do is often much less productive than simply writing the code and finding out directly.

Early in his essay, Kent expresses the lesson that summarizes much of what follows as

Our illusion of control over software design is a dangerous conceit best abandoned.

He says this lesson "disturbs and excites me". I guess I'm not too disturbed by this notion, because feeling out of control when working in a new domain or style or language has become routine for me. I often feel as if I'm stumbling around in the dark while a program grows into just what it needs to be, and then I see it. Then I feel like I'm in control. I knew where I was going all the time.

In my role as a teacher, this pattern holds great danger. It is easy after the fact to walk into a classroom and expound at length about a design or a program as if I understood it all along. Students sometimes think that they should feel in control all the time, too, and when they don't they become discouraged or scared. But the controlled re-telling of the story is a sham; I was no more in control while writing my program than they will be when they write theirs.

What excites me most about this pattern is that it lifts a burden from our backs that we usually didn't know we were carrying. Once we get it, we are able to move on to the real business of writing and learning, confident in the knowledge that we'll feel out of control much of the time.


Posted by Eugene Wallingford | Permalink | Categories: Software Development, Teaching and Learning

February 15, 2010 10:13 PM

Luck, Embracing Failure, and State of Mind

This morning, Kevlin Henney tweeted:

Being lucky is not generally a matter of luck RT @gregyoung: http://is.gd/8qdIk

That shortened URL points to an article called, Be Lucky: It's an Easy Skill to Learn. The author, psychologist Richard Wiseman, reports some of his findings after a decade studying people who consider themselves lucky or unlucky. Not surprisingly, one's state of mind has as much to do with perception of luck as any events in the observable world. He has identified three common threads that anyone can use to become luckier:

  • Trust your intuition.
  • Use variety and pseudorandom behavior to create opportunities for unexpected benefits.
  • See the positive in each event.

One of the things that struck me about this article was the connection of unlucky people to tension.

... unlucky people are generally much more tense than lucky people, and research has shown that anxiety disrupts people's ability to notice the unexpected.

Tension relates directly to all three of the above bullets. Tense people tend to overthink situations, looking to optimize some metric, and thus quash their gut instincts. They tend to seek routine as a way to minimize distraction and uncertainty, which cause them to miss opportunities. And their tension tends to cause them to see the negative in any event that does not match their desired optimum. Perhaps the key to luck is nothing more than relaxation!

When I think of times I feel unlucky -- and I must sheepishly admit that this happens all too often -- I can see the tension that underlies Wiseman's results. But for me this usually manifests itself as frustration. This thought, in turn, reminded me of a blog entry I wrote a year ago on embracing failure. In it, I considered Rich Pattis observation about how hard computer science must feel to beginners, because it is a discipline learned almost wholly by failure. Not just occasional failure, but a steady stream of failures ranging from syntax errors to misunderstanding complex abstractions. Succeeding in CS requires a certain mindset in which embraces, fights through, or otherwise copes with failure in a constructive way. Some of us embrace it with gusto, seeing failure as a challenge to surmount, not a comment on our value or skills.

I wonder now if there might be a connection between seeing oneself as lucky and embracing failure. Lucky people find positive in the negative events; successful programmers see valuable information in error messages and are empowered to succeed. Lucky people seek out variety and the opportunities it offers; successful programmers try out new techniques, patterns, and languages, not because they seek out failure but because they seek opportunities to learn. Lucky people respect their hunches; successful programmers have the hubris to believe they can see their way to a working program.

If relaxation is the key to removing tension, and removing tension is the key to being lucky, and being lucky is a lot like being a successful programmer, then perhaps the key to succeeding as a programmer is nothing more than relaxation! Yes, that's a stretch, but there is something there.

One last connection. There have been a couple of articles in the popular press recently about an increase in the prevalence of cheating, especially in CS courses. This has led to discussions of cheating in a number of places CS faculty hang out. I imagine there is a close connection between feeling frustrated and tense and feeling like one needs to cheat to succeed. If we can lower the level of tension in our classrooms by lowering the level of frustration, there may be a way for us to stem the growing tide of students cheating. The broader the audience we have in any given classroom, the harder this is to achieve. But we do have tools available to us, including having our students working in domains that give more feedback more visibly, sooner, and more frequently.

One of my favorite comments in all the on-line discussion of cheating in CS is Comment 1 to Mark Guzdial's blog entry, by Steve Tate:

About a decade ago I was chatting with some high school teachers when my university hosted a programming contest for high school kids. One teacher pointed out that her best CS students were those that also played either music or golf -- her theory was that they were used to tasks where you are really bad at first, but you persevere and overcome that. But you have to be able to accept that you'll really stink at it for a good long while.

This struck me as a neat way to make a connection between learning to program and learning music or sports. I had forgotten about the discussion of "meaningful failure" in my own entry... Tate explains the connection succinctly.

Whatever the connections among tension, fear of failure, cheating, and luck, we need to find ways to help students and novice developers learn how to take control of their own destiny -- even if it is only in helping them cultivate their own sense of good luck.


Posted by Eugene Wallingford | Permalink | Categories: Software Development, Teaching and Learning

February 11, 2010 5:40 PM

Creativity and the Boldness of Youth

While casting about Roy Behrens's blog recently, I came across a couple of entries that connected with my own experience. In one, Behrens discusses Arthur Koestler and his ideas about creativity. I enjoyed the entire essay, but one of its vignettes touched a special chord with me:

In 1971, as a graduate student at the Rhode Island School of Design, I finished a book manuscript in which I talked about art and design in relation to Koestler's ideas. I mailed the manuscript to his London home address, half expecting that it would be returned unopened. To my surprise, not only did he read it, he replied with a wonderfully generous note, accompanied by a jacket blurb.

My immediate reaction was "Wow!", followed almost imperceptibly by "I could never do such a thing." But then my unconscious called my bluff and reminded me that I had once done just such a thing.

Back in 2004, I chaired the Educators' Symposium at OOPSLA. As I first wrote back then, Alan Kay gave the keynote address at the Symposium. He also gave a talk at the main conference, his official Turing Award lecture. The Educators' Symposium was better, in large part because we gave Kay the time he needed to say what he wanted to say.

2004 was an eventful year for Kay, as he won not only the Turing Award but also the Draper Prize and Kyoto Prize. You might guess that Kay had agreed to give his Turing address at OOPSLA, given his seminal influence on OOP and the conference, and then consented to speak a second time to the educators.

But his first commitment to speak was to the Educators' Symposium. Why? At least in part because I called him on the phone and asked.

Why would an associate professor at a medium-sized regional public university dare to call the most recent Turing Award winner on the phone and ask him to speak at an event on the undercard of a conference? Your answer is probably as good as mine. I'll say one part boldness, one part hope, and one part naivete.

All I know is that I did call, hoping to leave a message with his secretary and hoping that he would later consider my request. Imagine my surprise when his secretary said, "He's across the hall just now; let me get him." My heart began to beat in triple time. He came to the phone, said hello, and we talked.

For me, it was a marvelous conversation, forty-five minutes chatting with a seminal thinker in my discipline, of whose work I am an unabashed fan. We discussed ideas that we share about computer science, computer science education, and universities. I was so caught up in our chat that I didn't consider just how lucky I was until we said our goodbyes. I hung up, and the improbability of what had just happened soaked in.

Why would my someone of Kay's stature agree to speak at a second-tier event before he had even been contacted to speak at the main event? Even more, why would he share so much time talking to me? There are plenty of reasons. The first that comes to mind is most important: many of the most accomplished people in computer science are generous beyond my ken. This is true in most disciplines, I am sure, but I have experienced it firsthand many times in CS. I think Kay genuinely wanted to help us. He was certainly willing to talk to me at some length about my hopes for the symposium and the role he could play.

I doubt that this was enough to attract him, though. The conference venue being Vancouver helped a lot; Kay loves Vancouver. The opportunity also to deliver his Turing Award lecture at OOPSLA surely helped, too. But I think the second major reason was his longstanding interest in education. Kay has spent much of his career working toward a more authentic kind of education for our children, and he has particular concerns with the state of CS education in our universities. He probably saw the Educators' Symposium as an opportunity to incite revolution among teachers on the front-line, to encourage CS educators to seek a higher purpose than merely teaching the language du jour and exposing students to a kind of computing calcified since the 1970s. I certainly made that opportunity a part of my pitch.

For whatever reason, I called, and Kay graciously agreed to speak. The result was a most excellent keynote address at the symposium. Sadly, his talk did not incite a revolt. It did plant seeds in the minds of at least of a few of us, so there is hope yet. Kay's encouragement, both in conversation and in his talk, inspire me to this day.

Behrens expressed his own exhilaration "to be encouraged by an author whose books [he] had once been required to read". I am in awe not only that Behrens had the courage to send his manuscript to Koestler but also that he and Koestler continued to correspond by post for over a decade. My correspondence with Kay since 2004 has been only occasional, but even that is more than I could have hoped for as a undergrad, when I first heard of Smalltalk or, as a grad student, when I first felt the power of Kay's vision by living inside a Smalltalk image for months at a time.

I have long hesitated to tell this story in public, for fear that crazed readers of my blog would deluge his phone line with innumerable requests to speak at conferences, workshops, and private parties. (You know who you are...) Please don't do that. But for a few moments once, I felt compelled to make that call. I was fortunate. I was also a recipient of Kay's generosity. I'm glad I did something I never would do.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

February 10, 2010 6:43 PM

Recent Connections: Narrative and Computation

Reader Clint Wrede sent me a link to A Calculus of Writing, Applied to a Classic, another article about author Zachary Mason and his novel The Lost Books of the Odyssey. I mentioned Mason and his book in a recent entry, Diverse Thinking, Narrative, Journalism, and Software, which considered the effect of Mason's CS background on his approach to narrative. In "A Calculus of Writing", makes that connection explicit:

"What I'm interested in scientifically is understanding thought with computational precision," he explained. "I mean, the romantic idea that poetry comes from this deep inarticulable ur-stuff is a nice idea, but I think it is essentially false. I think the mind is articulable and the heart probably knowable. Unless you're a mystic and believe in a soul, which I don't, you really don't have any other conclusion you can reach besides that the mind is literally a computer."

I'm not certain whether the mind is or is not a computer, but I share Mason's interest in "understanding thought with computational precision". Whether poets and novelists create through a computational process or not, building ever-more faithful computational models of what they do interests to people like Mason and me. It also seems potentially valuable as a way to understand what it means to be human, a goal scientists and humanists share.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

February 09, 2010 7:13 PM

Programs as Art

In my previous entry I mentioned colleague and graphic designer Roy Behrens. My first blog articles featuring Behrens mentioned or centered on material from Ballast Quarterly Review, a quarterly commonplace book he began publishing in the mid-1980s. I was excited to learn recently that Behrens is beginning to reproduce material from BALLAST on-line in his new blog, The Poetry of Sight. He has already posted both entries I've seen before and entries new to me. This is a wonderful resource for someone who likes to make connections between art, design, psychology, literature, and just about any other creative discipline.

All this is prelude to my recent reading of the entry Art as Brain Surgery, which recounts a passage from an interview with film theorist Ray Carney that begins the idea behind the entry's title:

The greatest works [of art] do brain surgery on their viewers. They subtly reprogram our nervous systems. They make us notice and feel things we wouldn't otherwise.

I read this passage as a potential challenge to an idea I had explored previously: programming is art. That article looked at the metaphor from poet William Stafford's perspectives on art. Carney looks at art from a different position, one which places a different set of demands on the metaphor. For example,

One of the principal ways [great works of art] do this is through the strangeness of their styles. Style creates special ways of knowing. ... Artistic style induces unconventional states of awareness and sensitivity.

This seems to contradict a connection to programming, a creative discipline in which we seem to prefer -- at least in our code -- convention over individuality, recognizability or novelty, and the obvious over the subtle. When we have to dig into an unfamiliar mass of legacy code, the last thing we want are "unconventional states of awareness and sensitivity". We want to grok the code, and now, so that we can extend and modify it effectively and confidently.

Yet I think we find beauty in programming styles that extend our way of thinking about the world. Many OO and procedural programmers encounter functional programming and see it as beautiful, in part because it does just what Carney says great art does:

It freshens and quickens our responses. It limbers up our perceptions and teaches us new possibilities of feeling and understanding.

The ambitious among us then try to take these new possibilities back to their other programming styles and imbue our code there with the new possibilities. We turn our new perceptions into the conventions and patterns that make our code recognizable and obvious. But this also makes our code subtle in its own, bearing a foreign beauty and sense of understanding in the way it solves the work-a-day problems found in the program's specs. The best software patterns do this: they not only solve a problem but teach us that it can be solved at all, often by bringing an outside influence to our programs.

Perhaps it's just me, but there is something poetic in how I experience the emotional peaks of writing programs. I feel what Carney says:

The greatest works of art are not alternatives to or escapes from life, but enactments of what it feels like to live at the highest pitch of awareness -- at a level of awareness most people seldom reach in their ordinary lives.

The first Lisp interpreter, which taught us that code is data. VisiCalc, which brought program as spreading activation to our desktops, building on AI work in the 1950s and 1960s. Smalltalk. Unix. Quicksort and mergesort, implemented in thousands of programs in thousands of ways, always different but always perceptibly the same. Programmers experience these ideas and programs at the highest pitch of awareness. I walk away from the computer some days hoping that other people get to feel the way I am feeling, alive with fire deep in my bones.

The greatest works are inspired examples of some of the most exciting, demanding routes that can be taken through experience. They bring us back to life.

These days, more than ever, I relish the way even reading a good program can bring me back to life. That's to say nothing of writing one.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

February 08, 2010 2:19 PM

Calling C.P. Snow

A lot has been going on at my university the last few months to keep me busy. With significant budget cuts and a long-term change in state funding of higher education, we are beginning to see changes across campus. Last month our provost announced a move that will affect me and my department intimately: the merger of the College of Natural Sciences (CNS) with the College of Humanities and Fine Arts (CHFA). Computer Science will go from being one department among seven science/math/technology departments to a member of a college twice as large and at least that much more diverse.

The merger came as a surprise to many of us on campus, so there is a lot to do beyond simply combining operating budgets and clerical staffs. I expect everything to work out fine in the end. Colleges of arts and sciences are a common way to organize universities like ours, both of the existing colleges contain good people and many good programs, and we have a dean especially well-suited to lead the merger. Still, the next eighteen months promise to deliver a lot of uncertainty and change. Change is hard, and the resulting college will be something quite different from who we are now. Part of me is excited... There are some immediate benefits for me and CS, as we will now be in the same college with colleagues such as Roy Behrens, and with the departments with whom we have been working on a new major in digital media. Multidisciplinary work is easier to do at the university when they fall under the same administrative umbrella.

We are only getting started on working toward the merger, but I've already noticed some interesting differences between the two faculties. For example, at the first meeting of the department heads in my college with a faculty leader from the other college, we learned that the humanities folks have been working together on a college-wide theme of internationalization. As part of this, they have been reading a common book and participating in reading groups to discuss it.

This is a neat idea. The book provides a common ground for their faculty and helps them to work together toward a common goal. The discussion unifies their college. Together, they also create a backdrop against which many of them can do their scholarly work, share ideas, and collaborate.

Now that we are on the way to becoming one college, the humanities faculty have invited us to join them in the conversation. This is a gracious offer, which creates an opportunity for us all to unify as a single faculty. The particular theme for this year, internationalization, is one that has relevance in both the humanities and the sciences. Many faculty in the sciences are deeply invested in issues of globalization. For this reason, there may well be some cross-college discussion that results, and this interaction will likely promote the merger of the colleges.

That said, I think the act of choosing a common book to read and discuss in groups may reflect a difference between the colleges, one that is either a matter of culture or a matter of practice. For the humanities folks, this kind of discussion is a first-order activity. It is what they do within and across their disciplines. For the science folks, this kind of discussion is a second-order activity. There are common areas of work across the science departments, such as bioinformatics, but even then the folks in biology, chemistry, computer science, and math are all working on their own problems in their own ways. A general discussion of issues in bioinformatics is viewed by most scientists as about bioinformatics, not bioinformatics itself.

I know that this is a superficial analysis and that it consists of more shades of gray than sharp lines. At its best, it is a simplification. Still I found it interesting to see and hear how science faculty responded to the offer.

Over the longer term, it will be interesting to see how the merger of colleges affects what we in the sciences do, and how we do it. I expect something positive will happen overall, as we come into more frequent contact with people who think a little differently than we do. I also expect the day-to-day lives of most science faculty (and humanities faculty as well) will go on as they are now. Letterhead will change, the names of secretaries will change, but scholarly lives will go on.,

The changes will be fun. Getting out of ruts is good for the brain.


Posted by Eugene Wallingford | Permalink | Categories: General

February 01, 2010 10:22 PM

A Blogging Milestone -- 10**3

Thanks to Allyn Bauer for noticing that my recent entry on The Evolution of the Textbook was the 1000th posting to this blog. Five and half years is a long time. I am glad I'm still at it. The last few months have been difficult on the non-teaching and non-CS side of my life, and I feel like my inspiration to write about interesting ideas has been stop-and-go. But I am glad I'm still at it.

While thinking about my 1000th post, I decided to take a look back at the other digits:

Number 100 refers to #99, a review of the Kary Mullis's Dancing Naked in the Mind Field. That article, in combination with the entries on algorithmic patterns and textbooks, seems pretty consistent with how I think about this blog: ideas encountered, considered, and applied. Looking at numbers 1 and 10 led to me to read over the monthly archive for July 2004. Revisiting old thoughts evokes -- or creates -- a strange sort of memory, one that I enjoy.

I hope that the next 1000 entries are as much fun to write.


Posted by Eugene Wallingford | Permalink | Categories: General