March 14, 2024 12:37 PM

Gene Expression

Someone sent me this image, from a slide deck they ran across somewhere:

A slide labeled 'Gene Expression'. The main image a casual shot of actor Gene Wilder, labeled 'One Gene'. There a three side images of Wilder as iconic characters he played in 'Willy Wonka & the Chocolate Factory', 'Young Frankenstein', and 'Blazing Saddles'. There are arrows from the main image to the three side images, labeled 'Many Outcomes'.

I don't know what to do with it other than to say this:

As a person named 'Eugene' and an admirer of Mr. Wilder's work, I smile every time I see it. That's a clever way to reinforce the idea of gene expression by analogy, using actors and roles.

When I teach OOP and FP, I'm always looking for simple analogies like this from the non-programming world to reinforce ideas that we are learning about in class. My OOP repertoire is pretty deep. As I teach functional programming each spring, I'm still looking for new FP analogies all the time.

~~~~~

Note: I don't know the original source of this image. If you know who created the slide, please let me know via email, Mastodon, or Twitter (all linked in the sidebar). I would love to credit the creator.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 27, 2024 7:10 PM

Today in "It's not the objects; it's the messages"

Alan Kay is fond of saying that object-oriented programming is not about the objects; it's about the messages. He also looks to the biological world for models of how to think about and write computer programs.

This morning I read two things on the exercise bike that brought these ideas to mind, one from the animal kingdom and one from the human sphere.

First was a surprising little article on how an invasive ant species is making it harder for Kenyan lions to hunt zebras, with elephants playing a pivotal role in the story, too. One of the scientists behind the study said:

"We often talk about conservation in the context of species. But it's the interactions which are the glue that holds the entire system together."

It's not just the animals. It's the interactions.

Then came @jessitron reflecting on what it means to be "the best":

And then I remembered: people are people through other people. Meaning comes from between us, not within us.

It's not just the people. It's the interactions.

Both articles highlighted that we are usually better served by thinking about interactions within systems, and not simply the components of system. That way lies a more reliable approach to build robust software. Alan Kay is probably somewhere nodding his head.

The ideas in Jessitron's piece fit nicely into the software analogy, but they mean even more in the world of people that she is reflecting on. It's easy for each of us to fall into the habit of walking around the world as an I and never quite feeling whole. Wholeness comes from connection to others. I occasionally have to remind myself to step back and see my day in terms of the students and faculty I've interacted with, whom I have helped and who have helped me.

It's not (just) the people. It's the interactions.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

January 21, 2024 8:28 AM

A Few Thoughts on How Criticism Affects People

The same idea popped up in three settings this week: a conversation with a colleague about student assessments, a book I am reading about women writers, and a blog post I read on the exercise bike one morning.

The blog post is by Ben Orlin at Math With Bad Drawings from a few months ago, about an occasional topic of this blog: being less wrong each day [ for example, 1 and 2 ]. This sentence hit close enough to home that I saved it for later.

We struggle to tolerate censure, even the censure of idiots. Our social instrument is strung so tight, the least disturbance leaves us resonating for days.

Perhaps this struck a chord because I'm currently reading A Room of One's Own, by Virginia Woolf. In one early chapter, Woolf considers the many reasons that few women wrote poetry, fiction, or even non-fiction before the 19th century. One is that they had so little time and energy free to do so. Another is that they didn't have space to work alone, a room of one's own. But even women who had those things had to face a third obstacle: criticism from men and women alike that women couldn't, or shouldn't, write.

Why not shrug off the criticism and soldier on? Woolf discusses just how hard that is for anyone to do. Even many of our greatest writers, including Tennyson and Keats, obsessed over every unkind word said about them or their work. Woolf concludes:

Literature is strewn with the wreckage of men who have minded beyond reason the opinions of others.

Orlin's post, titled Err, and err, and err again; but less, and less, and less, makes an analogy between the advance of scientific knowledge and an infinite series in mathematics. Any finite sum in the series is "wrong", but if we add one more term, it is less wrong than the previous sum. Every new term takes us closer to the perfect answer.

a black and white portrait of a bearded man
Source: Wikipedia, public domain

He then goes on to wonder whether the same is, or could be, true of our moral development. His inspiration is American psychologist and philosopher William James. I have mentioned James as an inspiration myself a few times in this blog, most explicitly in Pragmatism and the Scientific Spirit, where I quote him as saying that consciousness is "not a thing or a place, but a process".

Orlin connects his passage on how humans receive criticism to James's personal practice of trying to listen only to the judgment of ever more noble critics, even if we have to imagine them into being:

"All progress in the social Self," James says, "is the substitution of higher tribunals for lower."

If we hold ourselves to a higher, more noble standard, we can grow. When we reach the next plateau, we look for the next higher standard to shoot for. This is an optimistic strategy for living life: we are always imperfect, but we aspire to grow in knowledge and moral development by becoming a little less wrong each step of the way. To do so, we try to focus our attention on the opinions of those whose standard draws us higher.

Reading James almost always leaves my spirit lighter. After Orlin's post, I feel a need to read The Principles of Psychology in full.

These two threads on how people respond to criticism came together when I chatted with a colleague this week about criticism from students. Each semester, we receive student assessments of our courses, which include multiple-choice ratings as well as written comments. The numbers can be a jolt, but their effect is nothing like that of the written comments. Invariably, at least one student writes a negative response, often an unkind or ungenerous one.

I told my colleague that this is recurring theme for almost every faculty member I have known: Twenty-nine students can say "this was a good course, and I really like the professor", but when one student writes something negative... that is the only comment we can think about.

The one bitter student in your assessments is probably not the ever more noble critic that James encourages you to focus on. But, yeah. Professors, like all people, are strung pretty tight when it comes to censure.

Fortunately, talking to others about the experience seems to help. And it may also remind us to be aware of how students respond to the things we say and do.

Anyway, I recommend both the Orlin blog post and Woolf's A Room of One's Own. The former is a quick read. The latter is a bit longer but a smooth read. Woolf writes well, and once my mind got on the book's wavelength, I found myself engaged deeply in her argument.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

July 04, 2023 11:55 AM

Time Out

Any man can call time out, but no man
can say how long the time out will be.
-- Books of Bokonon

I realized early last week that it had been a while since I blogged. June was a morass of administrative work, mostly summer orientation. Over the month, I had made notes for several potential posts, on my web dev course, on the latest book I was reading, but never found -- made -- time to write a full post. I figured this would be a light month, only a couple of short posts, if I only I could squeeze another one in by Friday.

Then I saw that the date of my most recent post was May 26, with the request for ideas about the web course coming a week before.

I no longer trust my sense of time.

This blog has certainly become much quieter over the years, due in part to the kind and amount of work I do and in part to choices I make outside of work. I may even have gone a month between posts a few fallow times in the past. But June 2023 became my first calendar month with zero posts.

It's somewhat surprising that a summer month would be the first to shut me out. Summer is a time of no classes to teach, fewer student and faculty issues to deal with, and fewer distinct job duties. This occurrence is a testament to how much orientation occupies many of my summer days, and how at other times I just want to be AFK.

A real post or two are on their way, I promise -- a promise to myself, as well as to any of you who are missing my posts in your newsreader. In the meantime...

On the web dev course: thanks to everyone who sent thoughts! There were a few unanimous, or near unanimous, suggestions, such as to have students use VS code. I am now learning it myself, and getting used to an IDE that autocompletes pairs such as "". My main prep activity up to this point has been watching David Humphrey's videos for WEB 222. I have been learning a little HTML and JavaScript and a lot of CSS and how these tools work together on the modern web. I'm also learning how to teach these topics, while thinking about the differences between my student audience and David's.

On the latest book: I'm currently reading Shop Class as Soulcraft, by Matthew Crawford. It came out in 2010 and, though several people recommended it to me then, I had never gotten around to it. This book is prompting so many ideas and thoughts that I'm constantly jotting down notes and thinking about how these ideas might affect my teaching and my practice as a programmer. I have a few short posts in mind based on the book, if only I commit time to flesh them out. Here are two passages, one short and one long, from my notes.

Fixing things may be a cure for narcissism.

Countless times since that day, a more experienced mechanic has pointed out to me something that was right in front of my face, but which I lacked the knowledge to see. It is an uncanny experience; the raw sensual data reaching my eye before and after are the same, but without the pertinent framework of meaning, the features in question are invisible. Once they have been pointed out, it seems impossible that I should not have seen them before.

Both strike a chord for me as I learn an area I know only the surface of. Learning changes us.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

May 26, 2023 12:37 PM

It's usually counterproductive to be doctrinaire

A short passage from Innocence, by Penelope Fitzgerald:

In 1927, when they moved me from Ustica to Milan, I was allowed to plant a few seeds of chicory, and when they came up I had to decide whether to follow Rousseau and leave them to grow by the light of nature, or whether to interfere in the name of knowledge and authority. What I wanted was a a decent head of chicory. It's useless to be doctrinaire in such circumstances.

Sometimes, you just want a good head of chicory -- or a working program. Don't let philosophical ruminations get in the way. There will be time for reflection and evaluation later.

A few years ago, I picked up Fitzgerald's short novel The Bookshop while browsing the stacks at the public library. I enjoyed it despite the fact that (or perhaps because) it ended in a way that didn't create a false sense of satisfaction. Since then I have had Fitzgerald on my list of authors to explore more. I've read the first fifty pages or so of Innocence and quite like it.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

May 07, 2023 8:36 AM

"The Society for the Diffusion of Useful Knowledge"

I just started reading Joshua Kendall's The Man Who Made Lists, a story about Peter Mark Roget. Long before compiling his namesake thesaurus, Roget was a medical doctor with a local practice. After a family tragedy, though, he returned to teaching and became a science writer:

In the 1820s and 1830s, Roget would publish three hundred thousand words in the Encyclopaedia Brittanica and also several lengthy review articles for the Society for the Diffusion of Useful Knowledge, the organization affiliated with the new University of London, which sought to enable the British working class to educate itself.

What a noble goal, enabling the working class to educate itself. And what a cool name: The Society for the Diffusion of Useful Knowledge!

For many years, my university has provided a series of talks for retirees, on topics from various departments on campus. This is a fine public service, though without the grand vision -- or the wonderful name -- of the Society for the Diffusion of Useful Knowledge. I suspect that most universities depend too much on tuition and lower costs these days to mount an ambitious effort to enable the working class to educate itself.

Mental illness ran in Roget's family. Kendall wonders if Roget's "lifelong desire to bring order to the world" -- through his lecturing, his writing, and ultimately his thesaurus, which attempted to classify every word and concept -- may have "insulated him from his turbulent emotions" and helped him stave off the depression that afflicted several of his family members.

Academics often live an obsessive connection with the disciplines they practice and study. Certainly that sort of focus can can be bad for a person when taken too far. (Is it possible for an obsession not to go too far?) For me, though, the focus of studying something deeply, organizing its parts, and trying to communicate it to others through my courses and writing has always felt like a gift. The activity has healing properties all its own.

In any case, the name "The Society for the Diffusion of Useful Knowledge" made me smile. Reading has the power to heal, too.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

April 26, 2023 12:15 PM

Cultivating a Way of Seeing

Sometimes, I run across a sentence I wish I had written. Here are several paragraphs by Dan Bouk I would be proud to have written.

Museums offer a place to practice looking for and acknowledging beauty. This is, mostly, why I visit them.

As I wander from room to room, a pose diverts me, a glance attracts me, or a flash of color draws my eye. And then I look, and look, and look, and then move on.

Outside the museum, I find that this training sticks. I wander from subway car to platform, from park to city street, and a pose diverts me, a glance attracts me, or a flash of color draws my eye. People of no particular beauty reveal themselves to be beautiful. It feels as though I never left the museum, and now everything, all around me, is art.

This way of seeing persists, sometimes for days on end. It resonates with and reinforces my political commitment to the equal value of each of my neighbors. It vibrates with my belief in the divine spark, the image of God, that animates every person.

-- Dan Bouk, in On Walking to the Museum, Musing on Beauty and Safety


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

April 09, 2023 8:24 AM

It Was Just Revision

There are several revised approaches to "what's the deal with the ring?" presented in "The History of The Lord of the Rings", and, as you read through the drafts, the material just ... slowly gets better! Bit by bit, the familiar angles emerge. There seems not to have been any magic moment: no electric thought in the bathtub, circa 1931, that sent Tolkien rushing to find a pen.

It was just revision.

Then:

... if Tolkien can find his way to the One Ring in the middle of the fifth draft, so can I, and so can you.

-- Robin Sloan, How The Ring Got Good


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

March 31, 2023 3:57 PM

"I Just Need a Programmer, er, Writer"

This line line from Chuck Wendig's post on AI tools and writing:

Hell, it's the thing every writer has heard from some jabroni who tells you, "I got this great idea, you write it, we'll split the money 50/50, boom."

... brought to mind one of my most-read blog posts ever, "I Just Need a Programmer":

As head of the Department of Computer Science at my university, I often receive e-mail and phone calls from people with The Next Great Idea. The phone calls can be quite entertaining! The caller is an eager entrepreneur, drunk on their idea to revolutionize the web, to replace Google, to top Facebook, or to change the face of business as we know it. ...

They just need a programmer. ...

The opening of that piece sounds a little harsh more than a decade later, but the basic premise holds. And, as Wendig notes, it holds beyond the software world. I even once wrote a short follow-up when accomplished TV writer Ken Levine commented on his blog about the same phenomenon in screenwriting.

Some misconceptions are evergreen.

Adding AI to the mix adds a new twist. I do think human execution in telling stories will still matter, though. I'm not yet convinced that the AI tools have the depth of network to replace human creativity.

However, maybe tools such as ChatGPT can be the programmer people need. A lot of folks are putting these tools to good use creating prototypes, and people who know how to program are using them effectively as accelerators. Execution will still matter, but these programs may be useful contributors on the path to a product.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

February 26, 2023 8:57 AM

"If I say no, are you going to quit?"

Poet Marvin Bell, in his contribution to the collection Writers on Writing:

The future belongs to the helpless. I am often presented that irresistible question asked by the beginning poet: "Do you think I am any good?" I have learned to reply with a question: "If I say no, are you going to quit?" Because life offers any of us many excuses to quit. If you are going to quit now, you are almost certainly going to quit later. But I have concluded that writers are people who you cannot stop from writing. They are helpless to stop it.

Reading that passage brought to mind Ted Gioia's recent essay on musicians who can't seem to retire. Even after accomplishing much, these artists seem never want to stop doing their thing.

Just before starting Writers on Writing, I finished Kurt Vonnegut's Sucker's Portfolio, a slim 2013 volume of six stories and one essay not previously published. The book ends with an eighth piece: a short story unfinished at the time of Vonnegut's death. The story ends mid-sentence and, according to the book's editor, at the top of an unfinished typewritten page. In his mid-80s, Vonnegut was creating stories t the end.

I wouldn't mind if, when it's my time to go, folks find my laptop open to some fun little programming project I was working on for myself. Programming and writing are not everything there is to my life, but they bring me a measure of joy and satisfaction.

~~~~~

This week was a wonderful confluence of reading the Bell, Gioia, and Vonnegut pieces around the same time. So many connections... not least of which is that Bell and Vonnegut both taught at the Iowa Writers' Workshop.

There's also an odd connection between Vonnegut and the Gioia essay. Gioia used a quip attributed to the Roman epigrammist Martial:

Fortune gives too much to many, but enough to none.

That reminded me of a story Vonnegut told occasionally in his public talks. He and fellow author Joseph Heller were at a party hosted by a billionaire. Vonnegut asked Heller, "How does it make you feel to know that guy made more money yesterday than Catch-22 has made in all the years since it was published?" Heller answered, "I have something he'll never have: the knowledge that I have enough."

There's one final connection here, involving me. Marvin Bell was the keynote speaker at Camouflage: Art, Science & Popular Culture an international conference organized by graphic design prof Roy Behrens at my university and held in April 2006. Participants really did come from all around the world, mostly artists or designers of some sort. Bell read a new poem of his and then spoke of:

the ways in which poetry is like camouflage, how it uses a common vocabulary but requires a second look in order to see what is there.
I gave a talk at the conference called NUMB3RS Meets The DaVinci Code: Information Masquerading as Art. (That title was more timely in 2006 than 2023...) I presented steganography as a computational form of camouflage: not quite traditional concealment, not quite dazzle, but a form of dispersion uniquely available in the digital world. I recall that audience reaction to the talk was better than I feared when I proposed it to Roy. The computer science topic meshed nicely with the rest of the conference lineup, and the artists and writers who saw the talk seemed to appreciate the analogy. Anyway, lots of connections this week.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

February 13, 2023 10:34 AM

The Exuberance of Bruce Springsteen in Concert

Bruce Springsteen, on why he puts on such an intense physical show:

So the display of exuberance is critical. "For an adult, the world is constantly trying to clamp down on itself," he says. "Routine, responsibility, decay of institutions, corruption: this is all the world closing in. Music, when it's really great, pries that shit back open and lets people back in, it lets light in, and air in, and energy in, and sends people home with that and sends me back to the hotel with it. People carry that with them sometimes for a very long period of time."

This passage is from a 2012 profile of the Boss, We Are Alive: Bruce Springsteen at Sixty-Two. A good read throughout.

Another comment from earlier in the piece has been rumbling around my head since I read it. Many older acts, especially those of Springsteen's vintage, have become essentially "their own cover bands", playing the oldies on repeat for nostalgic fans. The Boss, though, "refuses to be a mercenary curator of his past" and continually evolves as an artist. That's an inspiration I need right now.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

December 22, 2022 1:21 PM

The Ability to Share Partial Results Accelerated Modern Science

This passage is from Lewis Thomas's The Lives of a Cell, in the essay "On Societies as Organisms":

The system of communications used in science should provide a neat, workable model for studying mechanisms of information-building in human society. Ziman, in a recent "Nature" essay, points out, "the invention of a mechanism for the systematic publication of fragments of scientific work may well have been the key event in the history of modern science." He continues:
A regular journal carries from one research worker to another the various ... observations which are of common interest. ... A typical scientific paper has never pretended to be more than another little piece in a larger jigsaw -- not significant in itself but as an element in a grander scheme. The technique of soliciting many modest contributions to the store of human knowledge has been the secret of Western science since the seventeenth century, for it achieves a corporate, collective power that is far greater than any one individual can exert [italics mine].

In the 21st century, sites like arXiv lowered the barrier to publishing and reading the work of other scientists further. So did blogs, where scientists could post even smaller, fresher fragments of knowledge. Blogs also democratized science, by enabling scientists to explain results for a wider audience and at greater length than journals allow. Then came social media sites like Twitter, which made it even easier for laypeople and scientists in other disciplines to watch -- and participate in -- the conversation.

I realize that this blog post quotes an essay that quotes another essay. But I would never have seen the Ziman passage without reading Lewis. Perhaps you would not have seen the Lewis passage without reading this post? When I was in college, the primary way I learned about things I didn't read myself was by hearing about them from classmates. That mode of sharing puts a high premium on having the right kind of friends. Now, blogs and social media extend our reach. They help us share ideas and inspirations, as well as helping us to collaborate on science.

~~~~

I first mentioned The Lives of a Cell a couple of weeks ago, in If only ants watched Netflix.... This post may not be the last to cite the book. I find something quotable and worth further thought every few pages.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns, Teaching and Learning

November 27, 2022 9:38 AM

I Toot From the Command Line, Therefore I Am

Like so many people, I have been checking out new social media options in the face of Twitter's upheaval. None are ideal, but for now I have focused most of my attention on Mastodon, a federation of servers implemented using the ActivityPub protocol. Mastodon has an open API, which makes it attractive to programmers. I've had an account there for a few years (I like to grab username wallingf whenever a new service comes out) but, like so many people, hadn't really used it. Now feels more like the time.

On Friday, I spent a few minutes writing a small script that posts to my Mastodon account from the command line. I occasionally find that sort of thing useful, so the script has practical value. Really, though, I just wanted to play a bit in code and take a look at Mastodon's API.

Several people in my feed posted, boosted, and retweeted a link to this DEV Community article, which walks readers through the process of posting a status update using curl or Python. Everything worked exactly as advertised, with one small change: the Developers link that used to be in the bottom left corner of one's Mastodon home page is now a Development link on the Preferences page.

I've read a lot in the last few weeks about how the culture of Mastodon is different from the culture of Twitter. I'm trying to take seriously the different culture. One concrete example is the use of content warnings or spoiler alerts to hide content behind a brief phrase or tag. This seems like a really valuable practice, useful in a number of different contexts. At the very least, it feels like the Subject: line on an email message or a Usenet News post. So I looked up how to post content warnings with my command-line script. It was dead simple, all done in a few minutes.

There may be efficiency problems under the hood with how Mastodon requests work, or so I've read. The public interface seems well done, though.

I went with Python for my script, rather than curl. That fits better with most my coding these days. It also makes it easier to grow the script later, if I want. bash is great for a few lines, but I don't like to live inside bash for very long. On any code longer than a few lines, I want to use a programming language. At a couple of dozen lines, my script was already long enough to merit a real language. I went mostly YAGNI this time around. There are no classes, just a sequence of statements to build the http request from some constants (server name, authorization token) and command-line args (the post, the content warning). I did factor the server name and authorization token out of the longer strings and include an option to write the post via stdin. I want the flexibility of writing longer toots now, and I don't like magic constants. If I ever need to change servers or tokens, I never have to look past the few first few lines of the file.

As I briefly imagined morphing the small but growing script into a Toot class, I recalled a project I gave my Intermediate Computing students back in 2009 or so: implement the barebones framework of a Twitter-like application. That felt cutting edge back then, and most of the students really liked putting their new OO design and programming skills to use in a program that seemed to matter. It was good fun, and a great playground for so many of the ideas they had learned that semester.

All in all, this was a great way to spend a few minutes on a free afternoon. The API was simple to use, and the result is a usable new command. I probably should've been grading or doing some admin work, but profs need a break, too. I'm thankful to enjoy little programming projects so much.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

November 23, 2022 1:27 PM

I Can't Imagine...

I've been catching up on some items in my newsreader that went unread last summer while I rode my bike outdoors rather than inside. This passage from a blog post by Fred Wilson at AVC touched on a personal habit I've been working on:

I can't imagine an effective exec team that isn't in person together at least once a month.

I sometimes fall into a habit of saying or thinking "I can't imagine...". I'm trying to break that habit.

I don't mean to pick on Wilson, whose short posts I enjoy for insight into the world of venture capital. "I can't imagine" is a common trope in both spoken and written English. Some writers use it as a rhetorical device, not as a literal expression. Maybe he meant it that way, too.

For a while now, though, I've been trying to catch myself whenever I say or think "I can't imagine...". Usually my mind is simply being lazy, or too quick to judge how other people think or act.

It turns out that I usually can imagine, if I try. Trying to imagine how that thinking or behavior makes sense helps me see what other people might be thinking, what their assumptions or first principles are. Even when I end up remaining firm in my own way of thinking, trying to imagine usually puts me in a better position to work with the other person, or explain my own reasoning to them more effectively.

Trying to imagine can also give me insight into the limits of my own thinking. What assumptions am I making that lead me to have confidence in my position? Are those assumptions true? If yes, when might they not to be true? If no, how do I need to update my thinking to align with reality?

When I hear someone say, "I can't imagine..." I often think of Russell and Norvig's textbook Artificial Intelligence: A Modern Approach, which I used for many years in class [1]. At the end of one of the early chapters, I think, they mention critics of artificial intelligence who can't imagine the field of AI ever accomplishing a particular goal. They respond cheekily to the effect, This says less about AI than it says about the critics' lack of imagination. I don't think I'd ever seen a textbook dunk on anyone before, and as a young prof and open-minded AI researcher, I very much enjoyed that line [2].

Instead of saying "I can't imagine...", I am trying to imagine. I'm usually better off for the effort.

~~~~

[1] The Russell and Norvig text first came out in 1995. I wonder if the subtitle "A Modern Approach" is still accurate... Maybe theirs is now a classical approach!

[2] I'll have to track that passage down when I am back in my regular office and have access to my books. (We are in temporary digs this fall due to construction.) I wonder if AI has accomplished the criticized goal in the time since Russell and Norvig published their book. AI has reached heights in recent years that many critics in 1995 could not imagine. I certainly didn't imagine a computer program defeating a human expert at Go in my lifetime, let alone learning to do so almost from scratch! (I wrote about AlphaGo and its intersection with my ideas about AI a few times over the years: [ 01/2016 | 03/2016 | 05/2017 | 05/2018 ].)


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

October 02, 2022 9:13 AM

Twitter Replies That No One Asked For

I've been pretty quiet on Twitter lately. One reason is that my daily schedule has been so different for the last six or eight weeksr: I've been going for bike rides with my wife at the end of the work day, which means I'm most likely to be reading Twitter late in the day. By then, many of the threads I see have played themselves out. Maybe I should jump in anyway? Even after more than a decade, I'm not sure I know how to Twitter properly.

Here are a few Twitter replies that no one asked for and that I chose not to send at the time.

• When people say, "That's the wrong question to ask", what they often seem to mean -- and should almost always say -- is, "That's not the question I would have asked."

• No, I will not send you a Google Calendar invite. I don't use Google Calendar. I don't even put every event into the calendaring system I *do* use.

• Yes, I will send you a Zoom link.

• COVID did not break me for working from home. Before the pandemic, I almost never worked at home during the regular work day. As a result, doing so felt strange when the pandemic hit us all so quickly. But I came first to appreciate and then to enjoy it, for many of the same reasons others enjoy it. (And I don't even have a long or onerous commute to campus!) Now, I try to work from home one day a week when schedules allow.

• COVID also did not break me for appreciating a quiet and relatively empty campus. Summer is still a great time to work on campus, when the pace is relaxed and most of the students who are on campus are doing research. Then again, so is fall, when students return to the university, and spring, when the sun returns to the world. The takeaway: It's usually a great time to be on campus.

I realize that some of these replies in absentia are effectively subtweets at a distance. All the more reason to post them here, where everyone who reads them has chosen to visit my blog, rather in a Twitter thread filled with folks who wouldn't know me from Adam. They didn't ask for my snark.

I do stand by the first bullet as a general observation. Most of us -- me included! -- would do better to read everyone else's tweets and blog posts as generously as possible.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 18, 2022 9:37 AM

Dread and Hope

First, a relatively small-scale dread. From Jeff Jarvis in What Is Happening to TV?

I dread subscribing to Apple TV+, Disney+, Discovery+, ESPN+, and all the other pluses for fear of what it will take to cancel them.

I have not seen a lot of popular TV shows and movies in the last decade or two because I don't want to deal with the hassle of unsubscribing from some service. I have a list of movies to keep an eye out for in other places, should they ever appear, or to watch at their original homes, should my desire to see them ever outgrow my preference to avoid certain annoyances.

Next, a larger-scale source of hope, courtesy of Neel Krishnaswami in The Golden Age of PL Research:

One minor fact about separation logic. John C. Reynolds invented separation logic when was 65. At the time that most people start thinking about retirement, he was making yet another giant contribution to the whole field!

I'm not thinking about retirement at all yet, but I am past my early days as a fresh, energetic, new assistant prof. It's good to be reminded every once in a while that the work we do at all stages of our careers can matter. I didn't make giant contributions when I was younger, and I'm not likely to make a giant contribution in the future. But I should strive to keep doing work that matters. Perhaps a small contribution remains to be made.

~~~~

This isn't much of a blog post, I know. I figure if I can get back into the habit of writing small thoughts down, perhaps I can get back to blogging more regularly. It's all about the habit. Wish me luck.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

August 29, 2022 4:44 PM

Radio Silence

a photo of a dolomite outcropping in Backbone State Park, Iowa

I did not intend for August to be radio silence on my blog and Twitter page. The summer just caught up with me, and my brain took care of itself, I guess, by turning off for a bit.

One bit of newness for the month was setting up a new Macbook Air. I finally placed my order on July 24. It was scheduled to arrive the week of August 10-17 but magically appeared on our doorstep on July 29. I've been meaning to write about the experience of setting up a new Mac laptop after working for seven years on a trusty Macbook Pro, but that post has been a victim of the August slowdown. I can say this: I pulled out the old Macbook Pro to watch Netflix on Saturday evening... and it felt *so* heavy. How quickly we adjust to new conditions and forget how lucky we were before.

Another pleasure in August was meeting up with Daniel Steinberg over Zoom. I remember back near the beginning of the pandemic Daniel said something on Twitter about getting together for a virtual coffee with friends and colleagues he could no longer visit. After far too long, I contacted him to set up a chat. We had a lot of catching up to do and ended up discussing teaching, writing, programming, and our families. It was one of my best hours for the month!

My wife and I took advantage of the last week before school started by going on a couple of hikes. We visited Backbone State Park for the first time and spent an entire day walking and enjoying scenery that most people don't associate with Iowa. The image at the top of this post comes from the park's namesake trail, which showcases some of the dolomite limestone cliffs leftover from before the last glaciers. Here's another shot, of an entrance to a cave carved out by icy water that still flows beneath the surface:

a photo of the entrance to a dolomite cave in Backbone State Park, Iowa

Closer to home, we took a long morning to walk through Hartman Reserve, a county preserve. Walking for a couple of hours as the sun rises and watching the trees and wildlife come to light is a great way to shake some rust off the mind before school starts.

I had a tough time getting ready mentally for the idea of a new school year. This summer's work offered more burnout than refreshment. As the final week before classes wound down, I had to get serious about class prep -- and it freed me up a bit. Writing code, thinking about CS, and getting back into the classroom with students still energize me. This fall is my compilers course. I'm giving myself permission to make only a few targeted changes in the course plan this time around. I'm hoping that this lets me build some energy and momentum throughout the semester. I'll need that in order to be there for the students.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

August 15, 2022 12:49 PM

No Comment

a picture of the orchid in my office from April 2021

From the closing pages from The Orchid Thief, which I mentioned in my previous post:

"The thing about computers," Laroche said, "the thing that I like is that I'm immersed in it but it's not a living thing that's going to leave or die or something. I like having the minimum number of living things to worry about in my life."

Actually, I have two comments.

If Laroche had gotten into open source software, he might have found himself with the opposite problem: software that won't die. Programmers sometimes think, "I know, I'll design and implement my own programming language!" Veterans of the programming languages community always seem to advise: think twice. If you put something out there, other people will use it, and now you are stuck maintaining a package forever. The same can be said for open source software more generally. Oh, and did I mention it would be really great if you added this feature?

I like having plants in my home and office. They give me joy every day. They also tend to live a lot longer than some of my code. The hardy orchid featured above bloomed like clockwork twice a year for me for five and a half years. Eventually it needed more space than the pot in my office could give, so it's gone now. But I'm glad to have enjoyed it for all those years.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Software Development

July 31, 2022 8:54 AM

Caring about something whittles the world down to a more manageable size

In The Orchid Thief, there is a passage where author Susan Orlean describes a drive across south Florida on her way to a state preserve, where she'll be meeting an orchid hunter. She ends the passage this way:

The land was marble-smooth and it rolled without a pucker to the horizon. My eyes grazed across the green band of ground and the blue bowl of sky and then lingered on a dead tire, a bird in flight, an old fence, a rusted barrel. Hardly any cars came toward me, and I saw no one in the rearview mirror the entire time. I passed so many vacant acres and looked past them to so many more vacant acres and looked ahead and behind at the empty road and up at the empty sky; the sheer bigness of the world made me feel lonely to the bone. The world is so huge that people are always getting lost in it. There are too many ideas and things and people, too many directions to go. I was starting to believe that the reason it matters to care passionately about something is that it whittles the world down to a more manageable size. It makes the world seem not huge and empty but full of possibility. If I had been an orchid hunter I wouldn't have see this space as sad-making and vacant--I think I would have seen it as acres of opportunity where the things I loved were waiting to be found.

John Laroche, the orchid hunter at the center of The Orchid Thief, comes off as obsessive, but I think many of us know that condition. We have found an idea or a question or a problem that grabs our attention, and we work on it for years. Sometimes, we'll follow a lead so far down a tunnel that it feels a lot like the swamps Laroche braves in search of the ghost orchid.

Even a field like computer science is big enough that it can feel imposing if a person doesn't have a specific something to focus their attention and energy on. That something doesn't have to be forever... Just as Laroche had cycled through a half-dozen obsessions before turning his energy to orchids, a computer scientist can work deeply in an area for a while and then move onto something else. Sometimes, there is a natural evolution in the problems one focuses on, while other times people choose to move into a completely different sub-area. I see a lot of people moving into machine learning these days, exploring how it can change the sub-field they used to focus exclusively on.

As a prof, I am fortunate to be able to work with young adults as they take their first steps in computer science. I get to watch many of them find a question they want to answer, a problem they want to work on for a few years, or an area they want to explore in depth until they master it. It's also sad, in a way, to work with a student who never quite finds something that sparks their imagination. A career in software, or anything, really, can look as huge and empty as Orlean's drive through south Florida if someone doesn't care deeply about something. When they do, the world seems not huge and empty, but full of possibility.

I'm about halfway through The Orchid Thief and am quite enjoying it.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

June 28, 2022 4:12 PM

You May Be Right

Billy Joel performing 'We Didn''t Start the Fire' at Notre Dame Stadium, June 25, 2022

I first saw Billy Joel perform live in 1983, with a college roommate and our girlfriends. It was my first pop/rock concert, and I fancied myself the biggest Billy Joel fan in the world. The show was like magic to a kid who had been listening to Billy's music on vinyl, and the radio, for years.

Since then, I've seen him more times than I can remember, most recently in 2008. My teenaged daughters went with me to that one, so it was magic for more reasons than one. I've even seen a touring Broadway show built around his music. So, yeah, I'm still a fan.

On Saturday morning, I drove to Elkhart, Indiana, to meet up with three friends from college to go see Billy perform outdoors at Notre Dame Stadium. We bought our tickets in October 2019, pre-COVID, expecting to see the show in the summer of 2020. After two years of postponement, Billy, the venue, and the fans were ready to go. Six hours is a long way to drive to see a two- or three-hour show, especially knowing that I had to drive six hours back the next morning. I'm not a college student any more!

You may be right; I may be crazy. But I would drive six hours again to see Billy. Even at 73, he puts on a great show. I hope I have that kind of energy -- and the desire to still do my professional thing -- when I reach that age. (I don't expect that 50,000 students will pay to see me do it, let alone drive six hours.) For this show, I had the bonus of being able to visit with good friends, one of whom I've known since grade school, after too long a time.

I went all fanboy in my short post about the 2008 concert, so I won't bore you again with my hyperbole. I'll just say that Billy performed "She's Always A Woman" and "Don't Ask Me Why" again, along with a bunch of the old favorites and a few covers: I enjoyed his impromptu version of "Don't Let the Sun Go Down on Me", bobbles and all. He played piano for one of his band members, Mike DelGuidice, who sang "Nessun Dorma". And the biggest ovation of the night may have gone to Crystal Taliafero, a multi-talented member of Billy's group, for her version of "Dancing in the Streets" during the extended pause in "The River of Dreams".

This concert crowd was the most people I've been around in a long time... I figured a show in an outdoor stadium was safe enough, with precautions. (I was one of the few folks who wore a mask in the interior concourse and restrooms.) Maybe life is getting back to normal.

If this was my last time seeing Billy Joel perform live, it was a worthy final performance. Who knows, though. I thought 2008 might be my last live show.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

June 14, 2022 2:48 PM

A Two Cultures Theory of Meetings

snow falling on a redwood cabin

Courtesy of Chad Orzel's blog:

This ended up reminding me of the Two Cultures theory of meetings that I heard (second-hand) from a former Dean (a Classics professor, for the record). This was prompted by her noticing that the scientists and engineers always seemed grumpy and impatient about having meetings during work hours, where folks from the non-STEM fields were more cheerful. She realized that this was largely because the STEM folks tended to do research in their labs and offices on campus, during the day, so having a meeting was directly taking them away from productive time. For folks on the more literary side of academia, the actual scholarly work of reading and writing was mostly done elsewhere— at home, in coffee shops, at archives— and at times outside the normal academic workday— in the evening, during the summer, etc. As a result, they tended to only come to campus for classes, meetings, and socialization, and the latter two tended to blend together.

Now I'm thinking back over my years as a faculty member and department head. I've been attending meetings mostly with administrators for so now long that I my experience is blunted: the days of science department heads are less different from the days of arts and humanities department heads than the differences for the corresponding faculty. Most admins seem reconciled, if ruefully, to their meetings.

Being a computer scientist affects my experience, too. Most of our faculty are software people who can read and write code from anywhere. In this regard, we are perhaps more like arts and humanities folks than other scientists are. When I think back on my interactions with CS colleagues, the ones least likely to want to meet at any old time are (1) people who do work with hardware in their labs and (2) people doing the most serious research. The second group tend to guard their creative time more carefully in all respects.

The other thing coloring my experience is... me. I am frequently grumpy and impatient about having meetings at all, during regular work hours or not, because so many of them come up on the wrong side of the cost/benefit ledger. A lot of university meetings happen only because they are supposed to happen. Many of my colleagues are congenial about this and manage to find ways to put the time to good use for them and, presumably, many other participants. I'd generally like to get back to work on more pressing, or interesting, matters.

But that is getting a bit far afield from the basic observation of a Two Cultures-style split, which is founded, I think, on the notion that the meetings in question are essential or at least important enough to hold. In that narrower context, I think Chad's colleague may be on to something.

~~~~~

Photo by Nikola Johnny Mirkovic on Unsplash.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

June 08, 2022 1:51 PM

Be a Long-Term Optimist and a Short-Term Realist

Before I went to bed last night, I stumbled across actor Robert De Niro speaking with Stephen Colbert on The Late Show. De Niro is, of course, an Oscar winner with fifty years working in films. I love to hear experts talk about what they do, so I stayed up a few extra minutes.

I think Colbert had just asked De Niro to give advice to actors who were starting out today, because De Niro was demurring: he didn't like to give advice, and everyone's circumstances are different. But then he said that, when he himself was starting out, he went on lots of auditions but always assumed that he wasn't going to get the job. There were so many ways not to get a job, so there was no reason to get his hopes up.

Colbert related that anecdote to his own experience getting started in show business. He said that whenever he had an acting job, he felt great, and whenever he didn't have a job, pessimism set in: he felt like he was never going to work again. De Niro immediately said, "oh, I never felt that way". He always felt like he was going to make it. He just had to keep going on auditions.

There was a smile on Colbert's face. He seemed to have trouble squaring De Niro's attitude toward auditions with his claimed confidence about eventual success. Colbert moved on with his interview.

It occurred to me that the combination of attitudes expressed by De Niro is a healthy, almost necessary, way to approach big goals. In the short term, accept that each step is uncertain and unlikely to pay off. Don't let those failures get you down; they are the price of admission. For the long term, though, believe deeply that you will succeed. That's the spirit you need to keep taking steps, trying new things when old things don't seem to work, and hanging around long enough for success to happen.

De Niro's short descriptions of his own experiences revealed how both sides of his demeanor contributed to him ultimately making it. He never knew what casting agents, directors, and producers were looking for, so he was willing to read every part in several different ways. Even though he didn't expect to get the job, maybe one of those people would remember him and mention him to a friend in the business, and maybe that connection would pay off. All he could do was audition.

The self-assurance De Niro seemed to feel almost naturally reminded me of things that Viktor Frankl and John McCain said about their ability to survive time in war camps. Somehow, they were able to maintain a confidence that they would eventually be free again. In the end, they were lucky to survive, but their belief that they would survive had given them a strength to persevere through much worse treatment than simply being rejected for a part in a movie. That perseverance helped them stay alive and take actions that would leave them in a position to be lucky.

I realize that the story De Niro tells, like those of Frankl and McCain, is potentially suspect due to survivor bias. We don't get to hear from people who believed that they would make it as actors but never did. Even so, their attitude seems like a pragmatic one to implement, if we can manage it: be a long-term optimist and a short-term realist. Do everything you can to hang around long enough for fortune to find us.

Like De Niro, I am not much one to give advice. In the haze of waking up and going back to sleep last night, though, I think his attitude gives us a useful model to follow.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

May 30, 2022 8:32 AM

I Have Written That Code

Last month, I picked up a copy of The Writing Life by Annie Dillard at the library. It's one of those books that everyone seems to quote, and I had never read it. I was pleased to find it is a slim volume.

It didn't take long to see one of the often-quoted passages, on the page before the first chapter:

No one expects the days to be gods. -- Emerson

Then, about a third of the way in, came the sentences for which everyone knows Dillard:

How we spend our days is, of course, how we spend our lives. What we do with this hour, and that one, is what we are doing.

Dillard's portrayal of the writing life describes some of the mystery that we non-writers imagine, but mostly it depicts the ordinariness of daily grind and the extended focus that looks like obsession to those of us on the outside.

Occasionally, her stories touched on my experience as a writer of programs. Consider this paragraph:

Every year the aspiring photographer brought a stack of his best prints to an old, honored photographer, seeking his judgment. Every year the old man studied the prints and painstakingly ordered them into two piles, bad and good. Every year that man moved a certain landscape print into the bad stack. At length he turned to the young man: "You submit this same landscape every year, and every year I put it on the bad stack. Why do you like it so much?" The young photographer said, "Because I had to climb a mountain to get it."

I have written that code. I bang my head against some problem for days or weeks. Eventually, I find a solution. Sometimes it's homely code that gets the job; usually it seems more elegant than it is, in relief against the work that went into discovering it. Over time, I realize that I need to change it, or delete it altogether, in order to make progress on the system in which it resides. But... the mountain.

It's a freeing moment when I get over the fixation and make the change the code needs. I'll always have the mountain, but my program needs to move in a different direction.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

April 04, 2022 5:45 PM

Leftover Notes

Like many people, I carry a small notebook most everywhere I go. It is not a designer's sketchbook or engineer's notebook; it is intended primarily to capturing information and ideas, a lá Getting Things Done, before I forget then. Most of the notes end up being transferred to one of my org-mode todo lists, to my calendar, or to a topical file for a specific class or project. Write an item in the notebook, transfer to the appropriate bin, and cross it off in the notebook.

I just filled the last line of my most recent notebook, a Fields Notes classic that I picked up as schwag at Strange Loop a few years ago. Most of the notebook is crossed out, a sign of successful capture and transfer. As I thumbed back through it, I saw an occasional phrase or line that never made into a more permanent home. That is pretty normal for my notebooks. At this point, I usually recycle the used notebook and consign untracked items to lost memories.

For some reason, this time I decided to copy all of the untracked down and savor the randomness of my mind. Who knows, maybe I'll use one of these notes some day.

The Feds

basic soul math

I want to be #0

routine, ritual

gallery.stkate.edu

M. Dockery

www.wastetrac.org/spring-drop-off-event

Crimes of the Art

What the Puck

Massachusetts ombudsman

I hope it's still funny...

chessable.com

art gallery

ena @ tubi

In Da Club (50 Cent)

Gide 25; 28 May : 1

HFOSS project

April 4-5: Franklin documentary

Mary Chapin Carpenter

"Silent Parade" by Keigo Higashino

www.pbs.org -- search Storm Lake

"Hello, Transcriber" by Hannah Morrissey

Dear Crazy Future Eugene

I recognize most of these, though I don't remember the reason I wrote all of them down. For whatever reason, they never reached an actionable status. Some books and links sound interesting in the moment, but by the time I get around to transcribing them elsewhere, I'm no longer interested enough to commit to reading, watching, or thinking about them further. Sometimes, something pops into my mind, or I see something, and I write it down. Better safe than sorry...

That last one -- Dear Crazy Future Eugene -- ends up in a lot of my notebooks. It's a phrase that has irrational appeal to me. Maybe it is destined to be the title of my next blog.

There were also three multiple-line notes that were abandoned:

poem > reality
words > fact
a model is not identical

I vaguely recall writing this down, but I forget what prompted it. I vaguely agree with the sentiment even now, though I'd be hard-pressed to say exactly what it means.

Scribble pages that separate notes from full presentation
(solutions to exercises)

This note is from several months ago, but it is timely. Just this week, a student in my class asked me to post my class notes before the session rather than after. I don't do this currently in large part because my sessions are a tight interleaving of exercises that the students do in class, discussion of possible solutions, and use of those ideas to develop the next item for discussion. I think that Scribble, an authoring system that comes with Racket, offers a way for me to build pages I can publish in before-and-after form, or at least in an outline form that would help students take notes. I just never get around to trying the idea out. I think the real reason is that I like to tinker with my notes right up to class time... Even so, the idea is appealing. It is already in my planning notes for all of my classes, but I keep thinking about it and writing it down as a trigger.

generate scanner+parser? expand analysis,
codegen (2 stages w/ IR -- simple exps, RTS, full)
optimization! would allow bigger source language?

This is evidence that I'm often thinking about my compiler course and ways to revamp it. This idea is also already in the system. But I keep to prompting myself to think about it again.

Anyway, that was a fun way to reflect on the vagaries of my mind. Now, on to my next notebook: a small pocket-sized spiral notebook I picked up for a quarter in the school supplies section of a big box store a while back. My friend Joe Bergin used to always have one of these in his shirt pocket. I haven't used a spiral-bound notebook for years but thought I'd channel Joe for a couple of months. Maybe he will inspire me to think some big thoughts.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

February 13, 2022 12:32 PM

A Morning with Billy Collins

It's been a while since I read a non-technical article and made as many notes as I did this morning on this Paris Review interview with Billy Collins. Collins was poet laureate of the U.S. in the early 2000s. I recall reading his collection, Sailing Alone Around the Room, at PLoP in 2002 or 2003. Walking the grounds at Allerton with a poem in your mind changes one's eyes and hears. Had I been blogging by then, I probably would have commented on the experience, and maybe one or two of the poems, in a post.

As I read this interview, I encountered a dozen or so passages that made me think about things I do, things I've thought, and even things I've never thought. Here are a few.

I'd like to get something straightened out at the beginning: I write with a Uni-Ball Onyx Micropoint on nine-by-seven bound notebooks made by a Canadian company called Blueline. After I do a few drafts, I type up the poem on a Macintosh G3 and then send it out the door.

Uni-Ball Micropoint pens are my preferred writing implement as well, though I don't write enough on paper any more to make buying a particular pen much worth the effort. Unfortunately, just yesterday my last Uni-Ball Micro wrote its last line. Will I order more? It's a race between preference and sloth.

I type up most of the things I write these days on a 2015-era MacBook Pro, often connected to a Magic Keyboard. With the advent of the M1 MacBook Pros, I'm tempted to buy a new laptop, but this one serves me so well... I am nothing if not loyal.

The pen is an instrument of discovery rather than just a recording implement. If you write a letter of resignation or something with an agenda, you're simply using a pen to record what you have thought out. In a poem, the pen is more like a flashlight, a Geiger counter, or one of those metal detectors that people walk around beaches with. You're trying to discover something that you don't know exists, maybe something of value.

Programming may be like writing in many ways, but the search for something to say isn't usually one of them. Most of us sit down to write a program to do something, not to discover some unexpected outcome. However, while I may know what my program will do when I get done, I don't always know what that program will look like, or how it will accomplish its task. This state of uncertainty probably accounts for my preference in programming languages over the years. Smalltalk, Ruby, and Racket have always felt more like flashlights or Geiger counters than tape recorders. They help me find the program I need more readily than Java or C or Python.

I love William Matthews's idea--he says that revision is not cleaning up after the party; revision is the party!

Refactoring is not cleaning up after the party; refactoring is the party! Yes.

... nothing precedes a poem but silence, and nothing follows a poem but silence. A poem is an interruption of silence, whereas prose is a continuation of noise.

I don't know why this passage grabbed me. Perhaps it's just the imagery of the phrases "interruption of silence" and "continuation of noise". I won't be surprised if my subconscious connects this to programming somehow, but I ought to be suspicious of the imposition. Our brains love to make connections.

She's this girl in high school who broke my heart, and I'm hoping that she'll read my poems one day and feel bad about what she did.

This is the sort of sentence I'm a sucker for, but it has no real connection to my life. Though high school was a weird and wonderful time for me, as it was for so many, I don't think anything I've ever done since has been motivated in this way. Collins actually goes on to say the same thing about his own work. Readers are people with no vested interest. We have to engage them.

Another example of that is my interest in bridge columns. I don't play bridge. I have no idea how to play bridge, but I always read Alan Truscott's bridge column in the Times. I advise students to do the same unless, of course, they play bridge. You find language like, South won with dummy's ace, cashed the club ace and ruffed a diamond. There's always drama to it: Her thirteen imps failed by a trick. There's obviously lots at stake, but I have no idea what he's talking about. It's pure language. It's a jargon I'm exterior to, and I love reading it because I don't know what the context is, and I'm just enjoying the language and the drama, almost like when you hear two people arguing through a wall, and the wall is thick enough so you can't make out what they're saying, though you can follow the tone.

I feel seen. Back when we took the local daily paper, I always read the bridge column by Charles Goren, which ran on the page with the crossword, crypto cipher, and other puzzles. I've never played bridge; most of what I know about the game comes from reading Matthew Ginsberg's papers about building AI programs to bid and play. Like Collins, I think I was merely enjoying sound of the language, a jargon that sounds serious and silly at the same time.

Yeats summarizes this whole thing in "Adam's Curse" when he writes: "A line will take us hours maybe, / Yet if it does not seem a moment's thought / Our stitching and unstitching has been naught."

I'm not a poet, and my unit of writing is rarely the line, but I know a feeling something like this in writing lecture notes for my students. Most of the worst writing consists of paragraphs and sections I have not spent enough time on. Most of the best sounds natural, a clean distillation of deep understanding. But those paragraphs and sections are the result of years of evolution. That's the time scale on which some of my courses grow, because no course ever gets my full attention in any semester.

When I finish a set of notes, I usually feel like the stitching and unstitching have not yet reached their desired end. Some of the text "seems a moment's thought", but much is still uneven or awkward. Whatever the state of the notes, though, I have move on to the next task: grading a homework assignment, preparing the next class session, or -- worst of all -- performing the administrivia that props up the modern university. More evolution awaits.

~~~~

This was a good read for a Sunday morning on the exercise bike, well recommended. The line on revision alone was worth the time; I expect it will be a stock tool in my arsenal for years to come.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Software Development, Teaching and Learning

November 22, 2021 2:23 PM

Quick Hits

It's been another one of those months when I think about blogging a lot but never set aside time to write. Rather than wait for the time to finish a piece I'm writing, about the process of writing a demo code generator for my compiler students, I thought I'd drop a few tidbits now, just for fun. Maybe that will break the ice for writing this holiday week.

• Two possible titles for my next blog: Dear Crazy Future Eugene and Eugene Wallingford's Descent Into Madness. (Hey to Sheldon Cooper.)

• A nice quote from one of my daughters' alumni magazines: A biology major who is now an executive at a nonprofit agency was asked about the value of having majored in science.

When science is taught the right way, she said, "it is relevant in just about every situation".
Everyone can benefit from thinking like a scientist, and feeling comfortable with that mode of thinking. (Hey to Chad Orzel and Eureka: Discovering Your Inner Scientist.)

• Dan Wang on the US's ability to be a manufacturer of batteries:

Batteries are hard to ship and tend to be developed for particular automakers. So they're made close to the site of auto assembly. The US could be a big battery maker if only it built the charging network and offered subsidies on the scale of Europe and China, it's not hard.
The worlds of manufacturing and big industry are different in fundamental ways from software. I learn a lot from Wang's deep dives into process knowledge and investment. A lot of his ideas apply to software, too.


Posted by Eugene Wallingford | Permalink | Categories: General

October 10, 2021 1:53 PM

Strange Loop 3: This and That

The week after Strange Loop has been a blur of catching up with all the work I didn't do while attending the conference, or at least trying. That is actually good news for my virtual conference: despite attending Strange Loop from the comfort of my basement, I managed not to get sucked into the vortex of regular business going on here.

A few closing thoughts on the conference:

• Speaking of "the comfort of my basement", here is what my Strange Loop conference room looked like:

my Strange Loop 2021 home set-up, with laptop on the left, 29-inch monitor in the center, and a beverage to the right

The big screen is a 29" ultra-wide LG monitor that I bought last year on the blog recommendation of Robert Talbert, which has easily been my best tech purchase of the pandemic. On that screen you'll see vi.to, the streaming platform used by Strange Loop, running in Safari. To its right, I have emacs open on a file of notes and occasionally an evolving blog draft. There is a second Safari window open below emacs, for links picked up from the talks and the conference Slack channels.

On the MacBookPro to left, I am running Slack, another emacs shell for miscellaneous items, and a PDF of the conference schedule, marked up with the two talks I'm considering in each time slot.

That set-up served me well. I can imagine using it again in the future.

• Attending virtually has its downsides, but also its upsides. Saturday morning, one attendee wrote in the Slack #virtual-attendees channel:

Virtual FTW! Attending today from a campsite in upstate New York and enjoying the fall morning air

I was not camping, but I experienced my own virtual victories at lunch time, when I was able to go for a walk with my wife on our favorite walking trails.

• I didn't experience many technical glitches at the conference. There were some serious AV issues in the room during Friday's second slot. Being virtual, I was able to jump easily into and out of the room, checking in on another talk while they debugged on-site. In another talk, we virtual attendees missed out on seeing the presenter's slides. The speaker's words turned out to be enough for me to follow. Finally, Will Byrd's closing keynote seemed to drop its feed a few times, requiring viewers to refresh their browsers occasionally. I don't have any previous virtual conferences to compare to, but this all seemed pretty minor. In general, the video and audio feedbacks were solid and of high fidelity.

• One final note, not related to The Virtual Experience. Like many conferences, Strange Loop has so many good talks that I usually have to choose among two or three talks I want to see in each slot. This year, I kept track of alt-Strange Loop, the schedule of talks I didn't attend but really wanted to. Comparing this list to the list of talks I did attend gives a representative account of the choices I faced. It also would make for a solid conference experience in its own right:

  • FRI 02 -- Whoops! I Rewrote it in Rust (Brian Martin)
  • FRI 03 -- Keeping Your Open Source Project Accessible to All (Treva Williams)
  • FRI 04 -- Impacting Global Policy by Understanding Litter Data (Sean Doherty)
  • FRI 05 -- Morel, A Functional Query Language (Julian Hyde)
  • FRI 06 -- Software for Court Appointed Special Advocates (Linda Goldstein)
  • SAT 02 -- Asami: Turn your JSON into a Graph in 2 Lines (Paula Gearon)
  • SAT 03 -- Pictures Of You, Pictures Of Me, Crypto Steganography (Sean Marcia)
  • SAT 04 -- Carbon Footprint Aware Software Development Tejas Chopra
  • SAT 05 -- How Flutter Can Change the Future of Urban Communities (Edward Thornton)
  • SAT 06 -- Creating More Inclusive Tech Spaces: Paths Forward (Amy Wallhermfechtel)

There is a tie for the honor of "talk I most wanted to see but didn't": Wallhermfechtel on creating more inclusive tech spaces and Marcia on crypto steganography. I'll be watching these videos on YouTube some time soon!

As I mentioned in Day 1's post, this year I tried to force myself out of usual zone, to attend a wider range of talks. Both lists of talks reflect this mix. At heart I am an academic with a fondness for programming languages. The tech talks generally lit me up more. Even so, I was inspired by some of the talks focused on community and the use of technology for the common good. I think I used my two days wisely.

That is all. Strange Loop sometimes gives me the sort of inspiration overdose that Molly Mielke laments in this tweet. This year, though, Strange Loop 2021 gave me something I needed after eighteen months of pandemic (and even more months of growing bureaucracy in my day job): a jolt of energy, and a few thoughts for the future.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

October 01, 2021 5:46 PM

Strange Loop 1: Day One

On this first day of my first virtual conference, I saw a number of Strange Loop-y talks: several on programming languages and compilers, a couple by dancers, and a meta-talk speculating on the future of conferences.

• I'm not a security guy or a cloud guy, so the opening keynote "Why Security is the Biggest Benefit of Using the Cloud" by AJ Yawn gave me a chance to hear what people in this space think and talk about. Cool trivia: Yawn played a dozen college basketball games for Leonard Hamilton at Florida State. Ankle injuries derailed his college hoops experience, and now he's a computer security professional.

• Richard Marmorstein's talk, "Artisanal, Machine-Generated API Libraries" was right on topic with my compiler course this semester. My students would benefit from seeing how software can manipulate AST nodes when generating target code.

Marmorstein uttered two of the best lines of the day:

  • "I could tell you a lot about Stripe, but all you need to know is Stripe has an API."
  • "Are your data structures working for you?"

I've been working with students all week trying to help them see how an object in their compiler such as a token can help the compiler do its job -- and make the code simpler to boot. Learning software design is hard.

• I learned a bit about the Nim programming language from Aditya Siram. As you might imagine, a language designed at the nexus Modula/Oberon, Python, and Lisp appeals to me!

• A second compiler-oriented talk, by Richard Feldman, demonstrated how opportunistic in-place mutation, a static optimization, can help a pure functional program outperform imperative code.

• After the talk "Dancing With Myself", an audience member complimented Mariel Pettee on "nailing the Strange Loop talk". The congratulations were spot-on. She hit the technical mark by describing the use of two machine learning techniques, variational auto encoding and graph neural networks. She hit the aesthetic mark by showing how computer models can learn and generate choreography. When the video for this talk goes live, you should watch.

Pettee closed with the expansive sort of idea that makes Strange Loop a must-attend conference. Dance has no universal language for "writing" choreography, and video captures only a single instance or implementation of a dance, not necessarily the full intent of the choreographer. Pettite had expected her projects to show how machine learning can support invention and co-creation, but now she sees how work like this might provide a means of documentation. Very cool. Perhaps CS can help to create a new kind of language for describing dance and movement.

• I attended Laurel Lawson's "Equitable Experiential Access: Audio Description" to learn more about ways in which videos and other media can provide a fuller, more equitable experience to everyone. Equity and inclusion have become focal points for so much of what we do at my university, and they apply directly to my work creating web-based materials for students. I have a lot to learn. I think one of my next steps will be to experience some of web pages (session notes, assignments, resource pages) solely through a screen reader.

• Like all human activities, traditional in-person conferences offer value and extract costs. Crista Lopes used her keynote closing Day 1 to take a sober look at the changes in their value and their costs in the face of technological advances over the last thirty years.

If we are honest with ourselves, virtual conferences are already able to deliver most of the value of in-person conferences (and, in some ways, provide more value), at much lower cost. The technology of going virtual is the easy part. The biggest challenges are social.

~~~~~

A few closing thoughts as Day 1 closes.

As Crista said, "Taking paid breaks in nice places never gets old." My many trips to OOPSLA and PLoP provided me with many wonderful physical experiences. Being in the same place with my colleagues and friends was always a wonderful social experience. I like driving to St. Louis and going to Strange Loop in person; sitting in my basement doesn't feel the same.

With time, perhaps my expectations will change.

It turns out, though, that "virtual Strange Loop" is a lot like "in-person Strange Loop" in one essential way: several cool new ideas arrive every hour. I'll be back for Day Two.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

September 30, 2021 4:42 PM

Off to Strange Loop

the Strange Loop splash screen from the main hall, 2018

After a couple of years away, I am attending Strange Loop. 2018 seems so long ago now...

Last Wednesday morning, I hopped in my car and headed south to Strange Loop 2018. It had been a few years since I'd listened to Zen and the Art of Motorcycle Maintenance on a conference drive, so I popped it into the tape deck (!) once I got out of town and fell into the story. My top-level goal while listening to Zen was similar to my top-level goal for attending Strange Loop this year: to experience it at a high level; not to get bogged down in so many details that I lost sight of the bigger messages. Even so, though, a few quotes stuck in my mind from the drive down. The first is an old friend, one of my favorite lines from all of literature:

Assembly of Japanese bicycle require great peace of mind.

The other was the intellectual breakthrough that unified Phaedrus's philosophy:

Quality is not an object; it is an event.

This idea has been on my mind in recent months. It seemed a fitting theme, too, for Strange Loop.

There will be no Drive South in 2021. For a variety of reasons, I decided to attend the conference virtually. The persistence of COVID is certainly one of big the reasons. Alex and the crew at Strange Loop are taking all the precautions one could hope for to mitigate risk, but even so I will feel more comfortable online this year than in rooms full of people from across the country. I look forward to attending in person again soon.

Trying to experience the conference at a high level is again one of my meta-level goals for attending. The program contains so many ideas that are new to me; I think I'll benefit most by opening myself to areas I know little or nothing about and seeing where the talks lead me.

This year, I have a new meta-level goal: to see what it is like to attend a conference virtually. Strange Loop is using Vito as its hub for streaming video and conference rooms and Slack as its online community. This will be my first virtual conference, and I am curious to see how it feels. With concerns such as climate change, public health, and equity becoming more prominent as conference-organizing committees make their plans, I suspect that we will be running more and more of our conferences virtually in the future, especially in CS. I'm curious to see how much progress has been made in the last eighteen months and how much room we have to grow.

This topic is even on the program! Tomorrow's lineup concludes with Crista Lopes speaking on the future of conferences. She's been thinking about and helping to implement conferences in new ways for a few years, so I look forward to hearing what she has to say.

Whatever the current state of virtual conferences, I fully expect that this conference will be a worthy exemplar. It always is.

So, I'm off to Strange Loop for a couple of days. I'll be in my basement.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

August 29, 2021 10:19 AM

Launching the Compiler Project with New Uncertainties

We will be forming project teams in my course this week, and students will begin work in earnest on Friday. Or so thinks the prof, who releases the first assignment on Thursday... I can dream.

I noticed one change this year when I surveyed students about their preferences for forming teams. In an ordinary year, most students submit at least one or two names of others in the class with whom they'd like to work; some already have formed the teams they want to work in. A few indicate someone they'd rather not work with, usually based on experiences in previous courses. This helps me help them form teams with a mix of new and familiar, with some hedge against expected difficulties. It's never perfect, but most years we end up with a decent set of teams and project experiences.

This year, though, students barely offered any suggestions for forming teams. Most students expressed no preference for whom they want to work with, and no one indicated someone they don't want to work with.

At first, this seemed strange to me, but then I realized that it is likely an effect of three semesters distorted by COVID-19. With one semester forced online and into isolation, a second semester with universal masking, no extracurricular activities, and no social life, and a third semester with continued masking and continued encouragement not to gather, these students have had almost no opportunitiy to get to know one another!

This isolation eliminates one of the great advantages of a residential university, both personally and professionally. I made so many friends in college, some of whom I'm still close to, and spent time with them whenever I wasn't studying (which, admittedly, was a lot). But it also affects the classroom, where students build bonds over semesters of taking courses together in various configurations. Those bonds carry over into a project course such as mine, where they lubricate the wheels of teams who have to work together more closely than before. They at least begin the project knowing each other a bit and sharing a few academic experiences.

Several students in my class this semester said, "I have no friends in this class" or even "I don't know any other CS majors". That is sad. It also raises the stakes for the compiler project, which may be there only chance to make acquaintances in their major before they graduate. I feel a lot more responsibility as I begin to group students into teams this semester, even as I know that I have less information available than ever before for doing a credible job.

I'm going to keep all this in mind as the semester unfolds and pay closer attention to how students and teams seem to be doing. Perhaps this course can not only help them have a satisfying and educational experience building a big piece of software, but also help them form some of the personal bonds that add grace notes to their undergrad years.

~~~~~

On an unrelated note, I received word a couple of weeks ago that this blog had been selected by Feedspot as one of the Top 20 Computer Science Blogs on the web. It's always nice to be recognized in this way. Given how little little I've blogged over the last couple of years, it is rather generous to include me on this list! I see there a number of top-quality blogs, several of which I read religiously, and most of which post entries with admirable regularity. It remains a goal of mine to return to writing here more regularly. Perhaps two entries within a week, light as they are, offer hope.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

May 06, 2021 3:19 PM

Sometimes You Have To Just Start Talking

I have been enjoying a few of James Propp's essays recently. Last month he wrote about the creation of zero. In Who Needs Zero, he writes:

But in mathematics, premature attempts to reach philosophical clarity can get in the way of progress both at the individual level and at the cultural level. Sometimes you have to just start talking before you understand what you're talking about.

This reminded me of a passage by Iris Murdoch in Metaphysics as a Guide to Morals, which I encountered in one of Robin Sloan's newsletters:

The achievement of coherence is itself ambiguous. Coherence is not necessarily good, and one must question its cost. Better sometimes to remain confused.

My brain seems hardwired to seek out and create abstractions. Perhaps it's just a deeply ingrained habit. Even so I am a pragmatist at heart. As Propp says, "Zero is as zero does."

Allowing oneself to remain confused, to forge ahead without having reached clarity yet, is essential to doing research, or to learning anything at all, really.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

February 28, 2021 9:47 AM

Find Your Passion? Master Something.

A few weeks ago, a Scott Galloway video clip made the rounds. In it, Galloway was saying something about "finding your passion" that many people have been saying for a long time, only in that style that makes Galloway so entertaining. Here's a great bit of practical advice on the same topic from tech guru Kevin Kelly:

Following your bliss is a recipe for paralysis if you don't know what you are passionate about. A better motto for most youth is "master something, anything". Through mastery of one thing, you can drift towards extensions of that mastery that bring you more joy, and eventually discover where your bliss is.

My first joking thought when I read this was, "Well, maybe not anything..." I mean, I can think of lots of things that don't seem worth mastering, like playing video games. But then I read about professional gamers making hundreds of thousands of dollars a year, so who am I to say? Find something you are good at, and get really good at it. As Galloway says, like Chris Rock before him, it's best to become good at something that other people will pay you for. But mastery of anything opens doors that passion can only bang on.

The key to the "master something, anything" mantra is the next sentence of Kelly's advice. When we master something, our expertise creates opportunities. We can move up or down the hierarchy of activities built from that mastery, or to related domains. That is where we are most likely to find the life that brings us joy. Even better, we will find it in a place where our mastery helps us get through the inevitable drudge work and over the inevitable obstacles that will pop in our way. I love to program, but some days debugging is a slog, and other days I butt up against thorny problems beyond my control. The good news is that I have skills to get through those days, and I like what I'm doing enough to push on through to the more frequent moments and days of bliss.

Passion is wonderful if you have it, but it's hard to conjure up on its own. Mastering a skill, or a set of skills, is something every one of us can do, and by doing it we can find our way to something that makes us happy.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

February 27, 2021 11:12 AM

All The Words

In a Paris Review interview, Fran Lebowitz joked about the challenge of writing:

Every time I sit at my desk, I look at my dictionary, a Webster's Second Unabridged with nine million words in it and think, All the words I need are in there; they're just in the wrong order.

Unfortunately, thinks this computer scientist, writing is a computationally more intense task than simply putting the words in the right order. We have to sample with replacement.

Computational complexity is the reason we can't have nice things.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

January 05, 2021 3:33 PM

Today's Reading

Lot's of good stuff on the exercise bike this morning...

Henry Rollins on making things because he must:

I'm a shipbuilder. I don't want to sail in them. I want you to sail in them. I'm just happy that they leave the harbor so I can have an empty workplace.

Deirdre Connolly on the wonder of human achievement:

We, ridiculous apes with big brains and the ability to cooperate, can listen to the universe shake from billions of light years away, because we learned math and engineering. Wild.

Sonya Mann on our ultimate task:

Our labor is the same as it ever was. Your job is to pioneer a resilient node in the network of civilization -- to dodge the punches, roll with the ones that you can't, and live to fight another day. That's what our ancestors did for us and it's what we'll do for those who come next: hold the line, explore when there's surplus, stay steady, and go down swinging when we have to.

Henry Rollins also said:

"What would a writer do in this situation?" I don't know, man. Ask one. And don't tell me what he said; I'm busy.

Back to work.


Posted by Eugene Wallingford | Permalink | Categories: General

January 03, 2021 5:08 PM

On the Tenth Day of Christmas...

... my daughter gave to me:

Christmas gifts from Sarah!

We celebrated Part 2 of our Zoom Family Christmas this morning. A package from one of our daughters arrived in the mail after Part 1 on Christmas Day, so we reprised our celebration during today's weekly call.

My daughter does not read my blog, at least not regularly, but she did search around there for evidence that I might already own these titles. Finding none, she ventured the long-distance gift. It was received with much joy.

I've known about I Am a Strange Loop for over a decade but have never read it. Somehow, Surfaces and Essences flew under my radar entirely. A book that is new to me!

These books will provide me many hours of enjoyment. Like Hofstadter's other books, they will probably bend my brain a bit and perhaps spark some welcome new activity.

~~~~~

Hofstadter appears in this blog most prominently in a set of entries I wrote after he visited my university in 2012:

I did mention I Am a Strange Loop in a later entry after all, a reflection on Alan Turing, representation, and universal machines. I'm glad that entry did not undermine my daughter's gift!


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

January 01, 2021 12:01 PM

This Version of the Facts

The physicist Leo Szilard once announced to his friend Hans Bethe that he was thinking of keeping a diary: "I don't intend to publish it; I am merely going to record the facts for the information of God." "Don't you think God knows the facts?" Bethe asked. "Yes," said Szilard. "He knows the facts, but He does not know this version of the facts."

I began 2021 by starting to read Disturbing the Universe, Freeman Dyson's autobiographical attempt to explain to people who are not scientists what the human situation looks like to someone who is a scientist. The above passage opens the author's preface.

Szilard's motive seems like a pretty good reason to write a blog: to record the one's own version of the facts, for oneself and for the information of God. Unlike Szilard, we have an alternative in between publishing and not publishing. A blog is available for anyone to read, at almost no cost, but ultimately it is for the author, and maybe for God.

I've been using the long break between fall and spring semesters to strengthen my blogging muscle and redevelop my blogging habit. I hope to continue to write more regularly again in the coming year.

Dyson's book is a departure from my recent reading. During the tough fall semester, I found myself drawn to fiction, reading Franny and Zooey by J. D. Salinger, The Bell Jar by Sylvia Plath, The Lucky Ones by Rachel Cusk, and The Great Gatsby by F. Scott Fitzgerald, with occasional pages from André Gide's diary in the downtime between books.

I've written about my interactions with Cusk before [ Outline, Transit, Kudos ], so one of her novels is no surprise here, but what's with those classics from sixty years ago or more? These stories, told by deft and observant writers, seemed to soothe me. They took the edge off of the long days. Perhaps I could have seen a run of classic books coming... In the frustrating summer run-up to fall, I read Thomas Mann's Death in Venice and Ursula Le Guin's The Lathe of Heaven.

For some reason, yesterday I felt the urge to finally pick up Dyson's autobiography, which had been on my shelf for a few months. A couple of years ago, I read most of Dyson's memoir, Maker of Patterns, and found him an amiable and thoughtful writer. I even wrote a short post on one of his stories, in which Thomas Mann plays a key role. At the time, I said, "I've never read The Magic Mountain, or any Mann, for that matter. I will correct that soon. However, Mann will have to wait until I finish Dyson...". 2020 may have been a challenge in many ways, but it gave me at least two things: I read my first Mann (Death in Venice is much more approachable than The Magic Mountain...), and it returned me to Dyson.

Let's see where 2021 takes us.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

December 27, 2020 10:10 AM

What Paul McCartney Can Teach Us About Software

From Sixty-Four Reasons to Celebrate Paul McCartney, this bit of wisdom that will sound familiar to programmers:

On one of the tapes of studio chatter at Abbey Road you can hear McCartney saying, of something they're working on, "It's complicated now. If we can get it simpler, and then complicate it where it needs to be complicated..."

People talk a lot about making software as simple as possible. The truth is, software sometimes has to be complicated. Some programs perform complex tasks. More importantly, programs these days often interact in complex environments with a lot of dissimilar, distributed components. We cannot avoid complexity.

As McCartney knows about music, the key is to make things as simple as can be and introduce complexity only where it is essential. Programmers face three challenges in this regard:

  • learning how to simplify code,
  • learning how to add complexity in a minimal, contained fashion, and
  • learning how to recognize the subtle boundary between essential simplicity and essential complexity.
I almost said that new programmers face those challenges, but after many years of programming, I feel like I'm still learning how to do all three of these things. I suspect other experienced programmers are still learning, too.

On an unrelated note, another passage in this article spoke to me personally as a programmer. While discussing McCartney's propensity to try new things and to release everything, good and bad, it refers to some of the songs on his most recent album (at that time) as enthusiastically executed misjudgments. I empathize with McCartney. My hard drive is littered with enthusiastically executed misjudgments. And I've never written the software equivalent of "Hey Jude".

McCartney just released a new album this month at the age of 78. The third album in a trilogy conceived and begun in 1970, it has already gone to #1 in three countries. He continues to write, record, and release, and collaborates frequently with today's artists. I can only hope to be enthusiastically producing software, and in tune with the modern tech world, when I am his age.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

December 21, 2020 8:30 AM

Watching the Scientific Method Take Off in Baseball

I make it a point never to believe anything
just because it's widely known to be so.
-- Bill James

A few years ago, a friend was downsizing and sent me his collection of The Bill James Abstract from the 1980s. Every so often I'll pick one up and start reading. This week, I picked up the 1984 issue.

my stack of The Bill James Abstract, courtesy of Charlie Richter

It's baseball, so I enjoy it a lot. My #1 team, the Cincinnati Reds, were in the summer doldrums for most of the 1980s but on their way to a surprising 1990 World Series win. My #2 team, the Detroit Tigers, won it all in 1984, with a performance so dominating that it seemed almost preordained. It's fun to reminisce about those days.

It's even more fascinating to watch the development of the scientific method in a new discipline.

Somewhere near the beginning of the 1984 abstract, James announces the stance that underlies all his work: never believe anything just because everyone else says it's true. Scattered through the book are elaborations of this philosophy. He recognizes that understanding the world of baseball will require time, patience, and revision:

In many cases, I have no clear evidence on the issue, no way of answering the question. ... I guess what I'm saying is that if we start trying to answer these questions now, we'll be able to answer them in a few years. An unfortunate side effect is that I'm probably going to get some of the answers wrong now; not only some of the answers but some of the questions.

Being wrong is par for the course for scientists; perhaps James felt some consolation in that this made him like Charles Darwin. The goal isn't to be right today. It is to be less wrong than yesterday. I love that James tells us that, early in his exploration, even some of the questions he is asking are likely the wrong questions. He will know better after he has collected some data.

James applies his skepticism and meticulous analysis to everything in the game: which players contribute the most offense or defense to the team, and how; how pitching styles affect win probabilities; how managers approach the game. Some things are learned quickly but are rejected by the mainstream. By 1984, for example, James and people like him knew that, on average, sacrifice bunts and most attempts to steal a base reduced the number of runs a team scores, which means that most of them hurt the team more than they help. But many baseball people continued to use them too often tactically and even to build teams around them strategically.

At the time of this issue, James had already developed models for several phenomena in the game, refined them as evidence from new seasons came in, and expanded his analysis into new areas. At each step, he channels his inner scientist: look at some part of the world, think about why it might work the way it does, develop a theory and a mathematical model, test the theory with further observations, and revise. James also loves to share his theories and results with the rest of us.

There is nothing new here, of course. Sabermetrics is everywhere in baseball now, and data analytics have spread to most sports. By now, many people have seen Moneyball (a good movie) or read the Michael Lewis book on which it was based (even better). Even so, it really is cool to read what are in effect diaries recording what James is thinking as he learns how to apply the scientific method to baseball. His work helped move an entire industry into the modern world. The writing reflects the curiosity, careful thinking, and humility that so often lead to the goal of the scientific mind:

to be less wrong than yesterday


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

December 10, 2020 3:36 PM

The Fate of Most Blogs

... was described perfectly by André Gide on August 8, 1891, long before the digital computer:

More than a month of blanks. Talking of myself bores me. A diary is useful during conscious, intentional, and painful spiritual evolutions. Then you want to know where you stand. But anything I should say now would be harpings on myself. An intimate diary is interesting especially when it records the awakening of ideas; or the awakening of the senses at puberty; or else when you feel yourself to be dying.

There is no longer any drama taking place in me; there is now nothing but a lot of ideas stirred up. There is no need to write myself down on paper.

Of course, Gide kept writing for many years after that moment of doubt. The Journals of André Gide are an entertaining read. I feel seen, as the kids say these days.


Posted by Eugene Wallingford | Permalink | Categories: General

December 08, 2020 2:06 PM

There Is No Step Two, But There Is A Step Three

In a not-too-distant post, Daniel Steinberg offered two lessons from his experience knitting:

So lesson one is to start and lesson two is to keep going.

This reminded me of Barney Stinson's rules for running a marathon (10s video):

Here's how you run a marathon.

Step 1: Start running.

<pause>

Oh, yeah -- there's no Step 2.

Daniel offers more lessons, though, including Lesson Three: Ask for help. After running the New York Marathon with no training, Barney learned this lesson the hard way. Several hours after the marathon, he found that he no longer had control of his legs, got stuck on the subway because he could not stand up on his own, and had to call the gang for help.

I benefit a lot from reading Daniel's blog posts, and Barney probably could have, too. We're all better off now that Daniel is writing for his blogs and newsletters regularly again. They are full of good stories, interesting links, and plenty of software wisdom.


Posted by Eugene Wallingford | Permalink | Categories: General, Running

July 16, 2020 10:47 AM

Dreaming in Git

I recently read a Five Books interview about the best books on philosophical wonder. One of the books recommended by philosopher Eric Schwitzgebel was Diaspora, a science fiction novel by Greg Egan I've never read. The story unfolds in a world where people are able to destroy their physical bodies to upload themselves into computers. Unsurprisingly, this leads to some fascinating philosophical possibilities:

Well, for one thing you could duplicate yourself. You could back yourself up. Multiple times.

And then have divergent lives, as it were, in parallel but diverging.

Yes, and then there'd be the question, "do you want to merge back together with the person you diverged from?"

Egan wrote Diaspora before the heyday of distributed version control, before darcs and mercurial and git. With distributed VCS, a person could checkout a new personality, or change branches and be a different person every day. We could run diffs to figure out what makes one version of a self so different from another. If things start going too wrong, we could always revert to an earlier version of ourselves and try again. And all of this could happen with copies of the software -- ourselves -- running in parallel somewhere in the world.

And then there's Git. Imagine writing such a story now, with Git's complex model of versioning and prodigious set of commands and flags. Not only could people branch and merge, checkout and diff... A person could try something new without ever committing changes to the repository. We'd have to figure out what it means to push origin or reset --hard HEAD. We'd be able to rewrite history by rebasing, amending, and squashing. A Git guru can surely explain why we'd need to --force-with-lease or --unset-upstream, but even I can imagine the delightful possibilities of git stash in my personal improvement plan.

Perhaps the final complication in our novel would involve a merge so complex that we need a third-party diff tool to help us put our desired self back together. Alas, a Python library or Ruby gem required by the tool has gone stale and breaks an upgrade. Our hero must find a solution somewhere in her tree of blobs, or be doomed to live a forever splintered life.

If you ever see a book named Dreaming in Git or Bug Report on an airport bookstore's shelves, take a look. Perhaps I will have written the first of my Git fantasies.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development

July 07, 2020 2:42 PM

Spurious Precision

When I pestered Conway for more details regarding the seminal Moscow meeting that inspired his triumphant half-day of discovery, he begged off. He was loath to add any "spurious precision", as he came to refer to his embellishments, advertent or accidental. "My memory. My memory is a liar," he said. "It's a good liar. It deceives even me."

I love the phrase "spurious precision". It's a great name for something I see in the world -- and, all too often, in my own mind. I should be as careful with my own memory as Conway tried to be in this instance.

(From a 2015 profile in The Guardian.)


Posted by Eugene Wallingford | Permalink | Categories: General

July 01, 2020 3:19 PM

Feeling Unstuck Amid the Pandemic

Rands recently wrote about his work-from-home routine. I love the idea of walking around a large wooded yard while doing audio meetings... One of his reasons for feeling so at ease struck a chord with me:

Everyone desperately wants to return to normality. I am a professional optimist, but we are not returning to normal. Ever. This is a different forever situation, and the sooner we realize that and start to plan accordingly, the sooner we will feel unstuck.

I have written or spoken a variation of this advice so many times over my fifteen years as department head, most often in the context of state funding and our university budget.

Almost every year for my first decade as head, we faced a flat or reduced budget, and every time several university colleagues expressed a desire to ride the storm out: make temporary changes to how we operate and wait for our budgets to return to normal. This was usually accompanied by a wistful desire that we could somehow persuade legislators of our deep, abiding value and thus convince them to allocate more dollars to the university or, failing that, that new legislators some future legislature would have different priorities.

Needless to say, the good old days never returned, and our budget remained on a downward slide that began in the late 1990s. This particular form of optimism was really avoidance of reality, and it led to many people living in a state of disappointment and discomfort for years. Fortunately, over the last five or ten years, most everyone has come to realize that what we have now is normal and has begun to plan accordingly. It is psychologically powerful to accept reality and begin acting with agency.

As for the changes brought on by the pandemic, I must admit that I am undecided about how much of what has changed over the last few months will be the normal way of the university going forward.

My department colleagues and I have been discussing how the need for separation among students in the classroom affects how we teach. Our campus doesn't have enough big rooms for everyone to move each class into a room with twice the capacity, so most of us are looking at ways to teach hybrid classes, with only half of our students in the classroom with us on any given day. This makes most of us sad and even a little depressed: how can we teach our courses as well as we always have in the past when new constraints don't allow us to do what we have optimized our teaching to do?

I have started thinking of the coming year in terms of hill climbing, an old idea from AI. After years of hard work and practice, most of us are at a local maximum in our teaching. The pandemic has disoriented us by dropping us at a random point in the environment. The downside of change in position is that we are no longer at our locally-optimal point for teaching our courses. The upside is that we get to search again under new conditions. Perhaps we can find a new local maximum, perhaps even one higher than our old max. If not, at least we have conducted a valuable experiment under trying conditions and can use what we learn going forward.

This analogy helps me approach my new course with more positive energy. A couple of my colleagues tell me it has helped them, too.

As many others have noted, the COVID-19 crisis has accelerated a few changes that were already taking place in our universities, in particular in the use of digital technology to engage students and to replace older processes. Of the other changes we've seen, some will certainly stick, but I'm not sure anyone really knows which ones. Part of the key to living with the uncertainty is not to tie ourselves too closely to what we did before.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Teaching and Learning

May 22, 2020 3:34 PM

What Good Can Come From All This?

Jerry Seinfeld:

"What am I really sick of?" is where innovation begins.

Steve Wozniak:

For a lot of entrepreneurs, they see something and they say, "I have to have this," and that will start them building their own.

Morgan Housel:

Necessity is the mother of invention, so our willingness to solve problems is about to surge.

A lot of people are facing a lot of different stresses right now, with the prospect that many of those stresses will continue on into the foreseeable future. For instance, I know a lot of CS faculty who are looking at online instruction and remote learning much carefully now that they may be doing it again in the fall. Many of us have some things to learn, and some real problems need to be solved.

"What am I really sick of?" can turn the dial up on our willingness to solve problems that have been lingering in the background for a while. Let's hope that some good can come from the disruption.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

April 08, 2020 2:42 PM

Three Quotes on Human Behavior

2019, Robin Sloan:

On the internet, if you stop speaking: you disappear. And, by corollary: on the internet, you only notice the people who are speaking nonstop.

Some of the people speaking nonstop are the ones I wish would disappear for a while.

~~~~~

1947, from Italy's Response to the Coronavirus:

Published in 1947, The Plague has often been read as an allegory, a book that is really about the occupation of France, say, or the human condition. But it's also a very good book about plagues, and about how people react to them -- a whole category of human behavior that we have forgotten.

A good book is good on multiple levels.

~~~~~

1628, William Harvey, "On the Motion of the Heart and Blood in Animals":

Doctrine, once sown, strikes deep its root, and respect for antiquity influences all men. Still the die is cast, and my trust is in my love of the truth and the candour of cultivated minds.

I don't know why, but the phrase "the candor of cultivated minds" really stuck with me when I read it this week.


Posted by Eugene Wallingford | Permalink | Categories: General

April 06, 2020 1:57 PM

Arithmetic is Fundamental

From a September 2009 edition of Scientific American, in a research report titled "Animals by the Numbers":

Recent studies, however, have uncovered new instances of a counting skill in different species, suggesting that mathematical abilities could be more fundamental in biology than previously thought. Under certain conditions, monkeys could sometimes outperform college students.

Having watched college students attempt to convert base 10 to base 2 using a standard algorithm, I am not surprised.

One animal recorded with Kool-Aid, was 10 to 20 percent less accurate than college students but beat them in reaction time. "The monkeys didn't mind missing every once in a while," Cantlon recounts. "It wants to get past the mistake and on to the next problem where to can get more Kool-Aid, whereas college students can't shake their worry over guessing wrong."

Well, that's changes things a bit. Our education system trains a willingness to fail out of our students. Animals face different kinds of social pressure.

That said, 10-20 percent less accurate is only a letter grade or two on many grading scales. Not too bad for our monkey friends, and they get some Kool-Aid to boot.

My wife was helping someone clean out their house and brought home a bunch of old Scientific Americans. I've had a good time browsing through the articles and seeing what people were thinking and saying a decade ago. The September 2009 issue was about the origins of ideas and products, including the mind. Fun reading.


Posted by Eugene Wallingford | Permalink | Categories: General

March 15, 2020 9:35 AM

Things I've Been Reading

This was a weird week. It started with preparations for spring break and an eye on the news. It turned almost immediately into preparations for at least two weeks of online courses and a campus on partial hiatus. Of course, we don't know how the COVID-19 outbreak will develop over the next three weeks, so we may be facing the remaining seven weeks of spring semester online, with students at a distance.

Here are three pieces that helped me get through the week.

Even If You Believe

From When Bloom Filters Don't Bloom:

Advanced data structures are very interesting, but beware. Modern computers require cache-optimized algorithms. When working with large datasets that do not fit in L3, prefer optimizing for a reduced number of loads over optimizing the amount of memory used.

I've always liked the Bloom filter. It seems such an elegant idea. But then I've never used one in a setting where performance mattered. It still surprises me how well current architectures and compilers optimize performance for us in ways that our own efforts can only frustrate. The article is also worth reading for its link to a nice visualization of the interplay among the parameters of a Bloom Filter. That will make a good project in a future class.

Even If You Don't Believe

From one of Tyler Cowen's long interviews:

Niels Bohr had a horseshoe at his country house across the entrance door, a superstitious item, and a friend asked him, "Why do you have it there? Aren't you a scientist? Do you believe in it?" You know what was Bohr's answer? "Of course I don't believe in it, but I have it there because I was told that it works, even if you don't believe in it."

You don't have to believe in good luck to have good luck.

You Gotta Believe

From Larry Tesler's annotated manual for the PUB document compiler:

In 1970, I became disillusioned with the slow pace of artificial intelligence research.

The commentary on the manual is like a mini-memoir. Tesler writes that he went back to the Stanford AI lab in the spring of 1971. John McCarthy sent him to work with Les Earnest, the lab's chief administrator, who had an idea for a "document compiler", a lá RUNOFF, for technical manuals. Tesler had bigger ideas, but he implemented PUB as a learning exercise. Soon PUB had users, who identified shortcomings that were in sync with Tesler's own ideas.

The solution I favored was what we would now call a WYSIWYG interactive text editing and page layout system. I felt that, if the effect of any change was immediately apparent, users would feel more in control. I soon left Stanford to pursue my dream at Xerox PARC (1973-80) and Apple Computer (1980-1997).

Thus began the shift to desktop publishing. And here I sit, in 2020, editing this post using emacs.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

February 10, 2020 2:37 PM

Some Things I Read Recently

Campaign Security is a Wood Chipper for Your Hopes and Dreams

Practical campaign security is a wood chipper for your hopes and dreams. It sits at the intersection of 19 kinds of status quo, each more odious than the last. You have to accept the fact that computers are broken, software is terrible, campaign finance is evil, the political parties are inept, the DCCC exists, politics is full of parasites, tech companies are run by arrogant man-children, and so on.

This piece from last year has some good advice, plenty of sarcastic humor from Maciej, and one remark that was especially timely for the past week:

You will fare especially badly if you have written an app to fix politics. Put the app away and never speak of it again.

Know the Difference Between Neurosis and Process

In a conversation between Tom Waits and Elvis Costello from the late 1980s, Waits talks about tinkering too long with a song:

TOM: "You have to know the difference between neurosis and actual process, 'cause if you're left with it in your hands for too long, you may unravel everything. You may end up with absolutely nothing."

In software, when we keep code in our hands for too long, we usually end up with an over-engineered, over-abstracted boat anchor. Let the tests tell you when you are done, then stop.

Sometimes, Work is Work

People say, "if you love what you do you'll never work a day in your life." I think good work can be painful--I think sometimes it feels exactly like work.

Some weeks more than others. Trust me. That's okay. You can still love what you do.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development

January 26, 2020 10:23 AM

The Narrative Impulse

Maybe people don't tell stories only to make sense of the world, but rather sometimes to deceive themselves?

It was an interesting idea, I said, that the narrative impulse might spring from the desire to avoid guilt, rather than from the need -- as was generally assumed -- to connect things together in a meaningful way; that it was a strategy calculated, in other words, to disburden ourselves of responsibility.

This is from Kudos, by Rachel Cusk. Kudos is the third book in an unconventional trilogy, following Outline and Transit. I blogged on a passage from Transit last semester, about making something that is part of who you are.

I have wanted to recommend Cusk and these books, but I do not feel up to the task of describing how or why I think so highly of them. They are unorthodox narratives about narrative. To me, Cusk is a mesmerizing story-teller who intertwines stories about people and their lives with the fabric of story-telling itself. She seems to value the stories we tell about ourselves, and yet see through them, to some overarching truth.

As for my own narrative impulse, I think of myself as writing posts for this blog in order to make connections among the many things I learn -- or at least that is I tell myself. Cusk has me taking seriously the idea that some of the stories I tell may come from somewhere else.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

January 06, 2020 3:13 PM

A Writing Game

I recently started reading posts in the archives of Jason Zweig's blog. He writes about finance for a living but blogs more widely, including quite a bit about writing itself. An article called On Writing Better: Sharpening Your Tools challenges writers to look at each word they write as "an alien object":

As the great Viennese journalist Karl Kraus wrote, "The closer one looks at a word, the farther away it moves." Your goal should be to treat every word you write as an alien object: You should be able to look at it and say, What is that doing here? Why did I use that word instead of a better one? What am I trying to say here? How can I get to where I'm going if I use such stale and lifeless words?

My mind immediately turned this into a writing game, an exercise that puts the idea into practice. Take any piece of writing.

  1. Choose a random word in the document.
  2. Change the word -- or delete it! -- in a way that improves the text.
  3. Go to 1.

Play the game for a fixed number of rounds or for a fixed period of time. A devilish alternative is to play until you get so frustrated with your writing that you can't continue. You could then judge your maturity as a writer by how long you can play in good spirits.

We could even automate the mechanics of the game by writing a program that chooses a random word in a document for us. Every time we save the document after a change, it jumps to a new word.

As with most first ideas, this one can probablyb be improved. Perhaps we should bias word selection toward words whose replacement or deletion are most likely to improve our writing. Changing "the" or "to" doesn't offer the same payoff as changing a lazy verb or deleting an abstract adverb. Or does it? I have a lot of room to improve as a writer; maybe fixing some "the"s and "to"s is exactly what I need to do. The Three Bears pattern suggests that we might learn something by tackling the extreme form of the challenge and seeing where it leads us.

Changing or deleting a single word can improve a piece of text, but there is bigger payoff available, if we consider the selected word in context. The best way to eliminate many vague nouns is to turn them back into verbs, where they act with vigor. To do that, we will have to change the structure of the sentence, and maybe the surrounding sentences. That forces us to think even more deeply about the text than changing a lone word. It also creates more words for us to fix in following rounds!

I like programming challenges of this sort. A writing challenge that constrains me in arbitrary ways might be just what I need to take time more often to improved my work. It might help me identify and break some bad habits along the way. Maybe I'll give this a try and report back. If you try it, please let me know the results!

And no, I did not play the game with this post. It can surely be improved.

Postscript. After drafting this post, I came across another article by Zweig that proposes just such a challenge for the narrower case of abstract adverbs:

The only way to see if a word is indispensable is to eliminate it and see whether you miss it. Try this exercise yourself:
  • Take any sentence containing "actually" or "literally" or any other abstract adverb, written by anyone ever.
  • Delete that adverb.
  • See if the sentence loses one iota of force or meaning.
  • I'd be amazed if it does (if so, please let me know).

We can specialize the writing game to focus on adverbs, another part of speech, or almost any writing weakness. The possibilities...


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns, Teaching and Learning

November 25, 2019 6:06 PM

Demonstrating That Two Infinities Are Equal

I remember first learning as a student that some infinities are bigger than others. For some sets of numbers, it was easy to see how. The set of integers is infinite, and the set of real numbers is infinite, and it seemed immediately clear that there are fewer integers than reals. Demonstrations and proofs of the fact were cool, but I already knew what they showed me.

Other relationships between infinities were not so easy to grok. Consider: There are an infinite numbers of points on a sheet of paper. There are an infinite numbers of points on a wall. These infinities are equal to one another. But how? Mathematician Yuri Manin demonstrates how:

I explained this to my grandson, that there are as many points in a sheet of paper as there are on the wall of the room. "Take the sheet of paper, and hold it so that it blocks your view of the wall completely. The paper hides the wall from your sight. Now if a beam of light comes out of every point on the wall and lands in your eye, it must pass through the sheet of paper. Each point on the wall corresponds to a point on the sheet of paper, so there must be the same number of each."

I remember reading that explanation in school and feeling both amazed and enlightened. What sorcery is this? So simple, so beautiful. Informal proofs of this sort made me want to learn more mathematics.

Manin told the story quoted above in an interview a decade or so ago with Mikhail Gelfand, We Do Not Choose Mathematics as Our Profession, It Chooses Us. It was a good read throughout and reminded me again how I came to enjoy math.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

October 30, 2019 3:30 PM

A Few Ideas from Economist Peter Bernstein

I found all kinds of wisdom in this interview with economist Peter Bernstein. It was originally published in 2004 and the updated online a couple of years ago. A lot of the wisdom sounds familiar, as most general wisdom does, but occasionally Bernstein offers a twist. For instance, I like this passage:

I make no excuses or apologies for changing my mind. The world around me changes, for one thing, but also I am continuously learning. I have never finished my education and probably never will.... I'm always telling myself, "I must sit down and explain why I said this, and why I was wrong."

People often speak the virtue of changing our minds, but Bernstein goes further: he feels a need to explain both the reason he thought what he did and the reason he was wrong. That sort of post-mortem can be immensely helpful to the rest of us as we try to learn, and the humility of explaining the error keeps us all better grounded.

I found quotable passages on almost every page. One quoted Leibniz, which I paraphrased as:

von Leibniz told Bernoulli that nature works in patterns, but "only for the most part". The other part -- the unpredictable part -- tends to be where the action is.

Poking around the fringes of a model that is pretty good or a pattern of thought that only occasionally fails us often brings surprising opportunities for advancement.

Many of Bernstein's ideas were framed specifically as about investing, of course, such as:

The riskiest moment is when you're right. That's when you're in the most trouble, because you tend to overstay the good decisions.

and:

Diversification is not only a survival strategy but also an aggressive strategy, because the next windfall might come from a surprising place.

These ideas are powerful outside the financial world, too, though. Investing too much importance in a productive research area can be risky because it becomes easy to stay there too long after the world starts to move away. Diversifying our programming language skills and toolsets might look like a conservative strategy that limits rapid advance in a research niche right now, but it also equips us to adapt more quickly when the next big idea happens somewhere we don't expect.

Anyway, the interview is a good long-but-quick read. There's plenty more to consider, in particular his application of Pascal's wager to general decision making. Give it a read if it sounds interesting.


Posted by Eugene Wallingford | Permalink | Categories: General

October 27, 2019 10:23 AM

Making Something That Is Part Of Who You Are

The narrator in Rachel Cusk's "Transit" relates a story told to her by Pavel, the Polish builder who is helping to renovate her flat. Pavel left Poland for London to make money after falling out with his father, a builder for whom he worked. The event that prompted his departure was a reaction to a reaction. Pavel had designed and built a home for his family. After finishing, he showed it to his father. His father didn't like it, and said so. Pavel chose to leave at that moment.

'All my life,' he said, 'he criticise. He criticise my work, my idea, he say he don't like the way I talk -- even he criticise my wife and my children. But when he criticise my house' -- Pavel pursed his lips in a smile -- 'then I think, okay, is enough.'

I generally try to separate myself from the code and prose I write. Such distance is good for the soul, which does not need to be buffeted by criticism, whether external or internal, of the things I've created. It is also good for the work itself, which is free to be changed without being anchored to my identity.

Fortunately, I came out of home and school with a decent sense that I could be proud of the things I create without conflating the work with who I am. Participating in writers' workshops at PLoP conferences early in my career taught me some new tools for hearing feedback objectively and focusing on the work. Those same tools help me to give feedback better. I use them in an effort to help my students develop as people, writers and programmers independent of the code and prose they write.

Sometimes, though, we make things that are expressions of ourselves. They carry part of us in their words, in what they say to the world and how they say it. Pavel's house is such a creation. He made everything: the floors, the doors, and the roof; even the beds his children slept in. His father had criticized his work, his ideas, his family before. But criticizing the house he had dreamed and built -- that was enough. Cusk doesn't give the reader a sense that this criticism was a last straw; it was, in a very real way, the only straw that mattered.

I think there are people in this world who would like just once in their lives to make something that is so much a part of who they are that they feel about it as Pavel does his house. They wish to do so despite, or perhaps because of, the sharp line it would draw through the center of life.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

September 14, 2019 2:56 PM

Listen Now

In a YC Female Founder Story, Danielle Morrill gives a wise answer to an old question:

Q: What do you wish someone had told you when you were 15?
I think people were telling me a lot of helpful things when I was 15 but it was very hard to listen.

This may seem more like a wry observation than a useful bit of wisdom. The fifteen-year-olds of today are no more likely to listen to us than we were to listen to adults when we were fifteen. But that presumes young people have more to learn than the rest of us. I'm a lot older than 15, and I still have plenty to learn.

Morrill's answer is a reminder to me to listen more carefully to what people are telling me now. Even now that can be hard, with all the noise out there and with my own ego getting in my way. Setting up my attention systems to identify valuable signals more reliably can help me learn faster and make me a lot more productive. It can also help future-me not want to look back wistfully so often, wishing someone had told me now what I know then.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

August 02, 2019 2:48 PM

Programming is an Infinite Construction Kit

As so often, Marvin Minsky loved to tell us about the beauty of programming. Kids love to play with construction sets like Legos, TinkerToys, and Erector sets. Programming provides an infinite construction kit: you never run out of parts!

In the linked essay, which was published as a preface to a 1986 book about Logo, Minsky tells several stories. One of the stories relates that once, as a small child, he built a large tower out of TinkerToys. The grownups who saw it were "terribly impressed". He inferred from their reaction that:

some adults just can't understand how you can build whatever you want, so long as you don't run out of sticks and spools.

Kids get it, though. Why do so many of us grow out of this simple understanding as we get older? Whatever its cause, this gap between children's imaginations and the imaginations of adults around them creates a new sort of problem when we give the children a programming language such as Logo or Scratch. Many kids take to these languages just as they do to Legos and TinkerToys: they're off to the races making things, limited only by their expansive imaginations. The memory on today's computers is so large that children never run out of raw material for writing programs. But adults often don't possess the vocabulary for talking with the children about their creations!

... many adults just don't have words to talk about such things -- and maybe, no procedures in their heads to help them think of them. They just do not know what to think when little kids converse about "representations" and "simulations" and "recursive procedures". Be tolerant. Adults have enough problems of their own.

Minsky thinks there are a few key ideas that everyone should know about computation. He highlights two:

Computer programs are societies. Making a big computer program is putting together little programs.

Any computer can be programmed to do anything that any other computer can do--or that any other kind of "society of processes" can do.

He explains the second using ideas pioneered by Alan Turing and long championed in the popular sphere by Douglas Hofstadter. Check out this blog post, which reflects on a talk Hofstadter gave at my university celebrating the Turing centennial.

The inability of even educated adults to appreciate computing is a symptom of a more general problem. As Minsky says toward the end of his essay, People who don't appreciate how simple things can grow into entire worlds are missing something important. If you don't understand how simple things can grow into complex systems, it's hard to understand much at all about modern science, including how quantum mechanics accounts for what we see in the world and even how evolution works.

You can usually do well by reading Minsky; this essay is a fine example of that. It comes linked to an afterword written by Alan Kay, another computer scientist with a lot to say about both the beauty of computing and its essential role in a modern understanding of the world. Check both out.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

July 30, 2019 3:27 PM

"Eugene-Past Knew Things That Eugene-Present Does Not"

A few months back, Mark Guzdial began to ponder a new research question:

I did some literature searches, and found a highly relevant paper: "Task specific programming languages as a first programming language." And the lead author is... me. I wrote this paper with Allison Elliott Tew and Mike McCracken, and published it in 1997. I honestly completely forgot that I had written this paper 22 years ago. Guzdial-past knew things that Guzdial-present does not.

I know this feeling too well. It seems that whenever I look back at an old blog post, especially from the early years, I am surprised to have already thought something, and usually to have thought it better and more deeply than I'm thinking it now! Perhaps this says something about the quality of my thinking now, or the quality of my blogging then. Or maybe it's simply an artifact of time and memory. In any case, stumbling across a link to an ancient blog entry often leads to a few moments of pleasure after an initial bit of disorientation.

On a related note, the fifteenth anniversary of my first blog post passed while I was at Dagstuhl earlier this month. For the first few years, I regularly wrote twelve to twenty posts a month. Then for a few years I settled into a pattern of ten to twelve monthly. Since early 2017, though, I've been in the single digits, with fewer substantial entries. I'm not giving Eugene-2025 much material to look back on.

With a new academic year soon upon us, I hope to write a bit more frequently and a bit more in depth about my programming, my teaching, and my encounters with computer science and the world. I think that will be good for me in many ways. Sometimes, knowing that I will write something encourages me to engage more deeply than I might otherwise. Nearly every time, the writing helps me to make better sense of the encounter. That's one way to make Eugene-Present a little smarter.

As always, I hope that whoever is still reading here finds it worth their time, too.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

July 05, 2019 12:40 PM

A Very Good Reason to Leave Your Home and Move to a New Country

He applied to switch his major from mathematics to computer science, but the authorities forbade it. "That is what tipped me to accept the idea that perhaps Russia is not the best place for me," he says. "When they wouldn't allow me to study computer science."

-- Sergey Aleynikov, as told to Michael Lewis and reported in Chapter 5 of Flash Boys.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 01, 2019 11:59 AM

Wandering the Stacks

In You Are Here, Ben Hunt writes:

You know what I miss most about the world before Amazon? I miss going to the library and looking up a book in the card catalog, searching the stacks for the book in question, and then losing myself in the experience of discovery AROUND the book I was originally searching for. It's one of the best feelings in the world, and I'm not sure that my children have ever felt it. I haven't felt it in at least 20 years.

My daughters, now in their mid-20s, have felt it. We were a library family, not a bookstore family or an Amazon family. Beginning as soon as they could follow picture books, we spent countless hours at the public library in our town and the one in the neighboring city. We took the girls to Story Time and to other activities, but mostly we went to read and wander and select a big stack of books to take home. The books we took home never lasted as long as we thought they would, so back we'd go.

I still wander the stacks myself, both at the university library and, less often these days, the local public libraries. I always start with a few books in mind, recommendations gathered from friends and articles I've read, but I usually bring home an unexpected bounty. Every year I find a real surprise or two, books I love but would never have known about if I hadn't let myself browse. Even when I don't find anything surprising to take home, it's worth the time I spend just wandering.

Writing a little code often makes my day better. So does going to the library. Walking among books, starting with a goal and then aimlessly browsing, calms me on days I need calming and invigorates me on days when my energy is down. Some days, it does both at the same time. Hunt is right: It's one of the best feelings in the world. I hope that whatever else modern technology does for our children, it gives them something to rival this feeling.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

June 28, 2019 3:39 PM

Another Peter Principle-Like Observation

Raganwald tweeted:

If you design a language for people who have a talent for managing accidental complexity, you'll beget more and more accidental complexity over time.
Someone who can manage accidental complexity will always take on more if it makes them more productive.

This reminded me of a blog post from last November in which I half-jokingly coined The Peter Principle of Software Growth:

Software grows until it exceeds our capacity to understand it.

In the case of Raganwald's tweet, languages that enable us to handle accidental complexity well lead to more accidental complexity, because the people who use them will be more be more ambitious -- until they reach their saturation point. Both of these observations about software resemble the original Peter Principle, in which people who succeed are promoted until they reach a point at which they can't, or don't, succeed.

I am happy to dub Raganwald's observation "The Peter Principle of Accidental Complexity", but after three examples, I begin to recognize a pattern... Is there a general name for this phenomenon, in which successful actors advance or evolve naturally until they reach a point at which the can't, or don't, succeed?

If you have any ideas, please email me or respond on Twitter.

In a playful mood at the end of a strange and hectic week, I am now wondering whether there is a Peter Principle of Peter Principles.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

June 18, 2019 3:09 PM

Notations, Representations, and Names

In The Power of Simple Representations, Keith Devlin takes on a quote attributed to the mathematician Gauss: "What we need are notions, not notations."

While most mathematicians would agree that Gauss was correct in pointing out that concepts, not symbol manipulation, are at the heart of mathematics, his words do have to be properly interpreted. While a notation does not matter, a representation can make a huge difference.

Spot on. Devlin's opening made me think of that short video of Richard Feynman that everyone always shares, on the difference between knowing the name of something and knowing something. I've seen people mis-interpret Feynman's words in both directions. The people who share this video sometimes seem to imply that names don't matter. Others dismiss the idea as nonsense: how can you not know the names of things and claim to know anything?

Devlin's distinction makes clear the sense in which Feynman is right. Names are like notations. The specific names we use don't really matter and could be changed, if we all agreed. But the "if we all agreed" part is crucial. Names do matter as a part of a larger model, a representation of the world that relates different ideas. Names are an index into the model. We need to know them so that we can speak with others, read their literature, and learn from them.

This brings to mind an article with a specific example of the importance of using the correct name: Through the Looking Glass, or ... This is the Red Pill, by Ben Hunt at Epsilon Theory:

I'm a big believer in calling things by their proper names. Why? Because if you make the mistake of conflating instability with volatility, and then you try to hedge your portfolio today with volatility "protection" ...., you are throwing your money away.

Calling a problem by the wrong name might lead you to the wrong remedy.

Feynman isn't telling us that names don't matter. He's telling us that knowing only names isn't valuable. Names are not useful outside the web of knowledge in which they mean something. As long as we interpret his words properly, they teach us something useful.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

April 28, 2019 10:37 AM

The Smart Already Know They Are Lucky

Writes Matthew Butterick:

As someone who had a good run in the tech world, I buy the theory that the main reason successful tech founders start another company is to find out if they were smart or merely lucky the first time. Of course, the smart already know they were also lucky, so further evidence is unnecessary. It's only the lucky who want proof they were smart.

From a previous update to The Billionaire's Typewriter, recently updated again. I'm not sure this is the main reason that most successful tech founders start another company -- I suspect that many are simply ambitious and driven -- but I do believe that most successful people are lucky many times over, and that the self-aware among them know it.


Posted by Eugene Wallingford | Permalink | Categories: General

March 31, 2019 4:07 PM

Writing Advice to the Aspiring Kurt Vonnegut

In the fall of 1945, Kurt Vonnegut was serving out the last few months of his military commitment after returning home from Dresden. During the day, he did paperwork in the secretarial pool, and at night he wrote stories in the hopes of making a living as a writer when he left the service. One day his wife, Jane, sent four of his stories to one of those agents who used to advertise in magazines and promise to help frustrated writers get into the business. Her cover letter touted Kurt's desire, ambition, and potential.

The agent wrote back with clear-eyed advice for an aspiring professional writer:

You say you think that Kurt is a potential Chekhov. To this I fervently reply "Heaven Save Him!" This is a very revealing statement. I'm glad you made it. I hope the virus has not become so entrenched that it can't be driven out of his system. I recognize the symptoms of a widely prevailing ailment.... Read Chekhov and enjoy him, yes, and all of the other great and inspiring ones, but don't encourage Kurt, or anybody else, to try to write like them. If you want to sell in the current market, you have got to write "current literature". I warmly applaud Kurt's desire to "say something" that will have some influence, however small, that will do something to help uplift humanity. Every writer worth a hoot has ambition. But don't think that it can't be done in terms of current fiction.... So then, what it adds up to or boils down to is this: you have got to master the current technique if you want acceptance for anything, good or drivel, in the current market. The "message to humanity" is a by-product: it always has been.... If you want to make a living writing you will first of all write to entertain, to divert, to amuse. And that in itself is a noble aim.

What a generous response. I don't know if he responded this way to everyone who contacted him, or if he saw something special in Jane Vonnegut's letter. But this doesn't feel like a generic form letter.

It's easy to idealize classic works of art and the writers, poets, and playwrights who created them. We forget sometimes that they were writing for an audience in their own time, sometimes a popular one, and that most often they were using the styles and techniques that connected with the people. Shakespeare and Mozart -- and Chekhov -- made great art and pushed boundaries, but they did so in their "current market". They entertained and amused who those saw performances of their works. And that's more than just okay; it, too, is a noble aim.

I found this story early in Charles Shields's And So It Goes. Shields met Vonnegut in the last year of his life and received his blessing to write the definitive biography of his life. It's not a perfect book, but it's easy to read and contains a boatload of information. I'm not sure what I'm just now getting around to reading it.


Posted by Eugene Wallingford | Permalink | Categories: General

February 28, 2019 4:29 PM

Ubiquitous Distraction

This morning, while riding the exercise bike, I read two items within twenty minutes or so that formed a nice juxtaposition for our age. First came The Cost of Distraction, an old blog post by L.M. Sacasas that reconsiders Kurt Vonnegut's classic story, "Harrison Bergeron" (*). In the story, it is 2081, and the Handicapper General of the United States ensures equality across the land by offsetting any advantages any individual has over the rest of the citizenry. In particular, those of above-average intelligence are required to wear little earpieces that periodically emit high-pitched sounds to obliterate any thoughts in progress. The mentally- and physically-gifted Harrison rebels, to an ugly end.

Soon after came Ian Bogost's Apple's AirPods Are an Omen, an article from last year that explores the cultural changes that are likely to ensue as more and more people wear AirPods and their ilk. ("Apple's most successful products have always done far more than just make money, even if they've raked in a lot of it....") AirPods free the wearer in so many ways, but they also bind us to ubiquitous distraction. Will we ever have a free moment to think deeply when our phones and laptops now reside in our heads?

As Sacasas says near the end of his post,

In the world of 2081 imagined by Vonnegut, the distracting technology is ruthlessly imposed by a government agency. We, however, have more or less happily assimilated ourselves to a way of life that provides us with regular and constant distraction. We have done so because we tend to see our tools as enhancements.

Who needs a Handicapper General when we all walk down to the nearest Apple Store or Best Buy and pop distraction devices into our own ears?

Don't get me wrong. I'm a computer scientist, and I love to program. I also love the productivity my digital tools provide me, as well as the pleasure and comfort they afford. I'm not opposed to AirBuds, and I may be tempted to get a pair someday. But there's a reason I don't carry a smart phone and that the only iPod I've ever owned is 1GB first-gen Shuffle. Downtime is valuable, too.

(*) By now, even occasional readers know that I'm a big Vonnegut fan who wrote a short eulogy on the occasion of his death, nearly named this blog after one of his short stories, and returns to him frequently.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

January 29, 2019 1:46 PM

Dependencies and Customizable Books

Shriram Krishnamurthi, in Books as Software:

I have said that a book is a collection of components. I have concrete evidence that some of my users specifically excerpt sections that suit their purpose. ...

I forecast that one day, rich document formats like PDF will recognize this reality and permit precisely such specifications. Then, when a user selects a group of desired chapters to generate a thinner volume, the software will automatically evaluate constraints and include all dependencies. To enable this we will even need "program" analyses that help us find all the dependencies, using textual concordances as a starting point and the index as an auxiliary data structure.

I am one of the users Krishnamurthi speaks of, who has excerpted sections from his Programming Languages: Application and Interpretation to suit the purposes of my course. Though I've not written a book, I do post, use, adapt, and reuse detailed lecture notes for my courses, and as a result I have seen both sides of the divide he discusses. I occasionally change the order of topics in a course, or add a unit, or drop a unit. An unseen bit of work is to account for the dependencies among concepts, examples, problems, and code in the affected sections, but also in the new whole. My life is simpler than book writers who have to deal at least in part with rich document formats: I do everything in a small, old-style subset of HTML, which means I can use simple text-based tools for manipulating everything. But dependencies? Yeesh.

Maybe I need to write a big makefile for my course notes. Alas, that would not help me manage dependencies in the way I'd like, or in the way Krishnamurthi forecasts. As such, it would probably make things worse. I suppose that I could create the tool I need.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

December 31, 2018 1:44 PM

Preserve Process Knowledge

This weekend I read the beginning of Dan Wang's How Technology Grows. One of the themes he presses is that when a country loses its manufacturing base, it also loses its manufacturing knowledge base. This in turn damages the economy's ability to innovate in manufacturing, even on the IT front. He concludes:

It can't be an accident that the countries with the healthiest communities of engineering practice are also in the lead in designing tools for the sector. They're able to embed knowledge into new tools, because they never lost the process knowledge in the first place.
Let's try to preserve process knowledge.

I have seen what happens within an academic department or a university IT unit when it loses process knowledge it once had. Sometimes, the world has changed in a way that makes the knowledge no longer valuable, and the loss is simply part of the organization's natural evolution. But other times the change that precipitated the move away from expertise is temporary or illusory, and the group suddenly finds itself unable to adapt other changes in the environment.

The portion of the article I read covered a lot of ground. For example, one reason that a manufacturing base matters so much is that services industries have inherent limits, summarized in:

[The] services sector [has] big problems: a lot of it is winner-take-all, and much of the rest is zero-sum.

This longer quote ends a section in which Wang compares the economies of manufacturing-focused Germany and the IT-focused United States:

The US and Germany are innovative in different ways, and they each have big flaws. I hope they fix these flaws. I believe that we can have a country in which wealth is primarily created by new economic activity, instead of by inheritance; which builds new housing stock, instead of permitting current residents to veto construction; which has a government willing to think hard about new projects that it should initiate, instead of letting the budget run on autopilot. I don't think that we should have to choose between industry and the internet; we can have a country that has both a vibrant industrial sector and a thriving internet sector.

This paragraph is good example of the paper's sub-title, "a restatement of definite optimism". Wang writes clearly and discusses a number of issues relevant to IT as the base for a nation's economy. How Technology Grows is an interesting read.


Posted by Eugene Wallingford | Permalink | Categories: General

December 26, 2018 2:44 PM

It's Okay To Say, "I Don't Know." Even Nobel Laureates Do It.

I ran across two great examples of humility by Nobel Prize-winning economists in recent conversations with Tyler Cowen. When asked, "Should China and Japan move to romanized script?", Paul Romer said:

I basically don't know the answer to that question. But I'll use that as a way to talk about something else ...

Romer could have speculated or pontificated; instead, he acknowledged that he didn't know the answer and pivoted the conversation to a related topic he had thought about (reforming spelling in English, for which he offered an interesting computational solution). By shifting the topic, Romer added value to the conversation without pretending that any answer he could give to the original question would have more value than as speculation.

A couple of months ago, Cowen sat with Paul Krugman. When asked whether he would consider a "single land tax" as a way to encourage a more active and more equitable economy, Krugman responded:

I just haven't done my homework on that.

... and left it there. To his credit, Cowen did not press for an uninformed answer; he moved on to another question.

I love the attitude that Krugman and Romer adopt and really like Krugman's specific answer, which echoed his response to another question earlier in the conversation. We need more people answering questions this way, more often and in more circumstances.

Such restraint is probably even more important in the case of Nobel laureates. If Romer and Klugman choose to speculate on a topic, a lot of people will pay attention, even if it is a topic they know little about. We might learn something from their speculations, but we might also forget that they are only uninformed speculation.

I think what I like best about these answers is the example that Romer and Klugman set for the rest of us: It's okay to say, "I don't know." If you have not done the homework needed to offer an informed answer, it's often best to say so and move on to something you're better prepared to discuss.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

December 24, 2018 2:55 PM

Using a Text Auto-Formatter to Enhance Human Communication

More consonance with Paul Romer, via his conversation with Tyler Cowen: They were discussing how hard it is to learn read English than other languages, due to its confusing orthography and in particular the mismatch between sounds and their spellings. We could adopt a more rational way to spell words, but it's hard to change the orthography of large language spoken by a large, scattered population. Romer offered a computational solution:

It would be a trivial translation problem to let some people write in one spelling form, others in the other because it would be word-for-word translation. I could write you an email in rationalized spelling, and I could put it through the plug-in so you get it in traditional spelling. This idea that it's impossible to change spelling I think is wrong. It's just, it's hard, and we should -- if we want to consider this -- we should think carefully about the mechanisms.

This sounds similar to a common problem and solution in the software development world. Programmers working in teams often disagree about the orthography of code, not the spelling so much as its layout, the use of whitespace, and the placement of punctuation. Being programmers, we often address this problem computationally. Team members can stylize their code anyway they see fit but, when they check it into the common repository, they run it through a language formatter. Often, these formatters are built into our IDEs. Nowadays, some languages even come with a built-in formatting tool, such as Go and gofmt.

Romer's email plug-in would play a similar role in human-to-human communication, enabling writers to use different spelling systems concurrently. This would make it possible to introduce a more rational way to spell words without having to migrate everyone to the new system all at once. There are still challenges to making such a big change, but they could be handled in an evolutionary way.

Maybe Romer's study of Python is turning him into a computationalist! Certainly, being a programmer can help a person recognize the possibility of a computational solution.

Add this idea to his recent discovery of C.S. Peirce, and I am feeling some intellectual kinship to Romer, at least as much as an ordinary CS prof can feel kinship to a Nobel Prize-winning economist. Then, to top it all off, he lists Slaughterhouse-Five as one of his two favorite novels. Long-time readers know I'm a big Vonnegut fan and nearly named this blog for one of his short stories. Between Peirce and Vonnegut, I can at least say that Romer and I share some of the same reading interests. I like his tastes.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 26, 2018 2:04 PM

Self-Help from Hamming

In yesterday's post, I mentioned re-reading Richard Hamming's 1986 talk, You and Your Research. Hamming himself found it useful to manage his own behavior in order to overcome his personal faults, in service of his goal to do great work. I have faults, too, and need occasional reminders to approach my work more intentionally.

I've been at low ebb recently with my own creative work, so there is plenty of low-hanging fruit to be picked after this read. In the short term, I plan to...

  • focus my reading and programming time on material that contributes to specific research and teaching problems I'm working on. In particular, as Hamming says, "you need to keep up more to find out what the problems are than ... to find the solutions" -- then get to work actually solving problems.

  • attend seminars in other departments regularly next semester, especially in our science departments. This action works in the opposite direction as the first bullet, as it broadens my vision beyond my own work. Its benefit is in providing a cross-fertilization of ideas and giving me more chances to converse with smart people outside my area who are solving interesting problems.

I'm also our department head, an administrative role that diverts much of my attention and energy from doing computer science. Hamming doesn't dismiss "management" outright, as so many scientists do. That's heartening, because organizations need good leaders to help create the conditions in which scientists do great work. He even explains why a capable scientist might reasonably choose to become a manager: "The day your vision, what you think needs to be done, is bigger than what you can do single-handedly, then you have to move toward management."

When I became head, I had some ideas about our department that I wanted to help implement from a leadership position. Do I still such ideas that I need to drive forward? If so, then I need to focus my administrative work on those goals. If not, then I need to think about next steps.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Personal

November 17, 2018 4:00 PM

Superior Ideas

In February 1943, an American friend sent physicist Freeman Dyson a copy of Kurt Gödel's "The Consistency of the Continuum Hypothesis" while he was an undergrad at Cambridge. Dyson wrote home about it to his parents:

I have been reading the immortal work (it is only sixty pages long) alternately with The Magic Mountain and find it hard to say which one is better. Mann of course writes better English (or rather the translator does); on the other hand the superiority of the ideas in Gödel just about makes up for that.

Imagine that, only five years later, Dyson would be "drinking tea with Gödel at his home in Princeton". Of course, after having taken classes with the likes of Hardy and Dirac, Dyson was well-prepared. He seems to have found himself surrounded by superior ideas much of his life and, despite his modesty, added a few himself.

I've never read The Magic Mountain, or any Mann, for that matter. I will correct that soon. However, Mann will have to wait until I finish Dyson's Maker of Patterns, in which I found this passage. It is a quite readable memoir that interleaves letters Dyson wrote to his family over the course of thirty-some years with explanatory text and historical asides. I'm glad I picked it up.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

October 21, 2018 9:53 AM

Find the Hard Work You're Willing to Do

I like this passage from John Urschel Goes Pro, about the former NFL player who is pursuing a Ph.D. in math:

The world thinks mathematicians are people for whom math is easy. That's wrong. Sure, some kids, like Urschel, have little trouble with school math. But everyone who starts down the road to creating really new mathematics finds out what Urschel did: It's a struggle. A prickly, sometimes lonely struggle whose rewards are uncertain and a long time coming. Mathematicians are the people who love that struggle.

It's cliché to tell kids to "find their passion". That always seems to me like an awful lot of pressure to put on young adults, let alone teenagers. I meet with potential CS majors frequently, both college students and high school students. Most haven't found their passion yet, and as a result many wonder if there is something wrong with them. I do my my best to assure them that, no, there is nothing wrong with them. It's an unreasonable expectation placed on them by a world that, usually with good intentions, is trying to encourage them.

I don't think there is anything I'd rather be than a computer scientist, but I did not walk a straight path to being one. Some choices early on were easy: I like biology as a body of knowledge, but I never liked studying biology. That seemed a decent sign that maybe biology wasn't for me. (High-school me didn't understand that there might be a difference between school biology and being a biologist...) But other choices took time and a little self-awareness.

From the time I was eight years old or so, I wanted to be an architect. I read about architecture; I sent away for professional materials from the American Institute of Architects; I took courses in architectural drafting at my high school. (There was an unexpected benefit to taking those courses: I got to meet a lot of people were not part of my usual academic crowd.) Then I went off to college to study architecture... and found that, while I liked many things about the field, I didn't really like to do the grunt work that is part of the architecture student's life, and when the assigned projects got more challenging, I didn't really enjoy working on them.

But I had enjoyed working on the hard projects I'd encountered in my programing class back in high school. They were challenges I wanted to overcome. I changed my major and dove into college CS courses, which were full of hard problems -- but hard problems that I wanted to solve. I didn't mind being frustrated for an entire semester one year, working in assembly language and JCL, because I wanted to solve the puzzles.

Maybe this is what people mean when they tell us to "find our passion", but that phrase seems pretty abstract to me. Maybe instead we should encourage people to find the hard problems they like to work on. Which problems do you want to keep working on, even when they turn out to be harder than you expected? Which kinds of frustration do you enjoy, or at least are willing to endure while you figure things out? Answers to these very practical questions might help you find a place where you can build an interesting and rewarding life.

I realize that "Find your passion" makes for a more compelling motivational poster than "What hard problems do you enjoy working on?" (and even that's a lot better than "What kind of pain are you willing to endure?"), but it might give some people a more realistic way to approach finding their life's work.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

October 04, 2018 4:46 PM

Strange Loop 6: Index + This and That

the view from the Stifel Theater

For my convenience and yours, here are all of Strange Loop 2018 posts:

... and few parting thoughts of the non-technical variety:

  • All the images used in these posts are photos I took at the conference. They are licensed CC Attribution-ShareAlike 3.0 Unported.

  • On Day One, Jason Dagit kept saying H.E., for "homomorphic encryption". For a while I was confused, because my brain kept hearing A.G.

  • I left my laptop in the hotel room this year, in order to engage more with the talks and the people than with a web browser. I'm glad I did: I enjoyed the talks more. I also took fewer and more focused notes. That made blogging easier and quicker.

  • I also decided not to acquire swag as greedily as usual, and I did a pretty good job of holding back... except for this beautiful hard-bound Jane Street notebook with graphed pages:
    swag from Jane Street Capital
    "Enter the Monad." Very nice. They must be doing well.

  • I left St. Louis with a lot of plastic. The Stifel Theater, the conference's main venue, does not recycle plastic. Like many conference goers, I went through a fair number of water and soda bottles. I hate to see all that plastic go into the landfill and, having driven down, I did not have to contribute. Twice a day, I took whatever bottles I had emptied, and whatever other bottles I found lying around, back to my car and through them in the trunk. When I got home, they went straight into the recycling bin. Yet another advantage to driving over flying.

I think that's all from Strange Loop 2018. It was fun.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

October 02, 2018 4:04 PM

Strange Loop 5: Day Two

the video screen announcing Philip Wadler's talk

Friday was a long day, but a good one. The talks I saw were a bit more diverse than on Day One: a couple on language design (though even one of those covered a lot more ground than that), one on AI, one on organizations and work-life, and one on theory:

• "All the Languages Together", by Amal Ahmed, discussed a problem that occurs in multi-language systems: when code written in one language invalidates the guarantees made by code written in the other. Most languages are not designed with this sort of interoperability baked in, and their FFI escape hatches make anything possible within foreign code. As a potential solution, Ahmed offered principled escape hatches designed with specific language features in mind. The proposed technique seems like it could be a lot of work, but the research is in its early stages, so we will learn more as she and her students implement the idea.

This talk is yet another example of how so many of our challenges in software engineering are a result of programming language design. It's good to see more language designers taking issues like these seriously, but we have a long way to go.

• I really liked Ashley Williams's talk on on the evolution of async in Javascript and Rust. This kind of talk is right up my alley... Williams invoked philosophy, morality, and cognitive science as she reviewed how two different language communities incorporated asynchronous primitives into their languages. Programming languages are designed, to be sure, but they are also the result of "contingent turns of history" (a lá Foucault). Even though this turned out to be more of a talk about the Rust community than I had expected, I enjoyed every minute. Besides, how can you not like a speaker who says, "Yes, sometimes I'll dress up as a crab to teach."?

(My students should not expect a change in my wardrobe any time soon...)

• I also enjoyed "For AI, by AI", by Connor Walsh. The talk's subtitle, "Freedom & Evolution of the Algopoetic Avant-Garde", was a bit disorienting, as was its cold open, but the off-kilter structure of the talk was easy enough to discern once Walsh got going: first, a historical review of humans making computers write poetry, followed by a look at something I didn't know existed... a community of algorithmic poets — programs — that write, review, and curate poetry without human intervention. It's a new thing, of Walsh's creation, that looks pretty cool to someone who became drunk on the promise of AI many years ago.

I saw two other talks the second day:

  • the after-lunch address by Philip Wadler, "Categories for the Working Hacker", which I wrote about separately
  • Rachel Krol's Some Things May Never Get Fixed, about how organizations work and how developers can thrive despite how they work

I wish I had more to say about the last talk but, with commitments at home, the long drive beckoned. So, I departed early, sadly, hopped in my car, headed west, and joined the mass exodus that is St. Louis traffic on a Friday afternoon. After getting past the main crush, I was able to relax a bit with the rest of Zen and the Art of Motorcycle Maintenance.

Even a short day at Strange Loop is a big win. This was the tenth Strange Loop, and I think I've been to five, or at least that's what my blog seems to tell me. It is awesome to have a conference like this in Middle America. We who live here benefit from the opportunities it affords us, and maybe folks in the rest of the world get a chance to see that not all great computing ideas and technology happen on the coasts of the US.

When is Strange Loop 2019?


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

September 29, 2018 6:19 PM

Strange Loop 1: Day One

the Strange Loop splash screen from the main hall

Last Wednesday morning, I hopped in my car and headed south to Strange Loop 2018. It had been a few years since I'd listened to Zen and the Art of Motorcycle Maintenance on a conference drive, so I popped it into the tapedeck (!) once I got out of town and fell into the story. My top-level goal while listening to Zen was similar to my top-level goal for attending Strange Loop this year: to experience it at a high level; not to get bogged down in so many details that I lost sight of the bigger messages. Even so, though, a few quotes stuck in my mind from the drive down. The first is an old friend, one of my favorite lines from all of literature:

Assembly of Japanese bicycle require great peace of mind.

The other was the intellectual breakthrough that unified Phaedrus's philosophy:

Quality is not an object; it is an event.
This idea has been on my mind in recent months. It seemed a fitting theme, too, for Strange Loop.

On the first day of the conference, I saw mostly a mixture of compiler talks and art talks, including:

@mraleph's "Six Years of Dart", in which he reminisced on the evolution of the language, its ecosystem, and its JIT. I took at least one cool idea from this talk. When he compared the performance of two JITs, he gave a histogram comparing their relative performances, rather than an average improvement. A new system often does better on some programs and worse on others. An average not only loses information; it may mislead.

• Jason Dagit's "Your Secrets are Safe with Julia", about a system that explores the use of homomorphic encryption to to compile secure programs. In this context, the key element of security is privacy. As Dagit pointed out, "trust is not transitive", which is especially important when it comes to sharing a person's health data.

• I just loved Hannah Davis's talk on "Generating Music From Emotion". She taught me about data sonification and its various forms. She also demonstrated some of her attempts to tease multiple dimensions of human emotion out of large datasets and to use these dimensions to generate music that reflects the data's meaning. Very cool stuff. She also showed the short video Dragon Baby, which made me laugh out loud.

• I also really enjoyed "Hackett: A Metaprogrammable Haskell", by Alexis King. I've read about this project on the Racket mailing list for a few years and have long admired King's ability in posts there to present complex ideas clearly and logically. This talk did a great job of explaining that Haskell deserves a powerful macro system like Racket's, that Racket's macro system deserves a powerful type system like Haskell's, and that integrating the two is more challenging than simply adding a stage to the compiler pipeline.

I saw two other talks the first day:

  • the opening keynote address by Simon Peyton Jones, "Shaping Our Children's Education in Computing" [ link ]
  • David Schmüdde, "Misuser" [ link ]
My thoughts on these talks are more extensive and warrant short entries of their own, to follow.

I had almost forgotten how many different kinds of cool ideas I can encounter in a single day at Strange Loop. Thursday was a perfect reminder.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

September 13, 2018 3:50 PM

Legacy

In an interview at The Great Discontent, designer John Gall is asked, "What kind of legacy do you hope to leave?" He replies:

I have no idea; it's not something I think about. It's the thing one has the least control over. I just hope that my kids will have nice things to say about me.

I admire this answer.

No one is likely to ask me about my legacy; I'm just an ordinary guy. But it has always seemed strange when people -- presidents, artists, writers, film stars -- are asked this question. The idea that we can or should craft our own legacy like a marketing brand seems venal. We should do things because they matter, because they are worth doing, because they make the world better, or at least better than it would be without us. It also seems like a waste of time. The simple fact is that most of us won't be remembered long beyond our deaths, and only then by close family members and friends. Even presidents, artists, writers, and film stars are mostly forgotten.

To the extent that anyone will have a legacy, it will decided in the future by others. As Gall notes, we don't have much control over how that will turn out. History is full of people whose place in the public memory turned out much differently than anyone might have guessed at the time.

When I am concerned that I'm not using my time well, it's not because I am thinking of my legacy. It's because I know that time is a precious and limited resource and I feel guilty for wasting it.

About the most any of us can hope is that our actions in this life leave a little seed of improvement in the world after we are gone. Maybe my daughters and former students and friends can make the world better in part because of something in the way I lived. If that's what people mean by their legacy, great, but it's likely to be a pretty nebulous effect. Not many of us can be Einstein or Shakespeare.

All that said, I do hope my daughters have good things to say about me, now and after I'm gone. I love them, and like them a lot. I want to make their lives happier. Being remembered well by them might also indicate that I put my time on Earth to good use.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 03, 2018 7:24 AM

Lay a Split of Good Oak on the Andirons

There are two spiritual dangers in not owning a farm. One is the danger of supposing that breakfast comes from the grocer, and the other that heat comes from the furnace.

The remedy for the first, according to Aldo Leopold, is to grow a garden, preferably in a place without the temptation and distraction of a grocery store. The remedy for the second is to "lay a split of good oak on the andirons" and let it warm your body "while a February blizzard tosses the trees outside".

I ran across Leopold's The Sand County Almanac in the local nature center late this summer. After thumbing through the pages during a break in a day-long meeting indoors, I added it to my long list of books to read. My reading list is actually stack, so there was some hope that I might get to it soon -- and some danger that it would be buried before I did.

Then an old high school friend, propagating a meme on Facebook, posted a picture of the book and wrote that it had changed his life, changed how he looked at the world. That caught my attention, so I anchored it atop my stack and checked a copy out of the university library.

It now serves as a quiet read for this city boy on a dark and rainy three-day weekend. There are no February blizzards here yet, of course, but autumn storms have lingered for days. In an important sense, I'm not a "city boy", as my by big-city friends will tell me, but I've lived my life mostly sheltered from the reality growing my own food and heating my home by a wonderful and complex economy of specialized labor that benefits us all. It's good to be reminded sometimes of that good fortune, and also to luxuriate in the idea of experiencing a different kind of life, even if only for a while.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

August 09, 2018 1:03 PM

Gerald Weinberg Has Passed Away

I just read on the old Agile/XP mailing list that Jerry Weinberg passed away on Tuesday, August 7. The message hailed Weinberg as "one of the finest thinkers on computer software development". I, like many, was a big fan of work.

My first encounter with Weinberg came in the mid-1990s when someone recommended The Psychology of Computer Programming to me. It was already over twenty years old, but it captivated me. It augmented years of experience in the trenches developing computer software with a deep understanding of psychology and anthropology and the firm but gentle mindset of a gifted teacher. I still refer back to it after all these years. Whenever I open it up to a random page, I learn something new again. If you've never read it, check it out now. You can buy the ebook -- along with many of Weinberg's books -- online through LeanPub.

After the first book, I was hooked. I never had the opportunity to attend one of Weinberg's workshops, but colleagues lavished them with praise. I should have made more of an effort to attend one. My memory is foggy now, but I do think I exchanged email messages with him once back in the late 1990s. I'll have to see if I can dig them up in one of my mail archives.

Fifteen years ago or so, I picked up a copy of Introduction to General Systems Thinking tossed out by a retiring colleague, and it became the first in a small collection of Weinberg books now on my shelf. As older colleagues retire in the coming years, I would be happy to salvage more titles and extend my collection. It won't be worth much on the open market, but perhaps I'll be able to share my love of Weinberg's work with students and younger colleagues. Books make great gifts, and more so a book by Gerald Weinberg.

Perhaps I'll share them with my non-CS friends and family, too. A couple of summers back, my wife saw a copy of Are Your Lights On?, a book Weinberg co-wrote with Donald Gause, sitting on the floor of my study at home. She read it and liked it a lot. "You get to read books like that for your work?" Yes.

I just read Weinberg's final blog entry earlier this week. He wasn't a prolific blogger, but he wrote a post every week or ten days, usually about consulting, managing, and career development. His final post touched on something that we professors experience at least occasionally: students sometimes solve the problems we et before them better than we expected, or better than we ourselves can do. He reminded people not to be defensive, even if it's hard, and to see the situation as an opportunity to learn:

When I was a little boy, my father challenged me to learn something new every day before allowing myself to go to bed. Learning new things all the time is perhaps the most important behavior in my life. It's certainly the most important behavior in our profession.

Weinberg was teaching us to the end, with grace and gratitude. I will miss him.

Oh, and one last personal note: I didn't know until after he passed that we shared the same birthday, a few years apart. A meaningless coincidence, of course, but it made me smile.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

August 07, 2018 3:04 PM

Too Bad Richard Feynman Didn't Have a Blog

There is a chapter in "Surely You're Joking, Mr. Feynman" about Feynman's work with biologists over summers and sabbaticals at Princeton and Cal Tech. He used a sabbatical year to work in a colleague's lab on bacteriophages, ribosomes, and RNA. After describing how he had ruined a potentially "fantastic and vital discovery" through sloppiness, he writes:

The other work on the phage I never wrote up -- Edgar kept asking me to write it up, but I never got around to it. That's the trouble with not being in your own field: You don't take it seriously.

I did write something informally on it. I sent it to Edgar, who laughed when he read it. It wasn't in the standard form that biologists use -- first, procedures, and so forth. I spent a lot of time explaining things that all the biologists knew. Edgar made a shortened version, but I couldn't understand it. I don't think they ever published it. I never published it directly.

Too bad Feynman didn't have a blog. I'll bet I could have learned something from his write-up. Not being a biologist, I generally can use some explanation intended for a lay reader, and Feynman's relaxed style might pull me through a biology paper. (Of all the sciences, biology is usually the biggest chore for me to learn.)

These days, scientists can post their informal writings on their blogs with little or no fuss. Standard form and formal style are for journals and conferences. Blog readers prefer relaxed writing and, for the most part, whatever form works best for the writer in order to get the ideas out to the world.

Imagine what a trove of stories Feynman could have told on his blog! He did tell them, of course, but in books like "Surely You're Joking, Mr. Feynman". But not everyone is going to write books, or have books written for them, so I'm glad to have the blogs of scientists, economists, and writers from many disciplines in my newsreader. For those who want something more formal before, or instead of, taking on the journal grind, we have arXiv.org. What a time to be alive.

Of course, when you read on in the chapter, you learn that James Watson (of Watson & Crick fame) heard about Feynman's work, thought it was interesting, invited Feynman to give a seminar talk at Harvard, and then went into the lab with him to conduct an experiment that very same week. I guess it all worked out for Feynman in the end.


Posted by Eugene Wallingford | Permalink | Categories: General

August 05, 2018 10:21 AM

Three Uses of the Knife

I just finished David Mamet's Three Uses of the Knife, a wide-ranging short book with the subtitle: "on the nature and purpose of drama". It is an extended essay on how we create and experience drama -- and how these are, in the case of great drama, the same journey.

Even though the book is only eighty or so pages, Mamet characterizes drama in so many ways that you'll have to either assemble a definition yourself or accept the ambiguity. Among them, he says that the job of drama and art is to "delight" us and that "the cleansing lesson of the drama is, at its highest, the worthlessness of reason."

Mamet clearly believes that drama is central to other parts of life. Here's a cynical example, about politics:

The vote is our ticket to the drama, and the politician's quest to eradicate "fill in the blank", is no different from the promise of the superstar of the summer movie to subdue the villain -- both promise us diversion for the price of a ticket and a suspension of disbelief.

As reader, I found myself using the book's points to ruminate about other parts of life, too. Consider the first line of the second essay:

The problems of the second half are not the problems of the first half.

Mamet uses this to launch into a consideration of the second act of a drama, which he holds equally to be a consideration of writing the second act of a drama. But with fall semester almost upon us, my thoughts jumped immediately to teaching a class. The problems of teaching the second half of a class are quite different from the problems of teaching the first half. The start of a course requires the instructor to lay the foundation of a topic while often convincing students that they are capable of learning it. By midterm, the problems include maintaining the students' interest as their energy flags and the work of the semester begins to overwhelm them. The instructor's energy -- my energy -- begins to flag, too, which echoes Mamet's claim that the journey of the creator and the audience are often substantially the same.

A theme throughout the book is how people immerse themselves in story, suspending their disbelief, even creating story when they need it to soothe their unease. Late in the book, he connects this theme to religious experience as well. Here's one example:

In suspending their disbelief -- in suspending their reason, if you will -- for a moment, the viewers [of a magic show] were rewarded. They committed an act of faith, or of submission. And like those who rise refreshed from prayers, their prayers were answered. For the purpose of the prayer was not, finally, to bring about intercession in the material world, but to lay down, for the time of the prayer, one's confusion and rage and sorrow at one's own powerlessness.

This all makes the book sound pretty serious. It's a quick read, though, and Mamet writes with humor, too. It feels light even as it seems to be a philosophical work.

The following paragraph wasn't intended as humorous but made me, a computer scientist, chuckle:

The human mind cannot create a progression of random numbers. Years ago computer programs were created to do so; recently it has been discovered that they were flawed -- the numbers were not truly random. Our intelligence was incapable of creating a random progression and therefore of programming a computer to do so.

This reminded me of a comment that my cognitive psychology prof left on the back of an essay I wrote in class. He wrote something to the effect, "This paper gets several of the particulars incorrect, but then that wasn't the point. It tells the right story well." That's how I felt about this paragraph: it is wrong on a couple of important facts, but it advances the important story Mamet is telling ... about the human propensity to tell stories, and especially to create order out of our experiences.

Oh, and thanks to Anna Gát for bringing the book to my attention, in a tweet to Michael Nielsen. Gát has been one of my favorite new follows on Twitter in the last few months. She seems to read a variety of cool stuff and tweet about it. I like that.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

July 17, 2018 2:32 PM

Get Attached to Solving Problems for People

In Getting Critiqued, Adam Morse reflects on his evolution from art student to web designer, and how that changed his relationship with users and critiques. Artists create things in which they are, at some level, invested. Their process matters. As a result, critiques, however well-intentioned, feel personal. The work isn't about a user; it's about you. But...

... design is different. As a designer, I don't matter. My work doesn't matter. Nothing I make matters in the context of my process. It's all about the people you are building for. You're just trying to solve problems for people. Once you realize this, it's the most liberating thing.

Now, criticism isn't really about you as artist. It's about how well the design meets the needs of the user. With that in mind, the artist can put some distance between himself or herself and think about the users. That's probably what the users are paying for anyway.

I've never been a designer, but I was fortunate to learn how better to separate myself from my work by participating in the software patterns community and its writers' workshop format. From the workshops, I came to appreciate the value of providing positive and constructive feedback in a supportive way. But I also learned to let critiques from others be about my writing and not about me. The ethos of writers' workshops is one of shared commitment to growth and so creates as supportive framework as possible in which to deliver suggestions. Now, even when I'm not in such an conspicuously supportive environment, I am better able to detach myself from my work. It's never easy, but it's easier. This mindset can wear off a bit over time, so I find an occasional inoculation via PLoP or another supportive setting to be useful.

Morse offers another source of reminder: the designs we create for the web -- and for most software, too-- are not likely to last forever. So...

Don't fall in love with borders, gradients, a shade of blue, text on blurred photos, fancy animations, a certain typeface, flash, or music that autoplays. Just get attached to solving problems for people.

That last sentence is pretty good advice for programmers and designers alike. If we detach ourselves from our specific work output a bit and instead attach ourselves to solving problems for other people, we'll be able to handle their critiques more calmly. As a result, we are also likely to do better work.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

June 03, 2018 10:50 AM

Two Thoughts on Teaching

... from my morning reading.

First, a sentence from Bryan Caplan, about one of his influences, philosopher Michael Huemer:

I think what's great about this book, and really all of Mike's work, is he always tries to start off with premises that make sense to people who don't already agree, and then try to get somewhere.

I value people who take the time to construct arguments in this way. It's surprisingly rare in academic discourse and public discourse. Teachers usually learn pretty quickly, though, that the most effective way to teach is start where your students are: recognize the state of their knowledge and respect their current beliefs. I try to remind myself of this principle regularly during a course, or I'm likely to go off track.

Second, the closing exchange from a 1987 interview with Stanley Kubrick. Kubrick has been talking about how the critics' views of his films tend to evolve over time. The interviewer wrapped up the conversation with:

Well, you don't make it easy on viewers or critics. You create strong feelings, but you won't give us any easy answers.

That's because I don't have any easy answers.

That seems like a pretty good aspiration to have for teaching, that people can say it creates strong feelings but doesn't give any easy answers. Much of teaching is simpler than this, of course, especially in a field such as computer science. A closure is something that we can understand as it is, as is, say, an algorithm for parsing a stream of tokens. But after you learn a few concepts and start trying to build or understand a complex system, easy answers are much harder to come by. Even so, I do hope that students leave my courses with strong feelings about their craft. Those feelings may not match my own, and they'll surely still be evolving, but they will be a product of the student engaging with some big ideas and trying them out on challenging problems.

Maybe if I keep reading interested articles on the exercise the bike and making connections to my craft, I can get this computer science thing down better.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

June 01, 2018 3:05 PM

Prepare to Appreciate the Solution

This post isn't really about chess, though it might seem at first to be.

In The Reviled Art, chess grandmaster Stuart Rachels says that most grandmasters don't like composed chess problems because they are too difficult. It's easy to imagine why average chessplayers find problems too difficult: they aren't all that great chess. But why grandmasters? Rachels contends that problems are hard for tournament players because they are counterintuitive: the solutions contradict the intuitions developed by players whose chess skill is developed and sharpened over the board.

Rachels then says:

Most problems stump me too, so I conceive of the time I spend looking at them as time spent preparing to appreciate their solutions -- not as time spent trying to solve them.

I love this attitude. If I view time spent banging my head against a puzzle or a hard problem as "trying to solve the problem", then not solving the problem might feel like failure. If I view that time as "preparing to appreciate the solution", then I can feel as if my time was well spent even if I don't solve it -- as long as I can appreciate the beauty or depth or originality of the solution.

This attitude is helpful outside of chess. Maybe I'm trying to solve a hard programming problem or trying to understand a challenging area of programming language theory that is new to me. I don't always solve the problem on my own or completely understand the new area without outside help or lots of time reading and thinking. But I often do appreciate the solution once I see it. All the time I spent working on the problem prepared me for that moment.

I often wish that more of my students would adopt Rachels's attitude. I frequently pose a problem for them to work on for a few minutes before we look at a solution, or several candidates, as a group. All too often some students look at the problem, think it's too difficult, and then just sit there waiting for me to show them the answer. This approach often results in them feeling two kinds of failure: they didn't solve the problem, and they don't even appreciate the solution when they see it. They haven't put in the work thinking about it that prepares their minds to really get the solution. Maybe I can do more to help students realize that the work is worth worth the effort even if they don't think they can solve the problem. Send me your suggestions!

Rachels's point about the counterintuitiveness of composed chess problems indicates another way in which trying to solve unorthodox problems can be worthwhile. Sometimes our intuitions let us down because they are too narrow, or even wrong. Trying to solve an unorthodox problem can help us broaden our thinking. My experience with chess compositions is that most of the ideas I need to solve them will not be helpful in over-the-board play; those kinds of positions simply don't occur in real games. But a few themes do apply, and practicing with them helps me learn how to play better in game situations. If nothing else, working on unorthodox problems reminds me to look outside the constraints of my intuitions sometimes when a problem in real life seems too hard.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

May 09, 2018 4:02 PM

Middles

In an old blog post promoting his book on timing, Daniel Pink writes:

... Connie Gersick's research has shown that group projects rarely progress in a steady, linear way. Instead, at the beginning of a project, groups do very little. Then at a certain moment, they experience a sudden burst of activity and finally get going. When is that moment? The temporal midpoint. Give a team 34 days, they get started in earnest on day 17. Give a team 11 days, they get really get going on day 6. In addition, there’s other research showing that being behind at the midpoint--in NBA games and in experimental settings--can boost performance in the second half.

So we need to recognize midpoints and try to use them as a spark rather than a slump.

I wonder if this research suggests that we should favor shorter projects over longer ones. If most of us start going full force only at the middle of our projects, perhaps we should make the middle of our projects come sooner.

I'll admit that I have a fondness for short over long: short iterations over long iterations in software development, quarters over semesters in educational settings, short books (especially non-fiction) over long books. Shorter cycles seem to lead to higher productivity, because I spend more time working and less time ramping up and winding down. That seems to be true for my students and faculty colleagues, too.

In the paragraph that follows the quoted passage, Pink points inadvertently to another feature of short projects that I appreciate: more frequent beginnings and endings. He talks about the poignancy of endings, which adds meaning to the experience. On the other end of the cycle are beginnings, which create a sense of newness and energy. I always look forward to the beginning of a new semester or a new project for the energy it brings me.

Agile software developers know that, on top of these reasons, short projects offer another potent advantage: more opportunities to take stock of what we have learned and feed that learning back into what we do.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

May 04, 2018 1:25 PM

No Venom Here

Ken Perlin liked Ready Player One at the theater and then went off to read some reviews:

Many critics seem incensed, indignant, left sputtering in outrage at the very idea of a Spielberg film that is simply fun, a pop confection designed mainly to entertain and delight.
Perhaps some of it is their feeling of horror that modern pop culture might be something worthy of celebrating, simply for the sake of celebrating a phenomenon that many people find delightful. But why the extreme degree of venom?

I am an unashamed fan of pop culture: music, TV, movies, and all the rest. My biggest complaint these days is that I can't keep up with all the good stuff being created... (It helps that I'm not as big a fan of superhero movies as most people.) Critics can claim to serve as the gatekeepers of culture if they want, but I'll enjoy "Shut Up and Dance" [ YouTube ] all the same.


Posted by Eugene Wallingford | Permalink | Categories: General

March 22, 2018 4:05 PM

Finally, Some Good News

It's been a tough semester. On top of the usual business, there have been a couple of extra stresses. First, I've been preparing for the departure of a very good friend, who is leaving the university and the area for family and personal reasons. Second, a good friend and department colleague took an unexpected leave that turned into a resignation. Both departures cast a distant pall over my workdays. This week, though, has offered a few positive notes to offset the sadness.

Everyone seems to complain about email these days, and I certainly have been receiving and sending more than usual this semester, as our students and I adjust to the change in our faculty. But sometimes an email message makes my day better. Exhibit 1, a message from a student dealing with a specific issue:

Thank you for your quick and helpful response!
Things don't look so complicated or hopeless now.

Exhibit 2, a message from a student who has been taming the bureaucracy that arises whenever two university systems collide:

I would like to thank you dearly for your prompt and thorough responses to my numerous emails. Every time I come to you with a question, I feel as though I am receiving the amount of respect and attention that I wish to be given.

Compliments like these make it a lot easier to muster the energy to deal with the next batch of email coming in.

There has also been good news on the student front. I received email from a rep at a company in Madison, Wisconsin, where one of our alumni works. They are looking for developers to work in a functional programming environment and are having a hard time filling the positions locally, despite the presence of a large and excellent university in town. Our alum is doing well enough that the company would like to hire more from our department, which is doing a pretty good job, too.

Finally, today I spoke in person with two students who had great news about their futures. One has accepted an offer to join the Northwestern U. doctoral program and work in the lab of Kenneth Forbus. I studied Forbus's work on qualitative reasoning and analogical reasoning as a part of my own Ph.D. work and learned a lot from him. This is a fantastic opportunity. The other student has accepted an internship to work at PlayStation this summer, working on the team that develops the compilers for its game engines. He told me, "I talked a lot about the project I did in your course last semester during my interview, and I assume that's part of the reason I got an offer." I have to admit, that made me smile.

I had both of these students in my intro class a few years back. They would have succeeded no matter who taught their intro course, or the compiler course, for that matter, so I can't take any credit for their success. But they are outstanding young men, and I have had the pleasure of getting to know over the last four years. News of the next steps in their careers makes me feel good, too.

I think I have enough energy to make it to the end of the semester now.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

March 12, 2018 3:43 PM

Technology is a Place Where We Live

Yesterday morning I read The Good Room, a talk Frank Chimero gave last month. Early on in the talk, Chimero says:

Let me start by stating something obvious: in the last decade, technology has transformed from a tool that we use to a place where we live.

This sentence jumped off the page both for the content of the assertion and for the decade time frame with which he bounds it. In the fall of 2003, I taught a capstone course for non-majors that is part of my university's liberal arts core. The course, titled "Environment, Technology, and Society", brings students from all majors on campus together in a course near the end of their studies, to apply their general education and various disciplinary expertises to problems of some currency in the world. As you might guess from the title, the course focuses on problems at the intersection of the natural environment, technology, and people.

My offering of the course put on a twist on the usual course content. We focused on the man-made environment we all live in, which even by 2003 had begun to include spaces carved out on the internet and web. The only textbook for the course was Donald Norman's The Design of Everyday Things, which I think every university graduate should have read. The topics for the course, though, had a decided IT flavor: the effect of the Internet on everyday life, e-commerce, spam, intellectual property, software warranties, sociable robots, AI in law and medicine, privacy, and free software. We closed with a discussion of what an educated citizen of the 21st century ought to know about the online world in which they would live in order to prosper as individuals and as a society.

The change in topic didn't excite everyone. A few came to the course looking forward to a comfortable "save the environment" vibe and were resistant to considering technology they didn't understand. But most were taking the course with no intellectual investment at all, as a required general education course they didn't care about and just needed to check off the list. In a strange way, their resignation enabled them to engage with the new ideas and actually ask some interesting questions about their future.

Looking back now after fifteen years , the course design looks pretty good. I should probably offer to teach it again, updated appropriately, of course, and see where young people of 2018 see themselves in the technological world. As Chimero argues in his talk, we need to do a better job building the places we want to live in -- and that we want our children to live in. Privacy, online peer pressure, and bullying all turned out differently than I expected in 2003. Our young people are worse off for those differences, though I think most have learned ways to live online in spite of the bad neighborhoods. Maybe they can help us build better places to live.

Chimero's talk is educational, entertaining, and quotable throughout. I tweeted one quote: "How does a city wish to be? Look to the library. A library is the gift a city gives to itself." There were many other lines I marked for myself, including:

  • Penn Station "resembles what Kafka would write about if he had the chance to see a derelict shopping mall." (I'm a big Kafka fan.)
  • "The wrong roads are being paved in an increasingly automated culture that values ease."
Check the talk out for yourself.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

March 06, 2018 4:11 PM

A Good Course in Epistemology

Theoretical physicist Marcelo Gleiser, in The More We Know, the More Mystery There Is:

But even if we did [bring the four fundamental forces together in a common framework], and it's a big if right now, this "unified theory" would be limited. For how could we be certain that a more powerful accelerator or dark matter detector wouldn't find evidence of new forces and particles that are not part of the current unification? We can't. So, dreamers of a final theory need to recalibrate their expectations and, perhaps, learn a bit of epistemology. To understand how we know is essential to understand how much we can know.

People are often surprised to hear that, in all my years of school, my favorite course was probably PHL 440 Epistemology, which I took in grad school as a cognate to my CS courses. I certainly enjoyed the CS courses I took as a grad student, and as an undergrad, too, and but my study of AI was enhanced significantly by courses in epistemology and cognitive psychology. The prof for PHL 440, Dr. Rich Hall, became a close advisor to my graduate work and a member of my dissertation committee. Dr. Hall introduced me to the work of Stephen Toulmin, whose model of argument influenced my work immensely.

I still have the primary volume of readings that Dr. Hall assigned in the course. Looking back now, I'd forgotten how many of W.V.O. Quine's papers we'd read... but I enjoyed them all. The course challenged most of my assumptions about what it means "to know". As I came to appreciate different views of what knowledge might be and how we come by it, my expectations of human behavior -- and my expectations for what AI could be -- changed. As Gleiser suggests, to understand how we know is essential to understanding what we can know, and how much.

Gleiser's epistemology meshes pretty well with my pragmatic view of science: it is descriptive, within a particular framework and necessarily limited by experience. This view may be why I gravitated to the pragmatists in my epistemology course (Peirce, James, Rorty), or perhaps the pragmatists persuaded me better than the others.

In any case, the Gleiser interview is a delightful and interesting read throughout. His humble of science may get you thinking about epistemology, too.

... and, yes, that's the person for whom a quine in programming is named. Thanks to Douglas Hofstadter for coining the term and for giving us programming nuts a puzzle to solve in every new language we learn.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns, Personal

January 22, 2018 3:50 PM

Same Footage, Different Film

In In the Blink of an Eye, Walter Murch tells the story of human and chimpanzee DNA, about how the DNA itself is substantially the same and how the sequencing, which we understand less well, creates different beings during the development of the newborn. He concludes by bringing the analogy back to film editing:

My point is that the information in the DNA can be seen as uncut film and the mysterious sequencing as the editor. You could sit in one room with a pile of dailies and another editor could sit in the next room with exactly the same footage and both of you could make different films out of the same material.

This struck me as quite the opposite of what programmers do. When given a new problem and a large language in which to solve it, two programmers can choose substantially different source material and yet end up telling the same story. Functional and OO programmers, say, may decompose the problem in a different way and rely on different language features to build their solutions, but in the end both programs will solve the same problem and meet the same user need. Like the chimp and the human, though, the resulting programs may be better adapted for living in different environments.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

January 17, 2018 3:51 PM

Footnotes

While discussing the effective use of discontinuities in film, both motion within a context versus change of context, Walter Murch tells a story about... bees:

A beehive can apparently be moved two inches each night without disorienting the bees the next morning. Surprisingly, if it is moved two miles, the bees also have no problem: They are forced by the total displacement of their environment to re-orient their sense of direction, which they can do easily enough. But if the hive is moved two yards, the bees become fatally confused. The environment does not seem different to them, so they do not re-orient themselves, and as a result, they will not recognize their own hive when they return from foraging, hovering instead in the empty space where the hive used to be, while the hive itself sits just two yards away.

This is fascinating, as well being a really cool analogy for the choices movies editors face when telling a story on film. Either change so little that viewers recognize the motion as natural, or change enough that they re-orient their perspective. Don't stop in the middle.

What is even cooler to me is that this story appears in a footnote.

One of the things I've been loving about In the Blink of an Eye is how Murch uses footnotes to teach. In many books, footnotes contain minutia or references to literature I'll never read, so I skip them. But Murch uses them to tell stories that elaborate on or deepen his main point but which would, if included in the text, interrupt the flow of the story he has constructed. They add to the narrative without being essential.

I've already learned a couple of cool things from his footnotes, and I'm not even a quarter of the way into the book. (I've been taking time to mull over what I read...) Another example: while discussing the value of discontinuity as a story-telling device, Murch adds a footnote that connects this practice to the visual discontinuity found ancient Egyptian painting. I never knew before why the perspective in those drawings was so unusual. Now I do!

My fondness for Murch's footnotes may stem from something more than their informative nature. When writing up lecture notes for my students, I like to include asides, digressions, and links to optional readings that expand on the main arc of the story. I'd like for them to realize that what they are learning is part of a world bigger than our course, that the ideas are often deeper and have wider implications than they might realize. And sometimes I just like to entertain with a connection. Not all students care about this material, but for the ones who do, I hope they get something out of them. Students who don't care can do what I do in other books: skip 'em.

This book gives me a higher goal to shoot for when including such asides in my notes: elaborate without being essential; entice without disrupting.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 07, 2018 10:25 AM

95:1

This morning, I read the first few pages of In the Blink of an Eye, an essay on film editing by Walter Murch. He starts by talking about his work on Apocalypse Now, which took well over a year in large part because of the massive amount of film Coppola shot: 1,250,000 linear feet, enough for 230 hours of running time. The movie ended up being about two hours and twenty-five minutes, so Murch and his colleagues culled 95 minutes of footage for every minute that made it into the final product. A more typical project, Murch says, has a ratio of 20:1.

Even at 20:1, Murch's story puts into clearer light the amount of raw material I create when designing a typical session for one of my courses. The typical session mixes exposition, examples, student exercises, and (less than I'd like to admit) discussion. Now, whenever I feel like a session comes up short of my goal, I will think back on Murch's 20:1 ratio and realize how much harder I might work to produce enough material to assemble a good session. If I want one of my sessions to be an Apocalypse Now, maybe I'll need to shoot higher.

This motivation comes at a favorable time. Yesterday I had a burst of what felt like inspiration for a new first day to my Programming Languages course. At the end of the brainstorm came what is now the working version of my opening line in the course: "In the beginning, there was assembly language.". Let's see if I have enough inspiration -- and make enough time -- to turn the idea into what I hope it can be: a session that fuels my students' imagination for a semester's journey through Racket, functional programming, and examining language ideas with interpreters.

I do hope, though, that the journey itself does not bring to mind Apocalypse Now.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 05, 2018 1:27 PM

Change of Terms

I received a Change of Terms message yesterday from one of my mutual fund companies, which included this unexpected note:

Direction to buy or sell Vanguard funds must be placed online or verbally and may no longer be submitted in writing.

I haven't mailed Vanguard or any other financial services company a paper form or a paper check in years, but still. When I was growing up, I never would have imagined that I would see the day when you could not mail a letter to a company in order to conduct financial business. Busy, busy, busy.

In the academic world, this is the time for another type change of terms, as we prepare to launch our spring semester semester on Monday. The temperatures in my part of the country the last two weeks make the name of the semester a cruel joke, but the hope of spring lives.

For me, the transition is from my compiler course to my programming languages course. Compilers went as well this fall as it has gone in a long time; I really wish I had blogged about it more. I can only hope that Programming Languages goes as well. I've been reading about some ways I might improve the course pedagogically. That will require me to change some old habits, but trying to do so is part of the fun of teaching. I intend to blog about my experiences with the new ideas. As I said, the hope of spring lives.

In any case, I get to write Racket code all semester, so at least I have that going for me, which is nice.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

December 28, 2017 8:46 AM

You Have to Learn That It's All Beautiful

In this interview with Adam Grant, Walter Jacobson talks about some of the things he learned while writing biographies of Benjamin Franklin, Albert Einstein, Steve Jobs, and Leonardo da Vinci. A common theme is that all four were curious and interested in a wide range of topics. Toward the end of the interview, Jacobson says:

We of humanities backgrounds are always doing the lecture, like, "We need to put the 'A' in 'STEM', and you've got to learn the arts and the humanities." And you get big applause when you talk about the importance of that.

But we also have to meet halfway and learn the beauty of math. Because people tell me, "I can't believe somebody doesn't know the difference between Mozart and Haydn, or the difference between Lear and Macbeth." And I say, "Yeah, but do you know the difference between a resistor and a transistor? Do you know the difference between an integral and a differential equation?" They go, "Oh no, I don't do math, I don't do science." I say, "Yeah, but you know what, an integral equation is just as beautiful as a brush stroke on the Mona Lisa." You've got to learn that they're all beautiful.

Appreciating that beauty made Leonardo a better artist and Jobs a better technologist. I would like for the students who graduate from our CS program to know some literature, history, and art and appreciate their beauty. I'd also like for the students who graduate from our university with degrees in literature, history, art, and especially education to have some knowledge of calculus, the Turing machine, and recombinant DNA, and appreciate their beauty.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

December 21, 2017 2:42 PM

A Writer with a Fondness for Tech

I've not read either of Helen DeWitt's novels, but this interview from 2011 makes her sound like a technophile. When struggling to write, she finds inspiration in her tools:

What is to be done?

Well, there are all sorts of technical problems to address. So I go into Illustrator and spend hours grappling with the pen tool. Or I open up the statistical graphics package R and start setting up plots. Or (purists will be appalled) I start playing around with charts in Excel.

... suddenly I discover a brilliant graphic solution to a problem I've been grappling with for years! How to display poker hands graphically in a way that sets a series of strong hands next to the slightly better hands that win.

Other times she feels the need for a prop, a lá Olivier:

I may have a vague idea about a character -- he is learning Japanese at an early age, say. But I don't know how to make this work formally, I don't know what to do with the narrative. I then buy some software that lets me input Japanese within my word-processing program. I start playing around, I come up with bits of Japanese. And suddenly I see that I can make visible the development of the character just by using a succession of kanji! I don't cut out text -- I have eliminated the need for 20 pages of text just by using this software.

Then she drops a hint about a work in progress, along with a familiar name:

Stolen Luck is a book about poker using Tuftean information design to give readers a feel for both the game and the mathematics.

Dewitt sounds like my kind of person. I wonder if I would like her novels. Maybe I'll try Lightning Rods first; it sounds like an easier read than The Last Samurai.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 24, 2017 12:30 PM

Thousand-Year Software

I recently read an old conversation between Neil Gaiman and Kazuo Ishiguro that started out as a discussion of genre but covered a lot of ground, including how stories mutate over time, and that the time scale of stories is so much larger than that of human lives. Here are a few of the passages about stories and time:

NG   Stories are long-lived organisms. They're bigger and older than we are.

NG   You sit there reading Pepys, and just for a minute, you kind of get to be 350, 400 years older than you are.

KI   There's an interesting emotional tension that comes because of the mismatch of lifespans in your work, because an event that might be tragic for one of us may not be so for the long-lived being.

KI   I'm often asked what my attitude is to film, theatrical, radio adaptations of my novels. It's very nice to have my story go out there, and if it's in a different form, I want the thing to mutate slightly. I don't want it to be an exact translation of my novel. I want it to be slightly different, because in a very vain kind of way, as a storyteller, I want my story to become like public property, so that it gains the status where people feel they can actually change it around and use it to express different things.

This last comment by Ishiguro made me think of open-source software. It can be adapted by anyone for almost any use. When we fork a repo and adapt it, how often does it grow into something new and considerably different? I often tell my compiler students about the long, mutated life of P-code, which was related by Chris Clark in a 1999 SIGPLAN Notices article:

P-code is an example [compiler intermediate representation] that took on a life of its own. It was invented by Nicklaus Wirth as the IL for the ETH Pascal compiler. Many variants of that compiler arose [Ne179], including the USCD Pascal compiler that was used at Stanford to define an optimizer [Cho83]. Chow's compiler evolved into the MIPS compiler suite, which was the basis for one of the DEC C compilers -- acc. That compiler did not parse the same language nor use any code from the ETH compiler, but the IL survived.

That's not software really, but a language processed by several generations of software. What are other great examples of software and languages that mutated and evolved?

We have no history with 100-year-old software yet, of course, let alone 300- or 1000-year-old software. Will we ever? Software is connected to the technology of a given time in ways that stories are not. Maybe, though, an idea that is embodied in a piece of software today could mutate and live on in new software or new technology many decades from now? The internet is a system of hardware and software that is already evolving into new forms. Will the world wide web continue to have life in a mutated form many years hence?

The Gaiman/Ishiguro conversation turned out to be more than I expected when I first found it. Good stuff. Oh, and as I wrap up this post, this passage resonates with me:

NG   I know that when I create a story, I never know what's going to work. Sometimes I will do something that I think was just a bit of fun, and people will love it and it catches fire, and sometimes I will work very hard on something that I think people will love, and it just fades: it never quite finds its people.

Been there, done that, my friend. This pretty well describes my experience blogging and tweeting all these years, and even writing for my students. I am a less reliable predictor of what will connect with readers than my big ego would ever have guessed.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 05, 2017 9:42 AM

One Way I'm Like Maurice Sendak

In this conversation, the interviewer asked Maurice Sendak, then age eighty-three, how long he could work in one stretch. The illustrator said that two hours was a long stretch for him.

Because I'm older, I get tired and nap. I love napping. Working and napping and reading and seeing my students. They're awfully nice. They're young and they're hopeful.

I'm not quite eighty-three, but I agree with every sentence in Sendak's answer. I could do worse than be as productive and as cantankerous for as long as he was.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 19, 2017 2:52 PM

Still Skeptical About Tweetstorms

The last couple of months have been the sparsest extended stretch on my blog since I began writing here in 2004. I have missed the feeling of writing, and I've wanted to write, but I guess never wanted it enough to set aside time to do the work. (There may be a deeper reason, the idea of which merits more thinking.) It's also a testament to the power of habit in my life: when I'm in the habit of writing, I write; when I fall out of the habit, I don't. During my unintended break from blogging, I've remained as active as usual on Twitter. But I haven't done much long-form writing other than lecture notes for my compiler class.

And that includes writing tweetstorms.

I'm one of those people who occasionally snarks on Twitter about tweetstorms. They always seem like a poor substitute for a blog entry or an essay. While I've probably written my last snarky tweet about tweetstorms, I remain skeptical of the form.

That said, my curiosity was aroused when Brian Marick, a writer and programmer whose work I always enjoy, tweeted yesterday:

[Note re: "write a blog post". I think the tweetstorm is different lit'ry form, and I like exploring it.]

I would love for Brian or anyone else to be able to demonstrate the value in a tweetstorm that is unique from equivalent writing in other forms. I've read many tweetstorms that I've enjoyed, including the epic Eric Garland disquisition considered by many to be the archetype of the genre. But in the end, every tweetstorm looks like either a bullet-point presentation that could be delivered in Powerpoint, or something that could stand on its own as an essay, if only the sentences were, you know, assembled into paragraphs.

I am sympathetic to the idea that there may be a new literary form lurking here. Like any constraint, the 140-character limit on tweets causes writers to be creative in a new way. Chaining a sequence of similarly constrained statements together as a meaningful whole requires a certain skill, and writers who master the style can pull me through to the end, almost despite myself. But I would read through to the end of a blog entry written as skillfully, and I wouldn't have to do the assembly of the work in my head as I go.

Perhaps the value lies in Twitter as an interaction mechanism. Twitter makes it easy to respond to and discuss the elements of a tweetstorm at the level of individual tweet. That's handy, but it can also be distracting. Not every Twitter platform manages the threading as well as it could. It's also not a new feature of the web; any blogging platform can provide paragraph-level linking as a primitive, and discussion forums are built on modular commentary and linking. Maybe tweetstorms are popular precisely because Twitter is a popular medium of the day. They are the path of least resistance.

That leads to what may be the real reason that people explore the form: Twitter lowers the barrier of entry into blogging to almost nothing: install an app, or point a web browser to your homepage, and you have a blogging platform. But that doesn't make the tweetstorm a new literary form of any particular merit. It's simply a chunking mechanism enforced by the nature of a limited interface. Is there anything more to it than that?

I'm an open-minded person, so when I say I'm skeptical about something, I really am open to changing my mind. When someone I respect says that there may be something to the idea, I know I should pay attention. I'll follow Brian's experiment and otherwise keep my mind open. I'm not expecting to undergo a conversion, but I'm genuinely curious about the possibilities.


Posted by Eugene Wallingford | Permalink | Categories: General

July 27, 2017 1:36 PM

How can we help students overcome "naturalness bias"?

In Leadership as a Performing Art, Ed Batista discusses, among other things, a "naturalness bias" that humans have when evaluating one another. Naturalness is "a preference for abilities and talents that we perceive as innate over those that appear to derive from effort and experience". Even when people express a preference for hard work and experience, they tend to judge more positively people who seem to be operating on natural skill and talent. As Batista notes, this bias affects not only how we evaluate others but also how we evaluate ourselves.

As I read this article, I could not help but think about how students who are new to programming and to computer science often react to their own struggles in an introductory CS course. These thoughts reached a crescendo when I came to these words:

One commonly-held perspective is that our authentic self is something that exists fully formed within us, and we discover its nature through experiences that feel more (or less) natural to us. We equate authenticity with comfort, and so if something makes us feel uncomfortable or self-conscious, then it is de facto inauthentic, which means we need not persist at it (or are relieved of our responsibility to try). But an alternative view is that our authentic self is something that we create over time, and we play an active role in its development through experiences that may feel uncomfortable or unnatural, particularly at first. As INSEAD professor of organizational behavior Herminia Ibarra wrote in The Authenticity Paradox in 2015,

Because going against our natural inclinations can make us feel like impostors, we tend to latch on to authenticity as an excuse for sticking with what's comfortable... By viewing ourselves as works-in-progress and evolving our professional identities through trial and error, we can develop a personal style that feels right to us and suits our organizations' changing needs. That takes courage, because learning, by definition, starts with unnatural and often superficial behaviors that can make us feel calculating instead of genuine and spontaneous. But the only way to avoid being pigeonholed and ultimately become better leaders is to do the things that a rigidly authentic sense of self would keep us from doing.

So many CS students and even computing professionals report suffering from impostor syndrome, sometimes precisely because they compare their internal struggles to learn with what appears to be the natural ability of their colleagues. But, as Ibarra says, learning, by definition, starts with the unnatural. To be uncomfortable is, in one sense, to be in a position to learn.

How might we teachers of computer science help our students overcome the naturalness bias they unwittingly apply when evaluating their own work and progress? We need strategies to help students see that CS is something we do, not something we are. You can feel uncomfortable and still be authentic.

This distinction is at the foundation of Batista's advice to leaders and, I think, at the foundation of good advice to students. When students can distinguish between their behavior and their identity, they are able to manage more effectively the expectations they have of their own work.

I hope to put what I learned in this article to good use both for my students and myself. It might help me be more honest -- and generous -- to myself when evaluating my performance as a teacher and an administrator, and more deliberate in how I try to get better.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Teaching and Learning

June 23, 2017 2:04 PM

No Summer Job? Learn How to Program

The article Why Aren't American Teenagers Working Anymore? comments on a general trend I have observed locally over the last few years: most high school students don't have summer jobs any more. At first, you might think that rising college tuition would provide an incentive to work, but the effect is almost the opposite:

"Teen earnings are low and pay little toward the costs of college," the BLS noted this year. The federal minimum wage is $7.25 an hour. Elite private universities charge tuition of more than $50,000.

Even in-state tuition at a public universities has grown large enough to put it out of the reach of the typical summer jobs. Eventually, there is almost no point in working a low-paying job; you'll have to borrow significant amount anyway.

These days, students have another alternative that might pay off better in the long run anyway. With a little gumption and free resources available on the web, many students can learn to program, build websites, and make mobile apps. Time spent not working a job but developing skills that are in high demand and which pay well might be time spent well.

Even as a computer scientist, though, I'm traditional enough to be a little uneasy with this idea. Don't young people benefit from summer jobs in ways other than a paycheck? The authors of this article offer the conventional thinking:

A summer job can help teenagers grow up as it expands their experience beyond school and home. Working teens learn how to manage money, deal with bosses, and get along with co-workers of all ages.

You know what, though... A student working on an open-source project can learn also how to deal with people in positions of relative authority and learn how to get along with collaborators of all ages. They might even get to interact with people from other cultures and make a lasting contribution to something important.

Maybe instead of worrying about teenagers getting summer jobs we should introduce them to programming and open-source software.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

June 15, 2017 2:14 PM

The Melancholy Department Head

In The Melancholy Dean, Matt Reed notes that, while most management books speak at the level of the CEO or a founder, most managers work further down the chain of command.

Most managers are closer to deans than to presidents. They're in the middle. ... it's not unusual that they find themselves tasked with carrying out policies with which they personally disagree. When success in a position relies largely on "soft power", having to carry out positions with which you personally disagree can be a real strain.

Obviously, if the disagreements become too large or frequent, the right move is to step out of the role. But that's the exception. More commonly, there's a vague sense of "I wouldn't have done it that way" that falls well short of a crisis of conscience, but can be enough to sap motivation. That's especially true when budgets are tightening and adverse decisions are made for you.

I have seen this happen to deans, but I also know the feeling myself. Here, department heads are administrators, and formally they depend upon the dean and provost for their positions. As public universities have to face falling state appropriations, increasing regulatory requirements, and increased competition for students, they often find themselves operating with more of a corporate mentality than the hallowed halls of academia we all dream of from yesteryear. Even with good and open leaders making decisions in upper administration (which I have been fortunate to have in my time as an administrator), more agency lives outside the department, more of the department head's time is spent carrying out activities defined elsewhere, and fewer strategic decisions are made by the head and faculty within the department.

It does wear on a person. These days, academic middle managers of all sorts have to cultivate the motivation they need to carry on. The good news is, through it all, we are helping students, and helping faculty help students. Knowing that, and doing at least a little programming every day, helps me relieve whatever strain I might feel. Even so, I could use more closure most days of the week.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

June 10, 2017 10:28 AM

98% of the Web in One Sentence

Via Pinboard's creator, the always entertaining Maciej Cegłowski:

Pinboard is not much more than a thin wrapper around some carefully tuned database queries.

You are ready to make your millions. Now all you need is an idea.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

June 06, 2017 2:39 PM

Using Programs and Data Analysis to Improve Writing, World Bank Edition

Last week I read a tweet that linked to an article by Paul Romer. He is an economist currently working at the World Bank, on leave from his chair at NYU. Romer writes well, so I found myself digging deeper and reading a couple of his blog articles. One of them, Writing, struck a chord with me both as a writer and as a computer scientist.

Consider:

The quality of written prose should be higher in documents that will have many readers.

This is true of code, too. If a piece of code will be read many times, whether by one person or several, then each minute spent making it shorter and clearer improves reading comprehension every single time. That's even more important in code than in text, because so often we read code in order to change it. We need to understand it at even deeper level to ensure that our changes have the intended effect. Time spent making code better repays itself many times over.

Romer caused a bit of a ruckus when he arrived at the World Bank by insisting, to some of his colleagues' displeasure, that everyone in his division writer clearer, more concise reports. His goal was admirable: He wanted more people to be able to read and understand these reports, because they deal with policies that matter to the public.

He also wanted people to trust what the World Bank was saying by being able more readily to see that a claim was true or false. His article looks at two different examples that make a claim about the relationship between education spending and GDP per capita. He concludes his analysis of the examples with:

In short, no one can say that the author of the second claim wrote something that is false because no one knows what the second claim means.

In science, writing clearly builds trust. This trust is essential for communicating results to the public, of course, because members of the public do not generally possess the scientific knowledge they need to assess the truth of claim directly. But it is also essential for communicating results to other scientists, who must understand the claims at a deeper level in order to support, falsify, and extend them.

In the second half of the article, Romer links to a study of the language used in World Bank's yearly reports. It looks at patterns such as the frequency of the word "and" in the reports and the ratio of nouns to verbs. (See this Financial Times article for a fun little counterargument on the use of "and".)

Romer wants this sort of analysis to be easier to do, so that it can be used more easily to check and improve the World Bank's reports. After looking at some other patterns of possible interest, he closes with this:

To experiment with something like this, researchers in the Bank should be able to spin up a server in the cloud, download some open-source software and start experimenting, all within minutes.

Wonderful: a call for storing data in easy-to-access forms and a call for using (and writing) programs to analyze text, all in the name not of advancing economics technically but of improving its ability to communicate its results. Computing becomes a tool integrated into the process of the World Bank doing its designated job. We need more leaders in more disciplines thinking this way. Fortunately, we hear reports of such folks more often these days.

Alas, data and programs were not used in this way when Romer arrived at the World Bank:

When I arrived, this was not possible because people in ITS did not trust people from DEC and, reading between the lines, were tired of dismissive arrogance that people from DEC displayed.

One way to create more trust is to communicate better. Not being dismissively arrogant is, too, though calling that sort of behavior out may be what got Romer in so much hot water with the administrators and economists at the World Bank in the first place.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development

May 31, 2017 2:28 PM

Porting Programs, Refactoring, and Language Translation

In his commonplace book A Certain World, W.H. Auden quotes C.S. Lewis on the controversial nature of tramslation:

[T]ranslation, by its very nature, is a continuous implicit commentary. It can become less tendentious only by becoming less of a translation.

Lewis was merely acknowledging a truth about language: Translators must have a point of view, and often that point of view will be controversial.

I once saw Kurt Vonnegut speak with a foreign language class here many years ago. One of the students asked him what he thought about the quality of the translations done for his book. Vonnegut laughed and said that his books were so peculiar and so steeped in Americana that translating one was akin to writing a new book. He said that his translators deserved all the royalties from the books they created by translating him. They had to write brand new works.

These memories came to mind again recently while I was reading Tyler Cowen's conversation with Jhumpa Lahiri, especially when Lahiri said this:

At one point I was talking about this idea, in antiquity: in Latin, the word for "translator" is "interpreter". I teach translation now, and I talk a lot to my students about translation being the most intimate form of reading and how there was the time when translating and interpreting and analyzing were all one thing.

As my mind usually does, it began to think about computer programs.

Like many programmers, I often find myself porting a program from one language to another. This is clearly translation but, as Vonnegut and and Lahiri tell us, it is also a form of interpretation. To port a piece of code, I have to understand its meaning and express that meaning in a new language. That language has its own constructs, idioms, patterns, and set of community practices and expectations. To port a program, one must have a point of view, so the process can be, to use Lewis's word, tendentious.

I often refactor code, too, both my own programs and programs written by others. This, too, is a form of translation, even though it leaves the new code written in the same language as the original. Refactoring is necessarily an opinionated act, and thus tendentious.

Occasionally, I refactor a program in order to learn what it does and how it does it. In those cases, I'm not judging the original code as anything but ill-suited to my current state of knowledge. Even so, when I get done, I usually like my version better, if only a little bit. It expresses what I learned in the process of rewriting the code.

It has always been hard for me to port a program without refactoring it, and now I understand why. Both activities are a kind of translation, and translation is by its nature an activity that requires a point of view.

This fall, I will again teach our "Translation of Programming Languages" course. Writing a compiler requires one to become intimate not only with specific programs, the behavior of which the compiler must preserve, but also the language itself. At the end of the project, my students know the grammar, syntax, and semantics of our source language in a close, almost personal way. The target language, too. I don't mind if my students develop a strong point of view, even a controversial one, along the way. (I'm actually disappointed if the stronger students do not!) That's a part of writing new software, too.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns, Software Development, Teaching and Learning

May 21, 2017 10:07 AM

Computer Programs Have Much to Learn, and Much to Teach Us

In his recent interview with Tyler Cowen, Garry Kasparov talks about AI, chess, politics, and the future of creativity. In one of the more intriguing passages, he explains that building databases for chess endgames has demonstrated how little we understand about the game and offers insight into how we know that chess-playing computer programs -- now so far beyond humans that even the world champion can only score occasionally against commodity programs -- still have a long way to improve.

He gives as an example a particular position with a king, two rooks, a knight on one side versus a king and two rooks on the other. Through the retrograde analysis used to construct endgame databases, we know that, with ideal play by both sides, the stronger side can force checkmate in 490 moves. Yes, 490. Kasparov says:

Now, I can tell you that -- even being a very decent player -- for the first 400 moves, I could hardly understand why these pieces moved around like a dance. It's endless dance around the board. You don't see any pattern, trust me. No pattern, because they move from one side to another.

At certain points I saw, "Oh, but white's position has deteriorated. It was better 50 moves before." The question is -- and this is a big question -- if there are certain positions in these endgames, like seven-piece endgames, that take, by the best play of both sides, 500 moves to win the game, what does it tell us about the quality of the game that we play, which is an average 50 moves? [...]

Maybe with machines, we can actually move our knowledge much further, and we can understand how to play decent games at much greater lengths.

But there's more. Do chess-playing computer programs, so much superior to even the best human players, understand these endgames either? I don't mean "understand" in the human sense, but only in the sense of being able to play games of that quality. Kasparov moves on to his analysis of games between the best programs:

I think you can confirm my observations that there's something strange in these games. First of all, they are longer, of course. They are much longer because machines don't make the same mistakes [we do] so they could play 70, 80 moves, 100 moves. [That is] way, way below what we expect from perfect chess.

That tells us that [the] machines are not perfect. Most of those games are decided by one of the machines suddenly. Can I call it losing patience? Because you're in a position that is roughly even. [...] The pieces are all over, and then suddenly one machine makes a, you may call, human mistake. Suddenly it loses patience, and it tries to break up without a good reason behind it.

That also tells us [...] that machines also have, you may call it, psychology, the pattern and the decision-making. If you understand this pattern, we can make certain predictions.

Kasparov is heartened by this, and it's part of the reason that he is not as pessimistic about the near-term prospects of AI as some well-known scientists and engineers are. Even with so-called deep learning, our programs are only beginning to scratch the surface of complexity in the universe. There is no particular reason to think that the opaque systems evolved to drive our cars and fly our drones will be any more perfect in their domains than our game-playing programs, and we have strong evidence from the domain of games that programs are still far from perfect.

On a more optimistic note, advances in AI give us an opportunity to use programs to help us understand the world better and to improve our own judgment. Kasparov sees this in chess, in the big gaps between the best human play, the best computer play, and perfect play in even relatively simple positions; I wrote wistfully about this last year, prompted by AlphaGo's breakthrough. But the opportunity is much more valuable when we move beyond playing games, as Cowen alluded in an aside during Kasparov's explanation: Imagine how bad our politics will look in comparison to computer programs that do it well! We have much to learn.

As always, this episode of Conversations with Tyler was interesting and evocative throughout. If you are a chess player, there is an special bonus. The transcript includes a pointer to Kasparov's Immortal Game against Veselin Topalov at Wijk aan Zee in 1999, along with a discussion of some of Kasparov's thoughts on the game beginning with the pivotal move 24. Rxd4. This game, an object of uncommon beauty, will stand as an eternal reminder why, even in the face of advancing AI, it will always matter that people play and compete and create.

~~~~

If you enjoyed this entry, you might also like Old Dreams Live On. It looks more foresightful now that AlphaGo has arrived.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

April 15, 2017 10:39 AM

Science Seeks Regularity

A week or so ago I tweeted that Carver Mead was blowing my mind: an electron a mile long! I read about that idea in this Spectator interview that covers both Mead's personal life and his professional work in engineering. Well worth a read.

Mead is not satisfied with the current state of physics and biology, or at least with the incomplete theories that we seem to have accepted in lieu of a more coherent conceptual understanding of how the world works. Ultimately, he sides with Einstein in his belief that there is a more coherent explanation:

I think Einstein was being a scientist in the truest sense in his response to the Copenhagen interpretation. He said that none of us would be scientists if deep down we didn't believe there is a set of regularities in the operation of physical law. That is a matter of faith. It is not something anybody has proven, but none of us would be scientists if we didn't have that faith.

Like Einstein, Mead believes that unpredictability at the lowest levels of a system does not imply intrinsic uncertainty. We need a different view that brings regularities to the forefront of our theories.

I also like this line from near the end of the interview:

People don't even know where to put the decimal point.

Mead says this as part of his assurance that artificial intelligence is nowhere near the level of what even the fruit fly can do, let alone the human brain. A lot has happened in AI during fifteen years since this interview; a computer program even beats our best players in Go now. Still, there is so much that we don't understand and cannot replicate.

I wonder if Mead's "decimal point" aphorism also might apply, metaphorically, to his view of the areas of science in which we have settled for, or are currently stuck with, unsatisifying theories. Our mathematical models cover a lot of ground, decimal point-wise, but there is still a simpler, more coherent picture to see. Maybe, though, that is the engineer in Mead showing through.


Posted by Eugene Wallingford | Permalink | Categories: General

April 02, 2017 12:02 PM

Reading an Interview with John McPhee Again, for the First Time

"This weekend I enjoyed Peter Hessler's interview of McPhee in The Paris Review, John McPhee, The Art of Nonfiction No. 3."

That's a direct quote from this blog. Don't remember it? I don't blame you; neither do I. I do remember blogging about McPhee back when, but as I read the same Paris Review piece again last Sunday and this, I had no recollection of reading it before, no sense of déjà vu at all.

Sometimes having a memory like mine is a blessing: I occasionally get to read something for the first time again. If you read my blog, then you get to read my first impressions for a second time.

I like this story that McPhee told about Bob Bingham, his editor at The New Yorker:

Bingham had been a writer-reporter at The Reporter magazine. So he comes to work at The New Yorker, to be a fact editor. Within the first two years there, he goes out to lunch with his old high-school friend Gore Vidal. And Gore says, What are you doing as an editor, Bobby? What happened to Bob Bingham the writer? And Bingham says, Well, I decided that I would rather be a first-rate editor than a second-rate writer. And Gore Vidal draws himself up and says, And what is wrong with a second-rate writer?

I can just hear the faux indignation in Vidal's voice.

McPhee talked a bit about his struggle over several years to write a series of books on geology, which had grown out of an idea for a one-shot "Talk of the Town" entry. The interviewer asked him if he ever thought about abandoning the topic and moving on to something he might enjoy more. McPhee said:

The funny thing is that you get to a certain point and you can't quit. Because I always worried: if you quit, you'll quit again. The only way out was to go forward, to learn your way and write your way out of it.

I know that feeling. Sometimes, I really do need to quit something and move on, but I always wonder whether quitting this time will make it easier to do next time. Because sometimes, I need to stick it out and, as McPhee says, learn my way out of the difficulty. I have no easy answers for knowing when quitting is the right thing to do.

Toward the end of the interview, the conversation turned to the course McPhee teaches at Princeton, once called "the literature of fact". The university first asked him to teach on short notice, over the Christmas break in 1974, and he accepted immediately. Not everyone thought it was a good idea:

One of my dear friends, an English teacher at Deerfield, told me: Do not do this. He said, Teachers are a dime a dozen -- writers aren't. But my guess is that I've been more productive as a writer since I started teaching than I would have been if I hadn't taught. In the overall crop rotation, it's a complementary job: I'm looking at other people's writing, and the pressure's not on me to do it myself. But then I go back quite fresh.

I know a lot of academics who feel this way. Then again, it's a lot easier to stay fresh in one's creative work if one has McPhee's teaching schedule, rather than a full load of courses:

My schedule is that I teach six months out of thirty-six, and good Lord, that leaves a lot of time for writing, right?

Indeed it does. Indeed it does.

On this reading of the interview, I marked only two passages that I wrote about last time. One came soon after the above response, on how interacting with students is its own reward. The other was a great line about the difference between mastering technique and having something to say: You demonstrated you know how to saddle a horse. Now go find the horse.

That said, I unconsciously channeled this line from McPhee just yesterday:

Writing teaches writing.

We had a recruitment event on campus, and I was meeting with a dozen or so prospective students and their entourages. We were talking about our curriculum, and I said a few words about our senior project courses. Students generally like these courses, even though they find them difficult. The students have never had to write a big program over the course of several months, and it's harder than it looks. The people who hire our graduates like these courses, too, because they know that these courses are places where students really begin to learn to program.

In the course of my remarks, I said something to the effect, "You can learn a lot about programming in classes where you study languages and techniques and theory, but ultimately you learn to write software by writing software. That's what the project courses are all about." There were a couple of experienced programmers in the audience, and they were all nodding their heads. They know McPhee is right.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

March 10, 2017 2:51 PM

Reading Is A Profoundly Creative Act

This comes from Laura Miller, a book reviewers and essayist for Slate, in a Poets & Writers interview:

I also believe that reading is a profoundly creative act, that every act of reading is a collaboration between author and reader. I don't understand why more people aren't interested in this alchemy. It's such an act of grace to give someone else ten or fifteen hours out of your own irreplaceable life, and allow their voice, thoughts, and imaginings into your head.

I think this is true of all reading, whether fiction or nonfiction, literary or technical. I often hear CS profs tell their students to read "actively" by trying code out in an interpreter, asking continually what the author means, and otherwise engaging with the material. Students who do have a chance to experience what Miller describes: turning over a few hours of their irreplaceable lives to someone who understands a topic well, allow their voice, thoughts, and imaginings into their heads, and coming out on the other end of the experience with new thoughts -- and maybe even a new mind.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

February 12, 2017 11:11 AM

Howard Marks on Investing -- and Software Development

Howard Marks is an investor and co-founder of Oaktree Capital Management. He has a big following in the financial community for his views on markets and investing, which often stray from orthodoxy, and for his straightforward writing and speaking style. He's a lot like Warren Buffett, with less public notoriety.

This week I read Marks's latest memo [ PDF ] to Oak Tree's investors, which focuses on expert opinion and forecasting. This memo made me think a lot about software development. Whenever Marks talks about experts predicting how the market would change and how investors should act, I thought of programming. His comments sound like the wisdom of an agile software developer.

Consider what he learned from the events of 2016:

  1. First, no one really knows what events are going to transpire.
  2. And second, no one knows what the market's reaction to those events will be.

Investors who got out of the market for the last couple of months of 2016, based on predictions about what would happen, missed a great run-up in value.

If a programmer cannot predict what will happen in the future, or how stakeholders will respond to these changes, then planning in too much detail is at best an inefficient use of time and energy. At worst it is a way to lock yourself into code that you really need to change but can't.

Or consider these thoughts on surprises (the emphasis in the original):

It's the surprises no one can anticipate that would move markets most if they were to happen. But (a) most people can't imagine them and (b) most of the time they don't happen. That's why they're called surprises.

To Marks, this means that investors should not try to get cute, predict the future, and outsmart the market. The best they can do is solid technical analysis of individual companies and invest based on observable facts about value and value creation.

To me, this means that we programmers shouldn't try to prepare for surprises by designing them into our software. Usually, the best we can do is to implement simple, clean code that does just what it does and no more. The only prediction we can make about the future is that we may well have to change our code. Creating clean interfaces and hiding implementation choices enable us to write code that is as straightforward as possible to change when the unimaginable happens, or even the imaginable.

Marks closes this memo with five quotes about forecasting from a collection he has been building for forty years. I like this line from former GE executive Ian Wilson, which expresses the conundrum that every designer faces:

No amount of sophistication is going to allay the fact that all of your knowledge is about the past and all your decisions are about the future.

It isn't really all that strange that the wisdom of an investor like Marks might be of great value to a programmer. Investors and programmers both have to choose today how to use a finite resource in a way that maximizes value now and in the future. Both have to make these choices based on knowledge gleaned from the past. Both are generally most successful when the future looks like the past.

A big challenge for investors and programmers alike is to find ways to use their experience of the past in a way that maximizes value across a number of possible futures, both the ones we can anticipate and the ones we can't.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

February 10, 2017 3:55 PM

Follow-Up on Learning by Doing and Ubiquitous Information

A few quick notes on my previous post about the effect of ubiquitous information on knowing and doing.

~~~~

The post reminded a reader of something that Guy Steele said at DanFest, a 2004 festschrift in honor of Daniel Friedman's 60th birthday. As part of his keynote address, Steele read from an email message he wrote in 1978:

Sussman did me a very big favor yesterday -- he let me flounder around trying to build a certain LISP interpreter, and when I had made and fixed a critical bug he then told me that he had made (and fixed) the same mistake in Conniver. I learned a lot from that bug.

Isn't that marvelous? "I learned a lot from that bug."

Thanks to this reader for pointing me to a video of Steele's DanFest talk. You can watch this specific passage at the 12:08 mark, but really: You now have a link to an hour-long talk by Guy Steele that is titled "Dan Friedman--Cool Ideas". Watch the entire thing!

~~~~

If all you care about is doing -- getting something done -- then ubiquitous information is an amazing asset. I use Google and StackOverflow answers quite a bit myself, mostly to navigate the edges of languages that I don't use all the time. Without these resources, I would be less productive.

~~~~

Long-time readers may have read the story about how I almost named this blog something else. ("The Euphio Question" still sets my heart aflutter.) Ultimately I chose a title that emphasized the two sides of what I do as both a programmer and a teacher. The intersection of knowing and doing is where learning takes place. Separating knowing from doing creates problems.

In a post late last year, I riffed on some ideas I had as I read Learn by Painting, a New Yorker article about an experiment in university education in which everyone made art as a part of their studies.

That article included a line that expressed an interesting take on my blog's title: "Knowing and doing are two sides of the same activity, which is adapting to our environment."

That's cool thought, but a rather pedestrian sentence. The article includes another, more poetic line that fits in nicely with the theme of the last couple of days:

Knowing is better than not knowing, but knowing without doing is as good as not knowing.

If I ever adopt a new tagline for my blog, it may well be this sentence. It is not strictly true, at least in a universal sense, but it's solid advice nonetheless.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

January 28, 2017 8:10 AM

Curiosity on the Chessboard

I found a great story from Lubomir Kavalek in his recent column, Chess Champions and Their Queens. Many years ago, Kavalek was talking with Berry Withuis, a Dutch journalist, about Rashid Nezhmedtinov, who had played two brilliant queen sacrifices in the span of five years. The conversation reminded Withuis of a question he once asked of grandmaster David Bronstein:

"Is the Queen stronger than two light pieces?"

(The bishop and knight are minor, or "light", pieces.)

The former challenger for the world title took the question seriously. "I don't know," he said. "But I will tell you later."

That evening Bronstein played a simultaneous exhibition in Amsterdam and whenever he could, he sacrificed his Queen for two minor pieces. "Now I know," he told Withuis afterwards. "The Queen is stronger."

How is that for an empirical mind? Most chessplayers would have immediately answered "yes" to Withuis's question. But Bronstein -- one of the greatest players never to be world champion and author of perhaps the best book of tournament analysis in history -- didn't know for sure. So he ran an experiment!

We should all be so curious. And humble.

I wondered for a while if Bronstein could have improved his experiment by channeling Kent Beck's Three Bears pattern. (I'm a big fan of this learning technique and mention it occasionally here, most recently last summer.) This would require him to play many games from the other side of the sacrifice as well, with a queen against his opponents' two minor pieces. Then I realized that he would have a hard time convincing any of his opponents to sacrifice their queens so readily! This may be the sort of experiment that you can only conduct from one side, though in the era of chess computers we could perhaps find, or configure, willing collaborators.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns, Teaching and Learning

January 26, 2017 3:37 PM

Another "I'm Successful Because I Was Lucky" Admission

This one from essayist and cartoonist Tim Kreider, in an AdviceToWriters interview:

What's your advice to new writers?
I first have to say that whatever moderate success I may have achieved has been so much a result of dumb luck that I feel fraudulent presuming to offer any advice to young writers, as if I did any of this on purpose or according to plan.

I appreciate humble advice.

The advice Kreider goes on to give aspiring writers is mostly obvious, as he says up front that it will be, but he also shares a quote from Thoreau that I like:

Read the best books first, or you may not have a chance to read them at all.

As someone who most days runs out of time to read as much computer science as I want, I value this reminder.


Posted by Eugene Wallingford | Permalink | Categories: General

January 06, 2017 4:29 PM

Moving a Flatter Email Archive

I'm not a New Year's resolution person, but I did make a change recently that moved me out of my comfort zone. Here's a quick version of the story.

I'm a hierarchical guy, like a lot of computer scientists, I imagine. That helps me manage a lot of complexity, but sometimes it also consumes more personal time than I'd like.

I'm also a POP mail guy. For many years, Eudora was my client of choice. A while back, I switched to Mail.app on OS X. In both, I had an elaborate filing system in which research mail was kept in a separate folder from teaching mail, which was kept in a separate folder from personal was kept in a separate folder from .... There were a dozen or so top-level folders, each having sub-folders.

Soon after I became department head a decade or so ago, I began to experience the downsides of this approach as much as the upsides. Some messages wanted to live in two folders, but I had to choose one. Even when the choice was easy, I found myself spending too many minutes each week filing away messages I would likely never think of again.

For years now, my browser- and cloud-loving friends have been extolling to me the value of leaving all my mail on the server, hitting 'archive' when I wanted to move a message out of my inbox, and then using the mail client's search feature to find messages when I need them later. I'm not likely to become a cloud email person any time soon, but the cost in time and mental energy of filing messages hierarchically finally became annoying enough that I decided to move into the search era.

January 1 was the day.

But I wasn't ready to go all the way. (Change is hard!) I'd still like to have a gross separation of personal mail from professional mail, and gross separation among email related to teaching, research, professional work, and university administration. If Mail.app had tags or labels, I might use them, but it doesn't. At this point, I have five targeted archive folders:

  • department business (I'm chair)
  • university business
  • TPLoP business (I'm a regional editor)
  • correspondence with my wife and daughters
  • other personal correspondence
  • personal finance and commerce
a folder for the course I am currently teaching, a folder for bulk mail and unavoidable mailing lists, and a folder for everything else. Everything else includes messages from mailing lists I choose to be on, such as the Racket users listserv and personal lists. None of these has subfolders.

I still have three other small hierarchies. The first is where I keep folders for other courses I have taught or plan to teach. I like the idea of keeping course questions and materials easy to find. The second is for hot topics I am working on as department head. For instance, we are currently doing a lot of work on outcomes assessment, and it's helpful to have all those messages in a separate bin. When a topic is no longer hot, I'll transfer its messages to the department archive. The third is is a set of two or three small to-do boxes. Again, it's helpful to an organizer like me to have such messages in a separate bin so that I can find and respond to them quickly; eventually those messages will move to the appropriate flat archive.

Yes, there is still a lot going on here, but it's a big change for me. So far, so good. I've not felt any urges to create subfolders yet, and I've used search to find things when I've needed them. After I become habituated to this new way of living, perhaps I'll feel daring enough to go even flatter.

Let's not talk about folders in my file system, though. Hierarchy reigns supreme there, as it always has.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

December 28, 2016 1:01 PM

Unclear on the Concept, or How Ken Griffey, Jr., is like James Monroe

Today I read an article on faithless electors, those members of the Electoral College who over the years did not vote for the candidate they were pledged. One story from 1820 made me think of baseball's Hall of Fame!

William Plummer, Sr. was pledged to vote for Democratic-Republican candidate James Monroe. Instead, he cast his vote for John Quincy Adams, also of the Democratic-Republican Party, although Adams was not a candidate in the 1820 election.

Supposedly, Plummer did not feel that the Electoral College should unanimously elect any president other than George Washington.

There are many Hall of Fame voters each year who practice Mr. Plummer's ostentatious electoral purity. They don't vote for anyone on his first ballot, preserving "the legacy of their predecessors", none of whom -- neither Cobb nor Ruth, Mays nor Aaron -- were elected unanimously.

(Some leave a great off the ballot for a more admirable reason: to use the vote to support a player they believe deserves entry but who does not receive many other votes. They hope to give such a player more time to attract a sufficient body of voters. I cut these voters a lot more slack than I cut the Plummers of the world.)

It was a silly idea in the case of President Monroe, whose unanimous election would have done nothing to diminish Washington's greatness or legacy, and it's a silly idea in the case of baseball greats like Ken Griffey, Junior.


Posted by Eugene Wallingford | Permalink | Categories: General

December 19, 2016 3:04 PM

Higher Education Has Become A Buyer's Market

... as last week's Friday Fragments reminds us.

Much of higher education is based on the premise of a seller's market. In a seller's market, the institution can decide the terms on which it will accept students. At the very elite, exclusive places, that's still largely true. Swarthmore turns away far more than it admits, and it does so on its own terms. But most of us aren't Swarthmore.

The effects of this change are numerous. It's hard to set prices, let alone correlate price and quality. University administrations are full of people confused by the shifting market. They are also full of people frantic at the thought of a drop in enrollment or retention. There are easy ways to keep these numbers up, of course, but most folks aren't willing to pay the associated price.

Interesting times, indeed.


Posted by Eugene Wallingford | Permalink | Categories: General

October 30, 2016 9:25 AM

Which Part of Speech Am I?

I saw a passage attributed to Søren Kierkegaard that I might translate as:

The life of humanity could very well be conceived as a speech in which different people represented the various parts of speech [...]. How many people are merely adjectives, interjections, conjunctions, adverbs; how few are nouns, verbs; how many are copula?

This is a natural thing to ponder around my birthday. It's not a bad thing to ask myself more often: Which part of speech will I be today?


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

October 22, 2016 2:00 PM

Competence and Creating Conditions that Minimize Mistakes

I enjoyed this interview with Atul Gawande by Ezra Klein. When talking about making mistakes, Gawande notes that humans have enough knowledge to cut way down on errors in many disciplines, but we do not always use that knowledge effectively. Mistakes come naturally from the environments in which we work:

We're all set up for failure under the conditions of complexity.

Mistakes are often more a matter of discipline and attention to detail than a matter of knowledge or understanding. Klein captures the essence of Gawande's lesson in one of his questions:

We have this idea that competence is not making mistakes and getting everything right. [But really...] Competence is knowing you will make mistakes and setting up a context that will help reduce the possibility of error but also help deal with the aftermath of error.

In my experience, this is a hard lesson for computer science students to grok. It's okay to make mistakes, but create conditions where you make as few as possible and in which you can recognize and deal with the mistakes as quickly as possible. High-discipline practices such as test-first and pair programming, version control, and automated builds make a lot more sense when you see them from this perspective.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development, Teaching and Learning

October 15, 2016 10:47 AM

A View of Self-Driving Cars from 1956

A friend and fellow classic science fiction fan told me that one of his favorite books as a teenager was Robert Heinlein's The Door into Summer. I've read a lot of Heinlein but not this one, so I picked it up at the library.

Early in the book, protagonist Daniel B. Davis needed to make the most of the next twenty-fours. He located his car, dropped some money into the "parking attendant", set course, and relaxed as the car headed out into traffic:

Or tried to relax. Los Angeles traffic was too fast and too slashingly murderous for me to be really happy under automatic control; I wanted to redesign their whole installation--it was not a really modern "fail safe". By the time we were west of Western Avenue and could go back on manual control I was edgy and wanted a drink.

This scene is set December 1970; Heinlein wrote it in 1956. He may have missed the year in which self-driving cars were already common technology by 45 years or more, but I think he got the feeling right. People like to be in control of their actions, especially when dropped into hectic conditions they can't control. Heinlein's character is an engineer, so naturally he thinks he could design a better. None of my programmer friends are like this, of course.

It's also interesting to note that automatic control was required in the most traffic. Once he got into a calmer setting, Davis could go back to driving himself. The system allows the human to drive only when he isn't a danger to other people, or even himself!

Today, it is commonplace to think that the biggest challenges of the move to self-driving cars are cultural, not technological: getting people to accept that the cars can drive themselves more safely than humans can drive them, and getting people to give up control. It's neat to see that Heinlein recognized this sixty years ago.


Posted by Eugene Wallingford | Permalink | Categories: General

October 06, 2016 2:46 PM

Computers Shouldn't Need a Restart Button (Memories of Minix)

An oldie but goodie from Andrew Tanenbaum:

Actually, MINIX 3 and my research generally is **NOT** about microkernels. It is about building highly reliable, self-healing, operating systems. I will consider the job finished when no manufacturer anywhere makes a PC with a reset button. TVs don't have reset buttons. Stereos don't have reset buttons. Cars don't have reset buttons. They are full of software but don't need them. Computers need reset buttons because their software crashes a lot. I know that computer software is different from car software, but users just want them both to work and don't want lectures why they should expect cars to work and computers not to work. I want to build an operating system whose mean time to failure is much longer than the lifetime of the computer so the average user never experiences a crash.

I remember loving MINIX 1 (it was just called Minix then, of course) when I first learned it in grad school. I did not have any Unix experience coming out of my undergrad and had only begun to feel comfortable with BSD Unix in my first few graduate courses. Then I was assigned to teach the Operating Systems course, working with one of the CS faculty. He taught me a lot, but so did Tanenbaum -- through Minix. That is one of the first times I came to really understand that the systems we use (the OS, the compiler, the DBMS) were just programs that I could tinker with, modify, and even write.

Operating systems is not my area, and I have no expertise for evaluating the whole microkernel versus monolith debate. But I applaud researchers like Tanenbaum who are trying to create general computer systems that don't need to be rebooted. I'm a user, too.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

October 02, 2016 10:03 AM

Tom Wolfe on Writer's Block

In the Paris Review's The Art of Fiction No. 123, Tom Wolfe tells how he learned about writer's block. Wolfe was working at Esquire magazine, and his first editor, Byron Dobell, had assigned him to write an article about car customizers. After doing all his research, he was totally blocked.

I now know what writer's block is. It's the fear you cannot do what you've announced to someone else you can do, or else the fear that it isn't worth doing. That's a rarer form. In this case I suddenly realized I'd never written a magazine article before and I just felt I couldn't do it. Well, Dobell somehow shamed me into writing down the notes that I had taken in my reporting on the car customizers so that some competent writer could convert them into a magazine piece. I sat down one night and started writing a memorandum to him as fast as I could, just to get the ordeal over with. It became very much like a letter that you would write to a friend in which you're not thinking about style, you're just pouring it all out, and I churned it out all night long, forty typewritten, triple-spaced pages. I turned it in in the morning to Byron at Esquire, and then I went home to sleep.

Later that day, Dobell called him to say that they were deleting the "Dear Byron" at the top of the memo and running the piece.

Most of us need more editing than that after we write anything, but... No matter; first you have to write something. Even if it's the product of a rushed all-nighter, just to get an obligation off our table.

When I write, and especially when I program, my reluctance to start usually grows out of a different sort of fear: the fear that I won't be able to stop, or want to. Even simple programming tasks can become deep holes into which we fall. I like that feeling, but I don't have enough control of my work schedule most days to be able to risk disappearing like that. What I could use is an extra dose of audacity or impetuosity. Or maybe a boss like Byron Dobell.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

September 28, 2016 3:16 PM

Language, and What It's Like To Be A Bat

My recent post about the two languages resonated in my mind with an article I finished reading the day I wrote the post: Two Heads, about the philosophers Paul and Pat Churchland. The Churchlands have been on a forty-year quest to change the language we use to describe our minds, from popular terms based in intuition and introspection to terms based in the language of neuroscience. Changing language is hard under any circumstances, and it is made harder when the science they need is still in its infancy. Besides, maybe more traditional philosophers are right and we need our traditional vocabulary to make sense of what it feels like to be human?

The New Yorker article closes with these paragraphs, which sounds as if they are part of a proposal for a science fiction novel:

Sometimes Paul likes to imagine a world in which language has disappeared altogether. We know that the two hemispheres of the brain can function separately but communicate silently through the corpus callosum, he reasons. Presumably, it will be possible, someday, for two separate brains to be linked artificially in a similar way and to exchange thoughts infinitely faster and more clearly than they can now through the muddled, custom-clotted, serially processed medium of speech. He already talks about himself and Pat as two hemispheres of the same brain. Who knows, he thinks, maybe in his children's lifetime this sort of talk will not be just a metaphor.

If, someday, two brains could be joined, what would be the result? A two-selved mutant like Joe-Jim, really just a drastic version of Siamese twins, or something subtler, like one brain only more so, the pathways from one set of neurons to another fusing over time into complex and unprecedented arrangements? Would it work only with similar brains, already sympathetic, or, at least, both human? Or might a human someday be joined to an animal, blending together two forms of thinking as well as two heads? If so, a philosopher might after all come to know what it is like to be a bat, although, since bats can't speak, perhaps he would be able only to sense its batness without being able to describe it.

(Joe-Jim is a character from a science fiction novel, Robert Heinlein's Orphans of the Sky.)

What a fascinating bit of speculation! Can anyone really wonder why kids are drawn to science fiction?

Let me add my own speculation to the mix: If we do ever find a way to figure out what it's like to be a bat, people will find a way to idescribe what it's like to be a bat. They will create the language they need. Making language is what we do.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns

September 26, 2016 2:54 PM

A Lesson from Tolkien about Commitment and Ignorance

Elrond has just addressed the Fellowship of the Ring, reminding them that only the Ring-Bearer is charged with completing the task ahead of them. The others "go with him as free companions", to assist in whatever ways they are able. He enters into an exchange with Gimli:

"The further you go, the less easy it will be to withdraw; yet no oath or bond is laid on to go further than you will. For you do not yet know the strength of your hearts, and you cannot foresee what each may meet upon the road."

"Faithless is he who says farewell when the road darkens," said Gimli.

"Maybe," said Elrond, "but let him not vow to walk in the dark who has not seen the nightfall."

"Yet sworn word may strengthen the quaking heart," said Gimli.

"Or break it," said Elrond. "Look not too far ahead!"

This is a tension we all live: the desire to make unconditional promises about the future to our lovers and compatriots despite not evening knowing what is possible in that future. I love Elrond's response, "Let him not vow to walk in the dark who has not seen the nightfall." Members of the Fellowship found that their future contained evil beyond their comprehension and temptations beyond their imagination.

Our challenge is to constantly balance this tension: to live with the confidence of Gimli, but tempered by the pragmatic awareness of our ignorance that Elrond offers. Sometimes, commitment gives us the strength to continue on in the face of fear. Sometimes, though, there is no shame in turning back.


Posted by Eugene Wallingford | Permalink | Categories: General

September 25, 2016 9:40 AM

There Is Only One Culture, But Two Languages

W.H. Auden, in A Certain World, on the idea of The Two Cultures:

Of course, there is only one. Of course, the natural sciences are just as "humane" as letters. There are, however, two languages, the spoken verbal language of literature, and the written sign language of mathematics, which is the language of science. This puts the scientist at a great advantage, for, since like all of us he has learned to read and write, he can understand a poem or a novel, whereas there are very few men of letters who can understand a scientific paper once they come to the mathematical parts.

When I was a boy, we were taught the literary languages, like Latin and Greek, extremely well, but mathematics atrociously badly. Beginning with the multiplication table, we learned a series of operations by rote which, if remembered correctly, gave the "right" answer, but about any basic principles, like the concept of number, we were told nothing. Typical of the teaching methods then in vogue is the mnemonic which I had to learn.
Minus times Minus equals Plus:
The reason for this we need not discuss.

Sadly, we still teach young people that it's okay if math and science are too hard to master. They grow into adults who feel a chasm between "arts and letters" and "math and science". But as Auden notes rightly, there is no chasm; there is mostly just another language to learn and appreciate.

(It may be some consolation to Auden that we've reached a point where most scientists have to work to understand papers written by scientists in other disciplines. They are written in highly specialized languages.)

In my experience, it is more acceptable for a humanities person to say "I'm not a science person" or "I don't like math" than for a scientist to say something similar about literature, art, or music. The latter person is thought, silently, to be a Philistine; the former, an educated person with a specialty.

I've often wondered if this experience suffers from observation bias or association bias. It may well. I certainly know artists and writers who have mastered both languages and who remain intensely curious about questions that span the supposed chasm between their specialties and mine. I'm interested in those questions, too.

Even with this asymmetry, the presumed chasm between cultures creates low expectations for us scientists. Whenever my friends in the humanities find out that I've read all of Kafka's novels and short stories; that Rosencrantz and Guildenstern Are Dead is my favorite play, or that I even have a favorite play; that I really enjoyed the work of choreographer Merce Cunningham; that my office bookshelf includes the complete works of William Shakespeare and a volume of William Blake's poetry -- I love the romantics! -- most seem genuinely surprised. "You're a computer scientist, right?" (Yes, I like Asimov, Heinlein, Clarke, and Bradbury, too.)

Auden attributes his illiteracy in the language of mathematics and science to bad education. The good news is that we can reduce, if not eliminate, the language gap by teaching both languages well. This is a challenge for both parents and schools and will take time. Change is hard, especially when it involves the ways we talk about the world.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

September 22, 2016 3:03 PM

RSS > Email

Newsletters delivered by email seem all the rage these days. I subscribe to only two or three. Every once in a while, the bulk mailer used by these folks gets blacklisted by some spam filtering service used by our mail server, the mail server respects the blacklist, and I don't receive my newsletter. We've whitelisted the senders of two particular newsletters, but even so I occasionally don't receive the message.

 the standard icon for RSS subscription, via Wikipedia

This is one reason I still love RSS. My newsreader is in control of the exchange. Once authors post their articles and updates their feeds, my newsreader can see them. I hit refresh, and the articles appear. RSS is not perfect; occasionally a blog updates its feed and I see a bunch of old articles in my reader. But I've been following some bloggers for well over a decade, and it has served us all well.

Do not expect me to hit you up for your email address anytime soon. I understand some of the reasons for going the newsletter route, but I think I'll keep publishing on my blog with a newsfeed for a while. That said, I love to hear from readers. Send me email any time, or tweet me at @wallingf.


Posted by Eugene Wallingford | Permalink | Categories: General

September 18, 2016 3:49 PM

Talking Shop

a photo of the Blueridge Orchard, visited by cyclists on the Cedar Valley Farm Ride

I agree with W.H. Auden:

Who on earth invented the silly convention that it is boring or impolite to talk shop? Nothing is more interesting to listen to, especially if the shop is not one's own.

My wife went on a forty-mile bike ride this morning, a fundraiser for the Cedar Valley Bicycle Collective, which visited three local farms. At those stops, I had the great fortune to listen to folks on all three farms talk shop. We learned about making ice cream and woodcarving at Three Pines Farm. We learned about selecting, growing, and picking apples -- and the damage hail and bugs can do -- at Blueridge Orchard. And the owner of the Fitkin Popcorn Farm talked about the popcorn business. He showed us the machines they use to sort the corn out of the field, first by size and then by density. He also talked about planting fields, harvesting the corn, and selling the product nationally. I even learned that we can pop the corn while it's still on the ears! (This will happen in my house very soon.)

I love to listen to people talk shop. In unguarded moments, they speak honestly about something they love and know deeply. They let us in on what it is like for them to work in their corner of the world. However big I try to make my world, there is so much more out there to learn.

The Auden passage is from his book A Certain World, a collage of poems, quotes, and short pieces from other writers with occasional comments of his own. Auden would have been an eclectic blogger! This book feels like a Tumblr blog, without all the pictures and 'likes'. Some of the passages are out of date, but they let us peak in on the mind of an accomplished poet. A little like good shop talk.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

September 16, 2016 12:41 PM

We Are All Mashups

Sheldon reminds Leonard that he never really had a personality, on The Bib Bang Theory

There is a scene in The Big Bang Theory where Sheldon laments that, without realizing it, he had allowed his girl/friend to alter his personality. Leonard responds, "Well, you didn't really have a 'personality'. You just had some shows you liked."

This scene came to mind when I read a passage from Kenneth Goldsmith's Uncreative Writing earlier this week:

I don't think there's a stable or essential "me". I am an amalgamation of many things: books I've read, movies I've seen, television shows I've watched, conversations I've had, songs I've sung, lovers I've loved. In fact, I'm a creation of so many people and so many ideas, to the point where I feel I've actually had few original thoughts and ideas; to think that what I consider to be "mine" was "original" would be blindingly egotistical.

It is occasionally daunting when I realize how much I am a product of the works, people, and ideas I've encountered. How can I add anything new? But when I surrender to the fact that I can't, it frees me to write and do things that I like. What I make may not be new, but it can still be useful or valuable, even if only to me.

I wonder what it's like for kids to grow up in a self-consciously mash-up culture. My daughters have grown up in a world where technology and communication have given everyone the ability to mix and modify other work so easily. It's a big part of the entertainment they consume.

Mash-up culture must feel hugely empowering in some moments and hugely terrifying in others. How can anyone find his or her own voice, or say something that matters? Maybe they have a better sense than I did growing up that nothing is really new and that what really matters is chasing your interests, exploring the new lands you enter, and sharing what you find. That's certainly been the source of my biggest accomplishments and deepest satisfactions.

(I ran across the passage from Goldsmith on Austin Kleon's blog.)


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 10, 2016 10:41 AM

Messages Rather Than Conversations

Kevin Kelly, in Amish Hackers:

One Amish-man told me that the problem with phones, pagers, and PDAs (yes he knew about them) was that "you got messages rather than conversations". That's about as an accurate summation of our times as any. Henry, his long white beard contrasting with his young bright eyes told me, "If I had a TV, I'd watch it." What could be simpler?

Unlike some younger Amish, I still do not carry a smart phone. I do own a cell but use it only when traveling. If our home phone disappeared overnight, it would likely take several days before my wife or I even noticed.

I also own a television, a now-déclassé 32" flat screen. Henry is right: having a TV, I find myself watching it on occasion. I enjoy it but have to guard vigilantly against falling into a hypnotic trance. It turns out that I form certain habits quite easily.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 06, 2016 2:44 PM

"Inception" and the Simulation Argument

If Carroll's deconstruction of the simulation argument is right, then the more trouble we have explaining consciousness, the more that should push us to believe we're in a ground-level simulation. There's probably a higher-level version of physics in which consciousness makes sense. Our own consciousness is probably being run in a world that operates on that higher-level law. And we're stuck in a low-resolution world whose physics doesn't allow consciousness -- because if we weren't, we'd just keep recursing further until we were.

-- Scott Alexander, The View From Ground Level

two characters from the film 'Inception' walking in a dream world where space folds back on itself

In the latest installment of "You Haven't Seen That Yet?", I watched the film Inception yesterday. There was only one person watching, but still the film gets two hearty thumbs-up. All those Ellen Pages waking up, one after the other...

Over the last few years, I've heard many references to the idea from physics that we are living in a simulation, that our universe is a simulation created by beings in another universe. It seems that some physicists think and talk about this a lot, which seems odd to me. Empiricism can't help us much to unravel the problem; arguments pro and con come down to the sort of logical arguments favored by mathematicians and philosophers, abstracted away from observation of the physical world. It's a fun little puzzle, though. The computer science questions are pretty interesting, too.

Ideas like this are old hat to those of us who read a lot of science fiction growing up, in particular Philip K. Dick. Dick's stories were often predicated on suspending some fundamental attribute of reality, or our perception of it, and seeing what happened to our lives and experiences. Now that I have seen Memento (a long-time favorite of mine) and Inception, I'm pretty happy. What Philip K. Dick was with the written word to kids of my generation, Christopher Nolan is on film to a younger generation. I'm glad I've been able to experience both.

~~~~

The photo above comes from Matt Goldberg's review of Inception. It shows Arthur, the character played by Joseph Gordon-Levitt, battling with a "projection" in three-dimensional space that folds back on itself. Such folding is possible in dream worlds and is an important element in designing dreams that enable all the cool mind games that are central to the film.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

September 03, 2016 4:49 PM

The Innate Boundary

If you love a demanding task, one that requires both discipline and talent -- shooting hoops, playing drums, writing code -- you eventually discover an innate boundary: you can apprehend real virtuosity, especially as it's used to best you, but you can't quite incorporate it. You will never be more than almost-great.

-- Tad Friend, in Squash for the Midlife Slump.

Still, you get to love. That's a good thing.


Posted by Eugene Wallingford | Permalink | Categories: General

August 22, 2016 4:18 PM

A New Way to Debug Our Election Systems

In The Security of Our Election Systems, Bruce Schneier says that we no longer have time to sound alarm about security flaws in our election systems and hope that government and manufacturers will take action. Instead...

We must ignore the machine manufacturers' spurious claims of security, create tiger teams to test the machines' and systems' resistance to attack, drastically increase their cyber-defenses and take them offline if we can't guarantee their security online.

How about this:

The students in my department love to compete in cyberdefense competitions (CDCs), in which they are charged with setting up various systems and then defending them against attack from experts for some period, say, twenty-four hours. Such competitions are growing in popularity across the country.

Maybe we should run a CDC with the tables turned. Manufacturers are required to set up their systems and to run the full set of services they promise when they sell the systems to government agencies. Students across the US would then be given a window of twenty-fours or more to try to crack the systems, with the manufacturers or even our election agencies trying to keep their systems up and running securely. Any vulnerabilities that the students find would be made public, enabling the manufacturers to fix them and the state agencies to create and set up new controls.

Great idea or crazy fantasy?


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

August 14, 2016 10:55 AM

Hemingway on Teachers, While Teaching

Ernest Hemingway sitting on a dock next to his boat, Pilar, in the 1930s

Early in Arnold Samuelson's With Hemingway: A Year in Key West and Cuba, Papa is giving an impromptu lecture about writing to two aspiring young writers. (He does that a lot in the book, whenever the men are out sailing and fishing.) This particular lecture was prompted by what he thought was bad advice in a book by a successful pulp fiction author on how to get started as a writer. An earlier session had focused on the shortcomings of going to college to learn how to become a writer.

Toward the end of his discourse, Hemingway tells the young writers to do daily writing exercise and generously offers to read read their work, giving feedback on how to get better. This offer elicits a few more remarks about the idea of college writing professors:

"They ought to have me teach some of those college classes. I could teach them something. Most professors of English composition can tell the students what's wrong with their pieces but they don't know how to make them good, because, if they knew that, they'd be writers themselves and they wouldn't have to teach."

"What do you think of the life of a professor?"

"All right for a person who is vain and likes to have the adulation of his students. Well, now, do you fellows think you can remember everything Professor Hemingway has told you? Are you prepared for a written examination on the lecture?"

Teaching computer science must be different from teaching fiction writing. I have been teaching for quite a few years now and have never received any adulation. Then again, though, I've never experienced much derision either. My students seems to develop a narrower set of emotions. Some seem to like me quite a bit and look for chances to take another course with me. Other students are... indifferent. To them, I'm just the guy standing in the way of them getting to somewhere else they want to be.

Hemingway's "have to teach" dig is cliché. Perhaps the Ernest Hemingways and Scott Fitzgeralds of the world should be devoting all of their time to writing, but there have a been any number of excellent authors who have supplemented their incomes and filled the down time between creative bursts by helping other writers find a path for themselves. Samuelson's book itself is a testament to how much Papa loved to share his wisdom and to help newcomers find their footing in a tough business. During all those hours at sea, Hemingway was teaching.

Still, I understand what Hemingway means when he speaks of the difference between knowing that something is bad and knowing how to make something good. One of the biggest challenges I faced in my early years as a professor was figuring out how to go beyond pointing out errors and weaknesses in my students' code to giving them concrete advice on how two design and write good programs. I'm still learning how to do that.

I'm lucky that I like to write programs myself. Writing code and learning new styles and languages is the only way to stay sharp. Perhaps if I were really good, I'd leave academia and build systems for Google or some hot start-up, as Hemingway would have it. I'm certainly under no illusion that I can simulate that kind of experience working at a university. But I do think a person can both do and teach, and that the best teachers are ones who take both seriously. In computer science, it is a constant challenge to keep up with students who are pushing ahead into a world that keeps changing.

~~~~

The photo above comes from the John F. Kennedy Presidential Library and Museum. It shows Hemingway sitting on a dock next to his boat, Pilar, sometime in the 1930s. The conversation quoted above took place on the Pilar in 1934.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

August 11, 2016 10:49 AM

To Founders in Search of Patience and Low Costs

Nils Pihl, CEO at Traintracks.io, writes about the benefits of launching the start-up in Beijing:

It took two years of hard work and late nights at the whiteboard to build a prototype of something we knew we could be proud of -- and what Silicon Valley investor would agree to fund something that would take two years to release? Not only that, but it would have cost us roughly 6 times as much money to develop it in Silicon Valley -- for no immediate benefit.

If moving to Beijing is not an option for you, fear not. You do not have to travel that far to find patient investors, great programmers, and low cost of living. Try Des Moines. Or St. Louis. Or Indianapolis. Or, if you must live in a Major World City, try Chicago. Even my small city can offer a good starting point, though programmers are not as plentiful as we might like.

The US Midwest has a lot of advantages for founders, but none of the smog you'll find in Beijing and much shorter commutes than you will find in all the places people tell you you have to go.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

August 04, 2016 11:51 AM

The Spirit Of Our Time: Sensation Brought To An Extreme

Jack Levine, on painting as a realist in the 1950s, a time of abstract expressionism and art as social commentary:

The difficulty is for me to be affirmative. I'm a little inhibited, as you have noticed, by not being against any of these people. The spirit of denunciation is more in the spirit of our time: sensation brought to an extreme.

Levine might just as well have been talking about today's social and political climate. Especially if he had had a Facebook or Twitter account.

~~~~

(This passage comes from Conversations with Artists. These entries also draw passages from it: [ 07/19 | 07/27 | 07/31 ]. This is my last entry drawn from the book, at least for now.)


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

August 03, 2016 1:56 PM

Programming: Don't Knock It Till You Try It

We have a fair number of students on campus outside of CS who want to become web designers, but few of them think they should learn to program. Some give it a try when one of our communications profs tells them how exciting and liberating it can be. In general, though, it's a hard sell. Programming sounds boring to them, full of low-level details better left to techies over in computer science.

This issue pervades the web design community. In The Bomb in the Garden, Matthew Butterick does a great job of explaining why the web as a design medium is worth saving, and pointing to ways in which programming can release the creativity we need to keep it alive.

Which brings me to my next topic--what should designers know about programming?

And I know that some of you will think this is beating a dead horse. But when we talk about restoring creativity to the web, and expanding possibilities, we can't avoid the fact that just like the web is a typographic medium, it's also a programmable medium.

And I'm a designer who actually does a lot of programming in my work. So I read the other 322,000 comments about this on the web. I still think there's a simple and non-dogmatic answer, which is this:

You don't have to learn programming, but don't knock it till you try it.

It's fun for me when one of the web design students majoring in another department takes his or her first programming class and is sparked by the possibilities that writing a program opens up. And we in CS are happy to help them go deeper into the magic.

Butterick speaks truth when he says he's a designer who does a lot of programming in his work. Check out Pollen, the publishing system he created to write web-based books. Pollen's documentation says that it "helps authors make functional and beautiful digital books". That's true. It's a very nice package.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 31, 2016 10:19 AM

"I Live In Air Filled With Images..."

Leonard Baskin waved his arms around his head:

I tell you honestly that I do not live in air. I live in air filled with images, waiting, waiting. And they are mad at me because I don't make them. This is not a fantasy. It is real, I assure you.

I know a few programmers who feel the same way about code. I have periods of such immediacy myself.

This is one of those double-edged phenomena, though. Many people would like to find some goal or activity that so enlivens their world, but they also do not want it to drive them to endless distraction. Fortunately, when we get deep into creating a new something, the swirl goes away for a while.

(This passage comes from Conversations with Artists, which I have now quoted a few times. I promise not to type the entire book into my blog.)


Posted by Eugene Wallingford | Permalink | Categories: General

July 13, 2016 11:19 AM

A Student Asks About Pursuing Research Projects

Faculty in my department are seeking students to work on research projects next. I've sent a couple of messages to our student mailing list this week with project details. One of my advisees, a bright guy with a good mind and several interests, sent me a question about applying. His question got to the heart of a concern many students have, so I responded to the entire student list. I thought I'd share the exchange as an open letter to all students out there who are hesitant about pursuing an opportunity.

The student wrote something close to this:

Both professors' projects seem like great opportunities, but I don't feel even remotely qualified for either of them. I imagine many students feel like this. The projects both seem like they'd entail a really advanced set of skills -- especially needing mathematics -- but they also require students with at least two semesters left of school. Should I bother contacting them? I don't want to jump the gun and rule myself out.

Many students "self-select out" -- choose not to pursue an opportunity -- because they don't feel qualified. That's too bad. You would be surprised how often the profs would be able to find a way to include a student who are interested in their work. Sometimes, they work around a skill the student doesn't have by finding a piece of the project he or she can contribute to. More often, though, they help the student begin to learn the skill they need. We learn many things best by doing them.

Time constraints can be a real issue. One semester is not enough time to contribute much to some projects. A grant may run for a year and thus work best with a student who will be around for two or more semesters. Even so, the prof may be able to find a way to include you. They like what they do and like to work with other people who do, too.

My advice is to take a chance. Contact the professor. Stop in to talk with him or her about your interest, your skills, and your constraints. The worst case scenario is that you get to know the professor a little better while finding out that this project is not a good fit for you. Another possible outcome, though, is that you find a connection that leads to something fruitful. You may be surprised!

~~~~

Postcript. One student has stopped in already this morning to thank me for the encouragement and to say that he is going to contact one of the profs. Don't let a little uncertainty stand in the way of pursuing something you like.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

May 30, 2016 8:43 AM

"The One Form of Poverty That Should Be Shunned"

In her essay "The Importance of Being Scared", poet Wislawa Szymborska talked about the fairy tales of Hans Christian Andersen, which were often scary in ways that are out of sync with modern sensibilities. Of Andersen, she wrote:

He didn't believe that you should try to be good because it pays (as today's moral tales insistently advertise, though it doesn't necessarily turn out that way in real life), but because evil stems from intellectual and emotional stuntedness and is the one form of poverty that should be shunned.

I love that phrase: "the one form of poverty that should be shunned", as well as Andersen's prescription.

I need to read more Szymborska. She is often quite funny. This book review quotes a passage from the review of a book on caves that made me smile:

The first to discover caves were of course those animals who could find their way in the dark. Cavemen, who had already lost this gift, couldn't venture too far into their caves. They had to stick to the edges. It's not that they didn't have the nerve, they just didn't have flashlights.

That joke hits close to home in my work as a programmer and teacher, and even as department head. Sometimes, I don't need more nerve. I need a flashlight.


Posted by Eugene Wallingford | Permalink | Categories: General

May 15, 2016 9:36 AM

An Interview about Encryption

A local high student emailed me last week to say that he was writing a research paper about encryption and the current conversation going on regarding its role in privacy and law enforcement. He asked if I would be willing to answer a few interview questions, so that he could have a few expert quotes for his paper. I'm always glad when our local schools look to the university for expertise, and I love to help young people, so I said yes.

I have never written anything here about my take on encryption, Edward Snowden, or the FBI case against Apple, so I figured I'd post my answers. Keep in mind that my expertise is in computer science. I am not a lawyer, a political scientist, or a philosopher. But I am an informed citizen who knows a little about how computers work. What follows is a lightly edited version of the answers I sent the student.

  1. Do you use encryption? If so, what do you use?

    Yes. I encrypt several disk images that hold sensitive financial data. I use encrypted files to hold passwords and links to sensitive data. My work laptop is encrypted to protect university-related data. And, like everyone else, I happily use https: when it encrypts data that travels between me and my bank and other financial institutions on the web.

  2. In light of the recent news on groups like ISIS using encryption, and the Apple v. Department of Justice, do you support legislation that eliminates or weakens powerful encryption?

    I oppose any legislation that weakens strong encryption for ordinary citizens. Any effort to weaken encryption so that the government can access data in times of need weakens encryption for all people at all times and against all intruders.

  3. Do you think the general good of encryption (protection of data and security of users) outweighs or justifies its usage when compared to the harmful aspects of it (being used by terrorists groups or criminals)?

    I do. Encryption is one of the great gifts that computer science has given humanity: the ability to be secure in one's own thoughts, possessions, and communication. Any tool as powerful as this one can be misused, or used for evil ends.

    Encryption doesn't protect us from only the U.S. government acting in good faith. It protects people from criminals who want to steal our identities and our possessions. It protects people from the U.S. government acting in bad faith. And it protects people from other governments, including governments that terrorize their own people. If I were a citizen of a repressive regime in the Middle East, Africa, Southeast Asia, or anywhere else, I would want the ability to communicate without intrusion from my government.

    Those of us who are lucky to live in safer, more secure circumstances owe this gift to the people who are not so lucky. And weakening it for anyone weakens it for everyone.

  4. What is your response to someone who justifies government suppression of encryption with phrases like "What are you hiding?" or "I have nothing to hide."?

    I think that most people believe in privacy even when they have nothing to hide. As a nation, we do not allow police to enter our homes at any time for any reason. Most people lock their doors at night. Most people pull their window shades down when they are bathing or changing clothes. Most people do not have intimate relations in public view. We value privacy for many reasons, not just when we have something illegal to hide.

    We do allow the police to enter our homes when executing a search warrant, after the authorities have demonstrated a well-founded reason to believe it contains material evidence in an investigation. Why not allow the authorities to enter or digital devices under similar circumstances? There are two reasons.

    First, as I mentioned above, weakening encryption so that the government can access data in times of legitimate need weakens encryption for everyone all the time and makes them vulnerable against all intruders, including bad actors. It is simply not possible to create entry points only for legitimate government uses. If the government suppresses encryption in order to assist law enforcement, there will be disastrous unintended side effects to essential privacy of our data.

    Second, our digital devices are different than our homes and other personal property. We live in our homes and drive our cars, but our phones, laptops, and other digital devices contain fundamental elements of our identity. For many, they contain the entirety of our financial and personal information. They also contain programs that enact common behaviors and would enable law enforcement to recreate past activity not stored on the device. These devices play a much bigger role in our lives than a house.

  5. In 2013 Edward Snowden leaked documents detailing surveillance programs that overstepped boundaries spying on citizens. Do you think Snowden became "a necessary evil" to protect citizens that were unaware of surveillance programs?

    Initially, I was unsympathetic to Snowden's attempt to evade detainment by the authorities. The more I learned about the programs that Snowden had uncovered, the more I came to see that his leak was an essential act of whistleblowing. The American people deserve to know what their government is doing. Indeed, citizens cannot direct their government if they do not know what their elected officials and government agencies are doing.

  6. In 2013 to now, the number of users that are encrypting their data has significantly risen. Do you think that Snowden's whistleblowing was the action responsible for a massive rise in Americans using encryption?

    I don't know. I would need to see some data. Encryption is a default in more software and on more devices now. I also don't know what the trend line for user encryption looked like before his release of documents.

  7. Despite recent revelations on surveillance, millions of users still don't voluntarily use encryption. Do you believe it is fear of being labeled a criminal or the idea that encryption is unpatriotic or makes them an evil person?

    I don't know. I expect that there are a number of bigger reasons, including apathy and ignorance.


  8. Encryption defaults on devices like iPhones, where the device is encrypted while locked with a passcode is becoming a norm. Do you support the usage of default encryption and believe it protects users who aren't computer savvy?

    I like encryption by default on my devices. It comes with risks: if I lose my password, I lose access to my own data. I think that users should be informed that encryption is turned on by default, so that they can make informed choices.

  9. Should default encryption become required by law or distributed by the government to protect citizens from foreign governments or hackers?

    I think that we should encourage people to encrypt their data. At this point, I am skeptical of laws that would require it. I am not a legal scholar and do not know that the government has the authority to require it. I also don't know if that is really what most Americans want. We need to have a public conversation about this.

  10. Do you think other foreign countries are catching up or have caught up to the United States in terms of technical prowess? Should we be concerned?

    People in many countries have astonishing technical prowess. Certainly individual criminals and other governments are putting that prowess to use. I am concerned, which is one reason I encrypt my own data and encourage others to do so. I hope that the U.S. government and other American government agencies are using encryption in an effort to protect us. This is one reason I oppose the government mandating weakness in encryption mechanisms for its own purposes.

  11. The United States government disclosed that it was hacked and millions of employees information was compromised. Target suffered a breach that resulted in credit card information being stolen. Should organizations and companies be legally responsible for breaches like these? What reparations should they make?

    I am not a lawyer, but... Corporations and government agencies should take all reasonable precautions to protect the sensitive data they store about their customers and citizens. I suspect that corporations are already subject to civil suit for damages caused by data breaches, but that places burdens on people to recover damages for losses due to breached data. This is another area where we as a people need to have a deeper conversation so that we can decide to what extent we want to institute safeguards into the law.

  12. Should the US begin hacking into other countries infrastructures and businesses to potentially damage that country in the future or steal trade secrets similar to what China has done to us?

    I am not a lawyer or military expert, but... In general, I do not like the idea of our government conducting warfare on other peoples and other governments when we are not in a state of war. The U.S. should set a positive moral example of how a nation and a people should behave.

  13. Should the US be allowed to force companies and corporations to create backdoors for the government? What do believe would be the fallout from such an event?

    No. See the third paragraph of my answer to #4.

As I re-read my answers, I realize that, even though I have thought a lot about some of these issues over the years, I have a lot more thinking to do. One of my takeaways from the interview is that the American people need to think about these issues and have public conversations in order to create good public policy and to elect officials who can effectively steward the government in a digital world. In order for this to happen, we need to teach everyone enough math and computer science that they can participate effectively in these discussions and in their own governance. This has big implications for our schools and science journalism.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

May 07, 2016 10:55 AM

Without Wonder, Without Awe

Henry Miller, in "The Books in My Life" (1969):

Every day of his life the common man makes use of what men in other ages would have deemed miraculous means. In the range of invention, if not in powers of invention, the man of today is nearer to being a god than at any time in history. (So we like to believe!) Yet never was he less godlike. He accepts and utilizes the miraculous gifts of science unquestioningly; he is without wonder, without awe, reverence, zest, vitality, or joy. He draws no conclusions from the past, has no peace or satisfaction in the present, and is utterly unconcerned about the future. He is marking time.

It's curious to me that this was written around the same time as Stewart Brand's clarion call that we are as gods. The zeitgeist of the 1960s, perhaps.

"The Books in My Life" really has been an unexpected gift. As I noted back in November, I picked it up on a lark after reading a Paris Review interview with Miller, and have been reading it off and on since. Even though he writes mostly of books and authors I know little about, his personal reflections and writing style click with me. Occasionally, I pick up one of the books he discusses, ost recently Richard Jefferies's The Story of My Heart.

When other parts of the world seem out of sync, picking up the right book can change everything.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

May 05, 2016 1:45 PM

Philosopher-Programmer

In her 1942 book Philosophy in a New Key, philosopher Susanne Langer wrote:

A question is really an ambiguous proposition; the answer is its determination.

This sounds like something a Prolog programmer might say in a philosophical moment. Langer even understood how tough it can be to write effective Prolog queries:

The way a question is asked limits and disposes the ways in which any answer to it -- right or wrong -- may be given.

Try sticking a cut somewhere and see what happens...

It wouldn't be too surprising if a logical philosopher reminded me of Prolog, but Langer's specialties were consciousness and aesthetics. Now that I think about it, though, this connection makes sense, too.

Prolog can be a lot of fun, though logic programming always felt more limiting to me than most other styles. I've been fiddling again with Joy, a language created by a philosopher, but every so often I think I should earmark some time to revisit Prolog someday.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development

April 28, 2016 4:12 PM

A Homeric Take on the Power of Programming

By participating in history instead of standing by to watch we shall at least be able to enjoy the present. ... You should read your Homer. Gods who manipulate the course of destiny are no more likely to achieve their private ambitions than are men who suffer the slings and arrows of outrageous fortune; but gods have much more fun!

If we are to be thwarted in our ambitions, let us at least enjoy the striving. Writing code is one way to strive bigger.

~~~~~

From We Are As Gods, the iconic piece in the The Whole Earth Catalog that gave us the line, "We are as gods and might as well get good at it" -- misremembered, but improved in the misremembering.


Posted by Eugene Wallingford | Permalink | Categories: General

April 27, 2016 4:35 PM

"I Had No Need Of That Hypothesis"

Joshua Brown closes his blog Simple vs Complex with this delightful little story:

1802: Emperor Napoleon sits in state at the Chateau de Malmaison, ready to receive the the mathematical physicist Pierre Laplace and his just completed Celestial Mechanics. In this book, Laplace has explained the formation of the solar system for the first time and has modeled exactly how the planets and stars work. For all his brutality and battlefield expedience, Napoleon is a sophisticate and an enthusiast of the arts and sciences. He is intellectually curious.

"Tell me, Monsieur Laplace, how did the solar system come about?"

"A chain of natural causes would account for the construction and preservation of the celestial system," Laplace explains.

"But you don't mention God or his intervention even once, as Newton did?"

"I had no need of that hypothesis."
One hundred years earlier, Sir Isaac Newton had created a celestial model of his own. In it, he surmised that the planetary orbits were out of control and not stable, and that a God was needed to explain their course. Laplace went further than Newton, showing "it works without that, too."

Whatever one's position on faith in a supernatural deity, Laplace models precisely the attitude that scientists must bring to their work. Let's explain every phenomenon with the fewest and simplest hypotheses.


Posted by Eugene Wallingford | Permalink | Categories: General

April 25, 2016 1:26 PM

"So Little of the Great to Conceal"

In a recent post, Clive Thompson quotes a short passage from Carlo Rovelli's Seven Brief Lessons on Physics in which Rovelli notes that genius hesitates when it comes upon great ideas. Einstein introduced quantum theory with "It seems to me...", and Darwin demurred even in his own notebooks on natural selection with "I think...". Thompson writes:

It's not a bad litmus test for the people around us in everyday life. The ones who are proposing genuinely startling and creative ideas are liable to be ... careful about it. It's the ones with small ideas who are shouting them from the rooftops.

These thought brought to mind a wonderful passage from Okakura Kakuzo's The Book of Tea:

Perhaps we reveal ourselves too much in small things because we have so little of the great to conceal.

Those who encounter a great idea are most willing to let their uncertainty show. Those who express no uncertainty often have no greatness to conceal.

Earlier in the book, Okakura writes another line that I see quoted often:

Those who cannot feel the littleness of great things in themselves are apt to overlook the greatness of little things in others.

This passage takes on a different flavor for me when considered in the light of Rovelli's observation.


Posted by Eugene Wallingford | Permalink | Categories: General

April 22, 2016 12:03 PM

Universities, Cities, and Start-Ups

If I were the city of Des Moines, I'd be thinking about Paul Graham's advice on how to make Pittsburgh a startup hub. Des Moines doesn't have a Carnegie Mellon, but it is reasonably close to two major research universities and has a livable downtown. While Des Moines is not likely to become a major startup hub, it could create the sort of culture needed to sustain a healthy ecosystem for new companies. Such an ecosystem would strengthen its already solid, if unspectacular, IT industry.

Regarding the universities' role in this process, Graham says:

Being that kind of talent magnet is the most important contribution universities can make toward making their city a startup hub. In fact it is practically the only contribution they can make.

But wait, shouldn't universities be setting up programs with words like "innovation" and "entrepreneurship" in their names? No, they should not. These kind of things almost always turn out to be disappointments. They're pursuing the wrong targets. The way to get innovation is not to aim for innovation but to aim for something more specific, like better batteries or better 3D printing. And the way to learn about entrepreneurship is to do it, which you can't in school.

Our university has an entrepreneurship program. I like a lot of what they do for students, but I worry about it becoming about entrepreneurship more than students starting companies. Academics are great at creating programs to talk about stuff, and a lot of what I see our students do is reading about entrepreneurship and studying what other entrepreneurs have done and are done. I'm reminded of an online Q-n-A with Elon Musk's ex-wife. She said that one thing Elon was not doing was sitting around thinking about what other entrepreneurs were doing.

As in so many things, I am also reminded of an aphorism from Kent Beck: "Do stuff, or talk about stuff, but don't talk about doing stuff." An entrepreneur does things. The best thing a university can do is to help students learn what they need to solve hard problems and then get out of their way.


Posted by Eugene Wallingford | Permalink | Categories: General

April 11, 2016 2:53 PM

A Tax Form is Really a Program

I finally got around to preparing my federal tax return this weekend. As I wrote a decade ago, I'm one of those dinosaurs who still does taxes by hand, using pencil and paper. Most of this works involves gathering data from various sources and entering numbers on a two-page Form 1040. My family's finances are relatively simple, I'm reasonably well organized, and I still enjoy the annual ritual of filling out the forms.

For supporting forms such as Schedules A and B, which enumerate itemized deductions and interest and dividend income, I reach into my books. My current accounting system consists of a small set of Python programs that I've been developing over the last few years. I keep all data in plain text files. These files are amenable to grep and simple Python programs, which I use to create lists and tally numbers to enter into forms. I actually enjoy the process and, unlike some people, enjoy reflecting once each year about how I support "we, the people" in carrying out our business. I also reflect on the Rube Goldberg device that is US federal tax code.

However, every year there is one task that annoys me: computing the actual tax I owe. I don't mind paying the tax, or the amount I owe. But I always forget how annoying the Qualified Dividends and Capital Gain Tax Worksheet is. In case you've never seen it, or your mind has erased its pain from your memory in an act of self-defense, here it is:

Qualified Dividends and Capital Gain Tax Worksheet--Line 44

It may not seem so bad at this moment, but look at that logic. It's a long sequence of "Enter the smaller of line X or line Y" and "Add lines Z and W" instructions, interrupted by an occasional reference to an entry on another form or a case statement to select a constant based on your filing status. By the time I get to this logic puzzle each year, I am starting to tire and just want to be done. So I plow through this mess by hand, and I start making mistakes.

This year I made a mistake in the middle of the form, comparing the wrong numbers when instructed to choose the smaller. I realized my mistake when I got to a line where the error resulted in a number that made no sense. (Fortunately, I was still alert enough to notice that much!) I started to go back and refigure from the line with the error, when suddenly sanity kicked it.

This worksheet is a program written in English, being executed by a tired, error-prone computer: me. I don't have to put up with this; I'm a programmer. So I turned the worksheet into a Python program.

This is what the Qualified Dividends and Capital Gain Tax Worksheet for Line 44 of Form 1040 (Page 44 of the 2015 instruction book) could be, if we weren't still distributing everything as dead PDF:

line   = [None] * 28

line[ 0] = 0.00 # unused line[ 1] = XXXX # 1040 line 43 line[ 2] = XXXX # 1040 line 9b line[ 3] = XXXX # 1040 line 13 line[ 4] = line[ 2] + line[ 3] line[ 5] = XXXX # 4952 line 4g line[ 6] = line[ 4] - line[ 5] line[ 7] = line[ 1] - line[ 6] line[ 8] = XXXX # from worksheet line[ 9] = min(line[ 1],line[ 8]) line[10] = min(line[ 7],line[ 9]) line[11] = line[9] - line[10] line[12] = min(line[ 1],line[ 6]) line[13] = line[11] line[14] = line[12] - line[13] line[15] = XXXX # from worksheet line[16] = min(line[ 1],line[15]) line[17] = line[ 7] + line[11] line[18] = line[16] - line[17] line[19] = min(line[14],line[18]) line[20] = 0.15 * line[19] line[21] = line[11] + line[19] line[22] = line[12] - line[21] line[23] = 0.20 * line[22] line[24] = XXXX # from tax table line[25] = line[20] + line[23] + line[24] line[26] = XXXX # from tax table line[27] = min(line[25],line[26])

i = 0 for l in line: print('{:>2} {:10.2f}'.format(i, l)) i += 1

This is a quick-and-dirty first cut, just good enough for what I needed this weekend. It requires some user input, as I have to manually enter values from other forms, from the case statements, and from the tax table. Several of these steps could be automated, with only a bit more effort or a couple of input statements. It's also not technically correct, because my smaller-of tests don't guard for a minimum of 0. Maybe I'll add those checks soon, or next year if I need them.

Wouldn't it be nice, though, if our tax code were written as computer code, or if we could at least download worksheets and various forms as simple programs? I know I can buy commercial software to do this, but I shouldn't have to. There is a bigger idea at play here, and a principle. Computers enable so much more than sharing PDF documents and images. They can change how we write many ideas down, and how we think. Most days, we barely scratch the surface of what is possible.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

April 07, 2016 3:47 PM

Notes from Today's Reading

Getting Older

In Fun With Aging, "Dean Dad" Matt Reed pontificates on reaching a Certain Age.

When Mom was the age I am now, I was in grad school. That can't possibly be right, but that's what the math says.

When my mom was the age I am now, I was already in my second year as an assistant professor, a husband, and father to a two-year-old daughter. Wow.

Getting old: what a strange thing to happen to a little boy.

That said, I am one up on Reed: I know one of Justin Bieber's recent songs and quite like it.

An Interesting Juxtaposition

Earlier this week, I read The Real Reason Middle America Should Be Angry, about St. Louis's fall from national prominence. This morning, I read The Refragmentation, Paul Graham's essay on the dissolution of the 20th century's corporate and cultural order abetted, perhaps accelerated, by computation.

Both tell a story of the rise and fall of corporations across the 20th century. Their conclusions diverge widely, though, especially on the value of government policies that affect scale. I suspect there are elements of truth in both arguments. In any case, they make interesting bookends to the week.

A Network of Links

Finally, as I tweeted yesterday, a colleague told me that he was going to search my blog. He had managed to forget where his own blog lives, and he remembered that I linked to it once.

At first, I chuckled at this situation as a comment on his forgetfulness, and ruefully as a comment on the passing of the age of the blog. But later I realized that this is as much a comment on the wonderfulness of blogging culture, in which links are life and, as long as the network is alive, conversation can be revived.

I hope he blogs again.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

April 05, 2016 4:06 PM

Umberto Eco and the Ineffable Power of Books

In What Unread Books Can Teach Us Oliver Burkeman relates this story about novelist and scholar Umberto Eco:

While researching his own PhD, Eco recalls, he got deeply stuck, and one day happened to buy a book by an obscure 19th-century abbot, mainly because he liked the binding. Idly paging through it, he found, in a throwaway line, a stunning idea that led him to a breakthrough. Who'd have predicted it? Except that, years later, when a friend asked to see the passage in question, he climbed a ladder to a high bookshelf, located the book... and the line wasn't there. Stimulated by the abbot's words, it seems, he'd come up with it himself. You never know where good ideas will come from, even when they come from you.

A person can learn something from a book he or or she has read, even if the book doesn't contain what the person learned. This is a much steadier path to knowledge than resting in the comfort that all information is available at the touch of a search engine.

A person's anti-library helps to make manifest what one does not yet know. As Eco reminds us, humility is an essential ingredient in this prescription.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

March 30, 2016 3:21 PM

Quick Hits at the University

This morning I read three pieces with some connection to universities and learning. Each had a one passage that made me smart off silently as I pedaled.

From The Humanities: What's The Big Idea?:

Boyarin describes his own research as not merely interdisciplinary but "deeply post-disciplinary." (He jokes that when he first came to Berkeley, his dream was to be 5 percent in 20 departments.)

Good luck getting tenure that way, dude.

"Deeply post-disciplinary" is a great bit of new academic jargon. Universities are very much organized by discipline. Figuring out how to support scholars who work outside the lines is a perpetual challenge, one that we really should address at scale if we want to enable universities to evolve.

From this article on Bernie Sanders's free college plan:

Big-picture principles are important, but implementation is important, too.

Hey, maybe he just needs a programmer.

Implementing big abstractions is hard enough when the substance is technical. When you throw in social systems and politics, implementing any idea that deviates very far from standard practice becomes almost impossible. Big Ball of Mud, indeed.

From Yours, Isaac Asimov: A Life in Letters:

Being taught is the intellectual analog of being loved.

I'll remind my students of this tomorrow when I give them Exam 3, on syntactic abstraction. "I just called to say 'I love you'."

Asimov is right. When I think back on all my years in school, I feel great affection for so many of my teachers, and I recall feeling their affection for me. Knowledge is not only power, says Asimov; it is happiness. When people help me learn they offer me knew ways to be happy.

( The Foundation Trilogy makes me happy, too.)


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

March 16, 2016 2:33 PM

Doing the Obvious Things Well

The San Antonio Spurs perennially challenge for the championship of the National Basketball Association. Like most great NBA teams, they have several excellent players. However, the foundation of their success isn't the high-flying sort of offense that people often associate with professional basketball, but rather a meticulous defense, the sort of defense people usually associate with defensive specialists and hard-working journeymen.

What's the secret? This article tells us there is no magic:

It's easy to think some form of incomprehensible genius is responsible for the subtle components of an elite defense, but in all reality, doing the obvious thing and doing it well (which is the hard part) is often all it takes.

This is true of so many things in life. Figure out what you need to do to excel, and then practice it -- both in preparation for the act and in the act itself.

It's not very exciting, but grunt work and attention to detail are usually the primary components of excellence. A little luck helps, of course; the Spurs were able to draft all-time great Tim Duncan as David Robinson's career was winding down. But even that luck is tinged with an unexciting message... What makes Duncan great isn't flashy style and supernatural skills. It's mostly doing the obvious things and doing them well.


Posted by Eugene Wallingford | Permalink | Categories: General

February 12, 2016 3:34 PM

Computing Everywhere: Detecting Gravitational Waves

a linearly-polarized gravitational wave
a linearly-polarized gravitational wave
Wikimedia Commons (CC BY-SA 3.0 US)

This week the world is excitedly digesting news that the interferometer at LIGO has detected gravitational waves being emitted by the merger of two black holes. Gravitational waves were predicted by Einstein one hundred years ago in his theory of General Relativity. Over the course of the last century, physicists have amassed plenty of indirect evidence that such waves exist, but this is the first time they have detected them directly.

The physics world is understandably quite excited by this discovery. We all should be! This is another amazing moment in science: Build a model. Make a falsifiable prediction. Wait for 100 years to have the prediction confirmed. Wow.

We in computer science can be excited, too, for the role that computation played in the discovery. As physicist Sabine Hossenfelder writes in her explanation of the gravitational wave story:

Interestingly, even though it was long known that black hole mergers would emit gravitational waves, it wasn't until computing power had increased sufficiently that precise predictions became possible. ... General Relativity, though often praised for its beauty, does leave you with one nasty set of equations that in most cases cannot be solved analytically and computer simulations become necessary.

As with so many cool advances in the world these days, whether in the sciences or the social sciences, computational modeling and simulation were instrumental in helping to confirm the existence of Einstein's gravitational waves.

So, fellow computer scientists, celebrate a little. Then, help a young person you know to see why they might want to study CS, alone or in combination with some other discipline. Computing is one of the fundamental tools we need these days in order to contribute to the great tableau of human knowledge. Even Einstein can use a little computational help now and then.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

January 17, 2016 10:07 AM

The Reluctant Mr. Darwin

Yesterday, I finished reading The Reluctant Mr. Darwin, a short biography of Charles Darwin by David Quammen published in 2006. It covers Darwin's life from the time he returns from his voyage on the HMS Beagle to his death in 1882, with a short digression to discuss Alfred Russel Wallace's early voyages and independent development of ideas on evolution and its mechanisms.

Before reading this book, I knew the basics of Darwin's theories but nothing about his life and very little about the milieu in which he worked and developed his theories. After reading, I have a better appreciation for the caution with which Darwin seemed to have worked, and the care he took to record detailed observations and to support his ideas with evidence from both nature and breeding. I also have a sense of how Wallace's work related to and affected Darwin's work. I could almost feel Darwin's apprehension upon receiving Russell's letter from southeast Asia, outlining ideas Darwin had been developing, refining, and postponing for twenty years.

The Reluctant Mr. Darwin is literary essay, not scholarly history. I enjoyed reading it. The book is at its best when talking about Darwin's life and work as a scientist, his attitudes and his work habits. The writing is clear, direct, and entertaining. When talking about Darwin's theories themselves, however, and especially about their effect in the world and culturally, the book comes across as too earnest and a bit too breathless for my taste. But this is a minor quibble. It's a worthwhile read.


Posted by Eugene Wallingford | Permalink | Categories: General

January 11, 2016 10:51 AM

Some Writing by Administrators Isn't Bad; It's Just Different

Jim Garland is a physicist who eventually became president of Miami University of Ohio. In Bad Writing by Administrators, Rachel Toor asked Garland how his writing evolved as he moved up the administrative hierarchy. His response included:

Truthfully, I did my deepest thinking as a beginning assistant professor, writing obscure papers on the quantum-mechanical properties of solids at liquid-helium temperatures. Over the years, I became shallower and broader, and by the time I left academe, I was worrying about the seating arrangement of donors in the president's football box.

I have experienced this even in my short step into the department head's office. Some of the writing I do as head is better than my writing before: clear, succinct, and qualified precisely. It is written for a different audience, though, and within a much different political context. My colleagues compliment me occasionally for having written a simple, straightforward note that says something they've been struggling to articulate.

Other times, my thinking is more muddled, and that shows through in what I write. When I try to fix the writing before I fix my thinking, I produce bad writing.

Some writing by administrators really is bad, but a lot of it is simply broader and shallower than what we write as academics. The broader it becomes, the less interesting the content is to the academics still living inside of us. Yet our target audience often can appreciate the value of that less interesting writing when it serves its purpose.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

December 26, 2015 2:12 PM

Moments of Alarm, Written Word Edition

In The Art of Fiction No. 156, George Plimpton asked William Styron, "Are you worried about the future of the written word?" Styron said, "Not really." But he did share the sort of moment that causes him alarm:

Not long ago I received in the mail a doctoral thesis entitled "Sophie's Choice: A Jungian Perspective", which I sat down to read. It was quite a long document. In the first paragraph it said, In this thesis my point of reference throughout will be the Alan J. Pakula movie of Sophie's Choice. There was a footnote, which I swear to you said, Where the movie is obscure I will refer to William Styron's novel for clarification.

Good thing there was an original source to consult.


Posted by Eugene Wallingford | Permalink | Categories: General

December 12, 2015 3:04 PM

Agreement: The Rare and Beautiful Exception

From How to Disagree:

Once disagreement starts to be seen as utterly normal, and agreement the rare and beautiful exception, we can stop being so surprised and therefore so passionately annoyed when we meet with someone who doesn't see eye-to-eye with us.

Sometimes, this attitude comes naturally to me. Other times, though, I have to work hard to make it my default stance. Things usually go better for me when I succeed.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

November 25, 2015 10:50 AM

It Started with a Tweet

Bret Victor's much-heralded What Can a Technologist Do About Climate Change? begins:

This started with a tweet. I'm embarrassed how often that happens.

Why be embarrassed? I am occasionally embarrassed when I tweet snarky and mildly regrettable things, but only because they are snarky and regrettable. However, having a thought, writing it down, and thinking some more is a perfectly honorable way to start writing an essay. Writing something down on Twitter has the advantage of sharing the idea with one's followers, which creates the possibility of getting feedback on the idea from smart, thoughtful people.

Sharing idle thoughts with the world can add value to them. They aren't always so idle.


Posted by Eugene Wallingford | Permalink | Categories: General

November 19, 2015 2:45 PM

Hope for the Mature Researcher

In A Primer on Graph Isomorphism, Lance Fortnow puts László Babai's new algorithm for the graph isomorphism problem into context. To close, he writes:

Also we think of theory as a young person's game, most of the big breakthroughs coming from researchers early in their careers. Babai is 65, having just won the Knuth Prize for his lifetime work on interactive proofs, group algorithms and communication complexity. Babai uses his extensive knowledge of combinatorics and group theory to get his algorithm. No young researcher could have had the knowledge base or maturity to be able to put the pieces together the way that Babai did.

We often hear that research, especially research aimed at solving our deepest problems, is a young person's game. Great work takes a lot of stamina. It often requires a single-minded focus that comes naturally to a young person but which is a luxury unavailable to someone with a wider set of obligations beyond work. Babai's recent breakthrough reminds us that other forces are at play, that age and broad experience can be advantages, too.

This passage serves as a nice counterweight to Garrison Keillor's The slow rate of learning... line, quoted in my previous post. Sometimes, slow and steady are what it takes to get a big job done.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 17, 2015 3:32 PM

Choice Passages from Recent Reads

Eric Schmidt, in an interview for Blitzscaling:

Every great project has started with a graduate student and an assistant professor looking for tenure.

~~~~~

Sir Peter Swinnerton-Dyer, quoted in an article about John Horton Conway:

I'm not sure that I can describe how charisma happens. It just is or isn't. And with most mathematicians it markedly isn't.

~~~~~

Alan Jacobs, in his 79 Theses on Technology For Disputation, as recorded by Chad Wellmon:

Everyone should sometimes write by hand, to recall what it's like to have second thoughts before the first ones are completely recorded.

~~~~~

Garrison Keillor, in his novel "Love Me":

We puritans overdramatize these things. We want there to be lions stalking us, whereas it's only some old coyote. Not utter degradation; just poor choices.

Truth, resignation, and freedom, all in one brief passage. I also like this one:

The slow rate of learning is discouraging to the older man, but thank God for illumination at whatever hour.

Yes, indeed.


Posted by Eugene Wallingford | Permalink | Categories: General

November 08, 2015 9:37 AM

Enthusiastic Recommendation Is Not A Vice

Novelist Henry Miller lamented one of his greatest vices, recommending books and authors too enthusiastically, but ultimately decided that he would not apologize for it:

However, this vice of mine, as I see it, is a harmless one compared with those of political fanatics, military humbugs, vice crusaders, and other detestable types. In broadcasting to the world my admiration and affection, my gratitude and reverence, ... I fail to see that I am doing any serious harm. I may be guilty of indiscretion, I may be regarded as a naïve dolt, I may be criticized justly or unjustly for my taste, or lack of it; I may be guilty, in the high sense, of "tampering" with the destiny of others; I may be writing myself down as one more "propagandist", but -- how am I injuring anyone? I am no longer a young man. I am, to be exact, fifty-eight years of age. (Je me nomme Louis Salavin.) Instead of growing more dispassionate about books, I find the contrary is taking place.

I'm a few years younger than Messrs. Miller and Salavin, but I share this vice of Miller's, as well as his conclusion. When you reach a certain age, you realize that admiration, affection, gratitude, and reverence, especially for a favorite book or author, are all to be cherished. You want to share them with everyone you meet.

Even so, I try to rein in my vice in the same way Miller himself knew he ought in his soberer moments, by having a lighter touch when I recommend. Broadcasting one's admiration and affection too enthusiastically often has the opposite effect to the one intended. The recipients either take the recommendation on its face and read with such high expectations that they will surely be disappointed, or they instinctively (if subconsciously) react with such skepticism that they read with an eye toward deflating the recommendation.

I will say that I have been enjoying The Books In My Life, from which the above passage comes. I've never read any of Miller's novels, only a Paris Review interview with him. This book about the books that shaped him has been a pleasant introduction to Miller's erudite and deeply personal style. Alas, the occasional doses of French are lost on me without the help of Google Translate.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

November 03, 2015 4:10 PM

Academic Computer Science is Not University IT

The Department of Biology operates a greenhouse so that its faculty can cultivate and study a wide range of plants. The greenhouse offers students a chance to see, smell, and touch real, living plants that they may never have encountered before. With the greenhouse as an ecosystem, students get to learn about the relationships among species and a little about how species evolve. The ecological setting in which the plants are grown provides the context needed for faculty to demonstrate realistically how organisms are connected within environments.

Faced with budget cuts, the university has decided that it is no longer cost-effective to have biology staff operate the greenhouse. We already have a grounds and landscaping unit as part of the physical plant, and its staff has expertise for working with a variety plants as a part of managing lawns and gardens. To save money, the administration is centralizing all plant management services in grounds and landscaping. If the folks in Biology needs anything done in or for the greenhouse, they call a designated contact person. They will have to streamline the offerings in the greenhouse, based on university-wide decisions about what kind of plants we can afford to support.

~~~~

The Department of Art has a number of faculty who specialize in drawing, both pencil and ink, and in painting. All students who major in art take two courses in drawing as part of the foundations sequence, and many studio art majors take painting. Both media help students learn to see and teach them about how their materials interact with their vision and affect the shape of their creative works.

Faced with budget cuts, the university has decided that it is no longer cost-effective to have the art faculty select and buy their own pencils, ink, and paints. We already have a couple of units on campus who purchase and use these materials. Operation and Maintenance does a wide variety of carpentry projects that include painting. All campus staff use pencils and ink pens, so Business Operations has purchasing agreements with several office supplies wholesalers. These agreements ensure that university staff can stock a range of pencils, pens, and paints at the best possible price.

When one of the drawing faculty calls over for a particular set of soft graphic pencils, ranging in hardness from 9B to H, she is told that the university has standardized on a set with a smaller range. Satndardization allows us to buy in bulk and to save management overhead. "At least they aren't all No. 2 pencils," thinks the art prof.

When one of the painting faculty calls over to Facilities for a particular set of acrylic paints, the warehouse manager says, "Sure, just let me know what colors you need and we'll by them. We have a great contract with Sherwin Williams." The prof isn't sure where he'll put all the one-gallon cans, though.

~~~~

... just kidding. No university would ever do that, right? Biologists run their own greenhouses and labs, and art faculty select, buy, and manage specialty materials in their studios. Yet academic Computer Science departments often work under nearly identical circumstances, because computers are part of university's IT infrastructure.

Every few years at academic institutions, the budget and management pendulum swings toward centralization of IT services, as a way to achieve economies of scale and save money. Then, a few years later, it swings back toward decentralization, as a way to provide better and finer-grained services to individual departments. Too often, the services provided to CS faculty and students are forced to go along for the ride.

My university is going through one of its periodic recentralizations, at the orders of the Board of Regents. Every time we centralize, we have to have the same conversations about how Computer Science fits into the picture, because most non-CS people ultimately see our use of computers and software as fundamentally the same as, say, the English department's or the Psychology department's. However interesting those departments' use of technology is (and in this day, most faculty and students use technology in interesting ways, regardless of discipline), it is not the same thing as what Computer Science does.

Academic computing has never been limited to computer scientists, of course. Many mathematicians and physicists rely on a very different sort of computing than the folks who use it only for library-style research, writing, and presentation. So do faculty in a few other disciplines. Just as Biology and Art need specialized laboratories and materials, so do those departments that are working at the edge of computing require specialized laboratories and materials. Computer Science is simply the discipline that is farthest out along this curve.

The same thing goes for support staff as for equipment. Few administrators would think of "centralizing" the lab technician and supplies manager for Biology or Chemistry into a non-academic unit on campus, or ask academic departments to depend on a non-academic unit to provide discipline-specific services that are critical to the departments' mission. Lab technicians and equipment managers need to be hired by the departments (or the college) that need them and serve the departments directly. So, too, do certain departments need to have system administrators and lab managers who work for them to meet the specialized needs of academic computing, serving the department or college directly.

Hardware and software are a computer scientist's greenhouse and artistic media. They are our our library and our telescopes, our tallgrass prairie preserves and our mass spectrometers. It is essential that university administrations think of -- and provide for -- Computer Science and other computation-laden departments as academic disciplines first, and not just as consumers of generic IT services. Doing so requires, among other things, leaving control of essential hardware, software, and policies for their use within the academic departments.

~~~~~

Disclaimer. The vignettes above were written by me. I am very much neither a biologist nor a studio artist. If any of the details clash with reality, please see them as creative liberties taken by the author to serve a theme.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

October 30, 2015 4:35 PM

Taking Courses Broad and Wide

Nearly nine years ago, digital strategist Russell Davies visited the University of Oregon to work with students and faculty in the advertising program and wrote a blog entry about his stint there. Among his reflections on what the students should be doing and learning, he wrote:

We're heading for a multi-disciplinary world and that butts right up against a university business model. If I were preparing myself for my job right now I'd do classes in film editing, poetry, statistics, anthropology, business administration, copyright law, psychology, drama, the history of art, design, coffee appreciation, and a thousand other things. Colleges don't want you doing that, that destroys all their efficiencies, but it's what they're going to have to work out.

I give similar advice to prospective students of computer science: If they intend to take their CS degrees out into the world and make things for people, they will want to know a little bit about many different things. To maximize the possibilities of their careers, they need a strong foundation in CS and an understanding of all the things that shape how software and software-enhanced gadgets are imagined, made, marketed, sold, and used.

Just this morning, a parent of a visiting high school student said, after hearing about all the computer science that students learn in our programs, "So, our son should probably drop his plans to minor in Spanish?" They got a lot more than a "no" out of me. I talked about the opportunities to engage with the growing population of Spanish-speaking Americans, even here in Iowa; the opportunities available to work for companies with international divisions; and how learning a foreign language can help students study and learn programming languages differently. I was even able to throw in a bit about grammars and the role they play in my compiler course this semester.

I think the student will continue with his dream to study Spanish.

I don't think that the omnivorous course of study that Davies outlines is at odds with the "efficiencies" of a university at all. It fits pretty well with a liberal arts education, which even of our B.S. students have time for. But it does call for some thinking ahead, planning to take courses from across campus that aren't already on some department's list of requirements. A good advisor can help with that.

I'm guessing that computer science students and "creatives" are not the only ones who will benefit from seeking a multi-disciplinary education these days. Davies is right. All university graduates will live in a multi-disciplinary world. It's okay for them (and their parents) to be thinking about careers when they are in school. But they should prepare for a world in which general knowledge and competencies buoy up their disciplinary knowledge and help them adapt over time.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

October 22, 2015 4:22 PM

Aramaic, the Intermediate Language of the Ancient World

My compiler course is making the transition from the front end to the back end. Our attention is on static analysis of abstract syntax trees and will soon turn to other intermediate representations.

In the compiler world, an "intermediate representation" or intermediate language is a notation used as a stepping stone between the abstract syntax tree and the machine language that is ultimately produced. Such a stepping stone allows the compiler to take smaller steps in translation process and makes it easier to improve the code before getting down into the details of machine language.

We sometimes see intermediate languages in the "real world", too. They tend to arise as a result of cultural and geopolitical forces and, while they usually serve different purposes in human affairs than in compiler affairs, they still tend to be practical stepping stones to another language.

Consider the case of Darius I, whose Persian armies conquered most of the Middle East around 500 BC. As John McWhorter writes in The Atlantic, at the time of Darius's conquest,

... Aramaic was so well-entrenched that it seemed natural to maintain it as the new empire's official language, instead of using Persian. For King Darius, Persian was for coins and magnificent rock-face inscriptions. Day-to-day administration was in Aramaic, which he likely didn't even know himself. He would dictate a letter in Persian and a scribe would translate it into Aramaic. Then, upon delivery, another scribe would translate the letter from Aramaic into the local language. This was standard practice for correspondence in all the languages of the empire.

For sixty years, many compiler writers have dreamed of a universal intermediate language that would ease the creation of compilers for new languages and new machines, to no avail. But for several hundred years, Aramaic was the intermediate representation of choice for a big part of the Western world! Alas, Greek and Arabic later came along to supplant Aramaic, which now seems to be on a path to extinction.

This all sounds a lot like the world of programming, in which languages come and go as we develop new technologies. Sometimes a language, human or computer, takes root for a while as the result of historical or technical forces. Then a new regime or a new culture rises, or an existing culture gains in influence, and a different language comes to dominate.

McWhorter suggests that English may have risen to prominence at just the right moment in history to entrench itself as the world's intermediate language for a good long run. We'll see. Human languages and computer languages may operate on different timescales, but history treats them much the same.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

October 15, 2015 8:18 AM

Perfection Is Not A Pre-Requisite To Accomplishing Something Impressive

In Not Your Typical Role Model, mathematician Hannah Fry tells us some of what she learned about Ada Lovelace, "the 19th century programmer", while making a film about her. Not all of it was complimentary. She concludes:

Ada was very, very far from perfect, but perfection is not a pre-requisite to accomplishing something impressive. Our science role models shouldn't always be there to celebrate the unachievable.

A lot of accomplished men of science were far from perfect role models, too. In the past, we've often been guilty of covering up bad behavior to protect our heroes. These days, we sometimes rush to judge them. Neither inclination is healthy.

By historical standards, it sounds like Lovelace's imperfections were all too ordinary. She was human, like us all. Lovelace thought some amazing things and wrote them down for us. Let's celebrate that.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

September 22, 2015 2:57 PM

"Good Character" as an Instance of Postel's Law

Mike Feathers draws an analogy I'd never thought of before in The Universality of Postel's Law: what we think of as "good character" can be thought of as an application of Postel's Law to ordinary human relations.

Societies often have the notion of 'good character'. We can attempt all sorts of definitions but at its core, isn't good character just having tolerance for the foibles of others and being a person people can count on? Accepting wider variation at input and producing less variation at output? In systems terms that puts more work on the people who have that quality -- they have to have enough control to avoid 'going off' on people when others 'go off on them', but they get the benefit of being someone people want to connect with. I argue that those same dynamics occur in physical systems and software systems that have the Postel property.

These days, most people talk about Postel's Law as a social law, and criticisms of it even in software design refer to it as creating moral hazards for designers. But Postel coined this "principle of robustness" as a way to talk about implementing TCP, and most references I see to it now relate to HTML and web browsers. I think it's pretty cool when a software design principle applies more broadly in the design world, or can even be useful for understanding human behavior far removed from computing. That's the sign of a valuable pattern -- or anti-pattern.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns, Software Development

September 19, 2015 11:56 AM

Software Gets Easier to Consume Faster Than It Gets Easier to Make

In What Is the Business of Literature?, Richard Nash tells a story about how the ideas underlying writing, books, and publishing have evolved over the centuries, shaped by the desires of both creators and merchants. One of the key points is that technological innovation has generally had a far greater effect on the ability to consume literature than on the ability to create it.

But books are just one example of this phenomenon. It is, in fact, a pattern:

For the most part, however, the technical and business-model innovations in literature were one-sided, far better at supplying the means to read a book than to write one. ...

... This was by no means unique to books. The world has also become better at allowing people to buy a desk than to make a desk. In fact, from medieval to modern times, it has become easier to buy food than to make it; to buy clothes than to make them; to obtain legal advice than to know the law; to receive medical care than to actually stitch a wound.

One of the neat things about the last twenty years has been the relatively rapid increase in the ability for ordinary people to to write and disseminate creative works. But an imbalance remains.

Over a shorter time scale, this one-sidedness has been true of software as well. The fifty or sixty years of the Software Era have given us seismic changes in the availability, ubiquity, and backgrounding of software. People often overuse the word 'revolution', but these changes really have had an immense effect in how and when almost everyone uses software in their lives.

Yet creating software remains relatively difficult. The evolution of our tools for writing programs hasn't kept pace with the evolution in platforms for using them. Neither has the growth in our knowledge of how make great software.

There is, of course, a movement these days to teach more people how to program and to support other people who want to learn on their own. I think it's wonderful to open doors so that more people have the opportunity to make things. I'm curious to see if the current momentum bears fruit or is merely a fad in a world that goes through fashions faster than we can comprehend them. It's easier still to toss out a fashion that turns out to require a fair bit of work.

Writing software is still a challenge. Our technologies have not changed that fact. But this is also true, as Nash reminds us, of writing books, making furniture, and a host of other creative activities. He also reminds us that there is hope:

What we see again and again in our society is that people do not need to be encouraged to create, only that businesses want methods by which they can minimize the risk of investing in the creation.

The urge to make things is there. Give people the resources they need -- tools, knowledge, and, most of all, time -- and they will create. Maybe one of the new programmers can help us make better tools for making software, or lead us to new knowledge.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns, Software Development

September 11, 2015 3:55 PM

Search, Abstractions, and Big Epistemological Questions

Andy Soltis is an American grandmaster who writes a monthly column for Chess Life called "Chess to Enjoy". He has also written several good books, both recreational and educational. In his August 2015 column, Soltis talks about a couple of odd ways in which computers interact with humans in the chess world, ways that raise bigger questions about teaching and the nature of knowledge.

As most people know, computer programs -- even commodity programs one can buy at the store -- now play chess better than the best human players. Less than twenty years ago, Deep Blue first defeated world champion Garry Kasparov in a single game. A year later, Deep Blue defeated Kasparov in a closely contested six-game match. By 2005, computers were crushing Top Ten players with regularity. These days, world champion Magnus Larson is no match for his chess computer.

a position in which humans see the win, but computers don't

Yet there are still moments where humans shine through. Soltis opens with a story in which two GMs were playing a game the computers thought Black was winning, when suddenly Black resigned. Surprised journalists asked the winner, GM Vassily Ivanchuk, what had happened. It was easy, he said: it only looked like Black was winning. Well beyond the computers' search limits, it was White that had a textbook win.

How could the human players see this? Were they searching deeper than the computers? No. They understood the position at a higher level, using abstractions such as "being in the square" and passed pawns like splitting a King like "pants". (We chessplayers are an odd lot.)

When you can define 'flexibility' in 12 bits,
it will go into the program.

Attempts to program computers to play chess using such abstract ideas did not work all that well. Concepts like king safety and piece activity proved difficult to implement in code, but eventually found their way into the programs. More abstract concepts like "flexibility", "initiative", and "harmony" have proven all but impossible to implement. Chess programs got better -- quickly -- when two things happened: (1) programmers began to focus on search, implementing metrics that could be applied rapidly to millions of positions, and (2) computer chips got much, much faster.

Pawn Structure Chess, by Andy Soltis

The result is that chess programs can beat us by seeing farther down the tree of possibilities than we do. They make moves that surprise us, puzzle us, and even offend our sense of beauty: "Fischer or Tal would have played this move; it is much more elegant." But they win, easily -- except when they don't. Then we explain why, using ideas that express an understanding of the game that even the best chessplaying computers don't seem to have.

This points out one of the odd ways computers relate to us in the world of chess. Chess computers crush us all, including grandmasters, using moves we wouldn't make and many of us do not understand. But good chessplayers do understand why moves are good or bad, once they figure it out. As Soltis says:

And we can put the explanation in words. This is why chess teaching is changing in the computer age. A good coach has to be a good translator. His students can get their machine to tell them the best move in any position, but they need words to make sense of it.

Teaching computer science at the university is affected by a similar phenomenon. My students can find on the web code samples to solve any problem they have, but they don't always understand them. This problem existed in the age of the book, too, but the web makes available so much material, often undifferentiated and unexplained, so, so quickly.

The inverse of computers making good moves we don't understand brings with it another oddity, one that plays to a different side of our egos. When a chess computer loses -- gasp! -- or fails to understand why a human-selected move is better than the moves it recommends, we explain it using words that make sense of human move. These are, of course, the same words and concepts that fail us most of the time when we are looking for a move to beat the infernal machine. Confirmation bias lives on.

Soltis doesn't stop here, though. He realizes that this strange split raises a deeper question:

Maybe it's one that only philosophers care about, but I'll ask it anyway:

Are concepts like "flexibility" real? Or are they just artificial constructs, created by and suitable only for feeble, carbon-based minds?

(Philosophers are not the only ones who care. I do. But then, the epistemology course I took in grad school remains one of my two favorite courses ever. The second was cognitive psychology.)

Aristotle

We can implement some of our ideas about chess in programs, and those ideas have helped us create machines we can no longer defeat over the board. But maybe some of our concepts are simply be fictions, "just so" stories we tell ourselves when we feel the need to understand something we really don't. I don't think so, the pragmatist in me keeps pushing for better evidence.

Back when I did research in artificial intelligence, I always chafed at the idea of neural networks. They seemed to be a fine model of how our brains worked at the lowest level, but the results they gave did not satisfy me. I couldn't ask them "why?" and receive an answer at the conceptual level at which we humans seem to live. I could not have a conversation with them in words that helped me understand their solutions, or their failures.

Now we live in a world of "deep learning", in which Google Translate can do a dandy job of translating a foreign phrase for me but never tell me why it is right, or explain the subtleties of choosing one word instead of another. Add more data, and it translates even better. But I still want the sort of explanation that Ivanchuk gave about his win or the sort of story Soltis can tell about why a computer program only drew a game because it saddled itself with inflexible pawn structure.

Perhaps we have reached the limits of my rationality. More likely, though, is that we will keep pushing forward, bringing more human concepts and abstractions within the bounds of what programs can represent, do, and say. Researchers like Douglas Hofstadter continue the search, and I'm glad. There are still plenty of important questions to ask about the nature of knowledge, and computer science is right in the middle of asking and answering them.

~~~~

IMAGE 1. The critical position in Ivanchuk-Jobava, Wijk aan Zee 2015, the game to which Soltis refers in his story. Source: Chess Life, August 2015, Page 17.

IMAGE 2. The cover of Andy Soltis's classic Pawn Structure Chess. Source: the book's page at Amazon.com.

IMAGE 3. A bust of Aristotle, who confronted Plato's ideas about the nature of ideals. Source: Classical Wisdom Weekly.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

August 31, 2015 4:13 PM

Catch-22: Faculty and University Administration

I agree with Timothy Burke that the evolution of university administration is shaped in part by the unintended consequences of faculty behavior:

I think some of my colleagues across the country are potentially contributing to the creation of the distanced, professionalized, managerial administrations that they say that they despise, and they're doing it in part through half-voiced expectations about what an ideal administrator might be like.

This passage comes from Performing the Role, in which Burke discusses some of the fall-out from a botched faculty hiring at the University of Illinois last year. Even if you don't know much about the Salaita case, you may find Burke's piece worth reading. It captures pretty well how universities seem to be shifting toward a professionalized administrative class and the ways in which this shift clashes -- and meshes -- with faculty expectations and behavior.

This line, in particular, sums up a surprising amount of my experience as a department head for the last decade:

I think we somehow expect that administrative leaders should be unfailingly polite, deferential, patient, and solicitous when we're the ones talking with them and bold, confrontational, and aggressive when they're talking to anyone else.

The next one has affected me less directly, but I see it in the expectations across campus all the time:

We seem to expect administrative leaders to escape structural traps that we cannot imagine a way to escape from.

Burke ends the paragraph containing those sentences with a summary that many administrators can appreciate: "There's a lot of Catch-22 going on here."

Burke is always thoughtful, and thought-provoking, on matters of academia and culture. If those topics interest, his blog is often worth reading.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

August 25, 2015 1:57 PM

The Art of Not Reading

The beginning of a new semester brings with it a crush of new things to read, write, and do, which means it's a good time to remember this advice from Arthur Schopenhauer:

Hence, in regard to our subject, the art of not reading is highly important. This consists in not taking a book into one's hand merely because it is interesting the great public at the time -- such as political or religious pamphlets, novels, poetry, and the like, which make a noise and reach perhaps several editions in their first and last years of existence. Remember rather that the man who writes for fools always finds a large public: and only read for a limited and definite time exclusively the works of great minds, those who surpass other men of all times and countries, and whom the voice of fame points to as such. These alone really educate and instruct.

"The man who writes for fools always finds a large public." You do not have to be part of it. Time is limited. Read something that matters.

The good news for me is that there is a lot of writing about compilers by great minds. This is, of course, also the bad news. Part of my job is to help my students navigate the preponderance of worthwhile readings.

Reading in my role as department head is an altogether different matter...

~~~~

The passage above is from On Books and Reading, which is available via Project Gutenberg, a wonderful source of many great works.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

August 19, 2015 4:07 PM

Working Too Much Means Never Having to Say "No"

Among the reasons David Heinemeier Hansson gives in his advice to Fire the Workaholics is that working too much is a sign of bad judgment:

If all you do is work, your value judgements are unlikely to be sound. Making good calls on "is it worth it?" is absolutely critical to great work. Missing out on life in general to put more hours in at the office screams "misguided values".

I agree, in two ways. First, as DHH says, working too much is itself a general indicator that your judgment is out of whack. Second is the more specific case:

For workaholics, doing more work always looks like a reasonable option. As a result, when you are trying to decide, "Should I make this or not?", you never have to choose not to make the something in question -- even when not making it is the right thing to do. That sort of indifferent decision making can be death in any creative endeavor.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

August 06, 2015 10:22 AM

Not So Different

Trevor Blackwell on The Lessons of Viaweb:

[Scott Kirsner]: What was the biggest challenge you faced with Viaweb?

[Trevor Blackwell]: Focusing every day on the few things that mattered and not getting distracted by the hundreds of things that didn't.

Maybe the life of a department head isn't all that different from the life of an entrepreneur after all. Well, except for the $49 million.


Posted by Eugene Wallingford | Permalink | Categories: General

July 30, 2015 2:45 PM

The Word Came First

James Somers's article You're Probably Using the Wrong Dictionary describes well how a good dictionary can change your life. In comparing a definition from Webster's 1913 Revised Unabridged Dictionary with a definition from the New Oxford Dictionary, which he offers as an exemplar of the pedestrian dictionaries we use today, he reminds us that words are elusive and their definitions only approximations:

Notice, too, how much less certain the Webster definition seems about itself, even though it's more complete -- as if to remind you that the word came first, that the word isn't defined by its definition here, in this humble dictionary, that definitions grasp, tentatively, at words, but that what words really are is this haze and halo of associations and evocations, a little networked cloud of uses and contexts.

Such poetry is not wasted on words; it is not, too use his own example from the essay, fustian. Words deserve this beauty, and a good dictionary.

There is also a more general reminder just beneath the surface here. In so many ways, more knowledge makes us less certain, not more, and more circumspect, not less. It is hard to make sharp distinctions within a complex web of ideas when you know a little about the web.

I strongly second Somers's recommendation of John McPhee's work, which I blogged about indirectly a few years ago. I also strongly second his recommendation of Webster's 1913 Revised Unabridged Dictionary. I learned about it from another blog article years ago and have been using it ever since. It's one of the first things I install whenever I set up a new computer.


Posted by Eugene Wallingford | Permalink | Categories: General

July 26, 2015 10:03 AM

A Couple of Passages on Disintermediation

"Disintermediation" is just a fancy word for getting other people out of the space between the people who create things and the people who read or listen to those things.

1. In What If Authors Were Paid Every Time Someone Turned a Page?, Peter Wayner writes:

One latter-day Medici posted a review of my (short) book on Amazon complaining that even 99 cents was too expensive for what was just a "blog post". I've often wondered if he was writing that comment in a Starbucks, sipping a $6 cup of coffee that took two minutes to prepare.

Even in the flatter world of ebooks, Amazon has the power to shape the interactions of creators and consumers and to influence strongly who makes money and what kind of books we read.

2. Late last year, Steve Albini spoke on the surprisingly sturdy state of the music industry:

So there's no reason to insist that other obsolete bureaux and offices of the lapsed era be brought along into the new one. The music industry has shrunk. In shrinking it has rung out the middle, leaving the bands and the audiences to work out their relationship from the ends. I see this as both healthy and exciting. If we've learned anything over the past 30 years it's that left to its own devices bands and their audiences can get along fine: the bands can figure out how to get their music out in front of an audience and the audience will figure out how to reward them.

Most of the authors and bands who aren't making a lot of money these days weren't making a lot of money -- or any money at all -- in the old days, either. They had few effective ways to distribute their writings or their music.

Yes, there are still people in between bands and their fans, and writers and their readers, but Albini reminds us how much things have improved for creators and audiences alike. I especially like his takedown of the common lament, "We need to figure out how to make this work for everyone." That sentence has always struck me as the reactionary sentiment of middlemen who no longer control the space between creators and audiences and thus no longer get their cut of the transaction.

I still think often about what this means for universities. We need to figure out how to make this internet thing work for everyone...


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 24, 2015 2:07 PM

Sentences of the Day

Three sentences stood out from the pages of my morning reading. The first two form an interesting dual around power and responsibility.

The Power to Name Things

Among the many privileges of the center, for example, is the power to name things, one of the greatest powers of all.

Costica Bradatan writes this in Change Comes From the Margins, a piece on social change. We programmers know quite well the power of good names, and thus the privilege we have in being able to create them and the responsibility we have to do that well.

The Avoidance of Power as Irresponsibility

Everyone's sure that speech acts and cultural work have power but no one wants to use power in a sustained way to create and make, because to have power persistently, in even a small measure, is to surrender the ability to shine a virtuous light on one's own perfected exclusion from power.

This sentence comes from the heart of Timothy Burke's All Grasshoppers, No Ants, his piece on one of the conditions he thinks ails our society as a whole. Burke's essay is almost an elaboration of Teddy Roosevelt's well-known dismissal of critics, but with an insightful expression of how and why rootless critics damage society as a whole.

Our Impotence in the Face of Depression

Our theories about mental health are often little better than Phlogiston and Ether for the mind.

Quinn Norton gives us this sentence in Descent, a personally-revealing piece about her ongoing struggle with depression. Like many of you, I have watched friends and loved ones fight this battle, which demonstrates all too readily the huge personal costs of civilization's being in such an early stage of understanding this disease, its causes, and its effective treatment.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 21, 2015 3:02 PM

'Send' Is The Universal Verb

In the mid-1980s, Ray Ozzie left IBM with the idea of creating an all-in-one software platform for business collaboration, based on his experience using the group messaging system in the seminal computer-assisted instruction system Plato. Ozzie's idea eventually became Lotus Notes. This platform lives on today in an IBM product, but it never had the effect that Ozzie envisioned for it.

In Office, Messaging, and Verbs, Benedict Evans tells us that Ozzie's idea is alive and well and finally taking over the world -- in the form of Facebook:

But today, Facebook's platform on the desktop is pretty much Ray Ozzie's vision built all over again but for consumers instead of enterprise and for cat pictures instead of sales forecasts -- a combination of messaging with embedded applications and many different data types and views for different tasks.

"Office, Messaging, and Verbs" is an engaging essay about how collaborative work and the tools we use to do it co-evolve, changing each other in turn. You need a keyboard to do the task at hand... But is the task at hand your job, or is it merely the way you do your job today? The answer depends on where you are on the arc of evolution.

Alas, most days I need to create or consume a spreadsheet or two. Spreadsheets are not my job, but they are way people in universities and most other corporate entities do too many of their jobs these days. So, like Jack Lemmon in The Apartment, I compute my cell's function and pass it along to the next person in line.

I'm ready for us to evolve further down the curve.

~~~~

Note: I added the Oxford comma to Evans's original title. I never apologize for inserting an Oxford comma.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 20, 2015 2:59 PM

Rethinking Accounting Software and Interfaces in the 1980s

In Magic Ink: Information Software and the Graphical Interface, Bret Victor reminds us that the dominant style of user interface today was created long before today's computers:

First, our current UI paradigm was invented in a different technological era. The initial Macintosh, for example, had no network, no mass storage, and little inter-program communication. Thus, it knew little of its environment beyond the date and time, and memory was too precious to record significant history. Interaction was all it had, so that's what its designers used. And because the computer didn't have much to inform anyone of, most of the software at the time was manipulation software -- magic versions of the typewriter, easel, and ledger-book. Twenty years and an internet explosion later, software has much more to say, but an inadequate language with which to say it.

William McCarthy, creator of the REA model of accounting

Victor's mention of the accounting ledger brings to mind the work being done since the early 1980s by Bill McCarthy, an accounting professor at Michigan State. McCarthy is motivated by a similar set of circumstances. The techniques by which we do financial accounting were created long before computers came along, and the constraints that made them necessary no longer exist. But he is looking deeper than simply the interaction style of accounting software; he is interested in upending the underlying model of accounting data.

McCarthy proposed the resources, events, agents (REA) model -- essentially an application of database theory from CS -- as an alternative to traditional accounting systems. REA takes advantage of databases and other computing ideas to create a more accurate model of a business and its activity. It eliminates many of the artifacts of double-entry bookkeeping, including debits, credits, and placeholder accounts such as accounts receivable and payable, because they can generated in real time from more fine-grained source data. An REA model of a business enables a much wider range of decision support than the traditional accounting model while still allowing the firm to produce all the artifacts of traditional accounting as side effect.

(I had the good fortune to work with McCarthy during my graduate studies and even helped author a conference paper on the development of expert systems from REA models. He also served on my dissertation committee.)

In the early years, many academic accountants reacted with skepticism to the idea of REA. They feared losing the integrity of the traditional accounting model, which carried a concomitant risk to the trust placed by the public in audited financial statements. Most of these concerns were operational, not theoretical. However, a few people viewed REA as somehow dissing the system that had served the profession so well for so long.

Victor includes a footnote in Magic Ink that anticipates a similar concern from interaction designers to his proposals:

Make no mistake, I revere GUI pioneers such as Alan Kay and Bill Atkinson, but they were inventing rules for a different game. Today, their windows and menus are like buggy whips on a car. (Although Alan Kay clearly foresaw today's technological environment, even in the mid-'70s. See "A Simple Vision of the Future" in his fascinating Early History of Smalltalk (1993).)

"They were inventing rules for a different game." This sentence echoes how I have always felt about Luca Pacioli, the inventor of double-entry bookkeeping. It was a remarkable technology that helped to enable the growth of modern commerce by creating a transparent system of accounting that could be trusted by insiders and outsiders alike. But he was inventing rules for a different game -- 500 years ago. Half a century dwarfs the forty or fifty year life of windows, icons, menus, and pointing and clicking.

I sometimes wonder what might have happened if I had pursued McCarthy's line of work more deeply. It dovetails quite nicely with software patterns and would have been well-positioned for the more recent re-thinking of financial support software in the era of ubiquitous mobile computing. So many interesting paths...


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 07, 2015 2:09 PM

Echoes: Symbols and Stories

Kevin Lawler, in Invention:

Steve Jobs gets credit for a lot of things he didn't do. Jobs himself said it best: "People like symbols, so I'm the symbol of certain things." Sometimes that means using Jobs as a stand-in for the many designers who work at Apple. Jobs usually makes for a good story. We like narratives, and we can build several entertaining ones around Jobs. Telling stories lets us gloss over other people by attributing their work to one person.

Peter Sheridan Dodds, in Homo Narrativus and the Trouble with Fame:

These two traits -- our compulsion to tell stories and our bias towards the individual -- conspire to ruin our intuitive understanding of fame.

The symbols we create and the stories we tell intertwine. Knowing that we are biased cognitively to treat them in a particular puts us in a better position to overcome the bias.


Posted by Eugene Wallingford | Permalink | Categories: General

July 06, 2015 4:48 PM

Echoes: Aligning Expectations with Reality

Adam Bosworth, in Say No:

Have you ever been busy for an entire week and felt like you got nothing meaningful done? Either your expectations are off or your priorities are.

Brent Simmons, in Love:

If there's a way out of despair, it's in changing our expectations.

Good advice from two people who have been in the trenches for a while.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

July 03, 2015 12:50 PM

Good Music Uses Your Whole Mind

A couple of months back, someone posted a link to an interview with guitarist Steve Vai, to share its great story about how Vai came to work with Frank Zappa. I liked the entire piece, including the first paragraph, which sets the scene on how Vai got into music in the first place:

Steve Vai: I recall when I was very, very young I was always tremendously excited whenever I was listening to the radio or records. Even back then a peculiar thing happened that still happens to me today. When I listen to music I can't focus on anything else. When there's wallpaper music on the radio it's not a problem but if a good song comes on it's difficult for me to carry on a conversation or multitask. It's always odd to me when I'm listening to something or playing something for somebody and they're having a discussion in the middle of a piece of music [laughs].

I have this pattern. When a song connects with me, I want to listen; not read or talk, simply listen. And, yes, sometimes it's "just" a pop song. For a while, whenever "Shut Up and Dance" by Walk the Moon came on the radio, it had my full attention. Ah, who am I kidding? It still has that effect on me.

Also, I love Vai's phrase "wallpaper music". I often work with music on in the background, and some music I like knows how to stay there. For me, that's a useful role for songs to play. Working in an environment with some ambient noise is much better for me than working in complete silence, and music makes better ambient noise for me than life in a Starbucks.

When I was growing up, I noticed that occasionally a good song would come on the air, and my level of concentration managed to hold it at bay. When I realized that I had missed the song, I was disappointed. Invariably in those cases, I had been solving a math problem or a writing a computer program. That must have been a little bit like the way Vai felt about music: I wanted to know how to do that, so I put my mind into figuring out how. I was lucky to find a career in which I can do that most of the time.

Oh, and by the way, Steve Vai can really play.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

June 22, 2015 3:27 PM

Strategy Under Time Constraints

an old analog chess clock

In Proving Too Much, Scott Alexander writes this about a rhetorical strategy that most people disapprove of:

Because here is a fundamental principle of the Dark Arts -- you don't need an argument that can't be disproven, only an argument that can't be disproven in the amount of time your opponent has available.

This is dark art in the world of ideas, where truth is more important than winning an argument. But it is a valuable strategy in games like chess, which are often played under time constraint. In competition, winning sometimes matters more the beauty or truth.

Suppose that my opponent has only a few minutes or seconds left on the clock. Suppose also that it's my move and that I have two possible moves to make. One is objectively better, in that it leads to the better expected outcome for me in theory, but that it is easy for my opponent to find good responses. The other move is weaker, perhaps even allowing my opponent to get an advantage over me, but that it would be hard for her to find the right path in the time available.

In this case, I may actually want to play the weaker move, because it maximizes my chance of winning in the circumstances of the game. My opponent has to use extra time to untangle the complexity of the position, and even if she finds the right move, there may not be enough time left to execute the plan. This approach is more volatile for me than playing the safer move, as it increases my risk of losing at the same time that it increases my chances of prevailing. But on balance, I am better off.

This may seem like a crazy strategy, but anyone who has played a lot of speed chess knows its value. Long-time world champion Emanuel Lasker was reputed to have employed a similar strategy, sometimes playing the move that would most unsettle the particular opponent he was playing that day, rather than the absolute best move. (Wikipedia says, though that this reputation may have been undeserved.)

There are chessplayers who would object to this strategy as much as people object to its use in argumentation. There is truth in chess, too, and most chessplayers deeply appreciate making beautiful moves and playing beautiful games. Some grandmasters have sought beautiful combinations to their own detriment. For example, Mikhail Tal may have been able to retain or regain his world title if not for a propensity to seek complication in search of beauty. He gave us many brilliancies as a result, but he also lost just often enough to keep him on the fringes of the world championship.

Much of the time, though, we chessplayers are trying to win the game, and practicing the dark arts is occasionally the best way to do so. That may mean making a move that confounds the opponent just long enough to win the game.


Posted by Eugene Wallingford | Permalink | Categories: General

June 09, 2015 2:48 PM

I'm Behind on Blogging About My Courses...

... so much so, that I may never catch up. The last year and a half have been crazy, and I simply have not set aside enough time to blog. A big part of the time crunch was teaching three heavy preps in 2014: algorithms, agile software development, and our intro course. It is fitting, then, that blogging about my courses has suffered most of all -- even though, in the moment, I often have plenty to say. Offhand, I can think of several posts for which I once had big plans and for which I still have drafts or outlines sitting in my ideas/ folder:

  • readers' thoughts on teaching algorithms in 2014, along with changes I made to my course. Short version: The old canon still covers most of the important bases.
  • reflections on teaching agile dev again after four years. Short version: The best learning still happens in the trenches working with the students, who occasionally perplex me and often amaze me.
  • reflections on teaching Python in the intro for the first course for the first time. Short version: On balance, there are many positives, but wow, there is a lot of language there, and way too many resources.
  • a lament on teaching programming languages principles when the students don't seem to connect with the material. Surprise ending: Some students enjoyed the course more than I realized.

Thoughts on teaching Python stand out as especially trenchant even many months later. The intro course is so important, because it creates habits and mindsets in students that often long outlive the course. Teaching a large, powerful, popular programming language to beginners in the era of Google, Bing, and DuckDuckGo is a Sisyphean task. No matter how we try to guide the students' introduction to language features, the Almighty Search Engine sits ever at the ready, delivering size and complexity when they really need simple answers. Maybe we need language levels a lá the HtDP folks.

Alas, my backlog is so deep that I doubt I will ever have time to cover much of it. Life goes on, and new ideas pop up every day. Perhaps I can make time the posts outlined above.

Right now, my excitement comes from the prospect of teaching my compilers course again for the first time in two years. The standard material still provides a solid foundation for students who are heading off into the the world of software development. But in the time since I last taught the course, some neat things have happened in the compiler world that will make the course better, if only by putting the old stuff into a more modern context. Consider announcements just this week about Swift, in particular that the source code is being open-sourced and the run-time ported to Linux. The moment these two things happen, the language instantly becomes of greater interest to more of my students. Its openness also makes it more suitable as content for a university course.

So, there will be plenty to blog about, even if I leave my backlog untouched. That's a good thing.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

June 07, 2015 9:26 AM

Agile Moments, Ernest Hemingway Edition

I couldn't help thinking of big visible charts when I read this paragraph in The Paris Review's interview with Ernest Hemingway:

[Hemingway] keeps track of his daily progress -- "so as not to kid myself" -- on a large chart made out of the side of a cardboard packing case and set up against the wall under the nose of a mounted gazelle head. The numbers on the chart showing the daily output of words differ from 450, 575, 462, 1250, back to 512, the higher figures on days [he] puts in extra work so he won't feel guilty spending the following day fishing on the Gulf Stream.

He uses the chart to keep himself honest. Even our greatest writers can delude themselves into thinking they are making enough progress when they aren't. All the more so for those of us who are still learning, whether how to run a marathon, how to write prose, or how to make software. When a group of people are working together, a chart can help the individuals maintain a common, and honest, understanding of how the team is doing.

Oh, and notice Hemingway's technology: the side of a cardboard packing case. No fancy dashboard for this writer who is known for his direct, unadorned style. If you think you need a digital dashboard with toggles, flashing lights, and subviews, you are doing it wrong. The point of the chart is to keep you honest, not give you another thing to do when you are not doing what you should be doing.

There is another lesson in this passage beyond the chart, about sustainable pace. Most of the numbers are in the ballpark of 500 (average: 499 3/4!), except for one day when he put in a double day. Perhaps 500 words a day is a pace that Hemingway finds productive over time. Yet he allows himself an occasional bit of overtime -- for something important, like time away from his writing desk, out on the water. Many of us programmers need to be reminded every so often that getting away from our work is valuable, and worth an occasional 0 on the big visible chart. It's also a more human motivation for overtime than the mad rush to a release date.

A few pages later in the interview, we read Hemingway repeating a common adage among writers that also echoes nicely against the agile practices:

You read what you have written and, as you always stop when you know what is going to happen next, you go on from there.

Hemingway stops each day at a point where the story will pull him forward the next morning. In this, XP devotees can recognize the habit of ending each day with a broken test. In the morning, or whenever we next fire up our editors, the broken test tells us exactly where to begin and gives us a concrete goal. By the time the test passes, our minds are ready to move on to something new.

Agility is useful when fighting bulls. Apparently, it helps when writing novels, too.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

June 04, 2015 2:33 PM

If the Web is the Medium, What is the Message?

How's this for a first draft:

History may only be a list of surprises, but you sure as heck don't want to lose the list.

That's part of the message in Bret Victor's second 'Web of Alexandria' post. He Puts it in starker terms:

To forget the past is to destroy the future. This is where Dark Ages come from.

Those two posts followed a sobering observation:

60% of my fav links from 10 yrs ago are 404. I wonder if Library of Congress expects 60% of their collection to go up in smoke every decade.

But it's worse than that, Victor tells us in his follow-up. As his tweet notes, the web has turned out to be unreliable as a publication medium. We publish items because we want them to persist in the public record, but they don't rarely persist for very long. However, the web has turned out to be a pernicious conversational medium as well. We want certain items shared on the web to be ephemeral, yet often those items are the ones that last forever. At one time, this may have seemed like only an annoyance, but now we know it to be dangerous.

The problem isn't that the web is a bad medium. In one sense, the web isn't really a medium at all; it's an infrastructure that enables us to create new kinds of media with historically uncharacteristic ease. The problem is that we are using web-based media for many different purposes, without understanding how each medium determines "the social and temporal scope of its messages".

The same day I read Victor's blog post, I saw this old Vonnegut quote fly by on Twitter:

History is merely a list of surprises. ... It can only prepare us to be surprised yet again.

Alas, on the web, history appears to be a list of cat pictures and Tumblr memes, with all the important surprises deleted when the author changed internet service providers.

In a grand cosmic coincidence, on the same day I read Victor's blog post and saw the Vonnegut quote fly by, I also read a passage from Marshall McLuhan in a Farnam Street post. It ends:

The modern world abridges all historical times as readily as it reduces space. Everywhere and every age have become here and now. History has been abolished by our new media.

The internet certainly amplifies the scale of McLuhan's worry, but the web has created unique form of erasure. I'm sure McLuhan would join Victor in etching an item on history's list of surprises:

Protect the past.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

June 02, 2015 1:46 PM

"I Just Need a Programmer", Screenplay Edition

Noted TV writer, director, producer, and blogger Ken Levine takes on a frequently-asked question in the latest edition of his "Friday Questions" feature:

I have a great idea for a movie, but I'm not a writer, I'm not in show biz, and I don't live in New York or LA. What do I do with this great idea? (And I'm sure you've never heard this question before, right?)

Levine is gentle in response:

This question does come up frequently. I wish I had a more optimistic answer. But the truth is execution is more valued than ideas. ...

Is there any domain where this isn't true? Yet professionals in every domain seem to receive this question all the time. I certainly receive the "I just need a programmer..." phone call or e-mail every month. If I went to cocktail parties, maybe I'd hear it at them, too.

The bigger the gap between idea and product, the more valuable, relatively speaking, execution is than having ideas. For many app ideas, executing the idea is not all that far beyond the reach of many people. Learn a little Objective C, and away you go. In three or four years, you'll be set! By comparison, writing a screenplay that anyone in Hollywood will look at (let alone turn into a blockbuster film) seems like Mount Everest.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

June 01, 2015 2:21 PM

I'd Like to Be Bored for a While

When asked if he has ever been bored, Italo Calvino responded:

Yes, in my childhood. But it must be pointed out that childhood boredom is a special kind of boredom. It is a boredom full of dreams, a sort of projection into another place, into another reality. In adulthood boredom is made of repetition, it is the continuation of something from which we are no longer expecting any surprise. And I -- would that I had time to get bored today!

Children are better at boredom than adults are, because we let them be. We should let adults be good at boredom every ocacsionally.

(Passage from this Paris Review interview, which I quoted a couple of times several weeks ago.)


Posted by Eugene Wallingford | Permalink | Categories: General

May 26, 2015 2:59 PM

If You Want to Help Students, You May Want to Help Faculty

In The Missing Middle, Matt Reed recommends a way for donors to have a real effect at teaching universities: pay for conference travel.

I've mentioned before that the next philanthropist who wants to make a massive difference in the performance of teaching-intensive public colleges -- whether community colleges or the smaller four-years -- could do it by underwriting conference travel. Right now, most colleges are lucky to send one or two people to most conferences. When an entire team attends the same presentation, it's much easier to get what chemists call activation energy. I've seen it personally.

This echoes a conversation the department heads in my college had earlier this month. Ongoing professional development is important for faculty, both in their research and their teaching. Faculty who are struggling in the classroom need more help than others, but even good teachers need to have their batteries charged every once in a while.

There tends to be more support for faculty development in their research than in their teaching, even at so-called teaching universities. Even so, professional development in research is often a natural side effect of external funding for research, and faculty at these universities don't always conduct research at a scale that is competitive for external funding.

And faculty development in their teaching? There aren't many sources to support this other than the university budget itself. Given the current state of funding for public universities, which is likely the new normal, these funds are being squeezed out of the budget, if they were ever present at all.

Professors need to stay current in their profession, and many need to address weaknesses over the course of their careers. When universities neglect faculty development, the faculty suffer, and so do their students. Often, the best way to help students is to help the faculty.

All that said, I am not holding my breath that dollars will be coming in from donors any time soon. People love to help students directly, but indirect support for students and support for other parts of the university are notoriously hard sells.


Posted by Eugene Wallingford | Permalink | Categories: General

May 09, 2015 9:28 AM

A Few Thoughts on Graduation Day

Today is graduation day for the Class of 2015 at my university. CS students head out into the world, most with a job in hand or nearly so, ready to apply their hard-earned knowledge and skills to all variety of problems. It's an exciting time for them.

This week also brought two other events that have me thinking about the world in which my students my will live and the ways in which we have prepared them. First, on Thursday, the Technology Association of Iowa organized a #TechTownHall on campus, where the discussion centered on creating and retaining a pool of educated people to participate in, and help grow, the local tech sector. I'm a little concerned that the TAI blog says that "A major topic was curriculum and preparing students to provide immediate value to technology employers upon graduation." That's not what universities do best. But then, that is often what employers want and need.

Second, over the last two mornings, I read James Fallows's classic The Case Against Credentialism, from the archives of The Atlantic. Fallows gives a detailed account of the "professionalization" of many lines of work in the US and the role that credentials, most prominently university degrees, have played in the movement. He concludes that our current approach is biased heavily toward evaluating the "inputs" to the system, such as early success in school and other demonstrations of talent while young, rather than assessing the outputs, namely, how well people actually perform after earning their credentials.

Two passages toward the end stood out for me. In one, Fallows wonders if our professionalized society creates the wrong kind of incentives for young people:

An entrepreneurial society is like a game of draw poker; you take a lot of chances, because you're rarely dealt a pat hand and you never know exactly what you have to beat. A professionalized society is more like blackjack, and getting a degree is like being dealt nineteen. You could try for more, but why?

Keep in mind that this article appeared in 1985. Entrepreneurship has taken a much bigger share of the public conversation since then, especially in the teach world. Still, most students graduating from college these days are likely thinking of ways to convert their nineteens into steady careers, not ways to risk it all on the next Amazon or Über.

Then this quote from "Steven Ballmer, a twenty-nine-year-old vice-president of Microsoft", on how the company looked for new employees:

We go to colleges not so much because we give a damn about the credential but because it's hard to find other places where you have large concentrations of smart people and somebody will arrange the interviews for you. But we also have a lot of walk-on talent. We're looking for programming talent, and the degree is in no way, shape, or form very important. We ask them to send us a program they've written that they're proud of. One of our superstars here is a guy who literally walked in off the street. We talked him out of going to college and he's been here ever since.

Who would have guessed in 1985 the visibility and impact that Ballmer would have over the next twenty years? Microsoft has since evolved from the entrepreneurial upstart to the staid behemoth, and now is trying to reposition itself as an important player in the new world of start-ups and mobile technology.

Attentive readers of this blog may recall that I fantasize occasionally about throwing off the shackles of the modern university, which grow more restrictive every year as the university takes on more of the attributes of corporate and government bureaucracy. In one of my fantasies, I organize a new kind of preparatory school for prospective software developers, one with a more modern view of learning to program but also an attention to developing the whole person. That might not satisfy corporate America's need for credentials, but it may well prepare students better for a world that needs poker players as much as it needs blackjack players. But where would the students come from?

So, on a cloudy graduation day, I think about Fallows's suggestion that more focused vocational training is what many grads need, about the real value of a liberal university education to both students and society, and about how we can best prepare CS students participate to in the world. It is a world that needs not only their technical skills but also their understanding of what tech can and cannot do. As a society, we need them to take a prominent role in civic and political discourse.

One final note on the Fallows piece. It is a bit long, dragging a bit in the middle like a college research paper, but opens and closes strongly. With a little skimming through parts of less interest, it is worth a read. Thanks to Brian Marick for the recommendation.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

April 30, 2015 6:00 PM

Software is a Means of Communication, Just Like a Research Paper

I can't let my previous post be my only comment on Software in Scientific Research. Hinsen's bigger point is worth a post of its own.

Software is a means of communication, just like papers or textbooks.

... much like the math that appears in a paper or a textbook -- except that, done properly, a computer program runs and provides a dynamic demonstration of an idea.

The main questions asked about scientific software [qua software] are "What does it do?" and "How efficient is it?" When considering software as a means of communication, we would ask questions such as "Is it well-written, clear, elegant?", "How general is the formulation?", or "Can I use it as the basis for developing new science?".

This shift requires a different level of understanding of programs and programming than many scientists (and other people who do not program for a living) have. But it is a shift that needs to take place, so we should so all we can to help scientists and others become more fluent. (Hey to Software Carpentry and like-minded efforts.)

We take for granted that all researchers are responsible for being able to produce and, more importantly, understand the other essential parts of scientific communication:

We actually accept as normal that the scientific contents of software, i.e., the models implemented by it, are understandable only to software specialists, meaning that for the majority of users, the software is just a black box. Could you imagine this for a paper? "This paper is very obscure, but the people who wrote it are very smart, so let's trust them and base our research on their conclusions." Did you ever hear such a claim? Not me.

This is a big part of the challenge we face in getting faculty across the university to see the vital role that computing should play in modern education -- as well as the roles it should not play. The same is true in the broader culture. We'll see if efforts such as code.org can make a dent in this challenge.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

April 26, 2015 9:55 AM

Yesterday's Questions Can Have Different Answers Today

I wrote on Twitter Thursday [ 1 | 2 ] that I end up modifying my lecture notes every semester, no matter how well done they were the last time I taught the course. From one semester to the next, I find that I am more likely to change the introductions, transitions, and conclusions of a session than the body. The intros, transitions, and conclusions help to situate the material in a given place and time: the context of this semester and this set of students. The content, once refined, tends to stabilize, though occasionally I feel a need to present even it in a different way, to fit the current semester.

Novelist Italo Calvino knew this feeling as well, when he was preparing to be interviewed:

Rarely does an interviewer ask questions you did not expect. I have given a lot of interviews and I have concluded that the questions always look alike. I could always give the same answers. But I believe I have to change my answers because with each interview something has changed either inside myself or in the world. An answer that was right the first time may not be right again the second.

This echoes my experience preparing for lecture. The answer that was right the last time does not seem right again this time. Sometimes, I have changed. With any luck, I have learned new things since the last time I taught the course, and that makes for a better story. Sometimes, the world has changed: a new programming language such as Clojure or Scala has burst onto the scene, or a new trend in industry such as mobile app development has made a different set of issues relevant to the course. I need to tell a different story that acknowledges -- and takes advantage of -- these changes.

Something else always changes for a teacher, too: the students. It's certainly true the students in the class are different every time I teach a course. But sometimes, the group is so different from past groups that the old examples, stories, and answers just don't seem to work. Such has been the case for me this semester. I've had to work quite a bit to understand how my students think and incorporate that into my class sessions and homework assignments. This is part of the fun and challenge of being a teacher.

We have to be careful not to take this analogy too far. Teaching computer science is different from an author giving an interview about his or her life. For one thing, there is a more formal sense of objective truth in the content of, say, a programming language course. An object is still a closure; a closure is still an object that other code can interact with over time. These answers tend to stay the same over time. But even as a course communicates the same fundamental truths from semester to semester, the stories we need to tell about these truths will change.

Ever the fantastic writer, Calvino saw in his interview experience the shape of a new story, a meta-story of sorts:

This could be the basis of a book. I am given a list of questions, always the same; every chapter would contain the answers I would give at different times. ... The changes would then become the itinerary, the story that the protagonist lives. Perhaps in this way I could discover some truths about myself.

This is one of the things I like about teaching. I often discover truths about myself, and occasionally transform myself.

~~~~

The passages quote above come from The Art of Fiction No. 130, Italo Calvino in The Paris Review. It's not the usual Paris Review interview, as Calvino died before the interviewer was done. Instead, it is a pastiche of four different sources. It's a great read nonetheless.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

April 09, 2015 3:26 PM

Two Factors for Succeeding at Research, or Investing

Think differently, of course. But be humble. These attitudes go hand-in-hand.

To make money in the markets, you have to think independently and be humble. You have to be an independent thinker because you can't make money agreeing with the consensus view, which is already embedded in the price. Yet whenever you're betting against the consensus there's a significant probability you're going to be wrong, so you have to be humble.

This applies equally well to doing research. You can't make substantial progress with the conventional wisdom, because it defines and limits the scope of the solution. So think differently. But when you leave the safety of conventional wisdom, you find yourself working in an immense space of ideas. There is a significant chance that you will be wrong a lot. So be humble.

(The quote is from Learn or Die: Using Science to Build a Leading-Edge Learning Organization by Ray Dalio.)


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

March 13, 2015 3:07 PM

Two Forms of Irrelevance

When companies become irrelevant to consumers.
From The Power of Marginal, by Paul Graham:

The big media companies shouldn't worry that people will post their copyrighted material on YouTube. They should worry that people will post their own stuff on YouTube, and audiences will watch that instead.

You mean Grey's Anatomy is still on the air? (Or, as today's teenagers say, "Grey's what?")

When people become irrelevant to intelligent machines.
From Outing A.I.: Beyond the Turing Test, by Benjamin Bratton:

I argue that we should abandon the conceit that a "true" Artificial Intelligence must care deeply about humanity -- us specifically -- as its focus and motivation. Perhaps what we really fear, even more than a Big Machine that wants to kill us, is one that sees us as irrelevant. Worse than being seen as an enemy is not being seen at all.

Our new computer overlords indeed. This calls for a different sort of preparation than studying lists of presidents and state capitals.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

March 10, 2015 4:45 PM

Learning to Program is a Loser's Game

After a long break from playing chess, I recently played a few games at the local club. Playing a couple of games twice in the last two weeks has reminded me that I am very rusty. I've only made two horrible blunders in four games, but I have made many small mistakes, the kind of errors that accumulate over time and make a position hard to defend, even untenable. Having played better in years past, these inaccuracies are irksome.

Still, I managed to win all four games. As I've watched games at the club, I've noticed that most games are won by the player who makes the second-to-last blunder. Most of the players are novices, and they trade mistakes: one player leaves his queen en prise; later, his opponent launches an underprepared attack that loses a rook; then the first player trades pieces and leaves himself with a terrible pawn structure -- and so on, the players trading weak or bad moves until the position is lost for one of them.

My secret thus far has been one part luck, one part simple strategy: winning by not losing.

This experience reminded me of a paper called The Loser's Game, which in 1975 suggested that it was no longer possible for a fund manager to beat market averages over time because most of the information needed to do well was available to everyone. To outperform the market average, a fund manager has to profit from mistakes made by other managers, sufficiently often and by a sufficient margin to sustain a long-term advantage. Charles Ellis, the author, contrasts this with the bull markets of the 1960s. Then, managers made profits based on the specific winning investments they made; in the future, though, the best a manager could hope for was not to make the mistakes that other investors would profit from. Fund management had transformed from being a Winner's Game to a Loser's Game.

the cover of Extraordinary Tennis for the Ordinary Tennis Player

Ellis drew his inspiration from another world, too. Simon Ramo had pointed out the differences between a Winner's Game and a Loser's Game in Extraordinary Tennis for the Ordinary Tennis Player. Professional tennis players, Ramo said, win based on the positive actions they take: unreturnable shots down the baseline, passing shots out of the reach of a player at the net, service aces, and so on. We duffers try to emulate our heroes and fail... We hit our deep shots just beyond the baseline, our passing shots just wide of the sideline, and our killer serves into the net. It turns out that mediocre players win based on the errors they don't make. They keep the ball in play, and eventually their opponents make a mistake and lose the point.

Ramo saw that tennis pros are playing a Winner's Game, and average players are playing a Loser's Game. These are fundamentally different games, which reward different mindsets and different strategies. Ellis saw the same thing in the investing world, but as part of a structural shift: what had once been a Winner's Game was now a Loser's Game, to the consternation of fund managers whose mindset is finding the stocks that will earn them big returns. The safer play now, Ellis says, is to minimize mistakes. (This is good news for us amateurs investors!)

This is the same phenomenon I've been seeing at the chess club recently. The novices there are still playing a Loser's Game, where the greatest reward comes to those who make the fewest and smallest mistakes. That's not very exciting, especially for someone who fancies herself to be Adolf Anderssen or Mikhail Tal in search of an immortal game. The best way to win is to stay alive, making moves that are as sound as possible, and wait for the swashbuckler across the board from you to lose the game.

What does this have to do with learning to program? I think that, in many respects, learning to program is a Loser's Game. Even a seemingly beginner-friendly programming language such as Python has an exacting syntax compared to what beginners are used to. The semantics seem foreign, even opaque. It is easy to make a small mistake that chokes the compiler, which then spews an error message that overwhelms the new programmer. The student struggles to fix the error, only to find another error waiting somewhere else in the code. Or he introduces a new error while eliminating the old one, which makes even debugging seem scary. Over time, this can dishearten even the heartiest beginner.

What is the best way to succeed? As in all Loser's Games, the key is to make fewer mistakes: follow examples closely, pay careful attention to syntactic details, and otherwise not stray too far from what you are reading about and using in class. Another path to success is to make the mistakes smaller and less intimidating: take small steps, test the code frequently, and grow solutions rather than write them all at once. It is no accident that the latter sounds like XP and other agile methods; they help to guard us from the Loser's Game and enable us to make better moves.

Just as playing the Loser's Game in tennis or investing calls for a different mindset, so, too does learning to program. Some beginners seem to grok programming quickly and move on to designing and coding brilliantly, but most of us have to settle in for a period of discipline and growth. It may not be exciting to follow examples closely when we want to forge ahead quickly to big ideas, but the alternative is to take big shots and let the compiler win all the battles.

Unlike tennis and Ellis's view of stock investing, programming offers us hope: Nearly all of us can make the transition from the Loser's Game to the Winner's Game. We are not destined to forever play it safe. With practice and time, we can develop the discipline and skills necessary to making bold, winning moves. We just have to be patient and put time and energy into the process of becoming less mistake-prone. By adopting the mindset needed to succeed in a Loser's Game, we can eventually play the Winner's Game.

I'm not too sure about the phrases "Loser's Game" and "Winner's Game", but I think that this analogy can help novice programmers. I'm thinking of ways that I can use it to help my students survive until they can succeed.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

February 16, 2015 4:15 PM

An Example of Science in Action

Here is another interesting piece from The New Yorker, this time on an example of science in action. Jon Krakauer is the author of Into the Wild, about adventurer Chris McCandless. Eighteen months ago, he published a claim that McCandless had likely died as a result of eating the seeds of Hedysarum alpinum, known as wild potato. Krakauer's theory was based on lab analysis of seeds from the plant showing that it contained a particular toxic alkaloid. A critic of the claim, Tom Clausen, suggested that Krakauer's theory would be credible only after being subjected to more thorough testing and published in a peer-reviewed journal.

So that's what Krakauer did. He worked with the same lab to do more thorough testing and found that his toxic alkaloid theory didn't hold up after all. Instead, detailed analysis found that Hedysarum alpinum instead contains an amino acid that acts as an antimetabolite and for which toxicity in animals has been well documented. This work went through peer review and is being published next month in a scientific journal.

That's how science works. If a claim is challenged by other scientists, it is subjected to further tests. When those tests undermine the claim, it is withdrawn. Often, though, the same tests that undermine one hypothesis can point us in the direction of another and give us the information we need to construct a better theory.

A cautionary lesson from science also jumps out of this article, though. While searching the scientific literature for studies as part of the re-analysis of Hedysarum alpinum, he found a paper that pointed him in the direction of toxic non-protein amino acids. Krakauer writes:

I had missed this article in my earlier searches because I had been looking for a toxic alkaloid instead of a toxic amino acid. Clausen had been looking for a toxic alkaloid as well, when he and Edward Treadwell reported, in a peer-reviewed paper published in the journal Ethnobotany Research & Applications, that "no chemical basis for toxicity could be found" in H. alpinum seeds.

Clausen's team had been looking specifically for alkaloids, but then concluded more generally that "no chemical basis for toxicity could be found". This claim is broader than their results can support. Only the narrower claim that they could find no chemical basis for alkaloid toxicity seems warranted by the evidence. That is probably the conclusion Clausen's team should have drawn. Our conclusions should be as narraow as possible, given the data.

Anyway, Krakauer has written a fascinating article, accessible even to a non-biologist like me. Check it out.


Posted by Eugene Wallingford | Permalink | Categories: General

February 06, 2015 3:11 PM

What It Feels Like To Do Research

In one sentence:

Unless you tackle a problem that's already solved, which is boring, or one whose solution is clear from the beginning, mostly you are stuck.

This is from Alec Wilkinson's The Pursuit of Beauty, about mathematician Yitang Zhang, who worked a decade on the problem of bounded gaps between prime numbers. As another researcher says in the article,

When you try to prove a theorem, you can almost be totally lost to knowing exactly where you want to go. Often, when you find your way, it happens in a moment, then you live to do it again.

Programmers get used to never feeling normal, but tackling the twin prime problem is on a different level altogether. The same is true for any deep open question in math or computing.

I strongly recommend Wilkinson's article. It describes what life for untenured mathematicians is like, and how a single researcher can manage to solve an important problem.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

February 05, 2015 3:57 PM

If You Want to Become a Better Writer...

... write for undergraduates. Why?

Last fall, Steven Pinker took a stab at explaining why academics stink at writing. He hypothesizes that cognitive science and human psychology explain much of the problem. Experts often find it difficult to imagine that others do not know what experts know, which Pinker calls the curse of knowledge. They work around the limitations of short-term memory by packaging ideas into bigger and more abstract units, often called chunking. Finally, they tend to think about the things they understand well in terms of how they use them, not in terms of what they look like, a transition called functional fixity.

Toward the end of the article, Pinker summarizes:

The curse of knowledge, in combination with chunking and functional fixity, helps make sense of the paradox that classic style is difficult to master. What could be so hard about pretending to open your eyes and hold up your end of a conversation? The reason it's harder than it sounds is that if you are enough of an expert in a topic to have something to say about it, you have probably come to think about it in abstract chunks and functional labels that are now second nature to you but are still unfamiliar to your readers--and you are the last one to realize it.

Most academics aren't trying to write bad prose. They simply don't have enough practice writing good prose.

When Calvin explained to Hobbes, "With a little practice, writing can be an intimidating and impenetrable fog," he got it backward. Fog comes easily to writers; it's the clarity that requires practice. The naïve realism and breezy conversation in classic style are deceptive, an artifice constructed through effort and skill.

Wanting to write better is not sufficient. Exorcising the curse requires writers to learn new skills and to practice. One of the best ways to see if the effort is paying off is to get feedback: show the work to real readers and see if they can follow it.

That's where undergraduates come in. If you want to become a better writer or a better speaker, teach undergraduates regularly. They are about as far removed as you can get from an expert while still having an interest in the topic and some inclination to learn more about it.

When I write lecture notes for my undergrads, I have to eliminate as much jargon as possible. I have to work hard to put topics into the best order for learners, not for colleagues who are also expert in the area. I have to find stories to illuminate ideas, and examples to illustrate them. When I miss the intended mark on any of these attempts, my students will let me know, either through their questions or through their inability to perform as I expected. And then I try again.

My lecture notes are far from perfect, but they are always much better after a few iterations teaching a course than they are the first time I do. The weakest parts tend to be for material I'm adding to the course for the first time; the best parts tend to be revisions of existing material. These facts are no surprise to any writer or presenter, of course. Repetition and effort are how we make things better.

Even if you do not consider yourself a teacher by trade, if you want to improve your ability to communicate science, teach undergrads. Write lecture notes and explanations. Present to live students and monitor lab sessions. The students will reward you with vigorous feedback. Besides, they are good people to get to know.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 31, 2015 11:51 AM

Failure with a Purpose

I am committed to being wrong bigger and more often in 2015. Yet I am mindful of Avdi Grimm's admonition:

... failure isn't always that informative. You can learn a thousand different ways to fail and never learn a single way to succeed.

To fail for failure's sake is foolish and wasteful. In writing, the awful stuff you write when you start isn't usually valuable in itself, but rather for what we learn from studying and practicing. In science, failing isn't usually valuable in itself, but rather for what you learn when you prove an idea wrong. The scientist's mindset has a built-in correction for dealing with failure: every surprising result prompts a new attempt to understand why and build a better model.

As Grimm says, be sure you know what purpose your failure will serve. Sometimes, taking bigger risks intellectually can help us get off a plateau in our thinking, or even a local maximum. The failure pays off when we pay attention to the outcome and find a better hill to climb.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

January 29, 2015 4:27 PM

A Reminder from Marcus Aurelius

... from Book 6 of The Meditations, courtesy of George Berridge:

Item 52 from Book 6 of The Meditations, by Marcus Aurelius

You are not compelled to form any opinion about this matter before you, nor to disturb your piece of mind at all. Things in themselves have no power to extort a verdict from you.

This seems especially sound advice in this era, full of devices that enable other people to bombard our minds with matters they find Very Important Indeed. Maintain your piece of mind until you encounter a thing that your own mind knows to be important.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

January 19, 2015 2:14 PM

Beginners, Experts, and Possibilities

Last Thursday, John Cook tweeted:

Contrary to the Zen proverb, there may be more possibilities in the expert's mind than in the beginner's.

This summed up nicely one of the themes in my Programming Languages course that afternoon. Some students come into the course knowing essentially only one language, say Python or Ada. Others come knowing several languages well, including their intro language, Java, C, and maybe a language they learned on the job, such as Javascript or Scala.

Which group do you think has a larger view of what a programming language can be? The more knowledgable, to be sure. This is especially true when their experience includes languages from different styles: procedural, object-oriented, functional, and so on.

Previous knowledge affects expectations. Students coming directly out of their first year courses are more likely to imagine that all languages are similar to what they already know. Nothing in their experience contradicts that idea.

Does this mean that the Zen notion of beginner's mind is wrongheaded? Not at all. I think an important distinction can be made between analysis and synthesis. In a context where we analyze languages, broad experience is more valuable than lack of experience, because we are able to bring to our seeing a wider range of possibilities. That's certainly my experience working with students over the years.

However, in a context, where we create languages, broad experience can be an impediment. When we have seen many different languages, it it can be difficult to create something that looks much different from the languages what we've already seen. Something in our minds seems to pull us toward an existing language that already solves the constraint they are struggling with. Someone else has already solved this problem; their solution is probably best.

This is also my experience working with students over the years. My freshmen will almost always come up with a fresher language design than my seniors. The freshmen don't know much about languages yet, and so their minds are relatively unconstrained. (Fantastically so, sometimes.) The seniors often seem to end up with something that is superficially new but, at its core, thoroughly predictable.

The value of "Zen mind, beginner's mind" also follows a bit from the distinction between expertise and experience. Experts typically reach a level of where they solve problem using heuristics to solve problems. There patterns and shortcuts are efficient, but they also tend to be "compiled" and not all that open to critical examination. We create best when we are able to modify, rearrange, and discard, and that's harder to do when our default mode of thinking is in pre-compiled units.

It should not bother us that useful adages and proverbs contradict one another. The world is complex. As Bokononists say, Busy, busy, busy.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 09, 2015 3:40 PM

Computer Science Everywhere, Military Edition

Military Operations Orders are programs that are executed by units. Code re-use and other software engineering principles applied regularly to these.

An alumnus of my department, a CS major-turned-military officer, wrote those lines in an e-mail responding to my recent post, A Little CS Would Help a Lot of College Grads. Contrary to what many people might imagine, he has found what he learned in computer science to be quite useful to him as an Army captain. And he wasn't even a programmer:

One of the biggest skills I had over my peers was organizing information. I wasn't writing code, but I was handling lots of data and designing systems for that data. Organizing information in a way that was easy to present to my superiors was a breeze and having all the supporting data easily accessible came naturally to me.

Skills and principles from software engineering and project development apply to systems other than software. They also provide a vocabulary for talking about ideas that non-programmers encounter every day:

I did introduce my units to the terms border cases, special cases, and layers of abstraction. I cracked a smile every time I heard those terms used in a meeting.

Excel may not be a "real programming language", but knowing the ways in which it is a language can make managers of people and resources more effective at what they do.

For more about how a CS background has been useful to this officer, check out CS Degree to Army Officer, a blog entry that expands on his experiences.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

January 01, 2015 11:29 AM

Being Wrong in 2015

Yesterday, I read three passages about being wrong. First, this from a blog entry about Charles Darwin's "fantastically wrong" idea for how natural selection works:

Being wildly wrong is perfectly healthy in science, because when someone comes along to prove that you're wrong, that's progress. Somewhat embarrassing progress for the person being corrected, sure, but progress nonetheless.

Then, P.G. Wodehouse shared in his Paris Review interview that it's not all Wooster and Jeeves:

... the trouble is when you start writing, you write awful stuff.

And finally, from a touching reflection on his novelist father, this delicious sentence by Colum McCann:

He didn't see this as a failure so much as an adventure in limitations.

My basic orientation as a person is one of small steps, small progress, trying to be a little less wrong than yesterday. However, such a mindset can lead to a conservatism that inhibits changes in direction. One goal I have for 2015 is to take bigger risks intellectually, to stretch my thinking more than I have lately. I'll trust Wodehouse that when I start, I may well be awful. I'll recall Darwin's example that it's okay to be wildly wrong, because then someone will prove me wrong (maybe even me), and that will be progress. And if, like McCann's father, I can treat being wrong as merely an adventure in my limitations, perhaps fear and conservatism won't hold me back from new questions worth asking.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

December 26, 2014 8:32 AM

Editing and the Illusion of Thought

Martin Amis, in The Paris Review, The Art of Fiction No. 151:

By the way, it's all nonsense about how wonderful computers are because you can shift things around. Nothing compares with the fluidity of longhand. You shift things around without shifting them around--in that you merely indicate a possibility while your original thought is still there. The trouble with a computer is that what you come out with has no memory, no provenance, no history--the little cursor, or whatever it's called, that wobbles around the middle of the screen falsely gives you the impression that you're thinking. Even when you're not.

My immediate reaction was that Mr. Amis needs version control, but there is something more here.

When writing with pencil and paper, we work on an artifact that embodies the changes it has gone through. We see the marks and erasures; we see the sentence where it once was once at the same time we see the arrow telling us where it now belongs. When writing in a word processor, our work appears complete, even timeless, though we know it isn't. Mark-up mode lets us see some of the document's evolution, but the changes feel more distant from our minds. They live out there.

I empathize with writers like Amis, whose experience predates the computer. Longhand feels different. Teasing out what what was valuable, even essential, in previous experience and what was merely the limitation of our tools is one of the great challenges of any time. How do we make new tools that are worth the change, that enable us to do more and better?


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

December 14, 2014 9:38 AM

Social Media, Baseball, and Addiction

In a recent interview with Rolling Stone, rock star Geddy Lee revealed himself as a fan of baseball, but not of social media:

Geddy Lee isn't a big fan of social media. "I sometimes look on Twitter to follow baseball transactions," he says. "But that's it. I'm also not on Facebook or anything. I see it as an addiction and I have enough addictions. God knows I pick up my phone enough to check baseball scores."

As a baseball fan without a smart phone, I am in no rush to judge. I don't need more addictions, either.

The recently-concluded winter baseball meetings likely kept Lee as busy following transactions as they kept me, with several big trades and free agent signings. My Reds and Tigers both made moves that affect expectations for 2015.

Pitchers and catchers report in a little over two months. Lee and I will be checking scores again soon enough.


Posted by Eugene Wallingford | Permalink | Categories: General, Photos

December 02, 2014 2:53 PM

Other People's Best Interests

Yesterday I read:

It's hard for me to figure out people voting against their own self-interests.

I'm not linking to the source, because it wouldn't be fair to single the speaker out, especially when so many other things in the article are spot-on. Besides, I hear many different people express this sentiment from time time, people of various political backgrounds and cultural experiences. It seems a natural human reaction when things don't turn out the way we think they should.

Here is something I've learned from teaching and from working with teams doing research and writing software:

If you find yourself often thinking that people aren't acting in their own self-interests, maybe you don't know what their interests are.

It certainly may be true that people are not acting in what you think is their own self-interest. But it's rather presumptuous to think that you other people's best interest better than they do.

Whenever I find myself in this position, I have some work to do. I need to get to know my students, or my colleagues, or my fellow citizens, better. In cases where it's really true, and I have strong reason to think they aren't acting in their own best interest, I have an opportunity to help them learn. This kind of conversation calls for great care, though, because often we are dealing with people's identities and most deeply-held believes.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Teaching and Learning

November 25, 2014 1:43 PM

Concrete Play Trumps All

Areschenko-Johannessen, Bundesliga 2006-2007

One of the lessons taught by the computer is that concrete play trumps all.

This comment appeared in the review of a book of chess analysis [ paywalled ]. The reviewer is taking the author to task for talking about the positional factors that give one player "a stable advantage" in a particular position, when a commercially-available chess program shows the other player can equalize easily, and perhaps even gain an advantage.

It is also a fitting comment on our relationship with computers these days more generally. In areas such as search and language translation, Google helped us see that conventional wisdom can often be upended by a lot of data and many processors. In AI, statistical techniques and neural networks solve problems in ways that models of human cognition cannot. Everywhere we turn, it seems, big data and powerful computers are helping us to redefine our understanding of the world.

We humans need not lose all hope, though. There is still room for building models of the world and using them to reason, just as there is room for human analysis of chess games. In chess, computer analysis is pushing grandmasters to think differently about the game. The result is a different kind of understanding for the more ordinary of us, too. We just have to be careful to check our abstract understanding against computer analysis. Concrete play trumps all, and it tests our hypotheses. That's good science, and good thinking.

~~~~

(The chess position is from Areschenko-Johannessen 2006-2007, used as an example in Chess Training for Post-Beginners by Yaroslav Srokovski and cited in John Hartmann's review of the book in the November 2014 issue of Chess Life.)


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 23, 2014 8:50 AM

Supply, Demand, and K-12 CS

When I meet with prospective students and their parents, we often end up discussing why most high schools don't teach computer science. I tell them that, when I started as a new prof here, about a quarter of incoming freshmen had taken a year of programming in high school, and many other students had had the opportunity to do so. My colleagues and I figured that this percentage would go way up, so we began to think about how we might structure our first-year courses when most or all students already knew how to program.

However, the percentage of incoming students with programming experience didn't go up. It went way down. These days, about 10% of our freshman know how to program when they start our intro course. Many of those learned what they know on their own. What happened, today's parents ask?

A lot of things happened, including the dot-com bubble, a drop in the supply of available teachers, a narrowing of the high school curriculum in many districts, and the introduction of high-stakes testing. I'm not sure how much each contributed to the change, or whether other factors may have played a bigger role. Whatever the causes, the result is that our intro course still expects no previous programming experience.

Yesterday, I saw a post by a K-12 teacher on the Racket users mailing list that illustrates the powerful pull of economics. He is leaving teaching for software development industry, though reluctantly. "The thing I will miss the most," he says, "is the enjoyment I get out of seeing youngsters' brains come to life." He also loves seeing them succeed in the careers that knowing how to program makes possible. But in that success lies the seed of his own career change:

Speaking of my students working in the field, I simply grew too tired of hearing about their salaries which, with a couple of years experience, was typically twice what I was earning with 25+ years of experience. Ultimately that just became too much to take.

He notes that college professors probably know the feeling, too. The pull must be much stronger on him and his colleagues, though; college CS professors are generally paid much better than K-12 teachers. A love of teaching can go only so far. At one level, we should probably be surprised that anyone who knows how to program well enough to teach thirteen- or seventeen-year-olds to do it stays in the schools. If not surprised, we should at least be deeply appreciative of the people who do.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

November 11, 2014 7:53 AM

The Internet Era in One Sentence

I just love this:

When a 14-year-old kid can blow up your business in his spare time, not because he hates you but because he loves you, then you have a problem.

Clay Shirky attributes it to Gordy Thompson, who managed internet services at the New York Times in the early 1990s. Back then, it was insightful prognostication; today, it serves as an epitaph for many an old business model.

Are 14-year-old kids making YouTube videos to replace me yet?


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 01, 2014 3:27 PM

Passion is a Heavy Burden

Mark Guzdial blogged this morning about the challenge of turning business teachers into CS teachers. Where is the passion? he asks.

These days, I wince every time I hear word 'passion'. We apply it to so many things. We expect teachers to have passion for the courses they teach, students to have passion for the courses they take, and graduates to have passion for the jobs they do and the careers they build.

Passion is a heavy burden. In particular, I've seen it paralyze otherwise well-adjusted college students who think they need to try another major, because they don't feel a passion for the one they are currently studying. They don't realize that often passion comes later, after they master something, do it for a while, and come to appreciate it ways they could never imagine before. I'm sure some of these students become alumni who are discontent with their careers, because they don't feel passion.

I think requiring all CS teachers to have a passion for CS sets the bar too high. It's an unrealistic expectation of prospective teachers and of the programs that prepare them.

We can survive without passionate teachers. We should set our sights on more realistic and relevant goals:

  • Teachers should be curious. They should have a desire to learn new things.
  • Teachers should be professional. They should have a desire to do their jobs well.
  • Teachers should be competent. They should be capable of doing their jobs well.

Curiosity is so much more important than passion for most people in most contexts. If you are curious, you will like encountering new ideas and learning new skills. That enjoyment will carry you a long way. It may even help you find your passion.

Perhaps we should set similarly realistic goals for our students, too. If they are curious, professional, and competent, they will most likely be successful -- and content, if not happy. We could all do worse.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

October 23, 2014 4:21 PM

A Quick Word on the Yik Yak Controversy

There has been some controversy on my campus recently about a slew of hurtful posts made on the social media application Yik Yak. The following is something I wrote for my intro CS students, with minor changes.

Computing is often in the news, but we don't talk much about current events in class. That's not the focus of this course, and we have plenty to do...

But the recent news story in the Northern Iowan about Yik Yak has been on my mind. Yik Yak is a social media app that lets people make comments anonymously and vote on other people's comments. This kind of app has many possible uses, some of which are positive. Many people live under conditions where they need to be able to communicate anonymously.

Unfortunately, some people in the UNI area have been using it to post hurtful comments about various groups. This behavior is simply mean.

Yik Yak is a social app, so the controversy is about people and how they behave. In this regard, my reaction has been similar to so many others' reactions. I am always sad to be reminded that people actually think such things, and sadder to know that they feel compelled to say them out loud. To do so anonymously is an act of cowardice.

But this controversy is also about what we do, because Yik Yak is a program. We call it an "app", but that's just the term du jour. It is a computer program. Programmers wrote it.

We could have an interesting discussion about apps like this: their uses, the good and bad they enable, how to grow and guide communities of users, and so on. I do not use Yik Yak and am not a member of its community. I don't know much beyond what has been reported about it in the media. However, I have been part of Internet-based communities since I was in college, and they all seem to have a lot in common with one another. So this situation feels quite familiar to me.

I am not going to lecture a group of young people about the ways they communicate and congregate on-line. Let me just say this.

When you learn to program, you inherit power to affect the world. You can make things, programs and apps and services that real people use. You can choose to use your power to do good things, to make the world better. Or you can not choose to. Not choosing may mean creating something whose effects you did not consider, or whose community behaves in ways you did not intend.

Please take your power seriously. Think about the effects of what you do when you write a program. Choose wisely.


Posted by Eugene Wallingford | Permalink | Categories: General

October 21, 2014 3:13 PM

Ernest Hemingway, Programmer

In The Wave in the Mind, a collection of talks and essays, Ursula Le Guin describes Ernest Hemingway as "someone who did Man right". She also gives us insight to Hemingway's preferences in programming languages. Anyone who has read Hemingway knows that he loved short sentences. Le Guin tells us more:

Ernest Hemingway would have died rather than have syntax. Or semicolons.

So, Java and C are out. Python would fit. Or maybe Lisp. All the greats know Lisp.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

September 24, 2014 3:54 PM

Is It Really That Hard?

This morning, I tweeted:

Pretty sure I could build a git-based curriculum management system in two weeks that would be miles better than anything on the market now.

Yes, I know that it is easy to have ideas, and that carrying an idea through to a product is the real challenge. At least I don't just need a programmer...

My tweet was the result of temporary madness provoked by yet another round of listening to non-CS colleagues talk about one of the pieces of software we use on campus. It is a commercial product purchased for one task only, to help us manage the cycle of updating the university catalog. Alas, in its current state, it can handle only one catalog at a time. This is, of course, inconvenient. There are always at least two catalogs: the one in effect at this moment, and the one in progress of being updated. That doesn't even take into account all of the old catalogs still in effect for the students who entered the university when they were The Catalog.

Yes, we need version control. Either the current software does not provide it, or that feature is turned off.

The madness arises because of the deep internal conflict that occurs within me when I'm drawn into such conversations. Everyone assumes that programs "can't do this", or that the programmers who wrote our product were mean or incompetent. I could try to convince them otherwise by explaining the idea of version control. But their experience with commercial software is so uniformly bad that they have a hard time imagining I'm telling the truth. Either I misunderstand the problem, or I am telling them a white lie.

The alternative is to shake my head, agree with them implicitly, and keep thinking about how to teach my intro students how to design simple programs.

I'm convinced that a suitable web front-end to a git back end could do 98% of what we need, which is about 53% more than either of our last two commercial solutions has done for us.

Maybe it's time for me to take a leave of absence, put together a small team of programmers, and do this. Yes, I would need a team. I know my limitations, and besides working with a few friends would be a lot more fun. The current tools in this space leave a lot of room for improvement. Built well and marketed well, this product would make enough money from satisfaction-starved universities to reward everyone on the team well enough for all to retire comfortably.

Maybe not. But the idea is free the taking. All I ask is that if you build it, give me a shout-out on your website. Oh, and cut my university a good deal when we buy your software to replace whatever product we are grumbling about when you reach market.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

September 15, 2014 4:22 PM

It's All Just Keystrokes

Laurie Penny describes one effect of so many strands of modern life converging into the use of a single device:

That girl typing alone at the internet café might be finishing off her novel. Or she might be breaking up with her boyfriend. Or breaking into a bank. Unless you can see her screen, you can't know for sure. It's all just keystrokes.

Some of it is writing in ways we have always written; some it is writing in ways only recently imagined. Some of it is writing for a computer. A lot of it is writing.

(Excerpt from Why I Write.)


Posted by Eugene Wallingford | Permalink | Categories: General

September 12, 2014 1:49 PM

The Suffocating Gerbils Problem

I had never heard of the "suffocating gerbils" problem until I ran across this comment in a Lambda the Ultimate thread on mixing declarative and imperative approaches to GUI design. Peter Van Roy explained the problem this way:

A space rocket, like the Saturn V, is a complex piece of engineering with many layered subsystems, each of which is often pushed to the limits. Each subsystem depends on some others. Suppose that subsystem A depends on subsystem B. If A uses B in a way that was not intended by B's designers, even though formally B's specification is being followed by A, then we have a suffocating gerbils problem. The mental image is that B is implemented by a bunch of gerbils running to exhaustion in their hoops. A is pushing them to do too much.

I first came to appreciate the interrelated and overlapping functionality of engineered subsystems in graduate school, when I helped a fellow student build a software model of the fuel and motive systems of an F-18 fighter plane. It was quite a challenge for our modeling language, because the functions and behaviors of the systems were intertwined and did not follow obviously from the specification of components and connections. This challenge motivated the project. McDonnell Douglas was trying to understand the systems in a new way, in order to better monitor performance and diagnose failures. (I'm not sure how the project turned out...)

We suffocate gerbils at the university sometimes, too. Some functions depend on tenure-track faculty teaching occasional overloads, or the hiring of temporary faculty as adjuncts. When money is good, all is well. As budgets tighten, we find ourselves putting demands on these subsystems to meet other essential functions, such as advising, recruiting, and external engagement. It's hard to anticipate looming problems before they arrive in full failure; everything is being done according to specification.

Now there's a mental image: faculty gerbils running to exhaustion.

If you are looking for something new to read, check out some of Van Roy's work. His Concepts, Techniques, and Models of Computer Programming offers all kinds of cool ideas about programming language design and use. I happily second the sentiment of this tweet:

Note to self: read all Peter Van Roy's LtU comments in chronological order and build the things that don't exist yet: http://lambda-the-ultimate.org/user/288/track?from=120&sort=asc&order=last%20post

There are probably a few PhD dissertations lurking in those comments.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

August 19, 2014 1:49 PM

The Universal Justification

Because we need it to tell better stories.

Ethan Zuckerman says that this is the reason people are addicted to big data, quoting Macej Ceglowski's wonderful The Internet with a Human Face But if you look deep enough, this is the reason that most of us do so many of the things we do. We want to tell better stories.

As I teach our intro course this fall, I am going to ask myself occasionally, "How does what we are learning today help my students tell a better story?" I'm curious to see how that changes the way I think about the things we do in class.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

July 28, 2014 1:00 PM

Sometimes, You Have To Speak Up For Yourself

Wisdom from the TV:

"Whatever happened to humility, isn't that a virtue or something?"

"One of the highest. People in power are always saying so."

It is worth noting that one of the antonyms of "humble" is "privileged".

~~~~

This passage apparently occurs in an episode of Orange Is The New Black. I've never seen the show, but the exchange is quoted in this discussion of the show.

I just realized how odd it is to refer to Orange Is The New Black as a TV show. It is Netflix original series, which shows up on your TV only if you route your Internet viewing through that old box. Alas, 30- and 60-minute serialized shows have always been "TV" to me. I'm caught in the slipstream as our dominant entertainment media change forms.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

July 10, 2014 3:08 PM

The Passing of the Postage Stamp

In this New York Times article on James Baldwin's ninetieth birthday, scholar Henry Louis Gates laments:

On one hand, he's on a U.S. postage stamp; on the other hand, he's not in the Common Core.

I'm not qualified to comment on Baldwin and his place in the Common Core. In the last few months, I read several articles about and including Baldwin, and from those I have come to appreciate better his role in twentieth-century literature. But I also empathize with anyone trying to create a list of things that every American should learn in school.

What struck me in Gates's comment was the reference to the postage stamp. I'm old enough to have grown up in a world where the postage stamp held a position of singular importance in our culture. It enabled communication at a distance, whether geographical or personal. Stamps were a staple of daily life.

In such a world, appearing on a stamp was an honor. It indicated a widespread acknowledgment of a person's (or organization's, or event's) cultural impact. In this sense, the Postal Service's decision to include James Baldwin on a stamp was a sign of his importance to our culture, and a way to honor his contributions to our literature.

Alas, this would have been a much more significant and visible honor in the 1980s or even the 1990s. In the span of the last decade or so, the postage stamp has gone from relevant and essential to archaic.

When I was a boy, I collected stamps. It was a fun hobby. I still have my collection, even if it's many years out of date now. Back then, stamp collecting was a popular activity with a vibrant community of hobbyists. For all I know, that's still true. There's certainly still a vibrant market for some stamps!

But these days, whenever I use a new stamp, I feel as if I'm holding an anachronism in my hands. Computing technology played a central role in the obsolescence of the stamp, at least for personal and social communication.

Sometimes people say that we in CS need to a better job helping potential majors see the ways in which our discipline can be used to effect change in the world. We never have to look far to find examples. If a young person wants to be able to participate in how our culture changes in the future, they can hardly do better than to know a little computer science.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

July 09, 2014 12:35 PM

Why I Blog, Ten Years On

A blog can be many things.

It can an essay, a place to work out what I think, in the act of writing.

It can be a lecture, a place to teach something, however big or small, in my own way.

It can be memoir, a place to tell stories about my life, maybe with a connection to someone else's story.

It can be a book review or a conference review, a place to tell others about something I've read or seen that they might like, too. Or not.

It can be an open letter, a place to share news, good or bad, in a broadcast that reaches many.

It can be a call for help, a request for help from anyone who receives the message and has the time and energy to respond.

It can be a riff on someone else's post. I'm not a jazz musician, but I like to quote the melodies in other people's writing. Some blog posts are my solos.

It can be a place to make connections, to think about how things are similar and different, and maybe learn something in the process.

A blog is all of these, and more.

A blog can also be a time machine. In this mode, I am the reader. My blog reminds me who I was at another time.

This effect often begins with a practical question. When I taught agile software development this summer, I looked back to when I taught it last. What had I learned then but forgotten since? How might I do a better job this time around?

When I visit blog posts from the past, though, something else can happen. I sometimes find myself reading on. The words mesmerize me and pull me forward on the page, but back in time. It is not that the words are so good that I can't stop reading. It's that they remind me who I was back then. A different person wrote those words. A different person, yet me. It's quite a feeling.

A blog can combine any number of writing forms. I am not equally good writing in all of these forms, or even passably good in any of them. But they are me. Dave Winer has long said that a blog is the unedited voice of a person. This blog is the unedited voice of me.

When I wrote my first blog post ten years ago today, I wasn't sure if anyone wanted to hear my voice. Over the years, I've had the good fortune to interact with many readers, so I know someone is listening. That still amazes me. I'm glad that something you read here is worth the visit.

Back in those early days, I wondered if it even mattered whether anyone else would read. The blog as essay and as time machine are valuable enough on their own to make writing worth the effort to me. But I'll be honest: it helps a lot knowing that other people are reading. Even when you don't send comments by e-mail, I know you are there. Thank you for your time.

I don't write as often as I did in the beginning. But I still have things to say, so I'll keep writing.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

June 26, 2014 11:12 AM

Debunking Christensen?

A lot of people I know have been discussing the recent New Yorker article "debunking" Clayton Christensen's theory of disruptive innovation. I'm withholding judgment, because that usually is the right thing for me to do when discussing theories about systems we don't understand well and critiques of such theories. The best way to find out the answer is to wait for more data.

That said, we have seen this before in the space of economics and business management. A few years back, the book Good to Great by James Collins became quite popular on my campus, because our new president, an economist by training, was a proponent of its view of how companies had gone from being merely steady producers to being stars in their markets. He hoped that we could use some of its prescriptions to help transform our university from a decent public comprehensive into a better, stronger institution.

But in recent years we have seen critiques of Collins's theory. The problem: some of the companies that Collins touts in the book have fallen on hard times and been unable to sustain their greatness. (As I said, more data usually settles all scores.) Good to Great's prescriptions weren't enough for companies to sustain greatness; maybe they were not sufficient, or even necessary, for achieving (short-term) market dominance.

This has long been a weakness of the business management literature. When I was an undergrad double majoring in CS and accounting, I read a lot of case studies about successful companies, and my professors tried to help us draw out truths that would help any company succeed. Neither the authors of the case studies nor the professors seemed aware that we were suffering from a base case of survivor bias. Sure, that set of strategies worked for Coca Cola. Did other companies use the same strategies and fail? If so, why? Maybe Coca Cola just got lucky. We didn't really know.

My takeaway from reading most business books of this sort is that they tell great stories. They give us posthoc explanations of complex systems that fit the data at hand, but they don't have much in the way of predictive power. Buying into such theories wholesale as a plan for the future is rarely a good idea.

These books can still be useful to people who read them as inspirational stories and a source of ideas to try. For example, I found Collins's idea of "getting the right people on the bus" to be helpful when I was first starting as department head. I took a broad view of the book and learned some things.

And that said, I have speculated many times here about the future of universities and even mentioned Christensen's idea of disruption a couple of times [ 1 | 2 ]. Have I been acting on a bad theory?

I think the positive reaction to the New Yorker article is really a reaction to the many people who have been using the idea of disruptive innovation as a bludgeon in the university space, especially with regard to MOOCs. Christensen himself has sometimes been guilty of speaking rather confidently about particular ways to disrupt universities. After a period of groupthink in which people know without evidence that MOOCs will topple the existing university model, many of my colleagues are simply happy to have someone speak up on their side of the debate.

The current way that universities do business faces a number of big challenges as the balance of revenue streams and costs shifts. Perhaps universities as we know them now will ultimately be disrupted. This does not mean that any technology we throw at the problem will be the disruptive force that topples them. As Mark Guzdial wrote recently,

Moving education onto MOOCs just to be disruptive isn't valuable.

That's the most important point to take away from the piece in the New Yorker: disruptors ultimately have to provide value in the market. We don't know yet if MOOCs or any other current technology experiment in education can do that. We likely won't know until after it starts to happen. That's one of the important points to take away from so much of the business management literature. Good descriptive theories often don't make good prescriptive theories.

The risk people inside universities run is falling into a groupthink of their own, in which something very like the status quo is the future of higher education. My colleagues tend to speak in more measured tones than some of the revolutionaries espousing on-line courses and MOOCs, but their words carry an unmistakable message: "What we do is essential. The way we do has stood the test of time. No one can replace us." Some of my colleagues admit ruefully that perhaps something can replace the university as it is, but that we will all be worse off as a result.

That's dangerous thinking, too. Over the years, plenty of people who have said, "No one can do what we do as well as we do" have been proven wrong.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

May 05, 2014 4:35 PM

Motivated by Teaching Undergrads

Recently, a gentleman named Seth Roberts passed away. I didn't know Roberts and was not familiar with his work. However, several people I respect commented on his life and career, so I took a look at one colleague's reminiscence. Roberts was an interesting fellow who didn't do things the usual way for a research academic. This passage stood out:

Seth's academic career was unusual. He shot through college and graduate school to a tenure-track job at a top university, then continued to do publication-quality research for several years until receiving tenure. At that point he was not a superstar but I think he was still considered a respected member of the mainstream academic community. But during the years that followed, Seth lost interest in that thread of research (you can see this by looking at the dates of most of his highly-cited papers). He told me once that his shift was motivated by teaching introductory undergraduate psychology: the students, he said, were interested in things that would affect their lives, and, compared to that, the kind of research that leads to a productive academic career did not seem so appealing.

That last sentence explains, I think, why so many computer science faculty at schools that are not research-intensive end up falling away from traditional research and publishing. When you come into contact with a lot of undergrads, you may well find yourself caring more deeply about things that will affect their lives in a more direct way. Pushing deeper down a narrow theoretical path, or developing a novel framework for file system management that most people will never use, may not seem like the best way to use your time.

My interests have certainly shifted over the years. I found myself interested in software development, in particular tools and practices that students can use to make software more reliably and teaching practices that would students learn more effectively. Fortunately, I've always loved programming qua programming, and this has allowed me to teach different programming styles with an eye on how learning them will help my students become better programmers. Heck, I was even able to stick with it long enough that functional programming became popular in industry! I've also been lucky that my interest in languages and compilers has been of interest to students and employers over the last few years.

In any event, I can certainly understand how Roberts diverged from the ordained path and turned his interest to other things. One challenge for leaving the ordained path is to retain the mindset of a scientist, seeking out opportunities to evaluate ideas and to disseminate the ones that appear to hold up. You don't need to publish in the best journals to disseminate good ideas widely. That may not even be the best route.

Another challenge is to find a community of like-minded people in which to work. An open, inquisitive community is a place to find new ideas, a place to try ideas out before investing too much in a doomed one, and a place to find the colleagues most of us need to stay sane while exploring what interests. The software and CS worlds have helped create the technology that makes it possible to grow such communities in new ways, and our own technology now supports some amazing communities of software and CS people. It is a good time to be an academic or developer.

I've enjoyed reading about Roberts' career and learning about what seems to have been one of the academia's unique individuals. And I certainly understand how teaching introductory undergrads might motivate a different worldview for an academic. It's good to be reminded that it's okay to care about the things that will affect the lives of our students now rather than later.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

April 27, 2014 7:20 PM

Knowing and Doing in the Wild, Antifragile Edition

a passage from Taleb's 'Antifragile' that mentions knowing and doing

Reader Aaron Friel was reading Taleb's Antifragile and came across a passage that brought to mind this blog. Because of "modernity's connectivity, and the newfound invisibility of causal chains", Taleb says, ...

The intellectual today is vastly more powerful and dangerous than before. The "knowledge world" causes separation of knowing and doing (within the same person) and leads to the fragility of society.

He wondered if this passage was the source of the title of my blog. Knowing and Doing predates Taleb's book by nearly decade, so it wasn't the source. But the idea expressed in this passage was certainly central to how the blog got its name. I hoped to examine the relationship between knowing and doing, and in particular the danger of separating them in the classroom or in the software studio. So, I'm happy to have someone make a connection to this passage.

Even so, I still lust after naming my blog The Euphio Question. RIP, Mr. Vonnegut.


Posted by Eugene Wallingford | Permalink | Categories: General

April 22, 2014 2:56 PM

Not Writing At All Leads To Nothing

In a recent interview, novelist and journalist Anna Quindlen was asked if she ever has writer's block. Her answer:

Some days I fear writing dreadfully, but I do it anyway. I've discovered that sometimes writing badly can eventually lead to something better. Not writing at all leads to nothing.

I deal with CS students all the time who are paralyzed by starting on a programming assignment, for fear of doing it wrong. All that gets them is never done. My job in those cases is less likely to involve teaching them something new they need to do the assignment and than to involve helping them get past the fear. A teacher sometimes has to be a psychologist.

I'd like to think that, at my advanced age and experience, I am beyond such fears myself. But occasionally they are there. Sometimes, I just have to force myself to write that first simple test, watch it fail, and ask myself, "What now?" As code happens, it may be good, or it may be bad, but it's not an empty file. Refactoring helps me make it better as I go along. I can always delete it all and start over, but by then I know more than I did at the outset, and I usually am ready to plow ahead.


Posted by Eugene Wallingford | Permalink | Categories: General

April 09, 2014 3:26 PM

Programming Everywhere, Vox Edition

In a report on the launch of Vox Media, we learn that line between software developers and journalists at Vox is blurred, as writers and reporters work together "to build the tools they require".

"It is thrilling as a journalist being able to envision a tool and having it become a real thing," Mr. Topolsky said. "And it is rare."

It will be less rare in the future. Programming will become a natural part of more and more people's toolboxes.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

March 12, 2014 3:55 PM

Not Content With Content

Last week, the Chronicle of Higher Ed ran an article on a new joint major at Stanford combining computer science and the humanities.

[Students] might compose music or write a short story and translate those works, through code, into something they can share on the web.

"For students it seems perfectly natural to have an interest in coding," [the program's director] said. "In one sense these fields might feel like they're far apart, but they're getting closer and closer."

The program works in both directions, by also engaging CS students in the societal issues created by ubiquitous networks and computing power.

We are doing something similar at my university. A few years ago, several departments began to collaborate on a multidisciplinary program called Interactive Digital Studies which went live in 2012. In the IDS program, students complete a common core of courses from the Communication Studies department and then take "bundles" of coursework involving digital technology from at least two different disciplines. These areas of emphasis enable students to explore the interaction of computing with various topics in media, the humanities, and culture.

Like Stanford's new major, most of the coursework is designed to work at the intersection of disciplines, rather than pursuing disciplines independently, "in parallel".

The initial version of the computation bundle consists of an odd mix of application tools and opportunities to write programs. Now that the program is in place, we are finding that students and faculty alike desire more depth of understanding about programming and development. We are in the process of re-designing the bundle to prepare students to work in a world where so many ideas become web sites or apps, and in which data analytics plays an important role in understanding what people do.

Both our IDS program and Stanford's new major focus on something that we are seeing increasingly at universities these days: the intersections of digital technology and other disciplines, in particular the humanities. Computational tools make it possible for everyone to create more kinds of things, but only if people learn how to use new tools and think about their work in new ways.

Consider this passage by Jim O'Loughlin, a UNI English professor, from a recent position statement on the the "digital turn" of the humanities:

We are increasingly unlikely to find writers who only provide content when the tools for photography, videography and digital design can all be found on our laptops or even on our phones. It is not simply that writers will need to do more. Writers will want to do more, because with a modest amount of effort they can be their own designers, photographers, publishers or even programmers.

Writers don't have to settle for producing "content" and then relying heavily on others to help bring the content to an audience. New tools enable writers to take greater control of putting their ideas before an audience. But...

... only if we [writers] are willing to think seriously not only about our ideas but about what tools we can use to bring our ideas to an audience.

More tools are within the reach of more people now than ever before. Computing makes that possible, not only for writers, but also for musicians and teachers and social scientists.

Going further, computer programming makes it possible to modify existing tools and to create new tools when the old ones are not sufficient. Writers, musicians, teachers, and social scientists may not want to program at that level, but they can participate in the process.

The critical link is preparation. This digital turn empowers only those who are prepared to think in new ways and to wield a new set of tools. Programs like our IDS major and Stanford's new joint major are among the many efforts hoping to spread the opportunities available now to a larger and more diverse set of people.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

March 11, 2014 4:52 PM

Change The Battle From Arguments To Tests

In his recent article on the future of the news business, Marc Andreessen has a great passage in his section on ways for the journalism industry to move forward:

Experimentation: You may not have all the right answers up front, but running many experiments changes the battle for the right way forward from arguments to tests. You get data, which leads to correctness and ultimately finding the right answers.

I love that clause: "running many experiments changes the battle for the right way forward from arguments to tests".

While programming, it's easy to get caught up in what we know about the code we have just written and assume that this somehow empowers us to declare sweeping truths about what to do next.

When students are first learning to program, they often fall into this trap -- despite the fact that they don't know much at all. From other courses, though, they are used to thinking for a bit, drawing some conclusions, and then expressing strongly-held opinions. Why not do it with their code, too?

No matter who we are, whenever we do this, sometimes we are right, and sometimes, we are wrong. Why leave it to chance? Run a simple little experiment. Write a snippet of code that implements our idea, and run it. See what happens.

Programs let us test our ideas, even the ideas we have about the program we are writing. Why settle for abstract assertions when we can do better? In the end, even well-reasoned assertions are so much hot air. I learned this from Ward Cunningham: It's all talk until the tests run.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

February 25, 2014 3:31 PM

Abraham Lincoln on Reading the Comment Section

From Abraham Lincoln's last public address:

As a general rule, I abstain from reading the reports of attacks upon myself, wishing not to be provoked by that to which I cannot properly offer an answer.

These remarks came two days after Robert E. Lee surrendered at Appomattox Court House. Lincoln was facing abuse from the North and the South, and from within his party and without.

The great ones speak truths that outlive their times.


Posted by Eugene Wallingford | Permalink | Categories: General

February 22, 2014 2:05 PM

MOOCs: Have No Fear! -- Or Should We?

The Grumpy Economist has taught a MOOC and says in his analysis of MOOCs:

The grumpy response to moocs: When Gutenberg invented moveable type, universities reacted in horror. "They'll just read the textbook. Nobody will come to lectures anymore!" It didn't happen. Why should we worry now?

The calming effect of his rather long entry is mitigated by other predictions, such as:

However, no question about it, the deadly boring hour and a half lecture in a hall with 100 people by a mediocre professor teaching utterly standard material is just dead, RIP. And universities and classes which offer nothing more to their campus students will indeed be pressed.

In downplaying the potential effects of MOOCs, Cochrane seems mostly to be speaking about research schools and more prestigious liberal arts schools. Education is but one of the "goods' being sold by such schools; prestige and connections are often the primary benefits sought by students there.

I usually feel a little odd when I read comments on teaching from people who teach mostly graduate students and mostly at big R-1 schools. I'm not sure their experience of teaching is quite the same as the experience of most university professors. Consequently, I'm suspicious of the prescriptions and predictions they make for higher education, because our personal experiences affect our view of the world.

That said, Cochrane's blog spends a lot of time talking about the nuts and bolts of creating MOOCs, and his comments on fixed and marginal costs are on the mark. (He may be grumpy, but he is an economist!) And a few of his remarks about teaching apply just as well to undergrads at a state teaching university as they do to U. of Chicago's doctoral program in economics. One that stood out:

Most of my skill as a classroom teacher comes from the fact that I know most of the wrong answers as well as the right ones.

All discussions of MOOCs ultimately include the question of revenue. Cochrane reminds us that universities...

... are, in the end, nonprofit institutions that give away what used to be called knowledge and is now called intellectual property.

The question now, though, is how schools can afford to give away knowledge as state support for public schools declines sharply and relative cost structure makes it hard for public and private schools alike to offer education at a price reasonable for their respective target audiences. The R-1s face a future just as challenging as the rest of us; how can they afford to support researchers who spend most of their time creating knowledge, not teaching it to students?

MOOCs are a weird wrench thrown into this mix. They seem to taketh away as much as they giveth. Interesting times.


Posted by Eugene Wallingford | Permalink | Categories: General

February 05, 2014 4:08 PM

Eccentric Internet Holdout Delbert T. Quimby

Back in a July 2006 entry, I mentioned a 1995 editorial cartoon by Ed Stein, then of the Rocky Mountain News. The cartoon featured "eccentric Internet holdout Delbert T. Quimby", contentedly passing another day non-digitally, reading a book in his den and drinking a glass of wine. It's always been a favorite of mine.

The cartoon had at least one other big fan. He looked for it on the web but had no luck finding it. When he googled the quote, though, there my blog entry was. Recently, his wife uncovered a newspaper clipping of the cartoon, and he remembered the link to my blog post. In an act of unprovoked kindness, he sent me a scan of the cartoon. So, 7+ years later, here it is:

Eccentric Internet holdout Delbert T. Quimby passes yet another day non-digitally.

The web really is an amazing place. Thanks, Duncan.

In 1995, being an Internet holdout was not quite as radical as it would be today. I'm guessing that most of the holdouts in 2014 are Of A Certain Age, remembering a simpler time when information was harder to come. To avoid the Internet and the web entirely these days is to miss out on a lot of life.

Even so, I am eccentric enough still to appreciate time off-line, a good book in my hand and a beverage at my side. Like my digital devices, I need to recharge every now and then.

(Right now, I am re-reading David Lodge's Small World. It's fun to watch academics made good sport of.)


Posted by Eugene Wallingford | Permalink | Categories: General

February 03, 2014 4:07 PM

Remembering Generosity

For a variety of reasons, the following passage came to mind today. It is from a letter that Jonathan Schoenberg wrote as part of the "Dear Me, On My First Day of Advertising" series on The Egotist forum:

You got into this business by accident, and by the generosity of people who could have easily been less generous with their time. Please don't forget it.

It's good for me to remind myself frequently of this. I hope I can be as generous with time to my students and colleagues as as so many of my professors and colleagues were with their time. Even when it means explaining nested for-loops again.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

January 27, 2014 3:29 PM

An Example of the Difference Between Scientists and Humanists

Earlier today, I tweeted a link to The origin of consciousness in the breakdown of the bicameral mind, in which Erik Weijers discusses an unusual theory about the origin of consciousness developed by Julian Jaynes:

[U]ntil a few thousand years ago human beings did not 'view themselves'. They did not have the ability: they had no introspection and no concept of 'self' that they could reflect upon. In other words: they had no subjective consciousness. Jaynes calls their mental world the bicameral mind.

It sounds odd, I know, but I found Jaynes's hypothesis to be a fascinating extrapolation of human history. Read more of Weijers's review if you might be interested.

A number of people who saw my tweet expressed interest in the article or a similar fascination with Jaynes's idea. Two people mentioned the book in which Jaynes presented his hypothesis. I responded that I would now have to dive into the book and learn more. How could I resist the opportunity?

Two of the comments that followed illustrate nicely the differing perspectives of the scientist and the humanist. First, Chris said:

My uncle always loved that book; I should read it, since I suspect serious fundamental evidentiary problems with his thesis.

And then Liz said:

It's good! I come from a humanities angle, so I read it as a thought experiment & human narrative.

The scientist thinks almost immediately of evidence and how well supported the hypothesis might be. The humanist thinks of the hypothesis first as a human narrative, and perhaps only then as a narrow scientific claim. Both perspectives are valuable; they simply highlight different forms of the claim.

From what I've seen on Twitter, I think that Chris and Liz are like me and most of the people I know: a little bit scientist, a little bit humanist -- interested in both the story and the argument. All that differs sometimes is the point from which we launch our investigations.


Posted by Eugene Wallingford | Permalink | Categories: General

January 27, 2014 11:39 AM

The Polymath as Intellectual Polygamist

Carl Djerassi, quoted in The Last Days of the Polymath:

Nowadays people [who] are called polymaths are dabblers -- are dabblers in many different areas. I aspire to be an intellectual polygamist. And I deliberately use that metaphor to provoke with its sexual allusion and to point out the real difference to me between polygamy and promiscuity.

On this view, a dilettante is merely promiscuous, making no real commitment to any love interest. A polymath has many great loves, and loves them all deeply, if not equally.

We tend to look down on dilettantes, but they can perform a useful service. Sometimes, making a connection between two ideas at the right time and in the right place can help spur someone else to "go deep" with the idea. Even when that doesn't happen, dabbling can bring great personal joy and provide more substantial entertainment than a lot of pop culture.

Academics are among the people these days with a well-defined social opportunity to be explore at least two areas deeply and seriously: their chosen discipline and teaching. This is perhaps the most compelling reason to desire a life in academia. It even offers a freedom to branch out into new areas later in one's career that is not so easily available to people who work in industry.

These days, it's hard to be a polymath even inside one's own discipline. To know all sub-areas of computer science, say, as well as the experts in those sub-areas is a daunting challenge. I think back to the effort my fellow students and I put in over the years that enabled us to take the Ph.D. qualifying exams in CS. I did quite well across the board, but even then I didn't understand operating systems or programming languages as well as experts in those areas. Many years later, despite continued reading and programming, the gap has only grown.

I share the vague sense of loss, expressed by the author of the article linked to above, of a time when one human could master multiple areas of discourse and make fundamental advances to several. We are certainly better off for collective understanding the world so much much better, but the result is a blow to a certain sort of individual mind and spirit.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

January 26, 2014 3:05 PM

One Reason We Need Computer Programs

Code bridges the gap between theory and data. From A few thoughts on code review of scientific code:

... there is a gulf of unknown size between the theory and the data. Code is what bridges that gap, and specifies how edge cases, weird features of the data, and unknown unknowns are handled or ignored.

I learned this lesson the hard way as a novice programmer. Other activities, such as writing and doing math, exhibit the same characteristic, but it wasn't until I started learning to program that the gap between theory and data really challenged me.

Since learning to program myself, I have observed hundreds of CS students encounter this gap. To their credit, they usually buckle down, work hard, and close the gap. Of course, we have to close the gap for every new problem we try to solve. The challenge doesn't go away; it simply becomes more manageable as we become better programmers.

In the passage above, Titus Brown is talking to his fellow scientists in biology and chemistry. I imagine that they encounter the gap between theory and data in a new and visceral way when they move into computational science. Programming has that power to change how we think.

There is an element of this, too, in how techies and non-techies alike sometimes lose track of how hard it is to create a successful start up. You need an idea, you need a programmer, and you need a lot of hard work to bridge the gap between idea and executed idea.

Whether doing science or starting a company, the code teaches us a lot about out theory. The code makes our theory better.

As Ward Cunningham is fond of saying, it's all talk until the tests run.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

December 18, 2013 3:31 PM

Favorite Passages from Today's Reading

From The End of the Facebook Era:

This is why social networks [like Google+] are struggling even more than Facebook to get a foothold in the future of social networking. They are betting on last year's fashion -- they're fighting Facebook for the last available room on the Titanic when they should be looking at all of the other ships leaving the marina.

A lot of people and organizations in this world are fighting over the last available room on their sector's version of the Titanic. Universities may well be among them. Who is leaving the marina?

From We Need to Talk About TED:

Astrophysics run on the model of American Idol is a recipe for civilizational disaster.

...

TED's version [of deep technocultural shift] has too much faith in technology, and not nearly enough commitment to technology. It is placebo technoradicalism, toying with risk so as to re-affirm the comfortable.

I like TED talks as much as the next person, but I often wonder how much change they cause in the world, as opposed to serving merely as chic entertainment for the comfortable First World set.


Posted by Eugene Wallingford | Permalink | Categories: General

December 17, 2013 3:32 PM

Always Have At Least Two Alternatives

Paraphrasing Kent Beck:

Whenever I write a new piece of code, I like to have at least two alternatives in mind. That way, I know I am not doing the worst thing possible.

I heard Kent say something like this at OOPSLA in the late 1990s. This is advice I give often to students and colleagues, but I've never had a URL that I could point them to.

It's tempting for programmers to start implementing the first good idea that comes to mind. It's especially tempting for novices, who sometimes seem surprised that they have even one good idea. Where would a second one come from?

More experienced students and programmers sometimes trust their skill and experience a little too easily. That first idea seems so good, and I'm a good programmer... Famous last words. Reality eventually catches up with us and helps us become more humble.

Some students are afraid: afraid they won't get done if they waste time considering alternatives, or afraid that they will choose wrong anyway. Such students need more confidence, the kind born out of small successes.

I think the most likely explanation for why beginners don't already seek alternatives is quite simple. They have not developed the design habit. Kent's advice can be a good start.

One pithy statement is often enough of a reminder for more experienced programmers. By itself, though, it probably isn't enough for beginners. But it can be an important first step for students -- and others -- who are in the habit of doing the first thing that pops into their heads.

Do note that this advice is consistent with XP's counsel to do the simplest thing that could possibly work. "Simplest" is a superlative. Grammatically, that suggests having at least three options from which to choose!


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

December 03, 2013 3:17 PM

The Workaday Byproducts of Striving for Higher Goals

Why set audacious goals? In his piece about the Snowfall experiment, David Sleight says yes, and not simply for the immediate end:

The benefits go beyond the plainly obvious. You need good R&D for the same reason you need a good space program. It doesn't just get you to the Moon. It gives you things like memory foam, scratch-resistant lenses, and Dustbusters. It gets you the workaday byproducts of striving for higher goals.

I showed that last sentence a little Twitter love, because it's something people often forget to consider, both when they are working in the trenches and when they are selecting projects to work on. An ambitious project may have a higher risk of failure than something more mundane, but it also has a higher chance of producing unexpected value in the form of new tools and improved process.

This is also something that university curricula don't do well. We tend to design learning experiences that fit neatly into a fifteen-week semester, with predictable gains for our students. That sort of progress is important, of course, but it misses out on opportunities for students to produce their own workaday byproducts. And that's an important experience for students to have.

It also gives a bad example of what learning should feel like, and what it should do for us. Students generally learn what we teach them, or what we make easiest for them to learn. If we always set before them tasks of known, easily-understood dimensions, then they will have to learn after leaving us that the world doesn't usually work like that.

This is one of the reasons I am such a fan of project-based computer science education, as in the traditional compiler course. A compiler is an audacious enough goal for most students that they get to discover their own personal memory foam.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

November 26, 2013 1:38 PM

Saying Thanks, and Giving Back

When someone asked Benjamin Franklin why he had declined to seek a patent for his famous stove, he said:

I declined it from a principle which has ever weighed with me on such occasions, that as we enjoy great advantages from the inventions of others, we should be glad of an opportunity to serve others by any invention of ours.

This seems a fitting sentiment to recall as I look forward to a few days of break with my family for Thanksgiving. I know I have a lot to be thankful for, not the least of which are the inventions of so many others that confer great advantage on me. This week, I give thanks for these creations, and for the creators who shared them with me.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

November 21, 2013 3:06 PM

Agile Thoughts, Healthcare.gov Edition

Clay Shirky explains the cultural attitudes that underlie Healthcare.gov's problems in his recent essay on the gulf between planning and reality. The danger of this gulf exists in any organization, whether business or government, but especially in large organizations. As the number of levels grows between the most powerful decision makers and the workers in the trenches, there is an increasing risk of developing "a culture that prefers deluding the boss over delivering bad news".

But this is also a story of the danger inherent in so-called Big Design Up Front, especially for a new kind of product. Shirky oversimplifies this as the waterfall method, but the basic idea is the same:

By putting the most serious planning at the beginning, with subsequent work derived from the plan, the waterfall method amounts to a pledge by all parties not to learn anything while doing the actual work.

You may learn something, of course; you just aren't allowed to let it change what you build, or how.

Instead, waterfall insists that the participants will understand best how things should work before accumulating any real-world experience, and that planners will always know more than workers.

If the planners believe this, or they allow the workers to think they believe this, then workers will naturally avoid telling their managers what they have learned. In the best case, they don't want to waste anyone's time if sharing the information will have no effect. In the worst case, they might fear the results of sharing what they have learned. No one likes to admit that they can't get the assigned task done, however unrealistic it is.

As Shirky notes, many people believe that a difficult launch of Healthcare.gov was unavoidable, because political and practical factors prevented developers from testing parts of the project as they went along and adjusting their actions in response. Shirky hits this one out of the park:

That observation illustrates the gulf between planning and reality in political circles. It is hard for policy people to imagine that Healthcare.gov could have had a phased rollout, even while it is having one.

You can learn from feedback earlier, or you can learn from feedback later. Pretending that you can avoid problems you already know exist never works.

One of the things I like about agile approaches to software development is they encourage us not to delude ourselves, or our clients. Or our bosses.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Software Development

November 14, 2013 2:55 PM

Toward A New Data Science Culture in Academia

Fernando Perez has a nice write-up, An Ambitious Experiment in Data Science, describing a well-funded new project in which teams at UC Berkeley, the University of Washington, and NYU will collaborate to "change the culture of universities to create a data science culture". A lot of people have been quoting Perez's entry for its colorful assessment of academic incentives and reward structures. I like this piece for the way Perez defines and outlines the problem, in terms of both data science across disciplines and academic culture in general.

For example:

Most scientists are taught to treat computation as an afterthought. Similarly, most methodologists are taught to treat applications as an afterthought.

Methodologists here includes computer scientists, who are often more interested in new data structures, algorithms, and protocols.

This "mirror" disconnect is a problem for a reason many people already understand well:

Computation and data skills are all of a sudden everybody's problem.

(Here are a few past entries of mine that talk about how programming and the nebulous "computational thinking" have spread far and wide: 1 | 2 | 3 | 4.)

Perez rightly points out that the open-source software, while imperfect, often embodies the principles or science and scientific collaboration better than the academy. It will be interesting to see how well this data science project can inject OSS attitudes into big research universities.

He is concerned because, as I have noted before, are, as a whole, a conservative lot. Perez says this in a much more entertaining way:

There are few organizations more proud of their traditions and more resistant to change than universities (churches and armies might be worse, but that's about it).

I think he gives churches and armies more credit than they deserve.

The good news is that experiments of the sort being conducted in the Berkley/UW/NYU project are springing up on a smaller scale around the world. There is some hope for big change in academic culture if a lot of different people at a lot of different institutions experiment, learn, and create small changes that can grow together as they bump into one another.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 09, 2013 12:25 PM

An Unusual Day

My university is hosting an on-campus day to recruit HS students and transfer students today. At a day like this, I usually visit with one or two potential majors and chat with one or two others who might be interested in a CS or programming class. All are usually men.

Today was unusual.

Eight people visited the department to learn about the major.

I spoke with three people who intend to major in other areas, such as accounting and physics, and want to take a minor in CS.

I spoke with a current English major here is set to graduate in May but now is thinking about employability and considering picking up a second degree in CS.

I spoke with three female students who are interested in CS. These include the English major and a student who has taken several advanced math courses at a good private school nearby, really likes them, and is thinking of combining math and CS in a major here.

The third is a high school freshman who has taken taken all the tech courses available at her schools, helps the tech teacher with the schools computers, and wants to learn more. She told me, "I just think it would be cool to write programs and make things happen."

Some recruiting days are better than others. This is one.


Posted by Eugene Wallingford | Permalink | Categories: General

October 30, 2013 11:41 AM

Discipline Can Be Structural As Well As Personal

There is a great insight in an old post by Brian Marick, Discipline and Skill, which I re-read this week. The topic sentence asserts:

Discipline can be a personal virtue, but it must also be structural.

Extreme Programming illustrates this claim. It draws its greatest power from the structural discipline it creates for developers. Marick goes on:

For example, one of the reasons to program in pairs is that two people are less likely to skip a test than one is. Removing code ownership makes it more likely someone within glaring distance will see that you didn't leave code as clean as you should have. The business's absolute insistence on getting working -- really working -- software at frequent intervals makes the pain of sloppiness strike home next month instead of next year, stiffening the resolve to do the right thing today.

P consists of a lot of relatively simple actions, but simple actions can be hard to perform, especially consistently and especially in opposition to deeply ingrained habits. XP practices work together to create structural discipline that helps developers "do the right thing".

We see the use of social media playing a similar role these days. Consider diet. People who are trying to lose weight or exercise more have to do some pretty simple things. Unfortunately, those things are not easy to do consistently, and they are opposed by deep personal and cultural habits. In order to address this, digital tool providers like FitBit make it easy for users to sync their data to a social media account and share with others.

This is a form of social discipline, supported by tools and practices that give structure to the actions people want to take. Just like XP. Many behaviors in life work this way.

(Of course, I'm already on record as saying that XP is a self-help system. I have even fantasized about XP's relationship to self-help in the cinema.)


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

October 12, 2013 11:27 AM

StrangeLoop: This and That, Volume 3

[My notes on StrangeLoop 2013: Table of Contents]

Six good talks a day is about my limit. Seven for sure. Each creates so much mental activity that my brain soon loses the ability to absorb more. Then, I need a walk.

~~~~

After Jenny Finkel's talk on machine, someone asked if Prismatic's system had learned any features or weights that she found surprising. I thought her answer was interesting. I paraphrase: "No. As a scientist, you should understand why the system is the way that it is, or find the bug if it shouldn't be that way."

In a way, this missed the point. I'm guessing the questioner was looking to hear about a case that required them to dig in because the answer was correct but they didn't know why yet, or incorrect and the bug wasn't obvious. But Finkel's answer shows how matter-of-fact scientists can be about what they find. The world is as it is, and scientists try to figure out why. That's all.

~~~~

The most popular corporate swag this year was stickers to adorn one's laptop case. I don't put stickers on my gear, but I like looking at other people's stickers. My favorites were the ones that did more than simply display the company name. Among them were asynchrony:

asynchrony laptop sticker

-- which is a company name but also a fun word in its own right -- and data-driven:

O'Reilly laptop sticker

-- by O'Reilly. I also like the bound, graph-paper notebooks that O'Reilly hands out. Classy.

~~~~

In a previous miscellany I mentioned Double Multitasking Guy. Not me, not this time. I carried no phone, as usual, and this time I left my laptop back in the hotel room. Not having any networked technology in hand creates a different experience, if not a better one.

Foremost, having no laptop affects my blogging. I can't take notes as quickly, or as voluminously. One of the upsides of this is that it's harder for me to distract myself by writing complete sentences or fact-checking vocabulary and URLs. Quick, what is the key idea here? What do I need to look up? What do I need to learn next?

~~~~

With video recording now standard at tech conferences, and with StrangeLoop releasing its videos so quickly now, a full blow-by-blow report of each talk becomes somewhat less useful. Some people find summary reports helpful, though, because they don't want to watch the full talks or have the time to do so. Short reports let these folks keep their pulse on the state of the world. Others are looking for some indication of whether they want to invest the time to watch.

For me, the reports serve another useful purpose. They let me do a little light analysis and share my personal impressions of what I hear and learn. Fortunately, that sort of blog entry still finds an audience.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 28, 2013 12:17 PM

StrangeLoop: This and That, Volume 2

[My notes on StrangeLoop 2013: Table of Contents]

I am at a really good talk and look around the room. So many people are staring at their phones, scrolling away. So many others are staring at their laptops, typing away. The guy next to me: doing both at the same time. Kudos, sir. But you may have missed the point.

~~~~

Conference talks are a great source of homework problems. Sometimes, the talk presents a good problem directly. Others, watching the talk sets my subconscious mind in motion, and it creates something useful. My students thank you. I thank you.

~~~~

Jenny Finkel talked about the difference between two kinds of recommenders: explorers, who forage for new content, and exploiters, who want to see what's already popular. The former discovers cool new things occasionally but fails occasionally, too. The latter is satisfied most of the time but rarely surprised. As conference goes, I felt this distinction at play in my own head this year. When selecting the next talk to attend, I have to take a few risks if I ever hope to find something unexpected. But when I fail, a small regret tugs at me.

~~~~

We heard a lot of confident female voices on the StrangeLoop stages this year. Some of these speakers have advanced academic degrees, or at least experience in grad school.

~~~~

The best advice I received on Day 1 perhaps came not from a talk but from the building:

The 'Do not Climb on Bears' sign on a Peabody statue

"Please do not climb on bears." That sounds like a good idea most everywhere, most all the time.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

September 23, 2013 4:22 PM

StrangeLoop: This and That, Volume 1

[My notes on StrangeLoop 2013: Table of Contents]

the Peabody Opera House's Broadway series poster

I'm working on a post about the compiler talks I attended, but in the meantime here are a few stray thoughts, mostly from Day 1.

The Peabody Opera House really is a nice place to hold a conference of this size. If StrangeLoop were to get much larger, it might not fit.

I really don't like the word "architected".

The talks were scheduled pretty well. Only once in two days did I find myself really wanting to go to two talks at the same time. And only once did I hear myself thinking, "I don't want to hear any of these...".

My only real regret from Day 1 was missing Scott Vokes's talk on data compression. I enjoyed the talk I went to well enough, but I think I would have enjoyed this one more.

What a glorious time to be a programming language theory weenie. Industry practitioners are going to conferences and attending talks on dependent types, continuations, macros, immutable data structures, and functional reactive programming.

Moon Hooch? Interesting name, interesting sound.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

August 29, 2013 4:31 PM

Asimov Sees 2014, Through Clear Eyes and Foggy

Isaac Asimov, circa 1991

A couple of years ago, I wrote Psychohistory, Economics, and AI, in which I mentioned Isaac Asimov and one way that he had influenced me. I never read Asimov or any other science fiction expecting to find accurate predictions of future. What drew me in was the romance of the stories, dreaming "what if?" for a particular set of conditions. Ultimately, I was more interested in the relationships among people under different technological conditions than I was in the technology itself. Asimov was especially good at creating conditions that generated compelling human questions.

Some of the scenarios I read in Asimov's SF turned out to be wildly wrong. The world today is already more different from the 1950s than the world of the Foundation, set thousands of years in the future. Others seem eerily on the mark. Fortunately, accuracy is not the standard by which most of us judge good science fiction.

But what of speculation about the near future? A colleague recently sent me a link to Visit to the World's Fair of 2014, an article Asimov wrote in 1964 speculating about the world fifty years hence. As I read it, I was struck by just how far off he was in some ways, and by how close he was in others. I'll let you read the story for yourself. Here are a few selected passages that jumped out at me.

General Electric at the 2014 World's Fair will be showing 3-D movies of its "Robot of the Future," neat and streamlined, its cleaning appliances built in and performing all tasks briskly. (There will be a three-hour wait in line to see the film, for some things never change.)

3-D movies are now common. Housecleaning robots are not. And while some crazed fans will stand in line for many hours to see the latest comic-book blockbuster, going to a theater to see a movie has become much less important part of the culture. People stream movies into their homes and into their hands. My daughter teases me for caring about the time any TV show or movie starts. "It's on Hulu, Dad." If it's not on Hulu or Netflix or the open web, does it even exist?

Any number of simultaneous conversations between earth and moon can be handled by modulated laser beams, which are easy to manipulate in space. On earth, however, laser beams will have to be led through plastic pipes, to avoid material and atmospheric interference. Engineers will still be playing with that problem in 2014.

There is no one on the moon with whom to converse. Sigh. The rest of this passage sounds like fiber optics. Our world is rapidly becoming wireless. If your device can't connect to the world wireless web, does it even exist?

In many ways, the details of technology are actually harder to predict correctly than the social, political, economic implications of technological change. Consider:

Not all the world's population will enjoy the gadgety world of the future to the full. A larger portion than today will be deprived and although they may be better off, materially, than today, they will be further behind when compared with the advanced portions of the world. They will have moved backward, relatively.

Spot on.

When my colleague sent me the link, he said, "The last couple of paragraphs are especially relevant." They mention computer programming and a couple of its effects on the world. In this regard, Asimov's predictions meet with only partial success.

The world of A.D. 2014 will have few routine jobs that cannot be done better by some machine than by any human being. Mankind will therefore have become largely a race of machine tenders. Schools will have to be oriented in this direction. ... All the high-school students will be taught the fundamentals of computer technology will become proficient in binary arithmetic and will be trained to perfection in the use of the computer languages that will have developed out of those like the contemporary "Fortran" (from "formula translation").

The first part of this paragraph is becoming truer every day. Many people husband computers and other machines as they do tasks we used to do ourselves. The second part is, um, not true. Relatively few people learn to program at all, let alone master a programming language. And how many people understand this t-shirt without first receiving an impromptu lecture on the street?

Again, though, Asimov is perhaps closer on what technological change means for people than on which particular technological changes occur. In the next paragraph he says:

Even so, mankind will suffer badly from the disease of boredom, a disease spreading more widely each year and growing in intensity. This will have serious mental, emotional and sociological consequences, and I dare say that psychiatry will be far and away the most important medical specialty in 2014. The lucky few who can be involved in creative work of any sort will be the true elite of mankind, for they alone will do more than serve a machine.

This is still speculation, but it is already more true than most of us would prefer. How much truer will it be in a few years?

My daughters will live most of their lives post-2014. That worries the old fogey in me a bit. But it excites me more. I suspect that the next generation will figure the future out better than mine, or the ones before mine, can predict it.

~~~~

PHOTO. Isaac Asimov, circa 1991. Britannica Online for Kids. Web. 2013 August 29. http://kids.britannica.com/comptons/art-136777.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

August 28, 2013 3:07 PM

Risks and the Entrepreneurs Who Take Them

Someone on the SIGCSE mailing list posted a link to an article in the Atlantic, that explores a correlation between entrepreneurship, teenaged delinquency, and white male privilege. The article starts with

It does not strike me as a coincidence that a career path best suited for mild high school delinquents ends up full of white men.

and concludes with

To be successful at running your own company, you need a personality type that society is a lot more forgiving of if you're white.

The sender of the link was curious what educational implications these findings have, if any, for how we treat academic integrity in the classroom. That's an interesting question, though my personal tendency to follow rules and not rock the boat has always made me more sensitive to the behavior of students who employ the aphorism "ask for forgiveness, not permission" a little too cavalierly for my taste.

My first reaction to the claims of this article was tied to how I think about the kinds of risks that entrepreneurs take.

When most people in the start-up world talk about taking risks, they are talking about the risk of failure and, to a lesser extent, the risk of being unconventional, not the risk of being caught doing something wrong. In my personal experience, the only delinquent behavior our entrepreneurial former students could be accused of is not doing their homework as regularly as they should. Time spent learning for their business is time not spent on my course. But that's not delinquent behavior; it's curiosity focused somewhere other than my classroom.

It's not surprising, though, that teens who were willing take legal risks are more likely willing to take business risk, and (sadly) legal risks in their businesses. Maybe I've simply been lucky to have worked with students and other entrepreneurs of high character.

Of course, there is almost certainly a white male privilege associated with the risk of failure, too. White males are often better positioned financially and socially than women or minorities to start over when a company fails. It's also easier to be unconventional and stand out from the crowd when you don't already stand out from the crowd due to your race or gender. That probably accounts for the preponderance of highly-educated white men in start-ups better than a greater willingness to partake in "aggressive, illicit, risk-taking activities".


Posted by Eugene Wallingford | Permalink | Categories: General

July 24, 2013 11:44 AM

Headline: "Dinosaurs Object to Meteor's Presence"

Don't try to sell a meteor to a dinosaur...

Nate Silver recently announced that he is leaving the New York Times for ESPN. Margaret Sullivan offers some observations on the departure, into how political writers at the Times viewed Silver and his work:

... Nate disrupted the traditional model of how to cover politics.

His entire probability-based way of looking at politics ran against the kind of political journalism that The Times specializes in: polling, the horse race, campaign coverage, analysis based on campaign-trail observation, and opinion writing. ...

His approach was to work against the narrative of politics. ...

A number of traditional and well-respected Times journalists disliked his work. The first time I wrote about him I suggested that print readers should have the same access to his writing that online readers were getting. I was surprised to quickly hear by e-mail from three high-profile Times political journalists, criticizing him and his work. ...

Maybe Silver decided to acquiesce to Hugh MacLeod's advice. Maybe he just got a better deal.

The world changes, whether we like it or not. The New York Times and its journalists probably have the reputation, the expertise, and the strong base they need to survive the ongoing changes in journalism, with or without Silver. Other journalists don't have the luxury of being so cavalier.

I don't know any more attitudes inside the New York Times than what I see reported in the press, but Sullivan's article made me think of one of Anil Dash's ten rules of the internet:

When a company or industry is facing changes to its business due to technology, it will argue against the need for change based on the moral importance of its work, rather than trying to understand the social underpinnings.

I imagine that a lot of people at the Times are indeed trying to understand the social underpinnings of the changes occurring in the media and trying to respond in useful ways. But that doesn't mean that everyone on the inside is, or even that the most influential and high-profile people in the trenches are. And that's adds an internal social challenge to the external technological challenge.

Alas, we see much the same dynamic playing out in universities across the country, including my own. Some dinosaurs have been around for a long time. Others are near the beginning of their careers. The internal social challenges are every bit as formidable as the external economic and technological ones.


Posted by Eugene Wallingford | Permalink | Categories: General

July 23, 2013 9:46 AM

Some Meta-Tweeting Silliness

my previous tweet
missed an opportunity,
should have been haiku

(for @fogus)


Posted by Eugene Wallingford | Permalink | Categories: General

July 15, 2013 2:41 PM

Version Control for Writers and Publishers

Mandy Brown again, this time on on writing tools without memory:

I've written of the web's short-term memory before; what Manguel trips on here is that such forgetting is by design. We designed tools to forget, sometimes intentionally so, but often simply out of carelessness. And we are just as capable of designing systems that remember: the word processor of today may admit no archive, but what of the one we build next?

This is one of those places where the software world has a tool waiting to reach a wider audience: the version control system. Programmers using version control can retrieve previous states of their code all the way back to its creation. The granularity of the versions is limited only by the frequency with which they "commit" the code to the repository.

The widespread adoption of version control and the existence of public histories at place such as GitHub have even given rise to a whole new kind of empirical software engineering, in which we mine a large number of repositories in order to understand better the behavior of developers in actual practice. Before, we had to contrive experiments, with no assurance that devs behaved the same way under artificial conditions.

Word processors these days usually have an auto-backup feature to save work as the writer types text. Version control could be built into such a feature, giving the writer access to many previous versions without the need to commit changes explicitly. But the better solution would be to help writers learn the value of version control and develop the habits of committing changes at meaningful intervals.

Digital version control offers several advantages over the writer's (and programmer's) old-style history of print-outs of previous versions, marked-up copy, and notebooks. An obvious one is space. A more important one is the ability to search and compare old versions more easily. We programmers benefit greatly from a tool as simple as diff, which can tell us the textual differences between two files. I use diff on non-code text all the time and imagine that professional writers could use it to better effect than I.

The use of version control by programmers leads to profound changes in the practice of programming. I suspect that the same would be true for writers and publishers, too.

Most version control systems these days work much better with plain text than with the binary data stored by most word processing programs. As discussed in my previous post, there are already good reasons for writers to move to plain text and explicit mark-up schemes. Version control and text analysis tools such as diff add another layer of benefit. Simple mark-up systems like Markdown don't even impose much burden on the writer, resembling as they do how so many of us used to prepare text in the days of the typewriter.

Some non-programmers are already using version control for their digital research. Check out William Turkel's How To for doing research with digital sources. Others, such The Programming Historian and A Companion to Digital Humanities, don't seem to mention it. But these documents refer mostly to programs for working with text. The next step is to encourage adoption of version control for writers doing their own thing: writing.

Then again, it has taken a long time for version control to gain such widespread acceptance even among programmers, and it's not yet universal. So maybe adoption among writers will take a long time, too.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 11, 2013 2:57 PM

Talking to the New University President about Computer Science

Our university recently hired a new president. Yesterday, he and the provost came to a meeting of the department heads in humanities, arts, and sciences, so that he could learn a little about the college. The dean asked each head to introduce his or her department in one minute or less.

I came in under a minute, as instructed. Rather than read a litany of numbers that he can read in university reports, I focused on two high-level points:

  • Major enrollment has recovered nicely since the deep trough after the dot.com bust and is now steady. We have near-100% placement, but local and state industry could hire far more graduates.
  • For the last few years we have also been working to reach more non-majors, which is a group we under-serve relative to most other schools. This should be an important part of the university's focus on STEM and STEM teacher education.

I closed with a connection to current events:

We think that all university graduates should understand what 'metadata' is and what computer programs can do with it -- enough so that they can understand the current stories about the NSA and be able to make informed decisions as a citizen.

I hoped that this would be provocative and memorable. The statement elicited laughs and head nods all around. The president commented on the Snowden case, asked me where I thought he would land, and made an analogy to The Man Without a Country. I pointed out that everyone wants to talk about Snowden, including the media, but that's not even the most important part of the story. Stories about people are usually of more interest than stories about computer programs and fundamental questions about constitutional rights.

I am not sure how many people believe that computer science is a necessary part of a university education these days, or at least the foundations of computing in the modern world. Some schools have computing or technology requirements, and there is plenty of press for the "learn to code" meme, even beyond the CS world. But I wonder how many US university graduates in 2013 understand enough computing (or math) to understand this clever article and apply that understand to the world they live in right now.

Our new president seemed to understand. That could bode well for our department and university in the coming years.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 08, 2013 1:05 PM

A Random Thought about the Metadata and Government Surveillance

In a recent mischievous mood, I decided it might be fun to see the following.

The next whistleblower with access to all the metadata that the US government is storing on its citizens assembles a broad list of names: Republican and Democrat; legislative, executive, and judicial branches; public official and private citizens. The only qualification for getting on the list is that the person has uttered any variation of the remarkably clueless statement, "If you aren't doing anything wrong, then you have nothing to hide."

The whistleblower thens mine the metadata and, for each person on this list, publishes a brief that demonstrates just how much someone with that data can conclude -- or insinuate -- about a person.

If they haven't done anything wrong, then they don't have anything to worry about. Right?


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

June 10, 2013 2:41 PM

Unique in Exactly the Same Way

Ah, the idyllic setting of my youth:

When people refer to "higher education" in this country, they are talking about two systems. One is élite. It's made up of selective schools that people can apply to -- schools like Harvard, and also like U.C. Santa Cruz, Northeastern, Penn State, and Kenyon. All these institutions turn most applicants away, and all pursue a common, if vague, notion of what universities are meant to strive for. When colleges appear in movies, they are verdant, tree-draped quadrangles set amid Georgian or Gothic (or Georgian-Gothic) buildings. When brochures from these schools arrive in the mail, they often look the same. Chances are, you'll find a Byronic young man reading "Cartesian Meditations" on a bench beneath an elm tree, or perhaps his romantic cousin, the New England boy of fall, a tousle-haired chap with a knapsack slung back on one shoulder. He is walking with a lovely, earnest young woman who apparently likes scarves, and probably Shelley. They are smiling. Everyone is smiling. The professors, who are wearing friendly, Rick Moranis-style glasses, smile, though they're hard at work at a large table with an eager student, sharing a splayed book and gesturing as if weighing two big, wholesome orbs of fruit. Universities are special places, we believe: gardens where chosen people escape their normal lives to cultivate the Life of the Mind.

I went to a less selective school than the ones mentioned here, but the vague ideal of higher education was the same. I recognized myself, vaguely, in the passage about the tousle-haired chap with a knapsack, though on a Midwestern campus. I certainly pined after a few lovely, earnest young women with a fondness for scarves and the Romantic poets in my day. These days, I have become the friendly, glasses-wearing, always-smiling prof in the recruiting photo.

The descriptions of movie scenes and brochures, scarves and Shelley and approachable professors, reminded me most of something my daughter told me as she waded through recruiting literature from so many schools a few years ago, "Every school is unique, dad, in exactly the same way." When the high school juniors see through the marketing facade of your pitch, you are in trouble.

That unique-in-the-same-way character of colleges and university pitches is a symptom of what lies at the heart of the coming "disruption" of what we all think of as higher education. The traditional ways for a school to distinguish itself from its peers, and even from schools it thinks of as lesser rivals, are becoming less effective. I originally wrote "disappearing", but they are now ubiquitous, as every school paints the same picture, stresses the same positive attributes, and tries not to talk too much about the negatives they and their peers face. Too many schools chasing too few tuition-paying customers accelerates the process.

Trying to protect the ideal of higher education is a noble effort now being conducted in the face of a rapidly changing landscape. However, the next sentence of the recent New Yorker article Laptop U, from which the passage quoted above comes, reminds us:

But that is not the kind of higher education most Americans know. ...

It is the other sort of higher education that will likely be the more important battleground on which the higher ed is disrupted by technology.

We are certainly beginning to have such conversations at my school, and we are starting to hear rumblings from outside. My college's dean and our new university president recently visited the Fortune 100 titan that dominates local industry. One of the executives there gave them several documents they've been reading there, including "Laptop U" and the IPPR report mentioned in it, "An Avalanche is Coming: Higher Education and the Revolution Ahead".

It's comforting to know your industry partners value you enough to want to help you survive a coming revolution. It's also hard to ignore the revolution when your partners begin to take for granted that it will happen.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

June 07, 2013 1:53 PM

Sentences to Ponder

Henry Rollins:

When one beast dumps you, summon the guts to find another. If it tries to kill you, the party has definitely started. Otherwise, life is a slow retirement.

Rollins is talking about why he's not making music anymore, but his observation applies to other professions. We all know programmers who are riding out the long tail of an intellectual challenge that died long ago. College professors, too.

I have to imagine that this is a sad life. It certainly leaves a lot of promise unfulfilled.

If you think you have a handle on the beast, then the beast has probably moved on. Find a new beast with which to do battle.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

June 04, 2013 2:43 PM

A Simple Confession

My Unix-toting brethren may revoke my CS card for saying this, but I really do like to install programs this way:

    Installing ApplicationX

1. Open the disk image 2. Drag ApplicationX to your Applications folder 3. Eject the disk image

The app loses points if I really have to drag it to the Applications folder. The Desktop should do.

I understand the value in ./configure and ./make and setting paths and... but it sure is nice when I don't have to use them.


Posted by Eugene Wallingford | Permalink | Categories: General

May 31, 2013 1:44 PM

Quotes of the Week, in Four Dimensions

Engineering.

Michael Bernstein, in A Generation Ago, A Thoroughly Modern Sampling:

The AI Memos are an extremely fertile ground for modern research. While it's true that what this group of pioneers thought was impossible then may be possible now, it's even clearer that some things we think are impossible now have been possible all along.

When I was in grad school, we read a lot of new and recent research papers. But the most amazing, most educational, and most inspiring stuff I read was old. That's often true today as well.

Science.

Financial Agile tweets:

"If it disagrees with experiment, it's wrong". Classic.

... with a link to The Scientific Method with Feynman, which has a wonderful ten-minute video of the physicist explaining how science works. Among its important points is that guessing is huge part of science. It's just that scientists have a way of telling which guesses are right and which are wrong.

Teaching.

James Boyk, in Six Words:

Like others of superlative gifts, he seemed to think the less gifted could do as well as he, if only they knew a few powerful specifics that could readily be conveyed. Sometimes he was right!

"He" is Leonid Hambro, who played with Victor Borge and P. D. Q. Bach but was also well-known as a teacher and composer. Among my best teachers have been some extraordinarily gifted people. I'm thankful for the time they tried to convey their insights to the likes of me.

Art.

Amanda Palmer, in a conference talk:

We can only connect the dots that we collect.

Palmer uses this sentence to explain in part why all art is about the artist, but it means something more general, too. You can build, guess, and teach only with the raw materials that you assemble in your mind and your world. So collect lots of dots. In this more prosaic sense, Palmer's sentence applies to not only to art but also to engineering, science, and teaching.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

May 26, 2013 9:45 AM

Programming Magic and Business Skeuomorphism

Designer Craig Mod offers Marco Arment's The Magazine as an exemplar of Subcompact Publishing in the digital age: "No cruft, all substance. A shadow on the wall."; a minimal disruptor that capitalizes on the digital medium without tying itself down with the strictures of twentieth-century hardcopy technology.

After detailing the advantages of Arment's approach, Mod points out the primary disadvantage: you have to be able to write an iOS application. Which leads to this gem

The fact that Marco -- a programmer -- launched one of the most 'digitally indigenous' contemporary tablet publications is indicative of two things:
  1. Programmers are today's magicians. In many industries this is obvious, but it's now becoming more obvious in publishing. Marco was able to make The Magazine happen quickly because he saw that Newsstand was underutilized and understood its capabilities. He knew this because he's a programmer. Newsstand wasn't announced at a publishing conference. It was announced at the WWDC.
  2. The publishing ecosystem is now primed for complete disruption.

If you are a non-programmer with ideas, don't think I just need a programmer; instead think, I need a technical co-founder. A lot of people think of programming as Other, as a separate world from what they do. Entrepreneurs such as Arment, and armies of young kids writing video games and apps for their friends, know instead that it is a tool they can use to explore their interests.

Mod offers an a nice analogy from the design world to explain why entrenched industry leaders and even prospective entrepreneurs tend to fall into the trap of mimicking old technology in their new technologies: business skeuomorphism.

For example, designers "bring the mechanical camera shutter sound to digital cameras because it feels good" to users. In a similar way, a business can transfer a decision made under the constraints of one medium or market into a new medium or market in which the constraints no longer apply. Under new constraints, and with new opportunities, the decision is no longer a good one, let alone necessary or optimal.

As usual, I am thinking about how these ideas relate to the disruption of university education. In universities, as in the publishing industry, business skeuomorphism is rampant. What is the equivalent of the Honda N360 in education? Is it Udacity or Coursera? Enstitute? Or something simpler?


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

May 17, 2013 3:26 PM

Pirates and Tenure

I recently read The Sketchbook of Susan Kare, "the Artist Who Gave Computing a Human Face", which referred to the Apple legend of the Pirate Flag:

[Kare's] skull-and-crossbones design would come in handy when Jobs issued one of his infamous motivational koans to the Mac team: "It's better to be a pirate than join the Navy."

For some reason, that line brought to mind a favorite saying of one of my friends, Sid Kitchel:

Real men don't accept tenure.

If by some chance they do accept tenure, they should at least never move into administration, even temporarily. It's a bad perch from which to be a pirate.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

April 23, 2013 4:16 PM

"Something Bigger Than Me"

In this interview with The Setup, Patric King talks about his current work:

Right now, my focus is bringing well-designed marketing to industries I believe in, to help them develop more financing. ... It is not glamorous, but it is the right thing to do. Designing pretty things is nice, but it's time for me to do something bigger than me.

Curly says, 'one thing... just one thing'

That's a pretty good position to be in: bringing value to a company or industry you believe in. Sometimes, we find such positions by virtue of the career path we choose. Those of us who teach as a part of our jobs are lucky in this regard.

Other times, we have to make a conscious decision to seek positions of this sort, or create the company we want to be in. That's what King has done. His skill set gives him more latitude than many people have. Those of us who can create software have more freedom than most other people, too. What an opportunity.

King's ellipsis is filled with the work that matters to him. As much as possible, when the time is right, we all should find the work that replaces our own ellipses with something that really matters to us, and to the world.


Posted by Eugene Wallingford | Permalink | Categories: General

April 14, 2013 6:25 PM

Scientists Being Scientists

Watson and Crick announced their discovery of the double helix structure of DNA in Molecular Structure of Nucleic Acids, a marvel of concise science writing. It has been widely extolled for how much information it packs into a single page, including the wonderfully understated line, "It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material."

As I read this paper again recently, though, this passage stood out:

The previously published X-ray data on deoxyribose nucleic acid are insufficient for a rigorous test of our structure. So far as we can tell, it is roughly compatible with the experimental data, but it must be regarded as unproved until it has been checked against more exact results.

They are unpretentious sentences. They do nothing special, stating simply that more and better data are needed to test their hypothesis. This is not a time for hyperbole. It is a time to get back to the lab.

Just scientists being scientists.


Posted by Eugene Wallingford | Permalink | Categories: General

April 10, 2013 4:03 PM

Minor Events in the Revolution at Universities

This morning I ran across several articles that had me thinking yet again about the revolution I see happening in the universities (*).

First, there was this recent piece in the New York Times about software that grades essays. Such software is probably essential for MOOCs in many disciplines, but it would also be useful in large lecture sections of traditional courses at many universities. The software isn't perfect, and skeptics abound. But the creator of the EdX software discussed in the article says:

This is machine learning and there is a long way to go, but it's good enough and the upside is huge.

It's good enough, and the upside is huge. Entrenched players scoff. Classic disruption at work.

Then there was this piece from the Nieman Journalism Lab about an online Dutch news company that wants readers to subscribe to individual journalists. Is this really news in 2013? I read a lot of technical and non-technical material these days via RSS feeds from individual journalists and bloggers. Of course, that's not the model yet for traditional newspapers and magazines.

... but that's the news business. What about the revolution in universities? The Nieman Lab piece reminded me of an old article in Vanity Fair about Politico, a news site founded by a small group of well-known political journalists who left their traditional employers to start the company. They all had strong "personal brands" and journalistic credentials. Their readers followed them to their new medium. Which got me to thinking...

What would happen if the top 10% of the teachers at Stanford or Harvard or Williams College just walked out to start their own university?

Of course, in the time since that article was published, we have seen something akin to this, with the spin-off of companies like Coursera and Udacity. However, these new education companies are partnering with traditional universities and building off the brands of their partners. At this point in time, the brand of a great school still trumps the individual brands of most all its faculty. But one can imagine a bolder break from tradition.

What happens when technology gives a platform to a new kind of teacher who bypasses the academic mainstream to create and grow a personal brand? What happens when this new kind of teacher bands together with a few like-minded renegades to use the same technology to scale up to the size of a traditional university, or more?

That will never happen, or so many of us in the academy are saying. This sort of thinking is what makes the Dutch news company mentioned above seem like such a novelty in the world of journalism. Many journalists and media companies, though, now recognize the change that has happened around them.

Which leads to a final piece I read this morning, a short blog entry by Dave Winer about Ezra Klein's epiphany on how blogging and journalism are now part of a single fabric. Winer says:

It's tragic that it took a smart guy like Klein so long to understand such a basic structural truth about how news, his own profession, has been working for the last 15 years.

I hope we aren't saying the same thing about the majority of university professors fifteen or twenty years from now. As we see in computers that grade essays, sometimes a new idea is good enough, and the upside is huge. More and more people will experiment with good-enough ideas, and even ideas that aren't good enough yet, and as they do the chance of someone riding the upside of the wave to something really different increases. I don't think MOOCs are a long-term answer to any particular educational problem now or in the future, but they are one of the laboratories in which these experiments can be played out.

I also hope that fifteen or twenty years from now someone isn't saying about skeptical university professors what Winer says so colorfully about journalists skeptical of the revolution that has redefined their discipline while they worked in it:

The arrogance is impressive, but they're still wrong.

~~~~

(*).   Nearly four years later, Revolution Out There -- and Maybe In Here remains one of my most visited blog entries, and one that elicits more reader comments than most. I think it struck a chord.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

April 09, 2013 3:16 PM

Writing a Book Is Like Flying A Spaceship

I've always liked this quote from the preface of Pragmatic Ajax, by Gehtland, Galbraith, and Almaer:

Writing a book is a lot like (we imagine) flying a spaceship too close to a black hole. One second you're thinking "Hey, there's something interesting over there," and a picosecond later, everything you know and love has been sucked inside and crushed.

Programming can be like that, too, in a good way. Just be sure to exit the black hole on the other side.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

March 30, 2013 8:43 AM

"It's a Good Course, But..."

Earlier this week I joined several other department heads to eat lunch with a bunch of high school teachers who were on campus for the Physics Olympics. The teachers were talking shop about the physics courses at their schools, and eventually the conversation turned to AP Physics. One of the teachers said, "It's a good course, but..."

A lot of these teachers would rather not offer AP Physics at all. One teacher described how in earlier days they were able to teach an advanced physics course of their own design. They had freedom to adapt to the interest of their students and to try out new ideas they encountered at conferences. Even though the advanced physics course had first-year physics as a prerequisite, they had plenty of students interested and able to take the second course.

The introduction of AP Physics created some problems. It's a good course, they all agreed, but it is yet another AP course for their students to take, and yet another AP exam for the students to prepare for. Most students can't or don't want to take all the AP courses, due to the heavier workload and often grueling pace. So in the end, they lose potential students who choose not to take the physics class.

Several of these teachers tried to make this case to heads of their divisions or to their principals, but to no avail.

This makes me sad. I'd like to see as many students taking science and math courses in high school as possible, and creating unnecessary bottlenecks hurts that effort.

There is a lot of cultural pressure these days to accelerate the work that HS students do. K-12 school districts and their administrators see the PR boon of offering more, and more advanced courses. State legislators are creating incentives for students to earn college credit while in high school, and funding for schools can reflect that. Parents love the idea of their children getting a head start on college, both because it might save money down the line and because they earn some vicarious pleasure in the achievement of their children.

On top of all this, the students themselves often face a lot of peer pressure from their friends and other fellow students to be doing and achieving more. I've seen that dynamic at work as my daughters have gone through high school.

Universities don't seem as keen about AP as they used to, but they send a mixed message to parents and students. On the one hand, many schools give weight in their admission decisions to the number of AP courses completed. This is especially true with more elite schools, which use this measure as a way to demonstrate their selectivity. Yet many of those same schools are reluctant to give full credit to students who pass the AP exam, at least as major credit, and require students to take their intro course anyway.

This reluctance is well-founded. We don't see any students who have taken AP Computer Science, so I can't commit on that exam but I've talked with several Math faculty here about their experiences with calculus. They say that, while AP Calculus teaches a lot of good material, but the rush to cover required calculus content often leaves students with weak algebra skills. They manage to succeed in the course despite these weaknesses, but when they reach more advanced university courses -- even Calc II -- these weaknesses come back to haunt them.

As a parent of current and recent high school students, I have observed the student experience. AP courses try to prepare students for the "college experience" and as a result cover a lot of material. The students see them as grueling experiences, even when they enjoy the course content.

That concerns me a bit. For students who know they want to be math or science majors, these courses are welcome challenges. For the rest of the students, who take the courses primarily to earn college credit or to explore the topic, these courses are so grueling that this dampen the fun of learning.

Call me old-fashioned, but I think of high school as a time to learn about a lot of different things, to sample broadly from all areas of study. Sure, students should build up the skills necessary to function in the workplace and go to college, but the emphasis should be on creating a broadly educated citizen, not training a miniature college student. I'd rather students get excited about learning physics, or math, or computer science, so that they will want to dive deeper when they get to college.

A more relaxed, more flexible calculus class or physics course might attract more students than a grueling AP course. This is particularly important at a time when everyone is trying to increase interest in STEM majors.

My daughters have had a lot of great teachers, both in and out of their AP courses. I wish some of those teachers had had more freedom to spark student interest in the topic, rather than student and teacher alike facing the added pressure of taking the AP exam, earning college credits, and affecting college admission decisions

It's a good course, but feel the thrill first.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

February 28, 2013 2:52 PM

The Power of a Good Abstract

Someone tweeted a link to Philip Greenspun's M.S. thesis yesterday. This is how you grab your reader's attention:

A revolution in earthmoving, a $100 billion industry, can be achieved with three components: the GPS location system, sensors and computers in earthmoving vehicles, and SITE CONTROLLER, a central computer system that maintains design data and directs operations. The first two components are widely available; I built SITE CONTROLLER to complete the triangle and describe it here.

Now I have to read the rest of the thesis.

You could do worse than use Greenspun's first two sentences as a template for your next abstract:

A revolution in <major industry or research area> can be achieved with <n> components: <component-1>, <component-2>, ... and <component-n>. The first <n-1> components are widely available. I built <program name> to meet the final need and describe it here.

I am adding this template to my toolbox of writing patterns, alongside Kent Beck's four-sentence abstract (scroll down to Kent's name), which generalizes the idea of one startling sentence that arrests the reader. I also like good advice on how to write concise, incisive thesis statements, such as that in Matt Might's Advice for PhD Thesis Proposals and Olin Shivers's classic Dissertation Advice.

As with any template or pattern, overuse can turn a good idea into a cliché. If readers repeatedly see the same cookie-cutter format, it begins to look stale and will cause the reader to lose interest. So play with variations on the essential theme: I have solved an important problem. This is my solution.

If you don't have a great abstract, try again. Think hard about your own work. Why is this problem important? What is the big win from my solution? That's a key piece of advice in Might's advice for graduate students: state clearly and unambiguously what you intend to achieve.

Indeed, approaching your research in a "test-driven" way makes a lot of sense. Before embarking on a project, try to write the startling abstract that will open the paper or dissertation you write when you have succeeded. If you can't identify the problem as truly important, then why start at all? Maybe you should pick something more valuable to work on, something that matters enough you can write a startling abstract for the esult. That's a key piece of advice shared by Richard Hamming in his You and Your Research.

And whatever you do, don't oversell a minor problem or a weak solution with an abstract that promises too much. Readers will be disappointed at best and angry at worst. If you oversell even a little bit too many times, you will become like the boy who cried wolf. No one will believe your startling claim even when it's on the mark.

Greenspun's startling abstract ends as strongly as it begins. Of course, it helps if you can close with a legitimate appeal to ameliorating poverty around the world:

This area is exciting because so much of the infrastructure is in place. A small effort by computer scientists could cut the cost of earthmoving in half, enabling poor countries to build roads and rich countries to clean up hazardous waste.

I'm not sure adding another automating refactoring to Eclipse or creating another database library can quite rise to the level of empowering the world's poor. But then, you may have a different audience.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns, Teaching and Learning

February 17, 2013 12:16 PM

The Disruption of Education: B.F. Skinner, MOOCs, and SkillShare

Here are three articles, all different, but with a connection to the future of education.

•   Matthew Howell, Teaching Programming

Howell is a software developer who decided to start teaching programming on the side. He offers an on-line course through SkillShare that introduces non-programmers to the basic concepts of computer programming, illustrated using Javascript running in a browser. This article describes some of his reasons for teaching the course and shares a few things he has learned. One was:

What is the ideal class size? Over the year, I've taught classes that ranged in size from a single person to as many as ten. Through that experience, I've settled on five as my ideal.

Anyone who has taught intro programming in a high school or university is probably thinking, um, yeah, that would be great! I once taught an intermediate programming section with fifty or so people, though most of my programming courses have ranged from fifteen to thirty-five students. All other things being equal, smaller is better. Helping people learn to write and make things almost usually benefits from one-on-one time and time for small groups to critique design together.

Class size is, of course, one of the key problems we face in education these days, both K-12 and university. For a lot of teaching, n = 5 is just about perfect. For upper-division project courses, I prefer four groups of four students, for a total of sixteen. But even at that size, the costs incurred by a university offering sections of are rising a lot faster than its revenues.

With MOOCs all the rage, Howell is teaching at the other end of spectrum. I expect the future of teaching to see a lot of activity at both scales. Those of us teaching in the middle face bleaker prospects.

•   Mike Caulfield, B. F. Skinner on Teaching Machines (1954)

Caulfield links to this video of B.F. Skinner describing a study on the optimal conditions for self-instruction using "teaching machines" in 1954. Caulfield points out that, while these days people like to look down on Skinner's behaviorist view of learning, he understood education better than many of his critics, and that others are unwittingly re-inventing many of his ideas.

For example:

[Skinner] understands that it is not the *machine* that teaches, but the person that writes the teaching program. And he is better informed than almost the entire current educational press pool in that he states clearly that a "teaching machine" is really just a new kind of textbook. It's what a textbook looks like in an age where we write programs instead of paragraphs.

That's a great crystallizing line by Caulfield: A "teaching machine" is what a textbook looks like in an age where we write programs instead of paragraphs.

Caulfield reminds us that Skinner said these things in 1954 and cautions us to stop asking "Why will this work?" about on-line education. That question presupposes that it will. Instead, he suggests we ask ourselves, "Why will this work this time around?" What has changed since 1954, or even 1994, that makes it possible this time?

This is a rightly skeptical stance. But it is wise to be asking the question, rather than presupposing -- as so many educators these days do -- that this is just another recursion of the "technology revolution" that never quite seems to revolutionize education after all.

•   Clayton Christensen in Why Apple, Tesla, VCs, academia may die

Christensen didn't write this piece, but reporter Cromwell Schubarth quotes him heavily throughout on how disruption may be coming to several companies and industries of interest to his Silicon Valley readership.

First, Christensen reminds young entrepreneurs that disruption usually comes from below, not from above:

If a newcomer thinks it can win by competing at the high end, "the incumbents will always kill you".

If they come in at the bottom of the market and offer something that at first is not as good, the legacy companies won't feel threatened until too late, after the newcomers have gained a foothold in the market.

We see this happening in higher education now. Yet most of my colleagues here on the faculty and in administration are taking the position that leaves legacy institutions most vulnerable to overthrow from below. "Coursera [or whoever] can't possibly do what we do", they say. "Let's keep doing what we do best, only better." That will work, until it doesn't.

Says Christensen:

But now online learning brings to higher education this technological core, and people who are very complacent are in deep trouble. The fact that everybody was trying to move upmarket and make their university better and better and better drove prices of education up to where they are today.

We all want to get better. It's a natural desire. My university understands that its so-called core competency lies in the niche between the research university and the liberal arts college, so we want to optimize in that space. As we seek to improve, we aspire to be, in our own way, like the best schools in their niches. As Christensen pointed out in The Innovator's Dilemma, this is precisely the trend that kills an institution when it meets a disruptive technoology.

Later in the article, Christensen talks about how many schools are getting involved in online learning, sometimes investing significant resources, but almost always in service of the existing business model. Yet other business models are being born, models that newcomers are willing -- and sometimes forced -- to adopt.

One or more of these new models may be capable of toppling even the most successful institutions. Christensen describes one such candidate, a just-in-time education model in which students learn something, go off to use it, and then come back only when they need to learn what they need to know in order to take their next steps.

This sort of "learn and use", on-the-job learning, whether online or in person, is a very different way of doing things from school as we know it. It id not especially compatible with the way most universities are organized to educate people. It is, however, plenty compatible with on-line delivery and thus offers newcomers to the market the pebble they may use to bring down the university.

~~~~

The massively open on-line course is one form the newcomers are taking. The smaller, more intimate offering enabled by the likes of SkillShare is another. It may well be impossible for legacy institutions caught in the middle to fend off challenges from both directions.

As Caulfield suggests, though, we should be skeptical. We have seen claims about technology upending schools before. But we should adopt the healthy skepticism of the scientist, not the reactionary skepticism of the complacent or the scared. The technological playing field has changed. What didn't work in 1954 or 1974 or 1994 may well work this time.

Will it? Christensen thinks so:

Fifteen years from now more than half of the universities will be in bankruptcy, including the state schools. In the end, I am excited to see that happen.

I fear that universities like mine are at the greatest risk of disruption, should the wave that Christensen predicts come. I don't know many university faculty are excited to see it happen. I just hope they aren't too surprised if it does.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

February 07, 2013 5:01 PM

Quotes of the Day

Computational Thinking Division. From Jon Udell, another lesson that programming and computing teach us which can be useful out in the world:

Focus on understanding why the program is doing what it's doing, rather than why it's not doing what you wanted it to.

This isn't the default approach of everyone. Most of my students have to learn this lesson as a part of learning how to program. But it can be helpful outside of programming, in particular by influencing how we interact with people. As Udell says, it can be helpful to focus on understanding why one's spouse or child or friend is doing what she is doing, rather than on why she isn't doing what you want.

Motivational Division. From the Portland Ballet, of all places, several truths about being a professional dancer that generalize beyond the studio, including:

There's a lot you don't know.
There may not be a tomorrow.
There's a lot you can't control.
You will never feel 100% ready.

So get to work, even if it means reading the book and writing the code for the fourth time. That is where the fun and happiness are. All you can affect, you affect by the work you do.

Mac Chauvinism Division. From Matt Gemmell, this advice on a particular piece of software:

There's even a Windows version, so you can also use it before you've had sufficient success to afford a decent computer.

But with enough work and a little luck, you can afford better next time.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Managing and Leading

February 06, 2013 10:06 AM

Shared Governance and the 21st Century University

Mitch Daniels, the new president of Purdue University, says this about shared governance in An Open Letter to the People of Purdue, his initial address to the university community:

I subscribe entirely to the concept that major decisions about the university and its future should be made under conditions of maximum practical inclusiveness and consultation. The faculty must have the strongest single voice in these deliberations, but students and staff should also be heard whenever their interests are implicated. I will work hard to see that all viewpoints are fairly heard and considered on big calls, including the prioritization of university budgetary investments, and endeavor to avoid surprises even on minor matters to the extent possible.

Shared governance implies shared accountability. It is neither equitable or workable to demand shared governing power but declare that cost control or substandard performance in any part of Purdue is someone else's problem. We cannot improve low on-time completion rates and maximize student success if no one is willing to modify his schedule, workload, or method of teaching.

Participation in governance also requires the willingness to make choices. "More for everyone" or "Everyone gets the same" are stances of default, inconsistent with the obligations of leadership.

I love the phrase, inconsistent with the obligations of leadership.

Daniels recently left the governor's house in Indiana for the president's house at Purdue. His initial address is balanced, open, and forward-looking. It is respectful of what universities do and forthright about the need to recognize changes in the world around us, and to change in response.

My university is hiring a new president, too. Our Board of Regents will announce its selection tomorrow. It is probably too much to ask that we hire a new president with the kind of vision and leadership that Daniels brings to West Lafayette. I do hope that we find someone up to the task of leading a university in a new century.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

February 03, 2013 11:10 AM

Faulkner Teaches How to Study

novelist William Faulkner, dressed for work

From this Paris Review interview with novelist William Faulkner:

INTERVIEWER

Some people say they can't understand your writing, even after they read it two or three times. What approach would you suggest for them?

FAULKNER

Read it four times.

The first three times through the book are sunk cost. At this moment, you don't understand. What should you do? Read it again.

I'm not suggesting you keep doing the same failing things over and over. (You know what Einstein said about insanity.) If you read the full interview, you'll see that Faulkner isn't suggesting that, either. We're suggesting you get back to work.

Studying computer science is different from reading literature. We can approach our study perhaps more analytically than the novel reader. And we can write code. As an instructor, I try to have a stable of ideas that students can try when they are having trouble grasping a new concept or understanding a reading, such as:

  • Assemble a list of specific questions to ask your prof.
  • Talk to a buddy who seems to understand what you don't.
  • Type the code from the paper in character-by-character, thinking about it as you do.
  • Draw a picture.
  • Try to explain the parts you do understand to another student.
  • Focus on one paragraph, and work backward from there to the ideas it presumes you already know.
  • Write your own program.

One thing that doesn't work very well is being passive. Often, students come to my office and say, "I don't get it." They don't bring much to the session. But the best learning is not passive; it's active. Do something. Something new, or just more.

Faulkner is quite matter-of-fact about creating and reading literature. If it isn't right, work to make it better. Technique? Method? Sure, whatever you need. Just do the work.

This may seem like silly advice. Aren't we all working hard enough already? Not all of us, and not all the time. I sometimes find that when I'm struggling most, I've stopped working hard. I get used to understanding things quickly, and then suddenly I don't. Time to read it again.

I empathize with many of my students. College is a shock to them. Things came easily in high school, and suddenly they don't. These students mean well but seem genuinely confused about what they should do next. "Why don't I understand this already?"

Sometimes our impatience is born from such experience. But as Bill Evans reminds us, some problems are too big to conquer immediately. He suggests that we accept this up front and enjoy the whole trip. That's good advice.

Faulkner shrugs his shoulders and tells us to get back to work.

~~~~

PHOTO. William Faulkner, dressed for work. Source: The Centered Librarian.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 26, 2013 5:52 PM

Computing Everywhere: Indirection

Alice: The hardest word you'll ever be asked to spell is "ichdericious".

Bob: Yikes. Which word?

A few of us have had fun with the quotations in English and Scheme over the last few days, but this idea is bigger than symbols as data values in programs or even words and strings in natural language. They are examples of a key element of computational thinking, indirection, which occurs in real life all the time.

A few years ago, my city built a new water park. To account for the influx of young children in the area, the city dropped the speed limit in the vicinity of the pool from 35 MPH to 25 MPH. The speed limit in that area has been 35 MPH for a long time, and many drivers had a hard time adjusting to the change. So the city put up a new traffic sign a hundred yards up the road, to warn drivers of the coming change. It looks like this one:

traffic sign: 40 MPH speed limit ahead

The white image in the middle of this sign is a quoted version of what drivers see down the road, the usual:

traffic sign: 40 MPH speed limit

Now, many people slow down to the new speed limit well in advance, often before reaching even the warning sign. Maybe they are being safe. Then again, maybe they are confusing a sign about a speed limit sign with the speed limit sign itself.

If so, they have missed a level of indirection.

I won't claim that computer scientists are great drivers, but I will say that we get used to dealing with indirection as a matter of course. A variable holds a value. A pointer holds the address of a location, which holds a value. A URL refers to a web page. The list goes on.

Indirection is a fundamental element in the fabric of computation. As computation becomes an integral part of nearly everyone's daily life, there is a lot to be gained by more people understanding the idea of indirection and recognizing opportunities to put it to work to mutual benefit.

Over the last few years, Jon Udell has been making a valiant attempt to bring this issue to the attention of computer scientists and non-computer scientists alike. He often starts with the idea of a hyperlink in a web page, or the URL to which it is tied, as a form of computing indirection that everyone already groks. But his goal is to capitalize on this understanding to sneak the communication strategy of pass by reference into people's mental models.

As Udell says, most people use hyperlinks every day but don't use them as well as they might, because the distinction between "pass by value" and "pass by reference" is not a part of their usual mental machinery:

The real problem, I think, is that if you're a newspaper editor, or a city official, or a citizen, pass-by-reference just isn't part of your mental toolkit. We teach the principle of indirection to programmers. But until recently there was no obvious need to teach it to everybody else, so we don't.

He has made the community calendar his working example of pass by reference, and his crusade:

In the case of calendar events, you're passing by value when you send copies of your data to event sites in email, or when you log into an events site and recopy data that you've already written down for yourself and published on your own site.

You're passing by reference when you publish the URL of your calendar feed and invite people and services to subscribe to your feed at that URL.

"Pass by reference rather than by value" is one of Udell's seven ways to think like the web, his take on how to describe computational thinking in a world of distributed, network media. That essay is a good start on an essential module in any course that wants to prepare people to live in a digital world. Without these skills, how can we hope to make the best use of technology when it involves two levels of indirection, as shared citations and marginalia do?

Quotation in Scheme and pass-by-reference are different issue, but they are related in a fundamental way to the concept of indirection. We need to arm more people with this concept than just CS students learning how programming languages work.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

January 25, 2013 4:47 PM

More on Real-World Examples of Quotation

My rumination on real-world examples of quotation to use with my students learning Scheme sparked the imaginations of several readers. Not too surprisingly, they came up with better examples than my own... For example, musician and software developer Chuck Hoffman suggested:

A song, he sang.
"A song", he sang.

The meaning of these is clearly different depending on whether we treat a song as a variable or as a literal.

My favorite example came from long-time friend Joe Bergin:

"Lincoln" has seven letters.
Lincoln has seven letters.

Very nice. Joe beat me with my own example!

As Chuck wrote, song titles create an interesting challenge, whether someone is singing a certain song or singing in a way defined by the words that happen to also be the song's title. I have certainly found it hard to find words both that are part of a title or a reference and that flow seamlessly in a sentence.

This turns out to be a fun form of word play, independent of its use as a teaching example. Feel free to send me your favorites.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

January 20, 2013 10:28 AM

Scored Discussions

My wife has been on a long-term substitute teaching assignment for the last few weeks. Yesterday, I ran across the following rubric used by one of the middle school teachers there to grade "scored discussions". The class reads a book, which they discuss as a group. Students are evaluated by their contribution to the discussion, including their observable behavior.

Productive behavior
  • Uses positive body language and eye contact (5)
  • Makes a relevant comment (1)
  • Offers supporting evidence (2)
  • Uses an analogy (3)
  • Asks a clarifying question (2)
  • Listens actively -- rephrases comment before responding (3)
  • Uses good speaking skills -- clear speech, loud enough, not too fast (2)

Nonproductive behavior

  • Not paying attention (-2)
  • Interrupting (-3)
  • Irrelevant comment (-2)
  • Monopolizing (-3)

Most adults, including faculty, should be glad that their behavior is not graded according to this standard. I daresay that many of us would leave meetings with a negative score more often that we would like to admit.

I think I'll use this rubric to monitor my own behavior at the next meeting on my calendar.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

December 31, 2012 8:22 AM

Building Things and Breaking Things Down

As I look toward 2013, I've been thinking about Alan Kay's view of CS as science [ link ]:

I believe that the only kind of science computing can be is like the science of bridge building. Somebody has to build the bridges and other people have to tear them down and make better theories, and you have to keep on building bridges.

In 2013, what will I build? What will I break down, understand, and help others to understand better?

One building project I have in mind is an interactive text. One analysis project in mind involves functional design patterns.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns

December 29, 2012 8:47 AM

Beautiful Sentences

Matthew Ward, in the Translator's Note to "The Stranger" (Vintage Books, 1988):

I have also attempted to venture further into the letter of Camus's novel, to capture what he said and how he said it, not what he meant. In theory, the latter should take care of itself.

This approach works pretty well for most authors and most books, I imagine.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

December 12, 2012 4:18 PM

Be a Driver, Not a Passenger

Some people say that programming isn't for everyone, just as knowing how to tinker under the hood of one's car isn't for everyone. Some people design and build cars; other people fix them; and the rest of us use them as high-level tools.

Douglas Rushkoff explains why this analogy is wrong:

Programming a computer is not like being the mechanic of an automobile. We're not looking at the difference between a mechanic and a driver, but between a driver and a passenger. If you don't know how to drive the car, you are forever dependent on your driver to take you where you want to go. You're even dependent on that driver to tell you when a place exists.

This is CS Education week, "a highly distributed celebration of the impact of computing and the need for computer science education". As a part of the festivities, Rushkoff was scheduled to address members of Congress and their staffers today about "the value of digital literacy". The passage quoted above is one of ten points he planned to make in his address.

As good as the other nine points are -- and several are very good -- I think the distinction between driver and passenger is the key, the essential idea for folks to understand about computing. If you can't program, you are not a driver; you are a passenger on someone else's trip. They get to decide where you go. You may want to invent a new place entirely, but you don't have the tools of invention. Worse yet, you may not even have the tools you need to imagine the new place. The world is as it is presented to you.

Don't just go along for the ride. Drive.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

December 09, 2012 5:12 PM

Just Build Things

The advantage of knowing how to program is that you can. The danger of knowing how to program is that you will want to.

From Paul Graham's How to Get Startup Ideas:

Knowing how to hack also means that when you have ideas, you'll be able to implement them. That's not absolutely necessary..., but it's an advantage. It's a big advantage, when you're considering an idea ..., if instead of merely thinking, "That's an interesting idea," you can think instead, "That's an interesting idea. I'll try building an initial version tonight."

Writing programs, like any sort of fleshing out of big ideas, is hard work. But what's the alternative? Not being able to program, and then you'll just need a programmer.

If you can program, what should you do?

[D]on't take any extra classes, and just build things. ... But don't feel like you have to build things that will become startups. That's premature optimization. Just build things.

Even the professor in me has to admit this is true. You will learn a lot of valuable theory, tools, and practices in class. But when a big idea comes to mind, you need to build it.

As Graham says, perhaps the best way that universities can help students start startups is to find ways to "leave them alone in the right way".

Of course, programming skills are not all you need. You'll probably need to be able to understand and learn from users:

When you find an unmet need that isn't your own, it may be somewhat blurry at first. The person who needs something may not know exactly what they need. In that case I often recommend that founders act like consultants -- that they do what they'd do if they'd been retained to solve the problems of this one user.

That's when those social science courses can come in handy.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 23, 2012 9:34 AM

In the Spirit of the Weekend

I am thankful for human beings' capacity to waste time.

We waste it in the most creative ways. My life is immeasurably better because other people have wasted time and created art and literature. Even much of the science and technology I enjoy came from people noodling around in their free time. The universe has blessed me, and us.

~~~~

At my house, Thanksgiving lasts the whole weekend. I don't mind writing a Thanksgiving blog the day after, even though the rest of the world has already moved on to Black Friday and the next season on the calendar. My family is, I suppose, wasting time.

This note of gratitude was prompted by reading a recent joint interview with Brian Eno and Ha-Joon Chang, oddities in their respective disciplines of music and economics. I am thankful for oddities such as Eno and Chang, who add to the world in ways that I cannot. I am also thankful that I live in a world that provides me access to so much wonderful information with such ease. I feel a deep sense of obligation to use my time in a way that repays these gifts I have been given.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

November 20, 2012 12:20 PM

The Paper Was Rejected, But Do Readers Care?

The research paper I discussed in a recent blog entry on student use of a new kind of textbook has not been published yet. It was rejected by ICER 2012, a CS education conference, for what are surely good reasons from the reviewers' perspective. The paper neither describes the results of an experiment nor puts the evaluation in the context of previous work. As the first study of this sort, though, that would be difficult to do.

That said, I did not hesitate to read the paper and try to put its findings to use. The authors have a solid reputation for doing good work, and I trust them to have done reasonable work and to have written about it honestly. Were there substantial flaws with the study or the paper, I trusted myself to take them into account as I interpreted and used the results.

I realize that this sort of thing happens every day, and has for a long time: academics reading technical reports and informal papers to learn from the work of their colleagues. But given the state of publishing these days, both academic and non-academic, I couldn't help but think about how the dissemination of information is changing.

Guzdial's blog is a perfect example. He has developed a solid reputation as a researcher and as an interpreter of other people's work. Now, nearly every day, we can all read his thoughts about his work, the work of others, and the state of the world. Whether the work is published in a journal or conference or not, it will reach an eager audience. He probably still needs to publish in traditional venues occasionally in order to please his employer and to maintain a certain stature, but I suspect that he no longer depends upon that sort of publication in the way researchers ten or thirty years ago.

True, Guzdial developed his reputation in part by publishing in journals and conferences, and they can still play that role for new researchers who are just developing their reputations. But there are other ways for the community to discover new work and recognize the quality of researchers and writers. Likewise, journals and conferences still can play a role in archiving work for posterity. But as the internet and web reach more and more people, and as we learn to do a better job of archiving what we publish there, that role will begin to fade.

The gates really are coming down.


Posted by Eugene Wallingford | Permalink | Categories: General

October 11, 2012 3:21 PM

Writing Advice for Me

I'm not a big fan of Top Ten lists on the web, unless they come from fellow Hoosier David Letterman. But I do like Number 9 on this list of writing tips:

Exclude all words that just don't add anything. This was the very best piece of advice I read when I first started blogging. Carefully re-read posts that you have written and try to remove all the extraneous words that add little or nothing.

This advice strikes a chord in me because I struggle to follow it, even when I am writing about it.


Posted by Eugene Wallingford | Permalink | Categories: General

October 01, 2012 7:40 AM

StrangeLoop 9: This and That

the Peabody Opera House

Every conference leaves me with unattached ideas floating around after I write up all my entries. StrangeLoop was no different. Were I master of Twitter, one who live-posted throughout the conference, many of this might have been masterful tweets. Instead, they are bullets in a truly miscellaneous blog entry.

~~~~

The conference was at the Peabody Opera House (right), an 80-year-old landmark in downtown St. Louis. It shares a large city block with the ScottTrade Center, home of the NHL Blues, and a large parking garage ideally positioned for a conference goer staying elsewhere. The main hall was perfect for plenary sessions, and four side rooms fit the parallel talks nicely.

~~~~

When I arrived at 8:30 AM on Monday, the morning refreshment table contained, in addition to the perfunctory coffee, Diet Mountain Dew in handy 12-ounce bottles. Soda was available all day. This made me happy.

Sadly, the kitchen ran out of Diet Dew before Tuesday morning. Such is life. I still applaud the conference for meeting the preferences of its non-coffee drinkers.

~~~~

During the Akka talk, I saw some code on a slide that made me mutter Ack! under my breath. That made me chuckle.

~~~~

"Man, there are a lot of Macs and iPads in this room."
-- me, at every conference session

~~~~

the St. Louis Arch, down the street from the Opera House

On Monday, I saw @fogus across the room in his Manfred von Thun jersey. I bow to you, sir. Joy is one of my favorites.

After seeing @fogus's jersey tweet, I actually ordered one for myself. Unfortunately, it didn't arrive in time for the conference. A nice coincidence: Robert Floyd spent most of his career at Stanford, whose mascot is... the Cardinal. (The color, not the bird.)

~~~~

During Matthew Flatt's talk, I couldn't help but think Alan Kay would be proud. This is programming taken to the extreme. Kay always said that Smalltalk didn't need an operating system; just hook those primitives directly to the underlying metal. Racket might be able to serve as its own OS, too.

~~~~

I skipped a few talks. During lunch each day, I went outside to walk. That's good for my knee as well as my head. Then I skipped one talk that I wanted to see at the end of each day, so that I could hit the exercise bike and pool. The web will surely provide me reports of both ( The Database as a Value and The State of JavaScript ). Sometimes, fresh air and exercise are worth the sacrifice.

~~~~

my StrangeLoop 2012 conference badge

I turned my laptop off for the last two talks of the conference that I attended. I don't think that the result was being able to think more or better, but I definitely did thought differently. Global connections seemed to surface more quickly, whereas typing notes seemed to keep me focused on local connections.

~~~~

Wednesday morning, as I hit the road for home, I ran into rush hour traffic driving toward downtown St. Louis. It took us 41 minutes to travel 12 miles. Love St. Louis and this conference as much as I do, I was glad to be heading home to a less crowded place.

~~~~

Even though I took walks at lunch, I was able to sneak into the lunch talks late. Tuesday's talk on Plato (OOP) and Aristotle (FP) brought a wistful smile. I spent a couple of years in grad school drawing inspiration for our lab's approach to knowledge-based systems from the pragmatists, in contrast to the traditional logical views of much of the AI world.

That talk contained two of my favorite sentences from the conference:

Computer scientists are applied metaphysicists.

And:

We have the most exciting job in the history of philosophy.

Indeed. We can encode, implement, and experiment with every model of the world we create. It is good to be the king.

This seems like a nice way to close my StrangeLoop posts for now. Now, back to work.


Posted by Eugene Wallingford | Permalink | Categories: General

September 19, 2012 4:57 PM

Don't Stop The Car

I'm not a Pomodoro guy, but this advice from The Timer Knows Best applies more generally:

Last month I was teaching my wife to drive [a manual transmission car], and it's amazing how easy stick shifting is if the car is already moving.... However, when the car is stopped and you need to get into 1st gear, it's extremely difficult. [So many things can go wrong:] too little gas, too much clutch, etc. ...

The same is true with the work day. Once you get going, you want to avoid coming to a standstill and having to get yourself moving again.

As I make the move from runner to cyclist, I have learned how much easier to keep moving on a bike than it is to start moving on a bike.

This is true of programming, too. Test-driven development helps us get started by encouraging us to focus on one new piece of functionality to implement. Keep it small, make it work, and move on to another small step. Pretty soon you are moving, and you are on your way.

Another technique many programs use to get started is to code a failing test before you stop the day before. This failing test focuses you even more quickly and recruits your own memory for help in recreating the feeling of motion more quickly. It's like a way to leave the car running in second gear.

I'm trying to help my students, who are mostly still learning how to write code, learn how to get started when they program. Many of them seem repeatedly to find themselves sitting still, grinding their gears and trying to figure out how to write the next bit of code and get it running. Ultimately, the answer may come down to the same thing we learn when we learn to drive a stick: practice, practice, practice, and eventually you get the feel of how the gearshift works.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

August 31, 2012 3:22 PM

Two Weeks Along the Road to OOP

The month has flown by, preparing for and now teaching our "intermediate computing" course. Add to that a strange and unusual set of administrative issues, and I've found no time to blog. I did, however manage to post what has become my most-retweeted tweet ever:

I wish I had enough money to run Oracle instead of Postgres. I'd still run Postgres, but I'd have a lot of cash.

That's an adaptation of tweet originated by @petdance and retweeted my way by @logosity. I polished it up, sent it off, and -- it took off for the sky. It's been fun watching its ebb and flow, as it reaches new sub-networks of people. From this experience I must learn at least one lesson: a lot of people are tired of sending money to Oracle.

The first two weeks of my course have led the students a few small steps toward object-oriented programming. I am letting the course evolve, with a few guiding ideas but no hard-and-fast plan. I'll write about the course's structure after I have a better view of it. For now, I can summarize the first four class sessions:

  1. Run a simple "memo pad" app, trying to identify behavior (functions) and state (persistent data). Discuss how different groupings of the functions and data might help us to localize change.
  2. Look at the code for the app. Discuss the organization of the functions and data. See a couple of basic design patterns, in particular the separation of model and view.
  3. Study the code in greater detail, with a focus on the high-level structure of an OO program in Java.
  4. Study the code in greater detail, with a focus on the lower-level structure of classes and methods in Java.

The reason we can spend so much time talking about a simple program is that students come to the course without (necessarily) knowing any Java. Most come with knowledge of Python or Ada, and their experiences with such different languages creates an interesting space in which to encounter Java. Our goal this semester is for students to learn their second language as much as possible, rather than having me "teach" it to them. I'm trying to expose them to a little more of the language each day, as we learn about design in parallel. This approach works reasonably well with Scheme and functional programming in a programming languages course. I'll have to see how well it works for Java and OOP, and adjust accordingly.

Next week we will begin to create things: classes, then small systems of classes. Homework 1 has them implementing a simple array-based class to an interface. It will be our first experience with polymorphic objects, though I plan to save that jargon for later in the course.

Finally, this is the new world of education: my students are sending me links to on-line sites and videos that have helped them learn programming. They want me to check them and and share with the other students. Today I received a link to The New Boston, which has among its 2500+ videos eighty-seven beginning Java and fifty-nine intermediate Java titles. Perhaps we'll come to a time when I can out-source all instruction on specific languages and focus class time on higher-level issues of design and programming...


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

August 09, 2012 1:36 PM

Sentences to Ponder

In Why Read?, Mark Edmundson writes:

A language, Wittgenstein thought, is a way of life. A new language, whether we learn it from a historian, a poet, a painter, or a composer of music, is potentially a new way to live.

Or from a programmer.

In computing, we sometimes speak of Perlis languages, after one of Alan Perlis's best-known epigrams: A language that doesn't affect the way you think about programming is not worth knowing. A programming language can change how we think about our craft. I hope to change how my students think about programming this fall, when I teach them an object-oriented language.

But for those of us who spend our days and nights turning ideas into programs, a way of thinking is akin to a way of life. That is why the wider scope of Wittgenstein's assertion strikes me as so appropriate for programmers.

Of course, I also think that programmers should follow Edmundson's advice and learn new languages from historians, writers, and artists. Learning new ways to think and live isn't just for humanities majors.

(By the way, I'm enjoying reading Why Read? so far. I read Edmundson's Teacher many years ago and recommend it highly.)


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

July 23, 2012 3:14 PM

Letting Go of Old Strengths

Ward Cunningham commented on what it's like to be "an old guy who's still a programmer" in his recent Dr. Dobb's interview:

A lot of people think that you can't be old and be good, and that's not true. You just have to be willing to let go of the strengths that you had a year ago and get some new strengths this year. Because it does change fast, and if you're not willing to do that, then you're not really able to be a programmer.""

That made me think of the last comment I made in my posts on JRubyConf:

There is a lot of stuff I don't know. I won't run out of things to read and learn and do for a long, long, time.

This is an ongoing theme in the life of a programmer, in the life of a teacher, and the life of an academic: the choice we make each day between keeping up and settling down. Keeping up is a lot more fun, but it's work. If you aren't comfortable giving up what you were awesome at yesterday, it's even more painful. I've been lucky mostly to enjoy learning new stuff more than I've enjoyed knowing the old stuff. May you be so lucky.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

July 20, 2012 3:39 PM

A Philosopher of Imitation

Ian Bogost, in The Great Pretender: Turing as a Philosopher of Imitation, writes:

Intelligence -- whatever it is, the thing that goes on inside a human or a machine -- is less interesting and productive a topic of conversation than the effects of such a process, the experience it creates in observers and interlocutors.

This is a very nice one-sentence summary of Turing's thesis in Computing Machinery and Intelligence. I wrote a bit about Turing's ideas on machine intelligence a few months back, but the key idea in Bogost's essay relates more closely to my discussion in Turing's ideas on representation and universal machines.

In this centennial year of his birth, we can hardly go wrong in considering again and again the depth of Turing's contributions. Bogost uses a lovely turn of phrase in his title: a philosopher of imitation. What may sound like a slight or a trifle is, in fact, the highest of compliments. Turing made that thinkable.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 18, 2012 2:31 PM

Names, Values, and The Battle of Bull Run

the cover of 'Encyclopedia Brown Finds the Clues'

Author Donald Sobol died Monday. I know him best from his long-running series, Encyclopedia Brown. Like many kids of my day, I loved these stories. I couldn't get enough. Each book consisted of ten or so short mysteries solved by Encyclopedia or Sally Kimball, his de facto partner in the Brown Detective Agency. I wanted to be Encyclopedia.

The stories were brain teasers. Solving them required knowledge and, more important, careful observation and logical deduction. I learned to pay close attention while reading Encyclopedia Brown, otherwise I had no hope of solving the crime before Encyclopedia revealed the solution. In many ways, these stories prepared me for a career in math and science. They certainly were a lot of fun.

One of the stories I remember best after all these years is "The Case of the Civil War Sword", from the very first Encyclopedia Brown book. I'm not the only person who found it memorable; Rob Bricken ranks it #9 among the ten most difficult Encyclopedia Brown mysteries. The solution to this case turned on the fact that one battle had two different names. Northerners often named battles for nearby bodies of water or prominent natural features, while Southerners named them for the nearest town or prominent man-made features. So, the First Battle of Bull Run and the First Battle of Manassas were the same event.

This case taught me a bit of historical trivia and opened my mind to the idea that naming things from the Civil War was not trivial at all.

This story taught me more than history, though. As a young boy, it stood out as an example of something I surely already knew: names aren't unique. The same value can have different names. In a way, Encyclopedia Brown taught me one of my first lessons about computer science.

~~~~

IMAGE: the cover of Encyclopedia Brown Finds the Clues, 1966. Source: Topless Robot.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 16, 2012 3:02 PM

Refactoring Everywhere: In Code and In Text

Charlie Stross is a sci-fi writer. Some of my friends have recommended his fiction, but I've not read any. In Writing a novel in Scrivener: lessons learned, he, well, describes what he has learned writing novels using Scrivener, an app for writers well known in the Mac OS X world.

I've used it before on several novels, notably ones where the plot got so gnarly and tangled up that I badly needed a tool for refactoring plot strands, but the novel I've finished, "Neptune's Brood", is the first one that was written from start to finish in Scrivener...

... It doesn't completely replace the word processor in my workflow, but it relegates it to a markup and proofing tool rather than being a central element of the process of creating a book. And that's about as major a change as the author's job has undergone since WYSIWYG word processing came along in the late 80s....

My suspicion is that if this sort of tool spreads, the long-term result may be better structured novels with fewer dangling plot threads and internal inconsistencies. But time will tell.

Stross's lessons don't all revolve around refactoring, but being able to manage and manipulate the structure of the evolving novel seems central to his satisfaction.

I've read a lot of novels that seemed like they could have used a little refactoring. I always figured it was just me.

The experience of writing anything in long form can probably be improved by a good refactoring tool. I know I find myself doing some pretty large refactorings when I'm working on the set of lecture notes for a course.

Programmers and computer scientists have the advantage of being more comfortable writing text in code, using tools such as LaTex and Scribble, or homegrown systems. My sense, though, is that fewer programmers use tools like this, at least at full power, than might benefit from doing so.

Like Stross, I have a predisposition against using tools with proprietary data formats. I've never lost data stored in plaintext to version creep or application obsolescence. I do use apps such as VoodooPad for specific tasks, though I am keenly aware of the exit strategy (export to text or RTFD ) and the pain trade-off at exit (the more VoodooPad docs I create, the more docs I have to remember to export before losing access to the app). One of the things I like most about MacJournal is that it's nothing but a veneer over a set of Unix directories and RTF documents. The flip side is that it can't do for me nearly what Scrivener can do.

Thinking about a prose writing tool that supports refactoring raises an obvious question: what sort of refactoring operations might it provide automatically? Some of the standard code refactorings might have natural analogues in writing, such as Extract Chapter or Inline Digression.

Thinking about automated support for refactoring raises another obvious question, the importance of which is surely as clear to novelists as to software developers: Where are the unit tests? How will we know we haven't broken the story?

I'm not being facetious. The biggest fear I have when I refactor a module of a course I teach is that I will break something somewhere down the line in the course. Your advice is welcome!


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

July 14, 2012 11:01 AM

"Most Happiness Comes From Friction"

Last time, I mentioned again the value in having students learn broadly across the sciences and humanities, including computer science. This is a challenge going in both directions. Most students like to concentrate on one area, for a lot of different reasons. Computer science looks intimidating to students in other majors, perhaps especially to the humanities-inclined.

There is hope. Earlier this year, the Harvard Magazine ran The Frisson of Friction, an essay by Sarah Zhang, a non-CS student who decided to take CS 50, Harvard's intro to computer science. Zhang tells the story of finding a thorny, semicolon-induced bug in a program (an extension for Google's Chrome browser) on the eve of her 21st birthday. Eventually, she succeeded. In retrospect, she writes:

Plenty of people could have coded the same extension more elegantly and in less time. I will never be as good a programmer as -- to set the standard absurdly high -- Mark Zuckerberg. But accomplishments can be measured in terms relative to ourselves, rather than to others. Rather than sticking to what we're already good at as the surest path to résumé-worthy achievements, we should see the value in novel challenges. How else will we discover possibilities that lie just beyond the visible horizon?

... Even the best birthday cake is no substitute for the deep satisfaction of accomplishing what we had previously deemed impossible -- whether it's writing a program or writing a play.

The essay addresses some of the issues that keep students from seeking out novel challenges, such as fear of low grades and fear of looking foolish. At places like Harvard, students who are used to succeeding find themselves boxed in by their friends' expectations, and their own, but those feelings are familiar to students at any school. Then you have advisors who subtly discourage venturing too far from the comfortable, out of their own unfamiliarity and fear. This is a social issue as big as any pedagogical challenge we face in trying to make introductory computer science more accessible to more people.

With work, we can help students feel the deep satisfaction that Zhang experienced. Overcoming challenges often leads to that feeling. She quotes a passage about programmers in Silicon Valley, who thrive on such challenges: "Most happiness probably comes from friction." Much satisfaction and happiness come out of the friction inherent in making things. Writing prose and writing programs share this characteristic.

Sharing the deep satisfaction of computer science is a problem with many facets. Those of us who know the satisfaction know it's a problem worth solving.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

July 13, 2012 12:02 PM

How Science -- and Computing -- Are Changing History

While reading a recent Harvard Magazine article about Eric Mazur's peer instruction technique in physics teaching, I ran across a link to an older paper that fascinated me even more! Who Killed the Men of England? tells several stories of research at the intersection of history, archaeology, genomics, evolution, demography, and simulation, such as the conquest of Roman England by the Anglo Saxons.

Not only in this instance, but across entire fields of inquiry, the traditional boundaries between history and prehistory have been melting away as the study of the human past based on the written record increasingly incorporates the material record of the natural and physical sciences. Recognizing this shift, and seeking to establish fruitful collaborations, a group of Harvard and MIT scholars have begun working together as part of a new initiative for the study of the human past. Organized by [professor of medieval history Michael] McCormick, who studies the fall of the Roman empire, the aim is to bring together researchers from the physical, life, and computer sciences and the humanities to explore the kinds of new data that will advance our understanding of human history.

... The study of the human past, in other words, has entered a new phase in which science has begun to tell stories that were once the sole domain of humanists.

I love history as much as computing and was mesmerized by these stories of how scientists reading the "material record" of the world are adding to our knowledge of the human past.

However, this is more than simply a one-way path of information flowing from scientists to humanists. The scientific data and models themselves are underconstrained. The historians, cultural anthropologists, and demographers are able to provide context to the data and models and so extract even more meaning from them. This is a true collaboration. Very cool.

The rise of science is erasing boundaries between the disciplines that we all studied in school. Scholars are able to define new disciplines, such as "the study of the human past", mentioned in the passage above. These disciplines are organized with a greater focus on what is being studied than on how we are studying it.

We are also blurring the line between history and pre-history. It used to be that history required a written record, but that is no longer a hard limit. Science can read nature's record. Computer scientists can build models using genomic data and migration data that suggest possible paths of change when the written and scientific record are incomplete. These ideas become part of the raw material that humanists use to construct a coherent story of the past.

This change in how we are able to study the world highlights the importance of a broad education, something I've written about a few times recently [ 1 | 2 | 3 ] and not so recently. This sort of scholarship is best done by people who are good at several things, or at least curious and interested enough in several things to get to know them intimately. As I wrote in Failure and the Liberal Arts, it's important both not to be too narrowly trained and not to be too narrowly "liberally educated".

Even at a place like Harvard, this can leave scholars in a quandary:

McCormick is fired with enthusiasm for the future of his discipline. "It is exciting. I jump up every morning. But it is also challenging. Division and department boundaries are real. Even with a generally supportive attitude, it is difficult [to raise funds, to admit students who are excellent in more than one discipline, and so on]. ..."

So I will continue to tell computer science students to take courses from all over the university, not just from CS and math. This is one point of influence I have as a professor, advisor, and department head. And I will continue to look for ways to encourage non-CS students to take CS courses and students outside the sciences to study science, including CS. As that paragraph ends:

"... This is a whole new way of studying the past. It is a unique intellectual opportunity and practically all the pieces are in place. This should happen here--it will happen, whether we are part of it or not."

"Here" doesn't have to be Harvard. There is a lot of work to be done.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

June 30, 2012 10:52 AM

"What Were Alleged to be Ideas"

James Webb Young begins his book A Technique for Producing Ideas with a prefatory note:

The subject is properly one which belongs to the professional psychologist, which I am not. This treatment of it, therefore, can have value only as an expression of the personal experience of one who has had to earn his living by producing what were alleged to be ideas.

With a little tweaking, such as occasionally substituting a different profession for psychologist, this would make a nice disclaimer for many of my blog entries.

Come to think of it, with a little tweaking, this could serve as the basis of a disclaimer for about 98% of the web.

Thanks to David Schmüdde for a pointer to Young's delightful little book.


Posted by Eugene Wallingford | Permalink | Categories: General

June 26, 2012 4:23 PM

Adventures in Advising

Student brings me a proposed schedule for next semester.

Me: "Are you happy with this schedule?"

Student: "If I weren't, why would I have made it?"

All I can think is, "Boy, are you gonna have fun as a programmer."


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

June 06, 2012 3:33 PM

Advice, Platitudes, and Reasonable Decisions

I recently listened to a short clip from Seth Godin's book "The Dip". In it, he quotes Vince Lombardi as saying, "winners never quit, and quitters never win", and then says something to the effect of:

Winners quit all the time. They just quit the right stuff at the right time.

This reminded of my recent Good Ideas Aren't Always Enough, in which I talk briefly about Ward Cunningham's experience trying to create a universal mark-up language for wiki.

How did Ward know it was the right time to stop pushing for a universal mark-up? Perhaps success was right around the corner. Maybe he just needed a better argument, or a better example, or a better mark-up language.

Inherent in this sort of lesson is a generic variation of the Halting Problem. You can't be sure that an effort will fail until it fails. But the process may never fail explicitly, simply churning on forever. What then?

That's one of the problems with giving advice of the sort my entry gave, or of the sort that Godin gives in his book. The advice itself is empty, because the opposite advice is also true. You only know which advice is right in any given context after the fact -- if ever.

How did Ward know? I'm guessing a combination of:

  • knowledge about the problem,
  • experience with this problem and others like it,
  • relationship with the community of people involved,
  • and... a little luck.

And someone may come along some day with a better argument, or a better example, or a better mark-up language, and succeed. We won't know until it happens.

Maybe such advice is nothing more than platitude. Without any context, it isn't all that helpful, except as motivation to persevere in the face of a challenge (if you want to push on) or consolation in the face of a setback (if you want to focus your energy elsewhere). Still, I think it's useful to know that other people -- accomplished people -- have faced the same choice. Both outcomes are possible. Knowing that, we can use our knowledge, experience, and relationships to make choices that make sense in our current circumstances and live with the outcomes.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

June 01, 2012 4:39 PM

Good Ideas Aren't Always Enough

Ward Cunningham

In his recent Dr. Dobb's interview, Ward Cunningham talked about the wiki community's efforts to create a universal mark-up language. Despite the many advantages of a common language, the idea never took hold. Ward's post-mortem:

So the only thing I can conclude is that as nice as having a universal or portable mark-up would be, it's not nice enough to cause people to give up what they're working on when they work on their wiki.

This is an important lesson to learn, whatever your discipline or your community. It's especially important if you hope to be an agent of change. Good ideas aren't always enough to induce change, even in a community of people working together in an explicit effort to create better ideas. There needs to be enough energy to overcome the natural inertia associated with any set of practices.

Ward's next sentence embodies even more wisdom:

I accept that as the state of nature and don't worry about it too much anymore.

Denial locks you up. Either you continue in vain to push the rejected idea, or you waste precious time and energy lamenting the perceived injustice of the failure.

Acceptance frees you to move on to your project in peace.


Posted by Eugene Wallingford | Permalink | Categories: General

May 31, 2012 3:17 PM

A Department Head's Fantasy

(I recently finished re-reading Straight Man, the 1997 novel by Richard Russo. This fantasy comes straight out of the book.)

Hank Devereaux, beleaguered chair of the English department, has been called in to meet with Dickie Pope, campus CEO. He arrives at Pope's office just as the CEO is wrapping up a meeting with chief of security Lou Steinmetz and another man. Pope says, "Hank, why don't you go on in and make yourself comfortable. I want to walk these fellas to the door." We join Devereaux's narration:

When I go over to Dickie's high windows to take in the view, I'm in time to see the three men emerge below, where they continue their conversation on the steps.... Lou's campus security cruiser is parked at the curb, and the three men stroll toward it. They're seeing Lou off, I presume, .... But when they get to the cruiser, to my surprise, all three men climb into the front seat and drive off. If this is a joke on me, I can't help but admire it. In fact, I make a mental note to employ a version of it myself, soon. Maybe, if I'm to be fired today, I'll convene some sort of emergency meeting, inviting Gracie, and Paul Rourke, and Finny, and Orshee, and one or two other pebbles from my shoe. I'll call the meeting to order, then step outside on some pretext or other, and simply go home. Get Rachel [my secretary] to time them and report back to me on how long it takes them to figure it out. Maybe even get some sort of pool going.

My relationship with my colleagues is nothing like Devereaux's. Unlike him, I like my colleagues. Unlike his colleagues, mine have always treated me with collegiality and respect. I have no reason to wish them ill will or discomfort.

Still. It is a great joke. And I imagine that there are a lot of deans and department chairs and VPs out there who harbor dark fantasies of this sort all the time, especially during those inevitable stretches of politics that plague universities. Even the most optimistic among us can be worn down by the steady drip-drip-drip of dysfunction. There have certainly been days this year when I've gone home at the end of a long week with a sense of doom and a desire for recompense.

Fortunately, an occasional fantasy is usually all I need to deflate the doom and get back to business. That is the voyeuristic allure of novels like Straight Man for me.

But there may come a day when I can't resist temptation. If you see me walking on campus wearing a Groucho Marx nose and glasses, all bets are off.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

May 22, 2012 7:53 PM

A Few Days at JRubyConf

It's been fourteen months since I last attended a conference. I decided to celebrate the end of the year, the end of my compiler course, and the prospect of writing a little code this summer by attending JRubyConf 2012. I've programmed a fair amount in Ruby but have only recently begun to play with JRuby, an implementation of Ruby in Java which runs atop the JVM. There are some nice advantages to this, including the ability to use Java graphics with Ruby models and the ability to do real concurrency. It also offers me a nice combination for the summer. I will be teaching our sophomore-level intermediate computing course this fall, which focuses in large part on OO design and Java implementation, as JRuby will let me program in Ruby while doing a little class prep at the same time.

the Stone Arch Bridge in Minneapolis

Conference organizer Nick Sieger opened the event with the obligatory welcome remarks. He said that he thinks the overriding theme of JRubyConf is being a bridge. This is perhaps a natural effect of Minneapolis, a city of many bridges, as the hometown of JRuby, its lead devs, and the conference. The image above is of the Stone Arch Bridge, as seen from the ninth level of the famed Guthrie Center, the conference venue. (The yellow tint is from the window itself.)

The goal for the conference is to be a bridge connecting people to technologies. But it also aims to be a bridge among people, promoting what Sieger called "a more sensitive way of doing business". Emblematic of this goal were its Sunday workshop, a Kids CodeCamp, and its Monday workshop, Railsbridge. This is my first open-source conference, and when I look around I see the issue that so many people talk about. Of 150 or so attendees, there must be fewer than one dozen women and fewer than five African-Americans. The computing world certainly has room to make more and better connections into the world.

My next few entries will cover some of the things I learn at the conference. I start with a smile on my face, because the conference organizers gave me a cookie when I checked in this morning:

the sugar cookie JRubyConf gave me at check-in

That seems like a nice way to say 'hello' to a newcomer.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

May 11, 2012 2:31 PM

Get Busy; Time is Short

After an award-winning author had criticized popular literature, Stephen King responded with advice that is a useful reminder to us all:

Get busy. You have a short life span. You need to stop this crap about sitting there and talking about what we do, and actually do it. Because God gave you some talent, but he also gave you a certain number of years.

You don't have to be an award-winning author to waste precious time commenting on other people's work. Anyone with a web browser can fill his or her day talking about stuff, and not actually making stuff. For academics, it is a professional hazard. We need to balance the analytic and the creative. We learn by studying others' work and writing about it, but we also need to make time to make.

(The passage above comes from Stephen King, The Art of Fiction No. 189, in the wonderful on-line archive of interviews from the Paris Review.)


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

May 08, 2012 3:22 PM

Quality and Quantity, Thoroughbred Edition

I'll Have Another was not highly sought after as a yearling, when he was purchased for the relatively small sum of $11,000.

On Saturday, I'll Have Another rallied down the stretch to win the 2012 Kentucky Derby, passing Bodemeister, one of the race favorites that had led impressively from the gate. Afterward, a television commentator asked the horse's trainer, "What did you and the owner see in the horse way back that made you want to buy it?" The trainer's answer was unusually honest. He said something to the effect,

We buy a lot of horses. Some work out, and some don't. There is a lot of luck involved. You do the right things and see what happens.

This is as a good an example as I've heard in a while of the relationship between quantity and quality, which my memory often connects with stories from the book Art and Fear. People are way too fond of mythologizing successes and then romanticizing the processes that lead to them. In most vocations and most avocations, the best way to succeed is to do the right things, to work hard, be unlucky a lot, and occasionally get lucky.

This mindset does not to diminish the value of hard work and good practices. No, it exalts their value. What it diminishes is our sense of control over outcomes in a complex world. Do your best and you will get better. Just keep in mind that we often have a lot less control over success and failure than our mythology tends to tell us.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

April 21, 2012 3:57 PM

A Conflict Between Fashion and the Unfashionable

Passage of the day, courtesy of Dave Winer:

They have started incubators in every major city on the planet. Unfortunately it hasn't been stylish to learn how to program for a number of years, so there aren't that many programmers available to hire. And it takes years to get really good at this stuff.

Hey, they just need a programmer. Or fifty.

While we teach CS students to program, we need to cultivate an entrepreneurial spirit, too. What an opportunity awaits someone with ideas and the ability to carry them out.


Posted by Eugene Wallingford | Permalink | Categories: General

April 20, 2012 3:14 PM

Better Than Everyone, University Edition

Seth Godin recently mentioned something that Clay Shirky has said about the television industry: Forty years ago, you only had to be better than two other shows. Now you have to better than everybody.

At the same time technology makes it easier for people to put their creations in front of potential viewers, it makes it harder for established players to retain control over market share. As Godin summarized, ".. with a million choices, each show earns the attention it gets in every single moment".

I've mused here periodically about how these same technological changes will ultimately affect universities. It seems that many people agree that education, even higher ed, is "ripe for disruption". Startups such as Boundless are beginning to take their shot at what seems an obvious market, the intersection of education and the beleaguered publishing industry: textbooks.

Though on-line education has been growing now for years, I haven't written anything about it. For one thing, I don't know what I really think of it yet. As much as I think out loud when I blog, I usually at least have a well-formed thought or two. When it comes to on-line education, my brain is still mostly full of mush.

Not long ago, the threat of on-line education to the way traditional universities operate did not seem imminent. That is, I think, starting to change. When the primary on-line players were non-traditional alternatives such as the University of Phoenix, it seemed easy enough to sell the benefits of the brick-and-ivy campus-based education to people. But as these schools slowly build a track record -- and an alumni base -- they will become a common enough part of the popular landscape that they become an acceptable alternative to many people. And as the cost of brick-and-ivy education rises, it becomes harder and harder to sell people on its value.

Of course, we now see a burgeoning in the number of on-line offerings from established universities. Big-name schools like MIT and Harvard have made full courses, and even suites of courses, available on-line. One of my more experienced colleagues began to get antsy when this process picked up speed a few years ago. Who wouldn't prefer MIT's artificial intelligence course over ours? These courses weren't yet available for credit, which left us with hope. We offer our course as part of a coherent program of study that leads to a credential that students and employers value. But in time...

... that would change. And it has. Udacity has spun itself off from Stanford and is setting its sights on a full on-line curriculum. A recent Computer World article talks about MITx, a similar program growing out of MIT. These programs are still being created and will likely offer a different sort of credential than the universities that gave birth to them, at least at the start. Is there still hope?

Less and less. As the article reports, other established universities are now offering full CS programs on-line. The University of Illinois at Springfield started in 2006 and now has more computer science students enrolled in its on-line undergrad and M.S. programs (171 and 146, respectively) than their on-campus counterparts (121 and 129). In June, Oregon State will begin offering a CS degree program on-line.

The natural reaction of many schools is to join in the rush. Schools like many are putting more financial and faculty resources into the creation of on-line courses and programs, because "that's where the future lies".

I think, though, that Shirky's anecdote about the TV industry serves as an important cautionary tale. The caution has two prongs.

First, you have to adapt. When a disruptive technology comes along, you have to respond. You may think that you are good enough or dominant enough to survive the wave, but you probably aren't. Giants that retain their position atop a local maximum when a new technology redefines an industry quickly change from giants to dinosaurs.

Adapting isn't easy. Clayton Christensen and his colleagues have documented how difficult it is for a company that is very good at something and delivering value in its market to change course. Even with foresight and a vision, it is difficult to overcome inertia and external forces that push a company to stay on the same track.

Second, technology lowers barriers for producers and consumers alike. It's no longer enough to be the best teaching university in your state or neighborhood. Now you have to better than everybody. If you are a computer science department, that seems an insurmountable task. Maybe you can be better than Illinois-Springfield (and maybe not!), but how can you be better than Stanford, MIT, and Harvard?

Before joining the rush to offer programs on-line, you might want to have an idea of what it is that you will be the best at, and for whom. With degrees from Illinois-Springfield, Oregon State, Udacity, Stanford, MIT, and Harvard only a few clicks away, you will have to earn the attention -- and tuition -- you receive from every single student.

But don't dally. It's lonely as the dominant player in a market that no longer exists.


Posted by Eugene Wallingford | Permalink | Categories: General

April 09, 2012 2:53 PM

Should I Change My Major?

Recruiting brochures for academic departments often list the kinds of jobs that students get when they graduate. Brochures for CS departments tend to list jobs such as "computer programmer", "system administrator", "software engineer", and "systems analyst". More ambitious lists include "CS professor" and "entrepreneur". I've been promoting entrepreneurship as a path for our CS grads for a few years now.

This morning, I was browsing the tables at one of my college's preview day sessions and came across my new all-time favorite job title for graduates. If you major in philosophy at my university, it turns out that one of the possible future job opportunities awaiting you is...

Bishop or Pope

Learning to program gives you superhuman strength, but I'm not sure a CS major can give you a direct line to God. I say, "Go for it."


Posted by Eugene Wallingford | Permalink | Categories: General

April 06, 2012 4:29 PM

A Reflection on Alan Turing, Representation, and Universal Machines

Douglas Hofstadter speaking at UNI

The day after Douglas Hofstadter spoke here on assertions, proof's and Gödel's theorem, he gave a second public lecture hosted by the philosophy department. Ahead of time, we knew only that Hofstadter would reflect on Turing during his centennial. I went in expecting more on the Turing test, or perhaps a popular talk on Turing's proof of The Halting Problem. Instead, he riffed on Chapter 17 from I Am a Strange Loop.

In the end, we are self-perceiving, self-inventing, locked-in mirages that are little miracles of self-reference.

Turing, he said, is another peak in the landscape occupied by Tarski and Gödel, whose work he had discussed the night before. (As a computer scientist, I wanted to add to this set contemporaries such as Alonzo Church and Claude Shannon.) Hofstadter mentioned Turing's seminal paper about the Entscheidungsproblem but wanted to focus instead on the model of computation for which he is known, usually referred to by the name "Turing machine". In particular, he asked us to consider a key distinction that Turing made when talking about his model: that between dedicated and universal machines.

A dedicated machine performs one task. Human history is replete with dedicated machines, whether simple, like the wheel, or complex, such as a typewriter. We can use these tools with different ends in mind, but the basic work is fixed in their substance and structure.

The 21st-century cell phone is, in contrast, a universal machine. It can take pictures, record audio, and -- yes -- even be used as a phone. But it can also do other things for us, if we but go to the app store and download another program.

Hofstadter shared a few of his early personal experiences with programs enabling line printers to perform tasks for which they had not been specifically designed. He recalled seeing a two-dimensional graph plotted by "printing" mostly blank lines that contained a single *. Text had been turned into graphics. Taking the idea further, someone used the computer to print a large number of cards which, when given to members of the crowd at a football game, could be used to create a massive two-dimensional message visible from afar. Even further, someone used a very specific layout of the characters available on the line printer to produce a print-out that appeared from the other side of the room to be a black-and-white photograph of Raquel Welch. Text had been turned into image.

People saw each of these displays as images by virtue of our eyes and mind interpreting a specific configuration of characters in a certain way. We can take that idea down a level into the computer itself. Consider this transformation of bits:

0000 0000 0110 1011 → 0110 1011 0000 0000

A computer engineer might see this as a "left shift" of 8 bits. A computer programmer might see it as multiplying the number on the left by 256. A graphic designer might see us moving color from one pixel to another. A typesetter may see one letter being changed into another. What one sees depends on how one interprets what the data represent and what the process means.

Alan Turing was the first to express clearly the idea that a machine can do them all.

"Aren't those really binary numbers?", someone asked. "Isn't that real, and everything else interpretation?" Hofstadter said that this is a tempting perspective, but we need to keep in mind that they aren't numbers at all. They are, in most computers, pulses of electricity, or the states of electronic components, that we interpret as 0s and 1s.

After we have settled on interpreting those pulses or states as 0s and 1s, we then interpret configurations of 0s and 1s to mean something else, such as decimal numbers, colors, or characters. This second level of interpretation exposes the flaw in popular claims that computers can do "only" process 0s and 1s. Computers can deal with numbers, colors, or characters -- anything that can be represented in any way -- when we interpret not only what the data mean but also what the process means.

(In the course of talking representations, he threw in a cool numeric example: Given an integer N, factor it as 2^a * 3^b * 5^c *7^d ... and use [a.b.c.d. ...] to stand for N. I see a programming assignment or two lying in wait.)

The dual ideas of representation and interpretation take us into a new dimension. The Principia Mathematica describes a set of axioms and formal rules for reasoning about numeric structures. Gödel saw that it could be viewed at a higher level, as a system in its own right -- as a structure of integers. Thus the Principia can talk about itself. It is, in a sense, universal.

This is the launching point for Turing's greatest insight. In I Am a Strange Loop, Hofstadter writes:

Inspired by Gödel's mapping of PM into itself, Alan Turing realized that the critical threshold for this kind of computational universality comes exactly at the point where a machine is flexible enough to read and correctly interpret a set of data that describes its own structure. At this crucial juncture, a machine can, in principle, explicitly watch how it does any particular task, step by step. Turing realized that a machine that has this critical level of flexibility can imitate any other machine, no matter how complex the latter is. In other words, there is nothing more flexible than a universal machine. Universality is as far as you can go!

Alan Turing

Thus was Turing first person to recognize the idea of a universal machine, circa 1935-1936: that a Turing machine can be given, as input, data that encodes its own instructions. This is the beginning of perhaps the biggest of the Big Ideas of computer science: the duality of data and program.

We should all be glad he didn't patent this idea.

Turing didn't stop there, of course, as I wrote in my recent entry on the Turing test. He recognized that humans are remarkably capable and efficient representational machines.

Hofstadter illustrates this with the idea of "hub", a three-letter word that embodies an enormous amount of experience and knowledge, chunked in numerous ways and accreted slowly over time. The concept is assembled in our minds out of our experiences. It is a representation. Bound up in that representation is an understanding of ourselves as actors in certain kinds of interactions, such as booking a flight on an airplane.

It is this facility with representations that distinguishes us humans from dogs and other animals. They don't seem capable of seeing themselves or others as representations. Human beings, though, naturally take other people's representations into their own. This results in a range of familiarities and verisimilitude. We "absorb" some people so well that we feel we know them intimately. This is what we mean when we say that someone is "in our soul". We use the word 'soul' not in a religious sense; we are referring to our essence.

Viewed this way, we are all distributed beings. We are "out there", in other people, as well as "in here", in ourselves. We've all had dreams of the sort Hofstadter used as example, a dream in which his deceased father appeared, seemingly as real as he ever had been while alive. I myself recently dreamt that I was running, and the experience of myself was as real as anything I feel when I'm awake. Because we are universal machines, we are able to process the representations we hold of ourselves and of others and create sensations that feel just like the ones we have when we interact in the world.

It is this sense that we are self-representation machines that gives rise to the title of his book, "I am a strange loop". In Hofstadter's view, our identity is a representation of self that we construct, like any other representation.

This idea underlies the importance of the Turing test. It takes more than "just syntax" to pass the test. Indeed, syntax is itself more than "just" syntax! We quickly recurse into the dimension of representation, of models, and a need for self-reference that makes our syntactic rules more than "just" rules.

Indeed, as self-representation machines, we are able to have a sense of our own smallness within the larger system. This can be scary, but also good. It makes life seem precious, so we feel a need to contribute to the world, to matter somehow.

Whenever I teach our AI course, I encounter students who are, for religious or philosophical reasons, deeply averse to the idea of an intelligent machine, or even of scientific explanations of who we are. When I think about identity in terms of self-representation, I can't help but feel that, at an important level, it does not matter. God or not, I am in awe of who we are and how we got to here.

So, we owe Alan Turing a great debt. Building on the work of philosophers, mathematicians, and logicians, Turing gave us the essential insight of the universal machine, on which modern computing is built. He also gave us a new vocabulary with which to think about our identity and how we understand the world. I hope you can appreciate why celebrating his centennial is worthwhile.

~~~~

IMAGE 1: a photo of Douglas Hofstadter speaking at UNI, March 7, 2012. Source: Kevin C. O'Kane.

IMAGE 2: the Alan Turing centenary celebration. Source: 2012 The Alan Turing Year.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

April 04, 2012 4:39 PM

Computational Search Answers an Important Question

Update: Well, this is embarrassing. Apparently, Mat and I were the victims of a prank by the folks at ChessBase. You'd think that, after more than twenty-five years on the internet, I would be more circumspect at this time of year. Rather than delete the post, I will leave it here for the sake of posterity. If nothing else, my students can get a chuckle from their professor getting caught red-faced.

I stand behind my discussion of solving games, my recommendation of Rybka, and my praise for My 60 Memorable Games (my favorite chess book of all time. I also still marvel at the chess mind of Bobby Fischer.

~~~~

Thanks to reader Mat Roberts for pointing me to this interview with programmer Vasik Rajlich, which describes a recent computational result of his: one of the most famous openings in chess, the King's Gambit, is a forced draw.

Games are, of course, a fertile testbed for computing research, including AI and parallel computation. Many researchers make one of their goals to "solve" a game, that is, to show that, with best play by both players, a game has a particular outcome. Games with long histories and large communities of players naturally attract a lot of interest, and solving one of them is usually considered a valuable achievement.

For us in CS, interest grows as with the complexity of the game. Solving Connect Four was cool, but solving Othello on a full-sized board would be cooler. Almost five years ago, I blogged about what I still consider the most impressive result in this domain: the solving of checkers by Jonathan Schaeffer and his team at the University of Alberta.

the King's Gambit

The chess result is more limited. Rajlich, an International Master of chess and the programmer of the chess engine Rybka, has shown results only for games that begin 1.e4 e5 2.f4 exf4. If White plays 3.Nf3 -- the most common next move -- then Black can win with 3... d6. 3.Bc4 also loses. Only one move for White can force a draw, the uncommon 3.Be2. Keep in mind that these results all assume best play by both players from there on out. White can win, lose, or draw in all variations if either player plays a sub-optimal move.

I say "only" when describing this result because it leaves a lot of chess unsolved, all games starting with some other sequence of moves. Yet the accomplishment is still quite impressive! The King's Gambit is one of the oldest and most storied opening sequences in all of chess, and it remains popular to this day among players at every level of skill.

Besides, consider the computational resources that Rajlich had to use to solve even the King's Gambit:

... a cluster of computers, currently around 300 cores [created by Lukas Cimiotti, hooked up to] a massively parallel cluster of IBM POWER 7 Servers provided by David Slate, senior manager of IBM's Semantic Analysis and Integration department -- 2,880 cores at 4.25 GHz, 16 terabytes of RAM, very similar to the hardware used by IBM's Watson in winning the TV show "Jeopardy". The IBM servers ran a port of the latest version of Rybka, and computation was split across the two clusters, with the Cimiotti cluster distributing the search to the IBM hardware.

Oh, and this set up had to run for over four months to solve the opening. I call that impressive. If you want something less computationally intensive yet still able to beat you me and everybody we know at chess, you can by Rybka, a chess engine available commercially. (An older version is available for free!)

What effect will this result have on human play? Not much, practically speaking. Our brains aren't big enough or fast enough to compute all the possible paths, so human players will continue to play the opening, create new ideas, and explore the action in real time over the board. Maybe players with the Black pieces will be more likely to play one of the known winning moves now, but results will remain uneven between White and Black. The opening leads to complicated positions.

the cover of Bobby Fischer's 'My 60 Memorable Games'

If, like some people, you worry that results such as this one somehow diminish us as human beings, take a look again at the computational resources that were required to solve this sliver of one game, the merest sliver of human life, and then consider: This is not the first time that someone claimed the King's Gambit busted. In 1961, an eighteen-year-old U.S. chess champion named Bobby Fischer published an article claiming that 1.e4 e5 2.f4 exf4 3.Nf3 was a forced loss. His prescription? 3... d6. Now we know for sure. Like so many advances in AI, this one leaves me marveling at the power of the human mind.

Well, at least Bobby Fischer's mind.

~~~~

IMAGE 1: The King's Gambit. Source: Wikimedia Commons.

IMAGE 2: a photograph of the cover of my copy of My 60 Memorable Games by Bobby Fischer. Bobby analyzes a King's Gambit or two in this classic collection of games.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

March 30, 2012 5:22 PM

A Reflection on Alan Turing, the Turing Test, and Machine Intelligence

Alan Turing

In 1950, Alan Turing published a paper that launched the discipline of artificial intelligence, Computing Machinery and Intelligence. If you have not read this paper, go and do so. Now. 2012 is the centennial of Turing's birth, and you owe yourself a read of this seminal paper as part of the celebration. It is a wonderful work from a wonderful mind.

This paper gave us the Imitation Game, an attempt to replace the question of whether a computer could be intelligent by withn something more concrete: a probing dialogue. The Imitation became the Turing Test, now a staple of modern culture and the inspiration for contests and analogies and speculation. After reading the paper, you will understand something that many people do not: Turing is not describing a way for us to tell the difference between human intelligence and machine intelligence. He is telling us that the distinction is not as important as we seem to think. Indeed, I think he is telling us that there is no distinction at all.

I mentioned in an entry a few years ago that I always have my undergrad AI students read Turing's paper and discuss the implications of what we now call the Turing Test. Students would often get hung up on religious objections or, as noted in that entry, a deep and a-rational belief in "gut instinct". A few ended up putting their heads in the sand, as Turing knew they might, because they simply didn't want to confront the implication of intelligences other than our own. And yet they were in an AI course, learning techniques that enable us to write "intelligent" programs. Even students with the most diehard objections wanted to write programs that could learn from experience.

Douglas Hofstadter, who visited campus this month, has encountered another response to the Turing Test that surprised him. On his second day here, in honor of the Turing centenary, Hofstadter offered a seminar on some ideas related to the Turing Test. He quoted two snippets of hypothetical man-machoine dialogue from Turing's seminal paper in his classic Gödel, Escher, Bach. Over the the years, he occasionally runs into philosophers who think the Turing Test is shallow, trivial to pass with trickery and "mere syntax". Some are concerned that it explores "only behavior". Is behavior all there is? they ask.

As a computer programmer, the idea that the Turing test explores only behavior never bothered me. Certainly, a computer program is a static construct and, however complex it is, we can read and understand it. (Students who take my programming languages course learn that even another program can read and process programs in a helpful way.) This was not a problem for Hofstadter either, growing up as he did in a physicist's household. Indeed, he found Turing's formulation of the Imitation Game to be deep and brilliant. Many of us who are drawn to AI feel the same. "If I could write a program capable of playing the Imitation Game," we think, "I will have done something remarkable."

One of Hofstadter's primary goals in writing GEB was to make a compelling case form Turing's vision.

Douglas Hofstadter

Those of us who attended the Turing seminar read a section from Chapter 13 of Le Ton beau de Marot, a more recent book by Hofstadter in which he explores many of the same ideas about words, concepts, meaning, and machine intelligence as GEB, in the context of translating text from one language to another. Hofstadter said the focus in this book is on the subtlety of words and the ideas they embody, and what that means for translation. Of course, these are the some of the issues that underlie Turing's use of dialogue as sufficient for us to understand what it means to be intelligent.

In the seminar, he shared with us some of his efforts to translate a modern French poem into faithful English. His source poem had itself been translated from older French into modern French by a French poet friend of his. I enjoyed hearing him talk about "the forces" that pushed him toward and away from particular words and phrases. Le Ton beau de Marot uses creative dialogues of the sort seen in GEB, this time between the Ace Mechanical Translator (his fictional computer program) and a Dull Rigid Human. Notice the initials of his raconteurs! They are an homage to Turing. The human translator, Douglas R. Hofstadter himself, is cast in the role of AMT, which shares its initials with Alan M. Turing, the man who started this conversation over sixty years ago.

Like Hofstadter, I have often encountered people who object to the Turing test. Many of my AI colleagues are comfortable with a behavioral test for intelligence but dislike that Turing considers only linguistic behavior. I am comfortable with linguistic behavior because it captures what is for me the most important feature of intelligence: the ability to express and discuss ideas.

Others object that it sets too low a bar for AI, because it is agnostic on method. What if a program "passes the test", and when we look inside the box we don't understand what we see? Or worse, we do understand what we see and are unimpressed? I think that this is beside the point. Not to say that we shouldn't want to understand. If we found such I program, I think that we would make it an overriding goal to figure out how it works. But how an entity manages to be "intelligent" is a different question from whether it is intelligent. That is precisely Turing's point!

I agree with Brian Christian, who won the prize for being "The Most Human Human" in a competition based on Turing's now-famous test. In an interview with The Paris Review, he said,

Some see the history of AI as a dehumanizing narrative; I see it as much the reverse.

Turing does not diminish what it is to be human when he suggests that a computer might be able to carry on a rich conversation about something meaningful. Neither do AI researchers or teenagers like me, who dreamed of figuring just what it is that makes it possible for humans to do what we do. We ask the question precisely because we are amazed. Christian again:

We build these things in our own image, leveraging all the understanding of ourselves we have, and then we get to see where they fall short. That gap always has something new to teach us about who we are.

As in science itself, every time we push back the curtain, we find another layer of amazement -- and more questions.

I agree with Hofstadter. If a computer could do what it does in Turing's dialogues, then no one could rightly say that it wasn't "intelligent", whatever that might mean. Turing was right.

~~~~

PHOTOGRAPH 1: the Alan Turing centenary celebration. Source: 2012 The Alan Turing Year.

PHOTOGRAPH 2: Douglas Hofstadter in Bologna, Italy, 2002. Source: Wikimedia Commons.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

March 27, 2012 4:53 PM

Faculty Workload and the Cost of Universities

This morning, @tonybibbs tweeted me a link to a Washington Post piece called Do college professors work hard enough?, wondering what I might think.

Author David Levy calls for "reforms for outmoded employment policies that overcompensate faculty for inefficient teaching schedules". Once, he says, faculty were generally underpaid relative to comparably educated professionals; now senior faculty at most state universities earn salaries roughly in line with comparable professionals.

Not changed, however, are the accommodations designed to compensate for low pay in earlier times. Though faculty salaries now mirror those of most upper-middle-class Americans working 40 hours for 50 weeks, they continue to pay for teaching time of nine to 15 hours per week for 30 weeks, making possible a month-long winter break, a week off in the spring and a summer vacation from mid-May until September.

My initial impressions after a quick read this morning were

  1. Yes, some faculty work too little.
  2. Most faculty work more than he seems to think.
  3. Changing #1 is hard.

After a second read, that still my impression. Let me expand.

Before beginning, let me note that Levy mentions three kinds of institutions: research universities, teaching universities, and community colleges. I myself can't offer informed comment on community college faculty. I have spent my professional career as a faculty member and department head at a teaching university. I also spent six years in grad school at an R-1 institution and have many colleagues and friends who work at research schools. Finally, I am in Computer Science, not a more stable discipline. These are the experience on which I draw.

First, #2. Levy seems willing to grant that faculty at research institutions work longer hours, or if not at least that the work they do is so valuable as to earn high pay. I agree. Levy seems unwilling to grant similar effort or importance to what faculty at teaching universities. He thinks himself generous in allowing that the latter might spend as much time in prep as in class and concludes that "the notion that faculty in teaching institutions work a 40-hour week is a myth".

At my school, data on faculty workload have routinely showed that on average faculty work more than fifty hours per week. When I was full time faculty, my numbers were generally closer to sixty. (As a department head for the last few years, I have generally worked more.) These numbers are self-reported, but I have good reason to trust them, having observed what faculty in my department do.

If we aren't meeting an R-1 school's stringent requirements for research, publication, and grant writing, what are we doing? We actually do spend more hours per week working outside the classroom as inside. We are preparing new course materials, meeting with students in office hours and the lab, and experimenting with new programming languages and technologies that can improve our courses (or making some of our course content obsolete). We advise undergrads and supervise their research projects. Many departments have small grad programs, which bring with them some of the duties that R-1 profs face.

We also do scholarship. Most teaching schools do expect some research and publication, though clearly not at the level expected by the R-1s. Teaching schools are also somewhat broader in the venues for publication that they accept, allowing teaching conferences such as SIGCSE or formal workshops like the SPLASH (neé OOPSLA) Educators' Symposium. Given these caveats, publishing a paper or more per year is not an unusual scholarship expectation at schools like mine.

During the summer, faculty at teaching universities are often doing research, writing, or preparing and teaching workshops for which they are paid little, if anything. Such faculty may have time for more vacation than other professionals, but I don't think many of them are sailing the Caribbean for the 20+ weeks that Levy presumes they have free.

Levy does mention service to the institution in the form of committee work. Large organizations do not run themselves. From what I remember of my time in grad school, most of my professors devoted relatively little time to committees. They were busy doing research and leading their teams. The university must have had paid staff doing a lot of the grunt work to keep the institution moving. At a school like mine, many faculty carry heavy service loads. Perhaps we could streamline the bureaucracy to eliminate some of this work, or higher staff to do it, but it really does consume a non-trivial amount of some faculty members' time and energy.

After offering these counterpoints -- which I understand may be seen as self-serving, given where I work -- what of #1? It is certainly the case that some university faculty work too little. Expectations for productivity in research and other scholarship have often been soft in the past, and only now are many schools coming to grips with the full cost of faculty productivity.

Recently, my school has begun to confront a long-term decline in real funding from the state, realizing that it cannot continue to raise tuition to make up for the gap. One administrative initiative asked department heads and faculty to examine scholarly productivity of faculty and assign professors who have not produced enough papers, grant proposals, or other scholarly results over a five-year period to teach an extra course. There were some problems in how administrators launched and communicated this initiative, but the idea is a reasonable one. If faculty are allocated time for scholarship but aren't doing much, then they can use that time to teach a course.

The reaction by most of faculty was skeptical and concerned. (This was true of department heads as well, because most of us think of ourselves as faculty temporarily playing an administrator's role.)

That brings us to #3. Changing a culture is hard. It creates uncertainty. When expectations have been implicit, it is hard to make them explicit in a way that allows enforcement while at the same time recognizing the value in what most faculty have been doing. The very word "enforcement" runs counter to the academic culture, in which faculty are left free to study and create in ways that improve their students' education and in which it is presumed faculty are behaving honorably.

In this sense, Levy's article hits on an issue that faces universities and the people who pay for them: taxpayers, students and parents who pay tuition, and granting agencies. I agree with Levy that addressing this issue is essential as universities come to live in a world with different cost structures and different social contracts. He seems to understand that change will be hard. However, I'm not sure he has an accurate view of what faculty at teaching universities are already doing.


Posted by Eugene Wallingford | Permalink | Categories: General

February 29, 2012 4:40 PM

From Mass Producing Rule Followers to Educating Creators

The bottom is not a good place to be, even if you're capable of getting there.

Seth Godin's latest manifesto, Stop Stealing Dreams, calls for a change to the way we educate our children. I've written some about how changes in technology and culture will likely disrupt universities, but Godin bases his manifesto on a simpler premise: we have to change what we achieve through education because what we need has changed. Historically, he claims, our K-12 system has excelled at one task: "churning out kids who are stuck looking for jobs where the boss tells them exactly what to do".

As negatively as that is phrased, it may well have been a reasonable goal for a new universal system of compulsory education in the first half of the 1900s. But times have changed, technology has changed, our economy has changed, and our needs have changed. Besides, universal education is a reality now, not a dream, so perhaps we should set our sights higher.

I only began to read Godin's book this afternoon. I'm curious to see how well the ideas in it apply to university education. The role of our universities has changed over time, too, including rapid growth in the number of people continuing their education after high school. The number and variety of public universities grew through the 1960s and 1970s in part to meet the new demand.

Yet, at its root, undergraduate education is, for most students, a continuation of the same model they experienced K-12: follow a prescribed program of study, attend classes, do assignments, pass tests, and follow rules. A few students avail themselves of something better as undergrads, but it's really not until grad school that most people have a chance to participate fully in the exploration for and creation of knowledge. And that is the result of self-selection: those most interested in such an education seek it out. Alas, many undergrads seem hardly prepared to begin driving their own educations, let alone interested.

That is one of the challenges university professors face. From my experience as a student and a father of students, I know that many HS teachers are working hard to open their students' minds to bigger ideas, too -- when they have the chance, that is, amid the factory-style mass production system that dominates many high schools today.

As I sat down to write this, it occurred to me that learning to program is a great avenue toward becoming a creator and an innovator. Sadly, most CS programs seem satisfied to keep doing the same old thing: to churn out people who are good at doing what they are told. I think many university professors, myself included, could do better by keeping this risk in mind. Every day as I enter the classroom, I should ask myself what today's session will do for my students: kill a dream, or empower it?

While working out this morning, my iPod served up John Hiatt's song, "Everybody Went Low" (available on YouTube). The juxtaposition of "going low" in the song and Godin's warning about striving for the bottom created an interesting mash-up in my brain. As Hiatt sings, when you are at the bottom, there is:

Nothing there to live up to
There's nothing further down
Turn it off or turn around

Big systems with lots of moving parts are hard to turn around. I hope we can do it before we get too low.


Posted by Eugene Wallingford | Permalink | Categories: General

February 25, 2012 3:04 PM

I Did the Reading, and Watched the Video

David Foster Wallace, 2006

It seems that I've been running across David Foster Wallace everywhere for the last few months. I am currently reading his collection A Supposedly Fun Thing I'll Never Do Again. I picked it up for a tennis-turned-philosophy essay titled, improbably, "Tennis Player Michael Joyce's Professional Artistry as a Paradigm of Certain Stuff about Choice, Freedom, Discipline, Joy, Grotesquerie, and Human Completeness". (You know I am a tennis fan.) On the way to reading that piece, I got hooked on the essay about filmmaker David Lynch. I am not a fan of Wallace's fiction, but his literary non-fiction arrests me.

This morning, I am listening to a lengthy uncut interview with Wallace from 2003, courtesy of fogus. In it, Wallace comes across just as he does in his written work: smart, well-read, and deeply thoughtful. He also seems so remarkably pleasant -- not the sort of thing I usually think of as a default trait in celebrities. His pleasantness feels very familiar to me as a fellow Midwesterner.

The video also offers occasional haunting images, both his mannerisms but especially his eyes. His obvious discomfort makes me uncomfortable as I watch. It occurs to me that I feel this way only because I know how his life ended, but I don't think that's true.

The interview contains many thought-provoking responses and interchanges. One particular phrase will stay with me for a while. Wallace mentions the fondness Americans have for the freedom to choice, the freedom to satisfy our desires. He reminds us that inherent in such freedom is a grave risk: a "peculiar kind of slavery", in which we feel we must satisfy our desires, we must act on our impulses. Where is the freedom in that prison?

There is also a simple line that appealed to the teacher in me: "It takes skill and education to get good enough at reading or listening to be able to derive pleasure from it." This is one of the challenges that faces teachers everywhere. Many things require skill and education -- and time -- in order for students to be able to derive satisfaction and even pleasure from them. Computer programming is one.

I recommend this interview to anyone interested in modern culture, especially American culture.

As I listened, I was reminded of this exchange from a short blog entry by Seth Godin from last year:

A guy asked his friend, the writer David Foster Wallace,

"Say, Dave, how'd y'get t'be so dang smart?"

His answer:

"I did the reading."

Wallace clearly did the reading.

~~~~

PHOTOGRAPH: David Foster Wallace at the Hammer Museum in Los Angeles, January 2006. Source: Wikimedia Commons.


Posted by Eugene Wallingford | Permalink | Categories: General

February 06, 2012 6:26 PM

Shopping Blog Entries to a Wider Audience

Over the last couple of years, our university relations department has been trying to promote more actively the university's role in the public sphere. One element of this effort is pushing faculty work and professional commentary out into wider circulation. For example, before and after the recent presidential caucuses in Iowa, they helped connect local political science profs with media who were looking for professional commentary from in the trenches.

Well, they have now discovered my blog and are interested in shopping several pieces of general interest to more traditional media outlets, such as national newspapers and professional magazines. Their first effort involves a piece I wrote about liberal education last month, which builds on two related pieces, here and here. I'm in the process of putting it into a form suitable for standalone publication. This includes polishing up some of the language, as well as not relying on links to other articles -- one of the great wins of the networked world.

Another big win of the networked world is the ease with which we can get feedback and make our ideas and our writing better. If you have any suggestions for how I might improve the presentation of the ideas in these pieces, or even the ideas themselves, please let me know. As always, I appreciate your thoughts and willingness to discuss them with me.

When I mentioned this situation in passing on Twitter recently, a former student asked whether my blog's being on the university's radar would cause me to write differently. The fact is that I always tried to respect my university, my colleagues, and my students when I write, and to keep their privacy and integrity in mind. This naturally results in some level of self-censorship. Still, I have always tried to write openly and honestly about what I think and learn.

You can rest assured. This blog remains mine alone and will continue to speak in my voice. I will write as openly and honestly as ever. That is the only way that the things I write could ever be of much interest to readers such as you, let alone to me.


Posted by Eugene Wallingford | Permalink | Categories: General

January 25, 2012 3:45 PM

Pragmatism and the Scientific Spirit

the philosopher William James

Last week, I found myself reading The Most Entertaining Philosopher, about William James. It was good fun. I have always liked James. I liked the work of his colleagues in pragmatism, C.S. Peirce and John Dewey, too, but I always liked James more. For all the weaknesses of his formulation of pragmatism, he always seemed so much more human to me than Peirce, who did the heavy theoretical lifting to create pragmatism as a formal philosophy. And he always seemed a lot more fun than Dewey.

I wrote an entry a few years ago called The Academic Future of Agile Methods, which described the connection between pragmatism and my earlier AI, as well as agile software development. I still consider myself a pragmatist, though it's tough to explain just what that means. The pragmatic stance is too often confounded with a self-serving view of the world, a "whatever works is true" philosophy. Whatever works... for me. James's references to the "cash value" of truth didn't help. (James himself tried to undo the phrase's ill effects, but it has stuck. Even in the 1800s, it seems, a good sound bite was better than the truth.)

As John Banville, the author NY Times book review piece says, "It is far easier to act in the spirit of pragmatism than to describe what it is." He then gives "perhaps the most concise and elegant definition" of pragmatism, by philosopher C. I. Lewis. It is a definition that captures the spirit of pragmatism as well as any few lines can:

Pragmatism could be characterized as the doctrine that all problems are at bottom problems of conduct, that all judgments are, implicitly, judgments of value, and that, as there can be ultimately no valid distinction of theoretical and practical, so there can be no final separation of questions of truth of any kind from questions of the justifiable ends of action.

This is what drew me to pragmatism while doing work in knowledge-based systems, as a reaction to the prevailing view of logical AI that seemed based in idealist and realist epistemologies. It is also what seems to me to distinguish agile approaches to software development from the more common views of software engineering. I applaud people who are trying to create an overarching model for software development, a capital-t Theory, but I'm skeptical. The agile mindset is, or at least can be, pragmatic. I view software development in much the way James viewed consciousness: "not a thing or a place, but a process".

As I read again about James and his approach, I remember my first encounters with pragmatism and thinking: Pragmatism is science; other forms of epistemology are mathematics.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

January 12, 2012 3:38 PM

At Least I'm Not Alone

One of the things I love about reading professional blogs and twitter feeds is reassuring myself that I am not crazy in many of my compulsions and obsessions.

On the exercise bike yesterday morning, I read Matt Might's End artificial scarcities to increase productivity. Many years ago I saw my colleague, friend, and hero Joe Bergin do something that I now do faithfully: always carry with me a pad of paper, small enough to fit comfortably in most any pocket, and a good pen. When your life is writing code, writing lectures, writing blog entries, you often want to write at the oddest of times. Now I am always ready to jot down any idea that comes into my head as soon as it does. I may throw it away later as a hare-brained scheme, but I prefer that to losing an idea for lack of a notepad.

Our house has pens, pencils, and usually paper in nearly every room. I have them in every bag I carry an in most coats I wear. The kind of pen matters some; I hate splotching and bleeding through. I have a fondness for a particular older model of Uniball pens, but I'm not obsessed with them. I do have a box of them in my desk at home, and every once in a while I'll pull one out to replace a pen that has run dry. They feel right in my hand.

Like Might, I have MacbookPro power adapters in every room in which I work, as well as one in my travel bag. The cost of having three or four adapters have been well worth the peace of mind. I even have a back-up battery or two on hand most of the time. (My Pro is one of the older ones with the removable battery.) I like to have one each in my home and school offices, where i do most of my work and from which most excursions begin.

On the bike this morning, I read Rands in Repose's bag pr0n essay from last month. Loved it! Like Lopp and many other geeks, I have at times obsessed over my bag. Back in high school I carried an attache case my parents gave me for Christmas. (Yes, I was that guy.) Since college and grad school, I've gone through several styles of bag, including freebies given out at conferences and a couple of nice ones my wife gave me as gifts. A few have provided what I desire: compactness, with a few compartments but not too many.

One of my favorites was from SIGCSE in the late 1990s. I still have it, though it shows its age and wear. Another is a bag I got at one of the PLoP conferences in the early part of the previous decade. It was perfect for an iBook, but is too small for my Pro. I still have it, too, waiting for a time when it will fit my needs again. Both were products of the days of really good conference swag. My current bag is a simple leather case that my wife gave me. It's been serving me well for a couple of years.

Each person has his or her particular point of obsession. Mine is the way the shoulder strap attaches to the body of bag. So many good bags have died too soon when the metallic clasp holding strap to body broke, or the clasp worked loose, or the fabric piece wore through.

Strange but true: One of my all time favorite bags was a $5 blue vinyl diaper bag that my wife bought at a garage sale in the early 1990s. No one knew it was a diaper bag, or so I think; at a glance it was rather inncouous. This bag was especially useful at a time when I traveled a lot, attending 4-6 conferences a year and doing more personal travel than I do these days. The changing pad served as a great sleeve to protect my laptop (first a G3 clamshell, then an iBook). The side compartments designed to hold two baby bottles were great for bottles of water or soda. This was especially handy for a long day flying -- back when we could do such crazy things as carry drinks with us. This bag also passed Rands' airport security line test. It allowed for easy in-and-out of the laptop, and then rolled nicely on its side for going through x-ray. I still think about returning to this bag some day.

I'm sure that this sort of obsessiveness is a positive trait for programmers. So many of us have it, it must be.


Posted by Eugene Wallingford | Permalink | Categories: General

December 30, 2011 11:05 AM

Pretending

Kurt Vonnegut never hesitated to tell his readers the morals of his stories. The frontispiece of his novel Mother Night states its moral upfront:

We are what we pretend to be, so we must be careful about what we pretend to be.

Pretending is a core thread that runs through all of Vonnegut's work. I recognized this as a teenager, and perhaps it is what drew me to his books and stories. As a junior in high school, I wrote my major research paper in English class on the role fantasy played in the lives of Vonnegut's characters. (My teachers usually resisted my efforts to write about authors such as Kafka, Vonnegut, and Asimov, because they weren't part of "the canon". They always relented, eventually, and I got to spend more time thinking about works I loved.)

I first used this sentence about pretending in my eulogy for Vonnegut, which includes a couple of other passages on similar themes. Several of those are from Bokononism, the religion created in his novel Cat's Cradle as a way to help the natives of the island of San Lorenzo endure their otherwise unbearable lives. Bokononism had such an effect on me that I spent part of one summer many years ago transcribing The Books of Bokonon onto the web. (In these more modern times, I share Bokonon's wisdom via Twitter.)

Pretending is not just a way to overcome pain and suffering. Even for Vonnegut, play and pretense are the ways we construct the sane, moral, kind world in which we want to live. Pretending is, at its root, a necessary component in how we train our minds and bodies to think and act as we want them to. Over the years, I've written many times on this blog about the formation of habits of mind and body, whether as a computer scientist, a student, or a distance runner.

Many people quote Aristotle as the paradigm of this truth:

We are what we repeatedly do. Excellence, then, is not an act, but a habit.

I like this passage but prefer another of his, which I once quoted in a short piece, What Remains Is What Will Matter:

Excellence is an art won by training and habituation. We do not act rightly because we have virtue or excellence, but rather we have those because we have acted rightly.

This idea came charging into my mind this morning as I read an interview with Seth Godin. He and his interviewers are discussing the steady stream of rejection that most entrepreneurs face, and how some people seem able to fight through it to succeed. What if a person's natural "thermostat" predisposes them to fold in the face of rejection? Godin says:

I think we can reset our inclinations. I'm certain that pretending we can is better than admitting we can't.

Vonnegut and Aristotle would be proud. We are what we pretend to be. If we wish to be virtuous, then we must act rightly. If we wish to be the sort of person who responds to rejection by working harder and succeeding, then we must work harder. We become the person we pretend to be.

As children, we think pretending is about being someone we aren't. And there is great fun in that. As teenagers, sometimes we feel a need to pretend, because we have so little control over our world and even over our changing selves. As adults, we tend to want to put pretending away as child's play. But this obscures a truth that Vonnegut and Aristotle are trying to teach us:

Pretending is just as much about being who we are as about being who we aren't.

As you and I consider the coming of a new year and what we might resolve to do better or differently in the coming twelve months that will make a real difference in our lives, I suggest we take a page out of Vonnegut's playbook.

Think about the kind of world you want to live in, then live as if it exists.

Think about the kind of person you want to be, then live as if you are that person.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

December 19, 2011 4:49 PM

"I Love The Stuff You Never See"

I occasionally read and hear people give advice about how to find a career, vocation, or avocation that someone will enjoy and succeed in. There is a lot of talk about passion, which is understandable. Surely, we will enjoy things we are passionate about, and perhaps then we want to put in the hours required to succeed. Still, "finding your passion" seems a little abstract, especially for someone who is struggling to find one.

This weekend, I read A Man, A Ball, A Hoop, A Bench (and an Alleged Thread)... Teller!. It's a story about the magician Teller, one half of the wonderful team Penn & Teller, and his years-long pursuit of a particular illusion. While discussing his work habits, Teller said something deceptively simple:

I love the stuff you never see.

I knew immediately just what he meant.

I can say this about teaching. I love the hours spent creating examples, writing sample code, improving it, writing and rewriting lecture notes, and creating and solving homework assignments. When a course doesn't go as I had planned, I like figuring out why and trying to fix it. Students see the finished product, not the hours spent creating it. I enjoy both.

I don't necessarily enjoy all of the behind-the-scenes work. I don't really enjoy grading. But my enjoyment of the preparation and my enjoyment of the class itself -- the teaching equivalent of "the performance" -- carries me through.

I can also say the same thing about programming. I love to fiddle with source code, organizing and rewriting it until it's all just so. I love to factor out repetition and discover abstractions. I enjoy tweaking interfaces, both the interfaces inside my code and the interfaces my code's users see. I love that sudden moment of pleasure when a program runs for the first time. Users see the finished product, not the hours spent creating it. I enjoy both.

Again, I don't necessarily enjoy everything that I have to do the behind the scenes. I don't enjoy twiddling with configuration files, especially at the interface to the OS. Unlike many of my friends, I don't always enjoy installing and uninstalling, all the libraries I need to make everything work in the current version of the OS and interpreter. But that time seems small compared the time I spend living inside the code, and that carries me through.

In many ways, I think that Teller's simple declaration is a much better predictor of what you will enjoy in a career or avocation than other, fancier advice you'll receive. If you love the stuff other folks never see, you are probably doing the right thing for you.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

December 05, 2011 4:34 PM

More on the Future of News and Universities

I remain endlessly fascinated with the evolution of the news industry in the Internet Age, and especially with the discussions of same within the industry itself. Last week, Clay Shirky posted Institutions, Confidence, and the News Crisis in response to Dean Starkman's essay in the Columbia Journalism Review, Confidence Game. It's clear that not everyone views the change enabled by the internet and the web as a good thing.

Of course, my interest in journalism quickly spills over into my interest in the future of my own institution, the university. In Revolution Out There -- And Maybe In Here, I first began to draw out the similarities between the media and the university, and since then I've written occasionally about connections [ 1 | 2 | 3 ]. Some readers have questioned the analogy, because universities aren't media outlets. But in several interesting ways, they are. Professors write textbooks, lectures, and supporting materials. Among its many purposes, a university course disseminates knowledge. Faculty can object that a course does more than that, which is true, but from many peoples' perspectives -- many students, parents, and state legislators included -- dissemination is its essential purpose.

Universities aren't solely about teaching courses. They also create knowledge, through basic and applied research, and through packaging existing work in new and more useful ways. But journalists also create and package knowledge in similar ways, through research, analysis, and writing. Indeed, one of the strongest arguments by journalism traditionalists like Starkman is that new models of journalism often make little or no account of public-interest reporting and the knowledge creation function of media institutions.

Most recently, I wrote about the possible death of bundling in university education, which I think is where the strongest similarity between the two industries lies. The biggest problems in the journalism aren't with what they do but with the way in which they bundle, sell, and pay for what they do. This is also the weak link in the armor of the university. For a hundred years, we have bundled several different functions into a whole that was paid for by the public through its governments and through peoples' willingness to pay tuition. As more and more options become available to people, the people holding the purses are beginning to ask questions about the direct and indirect value they receive.

We in the universities can complain all we want about the Khan Academy and the University of Phoenix and how what we do is superior. But we aren't the only people who get to create future. In the software development world, there has long been interest in apprenticeship models and other ways to prepare new developers that bypass the university. It's the software world's form of homeschooling.

(Even university professors are beginning to write about the weakness of our existing model. Check out Bryan Caplan's The Magic of Education for a discussion of education as being more about signaling than instruction.)

I look at my colleagues in industry who make a good living as teachers: as consultants to companies, as the authors of influential books and blogs, and as conference speakers. They are much like freelance journalists. We are even starting to see university instructors who want to teach focus on teaching leave higher education and move out into the world of consultants and freelance developers of courses and instructional material. Professors may not be able to start their own universities yet, the way doctors and lawyers can set up their own practices, but the flat world of the web gives them so many more options. As Shirky says of the journalism world, we need experiments like this to help us create the future.

In the journalism world, there is a divide between journalists arguing that we need existing media institutions to preserve the higher goals of journalism and journalists arguing that new models are arising naturally out of new technologies. Sometimes, the first group sounds like it is arguing for the preservation of institutions for their own sake, and the latter group sounds like it is rooting for existing institutions to fall, whatever the price. We in the university need to be mindful that institutions are not the same as their purpose. We have enough lead time to prepare ourselves for an evolution I think is inevitable, but only if we think hard and experiment ourselves.


Posted by Eugene Wallingford | Permalink | Categories: General

October 12, 2011 12:31 PM

Programming for Everyone -- Really?

TL;DR version: Yes.

Yesterday, I retweeted a message that is a common theme here:

Teaching students how to operate software, but not produce software, is like teaching kids to read & not write. (via @KevlinHenney)

It got a lot more action than my usual fare, both retweets and replies. Who knew? One of the common responses questioned the analogy by making another, usually of this sort:

Yeah, that would be like teaching kids how to drive a car, but not build a car. Oh, wait...

This is a sounds like a reasonable comparison. A car is a tool. A computer is a tool. We use tools to perform tasks we value. We do not always want to make our own tools.

But this analogy misses out on the most important feature of computation. People don't make many things with their cars. People make things with a computer.

When people speak of "using a computer", they usually mean using software that runs on a computer: a web browser, a word processor, a spreadsheet program. And people use many of these tools to make things.

As soon as we move into the realm of creation, we start to bump into limits. What if the tool we are given doesn't allow us to say or do what we want? Consider the spreadsheet, a general data management tool. Some people use it simply as a formatted data entry tool, but it is more. Every spreadsheet program gives us a formula language for going beyond what the creators of Excel or Numbers imagined.

But what about the rest of our tools? Must we limit what we say to what our tool affords us -- to what our tool builders afford us?

A computer is not just a tool. It is also a medium of expression, and an increasingly important one.

If you think of programming as C or Java, then the idea of teaching everyone to program may seem silly. Even I am not willing to make that case here. But there are different kinds of programming. Even professional programmers write code at many levels of abstraction, from assembly language to the highest high-level language. Non-programmers such as physicists and economists use scripting languages like Python. Kids of all ages are learning to program Scratch.

Scratch is a good example of what I was thinking when I retweeted. Scratch is programming. But Scratch is really a way to tell stories. Just like writing and speaking.

Alfred Thompson summed up this viewpoint succinctly:

[S]tudents need to be creators and not just consumers.

Kids today understand this without question. They want to make video mash-ups and interactive web pages and cutting-edge presentations. They need to know that they can do more than just use the tools we deign to give them.

One respondent wrote:

As society evolves there is an increasing gap between those that use technology and those that can create technology. Whilst this is a concern, it's not the lowest common denominator for communication: speaking, reading and writing.

The first sentence is certainly true. The question for me is: on which side of this technology divide does computing live? If you think of computation as "just" technology, then the second sentence seems perfectly reasonable. People use Office to do their jobs. It's "just a tool".

It could, however, be a better tool. Many scientists and business people write small scripts or programs to support their work. Many others could, too, if they had the skills. What about teachers? Many routine tasks could be automated in order to give them more time to do what they do best, teach. We can write software packages for them, but then we limit them to being consumers of what we provide. They could create, too.

Is computing "just tech", or more? Most of the world acts like it is the former. The result is, indeed, an ever increasing gap between the haves and the have nots. Actually, the gap is between the can dos and the cannots.

I, and many others, think computation is more than simply a tool. In the wake of Steve Jobs's death last week, many people posted his famous quote that computing is a liberal art. Alan Kay, one of my inspirations, has long preached that computing is a new medium on the order of reading and writing. The list of people in the trenches working to make this happen is too numerous to include.

More practically, software and computer technology are the basis of much innovation these days. If we teach the new medium to only a few, the "5 percent of the population over in the corner" to whom Jobs refers, we exclude the other 95% from participating fully in the economy. That restricts economic growth and hurts everyone. It is also not humane, because it restricts people's personal growth. Everyone has a right to the keys to the kingdom.

I stand in solidarity with the original tweeter and retweeter. Teaching students how to operate software, but not produce software, is like teaching kids to read but not to write. We can do better.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

October 10, 2011 2:56 PM

Making Your Own Mistakes

Earlier today, @johndmitchell retweeted a link from Tara "Miss Rogue" Hunt:

RT @missrogue: My presentation from this morning at #ennovation: The 10 Mistakes I've made...so you don't have to http://t.co/QE0DzF9tw

Danger ahead!

I liked the title and so followed the link to the slide deck. The talk includes a few good quotes and communicates some solid experience on how to fail as a start-up, and also how to succeed. I was glad to have read.

The title notwithstanding, though, be prepared. Other people making mistakes will not -- cannot -- save you from making the same mistakes. You'll have to make them yourself.

There are certain kinds of mistakes that don't need to be made again, but that happens when we eliminate an entire class of problems. As a programmer, I mostly don't have to re-make the mistakes my forebears made when writing code in assembly. They learned from their mistakes and made tools that shield me from the problems I faced. Now, I write code in a higher-level language and let the tools implement the right solution for me.

Of course, that means I face a new class of problems, or an old class of problems in a new way. So I make new kinds of mistakes. In the case of assembly and compilers, I am more comfortable working at that level and am thus glad to have been shielded from those old error traps, by the pioneers who preceded me.

Starting a start up isn't the sort of problem we are able to bypass so easily. Collectively, we aren't good at all at reliably creating successful start-ups. Because the challenges involve other people and economic forces, they will likely remain a challenge well into our future.

Warning, proceed at your risk!

Even though Hunt and other people who have tried and failed at start-ups can't save us from making these mistakes, they still do us a service when they reflect on their experiences and share with us. They put up guideposts that say "Danger ahead!" and "Don't go there!"

Why isn't that enough to save us? We may miss the signs in the noise of our world and walk into the thicket on our own. We may see the warning sign, think "My situation is different...", and proceed anyway. We may heed their advice, do everything we can to avoid the pitfall, and fail anyway. Perhaps we misunderstood the signs. Perhaps we aren't smart enough yet to solve the problem. Perhaps no one is, yet. Sometimes, we won't be until we have made the mistake once ourselves -- or thrice.

Despite this, it is valuable to read about our forebears' experiences. Perhaps we will recognize the problem part of the way in and realize that we need to turn around before going any farther. Knowing other people's experiences can leave us better prepared not to go too far down into the abyss. A mistake partially made is often better than a mistake made all the way.

If nothing else, we fail and are better able to recognize our mistake after we have made it. Other people's experience can help us put our own mistake into context. We may be able to understand the problem and solution better by bringing those other experiences to bear on our own experience.

While I know that we have to make mistakes to learn, I don't romanticize failure. We should take reasonable measures to avoid problems and to recognize them as soon as possible. That's the ultimate value in learning what Hunt and other people can teach us.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

August 17, 2011 3:52 PM

Remind Me Again...

This post is a mild and perhaps petulant rant about shackling free software. Feel free to skip it if you like.

I've been setting up a new iMac over the last couple of days. I ran into some difficulties installing Remind, a powerful text-based Unix calendar program, that made me sad.

First of all, I need to say "thank you" to the creator of Remind. He wrote the first version of the program back in the 1970s and has maintained and updated it over the last 30+ years. It has always been free, both as in beer and as in speech. Like many Unix folks, I became a devoted user of the program almost as soon as I discovered it.

Why am I sad? When I went to download the latest version, the server detected that I was connecting via a Mac browser and took me to a page that said only not to use Remind on an Apple product. I managed to download the source but found its compressed format incompatible with the tools, both apps and command-line programs, that I use to unstuff archives on my Mac. I finally managed to extract the source, build it, and install it. When Remind runs on my new machine, the program displays this message:

You appear to be running Remind on an Apple product. I'd rather that you didn't. Remind execution will continue momentarily.

... and delays for 30 seconds.

Wow, he really is serious about discouraging people from running his program on an Apple machine.

This is, of course, well within his rights. Like many people, he feels strongly about Apple's approach to software and apps these days. On the Remind home page, he writes:

Remind can be made to run under Mac OS X, but I prefer you not to do that. Apple is even more hostile than Microsoft to openness, using both technical and legal means to hobble what its customers and developers are allowed to do. If you are thinking of buying an Apple product, please don't. If you're unfortunate enough to already own Apple products, please consider switching to an open platform like Linux or FreeBSD that doesn't impose "1984"-like restrictions on your freedom.

I appreciate his desire to support the spirit of free software, to the point of turning long-time users away from his work. When I have downloaded earlier versions of Remind, I have noticed and taken seriously the author's remarks about Apple's closed approach. This version goes farther than offering admonition; it makes life difficult for users. I have always wondered about the stridency of some people in the free software community. I understand that they feel the only way to make a stand on their principles is to damage the freedom and openness of their own software. And they may be right. Companies like Microsoft and Apple are not going to change just because an independent software developer asks them to.

Then again, neither am I. I do take seriously the concerns expressed by Remind's author and others like him. The simple fact, though, is that I'm not likely to switch from my Mac because I find one of my command-line Unix tools no longer available. I have concerns of my own with Apple's approach to software these days, but at this point I still choose to use its products.

If it becomes too difficult to install the new versions of Remind, what will I do? Well, I could install the older version I have cached on my machine. Or perhaps I'll run a script such as rem2ics to free my data from Remind's idiosyncratic representation into the RFC2445 standard format. Then I would look for or write a new tool. Remind's author might be pleased that I wouldn't likely adopt Apple's iCal program and that I would likely make any tool I wrote for myself available to the rest of the world. I would not, however, tell users of any particular platform not to use my code. That's not my style.

I may yet choose to go that route anyway. As I continue to think about the issue, I may decide to respect the author's wishes and not use his program on my machine. If I do so, it will be because I want to show him that respect or because I am persuaded by his argument, not because I have to look at a two-line admonition or wait 30-seconds every time I run the program.

~~~~

Note: I could perhaps have avoided all the problems by using a package manager for Macs such as homebrew to download and install Remind. But I have always installed Remind by hand in the past and started down that path again this time. I don't know if homebrew's version of Remind includes the 30-second delay at execution. Maybe next time I'll give this approach a try and find out.


Posted by Eugene Wallingford | Permalink | Categories: General

August 03, 2011 7:55 PM

Psychohistory, Economics, and AI

Or, The Best Foresight Comes from a Good Model

Hari Seldon from the novel Foundation

In my previous entry, I mentioned re-reading Asimov's Foundation Trilogy and made a passing joke about psychohistory being a great computational challenge. I've never heard a computer scientist mention psychohistory as a primary reason for getting involved with computers and programming. Most of us were lucky to see so many wonderful and more approachable problems to solve with a program that we didn't need to be motivated by fiction, however motivating it might be.

I have, though, heard and read several economists mention that they were inspired to study economics by the ideas of psychohistory. The usual reason for the connection is that econ is the closest thing to psychohistory in modern academia. Trying to model the behavior of large groups of people, and reaping the advantages of grouping for predictability, is a big part of what macroeconomics does. (Asimov himself was most likely motivated in creating psychohistory by physics, which excels at predicting the behavior of masses of atoms over predicting the behavior of individual atoms.)

As you can tell from recent history, economists are no where near the ability to do what Hari Seldon did in Foundation, but then Seldon did his work more than 10,000 years in the future. Maybe 10,00 years from now economists will succeed as much and as well. Like my economist friends, I too am intrigued by economics, which also shares some important features in common with computer science, in particular a concern with the trade-offs among limited resources and the limits of rational behavior.

The preface to the third book in Asimov's trilogy, Second Foundation, includes a passage that caught my eye on this reading:

He foresaw (or he solved his [system's] equations and interpreted its symbols, which amounts to the same thing)...

I could not help but be struck by how this one sentence captured so well the way science empowers us and changes the intellectual world in which we live. Before the rapid growth of science and broadening of science education, the notion of foresight was limited to personal experience and human beings' limited ability to process that experience and generalize accurately. When someone had an insight, the primary way to convince others was to tell a good story. Foresight could be feigned and sold through stories that sounded good. With science, we have a more reliable way to assess the stories we are told, and a higher standard to which we can hold the stories we are told.

(We don't always do well enough in using science to make us better listeners, or better judges of purported foresights. Almost all of us can do better, both in professional settings and personal life.)

As a young student, I was drawn to artificial intelligence as the big problem to solve. Like economics, it runs directly into problems of limited resources and limited rationality. Like Asimov's quote above, it runs directly into the relationship between foresight and accurate models of the world. During my first few years teaching AI, I was often surprised by how fiercely my students defended the idea of "intuition", a seemingly magical attribute of men and women forever unattainable by computer programs. It did me little good to try to persuade them that their belief in intuition and "gut instinct" were outside the province of scientific study. Not only didn't they care; that was an integral part of their belief. The best thing I could do was introduce them to some of the techniques used to write AI programs and to show them such programs behaving in a seemingly intelligent manner in a situation that piqued my students' interest -- and maybe opened their minds a bit.

Over the course of teaching those early AI courses, I was eventually able to see one of the fundamental attractions I had to the field. When I wrote an AI program, I was building a model of intelligent behavior, much as Seldon's psychohistory involved building a model of collective human behavior. My inspiration did not come from Asimov, but it was similar in spirit to the inspiration my economist friends' drew from Asimov. I have never been discouraged or deterred by any arguments against the prospect of artificial intelligence, whether my students' faith-based reasons or by purportedly rational arguments such as John Searle's Chinese room argument. I call Searle's argument "purportedly rational" because, as it is usually presented, ultimately it rests on the notion that human wetware -- as a physical medium -- is capable of representing symbols in a way that silicon or other digital means cannot.

I always believed that, given enough time and enough computational power, we could build a model that approximated human intelligence as closely as we desired. I still believe this and enjoy watching (and occasionally participating in) efforts that create more and more intelligent programs. Unlike many, I am undeterred by the slow progress of AI. We are only sixty years into an enterprise that may take a few thousand years. Asimov taught me that much.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 15, 2011 4:25 PM

The Death of Bundling in University Education?

Clay Shirky's latest piece talks a bit about the death of bundling in journalism, in particular in newspapers. Bundling is the phenomenon of putting different kinds of content into a single product and selling consumers the whole. Local newspapers contain several kinds of content: local news coverage, national news, sports, entertainment, classified ads, obituaries, help columns, comics, .... Most subscribers don't consume all this content and won't pay for it all. In the twentieth century, it worked to bundle it all together, get advertisers to buy space in the package, and get consumers to buy the whole package. The internet and web have changed the game.

As usual, Shirky talking about the state and future of newspapers sets me to thinking about the state and future of universities [ 1, 2, 3, 4 ]. Let me say upfront that I do not subscribe to the anti-university education meme traversing the internet these days, which seems especially popular among software people. Many of its proponents speak too glibly about a world without the academy and traditional university education. Journalism is changing, not disappearing, and I think the same will be true of universities. The questions are, How will universities change? How should they change? Will universities be pulled downstream against there will, or will they actively redefine their mission and methods?

I wonder about the potential death of bundling in university. Making an analogy to Shirky's argument helps us to see some of the dissatisfaction with universities these days. About newspapers, he says:

Writing about the Dallas Cowboys in order to take money from Ford and give it to the guy on the City Desk never made much sense.

It's not hard to construct a parallel assertion about universities:

Teaching accounting courses in order to take money from state legislatures and businesses and give it to the humanities department never made much sense.

Majors that prepare students for specific jobs and careers are like the sports section. They put students in the seats. States and businesses want strong economies, so they are willing to subsidize students' educations, in a variety of ways. Universities use part of the money to support higher-minded educational goals, such as the liberal arts. Everyone is happy.

Well, they were in the 20th century.

The internet and web have drastically cut the cost of sharing information and knowledge. As a result, they have cut the cost of "acquiring" information and knowledge. When the world views the value of the bundle as largely about the acquisition of particular ingredients (sports scores or obituaries; knowledge and job skills), the business model of bundling is undercut, and the people footing most of the bill (advertisers; states and businesses) lose interest.

In both cases, the public good being offered by the bundle is the one most in jeopardy by unbundling. Cheap and easy access to targeted news content means that there is no one on the production side of the equation to subsidize "hard" news coverage for the general public. Cheap and easy access to educational material on-line erodes the university's leverage for subsidizing its public good, the broad education of a well-informed citizenry.

Universities are different from newspapers in one respect that matters to this analogies. Newspapers are largely paid for by advertisers, who have only one motivation for buying ads. Over the past century, public universities have largely been paid for by state governments and thus the general public itself. This funder of first resort has an interest in both the practical goods of the university -- graduates prepared to contribute to the economic well-being of the state -- and the public goods of the university -- graduates prepared to participate effectively in a democracy. Even still, over the last 10-20 years we have seen a steep decline in the amount of support provided by state governments to so-called "state universities", and elected representatives seem to lack the interest or political will to reverse the trend.

Shirky goes on to explain why "[n]ews has to be subsidized, and it has to be cheap, and it has to be free". Public universities have historically had these attributes. Well, few states offer free university education to their citizens, but historically the cost has been low enough that cost was not an impediment to most citizens.

As we enter a world in which information and even instruction are relatively easy to come by on-line, universities must confront the same issues faced by the media: the difference between what people want and what people are willing to pay for; the difference between what the state wants and what the state is willing to pay for. Many still believe in the overarching value of a liberal arts component to university education (I do), but who will pay for it, require it, or even encourage it?

Students at my university have questioned the need to take general education courses since before I arrived here. I've always viewed helping them to understand why as part of the education I help to deliver. The state was paying for most of their education because it had an interest in both their economic development and their civic development. As the adage floating around the Twitter world this week says, "If you aren't paying for the product, you are the product." Students weren't our customers; they are our product.

I still mostly believe that. But now that students and parents are paying the majority of the cost of the education, a percentage that rises every year, it's harder for me to convince them of that. Heck, it's harder for me to convince myself of that.

Shirky says other things about newspapers that are plausible when uttered about our universities as well, such as:

News has to be subsidized because society's truth-tellers can't be supported by what their work would fetch on the open market.

and:

News has to be cheap because cheap is where the opportunity is right now.

and:

And news has to be free, because it has to spread.

Perhaps my favorite analog is this sentence, which harkens back to the idea of sports sections attracting automobile dealers to advertise and thus subsidize the local government beat (emphasis added:

Online, though, the economic and technological rationale for bundling weakens -- no monopoly over local advertising, no daily allotment of space to fill, no one-size-fits-all delivery system. Newspapers, as a sheaf of unrelated content glued together with ads, aren't just being threatened with unprofitability, but incoherence.

It is so very easy to convert that statement into one about our public universities. We are certainly being threatened with unprofitability. Are we also being threatened with incoherence?

Like newspapers, the university is rapidly finding itself in need of a new model. Most places are experimenting, but universities are remarkably conservative institutions when it comes to changing themselves. I look at my own institution, whose budget situation calls for major changes. Yet it has been slow, at times unwilling, to change, for a variety of reasons. Universities that depend more heavily on state funding, such as mine, need to adapt even more quickly to the change in funding model. It is perhaps ironic that, unlike our research-focused sister schools, we take the vast majority of our students from in-state, and our graduates are even more likely to remain in the state, to be its citizens and the engines of its economic progress.

Shirky says that we need the new news environment to be chaotic. Is that true of our universities as well?


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

July 10, 2011 7:53 PM

Dave Barry on Media Computation

humorist Dave Barry

For Father's Day, my daughter gave me the most recent book of essays by humorist Dave Barry, subtitled his "Amazing Tales of Adulthood". She thought I'd especially enjoy the chapter on dance recitals, which reports -- with only the slightest exaggeration, I assure you -- an experience shared by nearly every father of daughters these days. He nailed it, right down to not finding your daughter on-stage until her song is ending.

However, his chapter on modern technology expresses a serious concern that most readers of this blog will appreciate:

... it bothers me that I depend on so many things that operate on principles I do not remotely understand, and which might not even be real.

He is talking about all of modern technology, including his microwave oven, but he when he lists tools that baffle him, digital technology leads the way:

I also don't know how my cell phone works, or my TV, or my computer, or my iPod, or my GPS, or my camera that puts nineteen thousand pictures inside a tiny piece of plastic, which is obviously NOT PHYSICALLY POSSIBLE, but there it is.

He knows this is "digital" technology, because...

At some point ... all media -- photographs, TV, movies, music, oven thermometers, pornography, doorbells, etc. -- became "digital". If you ask a technical expert what this means, he or she will answer that the information is, quote, "broken down into ones and zeros." Which sounds good, doesn't it? Ones and zeros! Those are digits, all right!

The problem is, he has never seen the ones and zeros. No matter how closely he looks at his high-def digital television, he can't see any ones or zeros. He goes on to hypothesize that no one really understands digital technology, that this "digital" thing is just a story to dupe users, and that such technology is a serious potential threat to humanity.

Of course, Dave is just having fun, but from 10,00 feet, he is right. Take a random sample of 100 people from this planet, and you'd be lucky to find one person who could explain how an iPod or digital camera works. I know that we don't all have to understand all the details of all our tools, otherwise we would all be in trouble. But this has become a universal, omnipresent phenomenon. Digital computations are the technology of our time. Dave could have listed even more tools that use digital technology, had he wanted (or known). If you want to talk about threats to humanity, let's start talking planes, trains, and automobiles.

For so many people, every phase of life depends on or is dominated by digital computation. Shouldn't people have some inkling of how all this stuff works? This is practical knowledge, much as knowing a little physics is useful for moving around the world. Understanding digital technology can make people better users of their tools and help them dream up improvements.

But to me, this is also humanistic knowledge. Digital technology is a towering intellectual and engineering achievement, of this or any era. It empowers us, but it also stands as a testament to humanity's potential. It reflects us.

Dave talked about a threat lying in wait, and there is one here, though not the one he mentions. We need people who understand digital technology because we need people to create it. Contrary to his personal hypothesis, this stuff isn't sent from outer space to the Chinese to be packaged for sale in America.

After reading this piece, I had two thoughts.

First, I think we could do a lot for Dave's peace of mind if we simply enrolled him in a media computation course! He is more than welcome to attend our next offering here. I'll even find a way for him to take the course for free.

Second, perhaps we could get Dave to do a public service announcement for studying computer science and digital technology. He's a funny guy and might be able to convince a young person to become the next Alan Kay or Fran Allen. He is also the perfect age to appeal to America's legislators and school board members. Perhaps he could convince them to include digital technology as a fundamental part of general K-12 education.

I am pretty sure that I will need your help to make this happen. I am no more capable of convincing Dave Barry to do this than of producing a litter of puppies. (*)

~~~~

(*) Analogy stolen shamelessly from the same chapter.


Posted by Eugene Wallingford | Permalink | Categories: General

July 03, 2011 1:16 PM

A Few Percent

Novak Djokovic

I am a huge tennis fan. This morning, I watched the men's final at Wimbledon and, as much as I admire Roger Federer and Raphael Nadal for the games and attitudes, I really enjoyed seeing Novak Djokovic break through for his first title at the All-England Club. Djokovic has been the #3 ranked player in the world for the last four years, but in 2011 he has dominated, winning 48 of 49 matches and two Grand Slam titles.

After the match, commentator and former Wimbledon champion John McEnroe asked Djokovic what he had changed about his game to become number one. What was different between this year and last? Djokovic shrugged his shoulders, almost imperceptibly, and gave an important answer:

A few percent improvement in several areas of my game.

The difference for him was not an addition to his repertoire, a brand new skill he could brandish against Nadal or Federer. It was a few percentage points' improvement in his serve, in his return, in his volley, and in his ability to concentrate. Keep in mind that he was already the best returner of service in the world and strong enough in the other elements of his game to compete with and occasionally defeat two of the greatest players in history.

That was not enough. So he went home and got a little better in several parts of his game.

Indeed, the thing that stood out to me from his win this morning against Rafa was the steadiness of his baseline play. His ground strokes were flat and powerful, as they long have been, but this time he simply hit more balls back. He made fewer errors in the most basic part of the game, striking the ball, which put Nadal under constant pressure to do the same. Instead of making mistakes, Djokovic gave his opponent more opportunities to make mistakes. This must have seemed especially strange to Nadal, because this is one of the ways in which he has dominated the tennis world for the last few years.

I think Djokovic's answer is so important because it reminds us that learning and improving our skills are often about little things. We usually recognize that getting better requires working hard, but I think we sometimes romanticize getting better as being about qualitative changes in our skill set. "Learn a new language, or a new paradigm, and change how you see the world." But as we get better this becomes harder and harder to do. Is there any one new skill that will push Federer, Nadal, or Djokovic past his challengers? They have been playing and learning and excelling for two decades each; there aren't many surprises left. At such a high level of performance, it really does come down to a few percent improvement in each area of the game that make the difference.

Even for us mortals, whether playing tennis or writing computer programs, the real challenge -- and the hardest work -- often lies in making incremental improvements to our skills. In practicing the cross-court volley or the Extract Class refactoring thousands and thousands of times. In learning to concentrate a little more consistently when we tire by trying to concentrate a little more consistently over and over.

As Nadal said in his own post-game inteview, the game is pretty simple. The challenge is to work hard and learn how to play it better.

Congratulations to Novak Djokovic for his hard work at getting a few percent better in several areas of his game. He has earned the accolade of being, for now, the best tennis player in the world.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

July 01, 2011 2:29 PM

What Would Have To Be True...

Yesterday, I read Esther Derby's recent post, Promoting Double Loop Learning in Retrospectives, which discusses ways to improve the value of our project retrospectives. Many people who don't do project retrospectives will still find Derby's article useful, because it's really about examining how we think and expanding possibilities.

One of the questions she uses to jump start deeper reflection is:

What would have to be true for [a particular practice] to work?

This is indeed a good question to ask when we are trying to make qualitative changes in our workplaces and organizations, for the reasons Derby explains. But it is also useful more generally as a communication tool.

I have a bad personal habit. When someone says something that doesn't immediately make sense to me, my first thought is sometimes, "That doesn't make sense." (Notice the two words I dropped...) Even worse, I sometimes say it out loud. That doesn't usually go over very well with the person I'm talking to.

Sometime back in the '90s, I read in a book about personal communication about a technique for overcoming this disrespectful tendency, which reflects a default attitude. The technique is to train yourself to think a different first thought:

What would have to be true in order for that statement to be true?

Rather than assume that what the person says is false, assume that it is true and figure out how it could be true. This accords my partner the respect he or she deserves and causes me to think about the world outside my own point of view. What I found in practice, whether with my wife or with a professional colleague, was that what they had said was true -- from their perspective. Sometimes we were starting from different sets of assumptions. Sometimes we perceived the world differently. Sometimes I was wrong! By pausing before reacting and going on the defensive (or, worse, the offensive), I found that I was saving myself from looking silly, rash, or mean.

And yes, sometimes, my partner was wrong. But now my focus was not on proving his wrong but on addressing the underlying cause of his misconception. That led to a very different sort of conversation.

So, this technique is not an exercise in fantasy. It is an exercise in more accurate perception. Sometimes, what would have to be true in the world actually is true. I just hadn't noticed. In other cases, what would have to be true in the world is how the other person perceives the world. This is an immensely useful thing to know, and it helps me to respond both more respectfully and more effectively. Rather than try to prove the statement false in some clinical way, I am better served by taking one of two paths:

  • helping the other person perceive the world more clearly, when his or her perception clashes with reality, or
  • recognizing that the world is more complicated than I first thought and that, at least for now, I am better served by acting from a state of contingency, in a world of differ possible truths.

I am still not very good at this, and occasionally I slip back into old habits. But the technique has helped me to be a better husband as well as a better colleague, department head, and teacher.

Speaking as a teacher: It is simply mazing how different interactions with students can be when, after students say something that seems to indicate they just don't get it, "What would have to be true in order for that statement to be true?" I have learned a lot about student misconceptions and about the inaccuracy of the signals I send students in my lectures and conversations just by stepping back and thinking, "What would have to be true..."

Sometimes, our imaginations are too small for our own good, and we need a little boost to see the world as it really is. This technique gives us one.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

June 30, 2011 10:26 AM

Manhood, and Programming, for Amateurs

A while back, I mentioned reading "Manhood for Amateurs". Like many of you, I collect quotes and passages. These days, I collect them in text files, a growing collection of digital commonplace books that I store in one of my folders of stuff. "Manhood for Amateurs" gave me many. Some seem as true about my life as programmer or teacher as they are about manhood or the world more generally.

... I never truly felt [the immensity of the universe] until I looked through a first-rate instrument at an unspoiled sky.

Call me crazy, but this brought to mind the summer I learned Smalltalk. I had been programming for ten years. When I first opened that Digitalk image and felt like I'd walked through the back of C.S. Lewis's wardrobe. After working through a simple tutorial from the manual, I was ready to explore. The whole world of computer science seemed to lie before me, written in simple sentences. Even arithmetic was implemented right there for me to read, to play with.

Smalltalk was my first first-rate instrument.

(Scheme was my second. There is nothing there! Just a few functions. The paucity of types, functions, objects, and libraries let me focus on what had to be there.)

the dark tide of magical boredom [... was ...] the source of all my inspiration

I wonder how much of my desire to program early on was driven by the stark fact that, if I wanted the computer to do anything, I had to teach it. There are so many distractions these days, on the computer and off. Will some possible future programmers never have a chance to create a desire out of their own bored minds' search?

If we are conducting our lives in the usual fashion, each of us serves as a constant source of embarrassment to his or her future self....

Spoken like a programmer. If you don't believe me, dig out some of your old code and read it.

Everything you love most is a lifelong focus of insufficiency.

Chabon is speaking as a man, a son, a husband, a father, and also, I presume, a writer. I feel this insufficiency as a teacher and as a programmer.

Every work of art is a secret handshake, a challenge that seeks the password, a heliograph flashed from a tower window, an act of hopeless optimism in the service of bottomless longing.

Okay, so this is more poetic than some of my programmer friends care to be, but it made me think of some of the little gems of code I have stumbled upon over the years. They were conceived in a programmer's mind and made real, then shared with the world. One of the great joys of living in this age is open-source software world and having the web and GitHub and CPAN available. It is so easy to find software created by a fellow artist out of love and hope and magic. It is so easy to share our own creations.

That leads me to one last quote, which comes from an essay in which Chabon describes his experience as a member of writers' workshops in his MFA program. He was a young poseur, being as dishonest to himself as he was to the people around him. He began to grow up when he realized something:

Without taking themselves half as seriously as I did, they were all twice as serious about what they were doing.

Take a look at all the wonderful work being done in the software world and being shared and written about for us. Then see if you can look yourself in the mirror and pretend you are anything but just another beginner on a path to being better. Then, get serious.

(Just don't mistake being serious with not having fun!)


Posted by Eugene Wallingford | Permalink | Categories: General

June 13, 2011 7:11 PM

A Few More Thoughts on the Future of Universities

Our state legislature still has not passed a budget for the next fiscal year, which leaves the the university hanging, waiting to set its course for 2011-2012. We expect another round of big cuts, the latest over more than a decade in which the funding base for state universities has eroded rapidly.

I've written before about the fault lines under higher education. I'm really not a Chicken Little sort of person, but I do think it's important that we pay attention to changes in the world and prepare them -- maybe even get ahead of the curve and actively build an institution that serves the state and its people well.

Over the weekend, I read William Deresiewicz's recent piece in The Nation, Faulty Towers: The Crisis in Higher Education, which looks at the pyramid scheme that is graduate education in the humanities. Deresiewicz writes about places like Yale, but much of what he says applies across the academy. This passage made sirens go off in my head:

As Gaye Tuchman explains in Wannabe U (2009), a case study in the sorrows of academic corporatization, deans, provosts and presidents are no longer professors who cycle through administrative duties and then return to teaching and research. Instead, they have become a separate stratum of managerial careerists, jumping from job to job and organization to organization like any other executive: isolated from the faculty and its values, loyal to an ethos of short-term expansion, and trading in the business blather of measurability, revenue streams, mission statements and the like. They do not have the long-term health of their institutions at heart. They want to pump up the stock price (i.e., U.S. News and World Report ranking) and move on to the next fat post.

...

What we have in academia, in other words, is a microcosm of the American economy as a whole: a self-enriching aristocracy, a swelling and increasingly immiserated proletariat, and a shrinking middle class. The same devil's bargain stabilizes the system: the middle, or at least the upper middle, the tenured professoriate, is allowed to retain its prerogatives -- its comfortable compensation packages, its workplace autonomy and its job security -- in return for acquiescing to the exploitation of the bottom by the top, and indirectly, the betrayal of the future of the entire enterprise.

Things aren't quite that bad at my school. Most of our administrators are home-grown, not outside hires using us as the next rung on their career ladder. But we are susceptible to other trends identified in this article, in particular the rapid growth of the non-faculty staff, both mid-level administrators and support staff for the corporate and human services elements of the university.

Likewise, the situation is different with our faculty. We have relatively few adjuncts teaching courses, and an even smaller proportion of grad students. We are a "teaching university", and our tenured and tenure-track faculty teach three courses each semester. That's great for our students, but our productivity in the classroom makes scrounging for grants and external research dollars hard to do. We may be more productive in the classroom than our research-school brethren, but with less recourse to external dollars we are more dependent on state funding. Unfortunately, our board of regents and our state government don't seem to appreciate this and leave us hanging by much thinner threads as state appropriations dwindle. Now there is talk of assigning faculty who are less productive as researchers to teach a fourth class each semester, which will only further hamper our ability to create and disseminate knowledge -- and our ability to attract external funding.

The idea of career administrators hit close to home for me personally, too, as I enter my third term as a department head. I am at heart a computer scientist and programmer, not an administrator. But it's easy to get sucked into the vortex of paperwork and meetings. I need to think of this year not as the first year of my next term but as the first year of my last term, or perhaps as my third-to-last year. Such a mindset may be a better way for me to aim at goals I think most important while preparing the department for a transition to new leadership.

One last passage in Deresiewicz's article got me to thinking. While talking about the problems with tenure, he points out one of the problems of not having tenure: Who will pursue the kind of research that cannot be converted to a quick buck if faculty can expect to be jettisoned by universities at any time, but especially as they age and become more expensive than new hires?

Doctors and lawyers can set up their own practice, but a professor can't start his own university.

I've been thinking about this idea for a while but don't think I've written about it yet. It's something that really intrigues me. There are so many obstacles lying in the way of achieving the idea, and the differential immediate applied value of the various disciplines is only one. yet it is an interesting thought experiment, one I hope to write about more in the future.


Posted by Eugene Wallingford | Permalink | Categories: General

June 09, 2011 6:17 PM

Language, Style, and Life

I should probably apologize for the florid prose in this recent post. It was written under the influence of "Manhood for Amateurs", Michael Chabon's book of essays on life, which I was reading at the time. Chabon is a novelist known for his evocative prose, and the language of "Manhood" captivated me. However, such style and vocabulary are perhaps best left to masters like Chabon. In the hands of amateur writers such as me, they pale in comparison.

I am often influenced as a writer by what I've been reading lately. There is something about the rhythm of some authors' words and sentences that vibrates in my mind and finds its way into my own words and sentences. Recognizing this tendency in myself, I often prepare for a bout of writing by reading a particular writer. For example, when I set out to write software patterns, I like to prime my brain by reading Kent Beck's Smalltalk Best Practice Patterns, for its spare, clean, readable style. I do the same when I code, sometimes. Browsing the current code base gets my mind ready to write new code and to refactor. Whenever I used to start a new Smalltalk project, I would browse the image to put myself in a Smalltalk frame of mind. These days, I'll do the same with Ruby -- github is full of projects I admire by programmers whose work I respect.

I can strongly recommend Chabon's book. It gave me as much life as any book I've read in a long while. Men will find themselves on every page of "Manhood". American men of a certain age will recognize and appreciate its cultural allusions even more. Women will find a bit of insight into the minds of the men in their lives, and receive confirmation from one particularly honest man that, most of the time, we don't have a clue what we are doing.

Is there anything in "Manhood" specifically for programmers and other computer types? No, though there are a couple of references to computers, including this one in the essay "Art of Cake":

Cooking satisfies the part of me that enjoys struggling for days to transfer an out-of-print vinyl record by Klaatu to digital format, screwing with scratch filters and noise reducers, only to have the burn fail every time at the very same track.

I love to cook in much the way Chabon describes, but I must admit that I've never had quite the drive to tinker with settings, configuration files, and boot sectors that my Linux friends seem to have. Cooking feels this need need better for me that installing the latest distribution of my operating system. My drive with computers has always been to create things with programs, and in that regard I was most at home in "Manhood" when he talked about writing.

Chabon does have an essay in the closing section of the book that echoes my observation that there is no normal, though his essay explores what it means for daily life to be normal. I usually see connections of this sort to my life, and readers of this blog won't be surprised if I write a post soon about how the truths of life that Chabon explores find themselves residing in the mind of this programmer and teacher.

One note in closing: Good Boy that I am, I must tell you that Chabon uses language I would never use, and he occasionally discusses frankly, though briefly, drug usage and sex. Fortunately, as I grew up, I learned that I could read about things I would never say or do, and benefit from the experience.


Posted by Eugene Wallingford | Permalink | Categories: General

May 23, 2011 2:34 PM

Plan A Versus Plan B

Late last week, Michael Nielsen tweeted:

"The most successful people are those who are good at Plan B." -- James Yorke

This is one of my personal challenges. I am a pretty good Plan A person. Historically, though, I am a mediocre Plan B person. This is true of creating Plan B, but more importantly of recognizing and accepting the need for Plan B.

Great athletes are good at Plan B. My favorite Plan B from the sporting world was executed by Muhammad Ali in the Rumble in the Jungle, his heavyweight title fight against George Foreman in October 1974. Ali was regarded by most at that time as the best boxer in the world, but in Foreman he encountered a puncher of immense power. At the end of Round 1, Ali realized that his initial plan of attacking Foreman was unlikely to succeed, because Foreman was also a quick fighter who had begun to figure out Ali's moves. So Ali changed plans, taking on greater short-term risk by allowing Foreman to hit him as much as he wanted, so long as the blows were not the kind likely to end the fight immediately. Over the next few rounds, Foreman began to wear down, unaccustomed to throwing so many punches for so many rounds against an opponent who did not weaken. Eventually, Ali found his opening, attacked, and ended the fight in Round 8.

This fight is burned in my mind for the all-time great Plan B moment: Ali sitting on his stool between the first and second rounds, eyes as wide and white as platters. I do not ever recall seeing fear in Muhammad Ali's eyes at any other time in his career, before or after this fight. He believed that Foreman could knock him out. But rather than succumb to the fear, he gathered himself, recalculated, and fought a different fight. Plan B. The Greatest indeed.

Crazy software developer that I am, I see seeds of Plan B thinking in agile approaches. Keep Plan A simple, so that you don't overcommit. Accept Plan B as a matter of course, refactoring in each cycle to build what you learn from writing the code back into the program. React to your pair's ideas and to changes in the requirements with aplomb.

There is good news: We can learn how to be better at Plan B. It takes effort and discipline, just as changing any of our habits does. For me, it is worth the effort.

~~~~

If you would like to learn more about the Rumble in the Jungle, I strongly recommend the documentary film When We Were Kings, which tells the story of this fight and how it came to be. Excellent sport. excellent art, and you can see Ali's Plan B moment with your own eyes.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Software Development

April 26, 2011 4:41 PM

Students Getting There Faster

I saw a graphic over at Gaping Void this morning that incited me to vandalism:

teaching is the art of getting people to where they need to be

A lot of people at our colleges and universities seem to operate under the assumption that our students need us in order to get where they need to be. In A.D. 1000, that may have been true. Since the invention of the printing press, it has been becoming increasingly less true. With the invention of the digital computer, the world wide web, and more and more ubiquitous network access, it's false, or nearly so. I've written about this topic from another perspective before.

Most students don't need us, not really. In my discipline, a judicious self-study of textbooks and all the wonderful resources available on-line, lots of practice writing code, and participation in on-line communities of developers can give most students a solid education in software development. Perhaps this is less true in other disciplines, but I think most of us greatly exaggerate the value of our classrooms for motivated students. And changes in technology put this sort of self-education within reach of more students in more disciplines every day.

Even so, there has never been much incentive for people not to go to college, and plenty of non-academic reasons to go. The rapidly rising cost of a university education is creating a powerful financial incentive to look for alternatives. As my older daughter prepares to head off to college this fall, I appreciate that incentive even more than I did before.

Yet McLeod's message resonates with me. We can help most students get where they need to be faster than they would get there without us.

In one sense, this has always been true. Education is more about learning than teaching. In the new world created by computing technologies, it's even more important that we in the universities understand that our mission is to help people get where they need to be faster and not try to sell them a service that we think is indispensable but which students and their parents increasingly see as a luxury. If we do that, we will be better prepared for reality as reality changes, and we will do a better job for our students in the meantime.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

April 18, 2011 4:44 PM

At the Penumbra of Intersections

Three comments on my previous post, Intersections, in decreasing order of interest to most readers.

On Mediocrity. Mediocitry is a risk if we add so many skills to our portfolio that we don't have the ability or energy to be good at all of them. This article talks about start-ups companies, but I think its lesson applies more broadly to the idea of carving out one's niche. For start-ups as in life, mediocrity is often a worse outcome than failure. When we fail, we know to move on and do. When we achieve mediocrity, sometimes we are just good enough to feel comfortable. It's hard to come up with the willingness to give up the security, or the energy it takes to push ourselves out of the local maximum. But then we miss out on the chance to rach our full potential.

Who Is "Non-Technical"? My post said, "my talk considered the role of programming in the future of people who study human communication, history, and other so-called non-technical fields". I qualified "non-technical", but still I wonder: How many disciplines are non-technical these days, in the era of big data and computation everywhere? How many of these disciplines will be non-technical in the same way 20 years from now?

Keller McBride's color spray artwork

Yak Shaving. I went looking for Venn diagrams to illustrate my post, and then realized I should just create my own. As I played with a couple of tools, I remembered a cool CS 1 assignment I used several years and one student's solution in particular. Suddenly I was obsessed with using my own homegrown tool. That meant finding the course archive and Keller's solution. Then finding my own solution, which had a couple of extra features. Then digging out Dr. Java and making it work with a current version of the media comp tools. Then extending the simple graphics language we used, and refactoring my code, and... The good news is that I now have running again a nice, very simple tool for drawing simple graphics, one that can be used to annotate existing images. And I got to have fun tickereing with code for a\ while on a cloudy Sunday afternoon.


Posted by Eugene Wallingford | Permalink | Categories: General

April 17, 2011 12:43 PM

Intersections

It seems I've been running into intersections everywhere.

In one of Richard Feynman letter, he wrote of two modes for scientists: deep and broad. Scientists who focus on one thing typically win the big awards, but Feynman reassured his correspondent that scientists work broadly in the intersections of multiple disciplines can make valuable contributions.

Scott Adams wrote about the value of combining skills. John Cook commented on Adams's idea, and one of Cook's readers commented on Cook's comment.

A week ago Friday, I spoke at a gathering of professors, students, and local business people who are interested in interactive digital technologies. Among other things, my talk considered the role of programming in the future of people who study human communication, history, and other so-called non-technical fields. One of my friends and former students, now a successful entrepreneur who employs many of our current and former students, spoke about how to succeed in business as a start-up. His talk inspired the audience withe power of passion, but he also gave some practical advice. It is difficult to be the best at any one thing, but if you are very good at two or three or five, then you can be the best in a particular market niche. The power of the intersection.

Wade used a Venn diagram to express his idea:

a Venn diagram of two intersecting sets

The more skills -- "core competencies", in the jargon of business and entrepreneurship -- you add, the more unique your niche:

a Venn diagram of four intersecting sets

As I thought about intersections in all these settings, a few ideas began to settle in my mind:

Adding more circles to your Venn diagram is a good thing, even if you feel they limit your ability to excel in one of the other areas. Each circle adds depth to your niche at the intersection. Having several skills gives you the agility to shift your focus as the world changes -- and as you change.

At some point, adding more circles to your Venn diagram starts to hurt you, not help you. For most of us, there is a limit to the number of different areas we can realistically be good in. If we are unable to perform at a high level in all the areas, or keep up with the changes they evolve, we end up being mediocre. Mediocrity isn't usually good enough to excel in the market, and it isn't a fun place to live.

The fact that we can create intersections in which to excel is a great opportunity for people who do not have the interest or inclination to focus on any one area too narrowly. Perhaps we can't all be Nobel Prize-winning physicists, but in principle we all can make our own niche.

The challenge is that you still have to work hard. This isn't about being the sort of dilettante who skims along the surface of knowledge without ever getting wet. It's about being good at several things, and that takes time and energy.

Of course, that is what makes Nobel Prize winners, too: hard work. They simply devote nearly all of their time and energy to one discipline.

I think it's good news that hard work is the common denominator of nearly all success. We may not control many things in this world, but we have control over how hard we work.


Posted by Eugene Wallingford | Permalink | Categories: General

April 12, 2011 7:55 PM

Commas, Refactoring, and Learning to Program

The most important topic in my Intelligent Systems class today was the comma. Over the last week or so, I had grading their essays on communicating the structure and intent of programs. I was not all that surprised to find that their thoughts on communicating the structure and intent of programs were not always reflected in their essays. Writing well takes practice, and these essays are for practice. But the thing that stood out most glaringly from most of the papers was the overuse, misuse, and occasional underuse of the comma. So after I gave a short lecture on case-based reasoning, we talked about commas. Fun was had by all, I think.

On a more general note, I closed our conversation with a suggestion that perhaps they could draw on lessons they learn writing, documenting, and explaining programs to help them write prose. Take small steps when writing new content, not worrying as much about form as about the idea. Then refactor: spend time reworking the prose, rewriting, condensing, and clarifying. In this phase, we can focus on how well our text communicates the ideas it contains. And, yes, good structure can help, whether at the level of sentences, paragraphs, or then whole essay.

I enjoyed the coincidence of later reading this passage in Roy Behrens's blog, The Poetry of Sight:

Fine advice from poet Richard Hugo in The Triggering Town: Lectures and Essays on Poetry and Writing (New York: W.W. Norton, 1979)--

Lucky accidents seldom happen to writers who don't work. You will find that you may rewrite and rewrite a poem and it never seems quite right. Then a much better poem may come rather fast and you wonder why you bothered with all that work on the earlier poem. Actually, the hard work you do on one poem is put in on all poems. The hard work on the first poem is responsible for the sudden ease of the second. If you just sit around waiting for the easy ones, nothing will come. Get to work.

This is an important lesson for programmers, especially relative beginners, to learn. The hard work you do on one program is put in on all programs. Get to work. Write code. Refactor. Writing teaches writing.

~~~~

Long-time readers of this blog may recall that I once recommended The Triggering Town in an entry called Reading to Write. It is still one of my favorites -- and due for another reading soon!


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

April 04, 2011 7:26 PM

Saying "Hell Yeah! or No" to "Hell Yeah! or No"

Sometimes I find it hard to tell someone 'no',
but I rarely regret it
.

I have been thinking a lot lately about the Hell Yeah! or No mindset. This has been the sort of year that makes me want to live this way more readily. It would be helpful when confronting requests that come in day to day, the small stuff that so quickly clutters a day. It would also be useful when facing big choices, such as "Would you like another term as department head?"

Of course, like most maxims that wrap up the entire universe in a few words, living this philosophy is not as simple as we might like it to be.

The most salient example of this challenge for me right now has to do with granularity. Some "Hell Yeah!"s commit me to other yesses later, whether I feel passionate about them or not. If I accept another term as head, I implicitly accept certain obligations to serve the department, of course, and also the dean. As a department head, I am a player on the dean's team, which includes serving on certain committees across the college and participating in college-level discussions of strategy and tactics. The 'yes' to being head is, in fact, a bundle of yesses, more like a project in Getting Things Done than a next action.

Another thought came to mind while ruminating on this philosophy, having to do with opportunities. If I do not find myself with the chance to say "Hell Yeah!" very often, then I need to make a change. Perhaps I need to change my attitude about life, to accept the reality of where and who I am. More likely, though, I need to change my environment. I need to put myself in more challenging and interesting situations, and hang out with people who are more likely to ask the questions that provoke me to say "Hell Yeah!"


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

March 31, 2011 8:06 PM

My Erdos Number

Back in the early days of my blog, I wrote about the H number as a measure of a researcher's contribution to the scientific community. In that article, the mathematician Paul Erdos makes a guest appearance in a quoted discussion about the trade-off between a small number of highly influential articles and a large number of articles having smaller effect.

Erdos is perhaps the best example of the former. By most accounts, he published more papers than any other mathematician in history, usually detailing what he called "little theorems". He is also widely know for the number of different coauthors with whom he published, so much so that one's Erdos number is a badge of honor among mathematicians and computer scientists. The shorter the path between a researcher and Erdos in the collaboration graph of authors and co-authors, the more impressive.

Kevlin Henney recently pointed me in the direction of Microsoft's VisualExplorer, which finds the shortest paths between any author and Erdos. Now I know that my Erdos number is 3. To be honest, I was surprised to find that my number was so small. There are many paths of lengths four and five connecting me to Erdos, courtesy of several of my good buddies and co-authors who started their professional lives in mathematics. (Hey to Owen and Joe.)

But thanks to Dave West, I have a path of length 3 to Erdos. I have worked with Dave at OOPSLA and at ChiliPLoP on a new vision for computer science, software development, and university education. Like me, Dave has not published a huge number of papers, but he has an eclectic set of interests and collaborators. One of his co-authors published with Erdos. 1-2-3!

In the world of agile software development, we have our own graph-theoretic badge of honor, the Ward number. If you have pair-programmed with Ward Cunningham, your Ward number is 1... and so on. My Ward number is 2, via the same Joe in my Erdos network, Bergin.

Back in even earlier days of my blog, I wrote an entry connected to Erdos, via his idea of Proofs from THE BOOK. Erdos was a colorful character!

Yes, computer scientists and mathematicians like to have fun, even if their fun involves graphs and path-finding algorithms.


Posted by Eugene Wallingford | Permalink | Categories: General

March 28, 2011 8:14 PM

A Well-Meaning Headline Sends an Unfortunate Signal

Last week, the local newspaper ran an above-the-fold front-page story about the regional Physics Olympics competition. This is a wonderful public-service piece. It extols young local students who spend their extracurricular time doing math and physics, and it includes a color photo showing two students who are having fun. If you would like to see the profile of science and math raised among the general public, you could hardly ask for more.

Unless you read the headline:

Young Einsteins

I don't want to disparage the newspaper's effort to help the STEM cause, but the article's headline undermines the very message it is trying to send. Science isn't fun; it isn't for everyone; it is for brains. We're looking for smart kids. Regular people need not apply.

Am I being too sensitive? No. The headline sends a subtle message to students and parents. It sends an especially dangerous signal to young women and minorities. When they see a message that says, "Science kids are brainiacs", they are more likely than other kids to think, "They don't mean me. I don't belong."

I don't want anyone to mislead people about the study of science, math, and CS. They are not the easiest subjects to study. Most of us can't sleep through class, skip homework, and succeed in these courses. But discipline and persistence are more important ingredients to success than native intelligence, especially over the long term. Sometimes, when science and math come too easily to students early in their studies, they encounter difficulties later. Some come to count on "getting it" quickly and, when it no longer comes easily, they lose heart or interest. Others skate by for a while because they don't have to practice and, when it no longer comes easily, they haven't developed the work habits needed to get over the hump.

If you like science and math enough to work at them, you will succeed, whether you are an Einstein or not. You might even do work that is important enough to earn a Nobel Prize.


Posted by Eugene Wallingford | Permalink | Categories: General

February 28, 2011 4:16 PM

Unsurprising Feelings of Success

As reported in this New York Times obit, Arthur Ashe once said this about the whooping he put on my old favorite bad boy, Jimmy Connors, in the 1975 Wimbledon championship:

"If you're a good player," he said softly, "and you find yourself winning easily, you're not surprised."

I've never been a good enough tennis player to have his feeling on court. Likewise with running, where I am usually competing more with my own expectations than with other runners. In that self-competition, though, I have occasionally had the sort of performance that tops my realistic expectations but doesn't really surprise me. Preparation makes success possible.

In the non-competitive world of programming, I sometimes encounter this feeling. I dive into a new language, tool, or framework, expecting slow and unsteady progress toward competence or mastery. But then I seem to catch a wave and cruise forward, deeper and more confident than I had right to expect. In those moments, it's good to step back and remember: we are never entitled to feel that way, but our past work has made them possible.

When those moments come, they are oh, so sweet. They make even more palatable the tough work we do daily, moving one shovel of dirt from here to there.


Posted by Eugene Wallingford | Permalink | Categories: General

February 19, 2011 9:55 AM

Takedown

As a blogger who sometimes worries that I am a person you've never heard of, "writing uninterestingly about the unexceptional", and that other people have already written about whatever I have to say, only better, I really enjoyed William Zinsser's recent takedown of a New York Times Book Review editor on the subject of writing memoirs. His paragraph that starts

Sorry to be so harsh, but I don't like people telling other people they shouldn't write about their life.

and ends

The Times can use its space more helpfully than by allowing a critic to hyperventilate on an exhausted subject. We don't have that many trees left.

is one of my favorite paragraphs in recent memory. I admit a juvenile fondness for hoisting someone with his own petard, and Zinsser does so masterfully.

One of the great beauties of the web is that anyone can write and publish. Readers choose to give their attention to the work that matters to them. No trees are harmed along the way.

In reading and writing together, we can all become better writers -- and better thinkers.

~~~~

I recommended Zinsser's On Writing Well, among several other books, in an earlier entry on reading to write. It remains at the top of my suggested reading list.


Posted by Eugene Wallingford | Permalink | Categories: General

February 05, 2011 9:57 AM

You Are Not Your Work

It is easy for me to get sucked into a mindset in which I equate myself with what I do. In Five Lies I No Longer Believe, Todd Henry writes:

I am not my work, and I am especially not defined by how my work is received. That is not an excuse for laziness; it's permission to engage fully and freely and to bless everyone I encounter without worrying about what they think of me. This is hard medicine to swallow in a culture that celebrates title and the little spaces we carve for ourselves in the marketplace. Not me, not any longer.

"I am what and whom and how I teach."

"I am the programs I create, the papers I write, the grants I receive."

"I am the head of the department."

It is dangerous to think this way when I fail in any one of these arenas. It undervalues who I am and what I can offer. It closes my eyes to other parts of my life.

It is also dangerous to think this way when I succeed. Even in success, this view diminishes me. And it creates an endless cycle of having to succeed again in order to be.

When we think we are what we do, we often constrain our actions based on what other people will think of us. That makes it hard to teach the hard truths, to make tough decisions, to lead people. It makes it hard to create things that matter.

Even if we tune out what other people think, we find that we are always judging ourselves. This is as restrictive and as counterproductive as worrying about other peoples' idea of us.

Having different roles in life and doing different kinds of things can help us avoid this trap. Activity, success, and failure in one arena are only a part of who we are. We have to be careful, though, not to equate ourselves with our the sum of our activities, successes, and failures in all these areas. Whatever that sum is, we are more.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

January 08, 2011 10:41 AM

A Healthy Diet for the Mind

"You are what you eat." You probably heard this bon mot as a child. It encouraged us to eat healthy foods, so that we could grow up to be big and strong.

I think the same thing is true of what we read. When we eat, we consume nutrients and vitamins. When we read, we consume ideas. Some are meat and potatoes, others fruits and vegetables. Some are the broad base of a healthy diet, like breads and grains. Others are junk food. Ideas may even occasionally be roughage!

There probably isn't an idea analogue to the food pyramid. Even more than with the food we eat, there is no right answer for what and how much of any kind of literature we should read. There are many ways for any particular person to read a healthy diet. Still, there are kinds of literature that offer us ideas in different forms, different concentrations, and different modalities. Fiction is where most children start, whether historical, fantastical, or simply about life. Non-fiction, too, comes in many categories: biography, history, science, ... too many to mention.

However, I do think that writer Matthew Kelly is right when he says, "We need a diet of the mind just as much as we need a diet of the body." Just as we should be mindful of what we put in our bodies, we should be mindful of what we put in our minds.

Each person needs to find the reading balance that makes them healthy and happy. I tend to read a lot of technical literature in my own discipline. Academics are pone to this imbalance. One of my challenges is to read enough other kinds of things to maintain a balanced intellectual life. It turns out that reading outside my discipline can make me a better computer scientist, because it gives me more kinds of ideas to use. But the real reason to read more broadly is to have a balanced mind and life.

I know people who wonder why they need to bother reading fiction at all. It doesn't make them better programmers. It doesn't help them change the world via political action. Both of these statements are so, so wrong. Shakespeare and Euripides and Kurt Vonnegut can teach us about how to change the world and even how to become better programmers! But that's not the point. They also make us better people.

Whenever I encounter this sentiment, I always send my friends to Tim O'Reilly's The Benefits of a Classical Education. Most programmers I know hold O'Reilly Media in near reverence, so perhaps they'll listen to its founder when he says, "Classical stories come often to my mind, and provide guides to action". The fiction I've read has shaped how I think about life and problems and given me ways to think about solutions and actions. That's true not only of the classics but also of Kurt Vonnegut and Isaac Asimov, Arthur Clarke and Franz Kafka.

As I wrote recently, I've been reading Pat Conroy's My Reading Life. Near the end of the book, he tells a powerful story about him and his mom reading Thomas Wolfe's Look Homeward, Angel when he was a teenager. This book gave them a way to talk about their own tortured lives in a way they could never have done without a story standing between them and the truths they lived but could not speak. As Conroy says, "Literature can do many things; sometimes it can do even the most important things." I might go one step further: sometimes, only literature can do the most important things.

Sure, there is plenty junk food for the mind, too. It is everywhere, in our books and our blogs, and on our TV and movie screens. But just as with food, we need not eliminate all sweets from our diets; we simply need to be careful about how much we consume. A few sweets are okay, maybe even necessary in some people's diets. We all have our guilty pleasures when it comes to reading. However, when my diet is dominated by junk, my mind becomes weaker. I become less healthy.

Some people mistakenly confuse medium with nutritional value. I hear people talk about blogs and Twitter as if they offer only the emptiest of empty calories, the epitome of junk reading. But the medium doesn't determine nutritional value. My Twitter feed is full of links to marvelous articles and conversation between solid thinkers about important ideas. Much is about computer science and software development, but I also learn about art, literature, human affairs, and -- in well-measured doses -- politics. My newsreader serves up wonderful articles, essays, analyses, and speculations. Sure, both come with a little sugar now and then, but that's just part of what makes it all so satisfying.

People should be more concerned when a medium that once offered nutritional value is now making us less healthy. Much of what we call "news" has in my mind gone from being a member of the grain food group to being junk food.

We have to be careful to consume only the best sources of ideas, at least most of the time, or risk wasting our minds. And when we waste our minds, we waste our gifts.

You are what you read. You become the stories you listen to. Be mindful of the diet of ideas you feed your mind.


Posted by Eugene Wallingford | Permalink | Categories: General

December 27, 2010 9:21 PM

Reading "My Reading Life"

Author Pat Conroy

I first became acquainted with Pat Conroy's work when, as a freshman in college, I watched The Great Santini as a requirement for one of my honors courses. This film struck me as both sad and hopeful. Since then, I have seen a couple of other movies adapted from his novels.

A few years ago, I read his My Losing Season, a memoir of his final season as a basketball player at The Citadel, 1966-1967. Conroy's story is essentially that of a mediocre college-level athlete coming to grips with the fact that he cannot excel at the thing he loves the most. This is a feeling I can appreciate.

The previous paragraph comes from a short review of My Losing Season written only a few weeks before I began writing this blog. That page gives summaries of several books I had read and enjoyed in the preceding months. (It also serves as an indication of how eager I was to write for a wider audience.)

This break I am reading Conroy's latest book, My Reading Life. It is about his love affair with words and writing, and with the people who brought him into contact with either. I stand by something I wrote in that earlier review: "Conroy is prone to overwrought prose and to hyperbole, but he's a good story teller." And I do love his stories.

I also value many of the things he values in life. In a moving chapter on the high school English teacher who became his first positive male role model and a lifelong friend and confidant, Conroy writes:

If there is more important work than teaching, I hope to learn about it before I die.

Then, in a chapter on words, he says:

Writing is the only way I have to explain my own life to myself.

Even more than usual, teaching and writing, whether prose or program, are very much on my mind these days. Reading My Reading Life is a great way to end my year.


Posted by Eugene Wallingford | Permalink | Categories: General

December 20, 2010 3:32 PM

From Occasionally Great to Consistently Good

Steve Martin's memoir, "Born Standing Up", tells the story of how Martin's career as a stand-up comedian, from working shops at Disneyland to being the biggest-selling concert comic ever at his peak. I like hearing people who have achieved some level of success talk about the process.

This was my favorite passage in the book:

The consistent work enhanced my act. I Learned a lesson: It was easy to be great. Every entertainer has a night when everything is clicking. These nights are accidental and statistical: Like the lucky cards in poker, you can count on them occurring over time. What was hard was to be good, consistently good, night after night, no matter what the abominable circumstances.

"Accidental greatness" -- I love that phrase. We all like to talk about excellence and greatness, but Martin found that occasional greatness was inevitable -- a statistical certainty, even. If you play long enough, you are bound to win every now and then. Those wines are not achievement of performance so much as achievements of being there. It's like players and coaches in athletics who break records for the most X in their sport. "That just means I've been around a long time," they say.

The way to stick around a long time, as Martin was able to do, is to be consistently good. That's how Martin was able to be present when lightning struck and he became the hottest comic in the world for a few years. It's how guys like Don Sutton won 300+ games in the major leagues: by being good enough for a long time.

Notice the key ingredients that Martin discovered to becoming consistently good: consistent work; practice, practice, practice, and more practice; continuous feedback from audiences into his material and his act.

We can't control the lightning strikes of unexpected, extended celebrity or even those nights when everything clicks and we achieve a fleeting moment of greatness. As good as those feel, they won't sustain us. Consistent work, reflective practice, and small, continuous improvements are things we can control. They are all things that any of us can do, whether we are comics, programmers, runners, or teachers.


Posted by Eugene Wallingford | Permalink | Categories: General, Running, Software Development, Teaching and Learning

December 01, 2010 3:45 PM

"I Just Need a Programmer"

As head of the Department of Computer Science at my university, I often receive e-mail and phone calls from people with The Next Great Idea. The phone calls can be quite entertaining! The caller is an eager entrepreneur, drunk on their idea to revolutionize the web, to replace Google, to top Facebook, or to change the face of business as we know it. Sometimes the caller is a person out in the community; other times the caller is a university student in our entrepreneurship program, often a business major. The young callers project an enthusiasm that is almost infectious. They want to change the world, and they want me to help them!

They just need a programmer.

Someone has to take their idea and turn it into PHP, SQL, HTML, CSS, Java, and Javascript. The entrepreneur knows just what he or she needs. Would I please find a CS major or two to join the project and do that?

Most of these projects never find CS students to work on them. There are lots of reasons. Students are busy with classes and life. Most CS students have jobs they like. Those jobs pay hard cash, if not a lot of it, which is more attractive to most students than the promise of uncertain wealth in the future. The idea does not excite other people as much as the entrepreneur, who created the idea and is on fire with its possibilities.

A few of the idea people who don't make connections with a CS student or other programmer contact me a second and third time, hoping to hear good news. The younger entrepreneurs can become disheartened. They seem to expect everyone to be as excited by their ideas as they are. (The optimism of youth!) I always hope they find someone to help them turn their ideas into reality. Doing that is exciting. It also can teach them a lot.

Of course, it never occurs to them that they themselves could learn how to program.

A while back, I tweeted something about receiving these calls. Andrei Savu responded with a pithy summary of the phenomenon I was seeing:

@wallingf it's sad that they see software developers as commodities. product = execution != original idea

As I wrote about at greater length in a recent entry, the value of a product comes from the combination of having an idea and executing the idea. Doing the former or having the ability to do the latter aren't worth much by themselves. You have to put the two together.

Many "idea people" tend to think most or all of the value inheres to having the idea. Programmers are a commodity, pulled off the shelf to clean up the details. It's just a small matter of programming, right?

On the other side, some programmers tend to think that most or all of the value inheres to executing the idea. But you can't execute what you don't have. That's what makes it possible for me and my buddy to sit around over General Tsao's chicken and commiserate about lost wealth. It's not really lost; we were never in its neighborhood. We were missing a vital ingredient. And there is no time machine or other mechanism for turning back the clock.

I still wish that some of the idea people had learned how to program, or were willing to learn, so that they could implement their ideas. Then they, too, could know the superhuman strength of watching ideas become tangible. Learning to program used to be an inevitable consequence of using computers. Sadly, that's no longer true. The inevitable consequence of using computers these days seems to be interacting with people we may or may not know well and watching videos.

Oh, and imagining that you have discovered The Next Great Thing, which will topple Google or Facebook. Occasionally, I have an urge to tell the entrepreneurs who call me that their ideas almost certainly won't change the world. But I don't, for at least two reasons. First, they didn't call to ask my opinion. Second, every once in a while a Microsoft or Google or Facebook comes along and does change the world. How am I to know which idea is that one in a gazillion that will? If my buddy and I could go back to 2000 and tell our younger and better-looking selves about Facebook, would those guys be foresightful enough to sit down and write it? I suspect not.

How can we know which idea is that one that will change the world? Write the program, work hard to turn it into what people need and want, and cross our fingers. Writing the program is the ingredient the idea people are missing. They are doing the right thing to seek it out. I wonder what it would be like if more people could implement their own ideas.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

November 22, 2010 2:18 PM

Tragedy and the Possibilities of Unlimited Storage

I've spent considerable time this morning cleaning out the folder on my desktop where I keep stuff. In one the dozens of notes files I've created over the last year or so, I found this unattributed quote:

In 1961, the scholar and cultural critic George Steiner argued in a controversial book, "The Death of Tragedy", that theatrical tragedies had begun their steady decline with the rise of rationalism and the Enlightenment in the 17th century. The Greek notion of tragedy was premised on man's inability to control his fate in the face of capricious, often brutal gods. But the point of a tragedy was to dramatize man's ability to make choices whatever his uncontrollable end.

The emphasis was not on the death -- what the gods usually had in store -- but on what the hero died for: the state, love, his own dignity. Did he die nobly? Or with shame? For a worthy cause? Or pitiless self-interest? Tragedies, then, were ultimately "an investigation into the possibilities of human freedom", as Walter Kerr put it in "Tragedy and Comedy" (1967).

I like this passage now as much as I must have when I typed it up from some book I was reading. (I'm surprised I did not write down the source!) It reminds me that I face and make choices every day that reveal who I am. Indeed, the choices I make create who I am. That message feels especially important to me today.

And yes, I know there are better tools for keeping notes than dozens of text files thrown into nearly as many folders. I take notes using a couple of them as well. Sometimes I lack the self-discipline I need to leave an ordered life!


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

November 18, 2010 3:43 PM

The Will to Run, or Do Anything Else

In "How Do You Do It?", an article in the latest issue of Running Times about how to develop the intrinsic motivation to do crazy things like run every morning at 5:00 AM, ultrarunner Eric Grossman writes:

The will to run emerges gradually where we cultivate it. It requires humility -- we can't just decide spontaneously and make it happen. Yet we must hold ourselves accountable for anything about which we can say, "I could have done differently."

Cultivation, humility, patience, commitment, accountability -- all features of developing the habits I need to run on days I'd rather stay in bed. After a while, you do it, because that's what you do.

I think this paragraph is true of whatever habit of thinking an doing that you are trying to develop, whether it's object-oriented programming, playing piano, or test-driven design.

~~~~

Eugene speaking at Tech Talk Cedar Valley, 2010/11/17

Or functional programming. Last night I gave a talk at Tech Talk Cedar Valley, a monthly meet-up of tech and software folks in the region. Many of these developers are coming to grips with a move from Java to Scala and are peddling fast to add functional programming style to their repertoires. I was asked to talk about some of the basic ideas of functional programming. My talk was called "Don't Drive on the Railroad Tracks", referring to Bill Murray's iconic character in the movie Groundhog Day. After hundreds or thousands of days reliving February 2 from the same starting point, Phil Connors finally comes to understand the great power of living in a world without side effects. I hope that my talk can help software developers in the Cedar Valley reach that state of mind sooner than a few years from now.

If you are interested, check out the slides of the talk (also available on SlideShare) and the code. both Ruby and Scheme, that I used to illustrate some of the ideas.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 06, 2010 11:04 AM

Outer Space, Iowa, and Unexpected Connections

Poet Marvin Bell tells this story in his collection, A Marvin Bell Reader:

[In Star Trek, Captain] Kirk is eating pizza in a joint in San Francisco with a woman whose help he will need, when he decides to fess up about who he is and where he has come from. The camera circles the room, then homes in on Kirk and his companion as she bursts out with, "You mean your from outer space?"

"No," says Kirk, "I'm from Iowa. I just work in outer space."

My life is in some ways a complement to Kirk's. I often feel like I'm from outer space. I just work in Iowa.

I briefly met Bell, the former poet laureate of Iowa, when he gave the keynote address at a camouflage conference at a few years ago. I gave a talk there on steganography, which is a form of digital camouflage. While Bell's quote comes from his own book, I found it in the front matter of Cook Book: Gertrude Stein, William Cook, and Le Corbusier, a delightful little book by Roy Behrens. Longtime readers of this blog will recognize Behrens's name; his writing has led me to many interesting books and ideas. I have written also written of Behrens's own scholarly work several times, most notably Teaching as Subversive Inactivity, Feats of Association, and Reviewing a Career Studying Camouflage. I am fortunate to have Roy as friend and colleague, right here in Cedar Falls, Iowa.

Cook Book tells the story of William Cook, a little known Iowa artist who left America for Europe as a young man and became a longtime friend of the celebrated American writer and expatriate Gertrude Stein. He later used his inheritance to hire a young, unknown Le Corbusier to design his new home on the outskirts of Paris. Behrens grew up in Cook's hometown of Independence, Iowa. If you would like a taste of the story before reading the book, read this short essay.

I am no longer surprised to learn of surprising connections among people, well-known and unknown alike. Yet I am always surprised at the particular connections that exist. A forgotten Iowa artist was a dear friend of one of America's most famous writers of the early 1900s? He commissioned one of the pioneers of modern architecture before anyone had heard of him? Pope Pius makes a small contribution to the expatriate Iowan's legacy?

Busy, busy, busy.

October 21, 2010 8:50 AM

Strange Loop Redux

StrangeLoop 2010 logo

I am back home from St. Louis and Des Moines, up to my next in regular life. I recorded some of my thoughts and experiences from Strange Loop in a set of entries here:

Unlike most of the academic conferences I attend, Strange Loop was not held in a convention center or in a massive conference hotel. The primary venue for the conference was the Pageant Theater, a concert nightclub in the Delmar Loop:

The Pageant Theater

This setting gave the conference's keynotes something of an edgy feel. The main conference lodging was the boutique Moonrise Hotel a couple of doors down:

The Pageant Theater

Conference session were also held in the Moonrise and in the Regional Arts Commission building across the street. The meeting rooms in the Moonrise and the RAC were ordinary, but I liked being in human-scale buildings that had some life to them. It was a refreshing change from my usual conference venues.

It's hard to summarize the conference in only a few words, other than perhaps to say, "Two thumbs up!" I do think, though, that one of the subliminal messages in Guy Steele's keynote is also a subliminal message of the conference. Steele talked for half an hour about a couple of his old programs and all of his machinations twenty-five or forty years to make them run in the limited computing environments of those days. As he went to all the effort to reconstruct the laborious effort that went into those programs in the first place, the viewer can't help but feel that the joke's on him. He was programming in the Stone Age!

But then he gets to the meat of his talk and shows us that how we program now is the relic of a passing age. For all the advances we have made, we still write code that transitions from state to state to state, one command at a time, just like our cave-dwelling ancestors in the 1950s.

It turns out that the joke is on us.

The talks and conversations at Strange Loop were evidence that one relatively small group of programmers in the midwestern US are ready to move into the future.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development

October 13, 2010 10:20 PM

Serendipitous Connections

I'm in St. Louis now for Strange Loop, looking at the program and planning my schedule for the next two days. The abundant options nearly paralyze me... There are so many things I don't know, and so many chances to learn. But there are a limited number of time slots in any day, so the chances overlap.

I had planned to check in at the conference and then eat at The Pasta House, a local pasta chain that my family discovered when we were here in March. (I am carbo loading for the second half of my travels this week.) But after I got the motel, I was tired from the drive and did not relish getting into my car again to battle the traffic again. So I walked down the block to Bartolino's Osteria, a more upscale Italian restaurant. I was not disappointed; the petto di pollo modiga was exquisite. I'll hit the Pasta House tomorrow.

When I visit big cities, I immediately confront the fact that I am, or have become, a small-town guy. Evening traffic in St. Louis overwhelms my senses and saps my energy. I enjoy conferences and vacations in big cities, but when they end I am happy to return home.

That said, I understand some of the advantages to be found in large cities. Over the last few weeks, many people have posted this YouTube video of Steven Johnson introducing his book, "Where Good Ideas Come From". Megan McArdle's review of the book points out one of the advantages that rises out of all that traffic: lots of people mean lots of interactions:

... the adjacent possible explains why cities foster much more innovation than small towns: Cities abound with serendipitous connections. Industries, he says, may tend to cluster for the same reason. A lone company in the middle of nowhere has only the mental resources of its employees to fall back on. When there are hundreds of companies around, with workers more likely to change jobs, ideas can cross-fertilize.

This is one of the most powerful motivations for companies and state and local governments in Iowa to work together to grow a more robust IT industry. Much of the focus has been on Des Moines, the state capitol and easily the largest metro area in the state, and on the Cedar Rapids/Iowa City corridor, which connects our second largest metro area with our biggest research university. Those areas are both home to our biggest IT companies and also home to a lot of people.

The best IT companies and divisions in those regions are already quite strong, but they will be made stronger by more competition, because that competition will bring more, and more diverse, people into the mix. These people will have more, and more diverse, ideas, and the larger system will create more opportunities for these ideas to bounce off one another. Occasionally, they'll conjoin to make something special.

The challenge of the adjacent possible makes me even more impressed by start-ups in my small town. People like Wade Arnold at T8 Webware are working hard to build creative programming and design shops in a city without many people. They rely on creating their own connections, at places like Strange Loop all across the country. In many ways, Wade has to think of his company as an incubator for ideas and a cultivator of people. Whereas companies in Des Moines can seek a middle ground -- large enough to support the adjacent possible but small enough to be comfortable -- companies like T8 must create the adjacent possible in any way they can.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 06, 2010 12:45 PM

Empiricism, Bias, and Confidence

This morning, Mike Feathers tweeted a link to an old article by Donald Norman, Simplicity Is Highly Overrated and mentioned that he disagrees with Norman. Many software folks disagreed with Norman when he first wrote the piece, too. We in software, often being both designers and users, have learned to appreciate simplicity, both functionally and aesthetically. And, as Kent Beck suggested, products such as the iPod are evidence contrary to the claim that people prefer the appearance of complexity. Norman offered examples in support of his position, too, of course, and claimed that he has observed them over many years and in many cultures.

This seems like a really interesting area for study. Do people really prefer the appearance of complexity as a proxy for functionality? Is the iPod an exception, and if so why? Are software developers different from the more general population when it comes to matters of function, simplicity, and use?

When answering these questions, I am leery of relying on self-inspection and anecdote. Norman said it nicely in the addendum to his article:

Logic and reason, I have to keep explaining, are wonderful virtues, but they are irrelevant in describing human behavior.

He calls this the Engineer's Fallacy. I'm glad Norman also mentions economists, because much of the economic theory that drives our world was creating from deep analytic thought, often well-intentioned but usually without much evidence to support it, if any at all. Many economists themselves recognize this problem, as in this familiar quote:

If economists wished to study the horse, they wouldn't go and look at horses. They'd sit in their studies and say to themselves, "What would I do if I were a horse?"

This is a human affliction, not just a weakness of engineers and economists. Many academics accepted the Sapir-Whorf Hypothesis, which conjectures that our language restricts how we think, despite little empirical support for a claim so strong. The hypothesis affected in disciplines such as psychology, anthropology, and education, as well as linguistics itself. Fortunately, others subjected the hypothesis to study and found it lacking.

For a while, it was fashionable to dismiss Sapir-Whorf. Now, as a recent New York Times article reports, researchers have begun to demonstrate subtler and more interesting ways in which the language we speaks shapes how we think. The new theories follow from empirical data. I feel a lot more confident in believing the new theories, because we have derived them from more reliable data than we ever had for the older, stronger claim.

(If you read the Times article, you will see that Whorf was an engineer, so maybe the tendency to develop theories from logical analysis and sparse data really is more prominent in those of us trained in the creation of artifacts to solve problems...)

We see the same tendencies in software design. One of the things I find attractive about the agile world is its predisposition toward empiricism. Just yesterday Jason Gorman posted a great example, Reused Abstractions Principle. For me, software abstractions that we discover empirically have a head-start toward confident believability on the ones we design aforethought. We have seen them instantiated in actual code. Even more, we have seen them twice, so they have already been reused -- in advance of creating the abstraction.

Given how frequently even domain experts are wrong in their forecasts of the future and their theorizing about the world, how frequently we are all betrayed by our biases and other subconscious tendencies, I prefer when we have reliable data to support claims about human preferences and human behavior. A flip but too often true way to say "design aforethought" is "make up".


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

August 24, 2010 4:30 PM

Dreaming, Doing, Perl, and Language Translation

Today, I quoted Larry Wall's 2000 Atlanta Linux Showcase Talk in the first day of my compilers course. In that talk, he gives a great example of using a decompiler to port code -- in this case, from Perl 5 to Perl 6. While re-reading the talk, I remembered something that struck me as wrong when I read it the first time:

["If you can dream it, you can do it"--Walt Disney]

"If you can dream it, you can do it"--Walt Disney. Now this is actually false (massive laughter). I think Walt was confused between necessary and sufficient conditions. If you *don't* dream it, you can't do it; that is certainly accurate.

I don't think so. I think this is false, too. (Laugh now.)

It is possible to do things you don't dream of doing first. You certainly have to be open to doing things. Sometimes we dream something, set out to do it, and end up doing something else. The history of science and engineering are full of accidents and incidental results.

I once was tempted to say, "If you don't start it, you can't do it; that is certainly accurate." But I'm not sure that's true either, because of the first "it". These days, I'm more inclined to say that if you don't start doing something, you probably won't do anything.

Back to Day 1 of the compilers: I do love this course. The Perl quote in my lecture notes is but one element in a campaign to convince my students that this isn't just a compilers course. The value in the course material and in the project itself go far beyond the creation of an old-style source language-to-machine language translator. Decompilers, refactoring browsers, cross-compilers, preprocessors, interpreters, and translators for all sorts of domain-specific languages -- a compilers course will help you learn about all of these tools, both how they work and how to build them. Besides, there aren't many better ways to consolidate your understanding of the breadth of computer science than to build a compiler.

The official title of my department's course is "Translation of Programming Languages". Back in 1994, before the rebirth of mainstream language experimentation and the growth of interest in scripting languages and domain-specific languages, this seemed like a daring step. These days, the title seems much more fitting than "Compiler Construction". Perhaps my friend and former colleague Mahmoud Pegah and I had a rare moment of foresight. More likely, Mahmoud had the insight, and I was simply wise enough to follow.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

August 18, 2010 5:04 PM

You May Be in the Software Business

In the category programming for all, Paul Graham's latest essay explains his ideas about What Happened to Yahoo. (Like the classic Marvin Gaye album and song, there is no question mark.) Most people may not care about programming, but they ought to care about programs. More and more, the success of an organization depends on software.

Which companies are "in the software business" in this respect? ... The answer is: any company that needs to have good software.

If this was such a blind spot for an Internet juggernaut like Yahoo, imagine how big a surprise it must be for everyone else.

If you employ programmers, you may be tempted to stay within your comfort zone and treat your tech group just like the rest of the organization. That may not work very well. Programmers are a different breed, especially great programmers. And if you are in the software business, you want good programmers.

Hacker culture often seems kind of irresponsible. ... But there are worse things than seeming irresponsible. Losing, for example.

Again: If this was such a blind spot for an Internet juggernaut like Yahoo, imagine how big an adjustment it would be for everyone else.

I'm in a day-long retreat with my fellow department heads in the arts and sciences, and it's surprising how often software has come up in our discussions. This is especially true in recruitment and external engagement, where consistent communication is so important. It turns out the university is in the software business. Unfortunately, the university doesn't quite get that.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

August 10, 2010 3:36 PM

In Praise of Attacking Hard Problems

With the analysis of Deolalikar's P != NP paper now under way in earnest, I am reminded of a great post last fall by Lance Fortnow, The Humbling Power of P v NP. Why should every theorist try to prove P = NP and P != NP?

Not because you will succeed but because you will fail. No matter what teeth you sink into P vs NP, the problem will bite back. Think you solved it? Even better. Not because you did but because when you truly understand why your proof has failed you will have achieved enlightenment.

You might even succeed, though I'm not sure if the person making the attempt achieves the same kind of enlightenment in that case.

Even if Deolalikar's proof holds up, Fortnow's short essay will still be valuable and true.

We'll just use a different problem as our standard.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

August 09, 2010 3:43 PM

Fail Early, Fail Often... But Win in the End

I seem to be running across the fail early, fail often meme a lot lately. First, in an interview on being wrong, Peter Norvig was asked how Google builds tolerance for the inevitable occasional public failures of its innovations "into a public corporation that's accountable to its bottom line". He responded:

We do it by trying to fail faster and smaller.

One of the ways they do this is by keeping iterations short and teams small.

Then this passage from Seth Godin's recent e-book, Insubordinate, jumped out as another great example:

As a result of David [Seuss]'s bias for shipping, we failed a lot. Products got returned. Commodore 64 computers would groan in pain as they tried to run software that was a little too advanced for their puny brains. It didn't matter, because we were running so fast that the successes supported us far more than the failures slowed us down.

In a rapidly changing environment, not to change is often a bigger risk than to change. In an environment most people don't understand well, in which information is unavailable and unevenly distributed, not to change is often a bigger risk than to change.

However, it's important not to fetish-ize failure, as some people seem to do. Dave Winer reminds us, embracing failure is a good way to fail. Sometimes, you have to look at what failure will mean and muster a level of determination that denies failure in order to succeed.

This all seems so contradictory... but it's not. As we humans often do, we create rules for behavior that are underspecified in terms of context and the problem being solved. There are a lot of trade-offs in the mix when we talk about success and failure. For example, we need to distinguish between failure in the large and failure in small. When an agile developer is taking small steps, she can afford to fail on a few -- especially if the failure teaches her something about how to succeed more reliably in the future. The new information gained is worth the cost of the loss.

In the passage from Godin, successes happened, too, not only losses, and the wins more than offset the losses. In that context, it seems that the advice is not about failure so much as getting over fear of failure. When we fear failure so much that we do not act, we deprive ourselves not only of losses but also of wins. Not failing gets in way of succeeding, of not learning and growing.

Winer was talking about something different, what I'm calling in my mind "ultimate failure": sending the employees home, shutting the doors, and turning the lights off for good. That is different than the "transitory failures" Godin was talking about, the sort of failures we experience when we learn in a dynamic, poorly understood environment. Still, Winer might not take any comfort in that idea. His company was at the brink, and only making the product work and sell was good enough to pull it back. At that moment, he probably wasn't interested in thinking about what he could learn from his next failure.

Sometimes, even the small failures can close doors, at least for a while. That's why so many entrepreneurs and commentators on start-up companies encourage people to fail early, before too many resources have been sunk into the venture, before too many people have been drawn into the realm affected by success or failure -- when a failure means that the entrepreneur simply must start over with her primary assets: her energy, determination, and effort.

When I was decorating my first college dorm room, I hung three small quotes on the wall over the desk. One of them comes to mind now. It was from Don Shula, the head coach of my favorite pro football team, the Miami Dolphins:

Failure isn't fatal, and success isn't final.

This seemed like a good mantra to keep in mind as I embarked on a journey into the unknown. It has served me well for many years now, including my time as a programmer and a teacher.


Posted by Eugene Wallingford | Permalink | Categories: General

July 30, 2010 2:36 PM

Notes on Entry Past

I've been killing loose minutes this week by going through my stuff folder, moving files I want to keep to permanent homes and pitching files I have lost interest in or won't have time for anytime soon. As I sometimes do, I've run across quotes I stashed away for use in blog entries. Alas, some of the quotes would have been useful in pieces I wrote recently, but now they aren't likely to find a home any time soon.

I recall reading this quote from A. E. Stallings in a short essay by Tim O'Reilly on the value of a classical education:

[The ancients] showed me that technique was not the enemy of urgency, but the instrument.

But it would have been perfect in Form Matters. Improving your form doesn't slow you down in the long run, it makes you faster.

I read this in an entry by Philip Windley about how he had averted a potential disaster:

Automate everything.

... and this in a response to that entry by Gordon Weakliem:

You are not a machine, so stop repeating yourself.

Of course, I was immediately reminded of my own disaster, unaverted. As I look back at Weakliem's article, it is interesting to see how programming principles such as "don't repeat yourself" and practices such as pair programming show up in different contexts outside of software.

Finally, I found this snippet, a tweet by @KentBeck:

as your audience grows, the cost of failure rises. put positively, it'll never be cheaper to fail than today.

This would have been a great part of any number of entries about my agile software development course in May and my software engineering course last fall. Kent is a master of crystallizing ideas into neat little catchphrases I never forget. Perhaps this one would have stuck with a few students as they tried to move toward shorter and shorter iterations of the test-code-refactor cycles.


Posted by Eugene Wallingford | Permalink | Categories: General

July 06, 2010 1:03 PM

Vindicated

By H. G. Wells, no less:

"You have turned your back on common men, on their elementary needs and their restricted time and intelligence," H.G. Wells complained to Joyce after reading "Finnegans Wake." That didn't faze him. "The demand that I make of my reader," Joyce said, "is that he should devote his whole life to reading my works." To which the obvious retort is: Life's too short.

This passage comes from an article on the complexity of modern art. Some modern art works for me, but I long ago lost interest in writers who complicate their work seemingly with the goal of proving to me how smart they are. Some of my friends love such writers and look at me in the same way they look at children and puppies. I must admit, with no small measure of guilt, that I have occasionally wondered how much their interest in these writers rested in a hidden desire to show how smart they are.

I've mentioned at least a couple of times that I prefer small books to large, and on that criterion alone I could bypass "Ulysses" and "Finnegans Wake". Joyce compounds their length with sentence structures and made-up words that numb my small brain and squanders my limited time. Their complexity and deeply-woven literary allusions may well reward the reader who devotes his life to studying Joyce. But for me, life is indeed too short.

I must admit that I very much enjoyed Joyce's comparatively svelte "Portrait Of The Artist As A Young Man". I also enjoyed H. G. Wells's science fiction, though as literature it never rises anywhere near the level of "Portrait".


Posted by Eugene Wallingford | Permalink | Categories: General

June 24, 2010 8:04 AM

Remembrance and Sustenance

All those things for which
we have no words are lost.
-- Annie Dillard, "Total Eclipse"

My family and I spent eight days on the road last week, with a couple of days in the Jamestown/Williamsburg area of Virginia and then a few days in Washington, D.C. I'd never spent more than a couple of hours in my nation's capital and enjoyed seeing the classic buildings in which our leaders work and the many monuments to past leaders and conflicts.

The Korean War Veterans Memorial in Washington, DC

The Korean War Veterans Memorial caught me by surprise. No one ever talks about this memorial, much like the war itself. I had no idea what it looked liked, no expectations. When we came upon it from the rear, I became curious. Standing amid the soldiers trudging through a field, I was unnerved. They look over their shoulders, or they make eye contact with one another, or they stare ahead, blankly. This is no sterile monument of while limestone. It is alive, even as it reminds us of men who no longer are. When we reached the front of the memorial, we saw a wreath with a note of thanks from the Korean people. It brought tears to my eyes, and to my daughter's.

As touched as I was by the National Mall, most of my memories of the trip are of artwork we saw in the several galleries and gardens. I came to remember how much I like the paintings of Monet. This time, it was his "The Seine at Giverny" that gave me the most joy. I learned how much I enjoy the work of Camille Pissarro, another of the French impressionists who redefined what a painting could be and say in the 1800s. I even saw a few abstract pieces by Josef Albers, whom I quoted not long ago. That quote came back to me as I walked amid the creations of men, oblivious to computer programming and the debits and credits of department budgets. What happens happens mostly without me. Indeed.

One Hiroshi Sugimoto's seascape photographs

I left Washington with a new inspiration, Hiroshi Sugimoto. My daughter and I entered one of the gallery rooms to find a bunch of canvasses filled with blacks, grays, and whites. "More modern nothingness," I thought at first. As we absorbed the images, though, one of us said out loud, "These look like pictures of the ocean. See here...?" We looked closer and saw different times of day, different clouds and fog, horizons crisp and horizons that were no more than imperceptible points on a continuum from dark ocean to light sky. Only upon leaving the room did we learn that these images were in fact seascapes. "This is modern art that works for me," said my daughter. I nodded agreement.

Sugimoto's seascapes are only one element of his work. I have many more of his images to discover.

I did not get through my eight days away without any thoughts of computer science. In the National Gallery of Art, we ran across this piece by Edward Ruscha, featured here:

Edward Ruscha's 'Lisp'

I vaguely recall seeing this image many years ago in a blog post at Lemonodor, but this time it grabbed me. My inner programmer was probably feeling the itch of a few days away from the keyboard. Perhaps Ruscha has an his own inner programmer. When I did a Google Image search to find the link above, I found that he had also created works from the words 'self' and 'ruby'. We programmers can make our own art using Lisp, Self, and Ruby. Our art, like that of Monet, Pissarro, Sugimoto, and Ruscha, sustains us.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

April 24, 2010 1:25 PM

Futility and Ditch Digging

First, Chuck Hoffman tweeted, The life of a code monkey is frequently depressingly futile.

I had had a long week, filled with the sort of activities that can make a programmer pine for days as a code monkey, and I replied, Life in many roles is frequently depressingly futile. Thoreau was right.

The ever-timely Brian Foote reminded me:

Sometimes utility feels like futility, but someone's gotta do it.

Thanks, Brian. I needed to hear that.

I remember hearing an interview with musician John Mellencamp many years ago in which he talked about making the movie Falling from Grace. Th interviewer was waxing on about the creative process and how different movies were from making records, and Mellencamp said something to the effect of, "A lot of it is just ditch digging: one more shovel of dirt." Mellencamp knew about that sort of manual labor because he had done it, digging ditches and stringing wire for a telephone company before making it as an artist. And he's right: an awful lot of every kind of working is moving one more shovel of dirt. It's not romantic, but it gets the job done.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

April 20, 2010 9:58 PM

Computer Code in the Legal Code

You may have heard about a recent SEC proposal that would require issuers of asset-backed securities to submit "a computer program that gives effect to the flow of funds". What a wonderful idea!

I have written a lot in this blog about programming as a new medium, a way to express processes and the entities that participate in them. The domain of banking and finance is a natural place for us to see programming enter into the vernacular as the medium for describing problems and solutions more precisely. Back in the 1990s, Smalltalk had a brief moment in the sunshine as the language of choice used by financial houses in the creation of powerful, short-lived models of market behavior. Using a program to describe models gave the investors and arbitrageurs not only a more precise description of the model but also a live description, one they could execute against live data, test and tinker with, and use as an active guide for decision-making.

We all know about the role played by computer models in the banking crisis over the last few years, but that is an indictment of how the programs were used and interpreted. The use of programs itself was and is the right way to try to understand a complex system of interacting, independent agents manipulating complex instruments. (Perhaps we should re-consider whether to traffic in instruments so complex that they cannot be understand without executing a complex program. But that is a conversation for another blog entry, or maybe a different blog altogether!)

What is the alternative to using a program to describe the flow of funds engendered by a particular asset-backed security? We could describe these processes using text in a natural language such as English. Natural language is supremely expressive but fraught with ambiguity and imprecision. Text descriptions rely on the human reader to do most of the work figuring out what they mean. They are also prone to gratuitous complexity, which can be used to mislead unwary readers.

We could also describe these processes using diagrams, such as a flow chart. Such diagrams can be much more precise than text, but they still rely on the reader to "execute" them as she reads. As the diagrams grow more complex, the more difficult it is for the reader to interpret the diagram correctly.

A program has the virtue of being both precise and executable. The syntax and semantics of a programming are (or at least can be) well-defined, so that a canonical interpreter can execute any program written in the language and determine its actual value. This makes describing something like the flow of funds created by a particular asset-backed security as precise and accurate as possible. A program can be gratuitously complex, which is a danger. Yet programmers have at their disposal tools for removing gratuitous complexity and focusing on the essence of a program, moreso than we have for manipulating text.

The behavior of the model can still be complex and uncertain, because it depends on the complexity and uncertainty of the environment in which it operates. Our financial markets and the economic world in which asset-backed securities live are enormously complex! But at least we have a precise description of the process being proposed.

As one commentator writes:

When provisions become complex beyond a point, computer code is actually the simplest way to describe them... The SEC does not say so, but it would be useful to add that if there is a conflict between the software and textual description, the software should prevail.

Using a computer program in this way is spot on.

After taking this step, there are still a couple of important issues yet to decide. One is: What programming language should we use? A lot of CS people are talking about the proposal's choice of Python as the required language. I have grown to like Python quite a bit for its relative clarity and simplicity, but I am not prepared to say that it is the right choice for programs that are in effect "legal code". I'll let people who understand programming language semantics better than I make technical recommendations on the choice of language. My guess is that a language with a simpler, more precisely defined semantics would work better for this purpose. I am, of course, partial to Scheme, but a number of functional languages would likely do quite nicely.

Fortunately, the SEC proposal invites comments, so academic and industry computer scientists have an opportunity to argue for a better language. (Computer programmers seem to like nothing more than a good argument about language, even writing programs in their own favorite!)

The most interesting point here, though, is not the particular language suggested but that the proposers suggest any programming language at all. They recognize how much more effectively a computer program cab describe a process than text or diagrams. This is a triumph in itself.

Other people are reminding us that mortgage-backed CDOs at the root of the recent financial meltdown were valued by computer simulations, too. This is where the proposal's suggestion that the code be implemented in open-source software shines. By making the source code openly available, everyone has the opportunity and ability to understand what the models do, to question assumptions, and even to call the authors on the code's correctness or even complexity. The open source model has worked well in the growth of so much of our current software infrastructure, including the simple in concept but complex in scale Internet. Having the code for financial models be open brings to bear a social mechanism for overseeing the program's use and evolution that is essential in a market that should be free and transparent.

This is also part of the argument for a certain set of languages as candidates for the code. If the language standard and implementations of interpreters are open and subject to the same communal forces as the software, this will lend further credibility to the processes and models.

I spend a lot of time talking about code in this blog. This is perhaps the first time I have talked about legal code -- and even still I get to talk about good old computer code. It's good to see programs recognized for what they are and can be.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

April 08, 2010 8:56 PM

Baseball, Graphics, Patterns, and Simplicity

I love these graphs. If you are a baseball fan or a lover of graphics, you will, too. Baseball is the most numbers-friendly of all sports, with a plethora of statistics that can extracted easily from its mano-a-mano confrontations between pitchers and batters, catchers and baserunners, American League and National. British fan Craig Robinson goes a step beyond the obvious to create beautiful, information-packed graphics that truths both quirky and pedestrian.

Some of the graphs are more complex than others. Baseball Heaven uses concentric rings, 30-degree wedges, and three colors to show that the baseball gods smile their brightest on the state of Arizona. I like some of these complex graphs, but I must admit that sometimes they seem like more work than they should be. Maybe I'm not visually-oriented in the right way.

I notice that many of my favorites have something in common. Consider this chart showing the intersection of the game's greatest home run hitters and the steroid era:

home run hitters and performance-enhancing drugs

It doesn't take much time looking at this graph for a baseball fanatic to sigh with regret and hope that Ken Griffey, Jr., has played clean. (I think he has.) A simple graphic, a poignant bit of information.

Next, take a look at this graph that answers the question, how does baseball's winningest team fare in the World Series?:

win/loss records and World Series performance

This is a more complex than the previous one, but the idea is simple: sort teams by win/loss record, identify the playoff and World Series teams by color, and make the World Series winners the min axis of the graph. Who would have thought that the playoff team with the worst record would win the World Series almost as often as the the team with the best record?

Finally, take a look at what is my current favorite from the site, an analysis of interleague play's winners and losers.

winners and losers in interleague play

I love this one not for its information but for its stark beauty. Two grids with square and rectangular cells, two primary colors, and two shades of each are all we need to see that the two leagues have played pretty evenly overall, with the American League dominating in recent years, and that the AL's big guns -- the Yankees, Red Sox, and Angels -- are big winners against their NL counterparts. This graph is so pretty, I want to put a poster-sized print of it on my wall, just so that I can look at it every day.

The common theme I see among these and my other favorite graphs is that they are variations of the unpretentious bar chart. No arcs, line charts with doubly-labeled axes, or 3D effects required. Simple colors, simple labels, and simple bars illuminating magnitudes of interest.

Why am I drawn to these basic charts? Am I too simple to appreciate the more complex forms, the more complex interweaving of dimensions and data?

I notice this as a common theme across domains. I like simple patterns. I am most impressed when writers and artists employ creative means to breathe life into unpretentious forms. It is far more creative to use a simple bar chart in a nifty or unexpected way than it is to use spirals, swirls of color, concentric closed figures, or multiple interlocking axes and data sources. To take a relationship, however complex, and boil its meaning down to the simplest of forms -- taken with a twist, perhaps, but unmistakably straightforward nonetheless -- that is artistry.

I find that I have similar tastes in programming. The simplest patterns learned by novice programmers captivate me: a guarded action or linear search; structural recursion over a BNF definition or a mutual recursion over two; a humble strategy object or factory method. Simple tools used well, adapted to the unique circumstances of a problem, exposing just the right amount of detail and shielding us from all that doesn't matter. A pattern used a million times never in the same way twice. My tastes are simple, but I can taste a wide range of flavors.

Now that I think about it, I think this theme explains a bit of what I love about baseball. It is a simple game, played on a simple form with simple equipment. Though its rules address numerous edge cases, at bottom they, too, are as simple as one can imagine: throw the ball, hit the ball, catch the ball, and run. Great creativity springs from these simple forms when they are constrained by simple forms. Maybe this is why baseball fans see their sport as an art form, and why people like Craig Robinson are driven to express its truths in art of their own.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

March 29, 2010 7:25 PM

This and That, Volume 2

[A transcript of the SIGCSE 2010 conference: Table of Contents]

Some more miscellaneous thoughts from a few days at the conference...

Deja Vu All Over Again

Though it didn't reach the level of buzz, concurrency and its role in the CS curriculum made several appearances at SIGCSE this year. At a birds-of-a-feather session on concurrency in the curriculum, several faculty talked about the need to teach concurrent programming and thinking right away in CS1. Otherwise, we teach students a sequential paradigm that shapes how they view problems. We need to make a "paradigm shift" so that we don't "poison students' minds" with sequential thinking.

I closed my eyes and felt like I was back in 1996, when people were talking about object-oriented programming: objects first, early, and late, and poisoning students' minds with procedural thinking. Some things never change.

Professors on Parade

How many professors throw busy slides full of words and bullet points up on the projector, apologize for doing so, and then plow ahead anyway? Judging from SIGCSE, too many.

How many professors go on and on about importance of active learning, then give straight lectures for 15, 45, or even 90 minutes? Judging from SIGCSE, too many.

Mismatches like these are signals that it's time to change what we say, or what we do. Old habits die hard, if at all.

Finally, anyone who thinks professors are that much different than students, take note. In several sessions, including Aho's talk on teaching compilers, I saw multiple faculty members in the audience using their cell phones to read e-mail, surf the web, and play games. Come on... We sometimes say, "So-and-so wrote the book on that", as a way to emphasize the person's contribution. Aho really did write the book on compilers. And you'd rather read e-mail?

I wonder how these faculty members didn't pay attention before we invented cell phones.

An Evening of Local Cuisine

Some people may not be all that excited by Milwaukee as a conference destination, but it is a sturdy Midwestern industrial town with deep cultural roots in its German and Polish communities. I'm not much of a beer guy, but the thought of going to a traditional old German restaurant appealed to me.

My last night in town, I had dinner at Mader's Restaurant, which dates to 1902 and features a fine collection of art, antiques, and suits of medieval armour "dating back to the 14th century". Over the years they have served political dignitaries such as the Kennedys and Ronald Reagan and entertainers such as Oliver Hardy (who, if the report is correct, ate enough pork shanks on his visit to maintain his already prodigious body weight).

I dined with Jim Leisy and Rick Mercer. We started the evening with a couple of appetizers, including herring on crostinis. For dinner, I went with the Ritter schnitzel, which came with German mashed potatoes and Julienne vegetables, plus a side order of spaetzele. I closed with a light creme brulee for dessert. After these delightful but calorie-laden dishes, I really should have run on Saturday morning!

Thanks to Jim and Rick for great company and a great meal.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

March 12, 2010 9:49 PM

SIGCSE This and That, Volume 1

[A transcript of the SIGCSE 2010 conference: Table of Contents]

Day 2 brought three sessions worth their own blog entries, but it was also a busy day meeting with colleagues. So those entries will have to wait until I have a few free minutes. For now, here are a few miscellaneous observations from conference life.

On Wednesday, I checked in at the table for attendees who had pre-registered for the conference. I told the volunteer my name, and he handed me my bag: conference badge, tickets to the reception and Saturday luncheon, and proceedings on CD -- all of which cost me in the neighborhood of $150. No one asked for identification. I though, what a trusting registration.

This reminded me of picking up my office and building keys on my first day at my current job. The same story: "Hi, I'm Eugene", and they said, "Here are your keys." When I suggested to a colleague that this was perhaps too trusting, he scoffed. Isn't it better to work at a place where people trust you, at least until we have a problem with people who violate that trust? I could not dispute that.

The Milwaukee Bucks are playing at home tonight. At OOPSLA, some of my Canadian and Minnesotan colleagues and I have a tradition of attending a hockey game whenever we are in an NHL town. I'm as big a basketball fan as they are hockey fans, so maybe I should check out an NBA game at SIGCSE? The cheapest seat in the house is $40 or so and is far from the court. I would go if I had a posse to go with, but otherwise it's a rather expensive way to spend a night alone watching a game.

SIGCSE without my buddy Robert Duvall feels strange and lonely. But he has better things to do this week: he is a proud and happy new daddy. Congratulations, Robert!

While I was writing this entry, the spellchecker on my Mac flagged www.cs.indiana.edu and suggested I replace it with www.cs.iadiana.edu. Um, I know my home state of Indiana is part of flyover country to most Americans, but in what universe is iadiana an improvement?

People, listen to me: problem-solve is not a verb. It is not a word at all. Just say solve problems. It works just fine. Trust me.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

March 08, 2010 8:41 PM

A Knowing-and-Doing Gathering at SIGCSE?

SIGCSE 2010 logo

I'm off tomorrow to SIGCSE. I'm looking forward to several events, among them the media computation workshop, the New Educators Roundtable, several sessions on programming languages and compilers, and especially a keynote address by physics Nobel laureate Carl Wiemann, who lots to say about using science to teach science. It should be a busy and fun week.

A couple of readers have indicated interest in visiting with me over a coffee break at the conference. Reader Matthew Hertz suggested something more: an informal meeting of Knowing and Doing readers. The lack of comments on this blog notwithstanding, I love hearing from readers, whether they have ideas to share or concerns with my frequently sketchy logic. As a reader myself, I often like to put a face on the authors I read. A few readers of this blog feel the same. My guess is that readers of my blog probably have a lot in common, and they might gain as much from meeting each other as meeting me!

So. If you are interested in meeting up with me at SIGCSE, partaking in an informal gathering of Knowing and Doing readers, or both, drop me a line by e-mail or on Twitter @wallingf. I'll gauge interest and let everyone know the verdict. I'm sure that, if there's interest, we can find a time and space to connect.


Posted by Eugene Wallingford | Permalink | Categories: General

February 22, 2010 6:56 PM

I'll Do Homework, But Only for a Grade

In the locker room one morning last week, I overheard two students talking about their course work. One of the guys eventually got himself pretty worked up while talking about one professor, who apparently gives tough exams, and exclaimed, "We worked two and a half hours on that homework, and he didn't even grade it!"

Yesterday, I was sitting with my daughters while they did some school work. One of them casually commented, "We all stopped putting too much effort into Teacher Smith's homework when we figured out s/he never grades it."

I know my daughter's situation up close and so know what she means. She tends to go well beyond the call of duty on her assignments, in large part because she is in search of a perfect grade. With time an exceedingly scarce and valuable resource, she faces an optimization problem. It turns out she can put in less effort on her homework than she ordinarily does and still do fine on her test. With no prospect of a higher grade from putting more time into the assignment to pull her along, she is willing to economize a bit and spend her time elsewhere.

Maybe that's just what the college student meant when I overheard him that morning. Perhaps he is actually overinvesting in his homework relative to its value for learning, because he seeks a higher grade on the homework component of the course. That's not the impression I got from my unintentional eavesdropping, though. I left the locker room thinking that he sees value in doing the homework only if it is graded, only if it contributes to his course grade.

This is the impression too many college students give their instructors. If it doesn't "count", why do it?

Maybe I was like that in college, too. I know that grades were important to me, and as double-major trying to graduate in four years after spending much of my freshman year majoring in something else, I was taking a heavy class load. Time was at premium. Who has time or energy to do things that don't count?

Even if I did not understand then, I know now that the practice itself is an invaluable part of how I learned. Without lots of practice writing code, we don't even learn the surface details of our language, such as syntax and idiom, let alone reach a deep understanding of solving problems. In the more practical terms expressed by the student in the locker room, without lots of practice, most every exam will seem too long, look to be difficult, and seem to be graded harshly. That prof of his has found a way to get the student to invest time in learning. What a gift!

We cannot let the professor off the hook, though. If s/he tells the class that the assignment will be graded, or even simply gives students the impression that it "counts for something", then not to grade the assignment is a deception. Such a tactic is justified only in exceptional circumstances, and not only moral grounds. As Teacher Smith has surely learned by now, students are smart enough not to fall for a lie too many times before they direct their energies elsewhere.

In general, though, homework is a gift: a chance to learn under controlled conditions. I'm pretty sure that students don't see it this way. This reminds me a conversation I had with my colleague Mark Jacobson a couple of weeks ago. We were discussing the relative abundance and paucity of a grateful attitude among faculty in general. He recalled that, in his study of the martial arts, he had encountered two words for "thank you". One, suki, from the Japanese martial arts, means to see events in our lives as opportunity or gift. Another, sugohasameeda, comes from Korean Tae Kwon Do and is used to say, "Thank you for the workout".

Suki and sugohasameeda are related. One expresses suki when things do not go the way we wish, such as when we have a flat tire or when a work assignment doesn't match or desires. One expresses sugohasameeda in gratitude to one's teacher for the challenging and painful work that make us grow, such as workouts that demand our all. I see elements of both in the homework we are assigned. Sugohasameeda seems to be spot-on with homework, yet suki comes into play, too, in cases such as the instructor going counter to our expectations and not grading an assignment.

I do not find myself in the role of student as much these days, but I can see so many ways that I can improve my own sense of gratefulness. I seem to live sugohasameeda more naturally these days, though incompletely. I am far too often lacking in suki. My daily life would be more peaceful and whole if I could recognize the opportunity to grow through undesired events with gratitude.

One final recollection. Soon after taking my current job, I met an older gentleman who had worked in a factory for 30+ years. He asked where I worked, and when I said, "I teach at the university", he said, "That beats workin' for a livin'". My first reaction was akin to amused indignation. He obviously didn't know anything about what my job was like.

Later I realized that there was a yin to that yang. I am grateful to have a career in which I can do so many cool things, explore ideas whenever they call to me, and work with students who learn and help me to learn -- to do things I love every day. So, yeah, I guess my job does beat "workin' for a livin'".

I just wish more students would take their homework seriously.

~~~~

My colleague Mark also managed to connect his ideas about gratitude from the martial arts to the 23rd Psalm of the Christian Bible. The green pastures to which it famously refers are not about having everything exactly as I want it, but seeing all things as they are -- as gift, as opportunity, as suki. I continue to learn from him.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

February 11, 2010 5:40 PM

Creativity and the Boldness of Youth

While casting about Roy Behrens's blog recently, I came across a couple of entries that connected with my own experience. In one, Behrens discusses Arthur Koestler and his ideas about creativity. I enjoyed the entire essay, but one of its vignettes touched a special chord with me:

In 1971, as a graduate student at the Rhode Island School of Design, I finished a book manuscript in which I talked about art and design in relation to Koestler's ideas. I mailed the manuscript to his London home address, half expecting that it would be returned unopened. To my surprise, not only did he read it, he replied with a wonderfully generous note, accompanied by a jacket blurb.

My immediate reaction was "Wow!", followed almost imperceptibly by "I could never do such a thing." But then my unconscious called my bluff and reminded me that I had once done just such a thing.

Back in 2004, I chaired the Educators' Symposium at OOPSLA. As I first wrote back then, Alan Kay gave the keynote address at the Symposium. He also gave a talk at the main conference, his official Turing Award lecture. The Educators' Symposium was better, in large part because we gave Kay the time he needed to say what he wanted to say.

2004 was an eventful year for Kay, as he won not only the Turing Award but also the Draper Prize and Kyoto Prize. You might guess that Kay had agreed to give his Turing address at OOPSLA, given his seminal influence on OOP and the conference, and then consented to speak a second time to the educators.

But his first commitment to speak was to the Educators' Symposium. Why? At least in part because I called him on the phone and asked.

Why would an associate professor at a medium-sized regional public university dare to call the most recent Turing Award winner on the phone and ask him to speak at an event on the undercard of a conference? Your answer is probably as good as mine. I'll say one part boldness, one part hope, and one part naivete.

All I know is that I did call, hoping to leave a message with his secretary and hoping that he would later consider my request. Imagine my surprise when his secretary said, "He's across the hall just now; let me get him." My heart began to beat in triple time. He came to the phone, said hello, and we talked.

For me, it was a marvelous conversation, forty-five minutes chatting with a seminal thinker in my discipline, of whose work I am an unabashed fan. We discussed ideas that we share about computer science, computer science education, and universities. I was so caught up in our chat that I didn't consider just how lucky I was until we said our goodbyes. I hung up, and the improbability of what had just happened soaked in.

Why would my someone of Kay's stature agree to speak at a second-tier event before he had even been contacted to speak at the main event? Even more, why would he share so much time talking to me? There are plenty of reasons. The first that comes to mind is most important: many of the most accomplished people in computer science are generous beyond my ken. This is true in most disciplines, I am sure, but I have experienced it firsthand many times in CS. I think Kay genuinely wanted to help us. He was certainly willing to talk to me at some length about my hopes for the symposium and the role he could play.

I doubt that this was enough to attract him, though. The conference venue being Vancouver helped a lot; Kay loves Vancouver. The opportunity also to deliver his Turing Award lecture at OOPSLA surely helped, too. But I think the second major reason was his longstanding interest in education. Kay has spent much of his career working toward a more authentic kind of education for our children, and he has particular concerns with the state of CS education in our universities. He probably saw the Educators' Symposium as an opportunity to incite revolution among teachers on the front-line, to encourage CS educators to seek a higher purpose than merely teaching the language du jour and exposing students to a kind of computing calcified since the 1970s. I certainly made that opportunity a part of my pitch.

For whatever reason, I called, and Kay graciously agreed to speak. The result was a most excellent keynote address at the symposium. Sadly, his talk did not incite a revolt. It did plant seeds in the minds of at least of a few of us, so there is hope yet. Kay's encouragement, both in conversation and in his talk, inspire me to this day.

Behrens expressed his own exhilaration "to be encouraged by an author whose books [he] had once been required to read". I am in awe not only that Behrens had the courage to send his manuscript to Koestler but also that he and Koestler continued to correspond by post for over a decade. My correspondence with Kay since 2004 has been only occasional, but even that is more than I could have hoped for as a undergrad, when I first heard of Smalltalk or, as a grad student, when I first felt the power of Kay's vision by living inside a Smalltalk image for months at a time.

I have long hesitated to tell this story in public, for fear that crazed readers of my blog would deluge his phone line with innumerable requests to speak at conferences, workshops, and private parties. (You know who you are...) Please don't do that. But for a few moments once, I felt compelled to make that call. I was fortunate. I was also a recipient of Kay's generosity. I'm glad I did something I never would do.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

February 10, 2010 6:43 PM

Recent Connections: Narrative and Computation

Reader Clint Wrede sent me a link to A Calculus of Writing, Applied to a Classic, another article about author Zachary Mason and his novel The Lost Books of the Odyssey. I mentioned Mason and his book in a recent entry, Diverse Thinking, Narrative, Journalism, and Software, which considered the effect of Mason's CS background on his approach to narrative. In "A Calculus of Writing", makes that connection explicit:

"What I'm interested in scientifically is understanding thought with computational precision," he explained. "I mean, the romantic idea that poetry comes from this deep inarticulable ur-stuff is a nice idea, but I think it is essentially false. I think the mind is articulable and the heart probably knowable. Unless you're a mystic and believe in a soul, which I don't, you really don't have any other conclusion you can reach besides that the mind is literally a computer."

I'm not certain whether the mind is or is not a computer, but I share Mason's interest in "understanding thought with computational precision". Whether poets and novelists create through a computational process or not, building ever-more faithful computational models of what they do interests to people like Mason and me. It also seems potentially valuable as a way to understand what it means to be human, a goal scientists and humanists share.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

February 09, 2010 7:13 PM

Programs as Art

In my previous entry I mentioned colleague and graphic designer Roy Behrens. My first blog articles featuring Behrens mentioned or centered on material from Ballast Quarterly Review, a quarterly commonplace book he began publishing in the mid-1980s. I was excited to learn recently that Behrens is beginning to reproduce material from BALLAST on-line in his new blog, The Poetry of Sight. He has already posted both entries I've seen before and entries new to me. This is a wonderful resource for someone who likes to make connections between art, design, psychology, literature, and just about any other creative discipline.

All this is prelude to my recent reading of the entry Art as Brain Surgery, which recounts a passage from an interview with film theorist Ray Carney that begins the idea behind the entry's title:

The greatest works [of art] do brain surgery on their viewers. They subtly reprogram our nervous systems. They make us notice and feel things we wouldn't otherwise.

I read this passage as a potential challenge to an idea I had explored previously: programming is art. That article looked at the metaphor from poet William Stafford's perspectives on art. Carney looks at art from a different position, one which places a different set of demands on the metaphor. For example,

One of the principal ways [great works of art] do this is through the strangeness of their styles. Style creates special ways of knowing. ... Artistic style induces unconventional states of awareness and sensitivity.

This seems to contradict a connection to programming, a creative discipline in which we seem to prefer -- at least in our code -- convention over individuality, recognizability or novelty, and the obvious over the subtle. When we have to dig into an unfamiliar mass of legacy code, the last thing we want are "unconventional states of awareness and sensitivity". We want to grok the code, and now, so that we can extend and modify it effectively and confidently.

Yet I think we find beauty in programming styles that extend our way of thinking about the world. Many OO and procedural programmers encounter functional programming and see it as beautiful, in part because it does just what Carney says great art does:

It freshens and quickens our responses. It limbers up our perceptions and teaches us new possibilities of feeling and understanding.

The ambitious among us then try to take these new possibilities back to their other programming styles and imbue our code there with the new possibilities. We turn our new perceptions into the conventions and patterns that make our code recognizable and obvious. But this also makes our code subtle in its own, bearing a foreign beauty and sense of understanding in the way it solves the work-a-day problems found in the program's specs. The best software patterns do this: they not only solve a problem but teach us that it can be solved at all, often by bringing an outside influence to our programs.

Perhaps it's just me, but there is something poetic in how I experience the emotional peaks of writing programs. I feel what Carney says:

The greatest works of art are not alternatives to or escapes from life, but enactments of what it feels like to live at the highest pitch of awareness -- at a level of awareness most people seldom reach in their ordinary lives.

The first Lisp interpreter, which taught us that code is data. VisiCalc, which brought program as spreading activation to our desktops, building on AI work in the 1950s and 1960s. Smalltalk. Unix. Quicksort and mergesort, implemented in thousands of programs in thousands of ways, always different but always perceptibly the same. Programmers experience these ideas and programs at the highest pitch of awareness. I walk away from the computer some days hoping that other people get to feel the way I am feeling, alive with fire deep in my bones.

The greatest works are inspired examples of some of the most exciting, demanding routes that can be taken through experience. They bring us back to life.

These days, more than ever, I relish the way even reading a good program can bring me back to life. That's to say nothing of writing one.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

February 08, 2010 2:19 PM

Calling C.P. Snow

A lot has been going on at my university the last few months to keep me busy. With significant budget cuts and a long-term change in state funding of higher education, we are beginning to see changes across campus. Last month our provost announced a move that will affect me and my department intimately: the merger of the College of Natural Sciences (CNS) with the College of Humanities and Fine Arts (CHFA). Computer Science will go from being one department among seven science/math/technology departments to a member of a college twice as large and at least that much more diverse.

The merger came as a surprise to many of us on campus, so there is a lot to do beyond simply combining operating budgets and clerical staffs. I expect everything to work out fine in the end. Colleges of arts and sciences are a common way to organize universities like ours, both of the existing colleges contain good people and many good programs, and we have a dean especially well-suited to lead the merger. Still, the next eighteen months promise to deliver a lot of uncertainty and change. Change is hard, and the resulting college will be something quite different from who we are now. Part of me is excited... There are some immediate benefits for me and CS, as we will now be in the same college with colleagues such as Roy Behrens, and with the departments with whom we have been working on a new major in digital media. Multidisciplinary work is easier to do at the university when they fall under the same administrative umbrella.

We are only getting started on working toward the merger, but I've already noticed some interesting differences between the two faculties. For example, at the first meeting of the department heads in my college with a faculty leader from the other college, we learned that the humanities folks have been working together on a college-wide theme of internationalization. As part of this, they have been reading a common book and participating in reading groups to discuss it.

This is a neat idea. The book provides a common ground for their faculty and helps them to work together toward a common goal. The discussion unifies their college. Together, they also create a backdrop against which many of them can do their scholarly work, share ideas, and collaborate.

Now that we are on the way to becoming one college, the humanities faculty have invited us to join them in the conversation. This is a gracious offer, which creates an opportunity for us all to unify as a single faculty. The particular theme for this year, internationalization, is one that has relevance in both the humanities and the sciences. Many faculty in the sciences are deeply invested in issues of globalization. For this reason, there may well be some cross-college discussion that results, and this interaction will likely promote the merger of the colleges.

That said, I think the act of choosing a common book to read and discuss in groups may reflect a difference between the colleges, one that is either a matter of culture or a matter of practice. For the humanities folks, this kind of discussion is a first-order activity. It is what they do within and across their disciplines. For the science folks, this kind of discussion is a second-order activity. There are common areas of work across the science departments, such as bioinformatics, but even then the folks in biology, chemistry, computer science, and math are all working on their own problems in their own ways. A general discussion of issues in bioinformatics is viewed by most scientists as about bioinformatics, not bioinformatics itself.

I know that this is a superficial analysis and that it consists of more shades of gray than sharp lines. At its best, it is a simplification. Still I found it interesting to see and hear how science faculty responded to the offer.

Over the longer term, it will be interesting to see how the merger of colleges affects what we in the sciences do, and how we do it. I expect something positive will happen overall, as we come into more frequent contact with people who think a little differently than we do. I also expect the day-to-day lives of most science faculty (and humanities faculty as well) will go on as they are now. Letterhead will change, the names of secretaries will change, but scholarly lives will go on.,

The changes will be fun. Getting out of ruts is good for the brain.


Posted by Eugene Wallingford | Permalink | Categories: General

February 01, 2010 10:22 PM

A Blogging Milestone -- 10**3

Thanks to Allyn Bauer for noticing that my recent entry on The Evolution of the Textbook was the 1000th posting to this blog. Five and half years is a long time. I am glad I'm still at it. The last few months have been difficult on the non-teaching and non-CS side of my life, and I feel like my inspiration to write about interesting ideas has been stop-and-go. But I am glad I'm still at it.

While thinking about my 1000th post, I decided to take a look back at the other digits:

Number 100 refers to #99, a review of the Kary Mullis's Dancing Naked in the Mind Field. That article, in combination with the entries on algorithmic patterns and textbooks, seems pretty consistent with how I think about this blog: ideas encountered, considered, and applied. Looking at numbers 1 and 10 led to me to read over the monthly archive for July 2004. Revisiting old thoughts evokes -- or creates -- a strange sort of memory, one that I enjoy.

I hope that the next 1000 entries are as much fun to write.


Posted by Eugene Wallingford | Permalink | Categories: General

January 29, 2010 7:01 PM

Diverse Thinking, Narrative, Journalism, and Software

A friend sent me a link to a New York Times book review, Odysseus Engages in Spin, Heroically, by Michiko Kakutani. My friend and I both enjoy the intersection of different disciplines and people who cross boundaries. The article reviews "The Lost Books of the Odyssey", a recent novel Kakutani calls "a series of jazzy, post-modernist variations on 'The Odyssey'" and "an ingeniously Borgesian novel that's witty, playful, moving and tirelessly inventive". Were the book written by a classicist, we might simply add the book to our to-read list and move on, but it's not. Its author, Zachary Mason is a computer scientist specializing in artificial intelligence.

I'm always glad to see examples of fellow computer scientists with interests and accomplishments in the humanities. Just as humanists bring a fresh perspective when they come to computer science, so do computer scientists bring something different when they work in the humanities. Mason's background in AI could well contribute to how he approaches Odysseus's narrative. Writing programs that make it possible for computers to understand or tell stories causes the programmer to think differently about understanding and telling stories more generally. Perhaps this experience is what enabled Mason to "[pose] new questions to the reader about art and originality and the nature of storytelling".

Writing a program to do any task has the potential to teach us about that task at a deeper level. This is true of mundane tasks, for which we often find our algorithmic description is unintentionally ambiguous. (Over the last couple of weeks, I have experienced this while working with a colleague in California who is writing a program to implement a tie-breaking procedure for our university's basketball conference.) It is all the more true for natural human behaviors like telling stories.

In one of those unusual confluences of ideas, the Times book review came to me the same week that I read Peter Merholz's Why Design Thinking Won't Save You, which is about the value, even necessity, of bringing different kinds of people and thinking to bear on the tough problems we face. Merholz is reacting to a trend in the business world to turn to "design thinking" as an alternative to the spreadsheet-driven analytical thinking that has dominated the world for the last few decades. He argues that "the supposed dichotomy between 'business thinking' and 'design thinking' is foolish", that understanding real problems in the world requires a diversity of perspectives. I agree.

For me, Kakutani's and Merholz's articles intersected in a second way as I applied what they might say about how we build software. Kakutani explicitly connects author Mason's CS background to his consideration of narrative:

["Lost Books" is] a novel that makes us rethink the oral tradition of entertainment that thrived in Homer's day (and which, with its reliance upon familiar formulas, combined with elaboration and improvisation, could be said to resemble software development) ...

When I read Merholz's argument, I was drawn to an analogy with a different kind of writing, journalism:

Two of Adaptive Path's founders, Jesse James Garrett and Jeffrey Veen, were trained in journalism. And much of our company's success has been in utilizing journalistic approaches to gathering information, winnowing it down, finding the core narrative, and telling it concisely. So business can definitely benefit from such "journalism thinking."

So can software development. This passage reminded of a panel I sat on at OOPSLA several years ago, about the engineering metaphor in software development. The moderator of the panel asked folks in the audience to offer alternative metaphors for software, and Ward Cunningham suggested journalism. I don't recall all the connections he made, but they included working on tight deadlines, having work product reviewed by an editor, and highly stylized forms of writing. That metaphor struck me as interesting then, and I have since written about the relationship between software development and writing, for example here. I have also expressed reservations about engineering as a metaphor for building software, such as here and here.

I have long been coming to believe that we can learn a lot about how to build software better by studying intensely almost every other discipline, especially disciplines in which people make things -- even, say, maps! When students and their parents ask me to recommend minors and double majors that go well with computer science, I often mention the usual suspects but always make a pitch for broadening how we think, for studying something new, or studying intensely an area that really interests the students. Good will come from almost any discipline.

These days, I think that making software is like so many things and unlike them all. It's something new, and we are left to find our own way home. That is indeed part of the fun.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

January 22, 2010 9:23 PM

Calling C. S. Peirce

William Caputo channels the pragmatists:

These days, I believe the key difference between practice, value and principle (something much debated at one time in the XP community and elsewhere) is simply how likely we are to adjust them if things are going wrong for us (i.e., practices change a lot, principles rarely). But none should be immune from our consideration when our actions result in negative outcomes.

To the list of practice, value, and principle, pragmatists like Peirce, James, Dewey, and Meade would add knowledge. When we focus on their instrumental view of knowledge, it easy to forget one of the critical implications of the view: that knowledge is contingent on experience and context. What we call "knowledge" is not unchanging truth about the universe; it is only less likely to change in the face of new experience than other elements of our belief system.

Caputo reminds us to be humble when we work to help others to become better software developers. The old pragmatists would concur, whether in asking us to focus on behavior over belief or to be open to continual adaptation to our environment. This guidance applies to teaching more than just software development.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

January 13, 2010 7:18 PM

Programming, Literacy, and Superhuman Strength

I've written occasionally here about programming as a new communications medium and the need to empower as many people as possible with the ability to write little programs for themselves. So it's probably not surprising that I read Clay Shirky's The Shock of Inclusion, which appears in Edge's How Has The Internet Changed The Way You Think?, with a thought about programming. Shirky reminds us that the revolution in thought created by the Internet is hardly in its infancy. We don't have a good idea of how the Internet will ultimately change how we think because the most important change -- to "cultural milieu of thought" -- has not happened yet. This sounds a lot like Alan Kay on the computer revolution, and like Kay, Shirky makes an analogy to the creation of the printing press.

When we consider the full effect of the Internet, as Shirky does in his essay, we think of its effect on the ability of individuals to share their ideas widely and to connect those ideas to the words of others. From the perspective of a computer scientist, I think of programming as a form of writing, as a medium both for accomplishing tasks and for communicating ideas. Just as the Internet has lowered the barriers to publishing and enables 8-year-olds to become "global publishers of video", it lowers the barriers to creating and sharing code. We don't yet have majority participation in writing code, but the tools we need are being developed and communities of amateur and professional programmers are growing up around languages, tools, and applications. I can certainly imagine a YouTube-like community for programmers -- amateurs, people we should probably call non-programmers who are simply writing for themselves and their friends.

Our open-source software communities have taught us not only that "collaboration between loosely joined parties can work at scales and over timeframes previously unimagined", as Shirky notes, but other of his lessons learned from the Internet: that sharing is possible in ways far beyond the 20th-century model of publishing, that "post-hoc peer review can support astonishing creations of shared value", that whole areas of human exploration "are now best taken on by groups", that "groups of amateurs can sometimes replace single experts", and that the involvement of users accelerates the progress of research and development. The open-source software is a microcosm of the Internet. In its own way, with some conscious intent by its founders, it is contributing to creation of the sort of Invisible College that Shirky rightly points out is vital to capitalizing on this 500-year advance in man's ability to communicate. The OSS model is not perfect and has much room for improvement, but it is a viable step in the right direction.

All I know is, if we can put the power of programming into more people's hands and minds, then we can help more people to have the feeling that led Dan Meyer to write Put THAT On The Fridge:

... rather than grind the solution out over several hours of pointing, clicking, and transcribing, for the first time ever, I wrote twenty lines of code that solved the problem in several minutes.

I created something from nothing. And that something did something else, which is such a weird, superhuman feeling. I've got to chase this.

We have tools and ideas that make people feel superhuman. We have to share them!


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

December 12, 2009 10:15 PM

The Computer Reconfigured Me

Joe Haldeman is a writer of some renown in the science fiction community. I have enjoyed a novel or two of his myself. This month he wrote the Future Tense column that closes the latest issue of Communications of the ACM, titled Mightier Than the Pen. The subhead really grabbed my attention.

Haldeman still writes his novels longhand, in bound volumes. I scribble lots of notes to myself, but I rarely write anything of consequence longhand any more. In a delicious irony, I am writing this entry with pen and paper during stolen moments before a basketball game, which only reminds me how much my penmanship has atrophied has from disuse! Writing longhand gives Haldeman the security of knowing that his first draft is actually his first draft, and not the result of the continuous rewriting in place that word processors enable. Even a new generation word processor like WriteBoard, with automatic versioning of every edit, cannot ensure that we produce a first draft without constant editing quite as well as a fountain pen. We scientists might well think as much about the history and provenance of our writing and data.

Yet Haldeman admits that, if he had to choose, he would surrender his bound notebooks and bottles of ink:

... although I love my pens and blank books with hobbyist zeal, if I had to choose between them and the computer there would be no contest. The pens would have to go, even though they're so familiar they're like part of my hand. The computer is part of my brain. It has reconfigured me.

We talk a lot about how the digital computer changes how we work and live. This passage expresses that idea as well as any I've seen and goes one step more. The computer changes how we think. The computer is part of my brain. It has reconfigured me.

Unlike so many others, Haldeman -- who has tinkered with computers in order to support his writing since the Apple II -- is not worried about this new state of the writer's world. This reconfiguration is simply another stage in the ongoing development of how humans think and work.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

December 05, 2009 2:01 PM

Some Things I Have Learned, along with Milton Glaser

I recently came across a link to graphic designer Milton Glaser's 2001 talk Ten Things I Have Learned. Several of his lessons struck close to home for me.

  • After spending time with a person, do you usually feel exhilarated or exhausted? If you always feel tired, then you have been poisoned. Avoid people who do this to you. I would add positive advice in the same vein: Try to surround yourself with people who give you energy, and try to be a person who energizes those around you.

  • Less is not necessarily more. That's a lie we tell ourselves too often when we face cuts. "Do less with more." In the short term, this can be a way to become more efficient. In the long term, it starves us and our organizations. I like Glaser's idea better: Just enough is more.

  • If you think you have achieved enlightenment, "then you have merely arrived at your limitation". I see this too often in academia and in industry. Glaser uses this example of the lesson that doubt is better than certainty, but it also relates to an earlier lesson in the talk: Style is not to be trusted. Styles come and go; integrity and substance remain vital no matter what the fashion is for expressing solutions.

This talk ends with a passage that brought to mind discussion in recent months among agile software developers and consultants about a the idea of certifying agile practitioners:

Everyone interested in licensing our field might note that the reason licensing has been invented is to protect the public not designers or clients. "Do no harm" is an admonition to doctors concerning their relationship to their patients, not to their fellow practitioners or the drug companies.

Much of the discussion in the agile community about certification seems more about protecting the label "agile" from desecration than about protecting our clients. It may well be that some clients are being harmed when unscrupulous practitioners do a lazy or poor job of introducing agile methods, because they are being denied the benefits of a more responsive development process grounded in evidence gathered from continuous feedback. A lot of the concern, though, seems to be with the chilling effect that poorly- executed agile efforts have on the ability of honest and hard-working agile consultants and developers to peddle our services under that banner.

I don't know what the right answer to any of this is, but I like the last sentence of Glaser's talk:

If we were licensed, telling the truth might become more central to what we do.

Whether we are licensed or not, I think the answer will ultimately back to a culture of honesty and building trust in relationships with our clients. So we can all practice Glaser's tenth piece of advice: Tell the truth.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

November 23, 2009 2:53 PM

Personality and Perfection

Ward Cunningham recently tweeted about his presentation at Ignite Portland last week. I enjoyed both his video and his slides.

Brian Marick has called Ward a "gentle humanist", which seems so apt. Ward's Ignite talk was about a personal transformation in his life, from driver to cyclist, but as is often the case he uncovers patterns and truths that transcend a single experience. I think that is why I always learn so much from him, whether he is talking about software or something else.

From this talk, we can learn something about change in habit, thinking, and behavior. Still, one nugget from the talk struck me as rather important for programmers practicing their craft:

Every bike has personality. Get to know lots of them. Don't search for perfection. Enjoy variety.

This is true about bikes and also true about programming languages. Each has a personality. When we know but one or two really well, we have missed out on much of what programming holds. When we approach a new language expecting perfection -- or, even worse, that it have the same strengths, weaknesses, and personality as one we already know -- we cripple our minds before we start.

When we get to know many languages personally, down to their personalities, we learn something important about "paradigms" and programming style: They are fluid concepts, not rigid categories. Labels like "OO" and "functional" are useful from some vantage points and exceedingly limiting from others. That is one of the truths underlying Anton van Straaten's koan about objects and closures.

We should not let our own limitations limit how we learn and use our languages -- or our bikes.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 21, 2009 5:54 AM

Quotes of the Day

The day was yesterday.

I am large, I contain multitudes.

The to-do list is a time capsule, containing missives and pleas to your future selves. ... Why is it not trivially easy to carry out items on your own to-do list? And the answer is: Because the one writing the list, and the one carrying it out are two different people.

Now I understand the problem... my to-do list is a form of time travel.

Open to Multitudes

It's the kind of culture that can tolerate rap music and extreme sports that can also create space for guys like Page and Brin and Google. That's one of our hidden strengths.

This is from economist Paul Romer, as quoted by Tyler Cowen. I agree. We need to try out lots of ideas to find the great ones.

Going to an Extreme

I'm not interested in writing short stories. Anything that doesn't take years of your life and drive you to suicide hardly seems worth doing.

Cormac McCarthy must live on the edge. This is one of those romantic notions that has never appealed to me. I've never been so driven -- nor felt like I wanted to be.

A Counterproposal

6. MAKE MANY SKETCHES

Join the best sketches to produce others and improve them until the result is satisfactory.

To make sketches is a humble and unpretentious approach toward perfection.

... says composer Arnold Schonberg, as quoted at peripatetic axiom. This is more my style.

Speaking of Perfection

My perfect day is sitting in a room with some blank paper. That's heaven. That's gold and anything else is just a waste of time.

Again from Cormac McCarthy. Unlike McCarthy, I do not think that everything else is a waste of time. Yet I feel a kinship with his sense of a perfect day. To sit in a room, alone, with an open terminal. To write, whether prose or code. But especially code.


Posted by Eugene Wallingford | Permalink | Categories: General

November 15, 2009 8:02 PM

Knowledge Arbitrage

A couple of weeks back, Brian Foote tweeted:

Ward Cunningham: Pure Knowledge arbitrageurs will no longer gain by hoarding as knowledge increasingly becomes a plentiful commodity #oopsla

This reminds me of a "quought" of the day that I read a couple of years ago. Paraphrased, it asked marketers: What will you do when all of your competitors know all of the same things you do? Ward's message broadens the implication from marketers to any playing field on which knowledge drives success. If everyone has access to the same knowledge, how do you distinguish yourself? Your product? The future looks a bit more imposing when no one starts with any particular advantage in knowledge.

Ward's own contributions to the world -- the wiki and extreme programming among them -- give us a hint as to what this new future might look like. Hoarding is not the answer. Sharing and building together might be.

The history of the internet and the web tells us at the result of collaboration and open knowledge may well be a net win for all of us over a world in which knowledge is hoarded and exploited for gain in controlled bursts.

Part of the ideal of the academy has always been the creation and sharing of knowledge. But increasingly its business model has been exposed as depending on the sort of knowledge arbitrage that Ward warns against. Universities now compete in a world of knowledge more plentiful and open than ever before. What can they do when all of their customers have access to much of the same knowledge that they hope to disseminate? Taking a cue from Ward, universities probably need to be thinking hard about how they share knowledge, how they help students, professors, and industry build knowledge together, and how they add value in their unique way through academic inquiry.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

November 09, 2009 9:54 PM

Reality versus Simulation

A recent discussion on the XP mailing list discussed the relative merits of using physical cards for story planning versus a program, even something as simple as a spreadsheet. Someone had asked, "why not use a program?", and lots of XP aficionados explained why not.

I mostly agree with the explanations, but one undercurrent in the discussion bothered me. It is best captured in this comment:

The software packages are simulations. The board and cards are the real thing.

I was immediately transported twenty years back, to a set of old arguments against artificial intelligence. They went something like this... If we write a program to simulate a rainstorm, we will not get wet; it is just a simulation. By the same token, we can write a program to simulate symbol processing the way we think people do it, but it's not real symbol processing; it is just a simulation. We can write a program to simulate human thought, but it's not real; it's just simulated thought. Just as a simulated rainstorm will not make us wet, simulated thought can't enlighten us. Only human thought is real.

That always raised my hackles. I understand the difference between a physical phenomenon like rain and a simulation of it. But symbol processing and thought are something different. They are physical in our brains, but they manifest themselves in our interactions with the exterior world, including other symbol processors and thinkers. Turing's insight in his seminal paper Computing Machinery and Intelligence was to separate the physical instantiation of intelligent behavior from the behavior itself. The essence of the behavior is its ability to communicate ideas to other agents. If a program can carry on such communication in a way indistinguishable form how humans communicate, then on what grounds are we to say that the simulation is any less real than the real thing?

That seems like a long way to go back for a connection, but when I read the above remark, from someone whose work I greatly respect, it, too, raised my hackles. Why would a software tool that supports an XP practice be "only" a simulation and current practice be the real thing?

The same person prefaced his conclusion above with this, which explains the reasoning behind it:

Every software package out there has to "simulate" some definite subset of these opportunities, and the more of them the package chooses to support the more complex to learn and operate it becomes. Whereas with a physical board and cards, the opportunities to represent useful information are just there, they don't need to be simulated.

The current way of doing things -- index cards and post-it notes on pegboards -- is a medium of expression. It is an old medium, familiar, comfortable, and well understood, but a medium nonetheless. So is a piece of software. Maybe we can't express as much in our program, or maybe it's not as convenient to say what we want to say. This disadvantage is about what we can say or say easily. It's not about reality.

The same person has the right idea elsewhere in his post:

Physical boards and cards afford a much larger world of opportunities for representing information about the work as it is getting done.

Ah... The physical medium fits better into how we work. It gives us the ability to easily represent information as the work is being done. This is about work flow, not reality.

Another poster gets it right, too:

It may seem counterintuitive for those of us who work with technology, but the physical cards and boards are simply more powerful, more expressive, and more useful than electronic storage. Maybe because it's not about storage but communication.

The physical medium is more expressive, which makes it more powerful. More power combined with greater convenience makes the physical medium more useful. This conclusion is about communication. It doesn't make the software tool less real, only less useful or effective.

You will find that communication is often the bottom line when we are talking about software development. The agile approaches emphasize communication and so occasionally reach what seems to be a counterintuitive result for a technical profession.

I agree with the XP posters about the use of physical cards and big, visible boards for displaying them. This physical medium encourages and enhances human communication in a way that most software does not -- at least for now. Perhaps we could create better software tools to support our work? Maybe computer systems will evolve to the point that a live display board will dynamically display our stories, tasks, and status in a way that meshes as nicely with human workflow and teamwork as physical displays do now. Indeed, this is probably possible now, though not as inexpensively or as conveniently as stash of index cards, a cheap box of push pins, and some cork board.

I am open to a new possibility. Framing the issue as one of reality versus simulation seems to imply that it's not possible. I think that perspective limits us more than it helps us.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

October 30, 2009 4:31 PM

Writing to Learn, Book-Style

I know all about the idea of "writing to learn". It is one of the most valuable aspects of this blog for me. When I first got into academia, though, I was surprised to find how many books in the software world are written by people who are far from experts on the topic. Over the years, I have met several serial authors who pick a topic in conjunction with their publishers and go. Some of these folks write books that are successful and useful to people. Still the idea has always seemed odd.

In the last few months, I've seen several articles in which authors talk about how they set out to write a book on a topic they didn't know well or even much at all. Last summer, Alex Payne wrote this about writing the tapir book:

I took on the book in part to develop a mastery of Scala, and I've looked forward to learning something new every time I sit down to write, week after week. Though I understand more of the language than I did when I started, I still don't feel that I'm on the level of folks like David Pollak, Jorge Ortiz, Daniel Spiewak, and the rest of the Scala gurus who dove into the language well before Dean or I. Still, it's been an incredible learning experience ...

Then today I ran across Noel Rappin's essay about PragProWriMo:

I'm also completely confident in this statement -- if you are willing to learn new things, and learn them quickly, you don't need to be the lead maintainer and overlord to write a good technical book on a topic. (Though it does help tremendously to have a trusted super-expert as a technical reference.)

Pick something that you are genuinely curious about and that you want to understand really, really well. It's painful to write even a chapter about something that doesn't interest you.

This kind of writing to learn is still not a part of my mentality. I've certainly chosen to teach courses in order to learn -- to have to learn -- something I want to know, or know better. For example, I didn't know any PHP to speak of, so I gladly took on a 5-week course introducing PHP as a scripting language. But I have a respect for books, perhaps even a reverence, that makes the idea of publishing one on a subject I am not expert in unpalatable. I have to much respect for the people who might read it to waste their time.

I'm coming to learn that this probably places an unnecessary limit on myself. Articles like Payne's and Rappin's remind me that I can study something and become expert enough to write a book that is useful to others. Maybe it's time to set out on that path.

Getting people to take this step is one good reason to heed the call of Pragmatic Programmers Writing Month (PragProWriMo), which is patterned after the more generic NaNoWriMo (NaNoWriMo). Writing is like anything else: we can develop a habit that helps us to produce material regularly, which is a first and necessary step to ever producing good material regularly. And if research results on forming habits is right, we probably need a couple of months of daily repetitions to form a habit we can rely on.

So, whether it's a book or blog you have in mind, get to writing.

(Oh, and you really should click through the link in Rappin's essay to Merlin Mann's Making the Clackity Noise for a provocative -- if salty -- essay on why you should write. From there, follow the link to Buffering, where you will find a video of drummer Sonny Payne playing an extended solo for Count Basie's orchestra. It is simply remarkable.)


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

October 22, 2009 4:00 PM

Local Boys Succeed in Gaming Industry

I went last night to see a talk by Aaron Schurman, co-founder and CEO of Phantom EFX. Phantom is a homegrown local company that makes video games. The talk told the story of their latest and most ambitious release, Darkest of Days, a first-person shooter game built around historic narratives and a time-travel hook.

Phantom got its start with casino games. They started from scratch, with no training in software development. Part of the team did have background in graphic design, which gave them a foundation to build on. In the last decade, they have became serious players in the market, with several top-selling titles.

I'm am not a "computer gamer" and rarely ever play the sort of games that are so popular with students these days. But as a computer scientists, I am interested in them as programs. Nearly every game these days requires artificial intelligence, both to play the game and, in character-based games, to provide realistic agents in the simulated world. My background in AI made me a natural local resource to the company when they were getting started. As a result, I have had the good fortune to be a long-time friend of the company.

Aaron's talk was like the game; it had something for almost everyone: history, creative writing, art, animation, media studies, and computer science. The CS is not just AI, of course. A game at this level of scale is a serious piece of software. The developers faced a number of computational constraints in filling a screen with a large number of realistic humans and while maintaining the frame rate required for an acceptable video experience. There were also software development challenges, such as building for multiple platforms in sync and working with contractors distributed across the globe. There is a lot to be learned by conducting a retrospective of this project.

Aaron spoke a lot about the challenges they faced. His response was the sort you expect from people who succeed: Don't be dismayed. Do you think you are too small or too poor to compete with the big boys? Don't be dismayed. You can find a way, even if it means rolling your own gaming engine because the commercial alternatives are too expensive. Don't know how to do something? Don't be dismayed. You simply don't know yet. Work hard to learn. Everyone can do that.

The practical side of me is glad that we are so close to a company like this and have connections. We've recently begun exploring ways to place our students at Phantom EFX for internships. I love the idea of running an iPhone development class to port some of the company's games to that market. This is a great opportunity for the students, but also for professors!

The dreamer in me was inspired by this talk. I am always impressed when I meet people, especially former students, who have a vision to build something big. This sort of person accepts risks and works hard. The return on that investment can be huge, both monetarily and spiritually. I hope more of our students take stories like this to heart and realize that entrepreneurship offers an alternative career path when they have ideas and are willing to put their their work hours toward something that they really care about.

At its bottom, this is the story of small-town Iowa guys staying in small-town Iowa and building a new tech company. Now they have Hollywood producers knocking on their doors, bidding to option their script and concept for a major motion picture. Not a bad way to make a living.


Posted by Eugene Wallingford | Permalink | Categories: General

October 15, 2009 5:22 PM

Conscience and Clarity

I've been working on a jumble of administrative duties all week long, with an eye toward the weekend. While cleaning up some old files, I ran across three items that struck me as somehow related, at least in the context of the last few days.

Listen to Your Conscience

Here's a great quote from an old article by John Gruber:

If you think your users would be turned off by an accurate description of something, that doesn't mean you should do it without telling them. It means means you shouldn't be doing whatever it is you don't want to tell them about.

This advice applies to so many different circumstances. It's not bulletproof, but it's worth being the first line of thought whenever your conscience starts to gnaw at you.

Listen With Your Heart

And here's a passage on writing from the great Joni Mitchell:

You could write a song about some kind of emotional problem you are having, but it would not be a good song, in my eyes, until it went through a period of sensitivity to a moment of clarity. Without that moment of clarity to contribute to the song, it's just complaining.

This captures quite nicely one of the difficulties I have with blogging about being a department head: I rarely seem to have that moment of clarity. And I need them, even if I don't intend to blog about the experience.

Somebody Must Be Listening

One piece of nice news... I recently received a message saying that Knowing and Doing has been included in a list of the top 100 blogs by professors on an on-line learning web site. There are a lot of great blogs on that list, and it's an honor to be included among them. I follow a dozen or so of those blogs closely. One that some of my readers might not be familiar with is Marginal Revolution, which looks at the world through the lens of an economist.

If I could add only one blog to that list, right now it would be The Endeavour, John Cook's blog on software, math, and science. I learn a lot from the connections he makes.

In any case, it's good to know that readers find some measure of value here, too. I'll keep watching for the moments of clarity about CS, software development, teaching, running, and life that signal a worthwhile blog entry.


Posted by Eugene Wallingford | Permalink | Categories: General

October 13, 2009 9:31 AM

Living with Yesterday

After my long run yesterday, I was both sorer and more tired ('tireder'?) than after last Sunday's big week and fast long run. Why? I cut my mileage last week from 48 miles to 38, and my long run from 22 miles to 14. I pushed hard only during Wednesday's track workout. Shouldn't last week have felt easy, and shouldn't I be feeling relatively rested after an easy long run yesterday?

No, I shouldn't. The expectation I should is a mental illusion that running long ago taught me was an impostor. It's hard to predict how I will feel on any day, especially during training, but the best predictor isn't what I did this week, but last; not today, but yesterday.

Intellectually, this should not surprise us. The whole reason we train today is to be better -- faster, strong, more durable -- tomorrow. My reading of the running literature says that it takes seven to ten days for the body to integrate the effects of a specific workout. It makes sense that the workout can be affecting our body in all sorts of ways during that period.

This is good example of how running teaches us a lesson that is true in all parts of life:

We are what and who are we are today because of what we did yesterday.

This is true of athletic training. It is true of learning and practice more generally. What we practice is what we become.

More remarkable than that this true in my running is that I can know and write about habit of mind as an intellectual idea without making an immediate connection to my running. I often find in writing this blog that I come back around on the same ideas, sometimes in a slightly different form and sometimes in much the same form as before. My mind seems to need that repetition before it can internalize these truths as universal.

When I say that I am living with yesterday, I am not saying that I can live anywhere but in this moment. That is all I have, really. But it is wise to be mindful that tomorrow will find me a product of what I do today.


Posted by Eugene Wallingford | Permalink | Categories: General, Running, Teaching and Learning

October 07, 2009 8:11 PM

Refactoring as Rewriting

Reader and occasional writer that I am, Michael Nielsen's Six Rules for Rewriting seemed familiar in an instant. I recognize their results in good writing, and even when I don't practice them successfully in my own writing I know they would often make it better.

Occasional programmer that I am, they immediately had me thinking... How well do they apply to refactoring? Programming is writing, and refactoring is one of our common forms of rewriting... So let's see.

First of all, let's acknowledge up front that a writer's rewriting is not identical to a programmer's refactoring. First of all, the writer does not have automated tests to help her ensure that the rewrite doesn't break anything. It's not clear to me exactly what not breaking anything means for a writer, though I have a vague sense that it is meaningful for most writing.

Also, the term "refactoring" does not refer to any old rewrite of a code base. It has a technical meaning: to modify code without changing its essential functionality. There are rewrites of a code base that are not refactoring. I think that's true of writing in general, though, and I also think that Nielsen is clearly talking about rewrites that do not change the essential content or purpose of a text. His rules are about how to say the same things more effectively. That seems close enough to our technical sense of refactoring to make this exercise worth an effort.

Striking

Every sentence should grab the reader and propel them forward.

Can we say that every line of code should grab the reader and propel her forward?! I certainly prefer to read programs in which every statement or expression tells me something important about what the program is and does. Some programming languages make this harder to do, with boilerplate and often more noise than signal.

Perhaps we could say that every line of code should propel the program forward, not get in the way of its functionality? This says more about the conciseness with which the programmer writes, and fits the spirit of Nielsen's rule nicely.

Every paragraph should contain a striking idea, originally expressed.

Can we say that every function or class should contain a striking idea, originally expressed? Functions and classes that do not usually get in the reader's way. In programming, though, we often write "helpers", auxiliary functions or classes that assist another in expressing an essential, even striking, idea. The best helpers capture an idea of deep value, but it's may be the nature of decomposition that we sometimes create ones that are striking only in the context of the larger system.

The most significant ideas should be distilled into the most potent sentences possible.

Yes! The most significant ideas in our programs should be distilled into the most potent code possible: expressions, statements, functions, classes, whatever the abstractions our language and style provide.

Style

Use the strongest appropriate verb.

Of course. Names matter. Use the strongest, most clearly named primitives and library functions possible. When we create new functions, give them strong, clear names. This rule applies to our nouns, too. Our variables and classes should carry strong names that clearly name their concept. No more "manager" or "process" nouns. They avoid naming the concept. What do those objects do?

This rule also applies more broadly to coding style. It seems to me that Tell, Don't Ask is about strength in our function calls.

Beware of nominalization.

In code, this guideline prescribes a straightforward idea: Don't make a class when a function will do. You Aren't Gonna Need It.

Meta

None of the above rules should be consciously applied while drafting material.

Anyone who writes a lot knows how paralyzing it can be to worry about writing good prose before getting words down onto paper, or into an emacs buffer. Often we don't know what to write until we write it; why try to write that something perfect before we know what it is?

This rule fits nicely with most lightweight approaches to programming. I even encourage novice programmers to write code this way, much to the chagrin of my more engineering-oriented colleagues. Don't be paralyzed by the blank screen. Write freely. Make something work, anything on the path to a solution, and only then worry about making it right and fast. Do the simplest thing that will work. Only after your code works do you rewrite to make it better.

Not all rewriting is refactoring, but all refactoring is rewriting. Write. Pass the test. Refactor.

Many people find that refactoring provides the most valuable use of design patterns, as a target toward which one moves the code. This is perhaps a more important use of patterns than initial design, at which time many of us tend to overdesign our programs. Joshua Kerievsky's Refactoring to Patterns book makes shows programmers how to do this safely and reliably. I wonder if there is any analogue to this book in the writing world, or if there even could be such a book?

I once wrote a post on writing in an agile style, and rewriting played a key role in that idea. Some authors like rewriting more than writing, and I think you can say the same thing of many, many programmers. Refactoring brings a different kind of joy, at getting something right that was before almost right -- which is, of course, another way of saying not yet right.

I recall once talking with a novelist over lunch about tools for writers. Yet even the most humble word processor has done so much to change how authors write and rewrite. One of the comments on Nielsen's entry asks whether new tools for writing have changed the way writers think. We might also ask whether new tools -- the ability to edit and rewrite so much more easily and with so much less= technical effort -- has changed the product created by most writers. If not, could it?

New tools also change how we rewrite code. The refactoring browser has escaped the confines of the Smalltalk image and now graces IDEs for Java, C++, and C## programmers; indeed, refactoring tools exist for so many languages these days. Is that good or bad? Many of my colleagues lament that the ease of rewriting has led to an endemic sloppiness, to a rash of random programming in which students keep making seemingly random changes to their code until something compiles. Back in the good old days, we had to think hard about our code before we carved it into clay tablets... It seems clear to me that making rewriting and refactoring easier is a huge win, even as it changes how we need to teach and practice writing.

In retrospect, a lot of Nielsen's rules generalize to dicta we programmers will accept eagerly. Eliminate boilerplate. Write concise, focused code. Use strong, direct, and clear language. Certainly when we abstract the tasks to a certain level, writing and rewriting really are much the same in text and code.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

September 27, 2009 11:19 AM

History Mournful and Glorious

While prepping for my software engineering course last summer, I was re-reading some old articles by Philip Greenspun on teaching, especially an SE course focused on building on-line communities. One of the talks he gives is called Online Communities. This talk builds on the notion that "online communities are at the heart" of most successful applications of the Internet". Writing in 2006, he cites amazon.com, AOL, and eBay as examples, and the three years since have only strengthened his case. MySpace seems to have passed its peak yet remains an active community. I sit hear connected with friends from grade school who have been flocking to Facebook in droves, and Twitter is now one of my primary sources for links to valuable professional articles and commentary.

As a university professor, the next two bullets in his outline evoke both sadness and hope:

  • the mournful history of applying technology to education: amplifying existing teachers
  • the beauty of online communities: expanding the number of teachers

Perhaps we existing faculty are limited by our background, education, or circumstances. Perhaps we simply choose the more comfortable path of doing what has been done in the past. Even those of us invested in doing things differently sometimes feel like strangers in a strange land.

The great hope of the internet and the web is that it lets many people teach who otherwise wouldn't have a convenient way to reach a mass audience except by textbooks. This is a threat to existing institutions but also perhaps an open door on a better world for all of us.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

September 19, 2009 9:09 PM

Quick Hits with an Undercurrent of Change

Yesterday evening, in between volleyball games, I had a chance to do some reading. I marked several one-liners to blog on. I planned a disconnected list of short notes, but after I started writing I realized that they revolve around a common theme: change.

Over the last few months, Kent Beck has been blogging about his experiences creating a new product and trying to promote a new way to think about his design. In his most recent piece, Turning Skills into Money, he talks about how difficult it can be to create change in software service companies, because the economic model under which they operates actually encourages them to have a large cohort of relatively inexperienced and undertrained workers.

The best line on that page, though, is a much-tweeted line from a comment by Niklas Bjørnerstedt:

A good team can learn a new domain much faster than a bad one can learn good practices.

I can't help thinking about the change we would like to create in our students through our software engineering course. Skills and good practices matter. We cannot overemphasize the importance of proficiency, driven by curiosity and a desire to get better.

Then I ran across Jason Fried's The Next Generation Bends Over, a salty and angry lament about the sale of Mint to Intuit. My favorite line, with one symbolic editorial substitution:

Is that the best the next generation can do? Become part of the old generation? How about kicking the $%^& out of the old guys? What ever happened to that?

I experimented with Mint and liked it, though I never convinced myself to go all the way it. I have tried Quicken, too. It seemed at the same time too little and too much for me, so I've been rolling my own. But I love the idea of Mint and hope to see the idea survive. As the industry leader, Intuit has the leverage to accelerate the change in how people manage their finances, compared to the smaller upstart it purchased.

For those of us who use these products and services, the nature of the risk has just changed. The risk with the small guy is that it might fold up before it spreads the change widely enough to take root. The risk with the big power is that it doesn't really get it and wastes an opportunity to create change (and wealth). I suspect that Intuit gets it and so hold out hope.

Still... I love the feistiness that Fried shows. People with big ideas and need not settle. I've been trying to encourage the young people with whom I work, students and recent alumni, to shoot for the moon, whether in business or in grad school.

This story meshed nicely with Paul Graham's Post-Medium Publishing, in which Graham joins in the discussion of what it will be like for creators no longer constrained by the printed page and the firms that have controlled publication in the past. The money line was:

... the really interesting question is not what will happen to existing forms, but what new forms will appear.

Change will happen. It is natural that we all want to think about our esteemed institutions and what the change means for them. But the real excitement lies in what will grow up to replace them. That's where the wealth lies, too. That's true for every discipline that traffics in knowledge and ideas, including our universities.

Finally, Mark Guzdial ruminates on what changes CS education. He concludes:

My first pass analysis suggests that, to make change in CS, invent a language or tool at a well-known institution. Textbooks or curricula rarely make change, and it's really hard to get attention when you're not at a "name" institution.

I think I'll have more to say about this article later, but I certainly know what Mark must be feeling. In addition to his analysis of tools and textbooks and pedagogies, he has his own experience creating a new way to teach computing to non-majors and major alike. He and his team have developed a promising idea, built the infrastructure to support it, and run experiments to show how well it works. Yet... The CS ed world looks much like it always has, as people keep doing what they've always been doing, for as many reasons as you can imagine. And inertia works against even those with the advantages Mark enumerates. Education is a remarkably conservative place, even our universities.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development, Teaching and Learning

September 09, 2009 10:04 PM

Reviewing a Career Studying Camouflage

Camouflage Conference poster

A few years ago I blogged when my university colleague Roy Behrens won a faculty excellence award in his home College of Humanities and Fine Arts. That entry, Teaching as Subversive Inactivity, taught me a lot about teaching, though I don't yet practice it very well. Later, I blogged about A Day with Camouflage Scholars, when I had the opportunity to talk about how a technique of computer science, steganography, related to the idea of camouflage as practiced in art and the military. Behrens is an internationally recognized expert on camouflage who organized an amazing one-day international conference on the subject here at my humble institution. To connect with these scholars, even for a day, was a great thrill. Finally, I blogged about Feats of Association when Behrens gave a mesmerizing talk illustrating "that the human mind is a connection-making machine, an almost unwilling creator of ideas that grow out of the stimuli it encounters."

As you can probably tell, I am a big fan of Behrens and his work. Today, I had a new chance to hear him speak, as he gave a talk associated with his winning another award, this time the university's Distinguished Scholar Award. After hearing this talk, no one could doubt that he is a worthy recipient, whose omnivorous and overarching interest in camouflage reflects a style of learning and investigation that we could all emulate. Today's talk was titled "Unearthing Art and Camouflage" and subtitled my research on the fence between art and science. It is a fence that more of us should try to work on.

The talk wove together threads from Roy's study of the history and practice of camouflage with bits of his own autobiography. It's a style I enjoyed in Kurt Vonnegut's Palm Sunday and have appreciated at least since my freshman year in college, when in an honors colloquium at Ball State University I was exposed to the idea of history from the point of view of the individual. As someone who likes connections, I'm usually interested in how accomplished people come to do what they do and how they make the connections that end up shaping or even defining their work.

Behrens was in the first generation of his family to attend college. He came from a small Iowa town to study here at UNI, where he first did research in the basement of the same Rod Library where I get my millions. He held his first faculty position here, despite not having a Ph.D. or the terminal degree of discipline, an M.F.A. After leaving UNI, he earned an M.A. from the Rhode Island School of Design. But with a little lucky timing and a publication record that merited consideration, he found his way into academia.

From where did his interest in camouflage come? He was never interested in military, though he served as a sergeant in the Vietnam-era Marine Corps. His interest lay in art, but he didn't enjoy the sort of art in which subjective tastes and fashion drove practice and criticism. Instead, he was interested in what was "objective, universal, and enduring" and as such was drawn to design and architecture. He and I share an interest in the latter; I began mu undergraduate study as an architecture major. A college professor offered him a chance to do undergraduate research, and his result was a paper titled "Perception in the Visual Arts", in which he first examined the relationship between the art we make and the science that studies how we perceive it. This paper was later published in major art education journal.

That project marked his first foray into perceptual psychology. Behrens mentioned a particular book that made an impression on him, Aspects of Form, edited by Lancelot Law Whyte. It contained essays on the "primacy of pattern" by scholars in both the arts and the sciences. Readers of this blog know of my deep interest in patterns, especially in software but in all domains. (They also know that I'm a library junkie and won't be surprised to know that I've already borrowed a copy of Whyte's book.)

Behrens noted that it was a short step from "How do people see?" to "How are people prevented from seeing?" Thus began what has been forty years of research on camouflage. He studies not only the artistic side of camouflage but also its history and the science that seeks to understand it. I was surprised to find that as a RISD graduate student he already intended to write a book on the topic. At the time, he contacted Rudolf Arnheim, who was then a perceptual psychologist in New York, with a breathless request for information and guidance. Nothing came of that request, I think, but in 1990 or so Behrens began a fulfilling correspondence with Arnheim that lasted until his death in 2007. After Arnheim passed away, Behrens asked his family to send all of his photos so that Behrens could make copies, digitize them, and then return the originals to the family. They agreed, and the result is a complete digital archive of photographs from Arnheim's long professional life. This reminded me of Grady Booch's interest in preservation, both of the works of Dijkstra and of the great software architectures of past and present.

While he was at RISD, Behrens did not know that the school library had 455 original "dazzle" camouflage designs in its collection and so missed out on the opportunity to study them. His ignorance of these works was not a matter of poor scholarship, though; the library didn't realize their significance and so had them uncataloged on a shelf somewhere. In 2007, his graduate alma mater contacted him with news of the items, and he has now begun to study them, forty years later.

As grad student, Behrens became in interested in the analogical link between (perceptual) figure-ground diagrams and (conceptual) Venn diagrams. He mentioned another book that helped him make this connection, Community and Privacy, by Serge Chermayeff and Christopher Alexander, whose diagrams of cities and relationships were Venn diagrams. This story brings to light yet another incidental connection between Behrens's work and mine. Alexander is, of course, the intellectual forebear of the software patterns movement, through his later books Notes On The Synthesis Of Form, The Timeless Way Of Building, A Pattern Language, and The Oregon Experiment.

UNI hired Behrens in 1972 into a temporary position that became permanent. He earned tenure and, fearing the lack of adventure that can come from settling down to soon, immediately left for the University of Wisconsin-Milwaukee. He worked there ten years and earned his tenure anew. It was at UW-M where he finally wrote the book he had begun planning in grad school. Looking back now, he is embarrassed by it and encouraged us not to read it!

At this point in the talk, Behrens told us a little about his area of scholarship. He opened with a meta-note about research in the era of the world wide web and Google. There are many classic papers and papers that scholars should know about. Most of them are not yet on-line, but one can at least find annotated bibliographies and other references to them. He pointed us to one of his own works, Art and Camouflage: An Annotated Bibliography, as an example of what is now available to all on the web.

Awareness of a paper is crucial, because it turns out that often we can find it in print -- even in the periodical archives of our own libraries! These papers are treasures unexplored, waiting to be rediscovered by today's students and researchers.

Camouflage consists of two primary types. The first is high similarity, as typified by figure-ground blending in the arts and mimicry in nature. This is the best known type of camouflage and the type most commonly seen in popular culture.

The second is high difference, or what is often called figure disruption. This sort of camouflage was one of the important lessons of World War I. We can't make a ship invisible, because the background against which it is viewed changes constantly. A British artist named Norman Wilkinson had the insight to reframe the question: We are not trying to hide a ship; we are trying to prevent the ship from being hit by a torpedo!

(Redefining one problem in terms of another is a standard technique in computer science. I remember when I first encountered it as such, in a graduate course on computational theory. All I had to do was find a mapping from a problem to, say, 3-SAT, and -- voilá! -- I knew a lot about it. What a powerful idea.)

This insight gave birth to dazzle camouflage, in which the goal came to be break an image into incoherent or imperceptible parts. To protect a ship, the disruption need not be permanent; it needed only to slow the attackers sufficiently that they were unable to target it, predict its course, and launch a relatively slow torpedo at it with any success.

a Gabon viper, which illustrates coincident disruption

Behrens offered that there is a third kind of camouflage, coincident disruption, that is different enough to warrant its own category. Coincident disruption mixes the other two types, both blending into the background and disrupting the viewer's perception. He suggested that this may well be the most common form of camouflage found in nature using the Gabon viper, pictured here, as one of his examples of natural coincident disruption.

Most of Behrens' work is on modern camouflage, in the 20th century, but study in the area goes back farther. In particular, camouflage was discussed in connection to Darwin's idea of natural selection. Artist Abbott Thayer was a preeminent voice on camouflage in the 19th century who thought and wrote on both blending and disruption as forms in nature. Thayer also recommended that the military use both forms of camouflage in combat, a notion that generated great controversy.

In World War I, the French ultimately employed 3,000 artists as "camoufleurs". The British and Americans followed suit on a smaller scale. Behrens gave a detailed history of military camouflage, most of which was driven by artists and assisted by a smaller number of scientists. He finds World War II's contributions less interesting but is excited by recent work by biologists, especially in the UK, who have demonstrated renewed interest in natural camouflage. They are using empirical methods and computer modeling as ways to examine and evaluate Thayer's ideas from over a hundred years ago. Computational modeling in the arts and sciences -- who knew?

Toward the end of his talk, Behrens told several stories from the "academic twilight zone", where unexpected connections fall into the scholar's lap. He called these the "unsung delights of researching". These are stories best told first hand, but they involved a spooky occurrence of Shelbyville, Tennessee, on a pencil he bought for a quarter from a vending machine, having the niece and nephew of Abbott Thayer in attendance at a talk he gave in 1987, and buying a farm in Dysart, Iowa, in 1992 only then to learn that Everett Warner, whom he had studied, was born in Vinton, Iowa -- 14 miles away. In the course of studying a topic for forty years, the strangest of coincidences will occur. We see these patterns whether we like to or not.

Behrens's closing remarks included one note that highlights the changes in the world of academic scholarship that have occurred since he embarked on his study of camouflage forty years ago. He admitted that he is a big fan of Wikipedia and has been an active contributor on pages dealing with the people and topics of camouflage. Social media and web sites have fundamentally changed how we build and share knowledge, and increasingly they are being used to change how we do research itself -- consider the Open Science and Polymath projects.

Today's talk was, indeed, the highlight of my week. Not only did I learn more about Behrens and his work, but I also ended up with a couple of books to read (the aforementioned Whyte book and Kimon Nicolaïdes's The Natural Way to Draw), as well as a couple of ideas about what it would mean for software patterns to hide something. A good way to spend an hour.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

August 07, 2009 2:18 PM

A Loosely-Connected Friday Miscellany

An Addition to My News Aggregator

Thanks to John Cook, I came across the blog of Dan Meyers, a high school math teacher. Cook pointed to an entry with a video of Meyer speaking pecha kucha-style at OSCON. One of the important messages for teachers conveyed in this five minutes is Be less helpful. Learning happens more often when people think and do than when they follow orders in a well-defined script.

While browsing his archive I came across this personal revelation about the value of the time he was spending on his class outside of the business day:

I realize now that the return on that investment of thirty minutes of my personal time isn't the promise of more personal time later. ... Rather it's the promise of easier and more satisfying work time now.

Time saved later is a bonus. If you depend on that return, you will often be disappointed, and that feeds the emotional grind that is teaching. Kinda like running in the middle. I think it also applies more than we first realize to reuse and development speed in software.

Learning and Doing

One of the underlying themes in Meyers's writing seems to be the same idea in this line from Gerd Binnig, which I found at Physics Quote of Day:

Doing physics is much more enjoyable than just learning it. Maybe 'doing it' is the right way of learning ....

Programming can be a lot more fun than learning to program, at least the way we often try to teach it. I'm glad that so many people are working on ways to teach it better. In one sense, the path to better seems clear.

Knowing and Doing

One of the reasons I named by blog "Knowing and Doing" was that I wanted to explore the connection between learning, knowing, and doing. Having committed to that name so many years ago, I decided to stake its claim at Posterous, which I learned about via Jake Good. Given some technical issues with using NanoBlogger, at least an old version of it, I've again been giving some thought to upgrading or changing platforms. Like Jake, I'm always tempted to roll my own, but...

I don't know if I'll do much or anything more with Knowing and Doing at Posterous, but it's there if I decide that it looks promising.

A Poignant Convergence

Finally, a little levity laced with truth. Several people have written to say they liked the name of my recent entry, Sometimes, Students Have an Itch to Scratch. On a whim, I typed it into Translation Party, which alternately translates a phrase from English into Japanese and back until it reaches equilibrium. In only six steps, my catchphrase settles onto:

Sometimes I fear for the students.

Knowing how few students will try to scratch their own itches with their new-found power as a programmer, and how few of them will be given a chance to do so in their courses on the way to learning something valuable, I chuckled. Then I took a few moments to mourn.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development, Teaching and Learning

August 06, 2009 11:59 AM

Woody Allen Is On Line 1

An excerpt from an interview at Gaping Void:

Some days, the work is tedious, labour-intensive and as repetitive as a production line in a factory. ... The key is having a good infrastructure. ...

But none of it works without discipline. Early on in my career, I was told that success demanded one thing above all others: turning up. Turning up every bloody day, regardless of everything.

This was said by artist Hazel Dooney, but it could just as well been said by a programmer -- or a university professor. One thing I love about the agile software world is its willingness to build new kinds of tools to support the work of programmers. Isn't it ironic? Valuing people over tools makes having the right tools even more important.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

August 01, 2009 7:09 AM

Casting Shadows

I have been reading David Ogilvy's Confessions of an Advertising Man. I have found it to be quite enjoyable. It is a slim volume, written charmingly in a style we don't see much anymore. It is about not only advertising but also leading a team, creating and guiding an organization, and running a business. There are elements of all these is my job as department head, and even as a faculty member. Many of Ogilvy's essons won't surprise you; he recommends the old-fashioned virtues. Hard work. Quality. Fairness. Honesty. Integrity. High standards. Candor.

Ogilvy describes how to build and run a great agency, but at heart he is a great believer in the individual, especially when it comes to creative acts:

Some agencies pander to the craze for for doing everything in committee. They boast about "teamwork" and decry the role of the individual. But no team can write an advertisement, and I doubt whether there is a single agency of any consequence which is not the lengthened shadow of one man.

I sometimes wonder whether greatness can be achieved by a group of competent or even above-average individuals, or if an outstanding individual is an essential ingredient. In an advertising agency, there are the somewhat distinct acts of creating campaigns and running the agency. Perhaps the latter is more amenable to being led by a team. But even when it comes to great works, I am aware that teams have produced excellent software. How much of that success can be attributed to the vision and leadership of one individual on the team, I don't know.

As I mentioned at the top of a recent entry, a university task force I chaired submitted its final report at the beginning of July. After working so long with this group, I am feeling a little seller's remorse. Did we do a good enough job? If acted upon, will our recommendations effect valuable change? Can they be acted upon effectively at a time of budget uncertainties? The report we wrote does not advocate revolutionary change, at least not on the surface. It is more about creating structures and practices that will support building trust and communication. In a community that has drifted in recent years and not always had visionary leadership, these are prerequisites to revolutionary changes. Still, I am left wondering what we might have done more or differently.

The report is most definitely the product of a committee. I suspect that several of the individuals in the group might well have been able to produce something as good or better by working solo, certainly something with sharper edges and sharper potential -- at higher risk. Others on the group could not have done so, but that was not the nature of their roles. In the end, the committee rounded off the sharp edges, searched for and found common ground. The result is not a least common denominator, but it is no longer revolutionary. If that sort of result is what you need, a committee is not your best agent.

Part of my own remorse comes back to Ogilvy's claim. Could I have led the group better? Could I have provided a higher vision and led the group to produce a more remarkable set of recommendations? Did I cast a long enough shadow?

~~~~

The shadows of summer are lengthening. One of the reasons that I have always liked living in the American Midwest is the changing of the seasons. Here we have not four seasons but eight, really, as the blending of summer to autumn, or winter to spring, each has its own character. Academia, too, has its seasons, and they are part of what attracted me to professional life at a university. From the outside looking in, working in industry looked like it could become monotonous. But the university offers two semesters and a summer, each with a brand new start, a natural life, and a glorious end. Whatever monotony we experience happens at a much larger time scale, as these seasons come and go over the years.

Like the weather, academia has more than the obvious seasons we name. We have the break between semesters over Christmas and New Year's Day, a short period of change. When school ends in May, it is like the end of the year, and we have a period of changing over to a summer of activity that is for many of us much different than the academic year. And finally, we have the transition between summer and the new academic year. For me, that season begins about now, on the first day of August, as thoughts turn more and more to the upcoming year, the preparation of classes, and the return of students. This is a change that injects a lot of energy into our world and saves us from any monotony we might begin to feel.

So, as the long shadows of summer begin to fall, we prepare for the light of a new year. What sort of shadow will I cast?


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

July 14, 2009 1:06 PM

Is He Talking About News, or Classroom Content?

Seth Godin says:

People will not pay for by-the-book rewrites of news that belongs to all of us. People will not pay for yesterday's news, driven to our house, delivered a day late, static, without connection or comments or relevance. Why should we?

Universities may not be subject to the same threats as newspapers, due in some measure to

  • their ability to aggregate intellectual capital and research capacity,
  • their privileged status in so many disciplines as the granters of required credentials, and
  • frankly, the lack of maturity, initiative, and discipline of their primary clientele.

But Godin's quote ought to cause a few university professors considerable uneasiness. In the many years since I began attending college as an undergrad, I have seen courses at every level and at every stop that fall under the terms of this rebuke.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

July 14, 2009 10:28 AM

They Say These Things Come in Threes...

After writing that two UNI CS grads had recently defended their doctoral dissertations, I heard about the possibility of a third. Turns out it was more than a possibility... Last Friday, Chris Johnson defended and has since submitted the final version of his dissertation to the University of Tennessee. His work is in the area of scientific visualization, with a focus on computation-intensive simulations. For the last few years, Chris has been working out of Ames, Iowa, and we may be lucky enough to have him remain close by.

The summer bonanza grows. Congratulations, Chris!


Posted by Eugene Wallingford | Permalink | Categories: General

July 11, 2009 10:57 AM

Former Students Crossing the Divide

What a summer for UNI CS alumni in academia! In the last few weeks, Andrew Drenner and Ryan Dixon both defended their Ph.D. dissertations, at the University of Minnesota and UC-Santa Barbara, respectively. Currently Andrew is currently working with a robotics start-up spun off from his research lab, and Ryan is enjoying a short break before starting full-time at Apple next month.

I had the great fortune to work with Andrew and Ryan throughout their undergraduate years, in several courses and projects each. Some students are different from the rest, and these guys distinguished themselves immediately not only by their creativity and diligence but also by their tremendous curiosity. When a person with deep curiosity also has the desire to work hard to find answers, stand back. It is neat to see them both expanding what we know about topics they were working on as undergrads. Indeed, Ryan's project at Apple is very much in the spirit of his undergrad research project, which I was honored to supervise.

Congratulations, gentlemen! Many of your friends and family may think that this means you are no longer students. But you are really joining a new fraternity of students. We are honored to have you among us.


Posted by Eugene Wallingford | Permalink | Categories: General

July 09, 2009 3:52 PM

Five Years On

Sketchbooks are not about being a good artist,
they're about being a good thinker.
-- Jason Santa Maria

Five years ago today, I started this blog as a sort of sketchbook for words and ideas. I didn't know just what to expect, so I'm not surprised that it hasn't turned out as I might have guessed. Thinking out loud and writing things down can be like that. Trying to explain to myself and anyone who would what was happening as I lived life as a computer scientist and teacher have been a lot of fun.

a shot of me running the Chicago marathon

In the beginning, I was preparing to teach a course on agile software development and planning to run my second marathon. These topics grew together in my mind, almost symbiotically, and the result was a lot of connections. The connections were made firmer by writing about them. They also gave me my first steady readership, as occasionally someone would share a link with a friend.

Things have changed since 2004. Blogging was an established practice in a certain core demographic but just ready to break out among the masses. Now, many of the bloggers whose worked I cherished reading back then don't write as much as they used to. Newer tools such as Twitter give people a way to share links and aphorisms, and many people seem to live in the Twittersphere now. Fortunately, a lot of people still take the time to share their ideas in longer form.

Even though I go through stretches where I don't write much, my blog has become an almost essential element of how I go about life now. Yesterday's entry is a great example of me writing to synthesize experience in a way I might not otherwise. I had a few thoughts running around my head. They were mostly unrelated but somehow... they wanted to be connected. So I started writing, and ended up somewhere I may not have taken the time to go if I hadn't had to write complete sentences and say things in a way my friends would understand. That is good for me. For you readers? I hope so. A few of you keep coming back.

Five years down the road, I am no longer surprised by how computer science, writing, and running flow together. First, they are all a part of who I am right now, and our minds love to make connections. But then there is something common to all activities that challenge us. With the right spirit, we find that they drive us to seek excellence, and the pursuit of excellence -- whether we are Roger Federer, reaching the highest of heights, or Lance Armstrong, striving to reach those heights yet again, or just a simple CS professor trying to reach his own local max -- is a singular experience.

Last week, I ran across a quote from Natalie Goldberg on Scott Smith's blog. I first mentioned Goldberg during my first month as a blogger. This quote ties running to writing to habit:

If you run regularly, you train your mind to cut through or ignore your resistance. You just do it. And in the middle of the run, you love it. When you come to the end, you never want to stop. And you stop, hungry for the next time. That's how writing is, too.

The more I blog, the more I want to write. And, in the face of some troubles over the last year, I wake up hungry to run.

A few years ago, I read a passage from Brian Marick that I tucked away for July 9, 2009:

I've often said that I dread the day when I look back on the me of five years ago without finding his naivete and misconceptions faintly ridiculous. When that day comes, I'll know I've become an impediment to progress.

a forest

Just last month, Brian quick-blogged on the same theme: continuing to grow enough that the me of five years ago looks naive, or stepping away from the stage. After five years blogging, my feeling on this is mixed. I look back and see some naivete, yes, but I often see some great stuff. "I thought that?" Sometimes I'm disappointed that a great idea from back then hasn't become more ingrained in my practices of today, but then I remember that it's a lot easier to think an idea than to live it. I do see progress, though. I also see new themes emerging in my thoughts and writing, which is a different sort of progress altogether.

I do take seriously that you are reading this and that you may even make an effort to come back to read more later. I am privileged to have had so many interactions with readers over these five years. Even when you don't send comments and links, I know you are there, spending a little of your precious time here.

So I think I'll stay on this stage a while longer. I am just a guy trying to evolve, and writing helps me along the way.


Posted by Eugene Wallingford | Permalink | Categories: General

July 06, 2009 3:26 PM

Cleaning Data Off My Desk

As I mentioned last time, this week I am getting back to some regular work after mostly wrapping up a big project, including cleaning off my desk. It is cluttered with a lot of loose paper that the Digital Age had promised to eliminate. Some is my own fault, paper copies of notes and agendas I should probably find a way to not to print. Old habits dies hard.

But I also have a lot paper sent to me as department head. Print-outs; old-style print-outs from a mainframe. The only thing missing from a 1980s flashback is the green bar paper.

Some of these print-outs are actually quite interesting. One set is of grade distribution reports produced by the registrar's office, which show how many students earned As, Bs, and so on in each course we offered this spring and for each instructor who taught a course in our department. This sort of data can be used to understand enrollment figures and maybe even performance in later courses. Some upper administrators have suggested using this data in anonymous form as a subtle form of peer pressure, so that profs who are outliers within a course might self-correct their own distributions. I'm ready to think about going there yet, but the raw data seems useful, and interesting in its own right.

I might want to do more with the data. This is the first time I recall receiving this, but in the fall it would be interesting to cross-reference the grade distributions by course and instructor. Do the students who start intro CS in the fall tend to earn different grades than those who start in the spring? Are there trends we can see over falls, springs, or whole years? My colleagues and I have sometimes wondered aloud about such things, but having a concrete example of the data in hand has opened new possibilities in my mind. (A typical user am I...)

As a programmer, I have the ability to do such analyses with relatively straightforward scripts, but I can't. The data is closed. I don't receive actual data from the registrar's office; I receive a print-out of one view of the data, determined by people in that office. Sadly, this data is mostly closed even to them, because they are working with an ancient mainframe database system for which there is no support and a diminishing amount of corporate memory here on campus. The university is in the process of implementing a new student information system, which should help solve some of these problems. I don't imagine that people across campus will have much access to this data, though. That's not the usual M.O. for universities.

Course enrollment and grade data aren't the only ones we could benefit from opening up a bit. As a part of the big project I just wrapped up, the task force I was on collected a massive amount of data about expenditures on campus. This data is accessible to many administrators on campus, but only through a web interface that constrains interaction pretty tightly. Now that we have collected the data, processed almost all of it by hand (the roughness of the data made automated processing an unattractive alternative), and tabulated it for analysis, we are starting to receive requests for our spreadsheets from others on campus. These folks all have access to the data, just not in the cleaned-up, organized format into which we massaged it. I expressed frustration with our financial system in a mini-rant a few years ago, and other users feel similar limitations.

For me, having enrollment and grade data would be so cool. We could convert data into information that we could then us to inform scheduling, teaching assignments, and the like. Universities are inherently an information-based institutions, but we don't always put our own understanding of the world into practice very well. Constrained resources and intellectual inertia slow us down or stop us all together.

Hence my wistful hope while reading Tim Bray's "Hello-World" for Open Data. Vancouver has a great idea:

  • Publish the data in a usable form.
  • License it in a way that turns people loose to do whatever they want, but doesn't create unreasonable liability risk for the city.
  • See what happens. ...

Would anyone on campus take advantage? Maybe, maybe not. I can imagine some interesting mash-ups using only university data, let alone linking to external data. But this isn't likely to happen. GPA data and instructor data are closely guarded by departments and instructors, and throwing light on it would upset enough people that any benefits would probably be shouted down. But perhaps some subset of the data the university maintains, suitably anonymized, could be opened up. If nothing else, transparency sometimes helps to promote trust.

I should probably do this myself, at the department level, with data related to schedule, budget, and so on. I occasionally share the spreadsheets I build with the faculty, so they can see the information I use to make decisions. This spring, we even discussed opening up the historic formula used in the department to allocate our version of merit pay.

(What a system that is -- so complicated that that I've feared making more than small editorial changes to it in my time as head. I keep hoping to find the time and energy to build something meaningful from scratch, but that never happens. And it turns out that most faculty are happy with what we have now, perhaps for "the devil you know" reasons.)

I doubt even the CS faculty in my department would care to have open data of this form. We are a small crew, and they are busy with the business of teaching and research. It is my job to serve them by taking as much of this thinking out of our way. Then again, who knows for sure until we try? If the cost of sharing can be made low enough, I'll have no reason not to share. But whether anyone uses the data that might not even be the real point. Habits change when we change them, when we take the time to create new ones to replace the old ones. This would a good habit for me to have.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Managing and Leading

June 26, 2009 4:01 PM

The Why of X

Where did the title of my previous entry come from? Two more quick hits tell a story.

Factoid of the Day

On a walk the other night, my daughter asked why we called variables x. She is reviewing some math this summer in preparation to study algebra this fall. All I could say was, "I don't know."

Before I had a chance to look into the reason, one explanation fell into my lap. I was reading an article called The Shakespeare of Iran, which I ran across in a tweet somewhere. And there was an answer: the great Omar Khayyam.

Omar was the first Persian mathematician to call the unknown factor of an equation (i.e., the x) shiy (meaning thing or something in Arabic). This word was transliterated to Spanish during the Middle Ages as xay, and, from there, it became popular among European mathematicians to call the unknown factor either xay, or more usually by its abbreviated form, x, which is the reason that unknown factors are usually represented by an x.

However, I can't confirm that Khayyam was first. Both Wikipedia and another source also report the Arabic language connection, and the latter mentions Khayyam, but not specifically as the source. That author also notes that "xenos" is the Greek word for "unknown" and so could be the root. However, I also haven't found a reference for this use of x that predates Khayyam, either. So may be.

My daughter and I ended up with as much of a history lesson as a mathematical terminology lesson. I like that.

Quote of the Day

Yesterday afternoon, the same daughter was listening in on a conversation between me and a colleague about doing math and science, teaching math and science, and how poorly we do it. After we mentioned K-12 education and how students learn to think of science and math as "hard" and "for the brains", she joined the conversation with:

Don't ask teachers, 'Why?' They don't know, and they act like it's not important.

I was floored.

She is right, of course. Even our elementary school children notice this phenomenon, drawing on their own experiences with teachers who diminish or dismiss the very questions we want our children to ask. Why? is the question that makes science and math what they are.

Maybe the teacher knows the answer and doesn't want to take the time to answer it. Maybe she knows the answer but doesn't know how to answer it in a way that a 4th- or 6th- or 8th-grader can understand. Maybe he really doesn't know the answer -- a condition I fear happens all too often. No matter; the damage is done when the the teacher doesn't answer, and the child figures the teacher doesn't know. Science and math are so hard that the teacher doesn't get it either! Better move on to something else. Sigh.

This problem doesn't occur only in elementary school or high school. How often do college professors send the same signal? And how often do college professors not know why?

Sometimes, truth hits me in the face when I least expect it. My daughters keep on teaching me.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

June 25, 2009 9:48 PM

X of the Day

Quick hits, for different values of x, of course, but also different values of "the day" I encountered them. I'm slow, and busier than I'd like.

Tweet of the Day

Courtesy of Glenn Vanderburg:

Poor programmers will move heaven and earth to do the wrong thing. Weak tools can't limit the damage they'll do.

Vanderburg is likely talking about professional programmers. I have experienced this truth when working with students. At first, it surprised me when students learning OOP would contort their code into the strangest configurations not to use the OO techniques they were learning. Why use a class? A fifty- or hundred-line method will do nicely.

Then, students learning functional programming would seek out arcane language features and workarounds found on the Internet to avoid trying out the functional patterns they had used in class. What could have been ten lines of transparent Scheme code in two mutually recursive functions became fifteen or more of the most painfully tortured C code wrapped in a thin veil of Scheme.

I've seen this phenomenon in other contexts, too, like when students take an elective course called Agile Software Development and go out of their way to do "the wrong thing". Why bother with those unit tests? We don't really need to try pair programming, so we? Refactor -- what's that?

This feature of programmers and learners has made me think harder trying to help them see the value in just trying the techniques they are supposed to learn. I don't succeed as often as I'd like.

Comic of the Day

Hammock dwellers, unite!

2009-06-23 Wizard of Id on professors

If only. If only. When does summer break start?


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

June 24, 2009 8:13 AM

Brains, Patterns, and Persistence

I like to solve the Celebrity Cipher in my daily paper. Each puzzle is a mixed alphabet substitution cipher on a quote by someone -- a "celebrity", loosely considered -- followed by the speaker's name, sometimes prefixed with a title or short description. Lately I've been challenging myself to solve the puzzle in my head, without writing any letters down, even once I'm sure of them. Crazy, I know, but this makes the easier puzzles more challenging now that I have gotten pretty good at solving them with pen in hand.

(Spoiler alert... If you like to do this puzzle, too, and have not yet solved the June 22 cipher, turn away now. I am about to give the the answer away!)

Yesterday I was working on a puzzle, and this was the speaker phrase:

IWHNN TOXFZRXNYHO NXKJHSSA YXOYXEBUHO

I had looked at the quote itself for a couple of minutes and so was operating on an initial hypothesis that YWH was the word the. I stared at the speaker for a while... IWHNN would be IheNN. Double letters to end the third word, which is probably the first name. N could be s, or maybe l. s... That would be the first letter of the first name.

And then I saw it, in whole cloth:

Chess grandmaster Savielly Tartakower

Please don't think less of me. I'm not a freak. Really.

a picture of Savielly Tartakower

How very strange. I have no special mental powers. I do have some experience solving these puzzles, of course, but this phrase is unusual both in the prefix phrase and in the obscurity of the speaker. Yes, I once played a lot of chess and did know of Tartakower, a French-Polish player of the early 20th century. But how did I see this answer?

The human brain amazes me almost every day with its ability to find, recognize, and impose patterns on the world. Practice and exposure to lots and lots of data is one of the ways it learns these patterns. That is part of how I am able to solve these ciphers most days -- experience makes patterns appear to me, unbidden by conscious thought. There may be other paths to mastery, but I know of no other reliable substitute for practice.

What about the rest of the puzzle? From the letter pairs in the speaker phrase, I was able to reconstruct the quote itself with little effort:

Victory goes to the player who makes the next-to-last mistake.

Ah, an old familiar line. If we follow this quote to its logical conclusion, it offers good advice for much of life. You never know which mistake will be the next-to-last, or the last. Keep playing to win. If you learn from your mistakes, you'll start to make fewer, which increases the probability that your opponent will make the last mistake of the game.

Even when in non-adversarial situations, or situations in which there is no obvious single adversary, this is a good mindset to have. People who embrace failure persist. They get better, but perhaps more importantly they simply survive. You have to be in the game when your opportunity comes -- or when your opponent makes the ultimate mistake.

Like so many great lines, Tartakower's is not 100% accurate in all cases. As an accomplished chessplayer, he certainly knew that the best players can lose without ever making an obvious mistake. Some of my favorite games of all time are analyzed in My Sixty Memorable Games, by Bobby Fischer himself. It includes games in which the conquered player never made the move that lost. Instead, the loser accreted small disadvantages, or drifted off theme, and suddenly the position was unfavorable. But looking back, Fischer could find no obvious improvement. Growing up, this fascinated me -- the loser had to make a mistake, right? The winner had make a killer move... Perhaps not.

Even still, the spirit of Tartakower's advice holds. Play in this moment. You never know which mistake will be the next-to-last, or the last. Keep playing.

At this time of year, when I look back over the past twelve months of performing tasks that do not come naturally to me, and looking ahead to next year's vision and duties, this advice gives me comfort.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

June 13, 2009 7:16 PM

Agile Moments While Reading the Newspaper

The first: Our local paper carries a parenting advice column by John Rosemond, an advocate of traditional parenting. In Wednesday's column, a parent asked how to handle a child who refuses to eat his dinner. Rosemond responded that the parents should calmly, firmly, and persistently expect the child to eat the meal -- even if it meant that the child went hungry that night by refusing.

[Little Johnny] will survive this ordeal -- it may take several weeks from start to finish -- with significantly lower self-esteem and a significantly more liberal palette, meaning that he will be a much happier child.

If you know Rosemond, you'll recognize this advice.

I couldn't help thinking about what happens when we adults learn a new programming style (object-oriented or functional programming), a new programming technique (test-driven development, pair programming), or even a new tool that changes our work flow (say, SVN or JUnit). Calm, firm, persistent self-discipline or coaching are often the path to success. In many ways, Rosemond's advice works more easily with 3- or 5-year-olds than college students or adults, because the adults have the option of leaving the room. Then again, the coach or teacher has less motivation to ensure the change sticks -- that's up to the learner.

I also couldn't help thinking how often college students and adults behave like 3- and 5-year-olds.

The second: Our paper also carries a medical advice column by a Dr. Gott, an older doctor who harkens back to an older day of doctor-patient relations. (There is a pattern here.) In Wednesday's column, the good doctor said about a particular diagnosis:

There is no laboratory or X-ray test to confirm or rule out the condition.

My first thought was, well, then how do we know it exists at all? This a natural reaction for a scientist -- or pragmatist -- to have. I think this means that we don't currently have a laboratory or X-ray test for the presence or absence of this condition. Or there may be another kind of test that will tell us whether the condition exists, such as a stress tests or an MRIs.

Without any test, how can we know that something is? We may find out after it kills the host -- but then we would need a post-mortem test. While the patient lives, there could be a treatment regimen that works reliably in face of the symptoms. This could provide the evidence we need to say that a particular something was present. But if the treatment fails, can we rule out the condition? Not usually, because there are other reasons that the treatment fails.

We face a similar situation in software with bugs. When we can't reproduce a bug, at least not reliably, we have a hard time fixing it. Whether we know the problem exists depends on which side of the software we live... If I am the user who encounters the problem, I know it exists. If I'm the developer, then maybe I don't. It's easy for me as developer to assume that there is something wrong with the user, not my lovingly handcrafted code. When the program involves threading or a complex web of interactions among several systems, we are more inclined to recognize that a problem exists -- but which problem? And where? Oh, to have a test... I can only think of two software examples of reliable treatment regimens that may tell us something was wrong: rebooting the machine and reinstalling the program. (Hey to Microsoft.). But those are such heavy-handed treatments that they can't give us much evidence about a specific bug.

There is, of course, the old saying of TDD wags: Code without a test doesn't exist. Scoff at that if you want, but it is a very nice guideline to live by.

To close, here are my favorite new phrases from stuff I've been reading:

Expect to see these jewels used in an article sometime soon.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

June 11, 2009 8:24 PM

Revolution Out There -- and Maybe In Here

(Warning: This is longer than my usual entry.)

In recent weeks I have found myself reading with a perverse fascination some of the abundant articles about the future of newspapers and journalism. Clay Shirky's Newspapers and Thinking the Unthinkable has received a deserving number of mentions in most. His essay reminds us, among other things, that revolutions change the rules that define our world. This means that living through a revolution is uncomfortable for most people -- and dangerous to the people most invested in the old order. The ultimate source of the peril is lack of imagination; we are so defined by the rules that we forget they are not universal laws but human constructs.

I'm not usually the sort of person attracted to train wrecks, but that's how I feel about the quandary facing the newspaper industry. Many people in and out of the industry like to blame the internet and web for the problem, but it is more complicated than that. Yes, the explosion of information technology has played a role in creating difficulties for traditional media, but as much as it causes the problems, I think it exposes problems that were already there. Newspapers battle forces from all sides, not the least of which is the decline -- or death? -- of advertising, which may soon be known as a phenomenon most peculiar to the 20th century. The web has helped expose this problem, with metrics that show just how little web ads affect reader behavior. It has also simply given people alternatives to media that were already fading. Newspapers aren't alone.

This afternoon, I read Xark's The Newspaper Suicide Pact and was finally struck by another perverse thought, a fear because it hits closer to my home. What if universities are next? Are we already in a decline that will become apparent only later to those of us who are on the inside?

Indications of the danger are all around. As in the newspaper industry, money is at the root of many problems. The cost of tuition has been rising much faster than inflation for a quarter of a century. At my university, it has more than doubled in the 2000s. Our costs, many self-imposed, rise at the same time that state funding for its universities falls. For many years, students offset the gap by borrowing the difference. This solution is bumping into a new reality now, with the pool of money available for student loans shrinking and the precipitous decline in housing equity for many eroding borrowing ability. Some may see this as a good thing, as our students have seen a rapid growth in indebtedness at graduation, outpacing salaries in even the best-paying fields. Last week, many people around here were agog at a report that my state's university grads incur more student loan debt than any other state's. (We're #1!)

Like newspapers, universities now operate in a world where plentiful information is available on-line. Sometimes it is free, and other times its is much less expensive than the cost of taking a course on the subject. Literate, disciplined people can create a decent education for themselves on-line. Perhaps universities serve primarily the middle and lower tier of students, who haven't the initiative or discipline to do it on their own?

I have no numbers to support these rash thoughts, though journalists and others in the newspaper industry do have ample evidence for fear. University enrollments depend mostly on the demographics of their main audience: population growth, economics, and culture. Students also come for a social purpose. But I think the main driver for many students to matriculate is industry's de facto use of the college degree as the entry credential to the workplace. In times of alternatives and tight money, universities benefit from industry's having outsourced the credentialing function to them.

The university's situation resembles the newspaper's in other ways, too. We offer a similar defense of why the world needs us: in addition to creating knowledge, we sort it, we package it for presentation, and we validate its authenticity and authority. If students start educating themselves using resources freely or cheaply available outside the university, how will we know that they are learning the right stuff? Don't get most academics started on the topic of for-profits like Kaplan University and the University of Phoenix; they are the university's whipping boy. The news industry has one, too: bloggers.

Newspaper publishers talk a lot these days about requiring readers to pay for content. In a certain sense, that is what students do: pay universities for content. Now, though, the web gives everyone access to on-line lectures, open-source lecture notes, the full text of books, technical articles, and ... the list goes on. Why should they pay?

Too many publishers argue that their content is better, more professional, and so stand behind "the reasonable idea that people should have to pay for the professionally produced content they consume". Shirky calls this a "post-rational demand", one that asks readers to behave in a way "intended to restore media companies to the profitability ordained to them by God Almighty" -- despite living in a world where such behaviors are as foreign as living in log cabins and riding horses for transportation. Is the university's self-justification as irrational? Is it becoming more irrational every year?

Some newspapers decide to charge for content as a way to prop up their traditional revenue stream, print subscriptions. Evidence suggest that this not only doesn't work (people inclined to drop their print subscriptions won't be deterred by pay walls) but that it is counter-productive: the loss of on-line visitors causes a decline in web advertising revenue that is much greater than the on-line reader revenue earned. Again, this is pure speculation, but I suspect that if universities try to charge for their on-line content they will see similar results.

The right reason to charge for on-line content is to create a new revenue stream, one that couldn't exist in the realm of print. This is where creative thinking will help to build an economically viable "new media". This is likely the right path for universities, too. My oldest but often most creative-thinking colleague has been suggesting this as a path for my school to consider for a few years. My department is working on one niche offering now: on-line courses aimed at a specific audience that might well take them elsewhere if we don't offer them, and who then have a smoother transition into full university admission later. We have other possibilities in mind, in particular as part of a graduate program that already attracts a large number of people who work full time in other cities.

But then again, there are schools like Harvard, MIT, and Stanford with open course initiatives, placing material on-line for free. How can a mid-sized, non-research public university compete with that content, in that market? How will such schools even maintain their traditional revenue streams if costs continue to rise and high quality on-line material is readily available?

In a middle of a revolution, no one knows the right answers, and there is great value in trying different ideas. Most any school can start with the obvious: lectures on-line, increased use of collaboration tools such as wikis and chats and blogs -- and Twitter and Facebook, and whatever comes next. These tools help us to connect with students, to make knowledge real, to participate in the learning. Some of the obvious paths may be part of the solution. Perhaps all of them are wrong. But as Shirky and others tell us, we need to try all sorts of experiments until we find the right solution. We are not likely to find it by looking at what we have always done. The rules are changing. The reactions of many in the academy tell a sad story. They are dismissive, or simply disinterested. That sounds a lot like the newspapers, too. Maybe people are simply scared and so hole up in the bunker constructed out of comfortable experience.

Like newspapers, some institutions of higher education are positioned to survive a revolution. Small, focused liberal arts colleges and technical universities cater to specific audiences with specific curricula. Of course, the "unique nationals" (schools such as Harvard, MIT, and Stanford) and public research universities with national brands (schools such as Cal-Berkeley and Michigan) sit well. Other research schools do, too, because their mission goes beyond the teaching of undergraduates. Then again, many of those schools are built on an economic model that some academics think is untenable in the long run. (I wrote about that article last month, in another context.)

The schools most in danger are the middle tier of so-called teaching universities and low-grade research schools. How will they compete with the surviving traditional powers or the wealth of information and knowledge available on-line? This is one reason I embrace our president's goal of going from good to great -- focusing our major efforts on a few things that we do really well, perhaps better than anyone, nurturing those areas with resources and attention, and then building our institution's mission and strategy around this powerful core. There is no guarantee that this approach will succeed, but it is perhaps the only path that offers a reasonable chance to schools like ours. We do have one competitive advantage over many of our competitors: enough research and size to offer students a rich learning environment and a wide range of courses of study, but small enough to offer a personal touch otherwise available only at much smaller schools. This is the same major asset that schools like us have always had. When we find ourselves competing in a new arena and under different conditions, this asset must manifest itself in new forms -- but it must remain the core around which we build..

One of the collateral industries built around universities, textbook publishing, has been facing this problem in much the same way as newspapers for a while now. The web created a marketplace with less friction, which has made it harder for them to make the return on investment to which they had grown accustomed. As textbook prices rise, students look for alternatives. Of course, students always have: using old editions, using library copies, sharing. Those are the old strategies -- I used them in school. But today's students have more options. They can buy from overseas dealers. They can make low-cost copies much more readily. Many of my students have begun to bypass the the assigned texts altogether and rely on free sources available on-line. Compassionate faculty look for ways to help students, too. They support old editions. They post lecture notes and course materials on-line. They even write their own textbooks and post them on-line. Here the textbook publishers cross paths with the newspapers. The web reduces entry costs to the point that almost anyone can enter and compete. And publishers shouldn't kid themselves; some of these on-line texts are really good books.

When I think about the case of computer science in particular, I really wonder. I see the wealth of wonderful information available on line. Free textbooks. Whole courses taught or recorded. Yes, blogs. Open-source software communities. User communities built around specific technologies. Academics and practitioners writing marvelous material and giving it away. I wonder, as many do about journalists, whether academics will be able to continue in this way if the university structure on which they build their careers changes or disappears? What experiments will find the successful models of tomorrow's schools?

Were I graduating from high school today, would I need a university education to prepare for a career in the software industry? Sure, most self-educated students would have gaps in their learning, but don't today's university graduates? And are the gaps in the self-educated's preparation as costly as 4+ years paying tuition and taking out loans? What if I worked the same 12, 14, or 16 hours a day (or more) reading, studying, writing, contributing to an open-source project, interacting on-line? Would I be able to marshall the initiative or discipline necessary to do this?

In my time teaching, I have encountered a few students capable of doing this, if they had wanted or needed to. A couple have gone to school and mostly gotten by that way anyway, working on the side, developing careers or their own start-up companies. Their real focus was on their own education, not on the details of any course we set before them.

Don't get me wrong. I believe in the mission of my school and of universities more generally. I believe that there is value in an on-campus experience, an immersion in a community constructed for the primary purpose of exploring ideas, learning and doing together. When else will students have an opportunity to focus full-time on learning across the spectrum of human knowledge, growing as a person and as a future professional? This is probably the best of what we offer: a learning community, focused on ideas broad and deep. We have research labs, teams competing in a cyberdefense and programming contests. The whole is greater than the sum of parts, both in the major and in liberal education.

But for how many students is this the college experience now, even when they live on campus? For many the focus is not on learning but on drinking, social life, video games... That's long been the case to some extent, but the economic model is changing. Is it cost-effective for today's students, who sometimes find themselves working 30 or more hours a week to pay for tuition and lifestyle, trying to take a full load of classes at the same time? How do we make the great value of a university education attractive in a new world? How do we make it a value?

And how long will universities be uniquely positioned to offer this value? Newspapers used to be uniquely positioned to offer a value no one else could. That has changed, and most in the industry didn't see it coming (or did, and averted their eyes rather than face the brutal facts).

I'd like also to say that expertise distinguishes the university from its on-line competition. That has been true in the past and remains true today, for the most part. But in a discipline like computer science, with a large professional component attracts most of its students, where grads will enter software development or networking... there is an awesome amount of expertise out in the world. More and more of those talented people are now sharing what they know on-line.

There is good news. Some people still believe in the value of a university education. Many students, and especially their parents, still believe. During the summer we do freshman orientation twice a week, with an occasional transfer student orientation thrown into the mix. People come to us eagerly, willing to spend out of their want or to take on massive debts to buy what we sell. Some come for jobs, but most still have at least a little of the idealism of education. When I think about their act in light of all that is going on in the world, I am humbled. We owe them something as valuable as what they surrender. We owe them an experience befitting the ideal. This humbles me, but it also Invigorates and scares me, too.

This article is probably more dark fantasy than reality. Still, I wonder how much of what I believe I really should believe, because it's right, and how much is merely a product of my lack of imagination. I am certain that I'm living in the middle of a revolution. I don't know how well I see or understand it. I am also certain of this: I don't want someone to be writing this speech about universities in a few years with me in its clueless intended audience.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

June 05, 2009 3:25 PM

Paying for Value or Paying for Time

Brian Marick tweeted about his mini-blog post Pay me until you're done, which got me to thinking. The idea is something like this: Many agile consultants work in an agile way, attacking the highest-value issue they can in a given situation. If the value of the issues to work on decreases with time, there will come a point at which the consultant's weekly stipend exceeds the value of the work he is doing. Maybe the client should stop buying services at that point.

My first thought was, "Yes, but." (I am far too prone to that!)

First, the "yes": In the general case of consulting, as opposed to contract work, the consultant's run will end as his marginal effect on the company approaches 0. Marick is being honest about his value. At some point, the value of his marginal contribution will fall below the price he is charging that week. Why not have the client end the arrangement at that point, or at least have the option to? This is a nice twist on our usual thinking.

Now for the "but". As I tweeted back this feels a bit like Zeno's Paradox. Marick the consultant covers not half the distance from start to finish each week, but the most valuable piece of ground remaining. With each week, he covers increasingly less valuable distance. So our consultant, cast in the role of Achilles, concedes the race and says, okay, so stop paying me.

This sounds noble, but remember: Achilles would win the race. We unwind Zeno's Paradox when we realize that the sum of an infinite series can be a finite number -- and that number may be just small enough for Achilles to catch the tortoise. This works only for infinite series that behave in a particular way.

Crazy, I know, but this is how the qualification of the "yes" arose in my mind. Maybe, the consultant helps to create a change in his client that changes the nature of the series of tasks he is working on. New ideas might create new or qualitatively different tasks to do. The change created may change the value of an existing task, or reorder the priorities of the remaining tasks. If the nature of the series changes, it may cause the value of the series to change, too. If so, then the client may well want to keep the consultant around, but doing something different than the original set of issues would have called for.

Another thought: Assume that the conditions that Marick described do hold. Should the compensation model be revised? He seems to be assuming that the consultant charges the same amount for each week of work, with the value of the tasks performed early being greater than that amount and the value of the tasks performed later being less than that amount. If that is true,then early on the consultant is bringing in substantially more value than he costs. If the client pulls the plug as soon as the value proposition turns in its favor, then the consultant ends up receiving less than the original contract called for yet providing more than average value for the time period. If the consultant thinks that is fair, great. What if not? Perhaps the consultant should charge more in the early weeks, when he is providing more value, than in later week? Or maybe the client could pay a fee to "buy out" the rest of the contract? (I'm not a professional consultant, so take that into account when evaluating my ideas about consultant compensation...)

And another thought: Does this apply to what happens when a professor teaches a class? In a way, I think it does. When I introduce a new area to students, it may well be the case that the biggest return on the time we spend (and the biggest bang for the students' tuition dollars) happens in the first weeks. If the course is successful, then most students will become increasingly self-sufficient in the area as the semester goes on. This is more likely the case for upper-division courses than for freshmen. What would it be like for a student to decide to opt out of the course at the point where she feels like she has stopped receiving fair value for the time being spent? Learning isn't the same as a business transaction, but this does have an appealing feel to it.

The university model for courses doesn't support Marick's opt-out well. The best students in a course often reach a point where they are self-sufficient or nearly so, and they are "stuck". The "but" in our teaching model is that we teach an audience larger than one, and the students can be at quite different levels in background and understanding. Only the best students reach a point where opting out would make sense; the rest need more (and a few need a lot more -- more than one semester can offer!).

The good news is that the unevenness imposed by our course model doesn't hurt most of those best students. They are usually the ones who are able to make value out of their time in the class and with the professor regardless of what is happening in the classroom. They not only survive the latency, but thrive by veering off in their own direction, asking good questions and doing their own programming, reading, thinking outside of class. This way of thinking about the learning "transaction" of a course may help to explain another class of students. We all know students who are quite bright but end up struggling through academic courses and programs. Perhaps these students, despite their intelligence and aptitude for the discipline, don't have the skills or aptitude to make value out of the latency between the point they stop receiving net value and the end of the course. This inability creates a problem for them (among them, boredom and low grades). Some instructors are better able to recognize this situation and address it through one-on-one engagement. Some would like to help but are in a context that limits them. It's hard to find time for a lot of one-on-one instruction when you teach three large sections and are trying to do research and are expected to meet all of the other expectations of a university prof.

Sorry for the digression from Marick's thought experiment, which is intriguing in its own setting. But I have learned a lot from applying agile development ideas to my running. I have found places where the new twist helps me and others where the analogy fails. I'm can't shake the urge to do the same on occasion with how we teach.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Software Development, Teaching and Learning

May 30, 2009 11:15 PM

How To Be Invincible

Everyone is trying to accomplish something big,
not realizing that life is made up of little things.
-- Frank A. Clark

Instead of trying to be the best, simply do your best.

Trying to be the best can turn into an ego trap: "I am better than you." In fact, the goal of being the best is often driven by ego. If it doesn't work out, this goal can become a source of finding fault and tearing oneself down. "I am not good enough." I should probably say "when", rather than "if". When your goal is to be the best, there always seems to be someone out there who does some task better. The result is like a cruel joke: trying to be the best may make you feel like you are never good enough.

In more prosaic sense, trying to be the best can provide a convenient excuse for being mediocre. When you realize that you'll never be as good as a particular someone, it's easy to say, "Well, why bother trying to be the best? I can spend my time doing something else.." This is a big problem when we decide to compare ourselves to the best of the best -- Lebron James, Haile Gebreselassie, or Ward Cunningham. Who among us can measure up to those masters? But it's also a problem when we compare ourselves to that one person in the office who seems to get and do everything right. Another cruel joke: trying to be the best ultimately gives us an excuse not to try to get better.

Doing your best is something that you can do any time or any place. You can succeed, no matter who else is involved. As time goes by, you are likely going to get better, as you develop your instincts. This means that every time you do your best you'll be in a different state, which adds a freshness to every new task you take on. Even more, I think that there is something about doing our best that causes us to want to get better; we are energized by the moment and realize that what we are doing now isn't the best we could do.

I've never met Lebron James or Haile Gebreselassie, but I've had the good fortune to meet and work with Ward Cunningham. He is a very bright guy, but he seems mostly to be a person who cares about other people and who has a strong drive to do interesting work -- and to get better. It's good to see that the folks we consider the best are... human. I've met enough runners, programmers, computer scientists, and chessplayers who are a lot better than I, and most of them are simply trying to do their best. That's how they got to be so good.

Some of you may say this is a distinction without a difference, but I have found that the subtle change in mindset that occurs when I shift my sights from trying to be the best to trying to do my best can have a huge effect on my attitude and my happiness. That is worth a lot. Again, though, there's more. The change in mindset also affects how I approach my work, and ultimately my effectiveness. Perhaps that's the final lesson, not a cruel joke at all: Doing your best is a better path to being better -- and maybe even the best -- than trying to the best.

(This entry is a riff on a passage from David Allen's Ready for Anything, from which I take the entry's title. Allen's approach to getting things done really does sync well with agile approaches to software development.)


Posted by Eugene Wallingford | Permalink | Categories: General

May 25, 2009 9:48 PM

Is There a Statute of Limitations for Blogging?

I had a few free minutes tonight with no big project at the front of my mind, so I decided to clean up my blog-ideas folder. Maybe one of the ideas would grab my imagination and I would write. But this is what grabbed my attention instead, a line in my main ideas.txt file:

(leftovers from last year's SIGCSE -- do them!?)

You have heard of code smells. This is a blog smell.

I have two entries still in the hopper from SIGCSE 2008 listed in my conference report table of contents: "Rediscovering the Passion and Beauty", on ways to share our passion and awe with others, and "Recreating the Passion and Beauty", because maybe it's just not that much fun any more. Both come from a panel discussion on the afternoon of Day 2, and both still seem worth writing, even after fourteenth months.

The question in the note to myself in the ideas file lets a little reality under the curtain... Will I ever write them? As conference report, they probably don't offer much, and the second entry has been preempted a bit by Eric Roberts giving a similar talk in other venues, and posting his slides on the web. But timeliness of the conference report isn't the only reason I write; the primary reason is to think about the ideas. The writing both creates the thinking and records it for later consideration. In this regard, they still hold my interest. Not all old ideas do.

When I first started this blog, I never realized how much my blogging would exhibit the phenomenon I call the Stack of Ideas. Sometimes an entry is a planned work, but more often I write what needs to be written based on where I am in my work. Hot ideas will push ideas that recently seemed hot onto the back burner. Going to a conference only makes the problem worse. The sessions follow one after another, and each one tends to stir me up so much as to push even the previous session way back in my mind. I have subfolders for hot ideas and merely recent ideas, and I do pull topics from them -- "hot" serving up ideas more reliably than "recent".

This is one risk of having more ideas than time. Of course, ideas are like most everything else: a lot of them are bunk. I suspect that many of my ideas are bunk and that the Stack of Ideas does me and my readers the Darwinian service of pushing the worst down, down, down out of consciousness. When I look back at most of the ideas that haven't made the cut yet, they feel stale. Are they just old, or were they not good enough? It's hard to say. Like other Darwinian processes, this one probably isn't optimal. Occasionally a good idea may lose out only because it wasn't fit for the particular mental environment in which it found itself. But all in all, the process seems to get things mostly right. I just hope the good ideas come back around sometime later. I think the best ones do.

This is one of the reasons that academics can benefit from keeping a blog. A lot of ideas are bunk. Maybe the ones that don't get written shouldn't be written. For the ideas that make the cut, writing this sort of short essay is a great way to think them through, make them come to life in words that anyone can read, and then let them loose into the world. Blog readers are great reviewers, and they help with the good and bad ideas in equal measure. What a wonderful opportunity blogging offers: an anytime, anyplace community of ideas. Most of us had little access to such a community even ten years ago.

I must say this, though. Blogging is of more value to me than just as a technical device. It can also offer an ego boost. There is nothing quite like having someone I met several years ago at SIGCSE or OOPSLA tell me how much they enjoy reading my blog. Or to have someone I've never met come up to me and say that they stumbled across my blog and find it useful. Or to receive e-mail saying, "I am a regular reader and thought you might enjoy this..." What a joy!

Will those old SIGCSE 2008 entries ever see the light of day? I think so, but the Stack of Ideas will have its say.


Posted by Eugene Wallingford | Permalink | Categories: General

May 08, 2009 6:31 AM

The Annual Book March

There is a scene in the television show "Two and a Half Men" in which neurotic Alan has a major meltdown in a bookstore. He decided to use some newly-found free time to better himself through reading the classics. He grabs some books from one shelf, say, Greek drama, and then from another, and another, picking up speed as he realizes that there aren't enough hours in a day or a lifetime to read all that is available. This pushes him over the edge, he makes a huge scene, and his brother is embarrassed in front of the whole store.

I know that feeling this time of year. When I check books out from the university library, the due date is always next May, at the end of finals week for spring semester. Over the year, I run into books I'd like to read, new and old, in every conceivable place: e-mail, blogs, tweets, newspapers, ... With no particular constraint other than a finite amount of shelf space -- and floor space, and space at home -- I check it out.

Now is the season of returning. I gather up all the books on my shelves, and on my floors, and in my home. For most of my years here, I have renewed them. Surely I will read them this summer, when time is less rare, or next year, on a trip or a break. At the beginning of the last couple of Mays, though, I have been trying to be more honest with myself and return books that have fallen so far down the list as to be unlikely reads. Some are far enough from my main areas of interest or work that they are crowded out by more relevant books. Others are in my area of interest but trumped by something newer or more on-point.

Now, as I walk to the library, arms full, to return one or two or six, I often feel like poor, neurotic Alan. So many book, so little time! How can I do anything but fall farther and farther behind withe each passing day? Every book I return is like a little surrender.

I am not quite as neurotic as Alan; at least I've never melted down in front of the book drop for all my students to see. I recognize reality. Still, it is hard to return almost any book unread.

I've had better habits this year, enforcing on myself first a strict policy of returning two books for every new one I checked out, then backsliding to an even one-for-one swap. As a result, I have far fewer books to return or new. Still, this week I have surrendered Knuth's Selected Papers on Analysis of Algorithms, David Berlinski's The Advent of the Algorithm, and Jerry Weissman's Presenting to Win. Worry not; others will take their place, both old (Northcote Parkinson, Parkinson's Law) and new: The Passionate Programmer and Practical Programming. The last of these promises an intro to programming for the 21st century, and I am eager to see how well they carry off the idea.

So, in the end, even if something changed radically to make the life of a professor less attractive, I agree with Learning Curves on the real reason I will never give up my job: the library.


Posted by Eugene Wallingford | Permalink | Categories: General

April 09, 2009 7:48 PM

Musings on Software, Programming, and Art

My in-flight and bedtime reading for my ChiliPLoP trip was William Stafford's Writing the Australian Crawl, a book on reading and especially writing poetry, and how these relate to Life. Stafford's musings are crashing into my professional work on the trip, about solving problems and writing programs. The collisions give birth to disjointed thoughts about software, programming, and art. Let's see what putting them into words does to them, and to me.

Intention endangers creation.

An intentional person is too effective to be a good guide in the tentative act of creating.

I often think of programming as art. I've certainly read code that felt poetic to me, such as McCarthy's formulation of Lisp in Lisp (which I discussed way back in an entry on the unity of data and program. But most of the programs we write are intentional: we desire to implement a specific functionality. That isn't the sort of creation that most artists do, or strive to do. If we have a particular artifact in mind, are we really "creating"?

Stafford might think not, and many software people would say "No! We are an engineering discipline, not an artistic one." Thinking as "artists", we are undisciplined; we create bad software: software that breaks, software that doesn't serve its intended purpose, software that is bad internally, software that is hard to maintain and modify.

Yet many people I know who program know... They feel something akin to artistry and creation.

How can we impress both sides of this vision on people, especially students who are just starting out? When we tell only one side of the story, we mislead.

Art is an interaction between object and beholder.

Can programs be art? Can a computer system be art? Yes. Even many people inclined to say 'no' will admit, perhaps grudgingly, that the iPod and the iPhone are objects of art, or at least have elements of artistry in them. I began writing some of these notes on the plane, and all around me I see iPods and iPhones serving people's needs, improving their lives. They have changed us. Who would ever have thought that people would be willing to watch full-length cinematic films on a 2" screen? Our youth, whose experiences are most shaped by the new world of media and technology, take for granted this limitation, as a natural side effect of experiencing music and film and cartoons everywhere.

Yet iPods aren't only about delivering music, and iPhones aren't just ways to talk to our friends. People who own them love the feel of these devices in their hands, and in our lives. They are not just engineered artifacts, created only to meet a purely functional need. They do more, and they are more.

Intention endangers creation.

Art reflects and amplifies experience. We programmers often look for inspirations to write programs by being alert to our personal experience and by recognizing disconnects, things that interrupt our wholeness.

Robert Schumann said, To send light into the darkness of men's hearts -- such is the duty of the artist. Artists deal in truth, though not in the direct, assertional sense we often associate with mathematical or scientific truth. But they must deal in truth if they are to shine light into the darkness of our hearts.

Engineering is sometimes defined as using scientific knowledge and physical resources to create artifacts that achieve a goal or meet a need. Poets use words, not "physical resources", but also shapes and sounds. Their poems meet a need, though perhaps not a narrowly defined one, or even one we realize we had until it was met in the poem. Generously, we might think of poets as playing a role somewhat akin to the engineer.

How about engineers playing a role somewhat akin to the artist? Do engineers and programmers "send light into the darkness of men's hearts"? I've read a lot of Smalltalk code in my life that seemed to fill a dark place in my mind, and my soul, and perhaps even my heart. And some engineered artifacts do, indeed, satisfy a need that we didn't even know we had until we experienced them. And in such cases it is usually experience in the broadest sense, not the mechanics of saving a file or deleting our e-mail. Design, well done, satisfies needs users didn't know they had. This applies as well to the programs we write as to any other artifact that we design with intention.

I have more to write about this, but at this time I feel a strong urge to say "Yes".


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

April 06, 2009 2:05 AM

The Hard Part

This is the idea behind biction:

(The hard part of writing isn't the writing; it's the thinking.)
-- William Zinsser

s/writing/programming/*

This line comes from Zinsser's recent article, Visions and Revisions, in which he describes the writing and rewriting of On Writing Well over the course of thirty years. I read On Writing Well a decade or so ago, in one of its earlier editions. It is my favorite book on the craft of writing.


Posted by Eugene Wallingford | Permalink | Categories: General

March 29, 2009 11:39 AM

Looking Forward to Time Working

In real life there is no such thing as algebra.

-- Fran Lebowitz

At this time next week, I will be on my way to ChiliPLoP for a working session. Readers here know how much I enjoy my annual sojourn to this working conference, but this year I look forward to it with special fervor.

First, my day job the last few months -- the last year, really -- has been heavier than usual with administrative activities: IT task force, program review, budget concerns. These are all important tasks, with large potential effects on my university, my department, and our curriculum and faculty. But they are not computer science, and I need to do some computer science.

Second, I am still in a state of hopeful optimism that my year-long Running Winter is coming to an end. I put in five runs this week and reached 20 miles for the first time since October. The week culminated this morning in a chilly, hilly 8 miles on a fresh dusting of snow and under a crystal clear blue sky. ChiliPLoP is my favorite place to run away from home. I never leave Carefree without being inspired, unless I am sick and unable to run. Even if I manage only two short runs around town, which is what I figure is in store, I think that the location will do a little more magic for me.

Our hot topic group will be working at the intersection of computer science and other disciplines, stepping a bit farther from mainstream CS than it has in recent years. We all see the need to seek something more transformative than incremental, and I'd like to put into practice some of the mindset I've been exploring in my blog the last year or so.

The other group will again be led by Dave West and Dick Gabriel, and they, too, are thinking about how we might re-imagine computer science and software development around Peter Naur's notion of programming as theory building. Ironically, I mentioned that work recently in a context that crosses into my hot topic's focus. This could lead to some interesting dinner conversation.

Both hot topics' work will have implications for how we present programming, software development, and computer science to others, whether CS students are professionals in other disciplines. Michael Berman (who recently launched his new blog) sent a comment on my Sweating the Small Stuff that we need to keep in mind whenever we want people to learn how to do something:

I think that's an essential observation, and one that needs to be designed into the curriculum. Most people don't learn something until they need it. So trying to get students to learn syntax by teaching them syntax and having them solve toy problems doesn't teach them syntax. It's a mistake to think that there's something wrong with the students or the intro class -- the problem is in the curriculum design.

I learned algebra when I took trig, and trig when I took calculus, and I learned calculus in my physics class and later in queueing theory and probability. (I never really learned queueing theory.)

One of the great hopes of teaching computation to physicists, economists, sociologists, and anyone else is that they have real problems to solve and so might learn the tool they need to solve them. Might -- because we need to tell them a story that compels them to want to solve them with computation. Putting programming into the context of building theories in an applied discipline is a first step.

(Then we need to figure out the context and curriculum that helps CS students learn to program without getting angry...)


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

March 24, 2009 3:45 PM

Meta-Blog: Follow-Up to My Adele Goldberg Entry

When I was first asked to consider writing a blog piece for the Ada Lovelace Day challenge, I wasn't sure I wanted to. I don't usually blog with any particular agenda; I just write whatever is in my mind at the time, itching to get out. This was surely a topic I have thought and written about before, and it's one that I have worked on with people at my university and across the state. I think it is in the best interest of computer science to be sure that we are not missing out on great minds who might be self-selecting away from the discipline for the wrong reasons. So I said yes.

Soon afterwards, ACM announced Barbara Liskov as the winner of the Turing Award. I had written about Fran Allen when she won the Turing Award, and here was another female researcher in programming languages whose work I have long admired. I think the Liskov Substitution Principle is one of the great ideas in software development, a crucial feature of object-oriented programming, of any kind of programming, really. I make a variant of the LSP the centerpiece of my undergraduate courses on OOP. But Liskov has done more -- CLU and encapsulation, Thor and object-oriented databases, the idea of Byzantine fault tolerance in distributed computing, ... It was a perfect fit for the challenge.

But my first thought, Adele Goldberg, would not leave me. That thought grew out of my long love affair with Smalltalk, to which she contributed, and out of a memory I have from my second OOPSLA Educators' Symposium, where she gave a talk on learning environments, programming, and language. Goldberg isn't a typical academic Ph.D.; she is versatile, having worked in technical research, applications, and business. She has made technical contributions and contributions to teaching and learning. She helped found companies. In the end, that's the piece I wanted to write.

So, if my entry on Goldberg sounds stilted or awkward, please cut me a little slack. I don't write on assigned topics much any more, at least not in my blog. I should probably have set aside more time to write that entry, but I wrote it much as I might write any other entry. If nothing else, I hope you can find value in the link to her Personal Dynamic Media article, which I was so happy to find on-line.

At this point, one other person has written about Goldberg for the Lovelace Day challenge. That entry has links to a couple of videos, including one of Adele demonstrating a WIMP interface using an early implementation of Smalltalk. A nice piece of history. Mark Guzial mentions Adele in his Lovelace Day essay, but he wrote about three women closer to home. One of his subjects is Janet Kolodner, who did groundbreaking research on case-based reasoning that was essential to my own graduate work. I'm a fan!


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

March 06, 2009 8:01 PM

Coming Up For Air

I have spent nearly every working minute this week sitting in front of this laptop, preparing a bunch of documents for an "academic program assessment" that is being done campus-wide at my university. Unfortunately, that makes this week Strike Two.

Last October: no OOPSLA for me.

This week: no SIGCSE for me.

The next pitch arrives at the plate in about a month... Will there be no ChiliPLoP for me?

That would be an inglorious Strike Three indeed. It would break my equivalent of DiMaggio's streak: I have never missed a ChiliPLoP. But budget rescissions, out-of-state travel restrictions, and work, work, work are conspiring against me. I intend to make my best effort. Say a little prayer.

I hope that you can survive my missing SIGCSE, as it will mean no reports from the front. Of course, you will notice two missing links on my 2008 report, so I do have some material in the bullpen!

Missing SIGCSE was tougher than usual, because this year I was to have been part of the New Teaching Faculty Roundtable on the day before the conference opened. I was looking forward to sharing what little wisdom I have gained in all my years teaching -- and to stealing as many good ideas as I could from the other panelists. Seeing all of the firepower on the roster of mentors, I have no doubts that the roundtable was a great success for the attendees. I hope SIGCSE offers the roundtable again next year.

Part of my working day today was spent in Cedar Rapids, an hour south of here. Some of you may recall that Cedar Rapids was devastated by flooding last summer, when much of the eastern part of the state was under 500-year flood waters. I surprised and saddened to see that so much of the downtown area still suffers the ill effects of the water. The public library is still closed while undergoing repair. But I was heartened to see a vibrant city rebuilding itself. A branch library has been opened at a mall on the edge of town, and it was buzzing with activity.

You know, having a library in the mall can be a good thing. It is perhaps more a part of some people's lives than a dedicated building in the city, and it serves as a nice counterpoint to the consumption and noise outside the door. Besides, I had easy access to excellent wireless service out in mall area even before the library opened, and easy access to all the food I might want whenever I needed to take a break. Alas, I really did spend nearly every working minute this week sitting in front of this laptop, so I worked my way right up to dinner time and a welcome drive home.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

February 21, 2009 7:10 PM

Hope for Troubled Economic Times

And we decided to innovate our way through this downturn, so that we would be further ahead of our competitors when things turn up.

"This downturn" was the dot.com bust. The speaker was Steve Jobs. The innovations were the iPod and iTunes. Seems to have worked out fine.

My agile friends are positioned well to innovate through our current downturn, as are the start-ups that other friends and former students run. It is something of a cliché but true nonetheless. Recessions can be a good time for people and organizations that are able -- and willing -- to adapt. They can be an opportunity as much as a challenge.

I hope that the faculty and staff of my university can approach these troubled budget times with such an attitude. In five years, we could be doing a much better job for our students, our state, and our respective academic disciplines.


Posted by Eugene Wallingford | Permalink | Categories: General

February 17, 2009 3:53 PM

Posts of the Day

Tweet of the Day

Haskell is a human-readable program compression format.
-- Michael Feathers

Maybe we should write a generator that produces Haskell.

Non-Tech Blog of the Day

Earlier in my career I worked hard to attract attention. ... The problem with this approach is that eventually it all burns down to ashes and no one knows a thing more about software development than they did before.
-- Kent Beck

Seek truth. You will learn to focus your life outside your own identity, and it makes finding out you're wrong not only acceptable, but desirable.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

February 05, 2009 4:24 PM

So Little Computer Science...

... so many meetings and budget discussion. A week in which I do no computer science is as bad as a week in which I do not run.

I did play hooky while attending a budget meeting yesterday. I took one of our department laptops so that I could upgrade Ruby, install Bluecloth, and play with a new toy. But that's bit-pushing, not computer science.

Why all the budget talk? Working at a public university offers some shelter from changing economic winds and tends to level changes out over time. But when the entire economy goes down, so do state revenues. My university is looking at a 9% cut in its base budget for next year. That magnitude of change means making some hard choices. Such change creates uneasy times among the faculty, and more work planning for changes and contingencies among department heads.

There is some consolation in being on the front line, knowing that I can shield the faculty from much of the indecision. I also have opportunities to argue against short-sighted responses to the budget cuts and to suggest responses that will help the university in the long term. There is nothing like a sudden lack of revenue to help you focus on what really matters!

Still, I'd rather be writing a program or working on a tough CS problem.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

February 04, 2009 7:44 AM

Remembering the Answer a Few Days Late

I need a better memory.

Last time, I wrote about being surprised by an interview request. But more than a year ago I read about a similar problem and one solution:

As a science journalist, I can tell you the best thing to do, as an academic getting interviewed and wanted to guide the interview somewhat, is to have analogies cocked, locked and loaded.... [R]eporters go nuts for pre-thought-out analogies/explanations because it's quotable material, and could in fact be the center of the article.... So cranking them out before you speak with someone is a great way to maintain some control of what reporters quote you on.

As in so many things, preparation pays off.

Of course, this isn't quite the same problem. Talking about one's own research or teaching is different than talking about department business or someone else's project. But that is one of the responsibilities that comes with chairing the department -- speaking about the wider interests of the department.

The bigger issue here is, how to convert what I read into learning. The passage above stuck out enough that I filed it away for eighteen months. But it doesn't do me any good sitting in a text file somewhere.


Posted by Eugene Wallingford | Permalink | Categories: General

January 30, 2009 3:10 PM

Pop Interview!

The phone rings.

"Hi, I'm [local radio personality]. I'd like to interview you about the grant your department received from State Farm."

"Um, sure." Quick -- compose yourself.

"So, what is this grant all about?"

A short game of Twenty Questions ensued. This was a first for me: a cold call from a radio station requesting an interview. Fortunately the interview was conducted off-line; my answers were recorded and will be used to produce a finished piece later.

I have done phone interviews before, some of which I have discussed here. But those were arranged in advance, so I had time to prepare specific comments and to get into the right frame of mind. to answer questions in that context. Also, my previous interviews have always been for my own personal work, which I know at a different level than I know my department work. Even though I wrote the grant proposal in question, it was collective work, not mine, and that shows in how well I feel the project.

A quick word about the grant... State Farm Insurance is based a few hours' drive from here and hires many of our best software engineering students into its systems development division. Through its foundation, State Farm supports universities with grants to support educational work. A few years ago, one of these grants helped us to build our first computational cluster and begin using it in our bioinformatics program, and to support a number of computational science projects on campus. The fact that an insurance company would fund this kind of work shows that it has a long-term view of education, which we at the university appreciate.

We recently received a new grant to purchase two quad-socket, quad-core servers and integrate their use into our architecture, systems, and programming courses. The world is going multi-core, and we would like to give our students some of the experiences they will need to contribute.

Anyway, I now have a new set of skills to work on: the pop interview. Or I at least need to develop a mind quick enough to say, "Hey, can I call you back in five?"


Posted by Eugene Wallingford | Permalink | Categories: General

January 22, 2009 4:05 PM

A Story-Telling Pattern from the Summit

At the Rebooting Computing Summit, one exercise called for us to interview each other and then report back to the group about the person we interviewed. The reports my partner and I gave, coupled with some self-reported experiences later in the day, reminded me of a pattern I've experienced in other contexts. Here is a rough first draft. Let me know what you think.

Second-Hand Story

When we need to know a person's story, our first tendency is often to ask him to tell us. After all, he know it best, because he lived it. He has had a chance to reflect on it, to reconsider decisions, and to evaluate what the story "means".

This approach can disappoint us. Sometimes, the person is too close to the experience and attaches accidental emotions and details. Sometimes, even though he has had a chance to reflect on his experience, he hasn't reflected enough -- or perhaps not at all! Telling the story may be the first time he has thought about some of those experiences in a long time. While trying to tell the story and summarize its meaning at the same time, the storyteller may reach for an easily-found answer. The result can be trite, convenient, or self-protective. Maybe the person is simply too close to an experience to see its true meaning.

Therefore, ask the person to tell his story to someone else, focusing on "just the facts". Then, ask the interviewer to tell the story, perhaps in summary form. Let the interviewer and the listeners look for patterns and themes.

The interviewer has distance and an opportunity to listen objectively. She is less likely to impose well-rehearsed personal baggage over the story.

The result can still be trite. If the listener does not listen carefully, or is too eager to stereotype the story, then the resulting story may well be worse than the original, because it is not only sanitized but sanitized by someone without intimate connection to it.

It can be refreshing to hear someone else tell your own story, to draw conclusions, to summarize what is most important. A good listener can pick up on essential details and remove the shroud of humility or disappointment that too often obscures your own view. You can learn something about yourself!

This technique depends on two people's ability to tell a story, not one. The original story-teller must be open, honest, and willing to describe situations more than evaluate them too much. (A little evaluation is unavoidable and also useful. The listener learns something about the story-teller from that party of the story, too.) The interviewer must be a careful listener and have a similar enough background to be able to put the story into context and form reasonable abstractions about it.

Examples. I found the interviewer's reports at the Rebooting Computing summit to be insightful, including the ones that Joe Carthy and I gave on one another. Hearing someone else "tell my story" let me hear it more objectively than if I had told it myself. Occasionally I felt let like chiming in to correct or add something, but I'm not sure than anything I said could have done a better job introducing myself to the rest of the group. Something Joe said during his interview of me made me think more about just how my non-CS profs helped lead me into CS, something I had never thought much about before that.

Later that day, we heard several self-reported stories, and those stories -- told by the same people who had reported on others earlier -- sounded flat and trite. I kept thinking, "What's the real story?" Maybe someone else could have told it better!

Related Ideas. I am reminded of several techniques I learned as an architecture major while studying Betty Edwards's Drawing on the Right Side of the Brain:

  • drawing a chair by looking at the negative space around it
  • drawing a picture of Igor Stravinsky upside down, in order to trick the visualization mechanism in our minds that jump right to stereotypes
  • drawing a paper wad, which matched no stereotyped form already in our memories

This pattern and these exercises are all examples of techniques for indirect learning. This is is perhaps the first time I realized just how well indirect learning can work in a social setting.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

January 21, 2009 7:55 AM

Rebooting Computing Summit -- This and That

As always, my report on the Rebooting Computing Summit left out some of the random thoughts and events that made my trip. Here are a few.

•   When I was growing up I learned that the prefixes "Mc" and "O'" indicated "son of" when used in names such as McDonald and O'Herlihy. I always wondered who the original ancestors were -- the Donalds and Herlihys. (Even then, I was concerned with the base case...) One of my favorite grad school profs was named Bill McCarthy, but I had never met a Carthy. Now I have... One of my table mates at the summit was Joe Carthy of Dublin! Joe shared some valuable insights on teaching computing.

•   In my report, I wrote of my vision for the future of computing, in which children will routinely walk to the computer and write a program.... "Walk to the computer" -- that is so 1990s! Today's children carry their technology in their hands.

•   During one of his messages, Peter Denning showed the familiar quote, "Insanity is doing the same thing over again, expecting different results," as a motivation to change. But I think there is more to it than that. I was reminded of a recent Frazz comic, in which the precocious Caulfield pointed out that the world is always changing, so it is also insanity to do the same thing over again, expecting the same results. The world is changing around computing and computing education. There is no particular reason to think that doing the same old things better will get us anywhere useful.

•   At one point, Alan Kay said that part of what is wrong with computing is that too many of us "fool around", rather than working to change the world. This, he said, is a feature of a popular culture, not a serious one. First, we had real guitar. Then came air guitar. And now we have Guitar Hero. He is, of course, right, and written occasionally of being shamed at coming up short when measured against his vision.

Later that evening, my roommate Robert Duvall discussed whether Guitar Hero might have some positives, by motivating some of the people who play it to learn to play a real guitar. I don't have a good feel for the culture around Guitar Hero, so I'll have to wait and see. New technologies often interact with younger generations in ways that we old folks can't predict. (My prurient side wants to say that Guitar Hero can't be all bad if it gives us Heidi Klum playing air guitar in her privates.)

•   A Creative Interlude

On the second day of the summit, each table was asked to communicate to the rest of the groups its vision of the future of CS. The facilitators encouraged us to express our vision creatively, via role play or some other non-bullet list medium. One group did a neat job on this, with one ham performer playing the central role in a number of vignettes showing where the computing of tomorrow will have taken us.

This is the sort of exercise for which I am ill-equipped to excel alone, but I am able to do all right if I am in a group. My table decided to gang-write a song -- doggerel, really. With Christmas still close in our memory, we chose the tune to the familiar carol "Angels We Have Heard On High", in part, I think, for its soaring "Gloria"s. The result was "Everyone Now Loves CS".

Our original plan was for Susan Horwitz to sing our creation, as she does this sort of thing in many of her classes and so is used to the attention. A few of us toyed with the idea of playing air guitar in the background, but I'm glad we opted not to; the juxtaposition of our performance with Alan Kay's remarks later that afternoon would have been unfortunate indeed! About five minutes before the performance Susan informed us that we would be singing as a group. So we did. My students should not expect a reprise.

My conference history now includes singing and acting. I don't imagine that dance is in my future, but you never know.


Posted by Eugene Wallingford | Permalink | Categories: General

January 02, 2009 10:42 PM

Fly on the Wall

A student wrote to tell me that I had been Reddited again, this time for my entry reflecting on this semester's final exam. I had fun reading the comments. Occasionally, I found myself thinking about how a poster had misread my entry. Even a couple of other posters commented as much. But I stopped myself and remembered what I had learned from writers' workshops at PLoP: they had read my words, which must stand on their own. Reading over a thread that discusses something I've written feels a little bit like a writers' workshop. As an author, I never know how others might interpret what I have written until I hear or read their interpretation in their own words. Interposing clarifying words is a tempting but dangerous trap.

Blog comments, whether on the author's site or on a community site such as Reddit, do tend to drift far afield from the original article. That is different from a PLoP workshop, in which the focus should remain on the work being discussed. In the case of my exam entry, the drift was quite interesting, as people discussed accumulator variables (yes, I was commenting on how students tend to overuse them; they are a wonderful technique when used appropriately) and recursion (yes, it is hard for imperative thinkers to learn, but there are techniques to help...). Well worth the read. But I could also see that sometimes a subthread comes to resemble an exchange in the children's game Telephone. Unless every commenter has read the original article -- in this case, mine -- the discussion tends to drift monotonically away from the content of the original as it loses touch with each successive post. Frankly, that's all right, too. I just hope that I am not held accountable for what someone at the end of the chain says I wrote...


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

December 11, 2008 7:37 AM

Movin' Out, Twyla Tharp, and Inspiration

a scene from the Broadway musical Movin' Out

Last month my wife and I had the good fortune to see a Broadway touring company perform the Tony Award-winning Movin' Out, a musical created by Twyla Tharp from the music of Billy Joel. I've already mentioned that I am a big fan of Billy Joel, so the chance to listen to his songs for two hours was an easy sell. Some of you may recall that I also wrote an entry way back called Start with a Box that was inspired by a wonderful chapter from Twyla Tharp's The Creative Habit. So even if I knew nothing else about Tharp, Movin' Out would have piqued my interest.

This post isn't about the show, but my quick review is: Wow. The musicians were very good -- not imitating Joel, but performing his music in a way that felt authentic and alive. (Yes, I sang along, silently to myself. My wife said she saw my lips moving!) Tharp managed somehow to tell a compelling story by stitching together a set of unrelated songs written over the long course of Joel's career. I know all of these songs quite well, and occasionally found myself thinking, "But that's not what this song means...". Yet I didn't mind; I was hearing from within the story. And I loved the dance itself -- it was classical even when modern, not abstract like Merce Cunningham's Patterns in Space and Sound. My wife knows dance well, and she was impressed that the male dancers in this show were actually doing classical ballet. (In many performances, the men are more props than dancers, doing lifts and otherwise giving the female leads a foil for their moves.)

Now I see that Merlin Mann is gushing over Tharp and The Creative Habit. Whatever else I can say, Mann is a great source of links... He points us to a YouTube video of Tharp talking about "failing well", as well as the first chapter of her book available on line. Now you can read a bit to see if you want to bother with the whole book. I echo Mann's caveat: we both liked the first chapter, but we liked the rest of the book more.

Since my post three years ago on The Creative Habit, I've been meaning to return to some of the other cool ideas that Tharp writes about in this book. Seeing Movin' Out caused me to dig out my notes from that summer, and seeing Mann's posts has awakened my desire to write some of the posts I have in mind. The ideas I learned in this book relate well to how I write software, teach, and learn.

Here is a teaser that may connect with agile software developers and comfort students preparing for final exams:

The routine is as much a part of the creative process as the lightning bolt of inspiration, maybe more. And this routine is available to everyone.

Oddly, this quote brings to mind an analogy to sports. Basketball coaches often tell players not to rely on having a great shooting night in order to contribute to the team. Shooting is like inspiration; it comes and it goes, a gift of capricious gods. Defense, on the other hand, is always within the control of the player. It is grunt work, made up of effort, attention, and hustle. Every player can contribute on defense every night of the week.

For me, that's one of the key points in this message from Tharp: control what you can control. Build habits within which you work. Regular routine -- weekly, daily, even hourly -- are the scaffolding that keep you focused on making something. What's better, everyone can create and follow a routine.

While I sit and wait for the lightning bolt of inspiration to strike, I am not producing code, inspired or otherwise. Works of inspiration happen while I am working. Working as a matter of routine increases the chances that I will be producing something when the gods smile on me with inspiration. And if they don't... I will still be producing something.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

November 30, 2008 10:04 PM

Disconnected Thoughts to End the Month

... and a week of time away from work and worries.

There is still something special about an early morning run on fresh snow. The world seems new.

November has been a bad month for running, with my health at its lowest ebb since June, but even one three-mile jog brings back a good feeling.

I can build a set of books for our home finances based on the data I have at hand. I do not have to limit myself to the way accountants define transactions. Luca Pacioli was a smart guy who did a good thing, but our tools are better today than they were in 1494. Programs change things.

S-expressions really are a dandy data format. They make so many things straightforward. Some programmers may not like the parens, but simple list delimiters are all I need. Besides, Scheme's (read) does all the dirty work parsing my input.

After a week's rest, I imagine something like one of those signs from God:

That "sustainable pace" thing...
I meant that.

-- The Agile Alliance

I'd put Kent or Ward's name in there, but that's a lot of pressure for any man. And they might not appreciate my sense of humor.

The Biblical story of creation in six days (small steps) with feedback ("And he saw that it was good.") and a day of rest convinces me that God is an agile developer.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

November 17, 2008 8:58 PM

Doing It Wrong Fast

Just this weekend I learned about Ashleigh Brilliant, a cartoonist and epigrammist. From little I've seen in a couple of days, his cartoons remind me of Hugh MacLeod's business-card doodles Gaping Void -- only with a 1930s graphic style and language that is more likely SFW.

This Brilliant cartoon, #3053, made it into my Agile Development Hall of Fame on first sight:

Doing it wrong fast... is at least better than doing it wrong slowly

Doing it wrong fast means that we have a chance to learn sooner, and so have a chance to be less wrong than yesterday.

This lesson was fresh in my mind over the weekend from a small dose of programming. I was working on the financial software I've decided to write for myself, which has me exploring a few corners of PLT Scheme that I don't use daily. As a result, I've been failing more frequently than usual. The best failures come rat-a-tat-tat, because they are often followed closely by an a-ha! Sunday, just before I learned of Brilliant's work, I felt one of those great releases when, for the first time, my code gave me an answer I did not already know and had not wanted to compute by hand: our family net worth. At that moment I was excited to verify the result manually (I'll need that test later!) and enjoy all the wrong wrong work that had brought me to this point. Brilliant has something.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

November 12, 2008 6:57 AM

That's a Wrap

I have posted all of my reports from the 2008 SECANT workshop. In sum, SECANT is a worthwhile community-building effort. It brings together such a mix: academia and industry, different disciplines, and different kinds of schools, from large Big Ten and other R-I universities down to small liberal arts colleges. This one of the reasons why I love OOPSLA, but this venue provides a smaller, more intimate setting. (Of course, while SECANT lies at the intersection of computing -- and especially programming -- OOPSLA's domain is really Everything Programming, which is even better.)

The workshop was again a great source of ideas and inspiration for me. This seems like a good use of a relatively small amount of money by NSF. The onus is now on us participants... Will we do the work to grow the community? To develop courses and materials? To transform our institutions and disciplines. A tall order.

As for being done with my reports, I feel a small measure of pride. Sure, last year, I posted my final workshop report only five days after the workshop ended, and this year I'm at eleven days. But my report on SIGCSE -- from March -- is still incomplete, with two entries on top: a general description of a panel on bringing the joy and wonder back to CS, and a more detailed report on one of the presentations from that panel, by Eric Roberts.

Is there a statute of limitations on blog entries? Has my coupon to post on that panel session expired? If I were Kent Beck, I'd probably call this long delay a "blog smell" and write a pattern!

For me, blogging suffers from a stack-of-ideas phenomenon. I have ideas, and they get pushed onto the to-blog list. Sometimes, I have more ideas than time to write, and some ideas get pushed deep in the stack before I get a chance to write them up. Time passes... And then I look back at the list of ideas, and most feel stale, or at least no longer have their original hold on my mind. I currently have three levels of "blog ideas" folders, and each one contains a bunch of ideas that I remember wanting to write now -- but which now I feel no desire to write. Sounds like it's time for a little rm -r *.*

Going to a conference only makes the stack-of-ideas problem worse. The sessions follow one upon another, and each one tends to stir me up so much that I push even the previous session way back in my mind. That's one advantage of a 1.5-day workshop over a several day conference like SIGCSE or OOPSLA: the scale does not overflow my small brain.

Do readers care about any of this? Is SIGCSE stale for them? Perhaps, and I figure anyone who was wondering what went on in Portland has likely found the information elsewhere, and in any case moved on. But the topic of the unwritten entry may not be stale yet, so hope remains.

To return to the beginning of this blog, on the end of my SECANT reports: I hope you get as much from reading them as I did writing them.


Posted by Eugene Wallingford | Permalink | Categories: General

October 31, 2008 10:52 AM

SECANT This and That

[A transcript of the SECANT 2008 workshop: Table of Contents]

As always, at this workshop I have heard lots of one-liners and one-off comments that will stick with me after the event ends. This entry is a place for me to think about them by writing, and to share them in case they click with you, too.

The buzzword of this year's workshop: infiltration. Frontal curricular assaults often fail, so people here are looking for ways to sneak new ideas into courses and programs. An incremental approach creates problems of its own, but agile software proponents understand its value.

College profs like to roll their own, but high-school teachers are great adapters. (And adopters.)

Chris Hoffman, while describing his background: "When you reach my age, the question becomes, 'What haven't you done?' Or maybe, 'What have you done well?'"

Lenny Pitt: "With Python, we couldn't get very far. Well, we could get as far as we wanted, but students couldn't get very far." Beautiful. Imagine how far students will get with Java or Ada or C++.

Rubin Landau: "multidisciplinary != interdisciplinary". Yes! Ideas that transform a space do more than bring several disciplines into the same room. The discipline is new.

It's important to keep in mind the relationship between modeling and computing. We can do model without computing. But analytical models aren't feasible for all problems, and increasingly the problems we are interested in fall into this set.

Finally let me re-run an old rant by linking to the original episode. People, when you are second or third or sixth, you look really foolish.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

October 29, 2008 9:11 PM

Information, Dystopia, and a Hook

On my drive to Purdue today, I listened to the first 3/4 of Caleb Carr's novel, "Killing Time". This is not a genre I read or listen to often, so it's hard for me to gauge the book's quality. If you are inclined, you can read reviews on-line. At this point, I would say that it is not a very good book, but it delivered fine escapism for a car ride on a day when I needed a break more than deep thought. But it did get me to thinking about... computer science. The vignette that sets up the novel's plot is based on a typical use case for Photoshop, or a homework assignment in a media computation CS1 course.

Carr describes a world controlled by "information barons", a term intended to raise the specter of the 19th century's rail barons and their control of wealth and commerce. The central feature of his world in 2023 is deception -- the manipulation of information, whether digital or physical, to control what people think and feel. The novel's opening involves the role a doctored video plays in a presidential assassination, and later episodes include doctored photos, characters manufactured via the data planted on the internet, the encryption of data on disk, and real-time surveillance of encrypted communication.

If students are at all interested in this kind of story, whether for the science fiction, the intrigue, or the social implications of digital media and their malleability, then we have a great way to engage them in computing that matters. It's CSI for the computer age.

Carr seems to have an agenda on the social issues, and as is often the case, such an agenda interferes with the development of the story. His characters are largely cut-outs in service of the message. Carr paints a dystopian view striking for its unremitting focus on the negatives of digital media and the science's increasing understanding of the world at a molecular level. The book seems unaware that biology and chemistry are helping us to understand diseases, create new drugs, and design new therapies, or that computation and digital information create new possibilities in every discipline and part of life. Perhaps it is more accurate to say that Carr starts with these promises as his backdrop and chooses to paint a world in which everything that could go wrong has. That makes for an interesting story but ultimately an unsatisfying thought experiment. For escapism, that may be okay.

After my previous entry, I couldn't help but wonder whether I would have the patience to read this book. I have to think not. How many pages? 274 pages -- almost slender compared to Perec's book. Still, I'm glad I'm listening and not reading.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

October 29, 2008 9:16 AM

Clearing the Mind for a Trip

I leave today to attend the second SECANT workshop at Purdue. This is the sort of trip I like: close enough that I can drive, which bypasses all the headaches and inconveniences of flight, but far enough away that it is a break from home. My conference load has been light since April, and I can use a little break from the office. Besides, the intersection of computer science and the other sciences is an area of deep interest, and the workshop group is a diverse one. It's a bit odd to look forward to six hours on the road, but driving, listening to a book or to music, and thinking are welcome pursuits.

As I was checking out of the office, I felt compelled to make two public confessions. Here they are.

First, I recently ran across another recommendation for Georges Perec's novel, Life: A User's Manual. This was the third reputable recommendation I'd seen, and as is my general rule, after the third I usually add it to my shelf of books to read. As I was leaving campus, I stopped by the library to pick it up for the trip. I found it in the stacks and stopped. It's a big book -- 500 pages. It's also known for its depth and complexity. I returned the book to its place on the shelf and left empty-handed. I've written before of my preference for shorter books and especially like wonderful little books that are full of wisdom. But these days time and energy are precious enough resources that I have to look at a complex, 500-page book with a wary eye. It will make good reading some other day. I'm not proud to admit it, but my attention span isn't up to the task right now.

Second, on an even more frivolous note, there is at the time of this writing no Diet Mountain Dew in my office. I drank the last one yesterday afternoon while giving a quiz and taking care of pre-trip odds and ends. This is noteworthy in my mind only because of its rarity. I do not remember the last time the cupboard was bare. I'm not a caffeine hound like some programmers, but I don't drink coffee and admit some weakness for a tasty diet beverage while working.

I'll close with a less frivolous comment, something of a pattern I've been noticing in my life. Many months ago, I wrote a post on moving our household financial books from paper ledgers and journals into the twentieth century. I fiddled with Quicken for a while but found it too limiting; my system is a cross between naive home user and professional bookkeeping. Then I toyed with the idea of using a spreadsheet tool like Numbers to create a cascaded set o journals and ledgers. Yet at every turn I was thinking that I'd want to implement this or that behavior, which would strain the limits of typical spreadsheets. Then I came to my computer scientist's senses: When in doubt, write a program. I'd rather spend my time that way anyway, and the result is just what I want it to be. No settling. This pattern is, of course, no news at all to most of you, who roll your own blogging software and homework submission systems, even content management systems and book publishing systems, to scratch your own itches. It's not news to me, either, though sometimes my mind comes back to the power slowly. The financial software will grow slowly, but that's how I like it.

As a friend and former student recently wrote, "If only there were more time..."

Off to Purdue.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

October 24, 2008 12:10 PM

I've Been Reddited

I don't know if "reddited" is a word like "slashdotted" yet, but I can say that yesterday's post, No One Programs Any More, has reached #1 on Reddit's programming channel. This topic seems to strike a chord with a lot of people, both in the software business and in other technology pursuits. Here are my favorite comments so far:

I can't think of a single skill I've learned that has had more of an impact on my life than a semi-advanced grasp of programming.

This person received some grief for ranking learning how to program ahead of, say, learning how to eat, but I know just what the commenter means. Learning to program changes one's mind in the same way that learning to read and write. Another commenter agreed:

It's amazing how after a year of programming at university, I began to perceive the world around me differently. The way I saw things and thought about them changed significantly. It was almost as if I could feel the change.

Even at my advanced age, I love when that feeling strikes. Beginning to understand call/cc felt like that (though I don't think I fully grok it yet).

My favorite comment is a bit of advice I recommend for you all:

I will not argue with a man named Eugene.

Reddit readers really are smart!


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

October 16, 2008 11:56 PM

Odds and Ends from Recent Reading and Writing

I'm on the road to a recruiting event in Des Moines. The event is for girls who are interested in math and science. For me, the real treat is a chance to meet Mae Jemison the first woman of color to travel in space, on the space shuttle Endeavour in 1992. She's surely going to do a better selling math and science to these students than I could! (Note after the talk: She did. Perhaps the best way to summarize her message is, "We have choices to make.")

A few short items have been asking me to write them:

• At the risk of living too public a life where my students can see, I will say that the personality of my current class of students is not one that gives me a lot of energy. They are either still wary or simply disinterested. This happens every once in a while, and I'll try to find a way to create more energy in the room. In any case, it's nice at least to have a student or two who are like this.

• Kevin Rutherford has been working on a little software tool called reek, a smell detector for Ruby code.

That is what I would like to be doing right now, with either Ruby or Scheme being fine as a source language. Every time I teach programming languages I get the itch to dive deeply back into the refactoring pool. This is the primary drawback of administrative work and the primary indicator that I am probably not suited for a career in administration.

Short of working on such a cool project, blogging about interesting ideas is the next best thing.

• But consider this advice on writing:

If you have so many ideas, prove it to the world and start blogging. There is nothing like a blog to help you realize you have nothing new to say.

That post is really about why not to write a book. For many people, writing a book is a way to gain or demonstrate authority. Several of my friends and family have asked when I plan to write a book, and for at least a few their desire for me is grounded in the great respect that have for the value of a book. But I think that the author of the post is correct that writing a book is an outdated way to gain authority.

The world still needs great books such as, well, Refactoring, and one day I may sit down to write one. But I have to have something to say that should best be said in a book.

Perhaps we should take this author's advice with caution. She wrote a book and markets it with a blog!

• That piece also contains the following passage:

... self-respect comes from having some sort of vision for one's life and heading in that direction. And there is no one who can give you that vision -- you have to give it to yourself, and before you can feel like you have direction, you have to feel lost -- and lost is okay.

Long-time readers of this blog know that getting lost is not only okay but also demonstrates and exercises the imagination. Sometimes we get lost inside code just so that we can learn a new code base intimately.

• Finally, Seth Godin offers an unusual way to get things done:

Assemble your team (it might be just you) ... and focus like your hair is on fire. ... Do nothing except finish the project.

I need a Friday or a Monday at the office to try this out on a couple of major department projects. I was already planning a big sprint this weekend on a particularly persistent home project, and now I have a more provocative way to rev my engine.


Posted by Eugene Wallingford | Permalink | Categories: General

October 15, 2008 7:52 AM

Social Networks and the Changing Relationship Between Students and Faculty

One of my most senior colleagues has recently become enamored of Facebook. One of his college buddies started using it to share pictures, so my colleague created an account. Within minutes, he had a friend request -- from a student in one of his classes. And they kept coming... He now has dozen of friends, mostly undergrads at our school but also a few former students and current colleagues.

Earlier this week, he stopped me in the hall to report that during his class the previous hour, a student in the class had posted a message on his own Facebook page saying something to the effect, "I can't keep my eyes open. I have to go to sleep!" How does the prof know? Because they are Facebook friends, of course.

Did the student think twice about posting such a message during class? I doubt it. Was he so blinded by fatigue or boredom that he forgot the prof is his friend and so would see the message? I doubt it. Is he at all concerned in retrospect, or even just a little sheepish? I doubt it. This is standard operating procedure for a college set that opens the blinds on it life, day by day and moment by moment.

We live in a new world. Our students live much more public lives than most of us did, and today's network technology knocks down the well that separates Them from Us.

This can be a good thing. My colleague keeps his Facebook page open in the evenings, where his students can engage him in chat about course material and assignments. He figures that his office hours are now limited only by the time he spends in front of a monitor. Immediate interaction can make a huge difference to a student who is struggling with a database problem or a C syntax error. The prof does not mind this as an encroachment on his time or freedom; he can close the browser window and draw the blinds on office hours anytime he wants, and besides, he's hacking or reading on-line most of the time anyway!

I'm uncertain what the potential downsides of this new openness might be. There's always a risk that students can become too close to their professors, so a prof needs to take care to maintain some semblance of a professional connection. But the demystification of professors is probably a good thing, done right, because it enables connections and creates an environment more conducive to learning. I suppose one downside might be that students develop a sense of entitlement to Anytime, Anywhere access, and professors who can't or don't provide could be viewed negatively. This could poison the learning environment on both sides of the window. But it's also not a new potential problem. Just ask students about the instructors who are never in their offices for face-to-face meetings or who never answer e-mail.

I've not had experience with this transformation due to Facebook. I do have a page, created originally for much the same reason as my colleague's. I do have a small number of friends, including undergrads, former students, current colleagues, a grade-school buddy, and even my 60+ aunt. But I use Facebook sparingly, usually for a specific task, and rarely have my page open. I don't track the comments on my "wall", and I don't generally post on others'. It has been useful in one particular case, though, reconnecting me with a former student whose work I have mentioned here. That has been a real pleasure. (FYI, the link to his old site seems to be broken now.)

However, I do have limited experience with the newly transparent wall between me and my students, through blogs. It started when a few students -- not many -- found my blog and began to read it. Then I found the blogs of a few recent students and, increasingly, current students. I don't have a lot of time to read any blogs these days, but when I do read, I read some of theirs. Blogs are not quite as immediate as the Twitter-like chatter to be found in Facebook, but they are a surprisingly candid look into my students' lives and minds. Struggles they have with a particular class or instructor; personal trials at home; illness and financial woes -- all are common topics in the student blogs I read. So, too, are there joys and excitement and breakthroughs. Their posts enlighten me and humble me. Sometimes I feel as if I am privy to far too much, but mostly I think that the personal connection enriches my relationship both with individual students and with the collective student body. What I read certainly can keep me on a better path as I play the role of instructor or guide.

And, yes, I realize that there is a chance that the system can be gamed. Am I being played by a devious student? It's possible, but honestly, I don't think it's a big issue. The same students who will post in full view of their instructor that they want to sleep through class without shame or compunction are the ones who are blogging. There is a cultural ethic at play, a code by which these students live. I feel confident in assuming that their posts are authentic, absent evidence to the contrary for any given blogger.

(That said, I appreciate when students write entries that praise a course or a professor. Most students current students are circumspect enough not to name names, but there is always the possibility that they refer to my course. That hope can psyche me up some days.)

To be fair, we have to admit that the same possibility for gaming the system arises when professors blog. I suppose that I can say anything here in an effort to manipulate my students' perceptions or feelings. I might also post something like this, which reflects my take on a group of students, and risk affecting my relationship with those students. One of my close friends sent me e-mail soon after that post to raise just that concern.

For the same reasons I give the benefit of the doubt to student bloggers, I give myself the benefit of the doubt, and the same to the students who read this blog. To be honest, writing even the few entries I manage to write these days takes a lot of time and psychic energy. I have too little of either resource to spend them disingenuously. There is a certain ethic to blogging, and most of us who write do so for more important purposes than trying to manipulate a few students' perceptions. Likewise, I trust the students who read this blog to approach it with a mindset of understanding something about computer science and just maybe to get a little sense of what their Dear Old Professor tick.

I know that is the main reason I write -- to figure out how I tick, and maybe learn a few useful nuggets of wisdom along the way. Knowing that I do so in a world much more transparent than the one I inhabited as a CS student years ago is part of the attraction.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

October 07, 2008 5:49 AM

Databases and the Box

Last time I mentioned a Supreme Court justice's thoughts on how universal access to legal case data changes the research task associated with the practice of the law. Justice Roberts's comments brought to mind two thoughts, one related to the law and one not.

As a graduate student, I worked on the representation and manipulation of legal arguments. This required me to spend some time reading legal journals for two different purposes. First, I needed to review the literature on applying computers to legal tasks, ad in particular how to represent knowledge of statute and cases. Second, I needed to find, read, and code cases for the knowledge base of my program. I'm not that old, but I'm old enough that my research preceded the Internet Age's access to legal cases. I went to the campus library to check out thick volumes of the Harvard Law Review and other legal collections and journals. These books became my companions for several months, as I lay on the floor of my study and pored over them.

When I could not find a resource I needed on campus, I rode my bike to the Michigan State Law Library in downtown Lansing to use law reviews in its collection. I was not allowed to take these home, so I worked through them one at a time in carols there. I was quite an anomalous sight there, in T-shirt and shorts with a bike helmet at my side!

I loved that time, reading and learning. I never considered studying the law as a profession, but this work was a wonderful education in a fascinating domain where computing can be applied. My enjoyment of the reading almost certainly extending my research time in grad school by a couple of months.

The second thought was of the changes in chess brought about by the application of simple database technology. I've written about chess before, but not about computing applications to it. Of course, the remarkable advances in chess-playing computers that came to a head in Hitech and Deep Thought have now reached the desktop in the form of cheap and amazingly strong programs. This has affected chess in so many ways, from eliminating the possibility of adjournments in most tournaments to providing super-strong partners for every player who wants to play, day or night. The Internet does the same, though now we are never sure if we are playing against a person or a person sitting next to a PC running Fritz.

But my thoughts turned to the same effect Justice Roberts talked about, the changes created by opening databases on how players learn, study, and stay abreast of opening theory. If you have never played tournament chess, you may not be aware of how much knowledge of chess openings has been recorded. Go to a big-box bookstore like Amazon or Barnes and Noble or Borders and browse the library of chess titles. (You can do that on-line now, of course!) You will see encyclopedias of openings like, well, the Encyclopedia of Chess Openings; books on classes of openings, such as systems for defending against king pawn openings; and books upon books about individual openings, from the most popular Ruy Lopez and Sicilian Defense to niche openings like my favorites, Petroff's Defense and the Center Game.

In the olden days of the 1980s, players bought books on their objects of study and pored over them with the same vigor as legal theorists studying law review articles. We hunted down games featuring our openings so that we could play through them to see if there was a novelty worth learning or if someone had finally solved an open problem in a popular variation. I still have a binder full of games with Petroff's Defense, cataloged using my own system, variation by variation with notes by famous players and my own humble notes from unusual games. My goal was to know this opening so well that I could always get a comfortable game out of the opening, against even stronger players, and to occasionally get a winning position early against a player not as well versed in the Petroff as I.

Talk about a misspent youth.

Chessplayers these days have the same dream, but they rarely spend hours with their heads buried inside opening books. These days, it is possible to subscribe to a database service that puts at our fingertips, via a computer keyboard, every game played with any opening -- anywhere in the recorded chess world, as recently as the latest update a week ago. What is the study of chess openings like now? I don't know, having grown up in the older era and not having kept up with chess study in many years. Perhaps Justice Roberts feels a little like this these days. Clerks do a lot of his research, and when he needs to do his own sleuthing, those old law reviews feel warm and inviting.

I do know this. Opening databases have so changed chess practice, from grandmasters down to patzers like me, that the latest issue of Chess Life, the magazine of U.S. Chess, includes a review of the most recent revision of Modern Chess Openings -- the opening bible on which most players in the West once relied as the foundation of broad study -- whose primary premise is this: What role does MCO play in a world where computer database is king? What is the use of this venerable text?

From our gamerooms to our courtrooms, applications of even the most straightforward computing technology have changed the world. And we haven't even begun to talk about programs.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

September 01, 2008 2:31 PM

B.B. King, CS Wannabe

The Parade magazine insert to my Sunday paper contained an interview with legendary bluesman B.B. King that included this snippet:

There's never a day goes by that I don't miss having graduated and gone to college. If I went now, I would major in computers and minor in music. I have a laptop with me all the time, so it's my tutor and my buddy.

CS and music are, of course, a great combination. Over the years, I've had a number of strong and interesting students whose backgrounds included heavy doses of music, from alt rock to marching band to classical. But B.B. King is from a different universe. Maybe I can get this quote plastered on the walls of all the local haunts for musicians.

I wonder what B.B. calls his laptop?


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

August 22, 2008 4:00 PM

Lawyers Read My Blog

Not really. But they do protect their marks.

A couple of years ago, I received a polite request from the lawyers of a well-known retired cartoonist, asking that I not use one of his cartoons. Today, I received a polite request from the lawyers of a well-known business author and speaker, asking:

Please make sure to include [a statement acknowledging our registered trademark] on each page that our trademarked term appears. Additionally, we respectfully request that every time you use our mark in the body of your work of commentary that you capitalize the first letter of each word in the mark and directly follow the mark with the ® symbol so that it reads as "... ®"

Google does change the landscape for many, many things. This is a good thing; it reduces friction in the market and the law.

One result for me is that I now know that the ® symbol is denoted by entity number 174 or entity name reg. I've used © occasionally, but rarely ®.

That said, I'm not too keen on having to capitalize two common words every time I use them in an article. I think I either need to write those articles without using the code phrase, or simply stop quoting books that are likely to trademark simple phrases. The latter rules out most thin business trade books, especially on management and marketing. That's not much of a loss, I suppose.


Posted by Eugene Wallingford | Permalink | Categories: General

August 15, 2008 2:35 PM

Less, Sooner

Fall semester is just around the corner. Students will begin to arrive on campus next week, and classes start a week from Monday. I haven't been able to spend much time on my class yet and am looking forward to next week, when I can.

What I have been doing is clearing a backlog of to-dos from the summer and handling standing tasks that come with the start of a new semester and especially a new academic year. This means managing several different to-do lists, crossing priorities, and generally trying to get things done.

As I look at this mound of things to do I can't help being reminded of something Jeff Patton blogged a month or so ago: two secrets of success in software development, courtesy of agile methods pioneer Jim Highsmith: start sooner, and do less.

Time ain't some magical quantity that I can conjure out of the air. It is finite, fixed, and flowing relentlessly by. If I can't seem to get done on time, I need to start sooner. If I can't seem to get it all done, I need to do less. Nifty procedures and good tools can help only so much.

I need to keep this in mind every day of the year.

Oh, and to you students out there: You may not be able to do less work in my class, but you can start sooner. You may have said so yourself at the end of last semester. Heck, you may even want to do more, like read the book...


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

July 28, 2008 3:44 PM

Revolution, Then Evolution

I recently started reading The Art of Possibility, by Roz and Ben Zander, and it brought to mind a pattern I have seen many times in literature and in life. Early on, the Zanders explain that this book is "not about making incremental changes that lead to new ways of doing things based on old beliefs". It is "geared toward causing a total shift of posture [and] perceptions"; it is "about transforming your entire world".

That's big talk, but the Zanders are not alone in this message. When talking to companies about creating new products, reaching customers, and running a business, Guy Kawasaki uses the mantra Revolution, Then Evolution. Don't try to get better at what you are doing now, because you aren't always doing the right things. But also don't worry about trying to be perfect at doing something new, because you probably won't be. Transform your company or your product first, then work to get better.

This pattern works in part because people need to be inspired. The novelty of a transformation may be just what your customers or teammates need to rally their energies, when "just" trying to get better will make them weary.

It also works despite running contrary to our fixation these days with "evolving". Sometimes, you can't get there from here. You need a mutation, a change, a transformation. After the transformation, you may not be as good as you would like for a while, because you are learning how to see the world differently and how to react to new stimuli. That is when evolution becomes useful again, only now moving you toward a higher peak than was available in the old place.

I have seen examples of this pattern in the software world. Writing software patterns was a revolution for many companies and many practitioners. The act of making explicit knowledge that had been known only implicitly, or the act of sharing internal knowledge with others and growing a richer set of patterns, requires a new mindset for most of us. Then we find out we are not very good, so we work to get better, and soon we are operating in a world that we may not have been able even to imagine before.

Adopting agile development, especially a practice-laden approach such as XP, is for many developers a Revolution, Then Evolution experience. So are major lifestyle changes such as running.

Many of you will recognize an old computational problem that is related to this idea: hill climbing. Programs that do local search sometimes get stuck at a local maximum. A better solution exists somewhere else in the search space, but the search algorithm makes it impossible for the program to get out of the neighborhood of the local max. One heuristic for breaking out of this circumstance is occasionally to make a random jump somewhere else in the search space, and see where hill climbing leads. If it leads to a better max, stay there, else jump back to the starting point.

In AI and computer science more generally, it is usually easier to peek somewhere else, try for a while, and pop back if it doesn't work out. Most individuals are reluctant to make a major life change that may need to be undone later. We are, for the most part, beings living in serial time. But it can be done. (I sometimes envy the freer spirits in this world who seem built for this sort of experimentation.) It's even more difficult to cause a tentative radical transformation within an organization or team. Such a change disorients the people involved and strains their bonds, which means that you had better well mean it when you decide to transform the team they belong to. This is a major obstacle to Revolution, Then Evolution, and one reason that within organizations it almost always requires a strong leader who has earned everyone's trust, or at least their respect.

As a writer of patterns, I struggle with how to express the context and problem for this pattern. The context seems to be "life", though there are certainly some assumptions lurking underneath. Perhaps this idea matters only when we are seeking a goal or have some metric for the quality of life. The problem seems to be that we are not getting better, despite an effort to get better. Sometimes, we are just bored and need a change.

Right now, the best I can say from my own experience is that Revolution, Then Evolution applies when it has been a while since I made long-term progress, when I keep finding myself revisiting the same terrain again and again without getting better. This is a sign that I have plateaued or found a local maximum. That is when it is time to look for a higher local max elsewhere -- to transform myself in some way, and then begin again the task of getting better by taking small steps.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

July 09, 2008 1:57 PM

Interlude

[Update: Added a linked to my interview at Confessions of a Science Librarian.]

Some months, I go through stretches when I write a lot. I started this month with a few substantive posts and a few light posts in the span of a week. Back in November 2007, I wrote twice as many posts as the typical month and more than any month since my first few months blogging. That month, I posted entries eleven days in a row, driven by a burst of thoughts from time spent at a workshop on science and computer science. This month, I had the fortune to read some good articles and the chance to skip real work, think, and write. Sometimes, the mood to write takes hold.

I have had an idea for a long time to write an entry that was motivated by reading George Orwell's essay Why I Write, but never seem to get to it. I'm not getting to it today, either. But it came to mind again for two reasons. First, I spent the morning giving a written interview to John Dupuis, who blogs at Confessions of a Science Librarian. John is a reader of my blog and asked me to share some of my ideas with his readers. I was honored to be asked, and so spent some time this morning reflecting on my blog, what and why I write. Second, today is the fourth anniversary of my first blog post.

Responding to John's questions is more writing than I do on most days. I don't have enough energy left to write a substantive post yet today, but I'm still in a reflective frame of mind about why I write.

Do I really need to blog? Someone has already said what I want to say. In that stretch of posts last November, I cited Norvig, Minsky, and Laurel, among others, talking about the same topics I was writing about. Some reasons I can think of are:

  • My experiences are different, so maybe I have something to add, however small that might be.
  • My posts link to all that great work, and someone who reads my blog may find an article they didn't know about and read it. Or they may remember it from the past, feel guilty at not having read it before, and finally go off to read it.
  • If nothing else, I write to learn, and to make myself happy. Some days, that's more than enough reason for me.

There are certainly other self-interested reasons to write. There is noble self-interest:

Share your knowledge. It's a way to achieve immortality.
-- the 14th Dalai Lama

And there is the short-term self-interest. I get to make connections in my own mind. Sometimes I am privileged to see my influence on former students, when they respond to something I've written. And then there is the lazy blog, where some reader knows or has something I don't and shares. At times, these two advantages come together, as when former student Ryan Dixon brought me a surprise gift last winter.

Year Five begins today, even if still without comments.


Posted by Eugene Wallingford | Permalink | Categories: General

July 07, 2008 1:58 PM

Patterns in My Writing

While reading this morning I came across a link to this essay. College students should read it, because it points out many of the common anti-patterns in the essays that we professors see -- even in papers written for computer science courses.

Of course, if you read this blog, you know that my writing is a poster child for linguistic diffidence, and pat expressions are part of my stock in trade. It's sad to know that these anti-patterns make up so much of my word count.

This web page also introduced me to Roberts's book Patterns in English. With that title, I must check it out. I needed a better reason to stop by the library than merely to return books I have finished. Now I have one.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

July 05, 2008 9:57 PM

Wedding Season

On my way into a store this afternoon to buy some milk, I ran into an old friend. He moved to town a decade or so ago and taught art at the university for five years before moving on to private practice. As we reminisced about his time on the faculty, we talked about how much we both like working with students. He mentioned that he recently attended his 34th wedding of a former student.

Thirty-four weddings from five years of teaching. I've been teaching for sixteen years and have been invited to only a handful of weddings -- three or four.

Either art students are a different lot from CS students, or I am doing something wrong...


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

July 04, 2008 8:37 AM

Science, Education, and Independent Thought

I wrote about a recent CS curricular discussion, which started with a blog posting by Mark Guzdial. Reading the comments to Guzdial's post is worth the time, as you'll find a couple of lengthy remarks by Alan Kay. As always, Kay challenges even computer science faculty to think beyond the boundaries of our discipline to the role what our students learn from us plays in a democratic world.

One of Kay's comments caught my attention for connections to a couple of things I've written about in recent years. First, consider this:

I posit that this is still the main issue in America. "Skilled children" is too low a threshold for our system of government: we need "educated adults". ... I think the principle is clear and simple: there are thresholds that have to be achieved before one can enter various conversations and processes. "Air guitar and attitude" won't do.

Science is a pretty good model (and it was used by the framers of the US). It is a two level system. The first level has to admit any and all ideas for consideration (to avoid dogma and becoming just another belief system). But the dues for "free and open" are that science has built the strongest system of critical thinking in human history to make the next level threshold for "worthy ideas" as high as possible. This really works.

This echoes the split mind of a scientist: willing to experiment with the widest set of ideas we can imagine, then setting the highest standard we can imagine for accepting the idea as true. As Kay goes on to say, this approach is embedded in the fabric of the American mentality for free society and government. This is yet another good reason for all students to learn and appreciate modern science; it's not just about science.

Next, consider this passage that follows soon after:

"Air guitar" is a metaphor for choosing too tiny a subset of a process and fooling oneself that it is the whole thing. ... You say "needs" and I agree, but you are using it to mean the same as "wants", and it is simply not the case that education should necessarily adapt to the "wants" of students. This is where the confusion of education and marketing enters. The marketeers are trying to understand "wants" (and even inject more) and cater to them for a price; real educators are interested in "needs" and are trying to fulfill these needs. Marketeers are not trying to change but to achieve fit; educators are trying to change those they work with. Introducing marketing ideas into educational processes is a disaster in the making.

I've written occasionally about ideas from marketing, from the value of telling the right story to the creating of new programs. I believe those things and think that we in academia can learn a lot from marketers with the right ideas. Further, I don't think that any of this is in conflict with what Kay says here. He and I agree that we should not change our curriculum to cater solely to the perceptions and base desires of our clientele, whether students, industry, or even government. My appeal to marketing for inspiration lies in finding better ways to communicate what we do and offer and in making sure that what we do and offer are in alignment with the long-term viability of the culture. The best companies are in business for the long haul and must stay aligned with the changing needs of the world.

Further, as I am certain Kay will agree based on many of the things he has said about Apple of the 1980s, the very best companies create and sell products that their customers didn't even know they wanted. We in academia might learn something from the Apples of our world about how to provide the liberal and professional education that our students need but don't realize they need. The same goes for convincing state legislatures and industry when they view too short a horizon for what we do.

Like Kay, I want to give my students "real computing" and "real education".

I think it is fitting and proper to talk about these issues on Independence Day in the United States. We depend on education to preserve the democratic system in which we live and the values by which we live. But there's more. Education -- including, perhaps especially, science -- creates freedom in the student. The mind becomes free to think greater thoughts and accomplish greater deeds when it has been empowered with our best ideas. Science is one.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

June 30, 2008 12:26 PM

Not Reading in the Wild

My recent entry on student evaluations brought to mind a few other items on not reading that I've encountered recently.

  • Michael Mitzenmacher writes about the accessibility of scientific references, whether on-line or in obscure journals. How much effort should an author have to make to track down related work to cite? Presumably, Mitzenmacher holds authors responsible for reading works about which they know, but it seems a short step from asking whether authors must make extra effort to find related work to asking, as Bayard did, whether authors must make extra effort even to read related work. (And, if you have ever seen academic papers in computer science, you know that many of them do require extra effort to read!)
  • Steve Yegge never learned to read sheet music and has survived by virtue of a prodigious memory. But he tells us that this is a bad thing:

    Having a good memory is a serious impediment to understanding. It lets you cheat your way through life.

    So, Montaigne and I need not worry. Had we better better memories, we might be skating through life as easily as Yegge.

  • Comedian Pat Dixon has a schtick in which he gives unbiased movie reviews. How does he maintain his objectivity in a sea of personal opinion? He doesn't watch the movies! Wouldn't watching affect his reviews? Brilliant, and often quite funny.

I hope it's clear that at least this last example is not serious at all.


Posted by Eugene Wallingford | Permalink | Categories: General

June 18, 2008 3:51 PM

William James and Focus

I've long been a fan of William James, and once wrote briefly about the connection between James's pragmatism and my doctoral work on knowledge-based systems. I was delighted yesterday to run across this quote from James's The Principles of Psychology, courtesy of 43 Folders and Linda Stone:

[Attention] is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought. ... It implies withdrawal from some things in order to deal effectively with others....

Prone as I am to agile moments, this message from James struck me in an interesting way. First of all, I occasionally battle the issue that Stone writes about, the to-do list that grows no matter productive I seem to be on a given day. (And on lazy summer June days, well, all bets are off.) James tells me that part of my problem isn't a shortage of time, but a lack of will to focus. I need to make better, more conscious choices about what tasks to add to the list. Kent Beck is fond of saying something to the effect that you may have too many things to do and too little time, but you ultimately have control over only one side of the equation. James would tell us the same thing.

My mind also made a connection from this quote to the agile software and test-driven development practice of working on small stories, on taking small steps. If I pick up a card with a single, atomic, well-defined feature to be added to my program, I am able to focus. What is the shortest step I can take and make this feature part of my code? No distractions, no Zerstreutheit. Though I have an idea in mind toward where my program is evolving, for this moment I attend to one small feature and make it work. Focus. James would be proud.

I think it's ironic in a way that one of the more effective ways to reach the state of flow is to decompose a task into the smallest of tasks and focus on them one at a time. The mind gets into a rhythm of red bar-green bar: select task, write code, refactor, and soon it is deep in its own world. I would like to be more effective at doing this in my non-programming duties. Perhaps if I keep James and his quote in mind, I can be.

This idea applies for me in other areas, in particular in running and training for particular events. Focusing each day on a particular goal -- intervals, Long Slow Distance, hill strength, and so on -- helps the mind to push aside its concerns with other parts of the game and attend to a particular kind of improvement. There is a great sense of relaxation in running super-hard repeats when the problem I've been having is, say, picking up pace late in a run. (I'd love to run super-hard repeats again some day soon, but I'm not there yet.)


Posted by Eugene Wallingford | Permalink | Categories: General, Running, Software Development

June 16, 2008 1:38 PM

A Picture of My Blog

I saw a link to Wordle on Exploration Through Example and decided to try it out on Knowing and Doing. Wordle generates a tag cloud for any text you paste in. I pasted in the content of my blog since the beginning of 2008, and it produced this lovely image:

a tag cloud for Knowing and Doing posts since 01/2008

That looks like a pretty good capsule of what I've been writing about lately. I was a bit surprised at the size of "students", but I probably shouldn't have been. "Programming", "work", "time", "ideas", "read", and "computer"/"CS"/"computing" hit the mark.

Besides, I had fun writing the script to pull plain text for my posts from their source form. It was a nice break from a day dealing with lab reservations and end-of-year budget issues.


Posted by Eugene Wallingford | Permalink | Categories: General

June 12, 2008 9:53 PM

The Subject of My Writing

In recent days, I have written about not reading books and the relationship of these ideas to writing, from my enjoyment of Pierre Bayard's How to Talk About Books You Haven't Read. A couple of readers have responded with comments about how important reading is. Don't worry -- much of what Bayard and I are saying here is a joke. But it is also true, when looked at with one's head held tilted just so, and that's part of what made the book interesting to me. For you software guys, think about Extreme Programming -- an idea taken to its limits, to see what the limits can teach us. You can be sure that I am not telling you not to read every line of every novel and short story by Kurt Vonnegut! (I certainly have, some many, many times, and I enjoyed every minute.) Neither is Bayard, though it may seem so sometimes.

In my entries inspired by the book, it seems as if I am talking about myself an awful lot. Or consider my latest article, on parsing in CS courses. I read an article by Martin Fowler and ended up writing about my course and my opinions of CS courses. My guess is that most folks out there are more interested in Fowler's ideas than mine, yet I write.

This is another source of occasional guilt... Shouldn't this blog be about great ideas? When I write about, say, Bayard's book, shouldn't the entry be about Bayard's book? Or at least about Bayard?

Bayard helps me to answer these questions. Let's switch from Montaigne, the focus of my last entry on this topic, to Wilde. The lead quote of Bayard's Chapter 12 was the first passage of the book to seize my attention as I thumbed through it:

Speaking About Yourself

(in which we conclude, along with Oscar Wilde,
that the appropriate time span for reading a book
is ten minutes, after which you risk forgetting
that the encounter is primarily a pretext
for writing your autobiography)

My experience writing this blog biases me toward shouting out, "Amen, Brother Bayard!" But, if it is true that all of my writing is a pretext for writing my autobiography, then it is all the more remarkable that I have any readers at all. Certainly you all have figured this out by now.

Bayard claims -- and Wilde agrees -- that it cannot be any other way. You may find more interesting people writing about themselves and read what they write, but you'll still be reading about the writer. (This is cold consolation for someone like me, who knows myself to be not particularly interesting!)

Bayard explores Wilde's writing on this very subject, in particular his The Critic as Artist (HB++). Bayard begins his discussion with the surface connection of Wilde offering strident support for the idea of not reading. Wilde says that, in addition to making lists of books to read and lists of books worth re-reading, we should also make lists of books not to read. Indeed, a teacher or critic would do an essential service for the world by dissuading people from wasting their time reading the wrong books. Not reading of this sort is a "power acquired by specialists, a particular ability to grasp what is essential".

Bayard then moves on to a deeper connection. Wilde asserts in his typical fashion that the distinction between creating a work of art and critiquing a work of art is artificial. First, the artist, when creating, necessarily exercises her critical faculty in the "spirit of choice" and the "subtle tact of omission"; without this faculty no one can create art, at least not art worth considering. This is an idea that most people are willing to accept, especially those creative people who have some awareness of how they create.

But what of the critic? Many people consider critics to be parasites who at best report what we can experience ourselves and and at worst detract from our experience with their self-indulgent contributions.

Not Wilde:

Criticism is itself an art. And just as artistic creation implies the working of the critical faculty, and, indeed, without it cannot be said to exist at all, so Criticism is really creative in the highest sense of the word. Criticism is, in fact, both creative and independent.

This means that a blogger who primarily comments on the work of others can herself be making art, creating new value. By choosing carefully ideas to discuss, subtly omitting what does not matter, the critic creates a new work potentially worthy of consideration in its own right. (Suddenly, the idea of a mashup comes to mind.)

The idea of critic as an independent creator is key. Wilde says:

The critic occupies the same relation to the work of art he criticises as the artist does to the visible world of form and colour, or the unseen world of passion and thought. He does not even require for the perfection of his art the finest materials. Anything will serve his purpose.

...

To an artist so creative as the critic, what does subject-matter signify? No more and no less than it does to the novelist and the painter. Like them, he can find his motives everywhere. Treatment is the test. There is nothing that has not in it suggestion or challenge.

Bayard summarizes other comments from Wilde in this way:

The work being critiqued can be totally lacking in interest, then, without impairing the critical exercise, since the work is there only as a pretext.

But how can this be?? Because ultimately, the writer writes about himself. Freed from the idea that writing about something else is about that something, the writer is able to use the something as a trigger, a cue to write about the ideas that lie in his own mind. (Please read the first paragraph of the linked entry, if nothing else. Talk about not reading!)

As Wilde says,

That is what the highest criticism really is, the record of one's own soul.

Again, Bayard summarizes neatly:

Reflection on the self .. .is the primary justification for critical activity, and this alone can elevate criticism to the level of art.

As I read this chapter, I felt as if Bayard and Wilde were speaking directly to me and my own doubts as a blogger who likes to write about works I read, performances I see, and experiences as I have. It is a blogger's manifesto! Knowing and Doing feels personal to me because it is. Those works, performances, and experiences stimulate me to write, and that's okay. It is the nature of creativity to be sparked by something Other and to use that spark to express something that lies within the Self. Reading about Montaigne and his fear of forgetting what he had written was a trigger for me to write something I'd long been thinking. So I did.

I can take some consolation: This blog may not be worth reading, but not because I choose to connect what I read, see, hear, and feel to myself. It can be unworthy only to the extent that what is inside me is uninteresting.

By the way, I have just talked quite a bit about "The Critic as Artist", though I have never read it. I have only read the passages quoted by Bayard, and Bayard's commentary on it. I intend to read the original -- and begin forgetting it -- soon.

~~~~~

These three entries on Bayard's delightful little text cover a lot of ground in the neighborhood of guilt. We often feel shame at not having read something, or at not having grown from it. When we write for others, it is easy to become too concerned with getting things right, with being perfect, with putting on appearances. But consider this final quote from Bayard:

Truth destined for others is less important than truthfulness to ourselves, something attainable only by those who free themselves from the obligation to seem cultivated, which tyrannizes us from within and prevents us from being ourselves.

Long ago, near the beginning of this blog, I quoted Epictetus's The Enchiridion, via the movie Serendipity, of all places. That quote has a lot in common with what Bayard says here. Freeing ourselves from the obligation to seem cultivated -- being content to be thought foolish and stupid -- allows us to grow and to create. Epictetus evens refers to keeping our "faculty of choice in a state conformable to nature", just as Wilde stresses the role of critical faculty creating a work of art when we write.

Helping readers to see this truth and to release them from the obligation to appear knowing is the ultimate source of the value of How to Talk About Books You Haven't Read. Perhaps Bayard's will be proud that I mark it FB++.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

June 05, 2008 3:49 PM

Not Reading, and Writing

In my last entry, I talked about Pierre Bayard's How to Talk About Books You Haven't Read, which I have, indeed, read. Bayard started with the notion that no one should feel shame about not having read a book, even when we are called upon to talk abut it. He eventually reached a much more important idea, that by freeing ourselves from this and other fears we have about books and learning we open ourselves to create art of our own. This entry looks at the bigger idea.

The issues that Bayard discusses throughout the book touch me in several personal and professional ways. I am a university professor, and as a teacher I am occasionally asked by students to talk about books and papers. I've read many of these, but not all; when I have read a work, though, I may well have forgotten a lot of it. In either case, I can find myself expected to talk intelligently about a work I don't know well. Not surprisingly, students show considerable deference to their teachers, in part because they expect a level of authority. That's pressure. Personally, I sometimes hang with an interesting, literate, well-read crowd. They've all read a lot of cool stuff; why haven't I? They don't ask me that, of course, but I ask myself.

Bayard assures us "don't worry!", explains why not, and tells us how to handle several common situations in which we will find ourselves. That's the idea -- partly humorous, partly serious -- behind the book.

But there is more to the book, both humor and truth, that connected with me. Consider:

Reading is not just acquainting ourselves with a text or acquiring knowledge; it is also, from its first moments, an inevitable process of forgetting.

Until I started writing this blog, I did not have a good sense of how bad my memory is for what I have read. I've never had a high level of reading comprehension. Those standardized tests we all took in grade school showed me to be a slow reader with only moderate comprehension, especially when compared to performance in school. One of the best outcomes for me of writing this blog has been to preserve some of what I read, especially the part that seems noteworthy at the time, before I start to forget it.

The chapter that contains the sentence quoted above begins with this subtitle:

(in which, along with Montaigne, we raise
the question of whether a book you have
read and completely forgotten, and which
you have even forgotten you have read,
is still a book you have read)

Montaigne writes with fear about his forgetfulness, the loss any memory of having read a book. Does that still count? In one sense, yes. I've held Ringworld in my hands and taken in the words on each page. But in most ways I am today indistinguishable from a person who has never read the book, because I don't remember much more than the picture on the cover. Bayard explores this and other ways in which the idea of "to read" is ambiguous and bases his advice on the results.

How about any of the many, many technical computer science books I've read? The same fate. There is one solace to be had when we consider books that teach us how to do something. The knowledge we gain from reading technical material can become part of our active skill base, so that even after we have forgotten "knowledge that" the content of a compiler text is true, we can still have "knowledge how" to do something.

But reading is not the end of forgetting. Montaigne was an essayist. What about writing? Montaigne expects his loss to extend to his own work:

It is no great wonder if my book follows the fate of other books, and if my memory lets go of what I write as of what I read, and of what I give as of what I receive.

Forgetting what I have written is a sensation I share with Montaigne. Occasionally, I go back and re-read an old entry in this blog, or a month of entries, and am amazed. Some times, I am amazed that I wrote such drivel. Other times, I am amazed that I had such a good idea and managed to express it well. And, yes, I am often amazed to be reminded I have read something I've forgotten all about. In the best of these cases, the entry includes a quotation or, even better, a link to the work. This allows me to read it again, if I desire. I usually do.

That is good news. We can hold at bay the forgetting of what we read by re-reading. But there is another risk in forgetting: writing the same thing again. Bayard reports Montaigne's fear:

Incapable of remembering what he has written, Montaigne finds himself confronted with the fear of all those losing their memory: repeating yourself without realizing it....

Loss of memory creates an ambiguity in the writer's mind. It's common for me when writing a blog entry to have a sense of deja vu that I've written the same thing before. Sometimes my mind is playing tricks on me, due to the rich set of links in my brain, but sometimes I am and have forgotten. The fear in forgetting what we have written is heightened by the fear that what we write is unremarkable. We may remember the idea that stands out, but how are we to remember the nondescript? I often feel as Montaigne did:

Now I am bringing in here nothing newly learned. These are common ideas; having perhaps thought of them a hundred times, I am afraid I have already set them down.

I feel this almost no matter what I write. Surely my thoughts are common; what value is there in writing them down for others to read? That's why it was good for me to realize at the very beginning that I had to think that I was writing for myself. Only then would I find the courage to write at all and maybe benefit someone else. When confronted by a sense that I am writing the same ideas again, I just have to be careful. And when I do repeat myself, I must hope that I do it better the second time, or at least differently enough, to add something that makes the new piece worth a reader's time.

The danger in forgetting what I have written is not only in writing again. What about when a reader asks me about something I have written? Montaigne faced this fear, too, as Bayard writes:

But fear of repeating himself is not the only embarrassing consequence of forgetting his own books. Another is that Montaigne does not even recognize his own texts when they are quoted in his presence, leaving him to speak about texts he hasn't read even though he has written them.

That is at least two layers of not reading more than most of us expect to encounter in any situation! But the circumstance is quite real. When someone sends me e-mail asking about something I've forgotten I wrote, I have the luxury of time to re-read (there it is again!) before I respond. My correspondent is likely none the wiser. But what if they ask me in person? I am right where Bayard says I will be, left to respond in many of the ways he describes.

By writing about what I read and think about, there is a great risk that people will expect me to be changed by the experience! I did not do myself any favors when I chose a name for my blog, because I create an expectation about both knowing and doing. I certainly hope that I am changed by my experience reading and writing, but I know that often I have not changed, at least sufficiently. I still give lame assignments. I'm not that much better as a teacher at helping students learn more effectively. My shortcoming is all the more obvious when students and former students read my blog and are able to compare their experiences in my classes with my aspirations.

This is actually more a source of guilt for me than being thought not to have read a book. I know I am not as good as all the things I've read might lead one to believe, or even as good as what I've written (which sets a much lower bar!). If I am not growing, what is the point?

Of course, I probably am changing, in small increments beneath the scale of my perception. At least I hope so. Bayard doesn't say this in so many words, but I think it is implicit in how he approaches reading and not reading. For him, there is no distinction between the two:

We do not retain in memory complete books identical to the books rememeberd by everyone else, but rather fragments surviving from partial readings, frequently fused together and recast by our private fantasies.

This is a central theme for Bayard, and for me as well. He talks at length about the different ways our inner conception of books and libraries affects the act of reading any book. I often wonder how much of what I report about a book or paper I read is a product of me imposing my own view on what the writer has said -- not what is there, not what the author has intended, what distorts the writer's meaning? How much of what I am writing about Bayard's book reflects accurately the book he wrote?

Bayard would be unconcerned. On his view, I could no more not impose my inner book on the outer one than to be Bayard himself. No one can avoid distortion (in an objectivist sense) or imposition of self (in a subjectivist sense). What distinguishes the master is how forcefully and creatively one does so. Private fantasy, indeed!

To conceive of reading as loss ... rather than as gain is a psychological resource essential to anyone seeking effective strategies for surviving awkward literary confrontations.

Can I admit that I have forgotten something I've read or written? Certainly; I do it frequently. The key is to talk about the work as who I am in the present. I don't even need to explicitly acknowledge the loss, because the loss is a given. But I must talk without embarrassment and without any pretension that I remember exactly what I said or thought then. The human mind works in a certain way, and I must accept that state of affairs and get down to the business of learning -- and creating -- today.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

June 04, 2008 2:07 PM

Not Reading Books

I have another million to my credit, and it was a marvelous little surprise.

Popular culture is full of all sorts of literary references with which you and I are supposed to be familiar. Every year brings another one or two. The Paradox of Choice. The Tipping Point. The Wisdom of Crowds. Well-read people are expected, well, to have read the books, too. How else can we expect to keep up with our friends when they discuss these books, or to use the central wisdoms they contain in knowing ways?

I have a confession. I have read only two or three chapters of The Wisdom of Crowds. I have read only an excerpt from The Tipping Point that appeared in the New Yorker or some other literary magazine. And while I've seen a Google talk by Barry Schwartz on-line, I may not have read anything more than a short precis of the work. Of course, I have learned a lot about them from my friends, and by reading about them in various other contexts. But, strictly speaking, I have not read any of them.

To be honest, I feel no shame about this state of affairs. There are so, so many books to read, and these just have not seemed important enough to displace others from my list. And in the case of The Wisdom of Crowds, I found that one or two chapters told me pretty much all I needed to understand the Big Idea it contained. Much as Seth Godin has said about many popular business books, many books in the popular canon can be boiled down to much shorter works in their essence, with the rest being there for elaboration or academic gravitas.

cover of How to Talk About Books You Haven't Read

For airplane reading on my trip to the workshop at Google, I took Pierre Bayard's How to Talk About Books You Haven't Read. Bayard's thesis is that neither I nor anyone else should feel shame about not having read any given book, even if we feel a strong compulsion to comment, speak, or write about it. In not reading and talking anyway, we are part of a grand intellectual tradition and are, in fact, acting out of necessity. There are simply too many books to read.

This problem arises even in the most narrow technical situation. When I wrote my doctoral dissertation, I surely cited works with which I was familiar but which I had not "read", or, having read them, had only skimmed them for specific details. I recall feeling a little bit uneasy; what if some party of the book or dissertation that I had not studied deeply said something surprising or wrong? But I knew a lot about these works in context: from other people's analyses, from other works by the same author, and even from having discussed the work with the author him- or herself. But in an important way, I was talking about a work I "had not read".

How I could cite the work anyway and still feel I was being intellectually honest gets to one of the central themes of Bayard's book: the relationships between ideas are often more important than the ideas themselves. To understand a work in the context of the universal library means more than just to know the details of the work, and the details themselves are often so affected by conditions outside of the text that they are less reliable than the bigger picture anyway.

First, let me assure you. Bayard wrote this book with a wink in his eye. At times, he speaks with a cavalier sarcasm. He also repeats himself in places; occasional paragraphs sound as if they have been lifted verbatim from previous chapters.

Second, this book fits Seth Godin's analysis of popular business books pretty well. Two or three chapters were sufficient to express the basic idea of this book. But such a slim product would have missed something important. How to Talk About Books You Haven't Read started as a joke, perhaps over small talk at a cocktail party, but as Bayard expanded on the idea he ended up with an irreverent take on reading, thinking, and understanding that carries a lot more truth than I might first have imagined. Readers of this blog who are software patterns aficionados might think of Big Ball of Mud in order to understand just what I mean: antipattern as pattern, when looked at from a different angle.

This book covers a variety of books that deal in some way with not reading books but talking about them. Along the way, Bayard explores an even wider variety of ideas. Many of these sound silly, even wrong, at first, and he uses this to weave a lit-crit tale that is perfect parody. But as I read, I kept saying, "Yeah, but..." in a way, this really is true.

For example, Bayard posits that reading too much can cause someone to lose perspective in the world of ideas and to lose one's originality. In a certain way, the reader subordinates himself to the writer, and so reading too much means always subordinating to another rather than creating ideas oneself. We could read this as encouragement not to read (much), which would miss his joke. But there is another level at which he is dead-on right. I knew quite a few graduate students who learned this firsthand when they got into a master's program and found that they preferred to immerse themselves in the research of others than to do creative research of their own. And there many blogs which do a fine job reporting on other people's work but which never seem to offer much new. (I struggle with that danger each time I write in this blog.)

Not reading does not mean that we cannot have an opinion. My friends and I are examples of this. Students are notorious for this, and Bayard, a lit professor, discusses the case of students in class at some length. But I was most taken by his discussion of Laura Bohannan's experience telling the story of Hamlet to the Tiv people of West Africa. As she told the story, the Tiv elders interpreted the story for her, correcting her -- and Western culture, and Shakespeare -- along the way. One of the interpretations was a heterodoxy that has a small but significant following among Shakespeare scholars. The chief even ventured to predict how the story ended, and did a pretty good job. Bayard used this as evidence that not reading a book may actually leave our eyes open to new possibilities. Bohannan's story is available on-line, and you really should read it -- it is delightful.

Bayard talks about so many different angles on our relationship with books and stories about them, including

  • how we as listeners tend to defer to a speaker, and thus allow a non-reader to successfully talk about a book to us, and
  • how some readers are masters at imposing their own views on even books they have read, which should give us all license to impose ourselves on the books we haven't read.

One chapter focuses on our encounters with writers, and the ticklish situations they create for the non-reader and for the writer. In another, Bayard deals with the relationship among professors, students, and books. It made me think about how students interpret the things I say in class, whether about our readings or the technical material we are learning. Both of these chapters figure in a second entry I'm writing about this book, as well as chapters on the works of Montaigne and Wilde.

One chapter uses as his evidence the campus novels of David Lodge, of whom I am a big fan. I've never blogged about them, but I did use the cover of one of his books to illustrate a blog entry. Yet another draws on Bill Murray's classic Groundhog Day, an odd twist in which actually reading books enters into Bayard's story and supports his thesis. I have recommended this film before and gladly do so again.

As in so many situations, our fear of appearing weak or unknowledgable is what prevents us from talking freely about a book we haven't read, or even to admit that we have not read it. But this same fear is also responsible for discouraging us from talking about books we have read and about ideas we have considered. This is ultimately the culprit that Bayard hopes to undermine:

But our anxiety in the face of the Other's knowledge is an obstacle to all genuine creativity about books. The idea that the Other has read everything, and thus is better informed than us, reduces creativity to a mere stopgap that non-readers might resort to in a pinch. In truth, readers and non-readers alike are caught up in an endless process of inventing books, whether we like it or not, and the real question is not how to escape that process, but how to increase its dynamism and its range.

Bayard's book is worth reading just for his excerpts of other books and for his pointer to the story about the Tiv. I should probably feel guilt at not having read this book yet when so many others have, but I'm just happy to have read it now.

While reading on the plane coming home, I glanced across the aisle and noticed another passenger reading Larry Niven's Ringworld. I smiled and thought "FB++", in Bayard's rating system. I could talk about it nonetheless.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

May 31, 2008 1:21 AM

Google Impressions

I have already mentioned a couple of my first impressions of being a guest on the Google campus:

  • good, plentiful, and diverse food and drink for employees and guests alike, and
  • good, plentiful, and diverse power cables built right into the meeting tables.

Here are a few other things I noticed.

Calling it the "Google campus" is just right. It looks and feels like a college campus. Dining service, gym facilities, a small goodies store, laundry, sand volleyball courts... and lots of employees who look college-aged because they recently were.

Everywhere we walked outdoors, we saw numerous blue bicycles. They are free for the use of employees, presumably to move between buildings. But there appeared to be bike trails across the road where the bikes could be used for recreation, too.

The quad area between Buildings 40 and 43 had a dinosaur skeleton with pink flamingos in its mouth. Either someone forgot to tell the dinosaur "don't be evil", or the dinosaur has volunteered to serve as aviary for kitsch.

The same area included a neat little vegetable garden. How's that for eating local? (Maybe the dinosaur just wanted to fit in.)

As we entered Building 43 for breakfast, we were greeted with a rolling display of search terms that Google was processing, presumably in real time. I wondered if we were seeing a filtered list, but we did see a "paris hilton" in there somewhere.

The dining rooms served Google-branded ice cream sandwiches, IT's IT, "a San Francisco tradition since 1928". In typical Google fashion, the tasty treat (I verified its tastiness empirically with a trial of size N=2) has been improved, into "a natural, locally sourced, trans-fat-free rendition of their excellent treat". So there.

I don't usually comment on my experience in the restroom, but... The men's rooms at Google do more than simply provide relief; they also provide opportunities for professional development. Testing on the Toilet consists of flyers over the urinal with stories and questions about software testing. (But what's a "C.L.", as in "one conceptual change per C.L."?) I cannot confirm that female engineers at Google have the same opportunities to learn while taking the requisite breaks from their work.

I earlier commented that we visitors had to stay within sight of a Google employee. After a few more hours on campus, it became clear that security is a major industry at Google. Security guards were everywhere. My fellow guests and I couldn't decide whether they were guarding against intellectual property theft by brazen Microsoft or Yahoo! employees or souvenir theft by Google groupies. But I did decide that the Google security force far outnumbers the police force in my metro area.

All in all, an interesting and enjoyable experience.


Posted by Eugene Wallingford | Permalink | Categories: General

May 31, 2008 12:33 AM

K-12 Road Show Summit, Day Two

The second half of the workshop opened with one of the best sessions of the event, the presentation "What Research Tells Us About Best Practices for Recruiting Girls into Computing" by Lecia Barker, a senior research scientist at the National Center for Women and IT. This was great stuff, empirical data on what girls and boys think and prefer. I'll be spending some time looking into Barker's summary and citations later. Some of the items she suggested confirm commonsense, such as not implying that you need to be a genius to succeed in computing; you only need to be capable, like anything else. I wonder if we realize how often our actions and examples implicitly say "CS is difficult" to interested young people. We can also use implicit cues to connect with the interests of our audience, such as applications that involve animals or the health sciences, or images of women performing in positions of leadership.

Other suggestions were newer to me. For example, evidence shows that Latina girls differ more from white and African-American girls than white and African-American girls differ from each other. This is good to know for my school, which is in the Iowa metro area with the highest percentage of African-Americans and a burgeoning Latina population. She also suggested that middle-school girls and high-school girls have different interests and preferences, so outreach activities should be tailored to the audience. We need to appeal to girls now, not to who they will be in three years. We want them to be making choices now that lead to a career path.

A second Five-Minute Madness session had less new information for me. I thought most about funding for outreach activities, such as ongoing support for an undergraduate outreach assistant whom we have hired for next year using a one-time grant form the university's co-op office. I had never considered applying for a special projects grant from the ACM for outreach, and the idea of applying to Avon was even more shocking!

The last two sessions were aimed at helping people get a start on designing an outreach project. First, the whole group brainstormed ideas for target audiences and goals, and then the newbies in the room designed a few slides for an outreach presentation with guidance from the more experienced people. Second, the two groups split, with the newbies working more on design and the experienced folks discussing the biggest challenges they face and ways to overcome them.

These sessions again made clear that I need to "think bigger". One, Outreach need not aim only at schools; we can engage kids through libraries, 4-H (which has broadened its mission to include technology teams), the FFA, Boys and Girls Clubs, and the YMCA and YWCA. Some schools report interesting results from working with minority girls through mother/daughter groups at community centers. Sometimes, the daughters end up encouraging the moms to think bigger themselves and seek education for more challenging and interesting careers. Two, we have a lot more support from upper administration and from CS faculty at my school than most outreach groups have at their schools. This means that we could be more aggressive in our efforts. I think we will next year.

The workshop ended with a presentation by Gabe Cohen, the project manager for Google Apps. This was the only sales pitch we received from Google in the time we were here (other than being treated and fed well), and it lasted only fifteen minutes. Cohen showed a couple of new-ish features of the free Apps suite, including spreadsheets with built-in support for web-based form input. He closed hurriedly with a spin through the new AppEngine, which debuted to the public on Wednesday. It looks cool, but do I have time?

The workshop was well-done and worth the trip. The main point I take away is to be more aggressive on several fronts, especially in seeking funding opportunities. Several companies we work with have funded outreach activities at other schools, and our state legislative and executive branches have begun to take this issue seriously from the standpoint of economic development. I also need to find ways to leverage faculty interest in doing outreach and interest from our administration in both STEM education initiatives and community service and outreach.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

May 30, 2008 7:23 PM

K-12 Road Show Summit, Day One

The workshop has ended. Google was a great host, from beginning to end. They began offering food and drinks almost immediately, and we never hungered or thirsted for long. That part of the trip made Google feel like the young person's haven it is. Wherever we went, the meeting tables included recessed power and ethernet cables for every kind of laptop imaginable, including my new Mac. (Macbook Pros were everywhere we went at Google.) But we also learned right away that visitors also must stay within bounds. No wandering around was allowed; we had to remain within sight of a Googler. And we were told not to take any photos on the grounds or in the buildings.

The workshop was presented live from within Google Docs, which allowed the leaders and presenters to display from a common tool and to add content as we went along. The participants didn't have access to the doc, but we were it as a PDF file -- on the smallest flash drive I've ever owned. It's a 1GB stick with the dimensions of the delete key on my laptop (including height).

The introduction to the workshop consisted of a linked-list game in which each person introduced the person to his left, followed by remarks from Maggie Johnson, the Learning and Development Director at Google Engineering, and Chris Stephenson, the executive director of ACM's Computer Science Teachers Association. The game ran a bit long, but it let everyone see how many different kinds of people were in the room, including a lot of non-CS faculty who lead outreach activities for some of the bigger CS departments. Chris expressed happiness that K-12, community colleges, and universities were beginning to work together on the CS pipeline. Outreach is necessary, but it can also be joyful. (This brought to mind her panel statement at SIGCSE, in a session I still haven't written up...)

Next up was Liz Adams reporting on her survey of people and places who are doing road shows or thinking about it. She has amassed a lot of raw data, which is probably most useful as a source of ideas. During her talk, someone asked, does anyone know if what they are doing is working? This led to a good discussion of assessment and just what you can learn. The goals of these road shows are many. When we meet with students, are we recruiting for our own school? Or are we trying to recruit for discipline, getting more kids to consider CS as a possible major? Are we working to reach more girls and underrepresented groups, or do we seek a rising tide? Perhaps we are doing service for the economy of our community, region, or state? The general answer is 'yes' to all of these things, which makes measuring success all the more difficult. While it's comforting to shoot wide, this may not be the most effective strategy for achieving any goal at all!

One idea I took away from this session was to ask students to complete a short post-event evaluation. I view most of our outreach activities these days as efforts to broaden interest in computer science generally, and to broaden students' views of the usefulness and attractiveness of computing even more generally. So I'd like to ask students about their perceptions of computing after we work with them. Comparing these answers to ones gathered before the activity would be even better. My department already asks students declaring CS majors to complete a short survey, and I plan to ensure it includes a question that will allow us to see whether our outreach activities have had any effect on the new students we see.

Then came a session called Five-Minute Madness, in which three people from existing outreach programs answered several questions in round-robin fashion, spending five minutes altogether on each. I heard a few useful nuggets here:

  • Simply asking a young student "What will you be when you grow up?" and then talking about what we do can be a powerful motivator for some kids.

  • Guidance counselors in the high schools are seriously misinformed about computing. No surprise there. But they often don't have access to the right information, or the time to look for it. The outreach program at UNC-Charlotte has produced a packet of information specifically for school counselors, and they visit with the counselors on any school visit they can.

  • Reaching the right teacher in a high school can be a challenge. It is hard to find "CS teachers" because so few states certify that specialty. Don't send letters addressed to the "computing teacher"; it will probably end up in the trash can!

  • We have to be creative in talking to people in the Department of Education, as well as making sure we time our mailings and offerings carefully. Know the state's rules about curriculum, testing, and the like.

  • It's a relationship. Treat initial contacts with a teacher like a first date. Take the time to connect with the teacher and cultivate something that can last. One panelist said, "We HS teachers need a little romance." If we do things right, these teachers can become our biggest advocates and do a lot of "recruiting" for us through their everyday, long-term relationship with the students.

Dinner in one of the Google cafeterias was just like dinner in one of my university's residence halls, only with more diverse fare. A remarkable number of employees were there. Ah, to be young again.

Our first day closed with people from five existing programs telling us about their road shows. My main thought throughout this session was that these people spend a lot of time talking to -- at -- the kids. I wonder how effective this is with high school students and imagine that as the audience gets younger, this approach becomes even less effective. That said, I saw a lot of good slides with information that we can use to do some things. The presenters have developed a lot of good material.

Off to bed. Traveling west makes for long, productive days, but it also makes me ready to sleep!


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

May 23, 2008 3:17 PM

The Split Mind of a Scientist

Someone pointed me toward a video of a talk given at Google by John Medina on his new book Brain Rules. I enjoyed the talk and will have to track down a copy of the book. Early on, he explains that the way he have designed our schools and workplaces produce the worst possible environments in which for us to learn and work. But my favorite passage came near the end, in response to the question, "Do you believe in magic?"

Hopefully I'm a nice guy, but I'm a really grumpy scientist, and in the end, I'm a reductionist. So if you can show me, [I'll believe it]. As a scientist, I have to be grumpy about everything and be able to be willing to believe anything. ... If you care what you believe, you should never be in the investigative fields -- ever. You can't care what you believe; you just have to care what's out there. And when you do that, your bandwidth is as wide as that sounds, and the rigor ... has to be as narrow as as the biggest bigot you've ever seen. Both are resident in a scientist's mind at the same time.

Yes. Unfortunately, public discourse seems to include an unusually high number of scientists are very good at the "being grumpy about everything" part and not so good at the "being able to be willing to believe anything" part. Notice that Medina said "be able to be willing to believe", not "be willing to believe". I think that some people are less able to be willing to believe something they don't already believe, which makes them not especially good candidates to be scientists.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

May 20, 2008 12:47 PM

Cognitive Surplus and the Future of Programming

the sitcom All in the Family

I grew up on the sitcom of the 1970s and 1980s. As kids, we watched almost everything we saw in reruns, whether from the '60s or the '70s, but I enjoyed so many of them. By the time I got to college, I had well-thought ideas on why The Dick Van Dyke Show remains one of the best sitcoms ever, why WKRP in Cincinnati was underrated for its quality, and why All in the Family was _the_ best sitcom ever. I still hold all these biases in my heart. Of course, I didn't limit myself to sitcoms; I also loved light-action dramas, especially The Rockford Files.

Little did I know then that my TV viewing was soaking up a cognitive surplus in a time of social transition, or that it had anything in common with gin pushcarts in the streets of London at the onset of the Industrial Revolution.

Clay Shirky has published a wonderful little essay, Gin, Television, and Social Surplus that taught me these things and put much of what we see on happening on the web into the context of a changing social, cultural, and economic order. Shirky contends that, as our economy and technology evolve, a "cognitive surplus" is created. Energy that used to be spent on activities required in the old way is now freed for other purposes. But society doesn't know what to do with this surplus immediately, and so there is a transition period where the surplus is dissipated in (we hope) harmless ways.

My generation, and perhaps my parents', was part of this transition. We consumed media content produced by others. Some denigrate that era as one of mindless consumption, but I think we should not be so harsh. Shows like All in the Family and, yes, WKRP in Cincinnati often tackled issues on the fault lines of our culture and gave people a different way to be exposed to new ideas. Even more frivolous shows such as The Dick Van Dyke Show and The Rockford Files helped people relax and enjoy, and this was especially useful for those who were unprepared for the expectations of a new world.

We are now seeing the advent of the new order in which people are not relegated to consuming from the media channels of others but are empowered to create and share their own content. Much attention is given by Shirky and many, many others to the traditional media such as audio and video, and these are surely where the new generation has had its first great opportunities to shape its world. As Shirky says:

Here's something four-year-olds know: A screen that ships without a mouse ships broken. Here's something four-year-olds know: Media that's targeted at you but doesn't include you may not be worth sitting still for.

But as I've been writing about here, lets not forget the next step: the power to create and shape the media themselves via programming. When people can write programs, they are not relegated even to using the media they have been given but are empowered to create new media, and thus to express and share ideas that may otherwise have been limited to the abstraction of words. Flickr and YouTube didn't drop from the sky; people with ideas created new channels of dissemination. The same is true of tools like Photoshop and technologies such as wikis: they are ideas turned into reality through code.

Do read Shirky's article, if you haven't already. It has me thinking about the challenge we academics face in reaching this new generation and engaging them in the power that is now available to them. Until we understand this world better, I think that we will do well to offer young people lots of options -- different ways to connect, and different paths to follow into futures that they are creating.

One thing we can learn from the democratized landscape of the web. I think, is that we are not offering one audience many choices; we are offering many audiences the one or two choices each that they need to get on board. We can do this through programming courses aimed at different audiences and through interdisciplinary major and minor programs that embed the power of computing in the context of problems and issues that matter to our students.

Let's keep around the good old CS majors as well, for those students who want to go deep creating the technology that others are using to create media and content -- just as we can use the new technologies and media channels to keep great old sitcoms available for geezers like me.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

May 16, 2008 2:55 PM

The Seductiveness of Job Security

A former student recently mentioned a tough choice he faces. He has a great job at a Big Company here in the Midwest. The company loves him and wants him to stay for the long term. He likes the job, the company, and the community in which he lives. But this isn't the sort of job he originally had hoped for upon graduation.

Now a position of just the sort he was originally looking for is available to him in a sunny paradise. He says, "I have quite a decision to make.... it's hard to convince myself to leave the secure confines of [Big Company]. Now I see why their turnover rate is so low."

I had a hard time offering any advice. When I was growing up, my dad work for Ford Motor Company in an assembly plant, and he faced insecurity about the continuance of his job several times. I don't know how much this experience affected my outlook on jobs, but in any case my personality is one that tends to value security over big risk/big gain opportunities.

Now I hold a job with greater job security than anyone who works for a big corporation. An older colleague is fond of saying Real men don't accept tenure. I first heard him say that when I was in grad school, and I remember not getting it at all. What's not to like about tenure?

After a decade with tenure, I understand better now what he means. I always thought that the security provided by having tenure would promote taking risks, even if only of the intellectual sort. But too much security is just as likely to stunt growth and inhibit taking risks. I sometimes have to make a conscious effort to push myself out of my comfort zone. Intellectually, I feel free to try new things, but pushing myself out of a comfortable nest here into a new wnvironment -- well, that's another matter. What are the opportunity costs in that?

I love what Paul Graham says about young CS students and grads having the ability to take entrepreneurial risk, and how taking those risks may well be the safer choice in the long run. It's kind of like investing in stocks instead of bonds, I think. I encourage all of my students to give entrepreneurship a thought, and I encourage even more the ones whom I think have a significant chance to do something big. There is probably a bit of wistfulness in my encouragement, not having done that myself, but I don't think I'm simply projecting my own feelings. I really do believe that taking some employment risk, especially while young, is good for many CS grads.

But when faced with a concrete case -- a particular student having to make a particular decision -- I don't feel quite so cocksure in saying "go for it with abandon". This is not abstract theory; his job and home and fiancee are all in play. He will have to make this decision on his own, and I'd hate to push him toward something that isn't right for him from my cushy, secure seat in the tower. I feel a need to stay abstract in my advice and leave him to sort things out. Fortunately, he is a bright, level-headed guy, and I'm sure he'll do fine whichever way he chooses. I wish him luck.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

May 15, 2008 4:30 PM

Being Part of a Group

Surround yourself with smart, competent people, and you will find ideas in the air. One of the compelling thoughts in that article is this:

A scientific genius is not a person who does what no one else can do; he or she is someone who does what it takes many others to do.

For those of us who are not geniuses, the lesson is that we can still accomplish great things -- if we take part in the right sort of collaboration and be curious, inquisitive, and open to big ideas. I think this applies not only to inventions but also to ideas for start-ups and insight to class projects.

(So go to class. You'll find people there.)

But being in a group is not a path to easy accomplishment, as people who have tried to write a book in a group know:

Talking about a "group-book" is a lot of fun. Actually putting one together, maybe less fun.

The ongoing ChiliPLoP working group of which I am a member is another datapoint for Mitzenmacher's claim. Doing more than brainstorming ideas in a groups takes all the same effort, coordination, and individual and collective responsibility as any other sort of work.

(As an aside, I love Stigler's Law as quoted in the Gladwell article linked above! Self-reference can be a joy, especially with the twist engendered by this one.)


Posted by Eugene Wallingford | Permalink | Categories: General

May 13, 2008 9:15 AM

Solid and Relevant

I notice a common rhetorical device in many academic arguments. It goes like this. One person makes a claim and offers some evidence. Often, the claim involves doing something new or seeing something in a new way. The next person rebuts the argument with a claim that the old way of doing or seeing things is more "fundamental" -- it is the foundation on which other ways of doing and seeing are built. Oftentimes, the rebuttal comes with no particular supporting evidence, with the claimant relying on many in the discussion to accept the claim prima facie. We might call this The Fundamental Imperative.

This device is standard issue in the CS curriculum discussions about object-oriented programming and structured programming in first-year courses. I recently noticed its use on the SIGCSE mailing list, in a discussion of what mathematics courses should be required as part of a CS major. After several folks observed that calculus was being de-emphasized in some CS majors, in favor of more discrete mathematics, one frequent poster declared:

(In a word, computer science is no longer to be considered a hard science.)

If we know [the applicants'] school well we may decide to treat them as having solid and relevant math backgrounds, but we will no longer automatically make that assumption.

Often, the conversation ends there; folks don't want to argue against what is accepted as basic, fundamental, good, and true. But someone in this thread had the courage to call out the emperor:

If you want good physicists, then hire people who have calculus. If you want good computer scientists, then hire people who have discrete structures, theory of computation, and program verification.

I don't believe that people who are doing computer science are not doing "hard science" just because it is not physics. The world is bigger than that.

...

You say "solid and relevant" when you really should be saying "relevant". The math that CS majors take is solid. It may not be immediately relevant to problems [at your company]. That doesn't mean it is not "solid" or "hard science".

I sent this poster a private "thank you". For some reason, people who drop the The Fundamental Imperative into an argument seem to think that it is true absolutely, regardless of context. Sure, there may be students who would benefit from learning to program using a "back to the basics" approach, and there may be CS students for whom calculus will be an essential skill in their professional toolkits. But that's probably not true of all students, and it may well be that the world has changed enough that most students would benefit from different preparation.

"The Fundamental Imperative" is a nice formal name for this technique, but I tend to think of it as "if it was good enough for me...", because so often it comes down to old fogies like me projecting our experience onto the future. Both parties in such discussions would do well not to fall victim to their own storytelling.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

May 12, 2008 12:24 PM

Narrative Fallacy on My Mind

In his recent bestseller The Black Swan: The Impact of the Highly Improbable, Nassim Nicholas Taleb uses the term narrative fallacy to describe man's penchant for creating a story after the fact, perhaps subconsciously, in order to explain why something happened -- to impute a cause for an event we did not expect. This fallacy derives from our habit of imposing patterns on data. Many view this as a weakness, but I think it is a strength as well. It is good when we use it to communicate ideas and to push us into backing up our stories with empirical investigation. It is bad when we let our stories become unexamined truth and when we use the stories to take actions that are not warranted or well-founded.

Of late, I've been thinking of the narrative fallacy in its broadest sense, telling ourselves stories that justify what we see or want to see. My entry on a response to the Onward! submission by my ChiliPLoP group was one trigger. Those of us who believe strongly that we could and perhaps should be doing something different in computer science education construct stories about what is wrong and what could be better; we're like anyone else. That one OOPSLA reviewer shed a critical light on our story, questioning its foundation. That is good! It forces us to re-examine our story, to consider to what extent it is narrative fallacy and to what extent it matches reality. In the best case, we now know more about how to tell the story better and what evidence might be useful in persuading others. In the worst, we may learn that our story is a crock. But that's a pretty good worst case, because it gets us back on the path to truth, if indeed we have fallen off.

A second trigger was finding a reference in Mark Guzdial's blog to a short piece on universal programming literacy at Ken Perlin's blog. "Universal programming literacy" is Perlin's term for something I've discussed here occasionally over the last year, the idea that all people might want or need to write computer programs. Perlin agrees but uses this article to consider whether it's a good idea to pursue the possibility that all children learn to program. It's wise to consider the soundness of your own ideas every once in a while. While Perlin may not be able to construct as challenging a counterargument as our OOPSLA reviewer did, he at least is able to begin exploring the truth of his axioms and the soundness of his own arguments. And the beauty of blogging is that readers can comment, which opens the door to other thinkers who might not be entirely sympathetic to the arguments. (I know...)

It is essential to expose our ideas to the light of scrutiny. It is perhaps even more important to expose the stories we construct subconsciously to explain the world around us, because they are most prone to being self-serving or simply convenient screens to protect our psyches. Once we have exposed the story, we must adopt a stance of skepticism and really listen to what we hear. This is the mindset of the scientist, but it can be hard to take on when our cherished beliefs are on the line.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns

April 24, 2008 6:56 AM

On the Small Doses Pattern

The Small Doses pattern I wrote up in my previous entry was triggered almost exclusively by the story I heard from Carl Page. The trigger lives on in the text that runs from "Often times, the value of Small Doses..." to the end, and in the paragraph beginning "There is value in distributing...". The story was light and humorous, just the sort of story that will stick with a person for twenty or more years.

As I finally wrote the pattern, it grew. That happens all the time when I write. It grew both in size and in seriousness. At first I resisted getting too serious, but increasingly I realized that the more serious kernel of truth needed telling. So I gave it a shot.

The result of this change in tone and scope means that the pattern you read is not yet ready for prime time. Rather than wait until it was ready, though, I decided to let the pattern be a self-illustration. I have put it out now, in its rough form. It is rough both in completeness and in quality. Perhaps my readers will help me improve. Perhaps I will have time and inspiration soon to tackle the next version.

In my fantasies, I have time to write more patterns in a Graduate Student pattern language (code name: Chrysalis), even a complete language, and cross-reference it with other pattern languages such as XP. Fantasies are what they are.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

April 16, 2008 4:07 PM

Right On Time

Last night, I attended a Billy Joel concert. I last saw him perform live a decade or so ago. Billy was a decade older, and I was a decade older. He looked it, and I'm sure I do, too.

But when he started to play the piano, it could have been 1998 in the arena. Or 1988. Or 1978. The music flowing from his hands and his dancing feet filled me. Throughout the night I was 19 again, then 14, 10, and 25. I was lying on my parents' living room floor; sitting in the hand-me-down recliner that filled my college dorm room; dancing in Market Square Arena with an old girlfriend. I was rebellious teen, wistful adult, and mesmerized child.

There are moments when time seems more illusion than reality. Last night I felt like Billy Pilgrim, living two-plus hours unstuck in time.

Oh, and the music. There are not many artists who can, in the course of an evening, give you so many different kinds of music. From the pounding rock of "You May Be Right" to the gentle, plaintive "She's Always A Woman", and everything between. The Latin rhythms of "Don't Ask Me Why" extended with an intro of Beethoven's "Ode to Joy", and a "Root Beer Rag" worthy of Joplin.

Last night, my daughters aged 15 and 11 attended the concert with me. Music lives on, and time folds back on itself yet again.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

March 13, 2008 8:06 PM

SIGCSE Day 1 -- This and That

[A transcript of the SIGCSE 2008 conference: Table of Contents]

This sort of entry usually comes after I write up the various conference sessions and have leftovers that didn't quite fit in an article. That may still happen, but I already have some sense of what will go where and have these items as miscellaneous observations.

First of all, I tried an experiment today. I did not blog in real-time. I used -- gasp! -- the antiquated technology of pen and paper to take notes during the sessions. On one or two occasions, I whipped open the laptop to do a quick Google search for a PhD dissertation or a book, but I steadfastly held back from the urge to type. I took notes on paper, but I couldn't fall into "writing" -- crafting sentences, then forming paragraphs, editing, ... All I could do was jot, and because I write slowly I had to be pickier about what I recorded. One result is that I paid more attention to the speakers, and less to a train of thought in my head. Another is that I'll have to write up the blog posts off-line, and that will take time!

As I looked through the conference program last night, I found myself putting on my department head hat, looking for sessions that would serve my department in the roles I now find myself in more often: CS1 for scientists, educational policy in CS, and the like. But when I got to the site and found myself having to choose between Door A and Door B... I found myself drifting into the room where Stuart Reges was talking about a cool question that seems to pick out good CS students, and the nifty assignments. Whatever my job title may be, I am a programmer and CS teacher. (More on both of those sessions in coming entries...)

Now, for a couple of non-CS, non-teaching observations.

  • I am amazed at how many healthy adults will walk out of their way, past a perfectly good set of stairs, to ride up an escalator. Sigh.
  • Staying at the discount motel over three miles away and without a car, I am relying on public buses. I have quickly learned that bus schedules are suggestions, not contracts. Deal with it. And in Portland, that often means: deal with it in the rain.
  • Schoolchildren here take standard mass transit buses to school. I never knew such a place existed.

There is so much for me to learn.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

March 12, 2008 4:32 PM

On the Roads Back in Portland

SIGCSE 2008 logo

With the exception of my annual visit to Carefree for ChiliPLoP, I don't often get a chance to return to a city for another conference. This year brings a pleasant return to Portland for SIGCSE 2008. OOPSLA'06 was in Portland, and I wrote up a little bit about running in Portland as part of my first visit to town. Because I was on the conference planning committee that year, I made three trips to the city, stayed in the same hotel three times, and ran several of the same routes three times. The convention center is right in town, which makes it hard to get to any nice parks to run, but Portland has a 3-mile loop alongside the Willamette River that provides a decent run.

This time, I am on my own dime and trying to save a little money by staying at a budget motel about 3.5 miles from the convention center. That meant figuring out bus routes and bus stops for the ride between the two -- no small feat for a guy who has never lived in a place where public transportation is common! It also meant planning some new runs, including a route back to the waterfront.

I arrived in town early enough yesterday to figure out the buses (I think) and still have time for an exploratory run. I ran toward the river, and then toward the convention center, until I knew the lay of the land well enough. The result was 4.5 miles of urban running in neighborhoods I'd never seen. This morning, used what I learned to get to the river, where I ran my first lap through the Governor Tom McCall Waterfront Park and the Eastbank Esplanade since October 2006. I ended up with about 8 miles under my belt, and a strong desire to return Saturday evening for three laps and what will be a 14-miler -- what would be my longest run since the Marine Corps Marathon. Let's see how I feel in a couple of days...

The rest of this week I am at SIGCSE, and I'm looking forward to seeing old friends and colleagues and to talking CS for a few days. Then on Sunday, four of us fly to Phoenix for ChiliPLoP and some intense work. This is a long time to be away from home and to miss my family, but the ideas should keep me busy.


Posted by Eugene Wallingford | Permalink | Categories: General, Running

February 28, 2008 7:10 PM

The Complement of Schadenfreude

Does it have a name?

Of course, Schadenfreude itself doesn't really have a name in English. It is a German word that means roughly delight in another person's misfortune. (However, I see that Wikipedia offers one, the 300+-year-old, apparently abandoned "epicaricacy".)

Last semester, a colleague described what struck me as the complement of Schadenfreude. He reported that one of our close friends, a retired professor here, expressed a strong unhappiness or distaste for faculty who succeeded in publishing academic papers. This matters to him because he is one of those folks. His friend came to the university in a different era, when we were a teacher's college without any pretension to being a comprehensive university. The new faculty who publish and talk about their research, she said, are "just showing off". Their success caused her pain, even if they didn't brag about their success.

This is not the opposite of Schadenfreude. That is happiness in another's good fortune, which Wikipedia tells us matches the Buddhist concept of mudita. What our friend feels inverts both the emotion and the trigger.

I don't think that her condition corresponds to envy. When someone is envious, they want what someone else has. Our friend doesn't want what the others have; she is saddened, almost angered, that others have it. No one should.

The closest concept I can think of is "sour grapes", a metaphor from one of Aesop's beloved fables. But in this story, the fox does want the grapes, and professes to despise them only when he can't reach them. I believe that our friend really doesn't want the success of research; she earnestly believes that our mission is to teach, not publish, and that energy spent doing research is energy misspent. And that makes her feel bad.

When my colleague told me his story, I joked that the name for this condition should be freudenschade. I proposed this even though I know a little German and know how non-sensical it is. But it seemed fun. Sadly, I wasn't the first person to coin the word... Google tells me that at least one other person has. You may be tempted to say that I feel freudenschade that someone else coined the term "freudenschade" first, but I don't. What I feel is envy!

The particular story that led to my discussion is almost beside the point. I'm on a mission that has moved beyond it. I am not aware of a German word for the complement of Schadenfreude. Nor am I aware of an English word for it. Is there a word for it anywhere, in English, German, or some other language?

I'm curious... Perhaps the Lazyweb can help me.


Posted by Eugene Wallingford | Permalink | Categories: General

February 24, 2008 12:48 PM

Getting Lost

While catching up on some work at the office yesterday -- a rare Saturday indeed -- I listened to Peter Turchi's OOPSLA 2007 keynote address, available from the conference podcast page. Turchi is a writer with whom conference chair Richard Gabriel studied while pursuing his MFA at Warren Wilson College. I would not put this talk in the same class as Robert Hass's OOPSLA 2005 keynote, but perhaps that has more to do with my listening to an audio recording of it and not being there in the moment. Still, I found it to be worth listening as Turchi encouraged us to "get lost" when we want to create. We usually think of getting lost as something that happens to us when we are trying to get somewhere else. That makes getting lost something we wish wouldn't happen at all. But when we get lost in a new land inside our minds, we discover something new that we could not have seen before, at least not in the same way.

As I listened, I heard three ideas that captured much of the essence of Turchi's keynote. First was that we should strive to avoid preconception. This can be tough to do, because ultimately it means that we must work without knowing what is good or bad! The notions of good and bad are themselves preconceptions. They are valuable to scientists and engineers as they polish up a solution, but they often are impediments to discovering or creating a solution in the first place.

Second was the warning that a failure to get lost is a failure of imagination. Often, when we work deeply in an area for a while, we sometimes feel as if we can't see anything new and creative because we know and understand the landscape so well. We have become "experts", which isn't always as dandy a status as it may seem. It limits what we see. In such times, we need to step off the easy path and exercise our imaginations in a new way. What must I do in order to see something new?

This leads to the third theme I pulled from Turchi's talk: getting lost takes work and preparation. When we get stuck, we have to work to imagine our way out of the rut. For the creative person, though, it's about more about getting out of a rut. The creative person needs to get lost in a new place all the time, in order to see something new. For many of us, getting lost may seem like as something that just happens, but the person who wants to be lost has to prepare to start.

Turchi mentioned Robert Louis Stevenson as someone with a particular appreciation for "the happy accident that planning can produce". But artists are not the only folks who benefit from these happy accidents or who should work to produce the conditions in which they can occur. Scientific research operates on a similar plane. I am reminded again of Robert Root-Bernstein's ideas for actively engaging the unexpected. Writers can't leave getting lost to chance, and neither can scientists.

Turchi comes from the world of writing, not the world of science. Do his ideas apply to the computer scientist's form of writing, programming? I think so. A couple of years ago, I described a structured form of getting lost called air-drop programming, which adventurous programmers use to learn a legacy code base. One can use the same idea to learn a new framework or API, or even to learn a new programming language. Cut all ties to the familiar, jump right in, and see what you learn!

What about teaching? Yes. A colleague stopped by my office late last week to describe a great day of class in which he had covered almost none of what he had planned. A student had asked a question whose answer led to another, and then another, and pretty soon the class was deep in a discussion that was as valuable, or more, than the planned activities. My colleague couldn't have planned this unexpectedly good discussion, but his and the class's work put them in a position where it could happen. Of course, unexpected exploration takes time... When will they cover all the material of the course? I suspect the students will be just fine as they make adjustments downstream this semester.

What about running? Well, of course. The topic of air-drop programming came up during a conversation about a general tourist pattern for learning a new town. Running in a new town is a great way to learn the lay of the land. Sometimes I have to work not to remember landmarks along the way, so that I can see new things on my way back to the hotel. As I wrote after a glorious morning run at ChiliPLoP three years ago, sometimes you run to get from Point A to Point B; sometimes, you should just run. That applies to your hometown, too. I once read about an elite women's runner who recommended being dropped off far from your usual running routes and working your way back home through unfamiliar streets and terrain. I've done something like this myself, though not often enough, and it is a great way to revitalize my running whenever the trails start look like the same old same old.

It seems that getting lost is a universal pattern, which made it a perfect topic for an OOPSLA keynote talk.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns, Running, Software Development, Teaching and Learning

February 20, 2008 2:55 PM

You Know You're Doing Important Work...

... when Charlie Eppes invokes your research area on Numb3rs. In the episode I saw last Friday, the team used a recommender system, among other snazzy techie glitz, to track down a Robin Hood who was robbing from the dishonestly rich and giving to the poor through a collection of charities. A colleague of mine does work in recommender systems and collaborative filtering, so I thought of him immediately. His kind of work has entered the vernacular now.

I don't recall the Numb3rs crew ever referring to knowledge-based systems or task-specific architectures, which was my area in the old days. Nor do I remember any references to design patterns or to programming language topics, which is where I have spent my time in the last decade or so. Should I feel left out?

But Charlie and Amita did use the idea of steganography in an episode two years ago, to find a pornographic image hidden inside an ordinary image. I have given talks on steganography on campus occasionally in the last couple of years. The first time was at a conference on camouflage, and most recently I spoke to a graphic design class, earlier this month. (My next engagement is at UNI's Saturday Science Showcase, a public outreach lecture series my college runs in the spring.) So I feel like at least some of my intellectual work has been validated.

Coincidentally, I usually bill my talks on this topic as "Numb3rs Meets The Da Vinci Code: Information Masquerading as Art", and one of the demonstrations I do is to hide an image of Numb3rs guys in a digitized version of the Mona Lisa. The talk is a lot of fun for me, but I wonder if college kids these days pay much attention to network television, let alone da Vinci's art.

Lest you think that only we nth-tier researchers care to have our areas trumpeted in the pop world, even the great ones can draw such pleasure. Last spring, Grady Booch gave a keynote address at SIGCSE. As a part of his opening, he played for us a clip from a TV show that had brightened his day, because it mentioned, among other snazzy techie glitz, the Unified Modeling Language he had helped to create. Oh, and that video clip came from... Numb3rs!


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

January 24, 2008 4:18 PM

The Stars Trust Me

My horoscope says so:

Thursday, January 24

Scorpio (October 24-November 22) -- You are smart enough to realize meeting force with force will only result in non-productive developments. To your credit, you will turn volatile matters around with wisdom, consideration, and gentleness.

Now, I may not really be smart enough, or wise enough, or even gentle enough. But on days like today it is good to hear such advice. Managing a team, a faculty, or a class involves a lot or relationships and a lot of personalities. Using wisdom, consideration, and gentleness is usually a more effective way to deal with unexpected conflicts than responding in kind or brute force.

Some days, my horoscope fits my situation perfectly. Today is one. But I promise not to turn to the zodiac for future blogging inspiration, unless it delivers a similarly general piece of advice.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

January 23, 2008 11:15 AM

MetaBlog: Good News, No News

One piece of good news from the past week: My permalinks should work now! Our college web server is once again behaving as it should, which means that http://www.cs.uni.edu/~wallingf/blog/ will not redirect to a http://cns2.uni.edu/ URL. This means that my permalinks, which are in the www.cs.uni.edu domain, will once again work. This makes me happy, and I hope that it makes it easier for folks to link directly to articles that they discuss in their own blogs. There may still be a problem with the category pages, but the sysadmins should have that fixed soon.

Now for that Bloglines issue... I haven't had much luck getting help from the Bloglines team, but I'll keep trying.


Posted by Eugene Wallingford | Permalink | Categories: General

December 18, 2007 4:40 PM

Notes on a SECANT Workshop: Table of Contents

[Nothing new here for regular readers... This post implements an idea that I saw on Brian Marick's blog and liked: a table of contents for a set of conference posts coupled with cross-links as notes at the top of each entry. I have done a table of contents before, for OOPSLA 2005 -- though, sadly, not for 2004 or 2006 -- but I like the addition of the link back from entries to the index. This may help readers follow my entries, especially when they are out of order, and it may help me when I occasionally want to link back to the workshop as a unit.]

This set of entries records my experiences at the SECANT 2007 workshop November 17-18, hosted by the Purdue Department of Computer Science.

Primary entries:

  • Workshop Intro: Teaching Science and Computing
    -- on building a community
  • Workshop 1: Creating a Dialogue Between Science and CS
    -- How can we help scientists and CS folks work together?
  • Workshop 2: Exception Gnomes, Garbage Collection Fairies, and Problems
    -- on a hodgepodge of sessions around the intersection of science ed and computing
  • Workshop 3: The Next Generation
    -- what scientists are doing out in the world and how computer scientists are helping them
  • Workshop 4: Programming Scientists
    -- Should scientists learn to program? And, if so, how?
  • Workshop 5: Wrap-Up
    -- on how to cause change and disseminate results

Ancillary entries:

The next few items on the newsfeed will be these entries, updated with the "transcript" cross-link. [Done]


Posted by Eugene Wallingford | Permalink | Categories: General

December 18, 2007 2:12 PM

Post-Semester This and That

Now that things have wound down for the semester, I hope to do some mental clean-up and some CS. As much as I enjoyed the SECANT workshop last month (blogged in detail ending here), travel that late in a semester compresses the rest of the term into an uncomfortably small box. That said, going to conferences and workshops is essential:

Wherever you work, most of the smart people are somewhere else.

I saw that quote attributed to Bill Joy in an article by Tim Bray. Joy was speaking of decentralization and the web, but it applies to the pre-web network that makes up any scholarly discipline. Even with the web, it's good to get out of the tower every so often and participate in an old-fashioned conversation.

One part of the semester clean-up will be assessing the state of my department head backlog. Most days, I add more things to my to-do list than I am unable to cross off. Some of them are must-dos, details, and others are ideas, dreams. By the end of the semester, I have to be honest that many of the latter won't be done, soon if ever. I don't do a hard delete of most of these items; I just push them onto a "possibilities" list that can grow as large as it likes without affecting my mental hygiene.

I recently told my dean that, after two and a half years as head, I had almost come to peace with what I have taken to calling "time management by burying". He just smiled and said that the favorite part of his semester is doing just that, looking at his backlog and saying to himself, "Well, guess I'm not going to do that" as he deleted it from the list for good. Maybe I should be more ruthless myself. Or maybe that works better if you are a dean...

I've been following the story of the University of Michigan hiring West Virginia University's head football coach. Whatever one thinks of the situation -- and I think it brings shame to both Michigan and its new coach -- there was a very pragmatic piece of advice to be learned about managing people from one Pittsburgh Post-Gazette sports article about it. Says Bob Reynolds, former chief operating officer of Fidelity Investments:

I've been the COO of a 45,000-person company. When somebody's producing, you ask, 'What can I do for you to make your life better?' Not 'What can I do to make your life more miserable?'

That's a good thought for an academic department head to start each day with. And maybe a CS instructor, too.


Posted by Eugene Wallingford | Permalink | Categories: General

December 17, 2007 5:02 PM

An Unexpected Opportunity

I had to drive to Des Moines for a luncheon today. Four hours driving, round-trip, for a 1.25-hour lunch -- the things I do for my employer! The purpose of the trip was university outreach: I was asked to represent the university at a lunch meeting of the Greater Des Moines Committee, in place of our president and dean.

The luncheon was valuable for making connections to the movers and shakers in the capital city, and for talking to business leaders about computer science enrollments, math and science in the K-12 schools, and IT policy for the state. The lunch speaker, Ted Crosbie, the chief technology officer of Iowa, gave a good talk on economic development and the future of the state's technology efforts.

But was it all worth four hours on the road? Probably so, but I will give a firm Yes, for an unexpected reason.

A couple of minutes after I took my seat for lunch, former Iowa Governor Terry Branstad (1983-1999) sat down at our table. He struck up a nice conversation. Then, a couple of minutes later, former Iowa Governor Robert Ray (1969-1983) joined us. Very cool. I was impressed at how involved and informed these retired public officials remain in the affairs of the state, especially in economic development. The latter is, of course, something of great importance to my department and its students, as well as the university as a whole.

Then on the drive home, I saw a bald eagle soar majestically over a small riverbed. A thing of beauty.


Posted by Eugene Wallingford | Permalink | Categories: General

November 22, 2007 6:16 PM

For the Fruits of This Creation

On this and every day:

For the harvests of the Spirit, Thanks be to God;
For the good we all inherit, Thanks be to God;
For the wonders that astound us,
For the truths that will confound us,
Most of all that love has found us, Thanks be to God.

(Lyric by Fred Pratt Green, copyright 1970. Sung to a traditional Welsh melody.)

Among so many things, I'm thankful for the chance to write here and to have people read what I write.

Happy Thanksgiving.


Posted by Eugene Wallingford | Permalink | Categories: General

November 20, 2007 4:30 PM

Workshop 5: Wrap-Up

[A transcript of the SECANT 2007 workshop: Table of Contents]

The last bit of the SECANT workshop focused on how to build a community at this intersection of CS and science. The group had a wide-ranging discussion which I won't try to report here. Most of it was pretty routine and would not be of interest to someone who didn't attend. But there were a couple of points that I'll comment on.

On how to cause change.     At one point the discussion turned philosophical, as folks considered more generally how one can create change in a larger community. Should the group try to convince other faculty of the value of these ideas first, and then involve them in the change? Should the group create great materials and courses first and then use them to convince other faculty? In my experience, these don't work all that well. You can attract a few people who are already predisposed to the idea, or who are open to change because they do not have their own ideas to drive into the future. But folks who are predisposed against the idea will remain so, and resist, and folks who are indifferent will be hard to move simply because of inertia. If it ain't broke, don't fix it.

Others expressed these misgivings. Ruth Chabay suggested that perhaps the best way to move the science community toward computational science is by producing students who can use computation effectively. Those students will use computation to solve problems. They will learn deeper. This will catch the eye of other instructors. As a result, these folks will see an opportunity to change how they teach, say, physics. We wouldn't have to push them to change; they would pull change in. Her analogy was to the use of desktop calculators in math, chemistry, and physics classes in the 1970s and 1980s. Such a guerilla approach to change might work, if one could create a computational science course good enough to change students and attractive enough to draw students to take it. This is no small order, but it is probably easier than trying to move a stodgy academic establishment with brute force.

On technology for dissemination.     Man, does the world change fast. Folks talked about Facebook and Twitter as the primary avenues for reaching students. Blogs and wikis were almost an afterthought. Among our students, e-mail is nearly dead, only 20 years or so after it began to enter the undergraduate mainstream. I get older faster than the calendar says because the world is changing faster than the days are passing.

Miscellaneous.     Purdue has a beautiful new computer science building, the sort of building that only a large, research school can have. What we might do with a building at an appropriate scale for our department! An extra treat for me was a chance to visit a student lounge in the building that is named for the parents of a net acquaintance of mine, after he and his extended family made a donation to the building fund. Very cool.

I might trade my department's physical space for Purdue CS's building, but I would not trade my campus for theirs. It's mostly buildings and pavement, with huge amounts of auto traffic in addition to the foot traffic. Our campus is smaller, greener, and prettier. Being large has its ups and its downs.

Thanks to a recommendation of the workshop's local organizer, I was able to enjoy some time running on campus. Within a few minutes I found my way to some trails that head out into more serene places. A nice way to close the two days.

All in all, the workshop was well worth my time. I'll share some of the ideas among my science colleagues at UNI and see what else we can do in our own department.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

November 15, 2007 9:13 PM

Making Time to Do What You Love

Earlier this week, I read The Geomblog's A day in the life..., in which Suresh listed what he did on Monday. Research did not appear on the list.

I felt immediate and intense empathy. On Monday, I had spent all morning on our college's Preview Day, on which high school students who are considering studying CS at my university visit campus with their parents. It is a major recruiting effort in our college. I spent the early morning preparing my discussion with them and the rest of the morning visiting with them. The afternoon was full of administrative details, computer labs and registration and prospective grad students. On Tuesday, when I read the blog entry, I had taught compilers -- an oasis of CS in the midst of my weeks -- and done more administration: graduate assistantships, advising, equipment purchases, and a backlog of correspondence. Precious little CS in two days, and no research or other scholarly activity.

Alas, that is all too typical. Attending an NSF workshop this week is a wonderful chance to think about computer science, its application in the sciences, and how to teach it. Not research, but computer science. I only wish I had a week or five after it ends to carry to fruition some of the ideas swirling around my mind! I will have an opportuniy to work more on some of these ideas when I return to the office, as a part of my department's curricular efforts, but that work will be spread over many weeks and months.

That is not the sort of intense, concentrated work that I and many other academics prefer to do. Academics are bred for their ability to focus on a single problem and work intensely on it for long periods of time. Then comes academic positions that can spread us quite then. An administrative position takes that to another level.

Today at the workshop, I felt a desire to bow down before an academic who understands all this and is ready to take matters into his own hands. Some folks were discussing the shortcomings of the current Mac OS X version of VPython, the installation of which requires X11, Xcode, and Fink. Bruce Sherwood is one of the folks in charge of VPython. He apologized for the state of the Mac port and explained that the team needs a Mac guru to build a native port. They are looking for one, but such folks are scarce. Never fear, though... If they can't find someone soon, Sherwood said,

I'm retiring so that I can work on this.

Now that is commitment to a project. We should all have as much moxie! What do you say, Suresh?


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

November 12, 2007 7:27 AM

Notes to My Bloglines Readers

My apologies to the 130-odd of you who read this blog via Bloglines. A couple of you have alerted me to a technical issue with links disappearing from my posts when you read Knowing and Doing through the Bloglines interface. The problem is intermittent, which makes it frustrating for you all and harder for me to track down.

I've validated my RSS feed at http://feedvalidator.org and looked for some clues in the HTML source. No luck. At this point, I have asked the folks at Bloglines to see if they can find something in my feed that interacts badly with their software. I'll keep you posted.


Posted by Eugene Wallingford | Permalink | Categories: General

November 07, 2007 7:45 AM

Magic Books

Last Saturday morning, I opened a book at random, just to fill some time, and ended up writing a blog entry on electronic communities. It was as if the book were magic... I opened to a page, read a couple of sentences, and was launched on what seemed like the perfect path for that morning. That experience echoed one of the things Vonnegut himself has often said: there is something special about books.

This is one reason that I don't worry about getting dumber by reading books, because for me books have always served up magic.

I remember reading just that back in high school, in Richard Bach's Illusions:

I noticed something strange about the book. "The pages don't have numbers on them, Don."

"No," he said. "You just open it and whatever you need most is there."

"A magic book!"

These days, I often have just this experience on the web, as I read blogs and follow links off to unexpected places. An academic book or conference proceedings can do the same. Bach would have said, "But of course."

"No you can do it with any book. You can do it with an old newspaper, if you read carefully enough. Haven't you done that, hold some problem in your mind, then open any book handy and see what it tells you?"

I do that sometimes, but I'm just as likely to catch a little magic when my mind is fallow, and I grab a paper of one of my many stacks for a lunch jaunt. Holding a particular problem in my mind sometimes puts too much pressure on whatever might happen.

Indeed, this comes back to the theme of the article I wrote on Saturday morning. On one hand there are traditional media and traditional communities, and on the other are newfangled electronic media and electronic communities. The traditional experiences often seem to hold some special magic for us. But the magic is not in any particular technology; it is in the intersection between ideas out there and our inner lives.

When I feel something special in the asynchronicity of a book's magic, and think that the predetermination of an RSS feed makes it less spontaneous, that just reflects my experience, maybe my lack of imagination. If I look back honestly, I know that I have stumbled across old papers and old blog posts and old web pages that served up magic to me in much the same way that books have done. And, like electronic communities, the digital world of information creates new possibilities for us. A book can be magic for me only if I have a copy handy. On the web, every article is just a click a way. That's a powerful new sort of magic.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 06, 2007 6:53 AM

Lack of Confidence and Teamwork

Over on one of the mailing lists I browse -- maverick software development -- there has been a lot of talk about how a lack of trust is one of the primary dysfunctions of teams. The discussion started as a discussion of Patrick Lencioni's The Five Dysfunctions of a Team but has taken on its own life based on the experiences of the members of the list.

One writer there made the bold claim that all team dysfunctions are rooted in a lack of trust. Others, such as fear of conflict and lack of commitment to shared goals, grow solely from a lack of trust among team members and leaders. This is, in fact, what Lencioni claims in his book, that a lack of trust creates an environment in which people fear conflict, which ensures a lack of commitment and ultimately an avoidance of accountability, ending in an inattention to the results produced by the team.

The writer who made this claim asked list members for specific counterexamples. I don't know if I can do that, but I will say that it's amazing what a lack of confidence can do to an individual's outlook and performance, and ultimately on his or her ability to contribute positively as a team member.

When a person lacks confidence in his ability, he will be inclined to interpret every contingent signal in a different way than it was intended. This interpretation is often extreme, and very often wrong. This creates an impediment to performance and to interaction.

I see it in students all the time. A lack of confidence makes it hard to learn! If I don't trust what I know or can do, then every new idea looks scary. How can I understand this if I don't understand the more fundamental material? I don't want to ask this question, because the teacher, or my classmates, will see how little I know. There's no sense in trying this; I'll just fail.

This is, I think a problem in CS classes between female and male students. Male students seem more likely than females to bluff their way through a course, pretending they understand something more deeply than they do. This gives everyone a distorted image of the overall understanding of the class, and leaves many female students thinking that they are alone in not "getting it". One of the best benefits of teaching a CS class via discussion rather than lecture is that over time the bluffers are eventually outed by the facts. I still recall one of our female students telling me in the middle of one of my courses taught in this way that she finally saw that no one else had any better grasp on the material than she did and that, all things considered, she was doing pretty well. Yes!

I see the effects of lack of confidence in my faculty colleagues, too. This usually shows up in a narrow context, where the person doesn't know a particular area of computing very well, or lacks experience in a certain forum, and as a result shies away from interacting in venues that rely on this topic. I also see this spill over into other interactions, where a lack of confidence in one area sets the tone for fear of conflict (which might expose an ignorance) and disengagement from the team.

I see it in myself, as instructor on some occasions and as a faculty member on others. Whenever possible I use a lack of confidence in my understanding of a topic as a spur to learn more and get better. But in the rush of days this ideal outlook often falls victim to rapidly receding minutes.

A personal lack of confidence has been most salient to me in my role as a department head. This was a position for which I had received no direct training, and grousing about the performance of other heads offers only the flimsiest foundation for doing a better job. I've been sensitized to nearly every interaction I have. Was that a slight, or standard operating procedure? Should I worry that my colleague is displeased with something I've done, or was that just healthy feedback? Am I doing a good enough job, or are the faculty simply tolerating me? As in so many other contexts, these thoughts snowball until they are large enough to blot everything else out of one's sight.

The claimant on the mailing list might say that trust is the real issue here. If the student trusts his teacher, or the faculty member trusts his teammates, or the department head trusts his faculty, either they would not lack confidence or would not let it affect their reactions. But that is precisely the point: they are reactions, from deep within. I think we feel our lack of confidence prior to processing the emotion and acting on trust. Lack of confidence is probably not more fundamental than lack of trust, but I think they are often orthogonal to one another.

How does one get over a lack of confidence? The simplest way is to learn what we need to know, to improve our skills. In the mean time, a positive attitude -- perhaps enabled by a sense of trust in our teammates and situation -- can do wonders. Institutionally, we can have, or work to get, support from above. A faculty member who trusts that she has room to grow in the presence of her colleagues and head, or a new department who trusts that he has room to grow in the presence of his dean, will be able to survive a lack of confidence while in the process of learning. I've seen new deans and heads cultivate that sort of trust by acting cautiously at the outset of their tenure, so as not to lose trust before the relationship is on firm ground.

In the context of software development, the set of tasks for which someone is responsible is often more crisply delineated than the set of tasks for a student or manager. In one way, that is good news. If your lack of confidence stems from not knowing how Spring or continuation passing style works, you can learn it! But it's not too simple, as there are time constraints and team relationships to navigate along the way.

Ultimately, a mindset of detachment is perhaps the best tool a person who lacks confidence can have. Unfortunately, I do not think that detachment and lack of confidence are as common a package as we might hope. Fortunately, one can cultivate a sense of detachment over time, which makes dealing with recurring doubts about one's capabilities easier to manage over time.

If only it were as easy to do these things as it is to say them!


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Software Development

November 03, 2007 4:47 PM

Electronic Communities and Dancing Animals

I volunteered to help with a local 5K/10K race this morning. When I arrived at my spot along the course, I had half an hour to fill before the race began, and 45 minutes or so before the first runners would reach me. At first I considered taking a short nap but feared I'd sleep too long. Not much help to the runners in that! So I picked up Kurt Vonnegut's A Man Without a Country, which was in my front seat on its way back to the library. (I wrote a recent article motivated by something else I read in this last book of Vonnegut's.)

I opened the book to Page 61, and my eyes fell immediately to:

Electronic communities build nothing. You end up with nothing. We are dancing animals.

This passage follows a wonderful story about how Kurt mails his manuscripts, daily coming into contact with international voices and a flamboyant postal employee on whom he has a crush. I've heard this sentiment before, in many different contexts and from many different people, but fundamentally I disagree with the claim. Let me tell you about two stories of this sort that stick in my mind, and my reactions at the time.

A decade or so ago, the famed philosopher and AI critic Hubert Dreyfus came to our campus to deliver a lecture as part of an endowed lecture series in the humanities. Had I been blogging at that time, I surely would have written a long review of this talk! Instead, all I have a notebook on my bookshelf full of pages and pages of notes. (Perhaps one of these days...) Dreyfus claimed that the Internet was leading to a disintegration of society by creating barriers to people connecting in the real world. Electronic communication was supplanting face-to-face communication but giving us only an illusion of a real connection; in fact, we were isolating ourselves from one another.

In the question-and-answer session that followed, I offered a counterargument. Back in the mid-1980s I became quite involved in several Usenet newsgroups, both for research and entertainment. In the basketball and football newsgroups, I found intelligent, informed, well-rounded people with who to discuss sports at a deeper level than I could with anyone in my local physical world. These groups became an important part of my day. But as the number of people with Internet access exploded, especially on college campuses, the signal-to-noise ratio in the newsgroups fell precipitously. Eventually, a core group of the old posters moved much of discussion off-group to a private mailing list, and ultimately I was invited to join them.

This mailing list continues to this day, taking on and losing members as lives change and opportunities arise. We still discuss sports and politics, pop culture and world affairs. It is a community as real to me as most others, and I consider some of the folks there to be good friends whom I'm lucky to have come to know. Members of the basketball group get together in person annually for the first two rounds of the NCAA tournament, and wherever we travel for business or pleasure we are likely to be in the neighborhood of a friend we can join for a meal and a little face-to-face communication. Like any real community, there are folks in the group whom I like a lot and others with whom I've made little or no personal connection. On-line we have good moments and disagreements and occasional hurt feelings, like any other community of people.

The second story I remember most is from Vonnegut himself, when he, too, visited my campus back when. At one of the sessions I attended, someone asked him about the fate of books in the Digital Age. Vonnegut was certain that books would continue on in much their current form, because there was something special about the feel of a book in one's hands, the touch of paper on the skin, the smell of the print and binding. Even then I recall disagreeing with this -- not because I don't also feel that something special in the feel of a book in my hands or the touch of the paper on my skin. A book is an artifact of history, an invention of technology. Technology changes, and no matter how personally we experience a particular technology's outward appearance, it is more likely to be different in a few years than to be the same.

My Usenet newsgroup story seems to contradict Dreyfus's thesis, but he held that, because we took it upon ourselves to meet in person, my story actually supported it. To me that seemed a too convenient way for him to dismiss the key point: our sports list is essentially an electronic community, one whose primary existence is virtual. Were the Internet to disappear tomorrow, some of the personal connections we've made would live on, but the community would die.

And keep in mind that I am old guy... Today's youth grow up in a very different world of technology than we did. One of the specific sessions I regret missing by missing OOPSLA was the keynote by Jim Purbrick and Mark Lentczner on Second Life, a new sort of virtual world that may well revolutionize the idea of electronic community not only for personal interaction but for professional, corporate, and economic interaction as well. As an example, OOPSLA itself had an island in Second Life as a way to promote interaction among attendees before and during the conference.

The trend in the world these days is toward more electronic interaction, not less, and new kinds that support wider channels of communication and richer texture in the interchange. There are risks in this trend, to be sure. Who among us hasn't heard the already classic joke about the guy who needs a first life before he can have a Second Life? But I think that this trend is just another step in the evolution of human community. We'll find ways to minimize the risks while maximizing the benefits. The next generation will be better prepared for this task than old fogies like me.

All that said, I am sympathetic to the sentiment that Vonnegut expressed in the passage quoted above, because I think underlying the sentiment is the core of a truth about being human. He expresses his take on that truth in the book, too, for as I turned the page of the book I read:

We are dancing animals. How beautiful it is to get up and go out and do something. We are here on Earth to fart around. Don't let anybody tell you any different.

I know this beauty, and I'm sure you do. We are physical beings. The ability and desire to make and share ideas distinguish us from the rest of the world, but still we are dancing animals. There seems in us an innate need to do, not just think, to move and see and touch and smell and hear. Perhaps this innate trait is why I love to run.

But I am also aware that some folks can't run, or for whatever reason cannot sense our physical world in the same way. Yet many who can't still try to go out and do. At my marathon last weekend, I saw men who had lost use of their legs -- or lost their legs altogether -- making their way over 26.2 tough miles in wheelchairs. The long uphill stretches at the beginning of the course made their success seem impossible, because every time they released their wheels to grab for the next pull forward they lost a little ground. Yet they persevered. These runners' desire to achieve in the face of challenge made my own difficulties seem small.

I suspect that these runners' desire to complete the marathon had as much to do with a sense of loss as with their innate nature as physical beings. And I think that this accounts for Vonnegut's and others' sentiment about the insufficiency of electronic communities: a sense of loss as they watch the world around evolve quickly into something very different from the world in which they grew.

Living in the physical world is clearly an important part of being human. But it seems to be neither necessary nor sufficient as a condition.

Like Vonnegut, I grew up in a world of books. To me, there is still something special about the feel of a book in my hands, the touch of paper on my skin, the smell of the print and binding of a new book the first time I open it. But these are not necessary parts of the world; they are artifacts of history. The sensual feel of a book will change, and humanity will survive, perhaps none they worse for it.

I can't say that face-to-face communities are merely an artifact of history, soon to pass, but I see no reason to believe that the electronic communities we build now -- we do build them, and they so seem to last, at least on the short time scale we have for judging them -- cannot augment our face-to-face communities in valuable ways. I think that they will allow us to create forms of community that were not available to us before, and thus enrich human experience, not diminish it. While we are indeed dancing animals, as Vonnegut describes us, we are also playing animals and creative animals and thinking animals. And, at our core, we are connection-making animals, between ideas and between people. Anything that helps us to make more, different, and better connections has a good chance of surviving in some form as we move into the future. Whether dinosaurs like Vonnegut or I can survive there, I don't know!


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

October 19, 2007 4:42 PM

Working Hard, Losing Ground

Caution: Sisyphus at Work

Some days I read a paper or two and feel like I've lost ground. I just have more to read, think, and do. Of course, this phenomenon is universal... As Design Observer tells us, reading makes us more ignorant:

According to the Mexican critic Gabriel Zaid, writing in "So Many Books: Reading and Publishing in an Age of Abundance", ... "If a person read a book a day, he would be neglecting to read 4,000 others... and his ignorance would grow 4,000 times faster than his knowledge."

Don't read a book today! Now there is a slogan modern man can get behind. It seems that a few college students have already signed on.

My hope for most days is just the opposite. Here is a nice graphic slogan for this hope, courtesy of Brian Marick:

to be less wrong than yesterday

But it's hard to feel that way some days. The universe of knowing and doing is large. The best antidote to Sisyphean despair is to set a few measurable goals that one can reach with a reasonable short-term effort. Each step can give a bit of satisfaction, and -- if you take enough such steps -- you can end up someplace new. A lot like writing code.


Posted by Eugene Wallingford | Permalink | Categories: General

October 06, 2007 8:16 PM

Today I Wrote a Program

Today I wrote a program, just for fun. I wrote a solution to the classic WordLadder game, which is a common nifty assignment used in the introductory Data Structures course. I had never assigned it in one of my courses and had never had any other reason to solve it. But my daughter came home yesterday with a math assignment that included a few of these problems, such as converting "heart" to "spade", and in the course of talking with her I ended up doing a few of the WordLadder problems on my own. I'm a hopeless puzzle junkie.

Some days, an almost irrational desire to write a program comes over me, and last night's fun made me think, "I wonder how I might do this in code?" So I used a few spare minutes throughout today to implement one of my ideas from last night -- a simple breadth-first search that finds all of the shortest solutions in a particular dictionary.

A few of those spare minutes came at the public library, while the same daughter was participating in a writers' workshop for youth. As I listened to their discussion of a couple of poems written by kids in the workshop in the background, I thought to myself, "I'm writing here, too." But then it occurred to me that the kids in the workshop wouldn't call what I was doing "writing". Nor would their workshop leader or most people that we call "writers". Nor would most computer scientists, not without the rest of the phrase: "writing a program".

Granted, I wasn't writing a poem. But I was exploring an idea that had come into my mind, one that drove forward. I wasn't sure what sort of program I would end up, and arrived at the answer only after having gone down a couple of expected paths and found them wanting. My stanzas, er, subprocedures, developed over time. One grew and shrank, changed name, and ultimately became much simpler and clearer than what I had in mind when I started.

I was telling a story as much as I was solving a problem. When I finished, I had a program that communicates to my daughter an idea I described only sketchily last night. The names of my variables and procedures tell the story, even without looking at too much of their detail. I was writing as a way to think, to find out what I really thought last night.

Today I wrote a program, and it was fun.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

October 03, 2007 5:24 PM

Walk the Wall, Seeger

Foley break Mayo

There is a great scene toward the end of one of my favorite movies, An Officer and a Gentleman. The self-centered and childlike protagonist, Zach Mayo, has been broken down by Drill Instructor Foley. He is now maturing under the Foley's tough hand. The basic training cohort is running the obstacle course for its last time. Mayo is en route to a course record, and his classmates are urging him on. But as his passes one of his classmates on the course, he suddenly stops. Casey Seeger has been struggling with wall for the movie, and it looks like she still isn't going to make it. But if she doesn't, she won't graduate. Mayo sets aside his record and stands with Seeger, cheering her and coaching her over the wall. Ultimately, she makes it over -- barely -- and the whole class gathers to cheer as Mayo and Seeger finish the run together. This is one of the triumphant scenes of the film.

I thought of this scene while running mile repeats on the track this morning. Three young women in the ROTC program were on the track, with two helping the third run sprints. The two ran alongside their friend, coaxing her and helping her continue when she clearly wanted to stop. If I recall correctly from my sister's time in ROTC, morning PT (physical training) is a big challenge for many student candidates and, as in An Officer and a Gentleman, they must meet certain fitness thresholds in order to proceed with the program -- even if they are in non-combat roles, such as nurses.

It was refreshing to see that sort of teamwork, and friendship, among students on the track.

It is great when this happens in one our classes. But when it does, it is generally an informal process that grows among students who were already friends when they came to class. It is not a part of our school culture, especially in computer science.

Some places, it is part of the culture. A professor here recently related a story from his time teaching in Taiwan. In his courses there, the students in the class identified a leader, and then they worked together to make sure that everyone in the class succeeded. This was something that students expected of themselves, not something the faculty required.

I have seen this sort of collectivism imposed from above by CS professors, particularly in project courses that require teamwork. In my experience, it rarely works well when foisted on students. The better students resent having their grade tied to a weaker student's, or a lazier one's. (Hey, it's all about the grade, right?) The weaker students resent being made someone else's burden. Maybe this is a symptom of the Rugged Individualism that defines the West, but working collectively is generally just not part of our culture.

And I understand how the students feel. When I found myself in situations like this as a student, I played along, because I did what my instructors asked me to do. And I could be helpful. But I don't think it ever felt natural to me; it was an external requirement.

Recently I found myself discussing pair programming in CS1 with a former student who now teaches for us. He is considering pairing students in the lab portion of his non-majors course. Even after a decade, he remembers (fondly, I think) working with a different student each week in my CS1 lab. But the lab constituted only a quarter of the course grade, and the lab exercises did not require long-term commitment to helping the weakest members of the class succeed. Even still, I had students express dissatisfaction at "wasting their time".

This is one of the things that I like about the agile software methods: it promotes a culture of unity and of teamwork. Pair programming is one practice that supports this culture, but so are collective ownership, continuous integration, and coding standard. Some students and programmers, including some of the best, balk at being forced into "team". Whatever the psychological, social, and political issues, and whatever my personal preferences as a programmer, there seems something attractive about a team working together to get better, both as a team and as individuals.

I wish the young women I saw this morning well. I hope they succeed, as a team and as individuals. They can make it over the wall.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

October 02, 2007 6:58 AM

The Right (Kind of) Stuff

As you seek the three great virtues of a programmer, you seek to cultivate...

... the kind of laziness that makes you want to minimize future effort but investing effort today, to maximize your productivity and performance over the long haul, not the kind that leads you to avoid essential work or makes you want to cut corners.

... the kind of impatience that encourages you to work harder, not the kind of impatience that steals your spirit when you hit a wall or makes you want to cut corners.

... the kind of hubris that makes you think that you can do it, to trust yourself, not the kind of hubris that makes you think you don't have to listen to the problem, your code, or other people -- or the kind that makes you want to cut corners.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

September 30, 2007 11:16 AM

Unexpected Fun Cleaning out My Closet

The last week or so I've been trying to steal a few minutes each day to clean up the closet in my home work area. One of the big jobs has been to get rid of several years of journals and proceedings that built up from 1998 to 2002, when it seems I had time only to skim my incoming periodicals.

I seem genetically unable to simply through these into a recycling bin; instead, I sit on the floor and thumb through each, looking at least at the table of contents to see if there is anything I still want to read. Most of the day-to-day concerns in 2000 are of no particular interest now. But I do like to look at the letters to the editor in Communications of the ACM, IEEE Computer, and IEEE Spectrum, and some of the standing columns in SIGPLAN Notices, especially on Forth and on parsing. Out of every ten periodicals or so, I would guess I have saved a single paper or article for later reading.

One of the unexpected joys has been stumbling upon all of the IEEE Spectrum issues. It's one of the few general engineering generals I've ever received, and besides it has the bimonthly Reflections column by Robert Lucky, which I rediscovered accidentally earlier this month. I had forgotten in the off-months of Reflections, Spectrum runs a column called Technically Speaking, which I also enjoy quite a bit. According to its by-line, this column is "a commentary on technical culture and the use and misuse of technical language". I love words and learning about their origin and evolution, and this column used to feed my habit.

Most months, Technically Speaking includes a sidebar called "Worth repeating", which presents a quote of interest. Here are a couple that struck me as I've gone through my old stash.

From April 2000:

Engineering, like poetry, is an attempt to approach perfection. And engineers, like poets, are seldom completely satisfied with their creations.... However, while poets can go back to a particular poem hundreds of times between its first publication and its final version in their collected works, engineers can seldom make major revision in a completed structure. But an engineer can certainly learn from his mistakes.

This is from Henry Petroski, in To Engineer is Human. The process of discovery in which an engineer creates a new something is similar to the poet's process of discovery. Both lead to a first version by way of tinkering and revision. As Petroski notes, though, when engineers who build bridges and other singular structures publish their first version, it is their last version. But I think that smaller products which are mass produced often can be improved over time, in new versions. And software is different... Not only can we grow a product through a conscious process of refactoring, revision, and rewriting from scratch, but after we publish Version 1.0 we can continue to evolve the product behind its interface -- even while it is alive, servicing users. Software is a new sort of medium, whose malleability makes cleaving too closely to the engineering mindset misleading. (Of course, software developers should still learn from their mistakes!)

From June 2000:

You cannot have good science without having good science fans. Today science fans are people who are only interested in the results of science. They are not interested in a good play in science as a football fan is interested in a good play in football. We are not going to be able to have an excellent scientific effort unless the man in the street appreciates science.

This is reminiscent of an ongoing theme in this blog and in the larger computer science community. It continues to be a theme in all of science as well. How do we reform -- re-form -- our education system so that most kids at least appreciate what science is and means? Setting our goal as high as creating fans as into science as into football or NASCAR would be ambitious indeed!

Oh, and don't think that this ongoing theme in the computer science and general scientific world is a new one. The quote above is from Edward Teller, taken off the dust jacket of a book named Rays: Visible and Invisible, published in 1958. The more things change, the more they stay the same. Perhaps it should comfort us that the problem we face is at least half a century old. We shouldn't feel guilty that we cannot solve it over night.

And finally, from August 2000:

To the outsider, science often seems to be a frightful jumble of facts with very little that looks human and inspiring about it. To the working scientist, it is so full of interest and so fascinating that he can only pity the layman.

I think the key here is make moire people insiders. This is what Alan Kay urges us to do -- he's been saying this for thirty years. The best way to share the thrill is to help people to do what we do, not (just) tell them stories.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

September 28, 2007 8:42 AM

Invent. Tell the Story.

I recently mentioned again Seth Godin's All Marketers are Liars in the context of teachers as liars. One last mention -- this time, for researchers and students.

As I read pages 29 and 30 of the book, I was struck by how much Godin's advice for marketers matches my experience as a researcher, first as a graduate student, then as a young faculty member, and now as a grizzled veteran. Consider:

There are only two things that separate success from failure in most organizations today:
  1. Invent stuff worth talking about.
  2. Tell stories about what you've invented.

That is the life of the academic researcher: invent cool stuff, and talk about the inventions. Some of my best professors were people who invented cool stuff and loved to talk about their inventions. They relished being in the lab, creating, and then crafting a story that shared their excitement. As a student, undergrad and grad alike, I was drawn to these profs, even when they worked in areas that didn't interest me much. When they did -- wow.

Many people get into research because we want to do #1, and #2 is just part of the deal. Whether the young researcher wants to or not, telling the stories is essential. It is how we spread our ideas and get the feedback that helps us to improve them. But on a more mercenary level it's also how we get folks interested in offering us tenure-track positions, and then offering us tenure.

Over the course of my career, I have come to realize how many people go into research because they want to do #2. As strange as it might sound, Getting a Ph.D. is one of the more attractive routes to becoming a professional story-teller, because it is the de facto credential for teaching at universities. Sometimes these folks continue to invent cool stuff to talk about. But some ultimately fall away from the research game. They want to tell stories, but without the external pressure to do #1. Maybe they lose the drive to invent, or never really had it in the first place. These folks often become great teachers, too, whether as instructors at research schools or as faculty at so-called "teaching universities". Many of those folks still have a passion for something like #1, but it tends toward learning about the new stuff that others create, synthesizing it, and preparing it for a wider audience. Then they tell the stories to their students and to the general public.

As I've written before, CS needs its own popular story teller, working outside the classroom, to share the thrill... I don't think that has to be an active researcher -- think about the remarkable effect that Martin Gardner had on the world by sharing real math with us in ways that made us want to do mathematics -- and even computer science! But having someone who continues to invent be that person would work just fine. Thank you, Mr. Feynman.

So, to my grad students and to graduate students everywhere, this is my advice to you: Invent stuff worth talking about, and then tell stories about what you've invented.

But this advice is not just for graduate students. Consider this passage from Godin, which I also endorse wholeheartedly:

On a personal level, your resume should be about inventing remarkable things and telling stories that register--not about how good you are at meeting specs. Organizations that are going to be around tomorrow will be those that stop spending all their time dealing with the day-to-day crises of shipping stuff out the door or reacting to emergencies. Instead the new way of marketing will separate winners from losers.

This is where the excitement and future of computer science in industry lie, too. Students who can (only) meet specs are plentiful and not always all that valuable. The real value comes in creating and integrating ideas. This is advice that I've been sharing with entrepreneurially-minded students for a while, and I think as time goes by it will apply to more and more students. Paul Graham has spent a lot of time spreading this message, in articles such as What You'll Wish You'd Known, and I've written about Graham's message here as well. The future belongs to people who are asking questions, not people who can deliver answers to other peoples' questions.

So, this advice is not just for students. It is for everyone.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

August 30, 2007 5:57 PM

It's Not Them; It's Me

I recently realized something.

In books with academic settings, one often sees images of the professor, deep in thought, strolling along the tree-lined walks of campus. Students bustle about on the way between classes. The professor walks along, carefree, absorbed in whatever interesting problem has his or her mind. (All too often, it's a him.) Even if he is running late, has a meeting to attend or a class to lead, he hurries not. He is a professor and leads a life of his own design, even if administrators and students try to impinge on his time. Whatever deep thought occupies his mind comes first. So peaceful.

Movies show us these images, too. So peaceful.

I've never been like that. My campus setting looks much like the ones described in books and movies (though lately ours has looked more like a construction zone than an old-Ivy plat), but I always seem to be in hurry. Can't be late for class, or late for that meeting. Too much to do.

I've often asked myself, when will it be like in the books and movies.

My realization: The problem isn't with my campus or even my university. It's me.

The images in the books and movies are different because the prof ambling peacefully along isn't me. It's Professor Kingsfield. Many of these characters are clichés even when done well, but in any case they are different from me.

The only way for me to live out those images is to modify my own behavior or outlook. Peace comes from inside, not out there. But I don't think I am in need of a change... I'm not restless or dissatisfied; I'm just busy being me, solving problems and thinking about the latest something to cross my path.

So maybe what I need to change is my expectation -- the expectation that I can or even should be like the fictional people I see in those scenes. I suspect that having unrealistic expectations is the cause of as much disharmony as having the "wrong outlook". The outlook isn't always wrong. Sometimes it's just me.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

August 24, 2007 12:20 PM

You Want Security?

Here is security, which comes along with my new favorite error message:

Your password must be at least 18770 characters and cannot repeat any of your previous 30689 passwords. Please type a different password. Type a password that meets these requirements in both text boxes.

Oh, yeah -- be sure to type it twice.

I leave it to the TheoryCS guys at Ernie's 3D Pancakes, Computational Complexity, and The Geomblog to give us the mathematical picture on how secure an 18,770-character password is, and what the implications are for installing SP1, before which you could get by with a passcode of a mere 17,145 characters.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

August 08, 2007 8:57 PM

Seen in the IND Airport

Sponsored outlets in the walkways of the concourses:

You and your laptop may sigh with relief now.

and

Sharing the outlet is good business karma.

The sponsor in this case is Chase, The Bank Formerly Known as Chase Manhattan and later as JP Morgan Chase. Each outlet plate has a blue Chase banner running down the wall from about eye level right down to the pair of outlets. The banners caught my eye, so I guess they worked. Eventually the gimmick will wear out its novelty -- perhaps it already has for other flyers, or elsewhere in the country; I don't fly often -- but I thought it was cute. Funny how changes in technology have made something as mundane as an open outlet so valuable!

Oh, and thanks to cashing in some very old, expiring frequent flyer miles, I flew first class for the first time in a long time, from Indianapolis to John Wayne/Orange County. It wasn't quite like the Seinfeld episode in which Jerry and Elaine experience the different sides of traveling first class and coach, but it was very, very nice. A good addition to my vacation.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

July 26, 2007 1:21 PM

Agile Themes: Honesty and The Prime Directive

My last post looked at the relationship between honesty and blocking, motivated by a recent thread on the XP discussion list. In another thread, I encountered Dale Emery's message on The Prime Directive, and that got me to thinking about being honest with myself about my own behavior, and how to get better.

If you read much in the agile world, you'll run across the phrase "Prime Directive" a lot. I'm not a Trekkie, though I have enjoyed the several movies and TV series, but the first thing I think of when I hear the phrase is James T. Kirk. That's not what the agile folks are talking about... even if that directive raises interesting questions for a software person introducing agile methods to an organization!

If you google "prime directive agile", the first link is to Bob Martin's The Prime Directive of Agile Development, which is: Never be blocked. This is an ironic choice of words, given what I discussed in my previous post, but Martin is using an analogy from billiards, not football: An agile developer "makes sure that the shot he is taking sets up the next shot he expects to take.... A good agile developer never takes a step that stops his progress, or the progress of others." This is a useful notion, I think, but again not what most agilists mean when they speak of the Prime Directive.

They are referring instead to Norm Kerth's use of the phrase in the realm of project retrospectives, in which teams learn from the results of a recently-completed project in order to become a better team for future projects. Here is the Prime Directive for retrospectives, according to Norm:

The prime directive says:

Regardless of what we discover, we understand and truly believe that everyone did the best job they could, given what they knew at the time, their skills and abilities, the resources available, and the situation at hand.

At the end of a project everyone knows so much more. Naturally we will discover decisions and actions we wish we could do over. This is wisdom to be celebrated, not judgement used to embarrass.

This directive creates an environment in which people can examine past actions and results without fear of blame or reprisal. Instead the whole team can find ways to improve. When we look back at behavior and results in this context, we can be honest -- with our teammates and with ourselves. It's hard to improve oneself without facing the brutal facts that define our world and our person.

Emery's article focuses on the power of the phrase "given what they knew at the time". He does not view it as a built-in excuse -- well, I didn't know any better, so... -- rather as a challenge to identify and adjust the givens that limit us.

I apply The Prime Directive to my personal work by saying, "I did the best I could, given..." then fill in the givens. Then I set to work removing or relaxing the limiting conditions so that I perform better in the future. Usually, the most important conditions are the conditions within me, the conditions that I created.... If I created those conditions (and I did), then they are the conditions I can most directly improve.

Well said. Being honest with myself isn't easy, nor is following through on what I learn when I am. I take this as a personal challenge for the upcoming year.

(By the way, I strongly recommend Norm Kerth's book on retrospectives, as well as his pattern language on the transition from software analysis to design, Caterpillar's Fate. Norm is an engaging speaker and doer who celebrates the human element in whatever he touches. I reported on a talk he gave at PLoP 2004 on myth and patterns back in the early days of this blog.)


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

July 26, 2007 1:08 PM

Agile Themes: Honesty and Blocking

I recently wrote about a long-running thread on the XP discussion list about defining 'agile'. Another theme I've noticed across several threads is honesty. This entry and the one that follows look at two facets of the theme.

In one thread that seems to be dying down, the list has discussed the ethics of "blocking", a term that Scott Ambler borrowed from (American) football to describe teams that create the facade of following the official software development methodology while behind the scenes doing what they think is best to deliver the software. Scott wrote about this behavior, in which some members of the team protect the agile process by Running Interference for the rest of the team, in a 2003 Software Development article.

Is it right to do this? As developers, do we want to live our lives doing one thing and saying that we do another? I'm leery of any prescription that requires me to lie, yet I see shades of gray here. I don't think that my employer or our client are better served by actually following a process that is likely to fail to deliver the software as promised. Or, if my team is capable of delivering the software reasonably using the official methodology, then why do I need to lie in order to use an agile process? For me, programming in an agile way is a lot more fun, so there is that, but then maybe I need to find a place that will let me do that -- or start my own.

As I mentioned last time, I have not been able to follow the list discussion 100%, and I can't recall if Kent Beck ever chimed in. But I can imagine what he might say, given the substance and tone of his postings the last few years. If you have to lie -- even if we give it an innocuous name like "blocking"; even if we view it as a last resort -- then something is wrong, and you should think long and hard about how to make it right. Agile developers value people over processes, and honesty is one way we demonstrate that we value people.

George Dinwiddie has a blog entry that considered a more pragmatic potential problem with blocking. We may be getting the job done in the short term, but blocking is shortsighted and may hurt the agile cause in the long run. If we give the appearance of succeeding via the official route, our employer and customer are likely to conclude that the official route is a good one -- and that will make it even harder to introduce agile practices into the workplace. There is a practical value in telling the truth, even it requires us to take small steps. After all, agile developers ought to appreciate the power of small steps.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

July 25, 2007 7:45 AM

Miscellaneous Blogging Thoughts

... at the end of a long day.

  1. I must be an old hand at blogging now. I let Knowing and Doing's third anniversary pass without comment. And I let my 500th post -- and my 512th! -- go by with no fanfare.
  2. I continue to be amazed by Google and the blogosphere. While preparing to visit my old hometown for my upcoming high school reunion, I googled "Scott Merrell ARC" in hopes of finding out if the named Mr. Merrell still owned and operated ARC Sheet Metal, where I worked as a part-time sheet metal apprentice and installer of ductwork during several high school summers. Scott had met me at local chess tournaments and took me on for a job for which I had no training or particular native talent. He patiently taught me a few skills and endured a few mistakes. I thought it might be nice to stop in to see Scott after twenty years, introduce him to my wife and daughters, and maybe give him one more shot in his pet Lasker's variation against my Petrov's Defense. It was unlikely that a small local sheet metal shop would have a web page, but it cost me nothing to try.

    I found only one relevant link -- the first link on the results page, of course -- but it was not for the shop. Instead it included a blog entry written by a friend of Scott's son, which quoted the full text of the son's eulogy for his father. My good friend and former boss died this past March after a long battle with lung disease. (In addition to being a chess hound and a professional sheet metal man, he smoked far too much.) The eulogy almost brought me to tears as it reminisced about the decent man I, too, remembered fondly and respected so. I have no simple way to contact Scott's son to thank him for sharing his eulogy, but I did leave a comment on the blog.

    Not many years ago, the idea that I could have learned about Scott's passing in this way and read the eulogy would have been unthinkable. The connection was indirect, impersonal in some ways, but deeply personal. For all its shortcomings, our technology makes the world a better place to live.

  3. I don't write a personal blog like the one that quoted Scott's eulogy, this entry and a few others notwithstanding. Dave Winer expressed one of the powerful reasons for writing a blog in his essay The unedited voice of a person. A blog like mine provides an outlet for thinking out loud, developing professional ideas in front of an audience, and sharing the small insights that would likely never appear in a refereed publication in a journal or conference. Writing without an editor creates a little fear, but soon the fear is counterbalanced by the freedom that comes from not having to carry someone else's reputation into the written word. By having readers, I do feel the weight of expectation, as I don't want to waste the valuable time of others. But the voice here can be mine, and only mine.
  4. Besides, I like Winer's explanation for why comments are not the be-all, end-all of a blog. I've always told myself and anyone who asked why I don't have comments that I would soon, but I have remained too lazy or busy to set them up. The lightweight, shell-based blogging tool I use, an old version of Nanoblogger, doesn't support comments out of the box, and in fact seems to require magical incantations to make a third-party add-on to work with it. And I don't have the time or inclination to write my own just now.

    But I don't actually mind not having comments. I sometimes miss the interactivity that comments would enable, but managing comments and combatting comment spam takes time, time that I would rather spend reading and blogging.

  5. Last summer, Brad DeLong wrote a fun but more academic essay, The Invisible College, for the Chronicle of Higher Education Review that describes well why I like to blog. Even without comments enabled, I receive e-mail from smart, interesting people all over the world who have read something I wrote here, discussing some point I made, offering alternatives, and suggesting new ideas and resources to me. My academic work benefits from this invisible college. With any luck, some of my ideas might reach the eyes of non-computer scientists and start a conversation outside the confines of my discipline.

    Oh, and he's spot on about that procrastinating thing.

Back to paradise.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

July 20, 2007 7:38 PM

A Reunion with Reunion

My 25th high school reunion is next month. (I can just hear the pencils at work as students, current and former, figure out just how old I am.) So I took this opportunity to re-read Alan Lightman's novel Reunion, which is about a college professor's 30th college reunion. I first read this book when it came out several years ago, but the theme was more timely this time around.

I first learned about Lightman, a physicist-turned-novelist whose fact and fiction both rest on a physics foundation, from an endnote in David Bodanis's E=mc2, which referred me to Einstein's Dreams, This was an unusual book, only a couple of dozen short chapters, that consisted of a few fictional vignettes of Einstein's thinking and discussion with Hans Bethe as he reconceptualized time for his theory of relativity, interspersed among twenty or so fictional dreams that Einstein might have had about worlds in which time behaves differently than it does in our world. For example, in one world, time passes faster when one is at higher altitudes; in another, one occasionally gets stuck to a single place in time; in yet another, time moves backward.

I found this book delightful, both creative and wonderfully written. The conversations between Einstein and Bethe sounded authentic to this non-physicist, and the dream chapters were both "whimsical" and "provocative" (words I borrow from a literary review of the book) -- what would it be like if different neighborhoods lived in different decades or even centuries? Lightman writes as a poet, spare with words and description, precise in detail. Yet the book had a serious undercurrent, as it exposed some of the questions that physicists have raised about the nature of time, and how time interacts with human experience.

Later I found Reunion. It's more of a traditional human story, and I expect that some of my friends would derogate it as "chick lit". But I disagree. First, it's a man's story: a 52-year-old man keenly aware that time has passed beyond his dreams; a 22-year-old man alive with promise unaware that he is reaching branches in time that can never be passed again. And while its structure is that of a traditional novel, the underlying current is one of time's ambiguity: looking back, looking forward, standing still. Lightman even resorts in the shortest of passages to a common device which in other authors' hands is cliché, but which in his seems almost matter of fact. It's not science fiction because it sticks close to the way a real person might feel in this world, where time seems to move monotonically forward but in which our lives are a complex mishmash of present and past, future and never-was.

I enjoyed Reunion again and, though it's a bit of downer, it hasn't diminished my anticipation of stepping back in time to see people who were once my friends, and who because of how time works in my mind will always be my friends, to reminisce about back-when and since-then, and what-now. Time's linearity will show through, of course, in the graying of hair and the onset of wrinkles...


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

July 18, 2007 4:20 PM

Mathematics as "Social Construct"

Many folks like to make analogies between mathematics and art, or computer science and creative disciplines. But there are important ways in which these analogies come up short. Last time, I wrote about Reuben Hersh's view of how to teach math right. The larger message of the article, though, was Hersh's view of mathematics as a social construct, a creation of our culture much as law, religion, and money are. One of the neat things about Edge is that it not only gives interviews with thinkers but also asks thinkers from other disciplines to comment on the interviews. In the same issue as Hersh's interview is a response by Stanislas Dehaene, a mathematician-turned-neuroscientist who has studied the cognition of reading and number. He agrees that a Platonic view of number as Platonic ideal is untenable but then draws on his knowledge of cognitive science to remind us that math is not like art and religion as social constructs in two crucial ways: universality and effectiveness. First, there are some mathematical universals to which all cultures have converged, and for which we can construct arguments sufficient to convincing any person:

If the Pope is invited to give a lecture in Tokyo and attempts to convert the locals to the Christian concept of God as Trinity, I doubt that he'll convince the audience -- Trinity just can't be "proven" from first principles. But as a mathematician you can go to any place in the world and, given enough time, you can convince anyone that 3 is a prime number, or that the 3rd decimal of Pi is a 1, or that Fermat's last theorem is true.

I suspect that some cynics might argue that this is true precisely because we define mathematics as an internally consistent set of definitions and rules -- as a constructed system. Yet I myself am sympathetic to claims of the universality of mathematics beyond social construction.

Second, mathematics seems particular effective as the language of science. Dehaene quotes Einstein, "How is it possible that mathematics, a product of human thought that is independent of experience, fits so excellently the objects of physical reality?" Again, a cynic might claim that much of mathematics has been defined for the express purpose of describing our empirical observations. But that really begs the question. What are the patterns common to math and science that make this convergence convenient, even possible?

Dehaene's explanation for universality and effectiveness rests in evolutionary biology -- and patterns:

... mathematical objects are universal and effective, first, because our biological brains have evolved to progressive internalize universal regularities of the external world ..., and second, because our cultural mathematical constructions have also evolved to fit the physical world. If mathematicians throughout the world converge on the same set of mathematical truths, it is because they all have a similar cerebral organization that (1) lets them categorize the world into similar objects ..., and (2) forces to find over and over again the same solutions to the same problems ....

The world and our brains together drive us to recognize the patterns that exist in the world. I am reminded of a principle that I think I first learned from Patrick Henry Winston in his text Artificial Intelligence, called The Principle of Convergent Intelligence:

The world manifests constraints and regularities. If an agent is to exhibit intelligence, then it must exploit these constraints and regularities, no matter the nature of its physical make-up.

The close compatibility of math and science marveled at by Einstein and Dehaene reminds me of another of Winston's principles, Winston's Principle of Parallel Evolution:

The longer two situations have been evolving in the same way, the more likely they are to continue to evolve in the same way.

(If you never had the pleasure of studying AI from Winston's text, now in its third edition, then you missed the joy of his many idiosyncratic principles. They are idiosyncratic in that you;ll read them no where else, certainly not under the names he gives them. But they express truths he wants you to learn. They must be somewhat effective, if I remember some from my 1986 grad course and from teaching out of his text in the early- to mid-1990s. I am sure that most experts consider the text outdated -- the third edition came out in 1992 -- but it still has a lot to offer the AI dreamer.)

So, math is more than "just" a mental construct because it expresses regularities and constraints that exist in the real world. I suppose that this leaves us with another question: do (or can) law and religion do the same, or do they necessarily lie outside the physical world? I know that some software patterns folks will point us to Christopher Alexander's arguments on the objectivity of art; perhaps our art expresses regularities and constraints that exist in the real world, too, only farther from immediate human experience.

These are fun questions to ponder, but they may not tell us much about how to do better mathematics or how to make software better. For those of us who make analogies between math (or computer science) and the arts, we are probably wise to remember that math and science reflect patterns in our world, at least more directly with our immediate experience than some of our other pursuits.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 05, 2007 3:59 PM

Language Science

My previous entry discussed the scientific study of language. It occurred to me that the scientists who object to computing as science might also object to the study of language as "real" science. Certainly, the biological and neurological foundation of language seems to meet the criterion that a phenomenon occur in the natural world in order to be in the realm of science. But what of syntax and semantics? Even if we computer scientists speak of "natural language" Do they occur naturally in the world?

According to Chomsky, the answer is 'yes': People are born with a universal grammar implemented in their brains, which makes it possible for us to learn what otherwise might not be learnable in a tractable way. This is a claim that is open to empirical study, at least in principle, though the story of the Pirahã shows how difficult it is to support and falsify claims in this area. (And according to the article I cited last time, there are some are who concerned with how engaged Chomsky himself is in empirical verification these days.)

But isn't natural language man-made? We grow and evolve our vocabulary with some degree of intention. Syntax changes over time, too, and at least part of that change is intentional.

Not everyone thinks of language as a designed artifact. Consider this quote from an essay at Design Observer

According to new book by linguist David Harrison, "Languages can package knowledge in radically different ways, thus facilitating different ways of conceptualizing, naming, and discussing the world." If languages package information, can they be considered design objects?

I'm a computer scientist, so my first thought was, "Well, duh." Had I written a separate blog entry on that piece, I would have titled it "Yet Another Example of Why Non-Computer Scientists Should Study CS". Maybe, though, I am betrayed by living in a world artificial -- designed -- languages. (Well, if you can call a language like Perl "designed".)

I'm not a linguist, so it is hard for me to say to what extent primitive languages such as Pirahã esp. pre-literate ones, are designed and to what extent they flow out of the machinery of the human brain.

Interestingly, a few years ago we had a multidisciplinary language science seminar at my university. You can see the web page from our last full semester of activity. It was a diverse crew, ranging from folks interested in the biological and neurological side of language, to a cognitive psychologist, to linguists, on up to teachers of modern languages and literature profs interested in the use of language in poetry and prose. And there was one person interested in artificial languages and their interplay with natural language -- me. The group is dead now, but I found it quite valuable both as a teacher and as someone interested in programming languages. I miss our biweekly sessions.


Posted by Eugene Wallingford | Permalink | Categories: General

July 04, 2007 9:19 PM

Recursion, Natural Language, and Culture

M.C. Escher, 'Hands'

It's not often that one can be reading a popular magazine, even one aimed at an educated audience, and run across a serious discussion of recursion. Thanks to my friend Joe Bergin for pointing me to The Interpreter, a recent article in the The New Yorker by Reporter at Large John Colapinto. The article tells the story of the Pirahã, a native tribe in Brazil with a most peculiar culture and a correspondingly unusual language. You see, while we often observe recursion in nature, one of the places we expect to see it is in natural language -- in the embedding of sentence-like structures within other sentences. But the Pirahã don't use recursion in their language, because their world view makes abstract structure meaningless.

Though recursion plays a critical role in Colapinto's article, it is really about recursion; it is about a possible crack in Chomsky's universal grammar hypothesis about language, and some of the personalities and technical issues involved. Dan Everett is a linguist who has been working with the Pirahã since the 1970s. He wrote his doctoral dissertation on how the Pirahã language fit into the Chomsky, but upon further study and a new insight now "believes that Pirahã undermines Noam Chomsky's idea of a universal grammar." As you might imagine, Chomsky and his disciples disagree.

What little I learned about the Pirahã language makes me wonder at what it must be like to learn it -- or try to. One the one hand, it's a small language, with only eight consonants and three vowels. But that's just the beginning of its simplicity:

The Pirahã, Everett wrote, have no numbers, no fixed color terms, no perfect tense, no deep memory, no tradition of art or drawing, and no words for 'all', 'each', 'every', 'most', or 'few' -- terms of quantification believed by some linguists to be among the common building blocks of human cognition. Everett's most explosive claim, however, was that Pirahã displays no evidence of recursion, a linguistic operation that consists of inserting one phrase inside another of the same type..."

This language makes Scheme look like Ada! Of course, Scheme is built on recursion, and Everett's claim that the Pirahã don't use it -- can't, culturally -- is what rankles many linguists the most. Chomsky has built the most widely accepted model of language understanding on the premise that "To come to know a human language would be an extraordinary intellectual achievement for a creature not specifically designed to accomplish this task." And at the center of this model is "the capacity to generate unlimited meaning by placing one thought inside another", what Chomsky calls "the infinite use of finite means", after the nineteenth-century German linguist Wilhelm von Humboldt.

According to Everett, however, the Pirahã do not use recursion to insert phrases one inside another. Instead, they state thoughts in discrete units. When I asked Everett if the Pirahã could say, in their language, "I saw the dog that was down by the river get bitten by a snake", he said, "No. They would have to say, 'I saw the dog. The dog was at the beach. A snake bit the dog.'" Everett explained that because the Pirahã accept as real only that which they observe, their speech consists only of direct assertions ("The dog was at the beach."), and he maintains that embedded clauses ("that was down by the river") are not assertions but supporting, quantifying, or qualifying information -- in other words, abstractions.

The notion of recursion as abstraction is natural to us programmers, because inductive definitions are by their nature abstractions over the sets they describe. But I had never before thought of recursion as a form of qualification. When presented in the form of an English sentence such as "I saw the dog that was down by the river get bitten by a snake", it makes perfect sense. I'll need to think about whether it makes sense in a useful for my programs.

Here is one more extended passage from the article, which discusses an idea from Herb Simon, which appears in the latest edition of the Simon book I mentioned in my last entry:

In his article, Everett argued that recursion is primarily a cognitive, not a linguistic, trait. He cited an influential 1962 article, "The Architecture of Complexity," by Herbert Simon, a Nobel Prize-winning economist, cognitive psychologist, and computer scientist, who asserted that embedding entities within like entities (in a recursive tree structure of the type central to Chomskyan linguistics) is simply how people naturally organize information. ... Simon argues that this is essential to the way humans organize information and is found in all human intelligence systems. If Simon is correct, there doesn't need to be any specific linguistic principle for this because it's just general cognition." Or, as Everett sometimes likes to put it: "The ability to put thoughts inside other thoughts is just the way humans are, because we're smarter than other species." Everett says that the Pirahã have this cognitive trait but that it is absent from their syntax because of cultural constraints.

This seems to be a crux in Everett's disagreement with the Chomsky school: Is it sufficient -- even possible -- for the Pirahã to have recursion as a cognitive trait but not as a linguistic trait? For many armchair linguists, the idea that language and thought go hand in hand is almost an axiom. I can certainly think recursively even when my programming language doesn't let me speak recursively. Maybe the Pirahã have an ability to organize their understanding of the world using nested structures (as Simon says they must) without having the syntactic tools for conceiving such structures linguistically (as Everett says they cannot).

I found this to be a neat article for more reasons than just its references to recursion. Here are few other ideas that occurred as I read.

Science and Faith Experience

At UNICAMP (State Univ. of Campinas in Brazil), in the fall of 1978, Everett discovered Chomsky's theories. "For me, it was another conversion experience," he said.

Everett's first conversion experience happened when he became a Christian in the later 1960s, after meeting his wife-to-be. It was this first conversion that led him to learn linguistics in the first place and work with the Pirahã under the auspices of the Summer Institute of Linguistics, an evangelical organization. He eventually fell away from his faith but remained a linguist.

Some scientists might balk at Everett likening his discovery of Chomsky to a religious conversion, but I think he is right on the mark. I know what it's like as a scholar to come upon a new model for viewing the world and feeling as if I am seeing a new world entirely. In grad school, for me it was the generic task theory of Chandrasekaran, which changed how I viewed knowledge systems and foreshadowed my later move into the area of software patterns.

It was interesting to read, even briefly, the perspective of someone who had undergone both a religious conversion and a scientific conversion -- and fallen out of both, as his personal experiences created doubts for which his faiths had no answers for him.

Science as Objective

Obvious, right? No. Everett has reinterpreted data from his doctoral dissertation now that he has shaken the hold of his Chomskyan conversion. Defenders of Chomsky's theory say that Everett's current conclusions are in error, but he now says that

Chomsky's theory necessarily colored his [original] data-gathering and analysis. "'Descriptive work' apart from theory does not exist. We ask the questions that our theories tell us to ask.

Yes. When you want to build generic task models of intelligent behavior, you see the outlines of generic tasks wherever you look. You can tell yourself to remain skeptical, and to use an objective eye, but the mind has its own eye.

Science is a descriptive exercise, and how we think shapes what we see and how we describe. Do you see objects or higher-order procedures when you look at a problem to describe or when you conceive a solution? Our brains are remarkable pattern machines and can fall into the spell of a pattern easily. This is true even in a benign or helpful sense, such as what I experienced after reading an article by Bruce Schneier and seeing his ideas in so many places for a week or so. My first post in that thread is here, and the theme spread throughout this blog for at least two weeks thereafter.

Intellectually Intimidating Characters

Everett occupied an office next to Chomsky's; he found the famed professor brilliant but withering. "Whenever you try out a theory on someone, there's always some question that you hope they won't ask," Everett said. "That was always the first thing Chomsky would ask.

That is not a fun feeling, and not the best way for a great mind to help other minds grow -- unless used sparingly and skillfully. I've been lucky that most of the intensely bright people I've met have had more respect and politeness --and skill -- to help me come along on the journey, rather than to torch me with their brilliance at every opportunity.

Culture Driving Language

One of the key lessons we see from the Pirahã is that culture is a powerful force, especially a culture so long isolated from the world and now so closely held. But you can see this phenomenon even in relatively short-term educational and professional habits such as programming styles. I see it when I teach OO to imperative programmers, and when I teach functional programming to imperative OO programmers. (In a functional programming course, the procedural and OO programmers realize just how similar their imperative roots are!) Their culture has trained them not to use the muscles in their minds that rely on the new concepts. But those muscles are there; we just need to exercise them, and build them up so they are as strong as the well-practiced muscles.

What Is Really Universal?

Hollywood blockbusters, apparently:

That evening, Everett invited the Pirahã to come to his home to watch a movie: Peter Jackson's remake of "King Kong". (Everett had discovered that the tribe loves movies that feature animals.) After nightfall, to the grinding sound of the generator, a crowd of thirty or so Pirahã assembled on benches and on the wooden floor of Everett's [house]. Everett had made popcorn, which he distributed in a large bowl. Then he started the movie, clicking ahead to the scene in which Naomi Watts, reprising Fay Wray's role, is offered as a sacrifice by the tribal people of an unspecified South Seas island. The Pirahã shouted with delight, fear, laughter, and surprise -- and when Kong himself arrived, smashing through the palm trees, pandemonium ensued. Small children, who had been sitting close to the screen, jumped up and scurried into their mothers' laps; the adults laughed and yelled at the screen.

The Pirahã enjoy movies even when the technological setting is outside their direct experience -- and for them, what is outside their direct experience seems outside their imagination. The story reaches home. From their comments, the Pirahã seemed to understand King Kong in much the way we did, and they picked up on cultural clues that did fit into their experience. A good story can do that.

Eugene sez, The Interpreter, is worth a read.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

June 29, 2007 11:39 PM

Research, Prestige, and an Undergraduate Education

Philip Greenspun recently posted a provocative blog entry called Why do high school kids keep signing up to be undergrads at research universities? If you've never read any of Philip's stuff, this might seem like an odd and perhaps even naive piece. His claim is pretty straightforward: "Research universities do not bother to disguise the fact that promotion, status, salary, and tenure for faculty are all based on research accomplishments," so why don't our brightest, most ambitious high school students figure out that these institutions aren't really about teaching undergraduates? This claim might seem odd considering that Philip himself went to MIT and now teaches as an adjunct prof there. But he has an established track record of writing about how schools like Harvard, MIT, the Ivies, and their ilk could do a better job of educating undergrads, and at a lower cost.

My thoughts on this issue are mixed, though at a certain level I agree with his premise. More on how I agree below.

As an undergraduate, I went to a so-called regional university, one that grants Ph.D.s in many fields but which is not typical of the big research schools Philip considers. I chose the school for its relatively strong architecture school, which ranked in the top 15 or 20 programs nationally despite being at a school that overall catered largely to a regional student population. There I was part of a good honors college and was able to work closely with published scholars in a way that seems unlikely at a Research U. However, I eventually changed my major and studied computer science accounting. The accounting program had a good reputation, but its computer science department was average at best. It had a standard curriculum, and I was a good enough student and had enough good profs that I was able to receive a decent education and to have my mind opened to the excitement of doing computer science as an academic career. But when I arrived at grad school I was probably behind most of my peers in terms of academic preparation.

I went to a research school for my graduate study, though not one in the top tier of CS schools. It was at that time, I think, making an effort to broaden, deepen, and strengthen its CS program (something I think it has done). The department gave me great financial support and opportunities to teach several courses and do research with a couple of different groups. The undergrad students I taught and TAed sometimes commented that they felt like they were getting a better deal out of my courses than they got out of other courses at the university, but I was often surprised by how committed some of the very best researchers in the department were to their undergrad courses. Some of the more ambitious undergrads worked in labs with the grad students and got to know the research profs pretty well. At least one of those students is now a tenured prof in a strong CS program down south.

Now I teach at a so-called comprehensive university, one of those medium-sized state schools that offers neither the prestige of the big research school nor the prestige of an elite liberal arts school. We are in a no-man's land in other ways as well -- our faculty are expected to do research, but our teaching expectations and resources place an upper bound on what most faculty can do; our admissions standards grant access to a wider variety of students, but such folks tend to require a more active, more personal teaching effort.

What Greenspun says holds the essence of truth in a couple of ways. The first is that a lot of our best students think that they can only get a good education at one of the big research schools. That is almost certainly not true. The variation in quality among the programs at the less elite schools is greater, which requires students and their parents to be perhaps more careful in selecting programs. It also requires the schools themselves to do a better job communicating where their quality programs lie, because otherwise people won't know.

But a university such as mine can assemble a faculty that is current in the discipline, does research that contributes value (even basic knowledge), and cares enough about its mission to teach to devote serious energy to the classroom. I don't think that a comprehensive's teaching mission in any speaks ill of a research school faculty's desire to teach well but, as Greenspun points out, those faculty face strong institutional pressure to excel in other areas. The comprehensive school's lower admission standards means that weaker students have a chance that they couldn't get elsewhere. Its faculty's orientation means that stronger have a chance to excel in collaboration with faculty who combine interest and perhaps talent in both teaching and research.

If the MITs and Harvards don't excel in teaching undergrads, what value to they offer to bright, ambitious high school students? Commenters on the article answered in a way that sometimes struck me as cynical or mercenary, but I finally realized that perhaps they were simply being practical. Going to Research U. or Ivy C. buys you connections. For example:

Seems pretty plain that he's not looking to buy the educational experience, he's looking to buy the peers and the prestige of the university.

And in my experience of what school is good for, he's making the right decision.

You wanna learn? Set up a book budget and talk your way into or build your own facilities to play with the subject you're interested in. Lectures are a lousy way to learn anyway.

But you don't go to college to learn, you go to college to make the friends who are going to be on a similar arc as you go through your own career, and to build your reputation by association....

And:

You will meet and make friends with rich kids with good manners who will provide critical angel funding and business connections for your startups.

Who cares if the undergrad instruction is subpar? Students admitted to these schools are strong academically and likely capable of fending for themselves when it comes to content. What these students really need is a frat brother who will soon be an investment banker in a major NYC brokerage.

It's really unfair to focus on this side of the connection connection. As many commenters also pointed out, these schools attract lots of smart people, from undergrads to grad students to research staff to faculty. And the assiduous undergrad gets to hang around with them, learning from them all. Paul Graham would say that these folks make a great pool of candidates to be partners in the start-up that will make you wealthy. And if strong undergrad can fend for him- or herself, why not do it at Harvard or MIT, in a more intellectual climate? Good points.

But Greenspun offers one potential obstacle, one that seems to grow each year: price. Is the education an undergrad receives at an Ivy League or research school, intellectual and business connections included, really worth $200,000? In one of his own comments, he writes:

Economists who've studied the question of whether or not an Ivy League education is worth it generally have concluded that students who were accepted to Ivy League schools and chose not to attend (saving money by going to a state university, for example) ended up with the same lifetime income. Being the kind of person who gets admitted to Harvard has a lot of economic value. Attending Harvard turned out not to have any economic value.

I'm guessing, though, that most of these students went to a state research university, not to a comprehensive. I'd be curious to see how the few students who did opt for the less prestigious but more teaching-oriented school fared. I'm guessing that most still managed to excel in their careers and amass comparable wealth -- at least wealth enough to live comfortably.

I'm not sure Greenspun thinks that everyone should agree with his answer so much as that they should at least be asking themselves the question, and not just assuming the prestige trumps educational experience.

This whole discussion leads me to want to borrow a phrase from Richard Gabriel that he applies to talent and performance as a writer. The perceived quality of your undergraduate institution does not determine how good you can get, only how fast you get can good.

I read Greenspun's article just as I was finishing reading the book Teaching at the People's University, by Bruce Henderson. This book describes the history and culture of the state comprehensive universities, paying special attention to the competing forces that on the one hand push their faculty to teach and serve an academically diverse student body and on the other expects research and the other trappings of the more prestigious research schools. Having taught at a comprehensive for fifteen years now, I can't say that the book has taught me much I didn't already know about the conflicting culture of these schools, but it paints a reasonably accurate picture of what the culture is like. It can be a difficult environment in which to balance the desire to pursue basic research that has a significant effect in the world and the desire to teach a broad variety of students well.

There is no doubt that many of the students who enroll in this sort of school are served well, because otherwise they would have little opportunity to receive a solid university education; the major research schools and elite liberal arts schools wouldn't admit them. That's a noble motivation and it provides a valuable service to the state, but what about the better students who choose a comprehensive? And what of the aspirations of faculty who are trained in a research-school environment to value their careers by the intellectual contribution they make to their discipline? Henderson does a nice job laying these issues out for people to consider explicitly, rather than to back into them when their expectations are unmet. This is not unlike what Greenspun does in his blog entry, laying an important question on the line that too often goes unasked until the answer is too late to matter.

All this said, I'm not sure that Greenspun was thinking of the comprehensives at all when he wrote his article. The only school he mentions as an alternative to MIT, Harvard, and the other Ivies is the Olin College of Engineering, which is a much different sort of institution than a mid-level state school. I wonder whether he would suggest that his young relative attend one of the many teacher-oriented schools in his home state of Massachusetts?

After having experienced two or three different kinds of university, would I choose a different path for myself in retrospect? This sort of guessing game is always difficult to play, because I have experienced them all under different conditions, and they have all shaped me in different ways. I sometimes think of the undergraduates who worked in our research lab while I was in grad school; they certainly had broader and deeper intellectual experiences than I had as as undergraduate. But as a first-generation university attendee I grew quite a bit as an undergraduate and had a lot of fun doing it. Had I been destined for a high-flying academic research career, I think I would have had one. Some of my undergrad friends have done well on that path. My ambition, goals, and inclinations are well suited for where I've landed; that's the best explanation for why I've landed here. Would my effect on the world have been greater had I started at a Harvard? That's hard to say, but I see lots of opportunities to contribute to the world from this perch. Would I be happier, or a better citizen, or a better father and husband? Unlikely.

I wish Greenspun's young relative luck in his academic career. And I hope that I can prepare my daughters to choose paths that allow them to grow and learn and contribute.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

June 27, 2007 2:50 PM

Hobnobbing with Legislators

As a department head, I am occasionally invited to attend an event as a "university leader". This morning I had the chance to attend a breakfast reception thrown by the university for our six local state legislators. They had all been part of a strong funding year for state universities, and this meeting was a chance for us to say "thank you" and to tell them all some of the things we are doing. This may not sound like all that much fun to some of you; it's certainly unlike a morning spent cutting code. But I find this sort of meeting to be a good way to put a face on our programs to the people who hold our purse strings, and I admit to enjoying the experience of being an "insider".

I found our delegation to consist of good people who had done their homework and who have good intentions regarding higher education. Two or three of them seem to be well-connected in the legislature and so able to exercise some leadership. One in particular has the look, bearing, speaking ability, and mind that bode well should he decide to seek higher elected office.

I can always tell when I am in the presence of folks who have to market the university or themselves, as nearly every person the room must. I hear sound bites about "windows of opportunity" and "dynamic personalities in the leadership". My favorite sound bite of the morning bears directly on a computer science department: "The jobs of the future haven't been invented yet."

This post involves computing in an even more immediate way. Upon seeing my name tag, two legislators volunteered that the toughest course they took in college was their computer programming class, and the course in which they received their lowest grades (a B in Cobol and a C in Pascal, for what it's worth). These admissions came in separate conversations, completely independent from one another. The way they spoke of their experiences let me know that the feeling is still visceral for them. I'm not sure that this is the sort of impression we want to make on the folks who pay our bills! Fortunately, they both spoke in good nature and let us know that they understand how important strong CS programs are for the economic development of our region and state. So I left the meeting with a good feeling.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

June 22, 2007 4:32 PM

XUnit Test Patterns and the Duplex Book

Several folks have already recommended Gerard Meszaros's new book, xUnit Test Patterns. I was fortunate to have a chance to review early drafts of Gerard's pattern language on the web and then at PLoP 2004, where Gerard and I were in a writers' workshop together. By that time I felt I knew a little about writing tests and using JUnit, but reading Gerard's papers that fall taught me just how much more there was for me to learn. I learned a lot that month and can only hope that my participation in the workshop helped Gerard a small fraction as much as his book has helped me. I strongly echo Michael Feathers's recommendation: "XUnit Patterns is a great all around reference." (The same can be said for Michael's book, though my involvement reviewing early versions of it was not nearly as deep.)

As I grow older, I have a growing preference for short books. Maybe I am getting lazy, or maybe I've come to realize that most of the reasons for which I read don't require 400 or 600 hundred words. Gerard's book weighs in at a hefty 883 pages -- what gives? Well, as Martin Fowler writes in his post Duplex Book, XUnit Test Patterns is really more than one book. Martin says two, but I think of it as really three:

  • a 181-page narrative that teaches us about automated tests, how to write them, and how to refactor them,
  • a 91-page catalog of smells you will find in test code, and
  • an approximately 500-page catalog of the patterns of test automation. These patterns reference one another in a tight network, and so might be considered a pattern language.

So in a book like this, I have the best of two worlds: a relatively short, concise, well-written story that shows me the landscape of automated unit testing and gets me started writing tests, plus a complete reference book to which I can turn as I need to learn a particular technique in greater detail. I can read the story straight through and then jump into and out of the catalogs as needed. The only downside is the actual weight of the book... It's no pocket reference! But that's a price I am happy to pay.

One of my longstanding goals has been to write an introductory programming textbook, say for CS1, in the duplex style. I'm thinking something like the dual The Timeless Way of Building/A Pattern Language, only shorter and less mystical. I had always hoped to be the first to do this, to demonstrate what I think is a better future for instructional books. But at this increasingly late date, I'd be happy if anyone could succeed with the idea.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

June 20, 2007 1:20 PM

More Dissatisfaction with Math and Science Education

Another coincidence in time... The day after I post a note on Alan Kay's thoughts on teaching math and science to kids, I run across (via physics blogger and fellow basketball crazy Chad Orzel) Sean Carroll's lament about a particularly striking example of what Kay wants to avoid.

Carroll's article points one step further to his source, Eli Lansey's The sad state of science education, which describes a physics club's visit to a local elementary school to do cool demos. The fifth graders loved the demos and were curious and engaged; the sixth graders were disinterested and going through the motions of school. From his one data point, Carroll and Lansey hypothesize that there might be a connection between this bit flip and what passed for science instruction at the school. Be sure to visit Lansey's article if only to see the pictures of the posters these kids made showing their "scientific procedure" on a particular project. It's really sad, and it goes on in schools everywhere. I've seen similar examples in our local schools, and I've also noticed this odd change in stance toward science -- and loss in curiosity -- that seems to happen to students around fifth or sixth grade. Especially among the girls in my daughters' classes. (My older daughter seemed to go through a similar transition about that time but also seems to have rediscovered her interest in the last year as an eighth grader. My hope abounds...)

Let's hope that the students' loss of interest isn't the result of some unavoidable developmental process and does follow primarily from non-science or anti-science educational practices. If it's the latter, then the sort of things that Alan Kay's group are doing can help.

I haven't written about it here yet, but Iowa's public universities have been charged by the state Board of Regents with making a fundamental change in how we teach science and math in the K-12 school system. My university, which is the home of the state's primary education college, is leading the charge, in collaboration with our bigger R-1 sisters. I'll write more later as the project develops, but for now I can point you to web page that outlines the initiative. Education reform is often sought, often started, and rarely consummated to anyone's satisfaction. We hope that this can be different. I'd feel a lot more confident if these folks would take work like Kay's as its starting point. I fear that too much business-as-usual will doom this exercise.

As I type this, I realize that I will have to get more involved if I want what computer scientists are doing to have any chance of being in the conversation. More to do, but a good use of time and energy.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

June 12, 2007 1:38 PM

Trying to Learn from All Critics

Without having comments enabled on my blog, I miss out on most of the feedback that readers might like to give. It seems like a bigger deal to send an e-mail message with comments. Fortunately for me, a few readers go out of their way to send me comments. Unfortunately for the rest of my readers, those comments don't make it back into the blog the way on-line comments do, and so we all miss out on the sort of conversation that a blog can generate. I think it's time to upgrade my blogging software, I think...

Alistair Cockburn recently sent a comment on my entry But Raise Your Hand First that I must share (with Alistair's permission, of course):

Contrary to Weinberg, I use the exact opposite evaluation of a critic's comments: I assume that anybody, however naive and unschooled, has a valid opinion. No matter what they say, how outrageous, how seemingly ill-founded, someone thought it true, and therefore it is my job to examine it from every presupposition, to discover how to improve the <whatever it is>. I couldn't imagine reducing valid criticism to only those who have what I choose to call "credentials". Just among other things, the <whatever it is> improves a lot faster using my test for validity.

This raises an important point. I suspect that Weinberg developed his advice while thinking about one's inner critics, that four-year-old inside our heads. When he expressed it as applying to outer critics, he may well still have been in the mode of protecting the writer from prior censorship. But that's not what he said.

I agree with Alistair's idea that we should be open to learning from everyone, which was part of the reason I suggested that students not use this as an opportunity to dismiss critique from professors. When students are receiving more criticism than they are used to, it's too easy to fall into the trap of blaming the messenger rather than considering how to improve. I think that most of us, in most situations, are much better served by adopting the stance, "What can I learn from this?" Alistair said it better.

But in the border cases I think that Alistair's position places a heavy and probably unreasonable burden on the writer: "... my job to examine it from every presupposition, to discover how to improve the <whatever it is>." That is a big order. Some criticism is ill-founded, or given with ill will. When it is, the writer is better off to turn her attention to more constructive pursuits. The goal is to make the work better and to become a better writer. Critics who don't start in good faith or who lie too far from the target audience in level of understanding may not be able to help much.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

June 04, 2007 8:22 AM

A Blog Entry From Before I Had a Blog #2

[ UPDATE: I have corrected the quote of Alistair Cockburn that leads below. I'm sure Kent gave the right quote in his talk, and my notes were in error. The correct quote makes more sense in context. Thanks, Alistair. ]

Back in March 2006, I posted notes on an OOPSLA 2003 invited talk by David Ungar. While plowing through some old files last week, I found notes from another old OOPSLA invited talk, this from 2002: Kent Beck's "The Metaphor Metaphor". I've always appreciated Kent's person-centered view of software development, and I remember enjoying this talk. These notes are really a collection of snippets that deal with how language matters in how we think about our projects.

Kent Beck

The Metaphor Metaphor, by Kent Beck

November 6, 2002

"Embellishment is the pitfall of the methodologist." (Alistair Cockburn)

You gain experience. You are asked for advice. You give advice. They ask for more. Eventually, you reach the end of your experience. You run out of advice to give. But you don't run out of people asking you for advice. So, you reach...

Stupid ideas are important. How else will you know that the clever ideas are clever? Don't be afraid of stupid ideas.

A trope is an expression whose meaning is not intended to be derived from the literal interpretation of its words. There are many kinds of trope:

  • irony: spoken such that the opposite is true
  • paralipsis: speaking to a subject by saying that you won't
  • hyperbole: excessive exaggeration
  • pun: using word sounds to create ambiguity
  • metonymy: refer to the whole by referring to a part or a role
  • simile: explicit comparison
  • analogy: simile with connections among the parts
  • metaphor: the linkages plus the concomitant understanding that results

Think about how much of our communication is tropic. Is this a sign that our words and tools are insufficient for communication, or a sign that communication is really hard? (Kent thinks both.)

A key to the value of metaphor is the play between is and is not. How a metaphor holds and how it doesn't both tell us something valuable.

Metaphors run deep in computing. An example: "This is a memory cell containing a 1 or a 0." All four underlined phrases are metaphorical!

Kent's college roommate used to say, "Everything is an interpreter."

Some metaphors mislead. "war on terrorism" is a bad metaphor. "war on disease (e.g., cancer)" is a bad metaphor. Perhaps "terrorism is a disease" is a better metaphor!?

Lakoff's Grounding Hypothesis states: All metaphors ground in physical reality and experience. [Kent gave an example using arithmetic and number lines, relating to an experiment with children, but my notes are incomplete.]

We made Hot Draw "before there were computers". This meant that doing graphics "took forever". Boy was that fun! One cool thing about graphics programming: your mistakes look so interesting!

Hot Draw's metaphors: DRAWING +

  • FIGURE
  • TOOL
  • HANDLE

A lot of good design is waiting productively.

Regarding this quote, Kent told a story about duplicating code -- copy-and-paste with changes to two lines -- and not removing it. That's completely different from copying and pasting code with changes to two lines and not removing. [This is, I think, a nod to the old AI koan (listed first here) about toggling the on/off switch of a hung computer to make it work...]

Kent's final recommendations:

  • Be aware of computing's metaphors -- and your own!
  • If the Grounding Hypothesis is correct, then more physical activity makes better programmers. (Or at least ones with more interesting things to talk about.)

[end of excerpt]

That last recommendation reflects a truth that people often forget: Well-rounded people bring all sorts of positives, obvious and less so, to programming. And I love the quote about design as "productive waiting".

As with any of my conference reports, the ideas presented belong to Kent unless stated otherwise, but any mistakes are mine. With a five-year-old memory of the talk, mistakes in the details are probably unavoidable...


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

June 01, 2007 3:16 PM

More on Structured Text

My entry formatting text for readability elicited some interesting responses. A couple of folks pointed to Sun's language in development, Fortress, which is something of an example going in the other direction: it is a programming language that will be presentable in multiple forms, including a more human mathematics display. Indeed, Fortress code uses a notation that mathematicians will find familiar.

I especially enjoyed a message from Zach Beane, who recently read William Manchester's biography of Winston Churchill, Churchill wrote the notes for his speeches using a non-standard, structured form. While he may not have used syntactic structure as his primary mechanism, he did use syntactic structure as part of making his text easier to scan during delivery. Zach offered a few examples from the Library of Congress's on-line exhibit Churchill and the Great Republic, including Churchill's Speech to the Virginia General Assembly, March 8, 1946. My favorite example is this page of speaking notes for Churchill's radio broadcast to the United States, on October 16, 1938:

speaking notes for Churchill's broadcast too the United States, October 16, 1938

Thanks to Zach and all who responded with pointers!


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

May 30, 2007 7:01 AM

Weinberg on Writing

Whenever asked to recommend "must read" books, especially on computing, I always end up listing at least one book by Gerald Weinberg -- usually his The Psychology of Computer Programming. He has written a number of other classic books, on topics ranging from problem solving and consulting to teamwork and leadership. Now in a new stage of his career, Weinberg has moved from technical consulting to more general writing, including science fiction novels. He's also blogging, both on writing and on consulting.

I feel a connection to his blogs these days because they match a theme in my own reading and writing lately: telling stories as a way to teach. Even when Weinberg was writing his great non-fiction books -- The Psychology of Computer Programming, of course, but also An Introduction to General Systems Thinking, The Secrets of Consulting, and Becoming a Technical Leader -- he was telling stories. He claims that didn't realize that right away (emphasis added):

I'd like to say that I immediately recognized that reading fiction is another kind of simulation, but I'm not that insightful. Only gradually did I come to realize that a great deal of the popularity of my non-fiction books (and the books of a few others, like Tom DeMarco) is in the stories. They make for lighter reading, and some people object to them, but overall, those of us who use stories manage to communicate lots of hard stuff. Why? Because a good story takes the reader into a trance where s/he can "experience" events just as they can in a teaching simulation.

One of my favorite undergraduate textbooks was DeMarco's Structured Analysis and System Specification, and one of the reasons I liked it so was that it was a great book to read: no wasted words, no flashy graphics, just a well told technical story with simple, incisive drawings. Like Weinberg, I'm not sure I appreciated why I liked the book so much then, but when I kept wanting to re-read it in later years I knew that there was something different going on.

But "just" telling stories is different from teaching in an important way. Fiction and creative writers are usually told not to "have a point". Having one generally leads to stories that seem trite or forced. A story with a point can feel like a bludgeon to the reader's sensibility. A point can come out of a good story -- indeed I think that this is unavoidable with the best stories and the best story-tellers -- but it should rarely be put in.

Teachers differ from other story tellers in this regard. They are telling stories precisely because they have a point. Usually, there is something specific that we want others to learn!

(This isn't always true. For example, when I teach upper-division project courses, I want students to learn how to design big systems. In those courses, much of what students learn isn't specific content but habits of thought. For this purpose, "stories without a point" are important, because they leave the learner more freedom to make sense of their own experiences.)

But most of the time, teachers do have a point to make. How should the teacher as story-teller deal with this difference? Weinberg faces it, too, because even with his fiction, he is writing to teach. Here is what he says:

"If you want to send a message, go to Western Union." ...

It was good advice for novelists, script writers, children's writers, and actors, but not for me. My whole purpose in writing is to send messages.... I would have to take this advice as a caution, rather than a prohibition. I would have to make my messages interesting, embedding them in compelling incidents that would be worth reading even if you didn't care about the messages they contained.

For teachers, I think that the key to the effective story is context: placing the point to be learned into a web of ideas that the student understands. A good story helps the student see why the idea matters and why the student should change how she thinks or behaves. In effect, the teacher plays the role of a motivational speaker, but not the cheerleading, rah-rah sort. Students know when they are being manipulated. They appreciate authenticity even in their stories.

Weinberg's blogs make for light but valuable reading. Having learned so much from his books over the years, I enjoy following his thinking in this conversational medium, and I find myself still learning.

But, in the end, why tell stories at all? I believe the Hopi deserve the last word:

"The one who tells the stories rules the world."

Well, at least they have a better chance of reaching their students, and maybe improving their student evaluations.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

May 27, 2007 4:54 PM

Waiting on the World to Change

me and all my friends
we're all misunderstood
they say we stand for nothing and
there's no way we ever could
now we see everything that's going wrong
with the world and those who lead it
we just feel like we don't have the means
to rise above and beat it


so we keep waiting
waiting on the world to change
-- John Mayer

I'm glad to know that Mr. Mayer and his friends care about the world they live in, but I'd like to suggest a different strategy than waiting.

The world changes when people change it.

So...

If you are a software developer waiting for a better working environment, where you feel confident moving forward and have fun delivering value to your customer: Write a test. Do the simplest thing that will make it pass. Refactor. Then do it again.

If you are an instructor waiting for a better classroom environment, where your students are engaged and you have fun working with them on the material they are learning: Pick one session of one class you teach. Eliminate the slides on any topic. Replace them with an interactive exercise. Try it in class, and make the exercise better based on the feedback.

If you are like any of us waiting for better health, for a life in which you wake up ready for the day and feel better throughout: Go for a walk. If you are already walking, throw in a block of jogging. If you are already jogging, throw in a little pick-up where you push your limits for 50 or 100m.

This isn't magic. The world will probably push back. Your tools may not support you; students may resist leaving their cocoon; your body will probably be a little sore tomorrow morning. Changes aren't usually care free. So stick with it through the initial resistance. That's the closest thing to magic there is.

Look for other people who are trying to change their worlds. Talk to them. You'll learn from them, and they'll learn from you.

Make that change.


Posted by Eugene Wallingford | Permalink | Categories: General

May 24, 2007 7:48 AM

Formatting Text for Readability

Technology Changes, Humans Don't

Gaping Void ran the cartoon at the right last weekend, which is interesting, given that several of my recent entries have dealt with a similar theme. Technology may change, but humans -- at least our hard-wiring -- don't. We should take into account how humans operate when we work with them, whether in security, software development, or teaching.

In another coincidence, I recently came across a very cool paper, Visual-Syntactic Text Formatting: A New Method to Enhance Online Reading. We programmers spend an awful lot of time talking about indenting source code: how to do it, why to do it, tools for doing it, and so on. Languages such as Python require a particular sort of indentation. Languages such Scheme and Common Lisp depend greatly on indentation; the programming community has developed standards that nearly everyone follows and, by doing so, programmers can understand code whose preponderance of parentheses would otherwise blind them.

But the Walker paper is the first time I have ever read about applying this idea to text. Here is an example. This:

When in the Course of human events, it becomes necessary
for one people to dissolve the political bands which have
connected them with another, and to assume among the powers
of the earth, the separate and equal station to which the
Laws of Nature...

might become:

When in the Course
        of human events,
    it becomes necessary
        for one people
          to dissolve the political bands
            which have
              connected them with another,
          and to assume
              among the powers
                of the earth,
            the separate and equal station
              to which
                the Laws of Nature
...

Cognitively, this may make great sense, if our minds can process and understand text better presented when it is presented structurally. The way we present text today isn't much different in format than when we first started to write thousands of years ago, and perhaps it's time for a change. We shouldn't feel compelled to stay with a technology for purely historical reasons when the world and our understanding of it have advanced. (Like the world of accounting has with double-entry bookkeeping.)

For those of you who are still leery of such a change, whether for historical reasons, aesthetic reasons, or other personal reasons... First of all, you are in good company. I was once at a small session with Kurt Vonnegut, and he spoke eloquently of how the book as we know it now would never disappear, because there was nothing like the feel of paper on your fingertips, the smell of a book when you open its fresh pages for the first time. I did not believe him then, and I don't think even he believed that deep in his heart; it is nothing more than projecting our own experiences and preferences onto a future that will surely change. But I know just how he felt, and I see my daughters' generation already experiencing the world in a much richer, technology-mediated way than Vonnegut or I have.

Second, don't worry. Even if what Walker and his colleagues describe becomes a standard, I expect a structured presentation to simply be one view on the document out of many possible views. As an old fogey, I might prefer to read my text in the old paragraph-structured way, but I can imagine that having a syntactically-structured view would make it much easier to scan a document and find something more easily. Once I find the passage of interest, I could toggle back to a paragraph-structured view and read to my hearts content. And who knows; I might prefer reading text that is structured differently, if only I have the chance.

Such toggling between views is possible because of... computer science! The same theory and techniques that make it possible to do this at all makes it possible to do however you like. Indeed, I'll be teaching many of the necessary techniques this fall, as a part of building the "front end to a program compiler. The beauty of this science is that we are no longer limited by someone else's preferences, or by someone else's technology. As I often mention here, this is one of the great joys of being a computer scientist: you can create your own tools.

We can now see this technology making it out to general public. I can see the MySpace generation switching to new ways of reading text immediately. If it makes us better readers, and more prolific readers, then we will have a new lens on an old medium. Computer science is a medium-maker.

Of course, this particular project is just a proposal and in the early stages of research. Whether it is worth pursuing in its current form, or at all, depends on further study. But I'm glad someone is studying this. The idea questions assumptions and historical accident, and it uses what we have learned from cognitive science and medical science to suggest a new way to do something fundamental. As I said, very cool.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

May 20, 2007 3:14 PM

Good and Bad Use

Recently I wrote about persuasion and teaching, in light of what we know about how humans perceive and react to risk and new information. But isn't marketing inherently evil, in being motivated by the seller's self-interest and not the buyer's, and thus incompatible with a teacher/student relationship? No.

First of all, we can use an idea associated with a "bad" use to achieve something good. Brian Marick points out that the motivating forces of XP are built in large part on peer pressure:

Some of XP's practices help with discipline. Pair programming turns what could be a solitary vice into a social act: you and your pair have to look at each other and acknowledge that you're about to cheat. Peer pressure comes into play, as it does because of collective code ownership. Someone will notice the missing tests someday, and they might know it was your fault.

This isn't unusual. A lot of social organizations provide a former of positive peer pressure to help individuals become better, and to create a group that adds value to the world. Alcoholics Anonymous is an example for people tempted to do something they know will hurt them; groups of runners training for a marathon rely on one another for the push they need to train on days they feel like not and to exert the extra effort they need to improve. Peer pressure isn't a bad thing; it's just depends on who you choose for your peers.

Returning to the marketing world, reader Kevin Greer sent me a short story on something he learned from an IBM sales trainee:

The best sales guy that I ever worked with once told me that when he received sales training from IBM, he was told to make sure that he always repeated the key points six times. I always thought that six times was overkill but I guess IBM must know what they're talking about. A salesman is someone whose income is directly tied to their ability to effectively "educate" their audience.

What we learn here is not anything to do with the salesman's motive, but with the technique. It is grounded in experience. Teachers have heard this advice in a different adage about how to structure a talk: "Tell them what you are about to tell them. Then tell them. Then tell them what you have just told them." Like Kevin, I felt this was overkill when I first heard it, and I still rarely follow the advice. But I do know from experience how valuable it can be me, and in the meantime I've learned that how the brain works makes it almost necessary.

While I'm still not a salesman at heart, I've come to see how "selling" an idea in class isn't a bad idea. Steve Pavlina describes what he calls marketing from your conscience. His point ought not seem radical: "marketing can be done much more effectively when it's fully aligned (i.e., congruent) with one's conscience."

Good teaching is not about delusion but about conscience. It is sad that we are all supposed to believe the cynical interpretation of selling, advertising, and marketing. Even in the tech world we certainly have plenty of salient reasons to be cynical. We've all observed near-religious zealotry in promoting a particular programming language, or a programming style, or a development methodology. When we see folks shamelessly shilling the latest silver bullet as a way to boost their consulting income, they stand out in our minds and give us a bad taste for promotion. (Do you recognize this as a form of the availability heuristic?)

But.

I have to overcome my confirmation bias, other heuristic biases that limit my thinking, and my own self-interest in order to get students and customers to gain the knowledge that will help them; to try new languages, programming styles, and development practices that can improve their lives. What they do with these is up to them, but I have a responsibility to expose them to these ideas, to help them appreciate them, to empower them to make informed choices in their professional (and personal!) lives. I can't control how people will use the new ideas they learn with me, or if they will use them at all, but if help them also to learn how to make informed choices later, then I've done about the best I can do. And not teaching them anything isn't a better alternative.

I became a researcher and scholar because I love knowledge and what it means for people and the world. How could I not want to use my understanding of how people learn and think to help them learn and think better, more satisfyingly?


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

May 05, 2007 1:32 PM

Q School, Taking Exams, and Learning Languages

My previous entry has been a slowly-dawning realization. On Thursday, I felt another trigger in a similar vein. I heard an interview with John Feinstein, who was promoting his new book on Q school. "Q School" is how professional golfers qualify for the PGA tour. This isn't just a little tournament. The humor and drama in some of these stories surprised me. By nearly everyone's standard, all of these players are incredibly good, much, much better than you and especially me. What separates those who make it to the Tour from those who don't?

Feinstein said that professional golfers acknowledge there are five levels of quality in the world of golf:

  1. hitting balls on the range, well
  2. playing a round of golf, well
  3. playing well in a tournament
  4. playing well in a tournament under pressure, with a chance to win
  5. playing well in a major under pressure, with a chance to win

What separates those who it make from those who don't is the gap between levels 3 and 4. Then, once folks qualify for the tour, the gap between levels 4 and 5 becomes a hurdle for players who want to be "players", not just the journeymen who make a good living but in the shadows of the best.

I see the same thing in the world of professional tennis. On a smaller stage, I've experienced these levels as a competitor -- playing chess, competing in various academic venues, and doing academic work.

What does it take to make a step up to the next level? Hard work. Physical work or, in chess and academia, intellectual work. But mostly, it is mental. For most of the guys at Q school,and for many professional golfers and tennis players, the steps from 3 to 4 and from 4 to 5 are more about the mind than the body, more about concentration than skill.

Feinstein related a Tiger Woods statistic that points to this difference in concentration. On the PGA Tour in 2005, Tiger faced 485 putts of 5 feet or less. The typical PGA Tour pro misses an average of 1 such putt each week. Tiger missed 0. The entire season.

Zero is not about physical skill. Zero is about concentration.

This sort of concentration, the icy calm of the great ones, probably demoralizes many other players on the tour, especially the guys trying to make the move from level 4 to level 5. It might well infuriate that poor guy simply trying to qualify for the tour. He may be doing every thing he possibly can to improve his skills, to improve his mental approach. Sometimes, it just isn't enough.

Why did this strike me as relevant to my day job? I listened to the Feinstein interview on the morning I gave my final exam for the semester.

Students understand course material at different levels. That is part of what grades are all about. Many students perform at different levels, on assignments and on exams. At exam time, and especially at final exam time, many students place great hope in the idea that they really do get it, but that they just aren't able to demonstrate it on the exam.

There may be guys in Q School harboring similar hope, but reality for them is simple: if you don't demonstrate, you don't advance.

It's true that exam performance level is often not the best predictor of other kinds of performance in the world, and some students far exceed their academic performance level when they reach industry. But most students' hopes in this regard are misplaced. They would be much better off putting their energy into getting better. Whether they really are better than their exam performance or are in need of better exam performance skills, getting better will serve them well.

But it's more than just exams. There are different levels of understanding in everything we learn, and sometimes we settle for less than we can achieve. That's what my last entry was trying to say -- there is a need to graduate from the level at which one requires external reference to a level at which one has internalized essential knowledge and can bring it to bear when needed.

I am sure someone will point me to Bloom's taxonomy, and it is certainly relevant. But that taxonomy always seems so high-falutin' to me. I'm thinking of something closer to earth, something more concrete in terms of how we learn and use programming languages. For example, there might be five levels of performance with a programming language feature:

  1. recognize an idea in code
  2. program with the idea, using external references
  3. program with the idea, without external reference, but requiring time to "reinvent" the idea
  4. program with the idea, fluently
  5. program with the idea, fluently and under pressure

I don't know if there are five levels here, or if these are the right levels, but they seem a reasonable first cut for Concourse C at Detroit Metro. (This weekend brings the OOPSLA 2007 spring planning meeting in one of North America's great international cities, Montreal.) But this idea of levels has been rolling around my mind for a while now, and this interview has brought it to the top of my stack, so maybe I'll have something more interesting to say soon.

The next step is to think about how all this matters to my students, and to me as an instructor. Knowing about the levels, what should students do? How might I feed this back into how I teach my course and how I evaluate students?

For now, my only advice to students is to do what I hope to do in a week or so: relax for a few minutes at the advent of summer!


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

April 27, 2007 6:04 PM

Welcome to a New Century

While at Iowa State to hear Donald Norman speak at the HCI forum a couple of days ago, I spent a few minutes wandering past faculty offices. I love to read what faculty post on and around their doors -- cartoons, quotes, articles, flyers, posters, you name it. Arts and humanities offices are often more interesting than science faculty offices, at least in a lateral-thinking way, but I enjoy them all.

At ISU, one relatively new assistant prof had posted the student evaluations from his developmental robotics course. Most were quite positive and so make for good PR in attracting students, but he posted even the suggestions for improvement.

My favorite student quote?

It's nice to take a CS course that wasn't designed in the '70s.

Spot on. I wonder just how archaic most computer science courses must seem to students who were born in the late 1980s. Gotta teach those fundamentals!


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

April 26, 2007 7:05 PM

Don Norman on Cantankerous Cars

Yesterday afternoon I played professional hooky and rode with another professor and a few students to Ames, Iowa, to attend the fourth HCI Forum, Designing Interaction 2007, sponsored by Iowa State's Virtual Reality Applications Center. This year, the forum kicks off a three-day Emerging Technologies Conference that features several big-name speakers and a lot of the HCI research at ISU.

Donald Norman

I took an afternoon "off" to hear the keynote by Donald Norman, titled "Cautious Cars and Cantankerous Kitchens". It continues the story Norman began telling years ago, from his must-read The Design of Everyday Things to his in-progress The Design of Future Things.

"Let me start with a story." The story is about how a time when he is driving, feeling fine, and his wife feels unsafe. He tries to explain to her why everything is okay.

"New story." Same set-up, but now it's not his wife reacting to an unsafe feeling, but his car itself. He pays attention.

Why does he trust his car more than he trusts his wife? He thinks it's because, with his wife, conversation is possible. So he wants to talk. Besides, he feels in control. When conversation is not possible, and the power lies elsewhere, he acquiesces. But does he "trust"? In Norman's mind, I think the answer is 'no'.

Control is important, and not always in the way we think. Who has the most power in a negotiation? Often (Norman said always), it is the person with the least power. Don't send your CEO; send a line worker. Why? No matter how convincing the other sides' arguments are, the weakest participant may well have to say, "Sorry, I have my orders." Or at least "I'll have to check with my boss".

It's common these days to speak of smart artifacts -- smart cars, houses, and so on. But the intelligence does not reside in the artifact. It resides in the head of designer.

And when you use the artifact, the designer not there with you. The designer would be able to handle unexpected events, even by tweaking the artifact, but the artifact itself can't.

"There are two things about unexpected events... They are unexpected. And they always happen."

Throughout his talk, Norman compared driving a car to riding a horse, driving a horse and carriage, and then to riding a bike. The key to how well these analogies work or not lies in the three different levels of engagement that a human has: visceral, behavioral, and reflective. Visceral is biological, hard-coded in our brains, and so largely common to all people. It recognizes safe and dangerous situations. Behavioral refers to skills and "compiled" knowledge, knowledge that feels like instinct because it is so ingrained. Reflective is just that, our ability to step outside of a situation and consider it rationally. There are times for reflective engagement, but hurtling around a mountain curve at breakneck speed is not one of them.

Norman suggested that a good way to think of designing intelligent systems is to think of a new kind of entity: (human + machine). The (car + driver) system provides all three levels of engagement, with the car providing the visceral intelligence and the human providing the behavioral and reflective intelligences. Cars can usually measure most of what makes our situations safe or dangerous better than we can, because our visceral intelligence evolved under very different circumstances than the ones we now live in. But the car cannot provide the other levels of intelligence, which we have evolved as much more general mechanisms.

Norman described several advances in automobile technology that are in the labs or even available on the road: cars with adaptive cruise control; a Lexus that brakes when its on-board camera senses that the driver isn't paying attention; a car that follows lanes automatically; a car that parks automatically, both parallel and head-in. Some of these sound like good ideas, but...

In Norman's old model of users and tasks, he spoke of the gulfs of evaluation and execution. In his thinking these days, he speaks of the knowledge gap between human & machine, especially as we more and more think about machines as intelligence.

The problem, in Norman's view, is that machines automate the easy parts of a task, and they fail us when things get hard and we most need them. He illustrated his idea with a slide titled "Good Morning, Silicon Valley" that read, in part, "... at the very moment you enter a high-speed crisis, when a little help might come in handy, the system says, 'Here, you take it.'"

Those of us who used to work on expert systems and later knowledge-based systems recognize this as the brittleness problem. Expert systems were expert in their narrow niche only. When a system reached the boundary of its knowledge, its performance went from expert to horrible immediately. This differed from human experts and even humans who were not experts, whose performances tended to degrade more gracefully.

My mind wandered during the next bit of the talk... Discussion included ad hoc networks of cars on the road, flocking behavior, cooperative behavior, and swarms of cars cooperatively drafting. Then he discussed a few examples of automation failures. The first few were real, but the last two were fiction -- but things he thinks may be coming, in one form or another:

  • I swipe my credit card to make a purchase at the store. The machine responds, "Transaction Refused. You Have Enough Shoes."
  • A news headline: "Motorist Trapped in Roundabout for 14 Hours". If you drive a care that follows lanes and overrules your attempts to change... (April Fool's!)

Norman then came to another topic familiar to anyone who has done AI research or thought about AI for very long. The real problem here is shared assumptions, what we sometimes now call "common ground". Common ground in human-to-human communication is remarkably good, at least when the people come from cultures that share something in common. Common ground in machine-to-machine is also good, sometimes great, because it is designed. Much of what we design follows a well-defined protocol that makes explicit the channel of communication. Some protocols even admit a certain amount of fuzziness and negotiation, again with some prescribed bounds.

But there is little common ground in communication between human and machine. Human knowledge is so much richer, deeper, and interconnected than what we are yet able to provide our computer programs. So humans who wish to communicate with machines must follow rigid conventions, made explicit in language grammars, menu structures, and the like. And we aren't very good at following those kind of rules.

Norman believes that the problem lies in the "middle ground". We design systems in which machines do most or a significant part of a task and in which humans handle the tough cases. This creates expectation and capability gaps. His solution: let machine do all of a task -- or nothing. Anti-lock brakes were one of his examples. But what counts as a complete task? It seems to me that this solution is hard to implement in practice, because it's hard to draw a boundary around what is a "whole task".

Norman told a short story about visiting Delft, a city of many bicycles. As he and his guide were coming to the city square, which is filled with bicycles, many moving fast, his guide advised him, "Don't try to help them." By this, he meant not to slow down or speed up to avoid a bike, not to guess the cyclist's intention or ability. Just cross the street.

Isn't this dangerous? Not as dangerous as the alternative! The cyclist has already seen you and planned how to get through without injuring you or him. If you do something unexpected, you are likely to cause an accident! Act in the standard way so that the cyclist can solve the problem. He will.

This story led into Norman's finale, in which he argued that automation should be:

  • predictable
  • self-explaining
  • optional
  • assistive

The Delft story illustrated that the less flexible, less powerful party should be the more predictable party in an interaction. Machines are still less flexible than humans and so should be as predictable as possible. The computer should act in the standard way so that the human user can solve the problem. She will.

Norman illustrated self-explaining with a personal performance of the beeping back-up that most trucks have these days. Ever have anyone explain what the frequency of the beeps means? Ever read the manual? I don't think so.

The last item on the list -- assistive -- comes back to what Norman has been preaching forever and what many folks who see AI as impossible (or at least not far enough along) have also been saying for decades: Machines should be designed to assist humans in doing their jobs, not to do the job for them. If you believe that AI is possible, then someone has to do the research to bring it along. Norman probably disagrees that this will ever work, but he would at least say not to turn immature technology into commercial products and standards now. Wait until they are ready.

All's I know is... I could really have used a car that was smarter than its driver on Tuesday morning, when I forgot to raise my still-down garage door before putting the car into reverse! (Even 20+ years of habit sometimes fails, even if under predictable conditions.)


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

April 25, 2007 8:14 AM

No More Complaints

... about being too busy to do my job well. Monday night, I attended my university's senior recognition banquet for intercollegiate athletes. One of the academic award winners is on the track and field team.

He is a math major. In his first semester as a freshman, he came in and took three junior/senior level math courses. In later semesters, he took as many as five and six math courses. Some were master's level courses, because there were not enough undergraduate courses to keep him busy.

Track and field is unusual among intercollegiate sports in having competitive seasons in both the fall and the spring. Yet in the spring of his junior year, this young man took 24 credit hours -- 8 courses, including 5 math courses. This spring, he is taking 24 credit hours. I forget how many math courses are in the mix, but the number has gone down; he has exhausted the department's undergraduate curriculum and taken most of its graduate courses.

His GPA is nearly 4.0.

Let's not forget that he is an athlete, a pole vaulter, and so has practice and training nearly every day. And he's not just a member of the practice squad, having fun but saving his energy for his schoolwork. He is a 5-time conference champion and a 4-time All-American.

Oh, and he is a pretty good programmer, too, who took several CS courses his freshman year. That year he was a member of our department's programming team, which placed in the regional competition. He toyed with double majoring in CS, but there are only so many hours in a day, you know.

This young man has been busy, but he has excelled both on the field and in the classroom -- and I do mean "excelled", not the watered-down sense of the word as we too often use it these days.

Actually, attending the athlete's recognition banquet would open the eyes of most university faculty, who have very little sense of just how impressive these young men and women are. In the news we mostly hear about athletic exploits or about misbehavior. You don't hear about the lady soccer player carrying a 3.9 GPA in biomedical science, or the wrestler who double majors in humanities and philosophy, or the women's tennis team made up of players from all across the globe, studying in a second language (some just learning English) and earning a team GPA of 3.59. These student-athletes are the typical case at my university, not the exception.

If you seek excellence, do your best to be among others who seek excellence. Don't limit yourself to one sort of person, especially to people who do what you do. Inspiration can come from people working in all arenas, and you may well learn something from someone who thinks about the world in a different way. And that includes young people, even pole vaulters.


Posted by Eugene Wallingford | Permalink | Categories: General

April 12, 2007 8:16 AM

Kurt Vonnegut Has Come Unstuck in Time

Be careful what you pretend to be
because you are what you pretend to be.

Kurt Vonnegut

Sometimes, the universe speaks to us and catches us unaware.

Yesterday, I attended a workshop, about which I will have more to say later today. Toward the end, I saw a quote that struck me as an expression of this blog's purpose, and almost an unknowing source for the name of this blog:

Learning is about ... connecting teaching and knowing to action.

Connecting knowing to doing. That's what this blog is all about.

But long time readers know that "Knowing and Doing" almost wasn't the name of my blog. I considered several alternatives. Back in November 2004, I wrote about some of the alternatives. Most of the serious candidates came from Kurt Vonnegut, my favorite author. Indeed, that post wasn't primarily about the name of my blog but about Vonnegut himself, who was celebrating his 82nd birthday.

Here we are, trapped in the amber of the moment.
There is no why.

And then I wake up this morning to find the world atwitter with news of Vonnegut's passing yesterday. I'm not sure that anyone noticed, but Vonnegut died on a notable unbirthday, five months from the day of his birth. I think that Vonnegut would have liked that, as a great cosmic coincidence and as a connection to Lewis Carroll, a writer whose sense of unreality often matched Vonnegut's own. More than most, Kurt was in tune with just how much of what happens in this world is coincidence and happenstance. He wrote in part to encourage us not to put too much stock in our control over a very complex universe.

Busy, busy, busy.

Many people, critics included, considered Vonnegut a pessimist, an unhappy man writing dark humor as a personal therapy. But Vonnegut was not a pessimist. He was at his core one of the world's great optimists, an idealist who believed deeply in the irrepressible goodness of man. He once wrote that "Robin Hood" and the New Testament were the most revolutionary books of all time because they showed us a world in which people loved one another and looked out for the less fortunate. He wrote to remind us that people are lonely and that we have it in our own power to solve our own loneliness and the loneliness of our neighbors -- by loving one another, and building communities in which we all have the support we need to live.

Live by the foma that make you
brave and kind and healthy and happy.

I had the good fortune to see Kurt Vonnegut speak at the Wharton Center in East Lansing when I was a graduate student at Michigan State. I had the greater good fortune to see him speak when he visited UNI in the late 1990s. Then I saw his public talk, but I also sat in on a talk he gave to a foreign language class, on writing and translation. I also was able to sit in on his intimate meeting with the school's English Club, where he sat patiently in a small crowded room and told stories, answered questions, and generally fascinated awestruck fans, whether college students or old fogies like me. I am forever in the debt of the former student who let me know about those side events and made sure that I could be there with the students.

Sometimes the pool-pah
exceeds the power of humans to comment.

On aging, Vonnegut once said, "When Hemingway killed himself he put a period at the end of his life; old age is more like a semicolon." But I often think of Vonnegut staring down death and God himself in the form of old Bokonon, the shadow protagonist of his classic Cat's Cradle:

If I were a younger man, I would write a history of human stupidity; and I would climb to the top of Mount McCabe and lie down on my back with my history for a pillow; and I would take from the ground some of the blue-white poison that makes statues of men; and I would make a statue of myself, lying on my back, grinning horribly, and thumbing my nose at You Know Who.

The blue-white poison was, of course, Ice Nine. These days, that is the name of my rotisserie baseball team. I've used Vonnegut's words as names many times. Back at Ball State, my College Bowl team was named Slaughterhouse Five. (With our alternate, we were five.)

Kurt Vonnegut was without question my favorite writer. I spent teenage years reading Slaughterhouse Five and Cat's Cradle, Welcome to the Monkey House and Slapstick, The Sirens of Titan and Breakfast of Champions and Player Piano, the wonderfully touching God Bless You, Mr. Rosewater and the haunting Mother Night. Later I came to love Jailbird and Galapagos, Deadeye Dick and Hocus Pocus and especially Bluebeard. I reveled in his autobiographical collages, too, Wampeters, Foma, and Granfalloons, Palm Sunday, Fates Worse Than Death, and Timequake. His works affected me as much or more than those of any of the classic writers feted by university professors and critics.

The world is a lesser place today. But I am happy for the words he left us.

Tiger gotta hunt.
Bird gotta fly.
Man gotta sit and wonder why, why, why.

Tiger gotta sleep.
Bird gotta land.
Man gotta tell himself he understand.

If you've never read any Vonnegut, try it sometime. Start with Slaughterhouse Five or Cat's Cradle, both novels, or Welcome to the Monkey House, a collection of his short stories. Some of his short stories are simply stellar. If you like Cat's Cradle, check out my tabulation of The Books of Bokonon, which is proof that a grown man can still be smitten with a really good book.

And, yes, I still lust after naming my blog The Euphio Question.

Rest in peace, Kurt.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

April 10, 2007 7:54 PM

Incendiary Humor Considered Harmful?

For a good laugh, take a look at Jeff Overbey's "considered harmful" considered harmful web page. He writes:

I'm not entirely sure why, but I searched ACM and IEEE for all papers with "Considered Harmful" in the title. The length of this list should substantiate my claim that that phrase should be banned from the literature.

And he lists them all. The diversity of areas in computing where people have off Dijkstra's famous screed on go-to statements is pretty broad. The papers range from computer graphics (Bishop et al.) to software engineering (de Champeaux), from the use of comments (Beckman) to web services (Khare et al.) and human-centered design (Donald Norman!). Guy Steele has two entries on the list, one from his classic lambda series of papers and the other on arithmetic shifting, of all things.

A lot of the "considered harmful" papers deal with low-level programming constructs, like go-to, =, if-then-else, and the like. People doing deep and abstract work in computing and software development can still have deeply-held opinions about the lowest-level issues in programming -- and hold them so strongly that feel obligated to make their case publicly.

There is even a paper on the list that uses the device in a circular reference: "'Cloning Considered Harmful' Considered Harmful", by Kapser and Godfrey. This idea is taken to its natural endpoint by Eric Meyer in his probably-should-be-a-classic essay "Considered Harmful" Essays Considered Harmful. While Meyer deserves credit for the accuracy his title, I can't help but thinking he'd have score more style points from the judges for the pithier "Considered Harmful" Considered Harmful.

Of course, that would invite the obvious rejoinder "'Considered Harmful' Considered Harmful" Considered Harmful, and where would that leave us?

Meyer's essay makes a reasonable point:

It is not uncommon, in the context of academic debates over computer science and Web standards topics, to see the publication of one or more "considered harmful" essays. These essays have existed in some form for more than three decades now, and it has become obvious that their time has passed. Because "considered harmful" essays are, by their nature, so incendiary, they are counter-productive both in terms of encouraging open and intelligent debate, and in gathering support for the view they promote. In other words, "considered harmful" essays cause more harm than they do good.

I think that many authors adopt the naming device as an attempt to use humor to take the sharp edge off what is intended as an incendiary argument, or at least a direct challenge to what is perceived as an orthodoxy that no one thinks to challenge any more.

Apparently, the CS education community is more prone than most to making this sort of challenge. CS educators are indeed almost religious in their zeal for particular approaches, and the conservatism of academic CS is deeply entrenched. Looking at Overbey's list, I identify at least nine "considered harmful" papers on CS education topics, especially on the teaching of intro CS courses:

  • Westfall, "'Hello, World' Considered Harmful"
  • Rosenberg and Koelling, "I/O Considered Harmful..."
  • Martin, "Toy Projects Considered Harmful"
  • Johnson, "C in the First Course Considered Harmful"
  • Schneider, "Compiler Textbook Bibliographies Considered Harmful"
  • Hitchner et al., "Programming Early Considered Harmful"
  • Buck and Stucki, "Design Early Considered Harmful"
  • Kay, "Bandwagons Considered Harmful..." (in curriculum development)
  • Hu, "Dataless Objects Considered Harmful"
I've read far too many of these... And there may well be other intro CS papers on the list that I don't recognize just from their names.

Some of the papers on the CS ed list are even in direct opposition to one another! Consider "Programming Early Considered Harmful" and "Design Early Considered Harmful". If we can't do programming early, and we can't do design early, what can we do? Certainly not structured programming; that's on the bigger list twice.

This tells you something about the differences that arise in CS education, as well as the community's sense of humor. It may also say something about our level of creativity! (Just joking... I know some of these folks and know them to be quite creative.)


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

April 06, 2007 6:20 PM

Two, to Close the Week

... to the Sixth

My dad turned 64 yesterday. That's a nice round number in the computing world, though he might not appreciate me pointing that out. It's hard for me to imagine him, or me, any different than we were when I was a little boy growing up at home. It's also hard for me to imagine that someday soon my daughter might be thinking the same about the two of us. Perhaps I need a bigger imagination.

... Months

This is the time of the academic year when folks seeking jobs at other institutions, in particular administrative promotions, begin to learn of their good fortune and to plan to depart. Several of my colleagues at the university will be moving on to new challenges after this academic year.

In a meeting this week, one such colleague said something that needed to be said, but which most people wouldn't say. It was on one of those topics that seems off limits, for political or personal reasons, and so it usually just hangs in the air like Muzak.

Upon hearing the statement, another colleague joked, "Two months. You have two months to speak the truth. Two months to be a truth teller."

It occurred to me then that this must be quite a liberating feeling -- to be able to speak truths that otherwise will go unspoken. Almost immediately on the heels of this thought, it occurred to me just how sad it is that such truths go unspoken. And that I am also unwilling to speak them. Perhaps I need greater courage, or more skill.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Personal

April 04, 2007 5:57 PM

Science Superstars for an Unscientific Audience

Somewhere, something incredible is waiting to be known.
-- Carl Sagan

Sometime ago I remember reading John Scalzi's On Carl Sagan, a nostalgic piece of hero worship for perhaps the most famous popularizer of science in the second half of the 20th century. Having written a piece or two of my own of hero worship, I empathize with Scalzi's nostalgia. He reminisces about what it was like to be an 11-year-old astronomer wanna-be, watching Sagan on TV, "talk[ing] with celebrity fluidity about what was going on in the universe. He was the people's scientist."

Scalzi isn't a scientist, but he has insight into the importance of someone like Sagan to science:

... Getting science in front of people in a way they can understand -- without speaking down to them -- is the way to get people to support science, and to understand that science is neither beyond their comprehension nor hostile to their beliefs. There need to be scientists and popularizers of good science who are of good will, who have patience and humor, and who are willing to sit with those who are skeptical or unknowing of science and show how science is already speaking their language. Sagan knew how to do this; he was uncommonly good at it.

We should be excited to talk about our work, and to seek ways to help others understand the beauty in what we do, and the value of what we do to humanity. But patience and humor are too often in short supply.

I thought of Scalzi's piece when I ran across a link to the recently retired Lance Fortnow's blog entry on Yisroel Brumer's Newsweek My Turn column Let's Not Crowd Me, I'm Only a Scientist. Seconding Brumer's comments, Fortnow laments that the theoretical computer scientist seems at a disadvantage in trying to be Sagan-like:

Much as I get excited about the P versus NP problem and its great importance to all science and society, trying to express these ideas to uninterested laypeople always seems to end up with "Should I buy an Apple or a Windows machine?"

(Ooh, ooh! Mr. Kotter, Mr. Kotter! I know the answer to that one!)

I wonder if Carl Sagan ever felt like that. Somehow I doubt so. Maybe it's an unfair envy, but astronomy and physics seem more visceral, more romantic to the general public. We in computing certainly have our romantic sub-disciplines. When I did AI research, I could always find an interested audience! People were fascinated by the prospect of AI, or disturbed by it, and both groups wanted to talk about. But as I began to do work in more inward-looking areas, such as object-oriented programming or agile software development, I felt more like Brumer felt as a scientist:

Just a few years ago, I was a graduate student in chemical physics, working on obscure problems involving terms like quantum mechanics, supercooled liquids and statistical thermodynamics. The work I was doing was fascinating, and I could have explained the basic concepts with ease. Sure, people would sometimes ask about my work in the same way they say "How are you?" when you pass them in the hall, but no one, other than the occasional fellow scientist, would actually want to know. No one wanted to hear about a boring old scientist doing boring old science.

So I know the feeling reported by Brumer and echoed by Fortnow. My casual conversation occurs not at cocktail parties (there aren't my style) but at 8th-grade girls' basketball games, and in the hall outside dance and choir practices. Many university colleagues don't ask about what I do at all, at least once they know I'm in CS. Most assume that computers are abstract and hard and beyond them. When conversation does turn to computers, it usually turns to e-mail clients or ISPs. If I can't diagnose some Windows machine's seemingly random travails, I am considered quizzically. I can't tell if they think I am a fraud or an idiot. Isn't that what computer scientists know, what they do?

I really can't blame them. We in computing don't tell our story all that well. (I'm have a distinct sense of deja vu right now, as I have blogged this topic several times before.) The non-CS public doesn't know what we in CS do because the public story of computing is mostly non-existent. Their impressions are formed by bad experiences using computers and learning how to program.

I take on some personal responsibility as well. When my students don't get something, I have to examine what I am doing to see whether the problem is with how I am teaching. In this case, maybe I just need to to be more interesting! At least I should be better prepared to talk about computing with a non-technical audience.

(By the way, I do know how to fix that Windows computer.)

But I think that Brumer and Fortnow are talking about something bigger. Most people aren't all that interested in science these days. They are interested in the end-user technology -- just ask them to show you the cool features on their new cell phones -- but not so much in the science that underlies the technology. Were folks in prior times more curious about science? Has our "audience" changed?

Again, we should think about where else responsibility for such change may lie. Certainly our science has changed over time. It is often more abstract than it was in the past, farther removed from the user's experience. When you drop too many layers of abstraction between the science and the human experience, the natural response of the non-scientist is to view the science as magic, impenetrable by the ordinary person. Or maybe it's just that the tools folks use are so commonplace that they pay the tools no mind. Do us old geezers think much about the technology that underlies pencils and the making of paper?

The other side of this issue is that Brumer found, after leaving his scientific post for a public policy position, that he is now something of a star among his friends and acquaintances. They want to know what he thinks about policy questions, about the future. Ironic, huh? Scientists and technologists create the future, but people want to talk to wonks about it. They must figure that a non-scientist has a better chance of communicating clearly with them. Either they don't fear that something will be lost in the translation via the wonk, or they decide that the risk is worth taking, whatever the cost of that.

This is the bigger issue: understanding and appreciation of science by the non-scientist, the curiosity that the ordinary person brings to the conversation. When I taught my university's capstone course, aimed at all students as their culminating liberal-arts core "experience", I was dismayed by the lack of interest among students about the technological issues that face them and their nation. But it seems sometimes that even CS students don't want to go deeper than the surface of their tools. This is consistent with a general lack of interest in how world works, and the role that science and engineering play in defining today's world. Many, many people are talking and writing about this, because a scientifically "illiterate" person cannot make informed decisions in the public arena. And we all live with the results.

I guess we need our Carl Sagan. I don't think it's in me, at least not by default. People like Bernard Chazelle and Jeannette Wing are making an effort to step out and engage the broader community on its own terms. I wish them luck in reaching Sagan's level and will continue to do my part on a local scale.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

March 30, 2007 6:51 PM

A Hint for Idealess Web Entrepreneurs

I'm still catching up on blog reading and just ran across this from Marc Hedlund:

One of my favorite business model suggestions for entrepreneurs is, find an old UNIX command that hasn't yet been implemented on the web, and fix that. talk and finger became ICQ, LISTSERV became Yahoo! Groups, ls became (the original) Yahoo!, find and grep became Google, rn became Bloglines, pine became Gmail, mount is becoming S3, and bash is becoming Yahoo! Pipes. I didn't get until tonight that Twitter is wall for the web.

Show of hands -- how many of you have used every one of those Unix commands? The rise of Linux means that my students don't necessarily think of me as a dinosaur for having used all of them!

I wonder when rsync will be the next big thing on the web. Or has already done that one, too?

Then Noel Welsh points out a common thread:

The real lesson, I think, is that the basics of human nature are pretty constant. A lot of the examples above are about giving people a way to talk. It's not a novel idea, it's just the manifestation that changes.

Alan Kay is right -- perhaps the greatest impact of computing will ultimately be as a new medium of communication, not as computation per se. Just this week an old friend from OOPSLA and SIGCSE dropped me a line after stumbling upon Knowing and Doing via a Google search for an Alan Kay quote. He wrote, "Your blog illustrates the unique and personal nature of the medium..." And I'm a pretty pedestrian blogger way out on the long tail of the blogosphere.

This isn't to say that computation qua computation isn't exceedingly important. I have a colleague who continually reminds us young whippersnappers about the value added by scientific applications of computing, and he's quite right. But it's interesting to watch the evolution of the web as a communication channel, and as our discipline lays the foundation for a new way to speak we make possible the sort of paradigm shift that Kay foretells. And this paradigm shift will put the lie to the software world's idea that moving from C to C++ is a "paradigm shift". To reach Kay's goal, though, we need to make the leap from social software to everyman-as-programmer, though that sort of programming may look nothing like what we call programming today.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

March 26, 2007 8:14 PM

The End of a Good Blog

Between travels and work and home life, I've fallen way behind in reading my favorite blogs. I fired up NetNewsWire Lite this afternoon in a stray moment just before heading home and checked my Academic CS channel. When I saw the blog title "The End", I figured that complexity theorist Lance Fortnow had written about the passing of John Backus. Sadly, though, he has called an end to his blog, Computational Complexity. Lance is one of the best CS bloggers around, and he has taught me a lot about the theory of computation and the world of theoretical computer scientists. Theory was one of my favorite areas to study in graduate school, but I don't have time to keep up on its conferences, issues, and researchers full time. These days I rely on the several good blogs in this space to keep me abreast. With Computational Complexity's demise, I'll have one less source to turn to. Besides, like the best bloggers, Lance was a writer worth reading regardless of his topic.

I know how he must feel, though... His blog is 4.5 years and 958 entries old, while mine is not yet 3 years old and still shy of 500 posts. There are days and weeks where time is so scarce that not writing becomes easy. Not writing becomes a habit, and pretty soon I almost have to force myself to write. So far, whenever I get back to writing regularly, the urge to write re-exerts itself and all is well with Knowing and Doing is well again. Fortunately, I still the need to write as I learn. But I can imagine a day when the light will remain dim, and writing out of obligation will not seem right.

Fortunately, we all still have good academic CS blogs to read, among my favorites being the theory blogs Ernie's 3D Pancakes and The Geomblog. But I'll miss reading Lance's stuff.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

March 25, 2007 9:35 AM

Another Way Life is Like Running

Seven miles on heavy legs this morning had these passages on my mind:

To me it was a relief just to realize it might be ok to be discontented. The idea that a successful person should be happy has thousands of years of momentum behind it. If I was any good, why didn't I have the easy confidence winners are supposed to have? But that, I now believe, is like a runner asking "If I'm such a good athlete, why do I feel so tired?" Good runners still get tired; they just get tired at higher speeds. ...

If you feel exhausted, it's not necessarily because there's something wrong with you. Maybe you're just running fast.

As a runner trying to work back into shape, I appreciate this reminder. As a person trying to be a good husband and father, I appreciate it. As a student of computing and programming who faces an unending flow of new knowledge to be gained, I appreciate it. As a department head who can't seem to get off the treadmill of daily grind, I appreciate it.

You should read the whole piece in which these words appear, another essay from the wisdom of Paul Graham. Don't worry; it's not about running.

Oh, and as for seven miles on heavy legs this morning: The sun is out, the temperature is up, and spring rains fell overnight. However slow I felt, the run was a glorious reminder that spring has returned.


Posted by Eugene Wallingford | Permalink | Categories: General

February 12, 2007 8:15 PM

3 Out Of 5 Ain't Bad

... with apologies to Meat Loaf.

Last week, I accepted the Week of Science Challenge. How did I do? On sheer numbers, not so well. I did manage to blog on a CS topics three of the five business days. That's not a full week of posts, but it is more than I have written recently, especially straight CS content.

On content, I'm also not sure I did so well. The first post, on programming patterns across paradigms, is a good introduction to an issue of longstanding interest to me, and it caused me to think about Felleisen's important paper. The second post, on the beautiful unity of program and data, was not what I had hoped for it. I think I overreached, trying to bring too many different ideas into a single short piece. It neither needs to be a lot longer, or it needs a sharper focus. At least I can say that I learned something while writing the essay, so it was valuable to me. But it needs another draft, or three, before it's in a final state. Finally, my last piece of the week, turned out okay for a blog entry. Computer Science -- at least a big part of the varied discipline that falls under this umbrella -- is science. We in the field need to do a better job helping people to know this.

Accepting the challenge served me well by forcing me to write. That sort of constraint can be good. Because I had to write something, even if not borne away by a flash of inspiration, it forced me to write about something that required extra effort at that moment, and to write quickly enough to get 'er done that day. These constraints, too, can boost creativity, and help build the habit of writing where it has fallen soft in the face of too many other claims on time. In some ways, writing those essays felt like writing essay exams in college!

I think that I would probably have wanted to write about all of these ideas at some point later, but without the outside drive to write now I would probably have deferred them until a better time, until I was "ready". But would I ever? It's easy for me to wait so long that the idea is no longer fresh enough for me to write. An interesting Writing Down The Bones-like exercise might be for me to grab an old ideas file and use a random-number generator to pick out one topic every day for a week or so -- and then just write it.

As for the pieces produced this week, I can imagine writing more complete versions of the last two some day, when time, an inspiration, or a need hits me.

As I forced myself to look for ideas every day, I noticed my senses were heightened. For example, one evening last week I listened to an Opening Move podcast with Scott Rosenberg, author of Dreaming in Code. This book is about the many-years project of Mitch Kapor to build the ultimate PIM Chandler. During the interview, Rosenberg comments that software is "thought stuff", not subject to the laws of physics. As we implement new ideas, users -- including ourselves -- keep asking for more. My first thought was, I should update my piece on CS as science. CS helps to ask and to answer fundamental questions about what we could reasonably ask for, how much is too much, and what we will have to give up to get certain features or functionality. What are the limits to what we can ask for? Algorithms, theory, and experiment all play a role.

Maybe in my next draft.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

February 02, 2007 5:57 PM

Week of Science Challenge, Computer Science-Style

I don't consider myself a "science blogger", because to me computer science is much more than just science. But I do think of Knowing and Doing as a place for me to write about the intellectual elements of computer science and software development, and there is a wonderful, intriguing, important scientific discipline at the core of all that we do. So when I ran across the Week of Science Challenge in a variety of places, I considered playing along. Doing so would mean postponing a couple of topics I've been thinking about writing about, and it would mean not writing about the teaching side of what I do for the next few days. Sometimes, when I come out of class, that is all I want to do! But taking the challenge might also serve as a good way to motivate myself to write on some "real" computer science issues for a while again. And that would force me to read and think about this part of my world a bit more. Given the hectic schedule I face on and off campus in the next four weeks, that would be a refreshing intellectual digression -- and a challenge to my time management skills.

I have decided to pick up the gauntlet and take the challenge. I don't think I can promise a post every day during February 5-11, or that my posts will be considered "science blogs" by all of the natural scientists who consider computer science and software development as technology or engineering disciplines. But let's see where the challenge leads. Watch for some real CS here in the next week...


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

January 31, 2007 8:42 PM

Searching for a College Sysadmin

In my last post, I commented on academic searches. One of the articles I linked to gives some pretty good advice to prospective faculty applying for positions at smaller schools. They aren't fallback positions; they are different kinds of positions altogether. In the spirit of that post, I thought I'd share some of my recent experiences with other kinds of academic search.

This year, I am involved in a couple of high-profile searches on my campus, both of which matter very much to the faculty in my department. The first is for the chief sysadmin in the College of Natural Sciences, and the second is for the Assistant Vice President of information technology for the university. Both matter to us for the same reasons, though in different doses. In this post, I'll talk a bit about the sysadmin search.

Being a medium-sized "teaching university", we do not maintain and manage most of the basic computational resources that we use on a day-to-day basis. In the early years, before the rest of the college and university were deeply into technology, we did. But as our needs grew, and as other departments came to need computer labs and e-mail and the web, the college took on more and more of the systems burden. These days, the college sysadmin team implements and supports the network and server infrastructure for the college, manages the general college labs that our students use, and supports CS faculty labs and computer equipment. Not having the research money to maintain all of our department resources, this has worked out well enough for the last decade or so, with only occasional control issues arising. (One current one involves our web server, which for the last few weeks has been rewriting URLs in such a way as to break my blog permalinks.)

So, the college's lead sysadmin has a challenging job to do. There are a lot of technical tasks to do, plus managing student staff and providing user support. The diversity of issues that arise in the college is yet another challenge, ranging from a CS department, a math department, and industrial technology department, and the traditional natural sciences. In some ways, CS demands less personal support from the college, but we do have specific technical needs, sometimes outliers from the rest of the crowd, that we rely heavily on.

I'm chairing the college sysadmin search. In the process, I've come to appreciate just how hard it is to find candidates with sufficient technical skills and experience, sufficient "people" skills to work with a diverse audience of users, and sufficient managerial experience. Add to that a salary that is almost certainly lower than market value in private industry, and the task is even tougher. Even with these challenges, we have a promising pool of finalists and hope to begin on-campus interviews soon.

Here are some things that I've learned in the last few years and am trying to put into practice for this search:

  • Technical skills matter. Period. You can't hire a sysadmin with marginal or insufficient skills and hope that they can grow into the job. technology changes so fast that most such folks never have the chance to catch up. They just fall farther and farther behind. Strong Unix and Windows background; experience setting up and configuring servers; broad experience with software -- all are essential. Don't settle for less lightly.
  • Communication skills matter. A college-level sysadmin must be able to work with advanced and novice users alike, train students, and mentor full-time assistants. They also have to be able to explain technical choices and trends to department heads and deans, so that these administrators can make informed decisions about equipment and budgets. This latter is more important than many folks realize. Being a sysadmin at this level isn't just about hacking Ruby scripts to automate mechanical tasks; it also contains an element of helping others set vision.
  • But do not emphasize communication skills to the detriment of technical skills. Of course, you can't emphasize technical skills to the detriment of communication skills either, but at my university and in our current environment, favoring tech skills to the expense of communication skills just hasn't been a problem. Folks are more inclined to go with someone who has people skills and is agreeable, even if they lack the skills necessary to do the job well. Don't do it. You'll always pay in the long run.
  • On the search process itself: Fairness matters. Once you have published a job description, stick to it. It can be really tempting to say "I like this candidate's application..." and then start looking away from the ways in which the candidate falls short of the requirements and preferences that you had defined for the position at the outset of the search. This is a sure way to let your hidden biases get in the way of hiring the right candidate. The job description is your specification. If it is well thought out, you're almost certainly better off letting it guide the committee's actions.

    But I think it's also wrong to skirt the job description for another reason: It's unfair to other candidates who were honest about not meeting the requirements of the position and so didn't apply at all. If you are willing to consider folks in your pool who do not meet the specs, then you might really want to consider some of those who self-selected out of the pool through a sense of honor. And then there is the practical problem of having to relax the requirements fairly for all of the applicants in the pool, not just the one who caught your attention for some other reason.

    Maybe I'm too strict on this matter, but this sort of fairness is a sticky point for me. I try to manage my classroom similarly. If I feel a need to repeatedly relax a particular rule, then I probably need to change the rule, not relax it. Otherwise, exceptions tend to favor that certain group of students who are bold or needy enough to ask for exceptions, at the expense of the students who live by the rules quietly. (Can you guess which sort of student I most likely was?)

  • Leadership matters. Once a sysadmin can put ship into seaworthy condition and have things running as they should, it's awfully nice to have someone in place who can think bigger thoughts and help guide the department or college. This isn't crucial, as faculty and other administrators are already charged with this task, but why waste the expertise that your sysadmin brings to the table, especially if they are inclined to help lead? Allowing them to do so helps the college and lets the sysadmin grow professionally.

We'll see how successful the college is at finding the right sort of person, with what bit of influence I have been able to exert.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

January 30, 2007 2:06 PM

Academic Searches

Physics blogger Chad Orzel has written a couple of articles on the process of searching for new faculty, including his recent tips for applicants to schools like his. I haven't been part of a CS faculty search in a couple of years, after what seemed an interminable decade of trying and failing to feel an open slot or two. Our last two searches resulted in three hires, and we couldn't be much happier with the results. Our department is much better now than it was before.

I attribute our success as much to good fortune as to any particular actions we took. When I first became involved in job searches, I thought that a good search committee could reliably produce a good result. Write the correct job description, do the right sort of applicant screening and reference checking, and interview the finalists in the correct way, and out would pop the Ideal Candidate. After years of trying to do the job better, I came to understand how naive I had been. The process of searching for and hiring a member of the faculty is inherently risky, and the best one can do is hope to hire a good person while avoiding any major disasters. It's much easier to watch for the red flags that indicate a higher-than-usual level of risk than it is to recognize the signals that ensure a winner, but even on that side of the equation the committee must have the the patience it needs not to rush ahead in the presence of red flags.

Faculty searches are tougher than most, because the outcomes on which success will be measured are forecasts based on a small set of data points from a world that is not all that representative of the world in which new faculty will operate. We do our best and then act on faith in a person. Sometimes we just guess wrong. When things don't turn out right, both the new faculty member and the hiring department have suffered a loss. The split may be amicable, but neither party is better as a result.

For some staff positions, I think a good search committee can deliver a good result with a much higher probability, if only because the parameters of the position are easier to delineate and evaluate. Besides, there usually isn't the messy specter of a tenure decision complicating most such hires. But there's still an element of risk involved.

As a department head, I've come to appreciate that another challenging kind of search is for administrators and staff who affect the longer-term strategies of the department. I'll write a bit about this kind of search next time.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

January 10, 2007 7:06 PM

Two Quick Notes

1. When I ran the native OS X spellchecker on my last entry, it balked at "Spolsky", and suggested "Splotchy" instead. I don't know why I feel compelled to say so, but there. At least that's funnier than replacing Wozniak with wooziness.

2. You may notice that the links in the preceding paragraph are broken right now, and that all of my permalinks have been for a week or so, I guess. By broken, I mean that it doesn't take you to the 11/20/04 entry titled "Strange Blogging Occurrences" as it should. (At least it takes you to November 2004!) Our college web server underwent some migration and reconfiguration recently, and redirection of URLs seems to be a problem yet. I hope that the problem is fixed soon. In the meantime, my apologies for any inconvenience as you read.


Posted by Eugene Wallingford | Permalink | Categories: General

January 10, 2007 6:55 PM

Blogging When the Gifts Are Good

Maybe if I had big companies bribing me to blog I would make more time in these busy days?

The one sort of freebie that faculty tend to receive are examination copies of textbooks. Some publishers, especially the smaller ones and the highest-scale one, tend not to send unsolicited copies and expect you to return requested exam copies if you don't adopt the text. But some of the major CS-oriented publishers are quite generous. Sometimes you have to ask, either on-line, through a rep, or at a conference, but sometimes unsolicited copies flow freely. The most recent impetus was the advent of the Java intro and data structures texts. I long ago lost count of how many such texts I've examined, both requested and unsolicited.

As Spolsky points out there is an ethical question at play. I think I've maintained a pretty healthy relationship with the textbook reps I've known, and I know some pretty well. A couple of my local reps have been serving my university for a long time, and some of the reps that work conferences such as OOPSLA and SIGCSE have been on that circuit for nearly as long as I. As long as I approach texts as potential adoptions, I can accept a copy for examination. As the text moves away from my core teaching areas, I begin to feel guilty about taking a book. Sometimes I feel a bit guilty even when a book lies right in my area of teaching -- if I really want the book. Somehow, getting something I really want for free seems wrong, even if there is a legitimate professional reason. Must have something to do with how I was raised.

Of course, working at a relatively small school, I don't have much to offer publishers even if I adopt a textbook as a result of some gift or other freebie. If I taught 500-person sections at Mega State, then maybe... but my 35-person sections, even as an annuity over several years, don't amount to much revenue for anyone. Nor have I blogged many book reviews (or book crushes), and the size of my readership hardly makes me a target of Massive Consolidated Publishing.

The one way that I could enrich myself in a meager way from all the unsolicited books is to sell them to one of the many book re-sellers than now troll our halls and inundate us with e-mail. I would never do that, because I want neither to gain financially from the books nor to encourage the book reseller business. If I do anything other than keep the book, I give it to a needy undergrad or grad student looking to do some extracurricular work. Someone can gain from the book that way. Perhaps I should return the book to the publisher, but I don't feel much of an incentive to spend my time undoing something that the publisher did on its own. My time is scarce enough as it is.

I do have one book review that I plan to do sometime soon, on Chris Pine's Learn To Program. This is intended as an intro book -- the first? -- using Ruby. When I hesitantly sent Andy Hunt of the Pragmatic Bookshelf an e-mail asking for an exam copy, he sent me one immediately. We haven't had a chance to offer a Ruby-based intro programming course yet, so I couldn't adopt the book at the time. But it's interesting enough to talk about in public. If I do blog on it, I will be fair and as objective as possible. I owe you -- and myself -- that much. And I think Andy and Chris would want that to.

In closing, I will recommend a piece of software, and the recommendation will expose me as an OS X guy. This probably eliminates whatever small chance I ever had of receiving, like Joel Spolsky, an offer to blog about a complementary "loaded Ferrari 1000 courtesy of Windows Vista and AMD". That is a risk I am willing to take.

I send a lot of e-mail. I send a lot of e-mail with attachments. I cannot count all the times I have sent a message that said "Attached is ..." or the "See the attachment." and yet forgot to attach the file. Until a couple of months ago, this was an increasingly frequent and frustrating behavior of mine.

The I discovered James Eagan's free Attachment Scanner Plugin for Mail.app. Problem solved. It recognizes every variation of "*attach*" that I've ever used, and every time it reminds me to attach the file if I haven't already. Paradise.

The plug-in solved a major problem for me. It is free. And to top that, Eagan wrote a fine tutorial on how he decoded Mail.app's private plug-in and wrote his plug-in. That's more than I could have asked for!

Mr. Eagan did not pay me to say that.


Posted by Eugene Wallingford | Permalink | Categories: General

January 04, 2007 3:49 PM

A Tentative First Post of the Year

I was thinking that break time would be a great time to write a bit, both here and in my day job. But as Lance Fortnow writes, the academic world isn't nearly as quiet over the Christmas holidays as it used to be. I spent some of the time between Christmas Day and New Year's Day in my office working on some forward-dated Call for Proposal web pages for OOPSLA 2007 and some other time catching up on administrative and bookkeeping tasks. I also spent some time with my wife my daughters, enjoying family time that is so easily squeezed during the school year -- as much by their schedules as mine! Top it off with a whirlwind trip to Michigan and Indiana to visit extended family over the New Year's weekend, and suddenly I'm back in the office thinking about summer teaching schedules, graduate assistant assignments, and campus IT policies. I'm itching to write but trying to get a lot done, both here and at home, before the new semester begins in earnest.

I've noticed a pattern in my blogging. Whenever I go a few days without writing, I find that my first post is likely to be more personal than professional. It's almost as if I need to clear out my system a bit, priming the pump for what is to come. With a semester teaching what is probably my favorite course, Programming Languages, there will surely be plenty to say. And I have entries on the way dealing with people as diverse as Stephen King and Tiger Woods, and topics ranging from science and liberal education to Scheme to a review of my year in running shoes.

I'm not the sort of person makes New Year's resolutions. It was never a family tradition, and it never has appealed to me all that much as someone who believes in continuous feedback and improvement (even if I don't often practice what I preach). But I did see a quote this morning, in an inspirational e-mail message from writer Matthew Kelly, that gave me a little resolution buzz:

Preparation and anticipation play a powerful role in our lives; let's stop robbing ourselves of these gifts by doing everything at the last minute.

If I can take even a cautious step in this direction, I think I'll enjoy 2007 more than 2006.


Posted by Eugene Wallingford | Permalink | Categories: General

December 27, 2006 10:58 AM

Holiday Filmfest

It's not a holiday tradition for me to watch movies, but this holiday weekend I had a chance to see several -- not counting seasonal fare such as The Santa Clause. Only one was new, but as my brother is fond of saying, "Hey, they were new to me."

Last night I finally got to see Proof, the only of the movies with a professional connection. I'm not a mathematician, but I can relate to the manic periods of productivity interleaved with periods of little or no progress. As Robert, the mad genius played by Anthony Hopkins, and his daughter Catherine, played surprisingly well by Gwyneth Paltrow, both say at different points in the movie, the key to those down times is to keep plugging way, trying something new, attacking the problem from different angles, chipping away at the problem even the effort seems fruitless. You want to be there, ready, when the next burst of progress arrives.

The movie also brought to mind G. H. Hardy's wonderful A Mathematician's Apology, which I referenced briefly a year or so ago. There is a persistent mythology that mathematicians make their greatest contributions to the the world at an early age, after which their creative powers decline. I've generally heard 35 as this threshold age, but in Proof the mathematicians are much less generous -- Catherine at 27 could already be past her prime. While I do believe that scholars lose a certain kind of stamina they need for making great breakthroughs as they grow older, but I don't buy into this Conjecture of Inevitable Decline at a young age -- and certainly not 27! Certainly the history of computer science is replete with examples of the great ones making seminal contributions well past 35 -- Knuth and Simon come immediately to mind. We do ourselves and our community a disservice when we make excuses for lack of productivity and creativity as we age. That said, not ever having been a Great One, I can only try to imagine the feeling that Hardy describes, and that must have accentuated Robert's madness, as they fall away from their peak.

My weekend began with a trip to see the new release Eragon live in the theater with my daughter. She took me to this movie as a Christmas present, and it made for a very nice dad-and-daughter outing indeed. Growing up, I was more into science fiction than fantasy -- Asimov, Clarke, Heinlein -- but my daughter has gravitated to fantasy. She has already read the book on which this movie is based, and its sequel, so she could fill me in whenever I had a question. If I were a literary critic, I would call Eragon almost wholly derivative of prior works, but then most fantasy is. As a moviegoer, I call Eragon an entertaining show. And Sienna Guillory as Arya -- wow.

Sandwiched between the previous two was a Christmas evening viewing of Dear Frankie, a gem of a movie that I'd never heard of before my daughters pulled it off the video store shelf. Jack McElhone delivers an affecting performance as Frankie, a deaf 9-1/2 year old who longs to see his father. Frankie believes his father to be at sea after leaving their family when Frankie was but an infant. In truth, Frankie's mom, Lizzie, played by the enchanting Emily Mortimer has been on the run from the father all those years, her son and mother frequently about Scotland. Lizzie has been responding to Frankie's letters to his dad for years, unwilling to tell him the truth. Rather than tell you the whole story, or give away too much, let me give you a better piece of advice: watch this movie, and soon. It's one of the best I've seen in years.

That's my turning playing Gene Siskel for a while.


Posted by Eugene Wallingford | Permalink | Categories: General

December 20, 2006 5:16 PM

The Long Tail as Software Designer

Almost every day I am reminded of how the way the world works is changing all around us. For example, Philip Windley writes of the day, coming soon, when MP3 player are like pens -- "everywhere, given away, easily abandoned, even disposable". (I almost wrote "ballpoint pens", but that would betray my ever more apparent state of being a dinosaur.)

Then I learned about My Dream App, which lies at the convergence of American Idol, the web, and the independent software world:

My Dream App is a grand experiment to see what happens when you combine the expertise of some of the best talents in the software and tech world with great ideas and feedback from everyone else.

Like Idol, My Dream App had open tryouts to get into the pool of contestants and then judging by a panel of experts as the apps were winnowed through a series of rounds. However, My Dream App's panel of experts puts Idol's to shame! At the end, the viewing public selected winners by voting on-line. At stake was not a recording contract but a contract to have the idea implemented by a crack team of Mac developers, with royalties for life. The coding team has some great developers, including one of the guys behind SubEthaEdit.

Simon (Cowell) says: thumbs down!

Steve Wozniak

While I'm not all that enamored by most of the apps that finished at the top of the pile, with perhaps one exception, I think that this is a great idea. It exploits the power of a large number of people to brainstorm ideas and then allows them to participate in a selection process that is guided by informed folks who can provide a more focused perspective. And the allure of having Woz act as Simon Cowell must surely have attracted a few people to take their shot at submitting an idea. "That's the most pathetic feature I've seen since Bill Atkinson wanted to prevent users from specifying their own desktop patterns on the original Mac."

This is a new variation in the space of idea generation that I have written about before. On one end of a continuum is the great solo creator like Steve Jobs, who seems to have an innate sense of what of is good; at the other end is Howard Moskowitz, who produces an insanely large set of possibilities, including strange ones that we think no one might like, and then lets people discover what they like. My Dream App is more in the Moskowitz vein, but with a twist -- let everyone with an internet connection build your set of possibilities for you, and then let the crowd work with informed guides to winnow the set. The ubiquity of the web makes possible a collaborative process that would have been unwieldy at best in earlier times.

I wonder long it will be before a mainstream producer -- say, an automobile manufacturer -- uses this sort of approach to design a mainstream product. Just imagine... the 2009 Long Tail coupe, original idea by a insurance executive in Hawarden, Iowa, refined by the masses under the watchful eye of Lee Iacocca. Many auto manufacturers do worse on their own. When harnessed in the right sort of process, the wisdom of the crowd is a powerful force.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

December 16, 2006 2:25 PM

The Prospect of Writing a Textbook

Writing a book is an adventure. To begin with, it is
a toy and an amusement; then it becomes a mistress, and
then it becomes a master, and then a tyrant. The last
phase is that just as you are about to be reconciled to
your servitude, you kill the monster, and fling him out
to the public.

-- Winston Churchill

So maybe I think of myself as a writer, at least part of the time. Why haven't I written a book -- especially a textbook -- yet? My wife often asks when my book will be ready. She would like to see all the work I've done, both solo and with my many friends and colleagues, to be more visible in the world. My parents asks occasionally, too, as do some other colleagues. For many folks, a book is "the" way to make one's ideas usable by others. Papers, conference presentations, and a blog aren't enough.

To be fair my wife and many others who ask, I have been working for several years with colleagues on new ways to think about teaching first-year CS courses, motivated by real problems and pattern languages. We have progressed in spurts and sputters, as we learn more about what we are trying to do and as we try to make time to do the hard work of developing something new. I have learned a lot from this collaboration, but we haven't produced a book yet.

I suppose that I just haven't found the project that makes me write a book yet. A book project requires a huge amount of work, and I suppose that I need to really want to write a book to take on the yoke.

Other than my doctoral dissertation ("Conceptual Retrieval from Case Memory Based on Problem-Solving Roles") and several long-ish papers such as a master's thesis ("Temporal Logic and its Use in the Symbolic Verification of Hardware" -- written in nroff, no less!), I have never written a large work. But I occasionally sense what it must be like to write a textbook, a monograph, or even a novel. When I am at my most productive producing ideas and words, I see common triggers and common threads that tie ideas together across time and topic. When I am blogging, these connections manifest themselves as links to past entries and to other work I've done. (Due to the nature of blogging, these links are always backwards, even though my notes often remind me to foreshadow something I intend to write about later and then to link back to the current piece.) However, I know that when I have blogged on a topic I've only done the easy part. Even when I take the time to turn a stream-of-consciousness idea into a reasonably thoughtful piece for my blog, the hard work would come next: honing the words, creating a larger, coherent whole, making something that communicates a larger set of ideas to a reader who wants more than to drop occasionally into a blog to hear about a new idea. I don't think I fear this work, though I do have a healthy respect for it; I just haven't found the One Thing that makes me think the payoff would be worth the hard work.

The closest thing to a textbook that I have written are the lecture notes for my Programming Languages course. They are relatively complete and have been class-tested over several years. But these are largely a derivative work, drawing heavily on the text Essentials of Programming Languages and less heavily on a dozen other sources and inspiration. Over time, they have evolved away from dependence on those sources, but I still feel that my notes are more repackaging than original work. Furthermore, while I like these notes very much, I don't think there is a commercial market that justifies turning them into a real textbook, with end-of-the-chapter summaries and exercises and all that. They serve there purpose quite well -- at least well enough -- and that's enough for me.

What about the personal and university identity to be gained by writing a text? Reader Mike Holmes pointed me to a passage on the admissions web site of one of our sister institutions that their "professors actually write the textbooks they and professors at other colleges use". That's a strong marketing point for a school. My university likes to distinguish itself from many larger universities by the fact that our tenured faculty are the ones teaching the classes your students will take; how much better if those professors had written the text! Well, as Mike points out, many of us have had courses with author-professors who were average or worse in the classroom. And if a textbook has few or no external adoptions -- as so, so many CS texts do -- then the students at Author's U. would probably have been better off had the author devoted her textbook-polishing efforts to improving the course.

Maybe this is all just a rationalization of my lack of ambition or creative depth. But I don't think so. I think I'll know when a book I'm meant to write comes along.

Could my work on this blog eventually lead to a book? Another reader once suggested as much. Perhaps. It is certainly good to have an outlet for my ideas, a place where they go from abstractions to prose that might be valuable to someone. The blog is also an encouragement to write regularly, which should help me become a better writer if nothing else. Then, if a book I'm meant to write comes along, I should be prepared to write it.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

December 12, 2006 6:00 PM

Am I a Writer, Too?

[Written on a plane between Detroit and Minneapolis yesterday, returning from an OOPSLA 2007 planning meeting in Montreal.]

I have at various times in the life of this blog proclaimed myself a runner and a programmer. (The latter post has drawn comments from several readers, including an out-of-context "Me, too" from Joe Bergin in a Montreal restaurant Friday night!) My many posts on teaching my courses leave little doubt that I also think of myself as a teacher.

My flight home today leads me to to want to say "I am a writer", too. Over the last year or more, I have gone through occasional droughts posting here, and even when I have written I have not always felt particularly inspired. But stimulate me -- send me to a conference, or to a meeting with interesting people; give me a good book; turn me loose in a new course to teach -- and the flood gates open.

A few moments ago, the flight attendant told us to turn off all portable electronic devices, including laptop computers. So here I sit, scribbling this message on two small Northwest Airlines napkins received as part of the beverage service earlier. Two napkins, filling both sides and now writing up the margin. The words won't stop.

Flying home led me to think about what I had learned about my job as communications chair, and suddenly all sorts of ideas for new articles began to flow, and ideas for how to grow, merge, and finish up ideas that have been sitting inchoate in my "blogs to write" folder for days and weeks. Some of the ideas were triggered by reading I'd done on the plane; others, from conversations over the weekend; yet others, from who knows where.

Let's just say that my hopper is hopping and should serve well in the coming days as I take time to blog.

Can I be a runner, a programmer, a teacher, and a writer -- and a husband and a father -- all at once? Yes. I certainly admire many of my colleagues and friends who seem to be so many things, so well. Perhaps such breadth is just a part of being human. We are certainly lucky to live in a time and place where we can be so many people at once.


Posted by Eugene Wallingford | Permalink | Categories: General

December 11, 2006 7:29 PM

Learning about My Communications Job ...

... on a last December morning in Montreal, where I am on the job.

I awoke for an early morning run to snow, the wet fluffy snow of temperatures near 0° C. In the downtown area, all that remained were wet streets, but as I jogged toward Mont Royal the snow was still falling and the 6-8% grade up Rue Peel was covered in an inch or so. I love to run in falling snow, and my first snow run of the year is often special. I enjoyed this one as much as usual.

Just past half way on my out-and-back route, I missed a turn that seemed obvious yesterday. Perhaps it was the snow-covered street signs, or my snow-covered glasses. The result was 17 minutes or so of backtracking and retracing my steps, and a planned 8-miler turned into a 10-miler that lasted into Monday morning rush hour traffic. The 6-8% grade down a snowy Rue Peel was, oh, let's say more challenging than the run up. I survived with no spills but a few minutes of my heart pounding a bit more than usual.

I hope that the OOPSLA 2007 wiki, which we hope to have up and running any day, will be a place for OOPSLA-bound runners to share advice on routes and warnings of things to watch for. We usually launch the conference wiki at or just a few weeks before the conference, but I think having it running all year long will offer a chance for potential conference attendees and other altruistic souls to build community around issues related both to the conference and to our personal pursuits in Montreal. This is a part our vision for the communications component of our conference organization.

As I checked out later in the morning, I received a more expensive surprise. It turns out that the complimentary shuttle from the Hyatt to the Montreal central bus station, which I thought ran on a regular and frequent schedule, requires a one-hour advance registration. There was no information about this in the packet I received from our logistics company, nor at the L'Aerobus station, nor in the hotel itself. This inconvenience followed the general sense of disorientation and complexity I felt on the night I arrived and strengthened my desire to incorporate a useful Local Travel on the conference web site. Anything we can do to ensure that all of the ancillary things associated with conference travel go well, the more likely we can create an awesome OOPSLA 2007 experience for attendees. Yet more conference communications! Yet another element to my position on the conference committee.

Before I began to dig into this position during OOPSLA 2006, I assumed that most of my activities would focus on the conference content: calls for submissions, the advance program, the program on the web site, the final program, and the much-missed Program-at-a-Glance that makes the days of the conference easy to follow and plan. But I have come to understand that communications is much more than simply organizing the program for presentation in paper and bits. In fact, I'd say that that task is merely one example of a larger purpose. It is really about eliminating the friction that naturally comes at all stages of participating in OOPSLA. It is about serving the informational needs of submitters and attendees.

I have come to admire the underlying sense of duty that runs throughout our conference committee. General chair Dick Gabriel has his pulse on both ends of the spectrum of tasks that faces the committee: those little details that seem to matter only when something goes wrong, such as web-site navigation and hotel reservations, and the big picture of moving the conference forward for which he is so well-known among the OOPSLA committee, such as Onward! and the Essays track. He certainly recognizes the financial implications of falling down on the little services, which affects both the current year's attendance and the possibly future years' attendance, but I don't think that this is what motivates him. In any case, just now I am short on the kind of big ideas that can move the conference in a new direction, I am hoping that my attention to the details of our communicating well to OOPSLA participants can help the rest of the committee put on a winning conference.

And I am not saying all these nice things just because Dick graciously picked up part of a dinner tab that the conference budget could not last night at Montreal's Globe restaurant, attended by the committee members who had stayed an extra night in order to do more committee business and scout the area. The Globe offered a fine menu extravagant in seafood and a mix of French and North American cuisine, at prices that left this small-town Midwestern boy in a state of awe. The waitstaff was remarkably attractive and, um, shall we say, enticingly well-dressed. While I probably won't dine at this establishment on my future trips to Montreal, I can cherish the memory of this visit's delights.

Finally, during dinner conversation, I learned of the next big thing that will rock the software world, which will explode out of OOPSLA 2007 as so many revolutionary ideas have sprung from past OOPSLAs: ribosome-oriented computing. Keep your eyes glued to Google; this has the potential to make an international superstar out of postmodern software prophet Robert Biddle.


Posted by Eugene Wallingford | Permalink | Categories: General, Running

December 10, 2006 1:48 PM

Needs, Courage, and Respect

Two comments from the human side of our world have caught my eye in the last little bit:

On needs, courage, and respect

Dale Emery relates these concepts concisely:

A big part of courage is remembering that my needs matter. A big part of respect is remembering that the other person's needs matter.

In many ways, it's easier for me to integrate the latter into my daily life than the former. I usually feel guilty for placing my needs too high in the pecking order. Perhaps I place too much concern on the risk of being self-centered. (And that probably because I already have that weakness!)

On badmouthing one's competitor

A lot of folks have linked to Tom Peters' article on "loving thine enemy", but my take on Peters' article best matches Jason Yip's take:

My take on this is that I don't want to be someone who badmouths other people for my own self-interest. And that's enough.

Well said. I'm certainly sympathetic to the pragmatists' philosophical stance, which would summarize Peters' position as saying that working to hurt your competitors "doesn't work". But one ought not need an economic or altruistic reason not to speak ill of others. It is simply wrong.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

December 09, 2006 1:08 PM

Hello from Montreal

OOPSLA 2007 logo - embrace the beaver!

Bienvenue de Montréal! My previous post was written on the plane to Quebec for the December OOPSLA'07 planning meeting. This year, I am communications chair for the conference, which means that I have a diverse set of responsibilities defined around "getting the word out". (Don't worry; I won't turn this blog into a conference shill.) This includes the design of the advance and final programs, as well as helping to shape the content and organization of the web site. I'm part of a team consisting of a graphic artist and a person focused on advertising, which is good, given my lack of skills in the former and lack of feel for the latter.

My big challenge this year is to find a way to communicate the rich diversity of our program -- and it is richer and more diverse than any computing conference I know of -- to the people who should be at OOPSLA next year. We have all been talking about this for a couple of years, but now it's my job. How do I help attendees, especially newcomers, navigate their way through the conference? Someone this morning likened OOPSLA to Paris, a wild circus of ideas and events that can educate and enthuse and enlighten and entertain most anyone who cares about programs and the art of programming. It's a provoking analogy that I'll have to explore. For example, how do we help potential attendees find us as a "vacation spot" and plan their trip?

As when I attended SugarLoafPLoP, I find the change in local language to be surprisingly disorienting -- even in a place where almost everyone speaks English as well as I. This exposes my parochial upbringing and my rather limited travel experiences as a professional. It probably also says a lot about a self-insularity that I need to break out of. EuroPLoP and ECOOP are calling me!

Given that I'll spend a total of a couple of weeks over the next year among the Quebecois, I think that I should try to learn a little French beyond "oui", "merci", and "non parlez-vous Francais". When I went to Brazil, I set the too-ambitious goal of learning some conversational Portugese. This time, I will set the less imposing and more achievable goal of learning some vocabulary and a few key phrases. (My colleague Steve Metsker says that he sets the goal of learning 50 words in the local tongue whenever he travels to a new land.) At least I can learn enough to show Montreal residents that I respect their bilingual culture.

the view of Montreal from Mont Royal

I am not running ready to write a new installment of "Running on the Road" yet, but I am looking forward to starting my research. After a long few weeks at the office, three hard days in a row on the track, and a long day of travel, this morning found me resting peacefully. The great news about our location in downtown Montreal is proximity to the running trails along the Fleuve Saint-Laurent and in the wonderful Parc du Mont-Royal. I plan to run tomorrow's 12-miler on the trails of Mont Royal -- not climbing the mountain, but on a trail that circles near the base. Then Monday I'll try the trail along the St. Lawrence. My May visit will give me more opportunities to explore the possibilities.


Posted by Eugene Wallingford | Permalink | Categories: General, Running

November 22, 2006 6:30 PM

An Old Book Day

After over a year in the Big Office Downstairs and now the Office with a View, I am finally unpacking all the boxes from my moves and hanging pictures on the walls. Last year it didn't seem to make much sense to unpack, with the impending move to our new home, and since then I've always seemed to have more pressing things to do. But my wife and daughters finally tired of the bare walls and boxes on the floor, so I decided to use this day-off-while-still-on-duty to make a big pass at the task. It has gone well but would have gone faster if I only I could go through boxes of old books without looking at them.

Old books ought to be able to toss without a thought, right? I mean, who needs a manual for a Forth-79 interpreter that ran on a 1980 Apple ][ ? A 1971 compiler textbook by David Gries (not even mine -- a colleague's hand-me-down)? Who wants to thumb through Sowa's Conceptual Structures, Raphael's The Thinking Computer, Winograd and Flores's Understanding Computers and Cognition? Guilty as charged.

And while I may not be an active AI researcher anymore, I still love the field and its ideas and classic texts. I spent most of my time today browsing Raphael, Minsky, Schank's Dynamic Memory, Weld and de Kleer's Readings in Qualitative Reasoning about Physical Systems, the notes from a workshop on functional reasoning at AAAI 1994 (one of the last AI conferences I attended). These books brought back memories, of research group meetings on Wednesday afternoons where I cut my teeth as a participant in the academic discussion, of dissertation research, and of becoming a scholar in my own right. There were also programming books to be unpacked -- too many Lisp books to mention, including first and second editions of The Little LISPer, and a cool old book on computer chess from 1982 that is so out of date now as to be hardly worth a thumbing through. But I was rapt. These books brought back memories of implementing the software foundation for my advisor's newly established research lab -- and reimplementing it again and again as we moved onto ever better platforms for our needs. (Does anyone remember PCL?) Eventually, we moved away from Lisp altogether and into a strange language that no one seemed to know much about... Smalltalk. And so I came to learn OOP many years before it came into vogue via C++ and Java.

Some of these books are classics, books I could never toss out. Haugeland's Mind Design, Schank, Minsky, Raphael, The Little LISPer. Others hold value only in memory of time and place, and how they were with me when I learned AI and computer science.

I tossed a few books (among them the Gries compiler book) and kept a few more. I told my daughter I was being ruthless, but in reality I was far softer than I could have been. That's okay... I have shelf space to spare yet, at least in this office, and the prospect of my next move is far enough off that I am willing to hold onto that old computer chess book just in case I ever want to play through games by Belle and an early version of Cray Blitz, or steal a little code for a program of my own.

I wonder what our grandchildren and great-grandchildren will think of our quaint fetish for books. For me, as I close up shop for a long weekend devoted to the American holiday of Thanksgiving, I know that I am thankful for all the many books I've had the pleasure to hold and read and fall asleep with, and thankful for all the wonderful people who took the time to write them.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 18, 2006 9:17 AM

No Shortcuts

I overhead a conversation in the locker room yesterday that saddened me. Two university students were chatting about their lives and work-outs. In the course of discussing their rather spotty exercise routines, one of them said that he was planning to start using creatine as a way to develop a leaner "look". Creatine is a naturally-occurring compound that some folks use as a nutritional supplement.

Maybe I'm a fuddy-duddy, but I'm a little leery about using supplements to enhance my physical appearance and performance. It also may just be a matter of where to draw the line; I am willing to take multivitamins and zinc supplements for my immune system. The casual use of creatine by regular guys, though, seems like something different: an attempted shortcut.

There aren't all that many shortcuts to getting better in this world. Regular exercise and a good diet will help you develop a leaner body and the ability to perform better athletically. The guys I overhead knew that they could achieve the results they needed by exercising and cutting back on their beer consumption, but they wanted to reach their goal without having to make the changes needed to get there in the usual way.

The exercise-and-diet route also has other positive effects on one's body and mind, such as increased stamina and better sleep. Taking a supplement may let you target a specific goal, but the healthier approach improves your whole person.

Then there's the question of whether taking a supplement actually achieves the promised effect...

These thoughts about no shortcuts reminded me of something I read on Bob Martin's blog a few weeks ago, called WadingThroughCode. There Bob cautioned against the natural inclination not to work hard enough to slog through other people's programs. We all figure sometimes that we can learn more just by writing our own code, but Bob tells us that reading other people's code is an essential part of a complete learning regimen. "Get your wading boots on."

I've become sensitized to this notion over the last few years as I've noticed an increasing tendency among some of even my best students to not want to put in the effort to read their textbooks. "I've tried, and I just don't get it. So I just study your lecture notes." As good as my lecture notes might be, they are no substitute for the text. And the student would grow by making the extra effort it takes to read a technical book.

There are no shortcuts.


Posted by Eugene Wallingford | Permalink | Categories: General, Running, Teaching and Learning

November 16, 2006 5:28 PM

"No, We're Laughing with You"

It seemed an innocent enough question to ask my CS 1 class.

"Do you all know how to make a .zip file?"

My students laughed at me. As near as I could tell, it was unanimous.

For a brief second I felt old. But it wasn't that long ago that students in my courses had to be shown how to zip up a directory, so perhaps their reaction is testimony more to the inexorable march of technology than to my impending decrepitude.

At least most of them seemed interested when I offered to show them how to make a .jar file en route to creating a double-clickable app from their slideshow program.

I may be a dinosaur, but I'm not completely useless to them.

Yet.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

October 19, 2006 6:26 PM

Misconceptions about Blogs

When I first started writing this blog, several colleagues rolled their eyes. Another blog no one will read; another blogger wasting his time.

They probably equated all blogging with the confessional, "what I ate for breakfast" diary-like journal that takes up most of the blogspace. I'm not sure exactly what I expected Knowing and Doing to be like back then, but I never intended to write that sort of blog and made great effort to write only something that seemed worth my time to think about -- and any potential reader's time to think about, too. Sometimes my entries were lighthearted, but even then they related to something of some value to my professional life. The one exception is my running category, which is mostly "just about me". But even then I often found myself writing about the intersection of my thoughts on running an my thoughts on, say, software development practice.

My colleagues' eye rolling came to mind again yesterday when I heard one of my university colleagues from the humanities give a talk on the use of video essays in his courses. A video essay is just what it says, an essay in the form of a short film. He tried to help us to understand where the video essay sits in the continuum of all film works, as it injects the creator into the process the way a written essay does but unlike a video documentary, which should present a voice independent of the filmmaker. In the course of his explanation, he raised blogs as the the written equivalent of something that is too personal, too much about the creator and not enough about some idea with which the creator is engaging. To him, blogs were -- by and large -- trivia.

Writing my blog doesn't feel trivial, and I hope that the folks who take the time to read it don't find the content trivial. Were I writing trivia, I could and would write far more often, as I could dump whatever frustrations or uneasiness or even joy I was feeling at the end of each day right into my keyboard. Even when I write a short entry, I want it to be about something, reflect at least that I have done some serious thinking about that something, and use words and language that I've shown at least a modicum of effort to polish for public consumption.

Nearly all of the blogs I read have these features, usually more of each than mine. I'm honored to read the professional and personal ruminations of interesting minds as they try to learn something new, or teach the rest of us something that they have recently learned.

As the speaker told us more about the art of video essay, it occurred to me that the word "blog" is used to describe at least two different classes of on-line writing, and that the kind of blog I enjoy -- and aspire to write -- is not what most folks think of when they hear "blog". This sort of blog is really an essay, and the bloggers in question are essayists, much in the spirit of Michel de Montaigne himself. The web makes it possible for us to share our essays more quickly and more broadly than older media, and it allows us to show more or our work-in-progress than is feasible in a print-dominated world. But these blogs are essays.

So, next week, when I make my annual pilgrimage to the conference currently known as OOPSLA, don't expect me to "blog it"; instead, I'll be writing essays.

Speaking of OOPSLA, I'll be in Portland, Oregon, from Saturday, October 21, through Friday, October 27. If you will be there, too, I'd love to meet up over a break and chat. Drop me a line.

Oh, and to close the loop on my colleague's talk... He showed some of his students' video essays, and they were remarkable. Anyone who doubts that today's students think deeply about anything would have to reconsider this stance after seeing some of these works. Could I use video essay in a CS course? Perhaps not most, but that thought probably says more about my imagination than the possibilities inherent in the medium. I haven't thought deeply enough about this to say any more!


Posted by Eugene Wallingford | Permalink | Categories: General

October 07, 2006 10:34 AM

The Measure of All Things

The truth belongs to everyone,
but error is ours alone.

-- Ken Alder, The Measure of All Things

On my trip to the Twin Cities last weekend, I had the good fortune to listen to Ken Alder's entertaining The Measure of All Things on tape. Alder tells the story of the Meridian Project, revolutionary France's effort to define the meter -- the Base du Systeme Metrique, the foundation of the metric system -- in terms of the distance between between the North Pole and the equator. Before happening upon this book, I knew nothing of this project or the scientists involved, and less about the political history of the era than I should have known.

In the Meridian Project, two of the finest astronomers of the day set out to measure the line of longitude that runs from Dunkirk on the northern coast of France to Barcelona on the northeast corner of Spain. They used the technique of triangulation, wherein they measured all of the angles in a sequence of coincident triangle running the length of the line and then used the length of a single side to compute the length of the target line. I was surprised by both the quality of the tools and techniques available to 18th century scientists and the fortitude with which they overcame the practical obstacles that stood in their way. Those of us who do science in the 21st century -- and all of us, who enjoy the benefits of science and technology every day -- really do owe a great debt to the men and women who laid our scientific foundation.

This was the Golden Age of geodesy, the art of measuring the Earth. The Meridian Project captured public and political interest. Scientists made trips to places as remote as Peru and Lappland in an effort to draw a more complete picture of the size and shape of the Earth. We are in an age of tremendous growth of computing; what is our signature project? What can capture -- or recapture -- the public's real interest? We in the sciences talk a lot about the Human Genome Project, but I don't think that this will ever have universal appeal outside the sciences. Digital media are now woven inextricably into our lives, but so deeply that few people think twice about them as anything special any more.

Perhaps the key technical point in the Meridian Project story involves error. Mechain and Delambre, the protagonists of the story, used a repeating circle to take their angle measurements. This tool was designed to help users reduce small observational errors by taking repeated observations, amalgamating the results, and computing the actual value from the amalgamation. This made small values that were otherwise imperceptible to the human observer appear manifold, where they were observable as a group. (Anyone who has tried to time a lightning-fast computer operation is familiar with the CS analog of this technique: write a loop to do the operation a million times, and then divide by a million. Values that would otherwise show up as a 0 on your timer are now computable!)

In the course of his measurements in the north of Spain, Mechain encountered a discrepancy between two readings -- and panicked. He didn't want anyone to know about the error, lest it reflect badly on his skill, so he conducted an elaborate cover-up, one that did not alter the ultimate calculation of the length of the meter. Both Delambre and Alder marvel at Mechain's artistry in doctoring his data.

However, if Mechain had understood that there are different kinds of error, he may not have worried so much about the discrepancy in his data, or the potential effect on his reputation for exactitude. For, while the repeating circle's multiple readings helped to increase the precision of his results, they did nothing to increase their accuracy, and in fact made his results less accurate. A technique can produce internal consistency without veridicality, and vice versa.

We live on a fallen planet,
and there is no way back to Eden.

-- Jean-Baptiste-Joseph Delambre

In writing up the results of the Meridian Project for publication, Delambre came to appreciate Mechain's dilemma and was quite generous in his treatment of Mechain's error, by keeping the cover-up out of plain sight. Delambre never did anything to impugn the integrity of the study, but by writing with care he was able to preserve Mechain's contribution to the project, and his public reputation. Delambre's work in the decades following the project are a fine example of a scientist acting honorably, with respect for both truth and his fellow man.

At the time of the Meridian Project, many scientists thought that error could be handled in a purely rational way, through perfection of tools and techniques. Today we have formalized much of our way of handling error in a social process within the community of scientists. Peer review and open discussion of results are central components in this process. Consider Andrew Wiles's proposed proof of Fermat's Last Theorem. In many ways, our open scientific culture owes much to the democratic revolutions in America and Europe around the time of this project.

Another interesting thread running through Alder's book is the story of how definitions and standards are adoption. The US has been discussing the metric system since it was merely a proposal in the French Academy of Science, in our pre-revolutionary days. Indeed, the U.S. Constitution grants Congress the authority to establish national standards of weights and measures, as essential to interstate commerce and the smooth functioning of a national economy. But codifying the metric system ahead of its widespread adoption is in some ways antithetical to democracy, whose primacy was much on the minds of our Founding Fathers. As Benjamin Franklin asked of John Quincy Adams, "Shall we shape the people to the law, or the law to the people?" I see this conflict in many elements of my professional, from trying to set departmental policy as department header, to defining curriculum as a faculty member, to selecting and refining a software methodology as a developer!

Men will always prefer
a worse way of knowing to
a better way of learning.

-- Jean-Jacques Rousseau

Alder writes that, "Science likes to think itself the one human endeavor free of idolatry." But his story, like any complete recount of a real scientific project, points out the many ways in which scientists and their processes elevate some ideas to the status of dogma, both locally within our own programs and globally within the paradigm that dominates science of our time. We have to be on the look-out for these blind spots in our vision. Often, they hide errors in our thinking; sometimes, they hide opportunities to advance our science in a big way.

You're not the only one who's made mistakes
But they're the only thing that you can truly call your own

-- Billy Joel, You're Only Human (Second Wind)

The Meridian Project began in an attempt to anchor "the measure of all things" -- the meter -- to something unassailable in the external world, the size of the Earth. But along the way they helped us to learn the many ways in which the external world is imperfect to this end: the design of our tools, the use of these tools, the setting on approximations in the absence of complete knowledge, the refining of definitions in face of new knowledge and technology, an adoption of standards that is driven by political and social needs, .... Ultimately, the project only reinforced what Protagoras had taught 2500 years ago, that man is the measure of all things.


Posted by Eugene Wallingford | Permalink | Categories: General

October 04, 2006 5:45 PM

Hope with Thin Envelopes

I am working on a longer entry about a fine book I listened to this weekend, but various work duties -- including preparing for the rapidly approaching OOPSLA conference -- have kept me busier than I planned. I did run across a bit of news today that will perhaps raise the spirits of high school and college students everywhere who did not get into their dream schools. This from a wide-ranging bits-of-news column by Leah Garchik at the San Francisco Chronicle:

P.S.: A bit of information in Tuesday's story about Andrew Fire of Stanford University, winner of a Nobel Prize for medicine, seems deserving of underlining: Stanford turned down Fire when he applied for undergraduate study there. This revelation is a gift to every high school senior who ever received a thin envelope instead of a fat one.

(Of course, Fire did have the good fortune to study at Berkeley and MIT...)

Worth noting, too, is that Fire studied mathematics as an undergrad, and that his quantitative background probably played an important role in the thinking that led to his Nobel-winning work. Whenever I encounter high school or college students who are interested in other sciences these days, I tell them that studying computer science or math too will almost certainly make them better scientists than only studying a science content area.

I also tell them that computer science is a pretty good content area in its own right!


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

September 29, 2006 5:36 PM

Installing a New President

Today, my university installed its new president. The installation was high ceremony, with faculty dressed in full academic regalia, a distinguished platform committee, and an appearance by one of Iowa's U.S. senators, who is a UNI alumnus. (Alas, business on Capitol Hill changed his plans at the last minute and limited him to a video address.) As much as universities have changed in the last thousand years or so, we still hold onto some of our oldest traditions and ceremonies. That's a good thing, I think. It creates a sense of purpose and history, tying us to our forebears in the quest to understand the universe.

Ben Allen, President of UNI

I am excited by our new president. He comes to us from a major research school, a sister institution with which my department has dealt recently. Everyone I've spoken to from his old university, whether staff or administrator, regular faculty to research superstar, has praised his intellect, his ability, and his character in the most glowing of terms. We need a leader of high intellect, ability, and character as we face the many changes that confront universities, especially public universities, these days.

President Allen's installation address gives those of us in the sciences reason to anticipate the future. He seems to understand clearly the role that science, math, and technology will play in the world in the coming century, and he seems to understand just as well that a comprehensive university such as ours must play an essential role in educating both capable citizens and professionals prepared to work in a technological world. And his vision takes in more than just the aims of education; he sees the importance of participating in developing our state's economy and to lead in areas where our strengths meet the greatest needs of the public. These include the training of capable teachers and -- perhaps surprising to folks who think of the Great Midwest as only a homogeneous population of old European ancestry -- the integration and development of modern immigrant communities.

I feel great hope that the new president of my medium-sized public institution in the heartland of America believes that we can and must become world leaders in the areas of our strengths. As he is fond of saying, good is the enemy of great, and we must ask and answer difficult questions in order to discern where we will excel -- and then carry out a plan to do so. Even greater is my hope in his belief that science, math, and technology are among the areas we must come to lead.

Now my department faces a big challenge. At what can we be great? What specialty can we develop that will be world class?


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

September 18, 2006 6:24 PM

A New Entry for my Vocabulary

In my last entry I quoted a passage from Paul Graham's essay Copy What You Like. Graham encourages students of all sorts to develop the taste to recognize good stuff in their domain, and then to learn by copying it.

My friend Richard Gabriel is both computer scientist and poet. He has often talked of how writers copy from one another -- not just style, but sentences and phrases and even the words that catch our ears, that sound just right when used in the right place.

In that spirit I will steal, er, copy a word from Graham.

Most of the words that our discipline has added to the broader lexicon are hideous abominations, jargon used to replace already useful words. The next time I hear someone use "interface" as a verb in place of the older, humbler, and perfectly fine "interact", well, I don't know what I'll do. But it won't be pleasant. In this vein, I did recently hear an unusual word choice from a graduate student recently moved hear from Russia. Instead of "interaction", she used "intercourse". It sounded charming and had a subtly different connotation, but these days in the U.S. I suspect that most folks would look askance at you for this word choice.

But in "Copy...", Graham put a CS jargon word to good use in ordinary conversation:

It was so clearly a choice of doing good work xor being an insider that I was forced to see the distinction.

Standard English doesn't have a good word with the meaning of "xor"; "or" admits the same confusion in regular conversation that it does in logic. But sometimes we really want to express an 'exclusive or', and "xor" is perfect.

Now I'm on the look-out for an opportunity to drop this word into a conversation!


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

September 14, 2006 10:13 AM

And Now We Return...

Ten days without blogging means that I have been busy. The work busy-ness has been two-pronged: teaching CS1 for the first time in a decade, and launching the department's academic program review, which the university requires every seven years. Off campus, in addition to family, the last two weeks have been my two biggest training weeks as I get ready for my upcoming marathon.

Oftentimes, a conference is the "break" I need to get back to writing. (Next month, OOPSLA will give me my annual booster shot of inspiration and great ideas!) Today, it is a retreat of all the department heads in my college. We are discussing some of the major issues that face science and technology departments these days: fundraising, recruitment, and strategic planning. So I'm thinking about these issues right now, and I'll probably write down some of those thoughts here.


Posted by Eugene Wallingford | Permalink | Categories: General

September 03, 2006 8:39 PM

Crisis of Confidence

But here you are in the ninth
Two men out and three men on
Nowhere to look but inside
Where we all respond to
Pressure

-- Billy Joel

I had an abrupt crisis of confidence while running this morning. My legs had just started to feel the miles. Even though I've had good times in training the last few weeks, I pictured myself in the marathon, at that moment when my legs start hurting and I realize that there are still 4 or 6 or 8 or 10 miles, when my resolve is at its lowest and I simply have to gut it out if I want to finish the race strong and meet my goal -- and just then I wondered, maybe I'm just not tough enough mentally to overcome. The prospect those remaining 4 or 6 or 8 or 10 miles suddenly seemed very lonely.

The crisis was short-lived. Pretty soon I was thinking the other seemingly random thoughts that tend to fill a 3-hour run. But my earlier thoughts hung around my head like an echo, with the lyrics and uneven melody of Billy Joel's "Pressure" as accompaniment.

In that short period, I found myself wishing, almost counterintuitively, for a tough run, or even a bad stretch of training. Last summer went pretty smoothly, and look where that goth me. This year started with hamstring problems and so my training started slower and bit tougher than usual, but lately things have been going pretty well. When I'm on the course in the Twin Cities and my resolve bottoms out, will I have what it takes to gut it out?

In that short period, I found myself thinking of my friend Greg who is training to run Twin Cities with me. He lives in Arkansas, where summer is brutal on a marathon runner. Constant heat and unbearable humidity add to the hilly terrain to make every long run a chore. Greg's work schedule makes training even tougher, as almost he has to run in the middle of the night if he wants to get his miles in. As a result, he is worried that he won't be ready for the marathon. My counterintuitive thought was, maybe he'll be better prepared than I am for handling that "moment of truth" during the race; he'll have faced hard, painful runs all summer long.

How is that for my egocentrism and feeling sorry for myself? I guess a really long run on a rainy, dreary day can do that to the best of us. Once I moved on to my next thoughts, the idea that Greg somehow benefits from his current suffering seems foolish, just the sort of foolishness that someone who has had it easy sometimes indulges in. The bottom line is that I have to find the resolve when I need it. All the rest is just excuses.

Don't ask for help
You're all alone
You'll have to answer to your own
Pressure

-- Billy Joel


Posted by Eugene Wallingford | Permalink | Categories: General, Running

September 01, 2006 5:39 PM

Entering a Long Weekend

We've been in class for only two weeks, but I am ready for the long Labor Day weekend. The steady onslaught into the department office of beginning-of-the-year events has slowed, and we will soon be into the steady state of the academic semester. Of course, that includes preparing the spring course schedule, handling a tenure case, promoting new programs, writing a newsletter to alumni and other friends of the department, and many other activities, so the steady state isn't a slower place, just a steadier one.

I'm looking forward to reading this weekend. I've fallen woefully behind of reading my favorite bloggers and essayists. I hope to remedy that while checking out some U.S. Open tennis on television. I did get a chance to read a little bit today and ran across a couple of neat ideas that hit home during a busy week of classes and department duties.

The creation of art is not the fulfillment of a need but the creation of a need.

-- Louis Kahn

I've written on this topic before as it relates to design, but Kahn's line struck me today in the context of education. As much as we educators need to be pragmatic enough to think about how we serve a clientele of sorts, it is good to remember that a university education done well creates a need for it in the mind of the learner that didn't exist before. Even education in a technical area like computer science.

Then there was this short post on Belief by Seth Godin:

People don't believe what you tell them.

They rarely believe what you show them.

They often believe what their friends tell them.

They always believe what they tell themselves.

If we want to affect how students act and think, then we can't just tell them good stories or show them cool stuff. We have to get them to tell themselves the right stories.

More reading will be good. I'll also have a chance to do some relaxed thinking about my CS 1 course, as we move into real programming -- foreach loops! But I have some home duties to take care of as well, and I don't want to be this guy, even if I know in my heart that it is easy to be him. Any work I do this weekend will be firmly ensconced in the life of my family. I'll just do my homework at the dining room table with my daughters...


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

August 24, 2006 5:34 PM

Playing the Big Points Well

This week I've had the opportunity to meet with several colleagues from around my college, as part of some committee work I had volunteered for. This sort of extra work is often just that -- extra work -- but sometimes it is an opportunity to be reminded that I have some impressive colleagues. Sometimes, I even learn something new or am prompted to think about an idea that hadn't occurred to me before.

One of my colleagues spoke of how important it is to get the science faculty working more closely with one another at this time. He couched his ideas in historical terms. Science in the early 20th century was quite interdisciplinary, but as the disciplines matured within the dominant paradigms they became more and more specialized. The second half of the century was marked by great specialization, even with the various disciplines themselves. Computer science grew out of mathematics, largely, as a specialty, and over the course of the century it became quite specialized itself. Even within artificial intelligence, the area in which I did my research, became almost balkanized as a set of communities that didn't communicate much. But the sciences seem to have come back to an era of interdisciplinary work, and CS is participating in that, too. Bioinformatics, computational chemistry, physics students rediscovering computer programming for their own research -- all are indications that we have entered the new era, and CS is a fundamental player in helping scientists redefine what they do and how they do it.

Another colleague spoke eloquently of why we need to work hard to convince young people to enter the sciences at the university level. He said something to the effect that "Society does not need a lot of scientists, but the ones it does need, it needs very much -- and it needs them to be very good!" That really stuck with me. In an era when university funding may become tied to business performance, we have to be ready to argue the importance of departments with small numbers of majors, even if they aren't compensating with massive gen-ed credit hours.

Finally, a third colleague spoke of the "rhythm" of an administrator's professional life. Administrators often seek out their new positions because they have a set of skills well-suited to lead, or even a vision of where they want to help their colleagues go. But minutiae often dominate the daily life of the administrator. Opportunities to lead, to exercise vision, to think "big" come along as fleeting moments in the day. What a joy they are -- but you have to be alert, watching for them to arise, and then act with some intensity to make them fruitful.

For some reason, this reminded me of how sports and other competitive activities work. In particular, I recall a comment Justin Henin-Hardenne made at Wimbledon this year, after her semifinal win, I think. She spoke of how tennis is long string of mostly ordinary points, with an occasional moment of opportunity to change the direction of the match. She had won that day, she thought, because she had recognized and played those big points better than her opponent. I remember that feeling from playing tennis as a youth, usually on the losing end!, and from playing chess, where my results were sometimes better. And now, after a year as an administrator, I know what my colleague meant. But I'm not sure I had quite thought of it in these terms before.

Sometimes, you can learn something interesting when doing routine committee work. I guess I just have to be alert, watching for them to arise, and then act with some intensity to make them fruitful.

(And of course I'm not only an administrator... I'm having fun with my first week of CS1 and will write more as the week winds down and I have a chance to digest what I'm thinking.)


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Managing and Leading

August 21, 2006 5:01 PM

Flush with Expectation for the New Year

Students are back on campus, and classes are back in session. Though the department office moved into its new building in March, and the rest of the faculty moved in in May, this was our first chance to use our classrooms and meeting spaces "in anger". So, in addition to the usual first-day questions that walk into the department office, we got to track down ethernet ports and cables, transparency pens, and all the details of living in our space. The result was a busy day, and a good one in the end. Some of the usual questions were unusual, too -- for instance, I was asked to find a student a substitution course for Keyboarding 101, which s/he had failed at a previous university. Sadly for the student but good for us, we do not offer courses in keyboarding!

The emcee at our university convocation used the title phrase above when speaking with some faculty yesterday. (He is a theater prof, with a flourish for the dramatic.) It captures how most faculty, staff, and students feel this time of year. This is our New Year. It is the time for fresh opportunities, new year's resolutions, and clean desks. Most folks are rested and ready to dig back into the teaching business of the university. Every course is still perfect, the ideal held in our minds all summer. Students are still eager to find their classrooms and figure out what their courses and professors will be like.

In a couple of weeks, the ideal will be shattered: homework will hang over our heads to be done and graded... we'll have botched a lecture or three... the desk will be a mess... and some folks will already be looking ahead to our Baby New Year in January. But not today.

My class starts tomorrow, and I too am flush with the expectation of a new year. I teach CS I this fall for the first time in a decade, and I'm using a new approach for our department. I'm excited by the prospect of 24 students in class tomorrow. I may adapt some ideas from my talk on steganography last spring as my opening to the course, which could be fun -- but it's also a bit of a risk, the sort of risk that can quickly lead to "wait 'til next year...". I've also followed up on a promise to equip my course web site with a newsfeed, by using the same blog engine I use to publish this blog. One of our new adjuncts is blogging his senior-level Networking web site, too, which gives us an interesting experiment this semester into how many of our students are using newsreaders at this point in time, if any.

Now, off to get ready to actually teach the course...


Posted by Eugene Wallingford | Permalink | Categories: General

August 03, 2006 3:57 PM

Pre-Cease-and-Desist E-Mail

My blog has received occasional interest from unexpected places, such as my local newspaper and the author of a book on marketing. These contacts have always been pleasant surprises. Yesterday evening, I received e-mail on a different sort of interest, one which is less pleasant.

In a blog entry on June 14, 2005, I did what I sometimes do: use an image or cartoon from another source that illustrated a point I was trying to make. In this case, it was a cartoon from a well-known series that expressed humorously how I felt when accepting the position as head of my department. I wanted to give the creator credit for his work, so I mentioned it by name and linked to the author's web site.

Somehow, the syndicator of the series came to know about my entry. (The fact that my use of the cartoon shows up as #40 on a Google Images search for the cartoon makes the discovery not all that surprising!) Last night, I received a polite letter from the syndicator asking that I remove the image, along with a playful though perhaps overwrought message from the author himself. I updated that blog entry as soon as I could to remove the offending image.

As I said, I occasionally use images from other web sites to illustrate my blog, though I have not sought official permission to use all of them. I've always tried to stay within the bounds of fair use, and I have always linked to the web page where I found the image, so that the creator would get full credit for his or her intellectual property along with any attendant traffic that may follow from my readers. In an "free software" sense, I was just trying to participate in the exchange of ideas.

I may well have been within the legal bounds of fair use in this case. Not being a lawyer, I would have to spend a little time, if not money, to obtain a legal opinion to that effect. In either direction, I do not think it's worth my time or money to find out.

I have great respect for intellectual property rights and do not wish to infringe on them. I recall being rather annoyed by the cavalier attitude taken by the founder of Wikipedia on the its use of copyright images and images otherwise controlled by others. Whatever one feels about free software and open source, one rarely has the right to require another person to share intellectual property. Similarly, the idea of downloading copyright music and video without paying disturbs me. I may think that someone should distribute something openly, but that is the copyright holder's prerogative.

So: I will be careful from now on to use only those images for which I have permission to do so.


Posted by Eugene Wallingford | Permalink | Categories: General

July 21, 2006 5:15 PM

The End of a Different Year

I am coming to the end of my first year as department head, which officially began on August 1 last year. In that time, I have written less about management than I had expected. Why? One ingredient is that I blog using my public identity. I don't often mention UNI or the University of Northern Iowa in my entries, but the blog is located in my departmental web space, and I provide plenty of links off to other UNI pages.

For much of what I've thought to write in the last year, I would have had a hard time writing in an anonymous way; too often, it would have been obvious that my comments were about particular individuals -- folks that some of my readers know, or could know, and folks who themselves might stumble upon my blog. In a couple of CS curriculum entries such as this one, I have referred to a nameless colleague who would surely recognize himself (and our current and former students can probably identify him, too) -- but those were posts about CS content, not discussions of my role as head.

Why would that be a problem? I think that I am more prone to write these days on negative experiences that occur as an administrator, manager, or dare I say leader than I am to write on positive events. This is the same sort of motivation I have for writing software patterns, negative experiences that happen when programming and designing. But in those cases I also have a solution that resolves the forces. As an administrator, I am mostly still looking for solutions, feeling my way through the job and trying to learn general lessons. And that gets us to the real reason I haven't written much on this topic: I'm not there yet. I don't think I've learned enough to speak with any amount of authority here, and I don't have the confidence to ruminate yet.

As I wrote back in May, I thought I might have more time to write in the summer, at least more regularly. But instead of fewer distractions, I think I've experienced more. I've actually had less concentrated time to write. In that same entry, I related a quote from Henri Nouwen that "... the interruptions were my real work." The idea behind this quote is that we must come to peace with the distractions and view them as opportunities to serve. In a rational sense, I understand this, but I am not there yet. Sometimes, interruptions just seem like interruptions. I can go home at the end of a day of interruptions feel like I did something useful, but I also usually feel unfulfilled at not having accomplished my other goals for the day.

Sometimes, the interruption really does turn out to be my real task, one that displaces other plans in a meaningful way. The best example of that this summer has been a long-tabled proposal for our department to offer a B.S. in Software Engineering. I definitely have plenty to write about this in the next couple of weeks, but it won't be my colleagues here who might be unhappy with what I say; it will be colleagues at our sister state institutions.

This week has been filled with nothing but distractions of the former type. For most of the week, I have felt like a preemptible processor thrashing on a steady flow of interrupts. On the other hand, I did manage to tie up some loose ends from old projects, which should make August a more enjoyable month.

In that May entry, I also quoted Chad Fowler's vignette about "fighting the traffic". I have come to realize that, unlike Fowler's cabbie, I don't love to fight the traffic. At least not as my primary job. I know that it is an essential part of this job, so I am looking for middle ground. That's a meta-goal of mine for the coming year. I'd also like to refine how I respond to interrupts and how I schedule and manage my time. This year will tell me a lot about whether I should consider this as a long-term professional opportunity.

But now I am ready for a little vacation. I haven't take any break other than at Christmas since long weekends in May and July of 2005. While I am not despairing for my job, I am reminded of a quote I used when starting as head last August 1:

Oft expectation fails, and most oft there
Where most it promises; and oft it hits
Where hope is coldest, and despair most fits.

-- William Shakespeare
All's Well That Ends Well (II, i, 145-147)

Earlier this summer, I had a fantasy of a running vacation: Drive somewhere mostly isolated, and find a home base in a little motel. Get up every morning and run from 5:30 until 9:00 or so, anywhere from 12 to 15 miles. Come back, shower, eat, and nap until noon. Then spend the day reading, writing, and goofing off. Mmmm.

But I also have a strong desire to spend some time with my wife and daughters, and so that's what I'll do: hang around home. I'll do some "home"work, relax and read. I'll probably also spend a little relaxed time corralling my work to-do list, to prepare for returning to the office. That will actually feel good. I'll even do some disciplined reading of work e-mail -- but I don't want to be this guy:

Ph.D. comic about working on the beach

If I have to go an extreme, I'd rather be Delbert T. Quimby, a guy from an editorial cartoon by Ed Stein back in 1995. I can't find this on-line and don't have a scanned version of the cartoon, so you'll have to settle for my description. The scene is a cozy den, with a wall full of books. In the soft spotlight at the middle of the room is our protagonist, comfortably dressed, book in lap, glass of wine on the side table. The caption reads, "Eccentric Internet holdout Delbert T. Quimby passes yet another day non-digitally."

Back in 1995, such a maneuver was likely viewed as an act of rebellion; today, it would be viewed by many as just plain nuts. But, you know, it sounds really good right now. So who knows?

(My students probably consider me to be a Delbert T. Quimby already, not for the wardrobe and dumpy physique but for this neo-Luddite fact: I still have only a dial-up Internet connection at home!)

Oh, and I will definitely run a lot of miles next week -- 52 or so -- and blog a bit.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

July 14, 2006 5:29 PM

Growing a Tech Industry Instead of Corn

One of the enjoyable outreach activities I've been involved with as department head this year has been the state of Iowa's Information Technology Council. A few years back, the Iowa Department of Economic Development commissioned the Battelle corporation to study the prospects for growing the state's economy in the 21st century. They focused on three areas: bioscience, advanced manufacturing, and information technology. The first probably sounds reasonable to most people, given Iowa's reputation as an agriculture state, but what of the other two? It turns out that Iowa is much more of a manufacturing state than many people realize. Part of this relates back to agriculture. While John Deere is headquartered in Moline, Illinois, most of its factories are in Iowa. We also have manufacturers such as Rockwell Collins and Maytag (though that company has been purchased by Whirlpool and will close most or all of its Iowa locations soon).

But information technology? Des Moines is home to several major financial services companies or their regional centers, such as Principal Financial Group and Wells Fargo. Cedar Rapids has a few such firms as well, as well as other companies with a computing focus such as NCR Pearson and ACT.

IDED created the IT Council to guide the state in implementing the Information Technology Strategic Roadmap developed by Battelle as a result of its studies. (You can see the report at this IDED web page.) The council consists of representatives from most of Iowa's big IT firms and many of the medium-sized and small IT firms that have grown up throughout the state. Each of the three state universities has a representative on the council, as does the community college system and the consortium of Iowa's many, many private colleges and universities. I am UNI's representative.

The council has been meeting for only one year, and we have spent most of our time really understanding the report and mapping out some ideas to act on in the coming year. One of the big issues is, of course, how Iowa can encourage IT professionals to make the state their home, to work at existing companies and to create innovative start-ups that will fuel economic growth in the sector. Another part of the challenge is to encourage Iowa students to study computer science, math, and other science and engineering disciplines -- and then to stay in Iowa, rather than taking attractive job offers from the Twin Cities, Chicago, Kansas City, and many other places with already-burgeoning IT sectors.

To hear Paul Graham tell it, we are running a fool's errand. Iowa doesn't seem to be a place where nerds and the exceedingly rich want to live. Indeed, Iowa is one of those red states that he dismisses out of hand:

Conversely, a town that gets praised for being "solid" or representing "traditional values" may be a fine place to live, but it's never going to succeed as a startup hub. The 2004 presidential election ... conveniently supplied us with a county-by-county map of such places. [6]

Actually, as I look at this map, Iowa is much more people than red, so maybe we have a chance! I do think that a resourceful people that is willing to look forward can guide its destiny. And the homes of our three state universities -- Iowa City, Ames, and Cedar Falls -- bear the hallmark of most university towns: attracting and accepting more odd ideas than the surrounding environment tends to accept. But Iowans are definitely stolid Midwestern US stock, and it's not a state with grand variation in geography or history or culture. We have to bank on solidity as a strength and hope that some nerds might like to raise their families in a place with nice bike trails and parks, a place where you can let your kids play in the neighborhood with fearing the worst.

We also don't have a truly great university, certainly not of the caliber Graham expects. Iowa and Iowa State are solid universities, with very strong programs in some areas. UNI is routinely praised for its efficiency and for its ability to deliver a solid education to its students. (Solid -- there's that word again!) But none of the schools has a top-ten CS program, and UNI has not historically been a center of research.

I've sometimes wondered why Urbana-Champaign in Illinois hasn't developed a higher-profile tech center. UIUC has a top-notch CS program and produces a lot of Ph.D., M.S., and B.S. graduates every year. Eric Sink has blogged for a few years about the joys of starting an independent software company amid the farmland of eastern Illinois. But then there is that solid, traditional-values, boring reputation to overcome. Chicago is only a few hours' drive away, but Chicago just isn't a place nerds want to be near.

So Iowa is fighting an uphill battle, at least by most people's reckoning. I think that's okay, because I think the battle is still winnable -- perhaps not on the level of the original Silicon Valley but at least on the scale needed to invigorate Iowa's economy. And while reputation can be an obstacle, it also means that competitors may not be paying enough attention. The first step is to produce more tech-savvy graduates, especially ones with an entrepreneurial bent, and then convince them to stay home. Those are steps we can take.

One thing that has surprised me about my work with the IT Council is that Iowa is much better off on another of Graham's measures than I ever realized, or than most people in this state know. We have a fair amount of venture capital and angel funding waiting for the right projects to fund. This is a mixture of old money derived from stodgy old companies like Deere and new money from the 1990s. We need to find a way to connect this money to entrepreneurs who are ready to launch start-ups, and to educate folks with commercializable ideas on how to make their ideas attractive to the folks with the money.

Here at UNI, we are blessed to have an existence proof that it is possible to grow a tech start-up right here in my own backyard: TEAM Technologies, which among its many endeavors operates the premier data center in the middle part of the USA. A boring, solid location with few people, little crime, and no coastal weather turns out to be a good thing when you want to store and serve data safely! TEAM is headed up by a UNI alumnus -- another great strength for our department as we look for ways to expand our role in the economic development of the state.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Managing and Leading, Software Development

July 09, 2006 3:40 PM

Another Year of Blogging in the Books

I haven't written an entry about my blog in a while, but I hope you'll indulge me today. This is the 377th entry in my blog. I posted the first on July 9 two years ago.

This second year has seen considerably less activity than the first, in which I posted 225 entries. But I've still managed a dozen or so entries per month in 2005-2006 -- though the content of my postings looks a bit different in the second year as well. In Year 1, my writing seemed driven by thoughts of agile software development and especially connections I was making between agile ideas and my training for the Des Moines Marathon. My initial readership came largely from folks interested in agile development, and sometimes those interested in how I was trying to teach those ideas in a senior-level seminar.

Near the end of my first year I took on a three-year assignment as head of my department. I had thought that this would result in frequent writing about management and leadership, as I tried to figure out how to do them. But most of my entries have been about topics at the intersection of my headship and my teaching, the future of computer science at the university and the teaching of introductory CS courses. Why I have not written more frequently about the administrative and management sides of my job is worthy of its own entry in the future, but the short answer is that I have not yet managed to consolidate my learning in this area yet.

OOPSLA was again a primary source of inspiration, as was SIGCSE. And I still have a lot to say about my running, even if only the wild thoughts that pop into my head while deep in a 16-miler.

I think the entry from this year that elicited the most response was my report on a lecture by Thomas Friedman. In retrospect, I'm not sure if I have a favorite post from the year, though I recall articles on negative splits in learning, Robert Hass's OOPSLA keynote on creativity, and talks by Marcia Bjornerud on popular science writing and my friend Roy Behrens on teaching as "subversive inactivity" with fondness. (Does a particular article or theme stand out in your mind?)

After all this time, I still haven't followed through with allowing comments or posting links to some of my favorite blogs. The comments are problematic, given the low-tech blogging tool I use, NanoBlogger. With some time and patience, they are doable, but the opportunity cost of that time seems inordinately high. But I may move from NanoBlogger soon, for a variety of technical reasons (long, unmeaningful entry names and slowness processing a blog with 300+ entries among them), so who knows. I can add a blogroll of sorts with minimal effort, and I can only plead inordinate laziness as my excuse. Soon.

On my one-year anniversary, I wrote a brief reflection and wondered what a second year of Knowing and Doing would bring. I love the quotes I used there, from Annie Dillard's "The Writing Life" and Glenway Wescott's "The Pilgrim Hawk". They remain true for me today and express something of why I will continue to write here.

Thanks to all you who read my ramblings. Thanks, too, to all of you who send me short notes when a post strikes a chord with you, or when you have something to share. You've taught me much. I hope to make the time you spend reading in the coming year worth your while.


Posted by Eugene Wallingford | Permalink | Categories: General

July 03, 2006 4:55 PM

Humility and Revolution

Three quotes for what amounts in most American workplaces the middle of a long holiday weekend. The first two remind me to approach my teaching and administrative duties with humility. The third reminds me of we in America celebrate this holiday weekend at all.

... from Gerald Weinberg:

When I write a book or essay, or teach a course, I have one fundamental measure of failure, which I call Weinberg's Target:

After exposure to my work, does the audience care less about the subject than they did before?

If the answer is Yes, I've failed. If the answer is No, I've succeeded, and I'm happy for it. Perhaps you consider my goal too modest. Perhaps you aspire to something greater, like making the student learn something, or even love the subject. Oh, I'm not dismayed by such fine outcomes, but I don't think it's a reasonable goal to expect them.

We can do much worse than communicate some information without dampening our audience's natural enthusiasm.

... from Steve Yegge:

If you don't know whether you're a bad manager, then you're a bad manager. It's the default state, the start-state, for managers everywhere. So just assume you're bad, and start working to get better at it. ... Look for things you're doing wrong. Look for ways to improve. If you're not looking, you're probably not going to find them.

Steve's essay doesn't have much in the way of concrete suggestions for how to be a good manager, but this advice is enough to keep most of us busy for a while.

... finally, from the dome of the Jefferson Memorial, via Uncertain Principles:

I have sworn upon the altar of God eternal hostility against every form of tyranny over the mind of man.

A mixture of humility and boldness befitting revolution, of thought and government.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Managing and Leading

June 22, 2006 3:08 PM

The H Number

I ran across the concept of an "h number" again over at Computational Complexity. In case you've never heard of this number, an author has an h number of h if h of her Np papers have ≥ h citations each, and the rest of her (Np - h) papers have ≤ h citations each.

It's a fun little idea with a serious idea behind it: Simply counting publications or the maximum number of citations to an author's paper can give a misleading picture of a scientist's contribution. The h number aims to give a better indication of an author's cumulative effect and relevance.

Of course, as Lance points out, the h number can mislead, too. This number is dependent on the research community, as some communities tend to publish more or less, and cite more or less frequently, than other. It can reward a "clique" of authors who generously cite each other's work. Older authors have written more papers and so will tend to be cited more often than younger authors. Still, it does give us different information than raw counts, and it has the enjoyability of a good baseball statistic.

Now someone has written an h number calculator that uses Google Scholar to track down papers for a specific researcher and then compute the researcher's index. (Of course, this introduces yet another sort of problem... How accurate is Scholar? And do self-citations count?)

I love a good statistic and am prone to vanity surf, so I had to go compute my h number:

The h-number of Eugene Wallingford is 5 (max citations = 22)

You can put that into perspective by checking out some folks with much larger numbers. (Seventy?) I'm just surprised that I have a paper with 22 citations.

I also liked one of the comments to Lance's post. It suggests another potentially useful index -- (h * maxC)/1000, where maxC is the number of citations to the author's most cited paper -- which seems to combine breadth of contribution with depth. For the baseball fans among you, this number reminds me of OPS, which adds on-base percentage to slugging percentage. The analogy even feels right. h, like on-base percentage, reflects how the performer contributes broadly to the community (team); maxC, like slugging percentage, reflects the raw "power" of the author (batter).

The commenter then considers a philosophical question:

Lastly, it is not so clear that a person who has published a thousand little theorems is truly a worse scientist than one who has tackled two large conjectures. You don't agree? Paul Erdos was accused of this for most of his life, yet for the last two decades of his life it became very clear that many of those "little theorems" were gateways to entire areas of research.

Alan Kay doesn't publish a huge number of papers, but his work has certainly had a great effect on computing over the last forty years.

Baseball has lots of different statistics for comparing the performance of players and teams. Have a large set of tools can both be fun and give a more complete picture of the world.

I suppose that I should back to working beefing up my h number, or at least doing something administrative...


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

June 20, 2006 2:28 PM

Two Motifs for the Day

... courtesy of quite different triggers.

"The plan is more important than the ego."

I finally got back on the track last week for some faster running. Not fast, just faster than I've been able to manage the last couple of months. This Sunday I run a half-marathon, so I didn't want to run a speed workout this week, but I did want to get back into the habit of a weekly trek to the track, so I decided to go this morning for eight miles at my conservative half-marathon goal pace, 8:00 minutes/mile.

Everything went great for a couple of miles, when a college student joined me on the track. Then one of my personal weaknesses came forward: I wanted to run pass him. He wasn't running all that fast, and a couple of laps of fast stuff would have put him away. But it may also have cost me, either later in this run or, worse, during my race, when I discovered that I'd burned up my legs too much this week.

Fortunately, I summoned up some uncharacteristic patience. Fulfilling my plan for this morning was more important than stroking my ego for a couple of minutes. What else would have passing this guy have done for me? It wouldn't have proven anything to me or to him (or his cute girlfriend). In fact, my ego is better stroked by sticking to the plan and having the extra strength in my legs for Sunday morning.

In the end, things went well. I ran 7.5 miles pretty much on pace -- 7:55 or 7:56 per mile -- and then let myself kick home for a fast half mile to end. Can I do 13.1 miles at that pace this weekend? We'll see.

"It changes your life, the pursuit of truth."

I heard Ben Bradlee, former editor of the Washington Post, say this last night in an interview with Jim Lehrer. Bradlee is a throwback to a different era, and his comments were an interesting mix of principle and pragmatism. But this particular sentence stopped me in my tracks. It expresses a truth much bigger than journalism, and the scientist in me felt suddenly in the presence of a compatriot.

The pursuit of truth does change your life. It moves the focus off of oneself and out into the world. It makes hypotheticals and counterfactuals a natural part of one's being. It makes finding out you're wrong not only acceptable but desirable, because then you are closer to something you couldn't see before. It helps you to separate yourself -- your ego -- from your hypothesis about the world, which depersonalizes many interactions with other people and with the world. Note that it doesn't erase you or your ego; it simply helps you think of the world independent from them.

I'm sure every scientist knows just what Bradlee meant.


Posted by Eugene Wallingford | Permalink | Categories: General, Running

June 19, 2006 2:57 PM

Recruiting a Company to the Area

I had a new experience this morning and learned a new piece of vocabulary to boot. The regional chambers of commerce are recruiting a "software" company to the area, and they asked me and the head of career services here to participate in the initial contact meeting with company's founder and CFO. I was expecting a software development company, but it turns out that the company does CRM consulting for large corporations, working as a partner with the corporation that produces the particular software platform.

First, in case you aren't familiar with the term, CRM stands for "customer relationship management". It is the customer-facing side of a corporation's technical infrastructure, as contrasted to ERP -- "enterprise resource planning" -- on the back end. As far as I know, Siebel, recently purchased by Oracle, and SAP are the big CRM software companies in the US. With the push on everywhere to squeeze the last penny out of every part of the business, I expect that companies like these, and their consulting partners, are likely to do well in the near future. In any case, our local economy doesn't participate in this part of the software-and-services ecosystem right now, so attracting a firm of this sort would open up a new local opportunity for our students and strengthen the IT infrastructure of the region.

"Selling" our CS department to this company turned out to be pretty easy. They have had great success with IT students from the Midwest and are eager to locate in a place that produces that kind of graduate. I had figured that they might be looking for particular skills or experiences in our students, but beyond knowing either Java or C++, and having access to a course in databases, they asked for nothing. This was refreshing, considering that some companies seem to want university programs to do job training for them. These folks want good students that they can train. That we can do.

Not having participated in many recruiting meetings of this sort before, I was prepared to give the standard pitch: We have great students; they learn new skills with ease; they become model employees; etc. But the company founder short-circuited a lot of that by reminding me that almost every college and university says the same thing. I adapted my pitch to focus on harder facts, such as enrollments, graduation rates, and curriculum. My best chance to "sell" came when answering their questions, because it was only then that we get a good sense of what they are looking for.

Not being a big-time consumer or salesman, I have to remind myself that the things the other guys are saying are meant to sell them and so need to be examined with a critical eye. These folks seemed pretty straightforward, though they did make some claims about the salaries our graduates earn that seemed calculated to enhance their position in negotiating with the cities. But again, I was surprised -- pleasantly -- to find that this company does not seek financial support until after it has its operation in place and has reached an initial employment goal. Rather than trying to extort incentives out of the city upfront, they contribute first. That seems like both a great way to do business and a great way to sell your company to the locals.

During the meeting, it occurred to me just how hard it is to "sell" the quality of life of our area. Just as every university says that it produces great students, every town, city, and metro area touts the fine quality of life enjoyed by its residents. If we think we offer more or better -- and in many ways, I think we do -- how can you get that across in a three-hour meeting or a 10-minute DVD? I lived hear for many years before I fully appreciated our recreational trail system, which doubles quite nicely as a commuting mechanism for those who are so inclined. (Now that I spend 7 or 8 hours a week running on our roads and trails, I appreciate them!)

This was the first meeting, but things will move fast. For the next month or so, both sides of the deal will perform their due diligence, and if things work out a deal will be in place by fall. I expect that the university is done with its part, and so the next I hear -- if anything -- will be a public announcement of the development. Like the Halting Problem, no answer doesn't mean that the answer is 'no', though the longer I wait for an answer the less likely that the answer will be 'yes'.

Oh, the new vocabulary: value proposition. Not being tuned in to the latest marketing terminology, I don't think I'd ever heard this phrase before today, but our founder used it several times. He was otherwise light on jargon, at least on jargon that a CS guy would find unusual, so that was okay. Google tells me that "a value proposition is a clear statement of the tangible results a customer gets from using your products or services". The founder spoke of the company's value proposition to the community, to the city and state, and to our graduates. He was clear on what he thinks his company offers all three of these groups -- also a good way to sell yourself.

Three-hour business meetings are not usually at the top of my list of Best Ways to Spend a Beautiful 80-degree Day, but this was pleasurable. I still have a lot to learn about the world our students work in.


Posted by Eugene Wallingford | Permalink | Categories: General

June 11, 2006 2:05 PM

Pleasantly Surprising Interconnections

The most recent issue of the Ballast Quarterly Review, on which I've commented before, came out a month or so. I had set it aside for the right time to read and only came back around to it yesterday. Once again, I am pleasantly surprised by the interconnectedness of the world.

In this issue, editor Roy Behrens reviews John Willats's book Making Sense Of Children's Drawings. (The review is available on-line at Leonardo On-Line.) Some researchers have claimed that children draw what they know and that adults draw what they see, and that what we adults think we see interferes with our ability to create authentic art. Willats presents evidence that young children draw what they see, too, but that at that stage of neural development they see in an object-centered manner, not a viewer-centered manner. It is this subjectivity of perspective that accounts for the freedom children have in creating, not their bypassing of vision.

The surprising connection for came in the form of David Marr. A vision researcher at MIT, Marr had proposed the notion that we "see by processing phenomena in two very distinct ways", which he termed viewer-centered object-centered. Our visual system gathers data in a viewer-centered way and then computes from that data more objective descriptions from which we can reason.

Where's the connection to computer science and my experience? Marr also wrote one of the seminal papers in my development as an artificial intelligence researcher, his "Artificial Intelligence: A Personal View". You can find this paper as Chapter 4 in John Haugeland's well-known collection Mind Design and on-line as a (PDF) at Elsevier.

In this paper, Marr suggested that the human brain may permit "no general theories except ones so unspecific as to have only descriptive and not predictive powers". This is, of course, not a pleasant prospect for a scientist who wishes to understand the mind, as it limits the advance of science as a method. To the extent that the human mind is our best existence proof of intelligence, such a limitation would also impinge on the field of artificial intelligence.

I was greatly influenced by Marr's response to this possibility. He argued strongly that we should not settle for incomplete theories at the implementation level of intelligence, such as neural network theory, and should instead strive to develop theories that operate at the computational and algorithmic levels. A theory at the computational level captures the insight into the nature of the information processing problem being addressed, and a theory at the algorithmic level captures insight into the different forms that solutions to this information processing problem can take. Marr's argument served as an inspiration for the work of the knowledge-based systems lab in which I did my graduate work, founded on the earlier work on the generic task model of Chandrasekaran.

Though I don't do research in that area any more, Marr's ideas still guide how I think about problems, solutions, and implementations. What a refreshing reminder of Marr to encounter in light reading over the weekend.

Behrens was likely motivated to review Willats's book for the potential effect that his theories might have on the "day-to-day practice of teaching art". As you might guess, I am now left to wonder what the implications might be for teaching children and adults to write programs. Direct visual perception has less to do with the programs an adult writes, given the cultural context and levels of abstraction that our minds impose on problems, but children may be able to connect more closely with the programs they write if we place them in environments that get out of the way of their object-centered view of the world.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

May 31, 2006 1:07 PM

Market-Driven Academia

In this era of rising costs, heightened competition, and falling public support for education, universities are becoming more and more driven by public relations. I encountered a perfect example at a meeting this morning.

One of the offices on campus was demoing a beta version of a new on-line tool that will help the university interact with students and community colleges. They plan to go 'live' with with the tool later this summer. Right now, they are introducing the tool to faculty on campus, both to get feedback to improve the program but also to build awareness of the tool among potential stakeholders.

Further, before going live, these folks plan to visit community colleges around the state to teach them about the tool, to build awareness among another key group of stakeholders. Then they'll present the tool to students here at the university and elsewhere.

One of their reasons for the concerted effort to spread the word so broadly was very pragmatic: After university public relations sends out press releases on this new tool, they expect the press to immediately ask community college folks what they think of about the tool's effect on them and their students. And the university wants these folks to be able to respond to press queries in an informed -- and, hopefully, positive -- way. Good words in the press make for good public relations.

The fact that the university is making an effort to educate potential users and stakeholders is not unusual; I'd expect the same for any new tool. What struck me was the deliberate effort to move the education stage so early in the process, as a part of the PR initiative. And the campaign of enlightenment won't be limited to people directly affected by the tool and its use; the university also plans to demo the tool to key legislators in the state and in the districts served by the community colleges. University/community college relations are a hot political issue these days, and the university wants fair attention given to its efforts to meet the desires of the folks who hold the purse strings, and the folks who elect those folks.

The PR campaign goes farther than just educating stakeholders. The unit responsible for this tool is already working on trademarking the name and logo of the software, to solidify their use in PR and to prevent unscrupulous competitors from swooping in after the launch and stealing intellectual property. (That flowery prose is mine, not the universities.)

I can't say that I blame the university for working so hard to shape its image before the legislature and the public at large. Perception is important. With so many entities competing for state appropriations, the university needs to sell itself better. Some might say that public agencies aren't competing, but they are. Within any given political culture, public funding is limited, so choices have to be made.

So long as the university doesn't subvert its purpose, or do things it wouldn't otherwise do for the sake of publicity, playing the PR game seems an unavoidable reality these days.

Bioinformatics poster

I've already about my department's effort to market a new program in bioinformatics. I view our efforts as a reasonable attempt to make information available to the people who need it in order to make informed choices about careers and educational opportunities. We will be moving forward this summer to let more students and teachers know about our program. For now, you can see a couple of pieces of literature developed by the university's PR department to help us, an 11"x17" poster (right) and an 8-1/2"x8-1/2" bifold brochure (PDF).


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

May 25, 2006 10:12 AM

Dumbing Down Recipes

As a part of seeing my wife and daughters off to Italy, I cooked a few special meals for them on Sunday and Monday. One friend suggested that I needn't have bothered, because they will encounter much better food in Italy, but I think that an unusual meal prepared by Dad is still a nice treat -- and my wife loved not having to think about any meals for the last couple of days of packing and preparing. Besides, I'm not too shabby in the kitchen.

I like to cook. I'm not an accomplished chef, or anything of the sort, just an amateur who like to work in the kitchen and try new things.

While blanching asparagus for my last and finest effort of the weekend, I remembered an article that ran in our local paper last March under the headline Cookbooks simplify terms as kitchen skills dwindle. It discusses the dumbing down of cookbooks over the last couple of decades because Americans no longer know the common vocabulary of the kitchen. These days, recipes tend not to use words like "blanch", "dredge", or even "saute", "fold", and "braise", for fear that the casual reader won't have any idea what they mean. Cookbooks that buck the trend must provide detailed glossaries that explain what used to be standard techniques.

In some ways, this is merely a cultural change. People generally don't spend as much time cooking full meals or from scratch these days, and women in particular are less likely than their mothers to carry forward the previous generation's traditional culinary knowledge. That may not be a good or bad thing, just a difference borne out of technology and society. The article even implicates the digital computer, claiming that because kids grow up with computers these days they expect everything, even their cooking, to be fast. Who knew that our computers were partly responsible for the dumbing down of America's kitchen?

I sometimes think about connections between cooking and programming, and between recipes and programs. Most folks execute recipes, not create them, so we may not be able to learn much about how learning to programming from learning to cook. But the dumbing down of cooking vocabulary is a neat example for how programs work. When a recipe says to "fold" an ingredient into a mixture, it's similar to making a procedure call. Describing this process using different terms does not change the process, only the primitives used to describe the process. This focus on process, description, and abstraction is something that we computer scientists know and think a lot about.

In a more general teaching vein, I chuckled in my empathy for this cookbook editor:

"Thirty years ago, a recipe would say, 'Add two eggs,'" said Bonnie Slotnick, a longtime cookbook editor and owner of a rare-cookbook shop in New York's Greenwich Village. "In the '80s, that was changed to 'beat two eggs until lightly mixed.' By the '90s, you had to write, 'In a small bowl, using a fork, beat two eggs,'" she said. "We joke that the next step will be, 'Using your right hand, pick up a fork and...' "

Students probably feel that way about programming, but I sometimes feel that way about my students...

... which bring me back to my day job. I have reason to think about such issues as I prepare to teach CS 1 for the first time in a decade or so. Selecting a textbook is a particular challenge. How much will students read? What kinds of things will they read. How well can they read? That seems like an odd question to ask of college freshmen, but I do wonder about the technical reading ability of the average student who has questionable background in math and science but wants to "program computer games" or "work with computers". Colleagues complain about what they see as a dumbing down of textbooks, which grow in size, with more and more elaborate examples, while in many ways expecting less. Is this sort of text what students really need? In the end, what I think they really need are a good language reference and lots of good examples to follow, both in the practice of programming and in the programs themselves. It's our job to teach them how to read a language reference and programs.

My selection of a CS 1 textbook is complicated by the particular politics of the first year curriculum in my department. I need something that feels traditional enough not to alienate faculty who are skeptical of OO, but true enough to OO that Java doesn't feel like an unnecessary burden to students.

blanching vegetables

Postscript: Many recipes require that vegetables be blanched -- scalded in boiling water a short time -- before being added to a dish. Blanching stops the enzyme action, which allows them to stay crisp and retain their color and flavor. Here is a simple how-to for blanching. I didn't lear this from my mom or any of the cooks in my family (including my dad); I learned it the old-fashioned way: I ran into the term in a recipe, I wanted to know what it meant, so I looked it up in the cookbook. If we could give our programming students the right sort of reference for looking up the terms and ideas they encounter, we would be doing well. Of course, some students of programming will be like some students of cooking and try to fake it. I don't recommend faking the blanching of asparagus -- it's a temperamental vegetable!


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

May 24, 2006 2:09 PM

One Big Expenditure

map of italy

As I mentioned yesterday, as father to two active daughters, I sometimes make price-conscious decisions in other areas. The expenses of fatherhood are on my mind right now, because I have a large credit card bill in store... On Monday of this week, my wife and daughters left for a two-week trip to Italy. They'll be staying with friends who are stationed at Aviano Air Base, in northern Italy not too far from Venice. The friends will serve as local hosts and tour guides, which makes such a big trip less daunting.

This will be a great experience for the girls. They will be missing the last two weeks of classes at school, but they'll learn far more on the trip than would during two weeks in school (especially, sadly, the last two weeks of school, when things seem to wind down a bit too fast for my tastes). As Andy Hunt wrote a while back, travel expands the brain. I am so glad that they have this opportunity, and a bit envious.

What about me and my opportunity? Bad timing. This is the end of the academic and fiscal years at school, and both present me with things that have to be done in the near-term. Besides, I didn't have the most productive spring and so owe my department some work that I'd promised earlier. On top of all that, I've been tired, run down, and injured since the end of February or so, and I just need time to recover. As much as I miss my wife and girls already, I am glad to have some quiet time to rest and relax my mind a bit.

We thought about delaying the trip until later in the summer and traveling as a family but, to tie back to my previous post yet again, the cost of airfare rose dramatically as we got into traditional summer travel dates. We also started to run into conflicts with other summer activities. In the end, we decided that the opportunity was too good for Mary and the girls to pass up... So now I look forward to frequent e-mail, an occasional phone call, and when they return a thorough review from the journals that they are all keeping. (No, we didn't set them up to blog their trip, though they'll be taking plenty of digital photos!)

It's too bad we couldn't make a later trip fit our schedules and budget; Italy would have been a great change of scenery for me, too. I've not yet been to Europe, and I know I'd love to see so much of it. The end of next month would have been perfect: I could have attended ITiCSE -- in Bologna, no less! Alas, the airline tickets would have been $1400 or more each, and the idea that the price of my ticket would have been tax-deductible just wasn't attractive enough.


Posted by Eugene Wallingford | Permalink | Categories: General

May 23, 2006 4:23 PM

Quality is Only One Good

It seems that folks have been discussing a flap between Google and Yahoo about choice in the search engine market. "Choice" has become a loaded term in several political contexts over the last couple of decades, and I suppose that this context is political in its own way.

I don't have much to say today about the Google, Yahoo, Microsoft situation, but something Jeremy Zawodny said recently struck a chord:

First off, I agree that companies should compete based on quality. But Microsoft and McDonald's are both shining examples of how that's not necessarily the way it works when "the market" is involved in the decision making. Price and convenience tend to trump quality.

Far be it from me to defend Microsoft and McDonald's, neither of whose products I use with any frequency. But... it seems Jeremy is saying that companies compete based only on quality, whereas the market introduces unfortunate forces such as price and convenience into the mix.

Why should companies compete based only on quality?

Quality is only one good. Other features have value, too. I don't think the market introduces issues such as price and convenience into the equation so much as expose what people really value.

I think that I value quality, but I also value other things in this world. In particular, I am often a price-sensitive consumer. With two daughters to raise, I sometimes have to make choices based on both quality and price, if I want to save money for other purchases I want to make.

In the search arena, if Yahoo! or some other company creates the absolute best, highest-quality search engine, but it costs a pretty penny, I may choose a "lower-quality" provider simply to conserve my scarce resources for something else that I value. And I may not suffer at all for my choice; good enough is often good enough.

We face this kind of choice in software development, of course, and folks like Richard Gabriel have written about the phenomenon's effect in the software world. Agile methodologies encourage us not to build the Perfect Solution if we aren't gonna need it. I suppose that the choice between "do the right thing" and "do the simplest thing" is still an open question in the software world, with many folks not enamored with the agile approach to things, but I think in the long run "do the simplest thing" will win out -- and produce both the most software and the best software.

This is all basic economics: we have to make choices in a world of scarce resources and conflicting values.

As something of a disclaimer, while I can pinch pennies with the best of them, I'm not a "least common denominator" kind of guy. I don't eat a lot of fast food, at McDonald's or elsewhere, because I value some things more than the convenience and immediate price savings over some of the alternatives. I'm writing this blog entry on a computer made by a company that has built its reputation on the idea that it makes better products. Users of these products seem prouder than most to be using the better tools. And those of us who use these products pay a small premium to do so. When I buy a new computer, I take quality and price into account, along with a whole host of other factors, including convenience and the intangibles of my experience using the product.

I value quality, but I value many other things, too.

(Oh, and if Jeremy didn't mean what I thought he meant, I apologize for dragging him into this. His blog entry was simply the trigger for this piece. For more on triggers in writing, start with this piece by Richard Gabriel. I also recommend the Hugo book that Richard cites.)


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

May 18, 2006 3:59 PM

Summer Means Fewer Distractions

Several of my friends and colleagues have commented on the end of the academic year, which is different in some ways for me now that I am doing administrative duties as well as teaching. I've been known to do dance of joy at the end of a tough three-course teaching semester. Now my duties spread into the summer, but it's still nice to have a different rhythm to my days and weeks.

I am reminded of this little story by Chad Fowler, called Fight The Traffic:

I got in a cab last night heading from Washington D.C. to the Dulles airport. I was a block from the White House and traffic was stopped behind a crowd that was pushing its way in to see the President.

The old Ethiopian cab driver suddenly kicked the taxi into gear and zipped around a line of cars, edging us five cars closer to freedom.

"I hate traffic", he grumbled.

"You picked the wrong job, then, didn't you?"

"No! I love my job. My job is to fight the traffic."

In some ways, department heads are like cabbies. If you don't like to fight the traffic, then you probably won't like the job.

I spent a lot of time this year multitasking. There is a meeting this afternoon that I simply must attend? Grab the laptop and try to get a little light work done in the back. When in the office, I'd be working on some task with frequent context switches out for whatever phone call or walk-in visitor arrived. Soon I learned the myth of multitasking, illustrated by this photo from 43 Folders:

the myth of multitasking, courtesy Merlin Mann

It's a mirage, tempting as it may be. Context switching has a measurable deleterious effect on my performance.

Perhaps one day I can reach a state of Zen mind in which I can live this recommendation from Ron Rolheiser:

Henri Nouwen once wrote "I used to get upset about all the interruptions to my work until one day I realized that the interruptions were my real work."

Pure earthily accidents often do make us responsible for what is divine and they conscript us to our real work.

I do realize that at least part of my job is to fight the traffic, to make the lives of the faculty and students better in the process. But on too many days it all just feels like an unending distraction. Part of my task now is to learn how to manage this flow of needs and wants better, and another part is to learn what part of this flow really is my real work.

So my answer to all my friends and colleagues is, I'm still doing the dance of joy at semester's end, but for a different reason. I'm looking forward to a couple of months in which to collect my thoughts, work at a steadier pace, internalize a few lessons from the year, figure out how to get better -- and to hack a little Ruby or Scheme or Java, when the mood strikes me!


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

May 09, 2006 9:19 AM

A Weekend in Portland

OOPSLA 2006 logo

I was in Portland this weekend for the spring meeting of the OOPSLA 2006 conference committee. This is the meeting where we assemble the program for the conference, from technical papers to lightning talks to invited and keynote talks, from Onward! to the Educators' Symposium to DesignFest, from workshops to area of responsibility this year, tutorials. It looks like we will have 57 tutorials this year, covering a range of topics in the OOP community and out in the industrial software development community. It's a tough job to assemble the elements of such a program, which ranges over five days and encompasses affiliated events like GPCE and, for the first time ever this year, PLoP. Trying to schedule events in such a way as to minimize the number of times conference attendees say, "Rats! There are two things I want to see right now. In the session before lunch, there were none!" I suppose that, in some way, we'd be happy if every session created a conflict for attendees, but I'm not sure the attendees would like it so much!

As I've done in the past when chairing the 2004 and 2005 Educators' Symposia, I owe a great debt to my program committee of seven OOPSLA veterans. They did most of the heavy lifting in reading and evaluating all of the submissions we received. I had to make some tough calls at the end, but their input made that doable.

Some highlights from the weekend:

PDX, the Portland International Airport, has free wireless -- with better coverage than promised. Hurray!

Why is Onward! is a must-see? So that you can "cool your pinkies in the mud of the future". Maybe it's a must-see because the chairs of Onward! are the kind of people who say such things. I'm especially looking forward to Onward! films, which I had to step out of last year.

I am not sure that my students are ready for a web page about me that looks like this pictorial biography of past OOPSLA chair Douglas Schmidt. My favorite is this take on the old cartoon standard about the evolution of man:

the evolution of a programmer

You may recall me commenting on a particular sign I saw while running in Portland back at the fall meeting. Doesn't it always rain in Portland? Maybe not, it rained again both nights and mornings I was there this time. It was nice enough when I arrived Friday evening, if cool, and the sun had poked through the clouds Monday afternoon -- as we left.

At least it didn't rain on me while I ran. Unfortunately, I was only able to run my first morning in town. It was my first 12-miler in eight weeks or so, and felt pretty good. But, just as I did the second day at SIGCSE and the second day at ChiliPLoP I came down with some sort of respiratory thing that sapped all of my energy. So I took today off, and probably will tomorrow, too, just to get back to normal. I have a feeling that I won't be bragging about my mileage the year like I did at the end of 2005... I'm beginning to wonder about the pattern and what I can do make air travel workable again for me. I won't have a chance to test any hypothesis I develop until October, when I go to PLoP and OOPSLA.

Finally, on a less frivolous note, we spent a few minutes on Monday morning to plan a memorial for John Vlissides, whose passing I memorialized last winter. We want this memorial to be a celebration of all the ways John touched the lives of everyone he met. In an e-mail conversation last week, 2006 Educators' Symposium chair Rick Mercer pointed out a picture of John and me that I didn't know about from John's wiki, courtesy of Dragos Manolescu.

John Vlissides and Eugene Wallingford at the 2001 post-OOPSLA Hillside meeting

I remember that moment clearly. It has an OOPSLA connection, too, because it was taken at the annual fall meeting of the Hillside Group, which traditionally takes place the evening and morning after OOPSLA. (I've missed the last two, because the night OOPSLA ends is the traditional celebration dinner for the conference committee, and I've been eager to get home to see my family after a week on the round.)

John and I were part of a break-out group at that Hillside meeting on the topic of how to create a larger world in which to publish some of the work coming out of the PLoPs. Most academic conferences are places to publish novel work, and most pattern work is by definition not all that new -- it documents patterns we see in many existing code bases. The pattern as a literary work and teaching tool is itself novel, but that's just not what academic program conference committees are looking for.

Anyway, John and I were brainstorming. I don't remember what we produced in that session, but my clueless expression indicates that at that particular moment John was the one producing. That is not too surprising. yet he made me feel like an equal partner in the work. I guess I didn't hurt his impression of me too much. When he became general chair of OOPSLA 2004, he asked me to get involved with the conference committee for the first time, as his Educators' Symposium chair. Good memories. Thanks for the pointer to the photo, Rick. And thanks, Dragos, for taking the photo and sharing it.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

April 30, 2006 12:04 PM

Process on My Mind

Jesus Christ Superstar

My family and I watched the Tim Rice/Andrew Lloyd Webber rock opera "Jesus Christ Superstar" this weekend. Both of my daughters are into the theater, having performed a few times and seen most of the local children's theater's productions over the last many years. My older daughter vaguely remembers a stage production we saw a few years ago at an excellent local playhouse and wanted to see the show again. Our local library had two versions, the original 1973 movie and a 2000 London theatrical performance staged for television. Fortunately, we all like the music, so watching the same show on back-to-back nights was just fine.

Watching two versions so close in time really made the differences in tone, characterization, and staging stand out in great relief. The newer version took a European viewpoint, with the Romans as fascist/Nazi-like overlords and the common people seeking a revolution. The older version focused more on the personal struggles of the main characters -- Jesus, Mary, and especially Judas -- as they tried to come to grips with all that was happening around them.

For some reason, this brought to mind a short blog entry called Process as theatre written by Laurent Bossavit nearly two years ago. Laurent considers the differences between Extreme Programming as described in Kent Beck's original book and as practiced by Kent and others since, and compares them to the script of a play like "Hamlet". The script stays the same, but each staging makes its own work of art. The two videos I watched this weekend were at the same time both the same play and very different plays. (I was proud when my younger daughter recognized this and was able to express the two sides.)

Folks who feel compelled to follow every letter of every rule of a methodology often find themselves burning out and becoming disillusioned. Or, even when they are able to keep the faith, they find it difficult to bring others into the process, because those folks don't feel any need to be so limited.

On the other hand, we've all seen performances that take too many liberties with a script or story -- and instinctively feel that something is wrong. Similarly, we can't take too many liberties with XP or other methodologies before we are no longer working within their spirit. In XP, if we give up too many restrictions, we find that some of the remaining practices lose their effectiveness without the balancing effects of what we've removed.

As in so many things, striking the right balance between all or nothing is the key. But we start from a healthier place when we realize that a development process consists of both script and production, fixed and dynamic elements working together to create a whole.

I had forgotten that Laurent's blog entry refers to the book Artful Making. In an interesting confluence, I just this week asked our library to purchase a copy of this book so that I can read it over the summer. Now I'm even more eager.

Josh Mostel as King Herod

Oh, and on the two versions of "Superstar": call me an old fogey, but I still love the 1973 movie. Larry Marshall as Simon Zealotes gives an awesome performance in his highlighted scene, and Josh Mostel delivers one of the all-time great comedic song-and-dance performances as King Herod. "Get out of my life!"


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

April 20, 2006 6:54 PM

Artistic Diversions from Artistic Work

This
blog
speaks in
rhythms of
numbers in patterns.
Words in patterns entice the mind.

That bit of doggerel is my first Fib -- a poem!

Each poem is six lines, twenty syllables. You should see a familiar pattern.

The syllables of a Fib follow the Fibonacci sequence. And, for my own amusement, the last three paragraphs extend the pattern.

Don't I have something better to do, like prepare slides for my talk on steganography at this weekend's Camouflage Conference at UNI? Of course! But sometimes the mind chooses its own direction. (Count the syllables!)


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

March 27, 2006 6:41 PM

Busy with a Move...

I haven't had a chance to write in over a week, but a busy week it has been. My last post foreshadowed the move of my department office to a new home, in an old building on campus that has been renovated into an "innovative teaching and technology center". That name actually sets a higher standard than the building can live up to, but it will be a good home for us.

Last week, we began moving the department office to its new home, to make space for a computer lab. I suppose that we didn't technically have to move the department head's office yet, because the old office isn't need for something else yet. but anyone who has ever been at a university knows that heads, deans, and the like depend on their administrative staff too much to be separated from them. In this era of continuous electronic communication, I could probably survive better than most administrators from the past, but it would make an already challenging job all the more difficult to be two buildings away from the department secretary. So I moved, too, if only the working set of my office for now.

Not that I didn't have good reason to want to move. The university's provost, after taking a tour of the new building during construction, made a point of telling me that my office may have the best view on campus. Here is a digital snapshot looking out my window toward the southwest on a recent sunny day:

SW view from my ITTC office window

Here's a lower-quality picture of the view toward the northwest. The campanille serves as nice reference point:

NW view from my ITTC office window

I don't have the highest-quality digital camera, but I've done the best I can to reproduce my view. The glare off my windows makes it tough to get good shots from all angles. But, quoting Sidra from Seinfeld, I can say that these images "are real, and they are spectacular". Either karma is looking out for me, or I have simply lucked into one of the best offices on campus. I should probably be nervous that my provost, my dean, and nearly every other administrator on campus know about my office and view, but I've been too busy withe the business of the department to worry.

Of course, the construction in the rest of the building is still being completed, and there are warts to fix. For example: When I came into the office last Friday morning, the temperature in my office was 63 degrees Fahrenheit; this morning, it was 83 degrees Fahrenheit. After over eleven hours with the windows open and the door open to permit circulation through the department office, I have the temp down to 78 degrees. But I'm sure that this is the sort of kink we can work out over the next few months. (Let's hope so. With windows and western exposure, the summer could proved quite warm otherwise!)

Computer science happens for me these days only in fits and spurts, mostly short ones when I decide to find time to read by downgrading some other task's priority. This summer is my next real chance to do real computing. I've been learning a lot this year, but I hope to put that learning to good use in the future by managing times and tasks more carefully.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

February 13, 2006 6:09 PM

Doing What You Love, University Edition

Several folks have commented already on Paul Graham's How to Do What You Love. As always, this essay is chock full of quotable quotes. Ernie's 3D Pancakes highlights one paragraph on pain and graduate school. My grad school experience was mostly enjoyable, but I know for certain that I was doing part of it wrong.

Again as always, Graham expresses a couple of very good ideas in an easy-going style that sometimes makes the ideas seem easier to live than they are. I am thinking about how I as a parent can help my daughters choose a path that fulfills them rather than the world's goals for them, or mine.

Among the quotable quotes that hit close to home for me were these. First, on prestige as siren:

Prestige is especially dangerous to the ambitious. If you want to make ambitious people waste their time on errands, the way to do it is to bait the hook with prestige. That's the recipe for getting people to give talks, write forewords, serve on committees, be department heads, and so on. It might be a good rule simply to avoid any prestigious task. If it didn't suck, they wouldn't have had to make it prestigious.

Ouch. But in my defense I can say that in the previous fourteen years my department had beaten all of the prestige out of being our head. When I came around to considering applying for the job, it looked considerably less prestigious than the ordinary headship. I applied for the job precisely because it needed to be done well, and I thought I was the right person to do it. I accepted the job on with a shorter-than-usual review window with no delusions of grandeur.

Then, on prematurely settling on a career goal:

Don't decide too soon. Kids who know early what they want to do seem impressive, as if they got the answer to some math question before the other kids. They have an answer, certainly, but odds are it's wrong.

From the time I was seven years old until the time I went to college, I knew that I wanted to be an architect -- the regular kind that designs houses and other buildings, not the crazy enterprise integration kind. My premature optimization mostly didn't hurt me. When some people realize that they had been wrong all that time, they are ashamed or afraid to tell everyone and so stay on the wrong path. During my first year in architecture school, when I realized that as much as I liked architecture it probably wasn't the career for me, I was fortunate enough to be feel comfortable changing courses of study right away. It was a sea change for me mentally, but once it happened in mind I knew that I could tell folks.

I somehow knew that computer science was where I should go. Again, I was fortunate not to have skewed my high school preparation in a way that made the right path hard to join; I had taken a broad set of courses that prepared me well for almost any college major, including as much math and science as I could get.

One way that my fixation on architecture may have hurt me was in my choice of university. I optimized school selection locally by picking a university with a strong architecture program. When I decided to switch to a CS major, I ended up in a program not as strong. I certainly could have gone to a university that prepared me better for CS grad school. One piece of I advice that I'll give my daughters is to choose a school that gives you many options. Even if you never change majors, having plenty of strong programs will mean a richer ecosystem of ideas in which to swim. (I already give this advice to students interested in CS grad school, though there are different trade-offs to be made for graduate study.)

That said, I do not regret sticking with my alma mater, which gave me a very good education and exposed me to a lot of new ideas and people. Most of undergraduate education is what the student makes of it; it's only at the boundaries of high ambition where attending a particular school matters all that much.

Nor would I have traded my time in architecture school for a quicker start in CS. I learned a lot there that still affects how I thinking about systems, design, and education. More importantly, it was important for me to try the one thing I thought I would love before moving on to something else. Making such decisions on purely intellectual grounds is a recipe for regret.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

February 02, 2006 4:21 PM

Mac OS X Spell Checker Trivia

Before posting my last piece on Java trivia, I ran Mac OS X's spell checker on my article, and it flagged my very first line:

Via another episode of College Kids Say the Darnedest Things,

It suggested that I used Damnedest instead. This demonstrates several things:

  • My spell checker must not know me very well.
  • My spell checker doesn't know much about Art Linkletter, or Bill Cosby's TV biography.
  • My spell checker doesn't know about Mark Twain's advice:
    Substitute "damn" every time you're inclined to write "very"; your editor will delete it and the writing will be just as it should be.

So that's your Mac OS X spell checker trivia for the day.

Speaking of the day... Happy Groundhog Day! I like many of Bill Murray's movies, and even wrote a agile software development fantasy about one of them, but Groundhog Day is my favorite. Cue it up.


Posted by Eugene Wallingford | Permalink | Categories: General

January 06, 2006 4:02 PM

... But You Doesn't Have to Call Me Lefschetz

In the last two days, I have run across references to John von Neumann twice.

First, I was reading The Geomblog yesterday and found this:

It reminds me of a quote attributed to John von Neumann:

In mathematics you don't understand things. You just get used to them.

I've had that feeling in computer science... A few months ago I described something similar, but in that case I did come to understand the course material; it only seemed as if I never world. My "just get used to it" experiences came in an area right up Suresh's alley: Computational Complexity. I loved that class, but I always felt like I was swimming in the dark -- even as I did well enough in the course.

Then today I was cleaning out a folder of miscellaneous notes and found a clipping from some long-forgotten article.

In Princeton's Fine Hall, someone once posted a "Scale of Obviousness":

  • If Wedderburn says it's obvious, everybody in the room has seen it ten minutes ago.

  • If Bohnenblust says it's obvious, it's obvious.

  • If If Bochner says it's obvious, you can figure it out in half an hour.

  • If von Neumann says it's obvious, you can prove it in three months if you are a genius.

  • If Lefschetz says it's obvious, it's wrong.

I'll venture to say that students at every institution occasionally make such lists and discuss them with their friends, even if they are too polite to post them in public. That's good for the egos of us faculty members. In our fantasies, we are all von Neumanns. In reality, most of us are Bohnenblusts at best and more likely Wedderburns. And we all have our Lefschetz days.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

December 29, 2005 5:35 AM

You Have to Write the Program

As we close out 2005, an article in MSU Today reminds us why scientists have to run experiments, not just sit in their easy chairs theorizing. A group of physicists was working on a new mix of quarks and gluons. Their theory predicted that they were creating a plasma with few or no interactions between the particles. When they ran their experiments, they instead "created a new state of matter, a nearly perfect fluid in which the quarks and gluons interact strongly." Gary Westfall, the lead MSU researcher on the project, said,

What is so incredibly exciting about this discovery is that what we found turned out to be totally different from what we thought we would find. ... But it shows that you cannot just rely on making theories alone. In the end, you have to build the machine and do the experiment. What you learn is often more beautiful than the most vivid imagination.

Folks who don't write computer programs often say that computers only do what we tell them to do, implying somehow that their mechanistic nature makes them uninteresting. Anyone who writes programs, though, has had the sort of experience that Gary Westfall describes. What we learn by writing a program and watching it is often more beautiful than anything we could ever have imagined.

I love this game.

Best wishes to you all for a 2006 full of vivid imagination and beautiful programs.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

December 23, 2005 10:19 AM

On the Popularity of Chess

My only non-professional blog category is on running. That reflects one of my primary personal interests in the time since I started blogging a year and a half ago. If I had started blogging twenty-five years ago, the extra category would have been on chess.

Play it again, Sam.

From grade school into college, but especially in high school, I played a lot of chess, more than anyone I knew. I read chess books, I played chess variants, and I surrounded myself with the miscellania of the chess world. Chess didn't seem especially popular in the late 1970s and early 1980s, but we were still in the midst of the Fischer boom, which had created a resurgence in the popularity of chess in America. The headiness of the Fischer boom years eventually passed. Independently, I got busy at college with computer science (and girls) and had less and less time to play. But I still love the game.

A recent article in the New York Times talks about the further decline of American chess. The article's author, Jennifer Shahade, is a former U.S. women's champion and one of only a few young native-born Americans to have accomplished much in the world of chess over the last decade. There are many great minds who still play chess in the U.S. when they are young, but they are pulled toward more attractive -- and lucrative -- endeavors as they get older. Shahade points to poker, which has undergone a massive boom in popularity over the last decade, as a source of possible ideas for saving chess from its decline.

Her suggestions are reasonable goals accompanied by simple strategies for reaching them. The chess world needs to offer ways for adults to learn the game effectively on-line and to promote the sporting, competitive element of chess. (And I can support Shahade's claim that a long tournament game of chess is much more tiring than many physical activities.) But ultimately the key is finding a way to make chess seem cool and exciting again. A breakthrough on the world stage by a player like Hikaru Nakamura could turn the trick, but it's hard to engineer that sort of event.

Others interested in promoting chess have adopted more, um, salacious methods. Consider the World Chess Beauty Contest, reported in another NYT article on the same day as Shahade's. The WCBC tries to draw people -- well, at least teenage boys -- to chess by focusing on the many beautiful young women chess players around the world. When you are looking at pictures of these young ladies, just don't forget this: most of them are really strong players who can easily defeat the vast majority of chessplayers in the world. But, for the most part, they are not competitive with the very best women players in the world, let alone the top men.

Carmen Kass plays chess

Still, the thought that supermodel Carmen Kass is an avid chess player, is president of the Estonian Chess Federation last year, and is dating German grandmaster Eric Lobron makes me secretly happy. (The above picture is from the second NYT article and shows Kass playing speed chess with Indian super-grandmaster Viswanathan Anand, the world's #2 player.)

Shahade talks about how we could heighten interest in chess tournaments by making them more thrilling, more immediate. Chess tournaments are usually arranged as round-robin or Swiss system affairs, neither of which tends to create do-or-die situations that heighten in intensity as the tournament progresses. In contrast, consider U.S. college basketball's March Madness -- and then imagine what it would be like as as a round-robin. Boring -- and much less variable in its outcome.

We all love the mere chance that a Princeton or a UNI will come out of nowhere to upset a Duke or an Indiana, even if it doesn't happen very often. But in chess, the chances of a much lower-ranked player upsetting a better player is quite small. The standard deviation on performance at the highest levels of chess is remarkably small. When you try to cross more than one level, forget it. For example, the chance that I could beat Gary Kasparov, or even earn a draw against him, is essentially zero.

pawn and move odds

My proposal to increase the competitiveness of games among players of different skill levels comes from the 19th century: odds. Odds in chess are akin to handicaps in golf. For example, I might offer a weaker player "pawn odds" by removing my king's bishop's pawn before commencing play. In that case, I would probably play the white pieces; if I gave pawn odds and played black, then I would be giving "pawn and move" odds. (Moving first is a big advantage in chess.)

Back in the 1800s, it was common for even the best players in the world to take odds from better players. America's first great chess champion, Paul Morphy made his reputation by beating most of America's best players, and many of Europe's best players at "pawn and move" odds.

standard chess clock

Since the advent of the chess clock, another way to handicap a chess game is to give time odds. I spent many an evening as a teenager playing speed chess with Indianapolis masters who gave me the advantage of playing in 1.5 minutes against my 5 minutes. Even at those odds, I lost more quarters than I won for a long time... But I felt like I had a chance in every game we played, despite the fact that those guys were much better than I was.

My experience offering odds has been less successful. When I've tried to offer time odds to students and former students, they balked or outright refused. To them, playing at advantage seemed unsporting. But the result has generally been one-sided games and, within a while, one or both of us loses interest. I've never tried to give piece odds to these folks, because material seems more real to them than time and consequently the odds would seem even less sporting.

Odds chess isn't the complete answer to making top-level chess more attractive, though it might have its place in novelty tournaments. But giving odds could make coffeehouse chess, casual games wherever, and local tournaments more interesting for more players -- and thus offer a route to increased popularity.

This whole discussion raises another, more fundamental question. Should we even care about the popularity of chess? The conventional wisdom is yes; chess is a fun way for kids to learn to concentrate, to think strategically, to learn and deploy patterns, and so on. There is some evidence that children who play chess realize benefits from these skills in school, especially in math. But in today's world there are many more challenging and realistic games these days than there used to be, and maybe those games -- or learning to play a musical instrument, or learning to program a computer -- are better uses for our young brainpower. As a lover of the game, though, I still harbor a romantic notion that chess is worth saving.

One thing is for certain, though. Poker is a step backwards intellectually. It may be a lot of fun and require many useful skills, but it is much shallower than chess, or even more challenging card games, such as bridge.

The article on Jennifer Shahade that I link to above ends with a paragraph that sums up both the challenge in making chess more popular and a reason why it is worth the effort to do so:

"People sometimes ask me if chess is fun," Jennifer says. "'Fun' is not the word I'd use. Tournament chess is not relaxing. It's stressful, even if you win. The game demands total concentration. If you mind wanders for a moment, with one bad move you can throw away everything you've painstakingly built up."

Modern society doesn't seem to value things that aren't always fun and light, at least not as much as it could. But we could do our children a favor if we helped them learn to concentrate so deeply, to confront a challenge and painstakingly work toward a solution.

Maybe then math and science -- and computer programming -- wouldn't seem unusually or unexpectedly hard. And maybe then more students would have the mental skills needed to appreciate the beauty, power, and, yes, fun in work that challenges them. Like computer science.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

November 28, 2005 12:01 PM

The Passing of a Friend

When I first attended PLoP, I was a rank novice with patterns. But like most everyone else, I had read Design Patterns -- or at least put it on my bookshelf, like everyone else -- and was just a bit in awe of the Gang of Four. Pretty soon I learned that Ralph Johnson and John Vlissides were two of the nicest and most helpful people I around. I also met Erich Gamma a time or two and found him to be a good guy, though I never interacted all that much with him. I've never had the pleasure of meeting Richard Helm.

John Vlissides

Within a couple of years, John asked me to chair PLoP 2000. I like to have a decent understanding of something like this before I get started, and John sat with me to patiently answer questions and teach me some software patterns history. He also gave me a lot of encouragement. The conference turned out well.

Then a couple of years later, John approached me with another request: to chair the Educators Symposium at OOPSLA 2004. Again, I had been attending OOPSLA and the Educators Symposium for a few years, even helping on a few symposium committees, but I had never considered taking on the chair, which seemed like much more work and responsibility. Again, John offered me a lot of encouragement and offered to help me in any way he could in his capacity as conference chair -- including giving me his complimentary registration to OOPSLA 2003, so that I could attend the 2004 kick-off planning meeting and begin working with my colleagues to assemble the next year's program. 2003 was a tight money year for my department and me, and John's kindness made it possible for me to attend.

The 2004 Educators Symposium went pretty well, I think we can say. I've certainly talked about it enough here, beginning with this entry. I owed much of its success to John's encouragement and support. When I floated the idea of asking Alan Kay to keynote at the symposium, John said, "Dream big. I'll bet you can do it." Then, when Alan won the Turing Award, John worked to bring the Turing Award lecture to OOPSLA -- but all the while protecting my "coup" at having persuaded Alan to speak at OOPSLA 2004 in the first place. I'd've been happy to have Alan speak at OOPSLA under any circumstances, but I appreciated how John just assumed that I deserved some attention for my efforts and that, under his care, I would receive it.

John was always like that. He was a quiet leader, a person who treated each person with dignity and respect and who, as a result, could make things happen through the team he built.

He was also always a very good writer and scholar. The articles that ended up in his Pattern Hatching went a long way toward making design patterns more accessible to folks who had been a bit swamped by the GoF book. They also taught us a bit about how a good programmer thinks when he writes code.

I am filled with a deep sadness to know that John died at his home on Thanksgiving day, after more than a year and a half battling a brain tumor. He battled silently, with strength and courage derived from his abiding faith in God. My prayers are with his family, especially his wife and children.

As someone said in an announcement of John's death, the essence of John's greatness lay not in his technical accomplishments but in his humanity. He was a friend, a teacher, a colleague, and a mentor to everyone with whom he came into contact. He made everyone around him a better scholar and a better person.

Though I never did anything to deserve it, John was a friend and a mentor to me. I will miss him.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

November 23, 2005 1:46 PM

This and That, from the Home Front

The daily grind of the department office has descended upon me the last couple of weeks, which with the exception of two enjoyable talks (described here and here) have left me with little time to think in the way that academics are sometimes privileged. Now comes a short break full of time at home with family.

Here are a few things that have crossed my path of late:

  • Belated "Happy Birthday" to GIMP, which turned 10 on Monday, November 21, 2005. There is a lot of great open-source software out there, much of which is older than GIMP, but there's something special to me about this open-source program for image manipulation. Most of the pros use Photoshop, but GIMP is an outstanding program for a non-trivial task that shows how far an open-source community can take us. Check out the original GIMP announcement over at Google Groups.

  • Then there is this recently renamed oldie but goodie on grounded proofs. My daughters are at the ages where they can appreciate the beauty of math, but their grade-school courses can do only so much. Teaching them bits and pieces of math and science at home, on top of their regular work, is fun but challenging.

    The great thing about explaining something to a non-expert is that you have to actually understand the topic.

    Content and method both matter. Don't let either the education college folks or the "cover all the material" lecturers from the disciplines tell you otherwise.

  • Very cool: an on-line version of John Von Neumann's Theory of Self-Reproducing Automata.

  • Finally, something my students can appreciate as well as I:

    If schedule is more important than accuracy, then I can always be on time.

    Courtesy of Uncle Bob, though I disagree with his assumption that double-entry bookkeeping is an essential practice of modern accounting. (I do not disagree with the point he makes about test-driven development!) Then again, most accountants hold double-entry bookkeeping in nearly religious esteem, and I've had to disagree with them, too. But one of my closest advisors as a graduate student, Bill McCarthy, is an accountant with whom I can agree on this issue!


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development, Teaching and Learning

November 15, 2005 8:51 PM

Popularizing Science through Writing and Teaching

I have an interest in writing, both in general as a means for communication and in particular as it relates to the process of programming. So I headed over to the Earth Science department yesterday for a talk on popular science writing called "Words, Maps, Rocks: One Geologist's Path". The speaker was Marcia Bjornerud of Lawrence University, who recently published the popular geology book Reading the Rocks: The Autobiography of the Earth. The Earth Science faculty is using Reading the Rocks as reader in one of their courses, and they asked Dr. Bjornerud to speak on how she came to be a geologist and a popularizer of science.

Bjornerud took a roundabout way into science. As a child, she had no desire to be a scientist. Her first loves were words and maps. She loved the history of words, tracing the etymology of cool words back to their origin in European languages, then Latin or Greek, and ultimately back to the source of their roots. The history of a word was like a map through time, and the word itself was this rich structure of now and then. She also loved real maps and delighted in the political, geographical, and temporal markings that populated them. Bjornerud told an engaging story about a day in grade school when snow created a vacation day. She remembers studying the time zones on the map and learning that at least two places had no official time zone: Antarctica and Svalborg, Norway.

These reminiscences probably strike a chord in many scientists. I know that I have spent many hours poring over maps, just looking at cities and open spaces and geopolitical divisions, populations and latitudes and relative sizes. I remember passing time in an undergraduate marketing class by studying a large wall map of the US and first realizing just how much bigger Iowa (a state I had never visited but would one day call home) was than my home state of Indiana (the smallest state west of the Appalachian Mountains!) I especially love looking at maps of the same place over time, say, a map of the US in 1500, 1650, 1750, 1800, and so on. Cities grow and die; population moves inexorably into the available space, occasionally slowing at natural impediments but eventually triumphing. And words -- well, words were why I was at this talk in the first place.

Bjornerud loved math in high school and took physics at the suggestion of friends who pointed out that the calculus had been invented in large part in order to create modern physics. She loved the math but hated the physics course; it was taught by someone with no training in the area who acknowledged his own inability to teach the course well.

It wasn't until she took an introductory college geology course that science clicked for her. At first she was drawn to the words: esker, alluvium, pahoehoe, ... But soon she felt drawn to what the words name. Those concepts were interesting in their own right, and told their own story of the earth. She was hooked.

We scientists can often relate to this story. It may apply to us; some of us were drawn to scientific ideas young. But we certainly see it in our friends and family members and people we meet. They are interested in nature, in how the world works, but they "don't like science". Why? Where do our schools go wrong? Where do we as scientists go wrong? The schools are a big issue, but I will claim that we as scientists contribute to the problem by not doing a good job at all of communicating to the public why we are in science. We don't share the thrill of doing science.

A few years ago, Bjornerud decided to devote some of her professional energy to systematic public outreach, from teaching Elderhostel classes to working with grade schoolers, from writing short essays for consumption by the lay public to her book, which tells the story of the earth through its geological record.

To write for the public, scientists usually have to choose a plot device to make technical ideas accessible to non-scientists. (We agile software developers might think of this as the much-maligned metaphor from XP.)

Bjornerud used two themes to organize her book. The central theme is "rocks as text", reading rocks like manuscripts to reveal the hidden history of the earth. More specifically, she treats a rock as a palimpsest, a parchment on which a text was written and then scraped off, to be written on again. What a wonderful literary metaphor! It can captivate readers in a day when the intrigue of forensic science permeates popular culture.

Her second theme, polarities, aims more at the micro-structure of her presentation. She had as an ulterior motive, to challenge the modern tendency to see dichotomy everywhere. The world is really a tangled mix of competing concepts in tension. Among the polarities Bjornerud explores are innovation versus conservation (sound familiar?) and strength versus weakness.

Related to this motive is a desire -- a need -- to instill in the media and the public at larger an appetite for subtlety. People need to know that they can and sometimes must hold two competing ideas in their minds simultaneously. Science is a halting journey toward always-tentative conclusions.

These themes transfer well to the world of software. The tension between competing forces is a central theme driving the literature of software patterns. Focusing on a dichotomy usually leads to a sub-optimal program; a pattern that resolves the dichotomy can improve it. And the notion of "program as text" is a recurring idea. I've written occasionally about the value in having students read programs as they learn to write them, and I'm certainly not the first person to suggest this. For example, Owen Astrachan once wrote quite a bit on apprenticeship learning through reading master code (see, for example, this SIGCSE paper). Recently, Grady Booch blogged On Writing, in which he suggested "a technical course in selected readings of software source code".

Bjornerud talked a bit about the process of writing, revising, finding a publisher, and marketing a book. Only one idea stood out for me here... Her publisher proposed a book cover that used a photo of the Grand Canyon. But Bjornerud didn't want Grand Canyon on her cover; the Grand Canyon is a visual cliche, particularly in the world of rocks. And a visual cliche detracts from the wonder of doing geology; readers tune out when they see yet another picture of the Canyon. We are all taught to avoid linguistic cliches like the plague, but how many of us think about cliches in our other media? This seemed like an important insight.

Which programs are the cliches of software education? "Hello, World", certainly, but it is so cliche that it has crossed over into the realm of essential kitsch. Even folks pitching über-modern Ruby show us puts "Hello, World." Bank account. Sigh, but it's so convenient; I used it today in a lecture on closures in Scheme. In the intro Java world, Ball World is the new cliche. These trite examples provide a comfortable way to share a new idea, but they also risk losing readers whose minds switch off when they see yet another boring example they've seen before.

In the question-and-answer session that followed the talk, Bjornerud offered some partial explanations for where we go wrong teaching science in school. Many of us start with the premise that science is inherently interesting, so what's the problem?

  • Many science teachers don't like or even know science. They have never really done science and felt its beauty in their bones.

    This is one reason that, all other things being equal, an active scholar in a discipline will make a better teacher than someone else. It's also one of the reasons I favor schools of education that require majors in the content area to be taught (Michigan State) or that at least teach the science education program out of the content discipline's college (math and science education at UNI).

  • We tend explain the magic away in a pedantic way. We should let students discover ideas! If we tell students "this is all there is to it", we hide the beauty we ourselves see.

  • Bjornerud stressed the need for us to help students make a personal connection between science and their lives. She even admitted that we might help our students make a spiritual connection to science.

  • Finally, she suggested that we consider the "aesthetic" of our classrooms. A science room should be a good place to be, a fun place to engage ideas. I think we can take this one step further, to the aesthetic of our instructional materials -- our code, our lecture notes, our handouts and slides.

The thought I had as I left the lecture is that too often we don't teach science; we teach about science. At that point, science becomes a list of facts and names, not the ideas that underlie them. (We can probably say the same thing about history and literature in school, too.)

Finally, we talked a bit about learning. Can children learn about science? Certainly! Children learn by repetition, by seeing ideas over and over again at increasing degrees of subtlety as their cognitive maturity and knowledge level grow. Alan Kay has often said the same thing about children and language. He uses this idea as a motivation for a programming language like Smalltalk, which enables the learner to work in the same language as masters and grow in understanding while unfolding more of the language as she goes. His groups work on eToys seeks to extend the analogy to even younger children.

Most college students and professionals learn in this way, too. See the Spiral pedagogical pattern for an example of this idea. Bjornerud tentatively offered that any topic -- even string theory!?, can be learned at almost any level. There may be some limits to what we can teach young children, and even college students, based on their level of cognitive development, their ability to handle abstractions. But for most topics most of the time -- and certainly for the basic ideas of science and math -- we can introduce even children to the topic in a way they can appreciate. We just have to find the right way to pitch the idea.

This reminds me, too, of Owen Astrachan and his work on apprenticeship mentioned above. Owen has since backed off a bit from his claim that students should read master code, but not from the idea of reading code itself. When he tried his apprenticeship through reading master code, he found that students generally didn't "get it". The problem was that they didn't yet have the tools to appreciate the code's structures, its conventions and its exceptions, its patterns. They need to read code that is closer to their own level of programming. Students need to grow into an appreciation of master code.

Talks like this end up touching on many disparate issues. But a common thread runs through Bjornerud's message. Science is exciting, and we scientists have a responsibility to share this with the world. We must do so in how we teach our students, and in how we teach the teachers of our children. We must do so by writing for the public, engaging current issues and helping the citizenry to understand how science and technology are changing the world in which we live, and by helping others who write for the public to appreciate the subtleties of science and to share the same through their writing.

I concur. But it's a tall order for a busy scientist and academic. We have to choose to make time to meet this responsibility, or we won't. For me, one of my primary distractions is my own curiosity -- that which makes us a scientist in the first place drives us to push farther and deeper, to devote our energies to the science and not to the popularizing of it. Perhaps we are doomed to the G. H. Hardy's conclusion in his wonderful yet sad A Mathematician's Apology: Only after a great mind has outlived its ability to contribute to the state of our collective knowledge can -- should? will? -- it turn to explaining. (If you haven't read this book, do so soon! It's a quick read, small and compact, and it really is both wonderful and sad.)

But I do not think we are so doomed. Good scientists can do both. It's a matter of priorities and choice.

And, as in all things, writing matters. Writing well can succeed where other writing fails.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns, Software Development, Teaching and Learning

November 09, 2005 6:54 PM

More Visibility from the Blog

Back in March, I was contacted by my local paper for an article on local bloggers. That was, I think, the first time that someone outside my expected audience had contacted me about my blog.

Last week, I was contacted by a gentleman named Alex Gofman, who is CTO for Moskowitz Jacobs Inc. and is writing a book on the marketing techniques of his company's founder, Howard Moskowitz. If you have been reading this blog for long, you may remember a nearly year-old article I wrote entitled What Does the iPod have in Common with Prego Spaghetti Sauce?, in which I discussed some ideas on design, style, and creativity. My thoughts there were launched by articles I'd read from Paul Graham and Malcolm Gladwell. The Gladwell piece had quoted Moskowitz, and I quoted Gladwell quoting Moskowitz.

Mr. Gofman apparently had googled on Moskowitz's name and come across my blog as a result. He was intrigued by the connections I made between the technique used to revive Prego and the design ideas of Steve Jobs, Paul Graham, agile software methods, and Art and Fear. He contacted me by e-mail to see if I was willing to chat with him at greater depth on these ideas, and we had a nice 45-minute conversation this morning.

It was an interesting experience talking about an essay I wrote a year ago. First of all, I had to go back and read the piece myself. The ideas I wrote about then have been internalized, but I couldn't remember anything particular I'd said then. Then, during the interview, Mr. Gofman asked me about an earlier blog entry I'd written on the rooster story from Art and Fear, and I had to scroll down to remember that piece!

Our conversation explored the edges of my thoughts, where one can find seeming inconsistencies. For example, the artist in the rooster story did many iterations but showed only his final product. That differs from what Graham and XP suggest; is it an improvement or a step backward? Can a great designer like Jobs create a new and masterful design out of whole cloth, or does he need to go through a phase of generating prototyping to develop the idea?

In the years since the Prego experience reported by Gladwell, Moskowitz has apparently gone away from using trained testers and toward many iterations with real folks. He still believes strongly in generating many ideas -- 50, not 5 -- as a means to explore the search space of possible products. Mr. Gofman referred to their technique as "adaptive experimentation". In spirit, it still sounds a lot like what XP and other agile methods encourages.

I am reluctant to say that something can't happen. I can imagine a visionary in the mold of Jobs whose sense of style, taste, and the market enable him to see new ideas for products that help people to feel desires they didn't know they had. (And not in the superficial impulse sense that folks associate with modern marketing.) But I wouldn't want to stake my future or my company on me or most anyone I know being able to do that.

The advantage of the agile methods, of the techniques promoted in Art and Fear, is that they give mere mortals such as me a chance to create good products. Small steps, continuous feedback from the user, and constant refactoring make it possible for me to try working software out and learn from my customers what they really want. I may not be able to conceive the iPod, but I can try 45 kinds of sauce to see which one strikes the subconscious fancy of a spaghetti eater.

This approach to creating has at least two other benefits. First, it allows me to get better at what I do. Through practice, I hone my skills and learn my tools. Though sheer dent of repetition and coming into contact with many, many creations, I develop a sense of what is good, good enough, and bad. Second, just by volume I increase my chances of creating a masterpiece every now and then. No one may have seen all of my scratch work, but you can be sure that I will show off my occasional masterpiece. (I'm still waiting to create one...)

We should keep in mind that even visionary designers like Jobs fail, too -- whether by creating a product ahead of its time, market- or technology-wise too soon, or by simply being wrong. They key to a guy like Jobs is that he keeps coming back, having learned from his experience and trying again.

I see this mentality as essential to my work as a programmer, as a teacher, and now as an administrator. My best bet is to try many things, trust my "customer" (whether user, student, or faculty colleague) enough to let them see my work, and try to get better as I go on.

In part as a result of our conversation this morning, Mr. Gofman -- who is a software developer trained as a computer engineer -- decided to proposing adding a chapter to his book dealing with software development as a domain for adaptive experimentation. I learned that he is an XP aficionado who understands it well enough to know that it has limits. This chapter could be an interesting short work on agile methods from a different angle. I look forward to seeing what may result.

As Mr. Gofman and I chatted this morning, I kept thinking about how fear and creativity had come up a few times at OOPSLA this year, for example, here and here. But I didn't have a good enough reason to tell him, "You should read every article on my blog." :-) In any case, I wish him luck. If you happen to read the book, be on the look out for a quote from yours truly.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

November 08, 2005 3:00 PM

An Index to the OOPSLA Diaries

I have now published the last of my entries intended to describe the goings-on at OOPSLA 2005. As you can see from both the number and the length of entries I wrote, the conference provided a lot of worthwhile events and stimulated a fair amount of thinking. Given the number of entries I wrote, and the fact that I wrote about single days over several different entries and perhaps several weeks, I thought that some readers might appreciate a better-organized index into my notes. Here it is.

Of course, many other folks have blogged on the OOPSLA'05 experience, and my own notes are necessarily limited by my ability to be in only one place at a time and my own limited insight. I suggest that you read far and wide to get a more complete picture. First stop is the OOPSLA 2005 wiki. Follow the link to "Blogs following OOPSLA" and the conference broadsheet, the Post-Obvious Double Dispatch. In particular, be sure to check out Brian Foote's excellent color commentary, especially his insightful take on the software devolution in evidence at this year's conference.

Now, for the index:

Day 1

Day 2

Day 3

Day 4

Day 5

This and That

I hope that this helps folks navigate my various meanderings on what was a very satisfying OOPSLA.

Finally, thanks to all of you who have sent me notes to comment on this postings. I appreciate the details you provide and the questions you ask...

Now, get ready for OOPSLA 2006.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns, Software Development, Teaching and Learning

November 04, 2005 5:16 PM

Simplicity and Humility in Start-Ups

Two of Paul Graham's latest essays, Ideas for Startups and What I Did This Summer, echo ideas about simplicity, exploration, and conventional wisdom that Ward Cunningham talked about in his Educators' Symposium keynote address. Of course, Graham speaks in the context of launching a start-up company, but I don't think he sees all that much difference between that and other forms of exploration and creation.

Ideas for Startups focuses first on a problem that many people face when trying to come up with a Big Idea: they try to come up with a Big Idea. Instead, Graham suggests asking a question...

Treating a startup idea as a question changes what you're looking for. If an idea is a blueprint, it has to be right. But if it's a question, it can be wrong, so long as it's wrong in a way that leads to more ideas.

Humility. Simplicity. Learn something along the way. This is just the message that Ward shared.

Later, he speaks of how to "do" simplicity...

Simplicity takes effort -- genius, even. ... It seems that, for the average engineer, more options just means more rope to hang yourself.

In this regard, Ward offers more hope to the rest of us. Generating a truly great something may require genius; maybe not. But in any case, ordinary people can act in a way that biases their own choices toward simplicity and humility, and in doing so learn a lot and create good programs (or whatever). That's what things like CRC cards, patterns, and XP are all about. I know a lot of people think that XP requires Superhero programmers to succeed, but I know lots of ordinary folks who benefit from the agile mindset. Take small steps, make simple choices where possible, and get better as you go.

I am not sure whether of how pessimistic Graham really is here about achieving simplicity, but his writing often leaves us thinking he is. But I prefer to read his stuff as "Yeah, I can do that. How can I do that?" and take good lessons from what he writes.

Then, in "What I Did This Summer", Graham relates his first experience with the Summer Founders Program, bankrolling a bunch of bright, high-energy, ambitious your developers. Some of the lessons his proteges learned this summer are examples of what Ward told us. For example, on learning by doing:

Another group was worried when they realized they had to rewrite their software from scratch. I told them it would be a bad sign if they didn't. The main function of your initial version is to be rewritten.

This is an old saw, one I'm surprised that the SFP needed to learn. Then again, we often know something intellectually but, until we experience it, it's not our own yet.

But I really liked the paragraphs that came next:

That's why we advise groups to ignore issues like scalability, internationalization, and heavy-duty security at first. I can imagine an advocate of "best practices" saying these ought to be considered from the start. And he'd be right, except that they interfere with the primary function of software in a startup: to be a vehicle for experimenting with its own design. Having to retrofit internationalization or scalability is a pain, certainly. The only bigger pain is not needing to, because your initial version was too big and rigid to evolve into something users wanted.

I suspect this is another reason startups beat big companies. Startups can be irresponsible and release version 1s that are light enough to evolve. In big companies, all the pressure is in the direction of over-engineering.

Ward spoke with great feeling about being willing to settle for the incomplete, about ignoring some things you probably shouldn't ignore, about disobeying conventional wisdom -- all in the service of keeping things simple and being able to see patterns and ideas that are obscured by the rules. We are conditioned to think of these behaviors as irresponsible, but they may in fact point us in the most likely direction of success.

----

I also found one really neat idea to think about from reading these papers that is independent of Ward's Cunningham. Graham was talking about doodling and what the intellectual equivalent is, because doodling is such a productive way for visual artists to let their minds wonder. Technical innovators need to let their minds wander so that they can stumble upon newly synthesized ideas, where a common frame of reference is applied to some inappropriate data. This can be the source of analogies that help us to see newly.

Out of this discussion, his programmer's mind created a programming analogy:

That's what a metaphor is: a function applied to an argument of the wrong type.

What a neat idea, both as a general characterization of metaphor and also as a potential source of ideas for programming languages. What might a program gain from the ability to reason about a function applied to invalid arguments? On its face, that's an almost meaningless question, but that's part of Graham's -- and Ward's -- point. What may come from such a thought? I want to think more.

I don't know that I have any more time than Graham to spend on this particular daydream...


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development

November 02, 2005 12:28 PM

OOPSLA This and That 4: Inside Stories

I'm finally breaking away from the work that stacked up while I was at OOPSLA and so took the luxury of catching up on some blog reading. Some of that reading turned to OOPSLA itself!

Thanks to Brian Foote for his compliment on my OOSPLA posts. I'd never thought to characterize my conference reporting style as "play by play", but Brian, as usual, turns the apt phrase. I am please to play this role. If only I can manage to be the Bob Costas of conference reporting, I'll be proud. Of course. I'll be on the look-out for solid color commentators to complement my work.

Though I'm not in his league, Martin Fowler can do my color commentary any time. Martin wrote a nice summary of OOPSLA, with much pithier summaries of several key ideas than my fuller-featured posts managed. One of the great things about blogs as a medium these days is that we all get a chance to see conferences from many different points of view, including the points of view of Big Names in the field. If you missed the conference, these reports are irreplaceable, but they also offer something special to those of us who attended -- they enrich our own experiences.

In Martin's summary, I found the pointer to Brian Foote's post mentioned above, which also contains the full text of his introduction for Martin's invited talk. Apparently, Brian himself was not allowed to deliver the address, so Ralph Johnson read it. Ralph did a nice job trimming the text down, and he delivered his lines credibly. But now you can read the unabridged version for yourself!

And now from the "Watching the Sausage Being Made" department... I can tell you that such introductions are often post-modern works of art, assembled from ideas bantered across a table crammed with pastries, coffee cups, laptop computers, power cords, and power strips. Ubiquitous internet access and Google have opened this process to the full world of pastiche. I especially enjoyed playing an ever-so-small part in James Noble's creation of his introduction to Mary Beth Rosson's talk. Here's the story.

Mary Beth chaired OOPSLA the year it was in Minneapolis. Now, Minneapolis was home to iconic American television character Mary Richards, a spunky, optimistic go-getter. Apparently Dick Gabriel riffed on the similarity between the Mary Tyler Moore character and Mary Beth's own cheerful disposition. He coached Mary Beth on how to make MTM's signature cap toss from the TV show's opening theme, figuring she could do it as she took the stage for the grand opening of the conference. I wasn't there, but from what I can gather Mary Beth never quite mastered the cap toss in the time available. It occurs to me now that I don't know if she tried the toss after all or not!

Back to OOPSLA 2005... As James contemplated out loud his introduction for Mary Beth's talk later that morning, someone suggested an inside joke to play off the Minneapolis experience. Ultimately, this lead to James wanting to incorporate the lyrics of the theme song into his intro. During the conversation, I was at my iBook, eavesdropping while checking e-mail. I quickly found the lyrics on-line and passed them on to James, who made them his own in a fine theatric reading.

I was happy just to help. And I hope Mary Beth can forgive me. :-)


Posted by Eugene Wallingford | Permalink | Categories: General

November 01, 2005 4:19 PM

Sprinting Through To-Dos

I obviously enjoyed my time at OOPSLA last week and am already looking forward to OOPSLA 2007 in Portland. But conference travel -- especially a trip that lasts most of a week -- is tiring, and upon returning to the office I usually have a lot of work backed up. The trade-off is two days of hyper-activity in which so much gets done.

The day before I leave for a trip, I usually have a lot to do. Traveling to OOPSLA this year was the best case scenario. I did my last classes and meetings for the week on Thursday, and I flew to San Diego on Saturday. That left me a sweet Friday. All I had left for the week was to do all the stuff that needs to be done for class and the department before I disappeared from the office for a week: recommendation letters, department web pages, programming assignments for my course, e-mail messages to faculty and deans and various other university types. The to-do list looks insurmountable, and to the uninitiated it might seem to presage a painful day in the making. But there is no time to spare, no room for procrastination. You Just Do It. This rush to prepare to be gone can be serves as a useful impetus to get all of these things done, and now. For me, the experience is not painful, but exhilarating. Look at that to-do list shrink! That item's been on the list for weeks. Gone! Aah.

Of course, I was in the office from 7:00 AM until 7:00 PM, but I enjoyed most of it.

The second typical rush day for me is the day I travel home. I have all this time in airports and on planes, plus the energy I absorbed from intelligent, energetic colleagues at the conference. I know that I'll be back in the office in a day or two, but I am ready to do all the things that I've been jotting down as possibilities throughout the conference. I don't know anyone around me, so there are few distractions. No Internet in most places, so my attention is focused on my local work. Again a lot happens in the course of a long, tiring day, but the result satisfies.

The day traveling to the conference is not typically a good day for a dash. First, I'm excited about the conference, not about getting things done. Second, I tend to be tired from the dash the day before. Usually, my time traveling to the conference is spent reading, maybe preparing for the conference itself or maybe just relaxing with a short novel.

Why does so much get done during my sprint days? Why haven't I already done all the stuff that I have to rush through on those days? Sometimes, it is being so busy with day-to-day work that some tasks get pushed aside. Other times, it is garden-variety procrastination. Still others, it is the non-writing equivalent of writer block, just waiting for some inspiration. But inspiration comes to those already involved in the work. And, as we found during our afternoon writing exercises at the Extravagaria workshop, sprints of this sort can jolt creativity, through time pressure and loss of one's conscious mind in the details of doing the work.

I have always referred to these days as sprints, even before I was captive to running metaphors. Over at 43 Folders, they are called dashes, sometimes even mad dashes. Madness is an apt emotion, as these days often feel like hours of falling downhill. But I am getting work done.

So, I continue to like the conferences I frequent, OOPSLA, PLoP, and ChiliPLoP among them, for the life they give me. They offer an unexpected side effect in sprint days. A few sprints like this every year are good for my system.


Posted by Eugene Wallingford | Permalink | Categories: General

October 26, 2005 5:23 PM

On Being Yourself

From my bedside reading this week:

In an age where there is much talk of "being yourself" I reserve to myself the right to forget about being myself, since in any case there is very little chance of my being anybody else. Rather it seems to me that when one is intent on "being himself" he runs the risk of impersonating a shadow.

-- Thomas Merton, The True Solitude


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

October 25, 2005 8:43 PM

OOPSLA This and That 3: Geek Jargon

Get a bunch of technology folks together for any time and they are bound to coin some interesting words, or use ones they've coined previously, either out of habit or to impress their friends. The Extravagaria gang was no exception.

Example 1: When someone asked how many of us were left-handed, Dick Gabriel said that he was partially ambidextrous, to which Guy Steele volunteered that he was ambimoustrous. I like.

Example 2: At lunch, Guy Steele asked us if we ever intentionally got lost in a town, perhaps a town new to us, so that we had to learn the place in order to get back to a place we knew. Several people nodded vigorous agreement, and John Dougan noted that he and his colleagues use a similar technique to learn a new legacy code base. They call this air-drop programming. This is a colorful analogy for a common pattern among software developers. Sometimes the best way to learn a new framework or programming language is to parachute behind enemy lines, surrender connection to any safety nets outside, and fight our way out. Or better, not fight, but methodically conquer the new terrain.

But the biggest source of neologisms at the workshop was our speaking stick. At a previous Extravagaria workshop, Dick used a moose as the communal speaking stick, in honor of Vancouver as the host city. (Of course, there are probably as many moose in San Diego as in Vancouver, but you know, the Great White North and all.) He had planned to bring the moose to this workshop but left it at home accidentally. So he went to a gift shop and bought a San Diego-themed palm tree to use in its place. The veterans of the workshop dubbed it "the moose" out of one year's worth of tradition, and from there we milked the moose terminology with abandon.

Some of my favorites from the day:

  • on the moose -- back in session; on the clock
  • moose cycles -- a measure of the speed of communication around the group, signified in the passing of the moose
  • virtual moose -- speaking without the moose, but with implicit group permission (We didn't always follow our own rules!)
  • "Moose! Moose!" -- "Me next!"

Even computer professionals, even distinguished computing academics, surrender to the silliness of a good game. Perhaps we take joy in binding objects to names and growing systems of names more than most.

I suppose that I should be careful reporting this, because my students will surely hold it over my head at just the right -- or wrong -- moment!


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

October 19, 2005 10:11 AM

OOPSLA Day 1: Writing Exercises at Extravagaria

I am nobody:
A red sinking autumn sun
Took my name away.

-- Richard Wright

As I noted before, I nearly blew off Sunday, after a long and tiring two days before. As you might have gathered from that same entry, I am happy that I did not. The reasons should be obvious enough: cool ideas happen for me only when I am engaged with ideas, and the people and interactions at Extravagaria were a source of inspiration that has remained alive with me throughout the rest of the conference.

In the afternoon of the workshop, we did two group exercises to explore issues in creativity -- one in the realm of writing poetry, and one in the realm of software design.

Gang-Writing Haiku

Haiku is a simple poetic form that most of us learn as schoolchildren. It is generally more involved than we learn in school, with specific expectations on the content of the poems, but at its simplest it is a form of three lines, consisting of 5, 7, and 5 syllables, respectively.

If I understood correctly, a tonka is a poem constructed by following a haiku with a couplet in which line is 7 syllables. We can go a step further yet, by connecting a sequence of tonkas into a renga. John called the renga the "stretch limo" of haiku. Apparently, the Japanese have a traditional drinking game that requires each person to write a verse of a growing renga in turn, taking a drink with each verse. The poems may degenerate, but the evening is probably not a complete loss...

Our first exercise after lunch was a variation of this drinking game, only without the drinking. We replaced the adult beverages with two features intended to encourage and test our creativity. First, we were given one minute or less to write each verse. Second, when we passed the growing poem on to the next writer, we folded it over so that the person could see only the verse we had just written.

Rather than start with scratch, John seeded our efforts with haiku written by the accomplished American novelist Richard Wright. In the last eighteen months of his life, Wright became obsessed with haiku, writing dozens a day. Many of these works were published in a collection after his death. John gave each of person a haiku from this collection. One of them, my favorite, appears at the top of this essay.

Then we were off, gang-writing poetry. My group consisted of Brian Foote, Joe Yoder, Danny Dig, Guy Steele, and me. Each of us started with a Wright haiku, wrote a couplet in response to it, folded Wright's stanza under, and passed the extended poem on to continue the cycle. After a few minutes, we had five renga. (And yet we were sober, though the quality of the poetry may not have reflected that. :-)

The second step of the exercise was to select our favorite, the one we thought had the highest quality. My group opted for a two-pass process. Each of us cast a vote for our two favorites, and the group then deliberated over the the top two vote-getters. We then had the opportunity to revise our selection before sharing it with the larger group. (We didn't.) Then each of the groups read its best product to the whole group.

My group selected the near-renga we called Syzygy Matters (link to follow) as our best. This was not one of my two favorites, but it was certainly in line with my choices. One poem I voted for received only my vote, but I won't concede that it wasn't one of our best. I call it Seasons Cease.

Afterwards, we discussed the process and the role creativity played.

  • Most of us tried to build on the preceding stanza, rather than undo it.

  • This exercise resembles a common technique in improvisational theater. There, the group goes through rounds of one sentence per person, building on the preceding sentences. Sometimes, the participants cycle through these conjunctions in order: "Yes, and...", "No, and...", "Yes, but...", and "No, but...".

  • Time pressure matters.

  • Personally, I noticed that by moving so fast that I had no chance to clear my mind completely, a theme developed in my mind that carried over from renga to renga. So my stanzas were shaped both by the stanza I was handed and by the stanza I wrote in the previous round.

  • Guy was optimistic about the process but pessimistic about the products. The experience lowered his expectations for the prospects for groups writing software by global emergence from local rules.

  • We all had a reluctance to revise our selected poems. The group censored itself, perhaps out of fear of offending whoever had written the lines. (So much for Common Code Ownership.) Someone suggested that we might try some similar ideas for the revision process. Pass all the poems we generated to another group, which would choose the best of the litter. Then we pass the poem on to a third group, which is charged with revising the poem to make it better. This would eliminate the group censorship effect mentioned above, and it would also eliminate the possibility that our selection process was biased by personal triggers and fondness.

  • Someone joked that we should cut the first stanza, the one written by Wright!, because it didn't fit the style of the rest of the stanzas. Joke aside, this is often a good idea. Often, we need to let go of the triggers that initially caused us to write. That can be true in our code, as well. Sometimes a class that appears early in a program ultimately outlives its utility, its responsibilities distributed across other more vital objects. We shouldn't be afraid of cutting the class,but sometimes we hold an inordinate attachment to the idea of the class.

  • To some, this exercise felt more like a white-board design session than a coding exercise. We placed a high threshold on revisions, as we often do for group brainstorm designs.

  • Someone else compared this to design by committee, and to the strategy of separating the coding team from the QA team.

Later, we discussed how, in technical writing and other non-fiction, our goal is to make the words we use match the truth as much as possible, but sometimes an exaggeration can convey truth even better. Is such an exaggeration "more true" than the reality, by conveying better the feel of a situation than pure facts would have? Dick used the re-entry season from Apollo 13 as an example.

(Aside: This led to a side discussion of how watching a movie without listening to its soundtrack is usually avery different experience. Indeed, most directors these days use the music as an essential story-telling device. What if life were like that? Dick offered that perhaps we are embarking on a new era in which the personal MP3 player does just that, adding a soundtrack to our lives for our own personal consumption.)

A good story tells the truth better than the truth itself. This is true in mathematical proofs, where the proof tells a story quite different from the actual process by which a new finding is reached. It is true of papers on software system designs, of software patterns. this is yet another way in which software and computer science are like poetry and Mark Twain.

A Team Experiment with Software Design

The second exercise of the afternoon asked four "teams" -- three of size four, and the fourth being Guy Steele alone -- to design a program that could generate interesting Sudoku puzzles. Half way through our hour, two teams cross-pollinated in a Gabriel-driven episode of crossover.

I don't have quite as much to save about this exercise. It was fun thinking about Sudoku, a puzzle I've started playing a bit in the last few weeks. It was fun watching working with Sudoku naifs wrap their games around the possibilities of the game. It was especially fun to watch a truly keen mind describe how he attacked and solved a tough problem. (I saved Guy's handwritten draft of his algorithm. I may try to implement it later. I feel like a rock star groupie...)

The debrief of this exercise focused on whether this process felt creative in the sense that writing haiku did, or was it more like the algorithm design exercise one might solve on a grad school exam, taken from Knuth. Guy pointed out that these are not disjoint propositions.

What feels creative is solving something we don't yet understand -- creativity lies in exploring what do not understand, yet. For example, writing a Sudoku solver would have involved little or no creativity for most of us, because it would be so similar to backtracking programs we have written before, say, to solve the 8-queens puzzle.

In many ways, these exercises aren't representative of literary creativity, in several significant ways. Most writers work solo, rather than in groups. Creators may work under pressure, but not often in 1-minute compressions. But sprints of this sort can help jolt creativity, and they can expose us to models of work,models we can adopt and adapt.

One thing seems certain: Change begets creativity. Robert Hass spoke of the constant and the variable, and how -- while both are essential to creativity -- it is change and difficulty that are usually the proximate causes of the creative act. That's why cross-pollination of teams (e.g., pair programmers) works, and why we should switch tools and environments every so often, to jog the mind to open itself to creating something new.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

October 18, 2005 5:20 PM

OOPSLA This and That

Some miscellaneous thoughts I have had over the last few days...

  • My student Sergei is posting pictures from OOPSLA at http://www.lordofthewebs.com/oopsla/. I'm in there...

  • I made a revealing typo in my entry on the software development apprenticeship demo from yesterday's Educators' Symposium. I was writing about CS professors and the idea of a studio-based curriculum. Here is the final quote:

    For example, I think that the biggest adjustment most professors need to make in order to move to the sort of studio approach advocated by West and Rostal is from highly-scripted lectures and controlled instructional episodes to extemporaneous lecturing in response to student needs in real-time.

    In my first draft, I said highly-scriptured, not highly-scripted. This, I believe, was a Freudian slip that exposes the religious fervor with which we professors regard our lectures.

    I keep hearing educated people say the word 'processes' with a long second "e" -- processEs -- rather than the soft-e schwa sound -- processes --that I regard as correct. The long-e is a phonetic characteristic of plurals of words that end in -is, for example, 'emphasis' and 'emphases'. But 'process' doesn't end in -is... So it's just 'process-us'.

    An interesting convergence: I link to Wikipedia on the concept of schwa above, and Wikipedia creator Jimmy Wales's (mis)pronunciation this morning was the proverbial final straw for me that led to this rant.

    After having this grate in my ears for years, I finally checked the pronunciation in the dictionary. It seems that my so-called mispronunciation is listed as the 3rd pronunciation for the word. I do not know whether this means that this pronunciation is correct as an alternate, or if our dictionaries are yet again acceding to the downward swirl of linguistic evolution. I know, I know -- language is alive, blah, blah, blah. That doesn't mean that we have to give up on perfectly good words, definitions, and pronunciations without a fight! Besides -- three acceptable definitions? That seems excessive.

    Maybe I should just get over it. Or maybe not.

  • For the second time in two days, someone has just said that Maxwell's equations constitute the most beautiful set of equations one can fit on a single page of text. Today it was Gerry Sussman. Yesterday, it was Ward Cunningham. Last year, it was Alan Kay. I really need to go study these equations so that I can appreciate their deep beauty as well as these folks. Any suggested reading? (Should I be embarrassed by my need to study this now?)

  • And speaking of Alan Kay last year, please permit me a little rant deja vu: Turn off the cell phones, people!


  • Posted by Eugene Wallingford | Permalink | Categories: General

    October 14, 2005 6:11 PM

    Rescued by Google

    Okay, so I know some people don't like Google. They are getting big and more ambitious. Some folks even have Orwellian nightmares about Google. (If that link fails, try this one.) But, boy, can Google be helpful.

    Take today, for instance. I was scping some files from my desktop machine to the department server, into my web space. Through one part sloppiness and one part not understanding how scp handles sub-directories, I managed to overwrite my home page with a different index.html.

    What to do now? I don't keep a current back-up of that web space, because the college backs it up regularly. But recovering back-up files is slow, it's Friday morning, I'm leaving for OOPSLA at sunrise tomorrow, and I don't have time for this.

    What to do?

    I google myself. Following the first hit doesn't help, because it goes to the live page. But click on Cached link takes me to Google's cached copy my index. The only difference between it and the Real Thing is that they have bolded the search terms Eugene and Wallingford. Within seconds, my web site is as good as new.

    Maybe I should be concerned that Google has such an extensive body of data. We as a society need to be vigilant when it comes to privacy in this age of aggregation and big search tools and indexes of God, the universe, and everything. We need to be especially vigilant about civil rights in an age when our governments could conceivably gain access to such data. But the web and Google have changed how we think about data storage and retrieval, search and research. These tools open doors to collective goods we could hardly imagine before. Let's be vigilant, but let's look for paths forward, not paths backward.

    Another use of Google data that I am enjoying of late is gVisit, a web-based tool for tracking visitors to web sites. I use a bare-bones blogging client, NanoBlogger, which doesn't come with fancy primitive features like comments and hit counters. (At least the version I use didn't; there are more recent releases.) But gVisit lets me get a sense of at least where people have been reading my blog. Whip up a little Javascript, and I can see the last N unique cities from which people have read Knowing and Doing, where I choose N. I love seeing that someone from Indonesia or Kazakhstan or Finland has read my blog. I also love seeing names of all the US cities in which readers live. Maybe it's voyeurism, but it reminds me that people really do read.

    No, I haven't tried Google Reader yet. I'm still pretty happy with NetNewsWire Lite, and then there's always the latest version of Safari...


    Posted by Eugene Wallingford | Permalink | Categories: Computing, General

    September 27, 2005 7:29 PM

    Learning by Dint of Experience

    While writing my last article, I encountered one of those strange cognitive moments. I was in the process of writing the trite phrase "through sheer dint of repetition" when I had a sudden urge to use dent in place of 'dint' -- even though I know deep inside that dint is correct.

    What to do? I used what for many folks is now the standard Spell Checker for Tough Cases: Google. Googling dent of repetition found 4 matches; googling dint of repetition found 470. This is certainly not conclusive evidence; maybe everyone else is as clueless as I. But it was enough evidence to help me go with my instinct in the face of a temporary brain cramp.

    Of course, our growing experience with the World Wide Web and other forms of collaborative technologies is that the group is often smarter than the individual. The wisdom of crowds and all that. It's probably no accident that I link "wisdom of crowds" to Amazon.com, either.

    To further confirm my decision to stick with 'dint', I spent some time at Merriam-Webster On-Line, where I learned that 'dint' and 'dent' share a common etymology. It's funny what I can learn when I sit down to write.


    Posted by Eugene Wallingford | Permalink | Categories: Computing, General

    September 23, 2005 7:43 PM

    Proof Becky Hirta isn't Doug Schmidt

    ... is right here.

    I'm not saying that Doug doesn't have a social life; for all I know, he's a wild and crazy guy. But he has the well-earned reputation of answering e-mail at all hours of the day. I've sent him e-mail at midnight, only to have an answer hit my box within minutes. His grad students all tell me they've had the same experience, only at even more nocturnal hours. They are in awe. I'm a mix of envious and relieved.

    I don't know why this was the first thought that popped into my head when I read Becky's post just now. Maybe it has something to do with the fact that I'm sitting in my office at 7:30 PM on a Friday evening. (Shh. Don't tell my students.)


    Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

    August 15, 2005 7:58 PM

    International Exposure in my Hometown

    This evening I had the pleasure of attending a reception here as a part of Senator Charles Grassley's Ambassadors Tour. Every two years, Senator Grassley brings a delegation of ambassadors and embassy representatives for a tour of the state of Iowa. This year's delegation consisted of representatives of over 70 countries. They will spend five days in Iowa, visiting various Iowa businesses, in hopes of creating opportunities for international collaboration -- especially business connections. Senator Grassley is a UNI alumnus and brings his tour to our campus every other time or so.

    I had the opportunity of chatting with representatives from four continents, though I spent most of my time with delegates from the Republic of Congo, Russia, and Taiwan. Having "computer science" on my name tag seemed to attract folks' attention. Some countries seek to build up their computing infrastructure in order to participate more fully in the information economy. Others seek to develop connections to utilize existing computing industries. Still others found computers to be a familiar way to start a conversation, even if they weren't so interested in building up new computing-related connections with UNI.

    My conversations this evening remind of just how much we all have in common. When Americans think of the Congo, most probably don't think about colleges and businesses trying to do the same things they do here at home. The news we hear tends to be of extraordinary events, especially natural and man-made disasters. I had a chance to give my condolences to the senior diplomat from Cyprus for the recent plane crash in Greece that killed over one hundred of his compatriots. Computing notwithstanding, we all live in very much the same world.

    BTW, Senator Grassley is a good guy, and he has been good to his alma mater while serving in the legislature. Obligatory running trivia: Senator Grassley is an active runner, even in his 70s. He remains the only U.S. legislator whom I know I've defeated in a race, a 5K in a nearby rural town several years ago. I usually, um, neglect to tell anyone that he was just about to turn 70 at the time! (If I've defeated anyone else with a national profile, it was surely in the 2003 Chicago Marathon. That was a big crowd of runners! Then again, I was behind many of them.)


    Posted by Eugene Wallingford | Permalink | Categories: General

    August 12, 2005 12:06 PM

    Early Lessons on the Job

    After a couple of weeks in the Big Office Downstairs, I've been reminded how important some elements of human psychology are to group dynamics:

    • People want to be valued and respected for what they do and know.

    • When a person interprets any disagreement as a sign of disrespect, communication becomes difficult, if not impossible. (When two such people come into contact -- watch out.)

    A close friend of mine is police chief of a small town back in my home state. When I was visiting him a couple of weeks ago, he shared some of his experience as an administrator. He jokingly said that his job consists in three roles: manager of budgets and schedules, leader among his staff and community, and daycare provider. We both knew what he meant. Neither of us really thinks of our administrative jobs in such a condescending way, but it is clear that working with people is at least as important as the "paper pushing" elements of the job, and in some situations can dominate.

    That said, the paper-pushing side of things creates challenges, too. I am already finding the constant flow of information -- new things to do, new items for the calendar, new ideas to try out -- to be overpowering. In response, Remind and VoodooPad have become my friends. (As has the Remind widget.)

    Remind is a plain text Unix calendar tool. I considered using iCal for a while, but my preference for plain text data and low-frills interfaces pushed me toward Remind. After only a week, I was quite happy. I can add new entries to my databases with any editor, and I can script other applications to add them for me. The one thing I'd like is a better way to view weekly or monthly calendars. By default, Remind prints out days in fixed-width columns that result in the squashing and breaking of words. Of course, that's the beauty of a plain-text tool: If I want a different interface to my data, it is straightforward to write one. (I feel a student project coming on...)

    VoodooPad is just way cool. I tried OmniOutliner for a few months after first accepting my new position, when I was busily creating list of ideas and to-dos. I liked it all right, but it never felt natural to me. VoodooPad makes it easy to create links to new pages, using either CamelCase or command-L on any text selection, so I get the effect of collapsible sub-lists in wiki form. The program is also integrated with other OS X apps and services, such as Address Book and Mail, so I get free linking to my other on-line data and can launch an e-mail message with a single click. In one tool, I have a note taker, a wiki, a to-do list manager, and a decent text editor. There's a reason that VoodooPad is one of MacZealot's top 10 shareware apps of 2005.

    Using great tools means that I can eventually focus my energy on the Big Picture. I say "eventually" because, right now, mastering some details is the Big Picture.


    Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

    August 04, 2005 9:13 AM

    Keep Moving

    My term as department head began on Monday. I feel a bit overwhelmed with all I'd like to do right: plan for the long term, plan for the short term, learn more about our budget, create a list of faculty committees to assemble, meet individually with more faculty members about their goals and interests for the year, sift through all the notes I took in various meetings this summer, .... I have a natural desire to do it all at once, to be up to speed and at full capacity right away.

    I also feel a nagging, almost subconscious apprehension that I won't do as well as I have imagined I might. In my application and interview for the position, I made some strong claims about openness, transparency, respect, fairness, and leadership. Now the promises made in the safety of no responsibility meet the reality of responsibility.

    Oft expectation fails, and most oft there
    Where most it promises; and oft it hits
    Where hope is coldest, and despair most fits.

    -- William Shakespeare
    All's Well That Ends Well (II, i, 145-147)

    I think that my best course of action in the face of seemingly overwhelming possibilities and a fear of failure is familiar to many of you: approach the task in an agile fashion. Take small, measurable steps. Communicate with my team, and let them contribute their many ideas and talents. Get feedback wherever possible, and use it to improve both the process and content of my work.

    I've been think about how I might adapt ideas from Scrum and XP as explicit practices in the administrative side of my job. My thoughts are ill-formed at this point, so they are read neither for implementation nor description just yet. But I think a big visible chart is in the offing.

    The key is to keep moving, to be making progress in concrete ways.

    When George Hellmeier of Bellcore received the Founders Award from the National Academy of Engineering, he related the tale of his first discovery in liquid crystal technology. When he told Vladimir Zworykin that he had "stumbled upon" his discovery, Zworykin replied "... to stumble, one must be moving."

    Sadly, the moving that is most occupying my mind and time these days is moving my office. In 13 years here and another 10 before that as a student of CS, I have collected a lot of book. And a lot of papers. And a lot of software -- on 5-¼" floppy, 3-½" floppy, zip disks, and CDs. In the depths of a filing cabinet, I found shrink-wrapped copies of Microsoft Windows 95 and Office and Visual Studio. And cables for 1992 Mac Quadras. On top of most all of this was a layer of dust that betrays my lack of cleaning over the last few years. I hope to have things set up in my new office by the end of the week so that I can get down to the real business of leading my department.


    Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

    July 25, 2005 8:58 AM

    Playing With Our Toys

    Merlin Mann says this about running and running shoes:

    My concern is that there's a big difference between buying new running shoes and actually hitting the road every morning. Big difference. One is really fun and relaxing while the other requires a lot of hard work, diligence, and sacrifice.

    He is so right. Have you bought running shoes lately? Ten or so manufacturers. Ten or so models per manufacturer. Prices into the $100s for an increasing number of models. Going for a run each morning is much easier.

    Oh wait, that's not what he meant. So I disagree with his analogy after all. But I do agree with the real point of his essay, which is better expressed in this analogy:

    a claw hammer

    You can buy a successively more costly and high-quality series of claw hammers until you've reached the top of the line, but until you learn how to use them skillfully, you're going to keep making ugly bird houses.

    We can easily be so caught up in the excitement and fun of tinkering with new tools that we never getting any real work done. This separates the contenders from the pretenders in every area of life. It is true of productivity tool contenders and pretenders. It is true of runners and non-runners. It is also true of programmers and the folks who would rather tinker with their bash set-up and their .emacs file than actually cut code.

    We all know people like this. Heck, most of us are people like this, at least some of the time in some parts of their lives. Starting -- even finishing -- is much easier than living in Ordinary Time. But that's where life is.

    Some people run as a means to an end -- to lose weight, to "get in shape", to meet cute girls. For them, running may remain onerous forever, because it's just a chore aimed at reaching an external goal. Some people run for the sake of running. For them, getting up most mornings, pulling on the shoes, and hitting the road are a joy, not a chore. They are a part of the good life.

    The same is true of productivity tool contenders and pretenders. When playing with the latest tool is more fun than getting things done, the mind and heart are in the wrong place. The same is true of programmers and tinkerers. When playing with Linux and .vimrc are more fun than writing programs, being a programmer isn't really for you.

    When we find ourselves drifting into tinkerdom, sometimes all we need is to work on our habits. We really do want to write code, run five miles, get things done. But the distractions of the world have become a part of our routine and need to be put aside. Changing habits is hard...

    New Balance 766 shoes, blue and gold trim

    As for the running versus running shoes analogy, I really do find choosing running shoes more stressful than it could be. Going into a shoe store or browsing through a running shoe catalog is like going to K-Mart and stepping into the shampoo aisle -- instant product overload. New Balance, Asics, Saucony, Nike, Brooks, Mizuno, .... Stability shoes, motion control shoes, racing flats, .... I just want a pair of shoes! I do my best to avoid all the indecision by sticking with the shoe that has served me well in the past, New Balance 766. Even still, that model number seems to change every year or two... (When my local shoe store stopped carrying 766s, I bought a couple of pairs of Asics GT-2099s that worked out pretty well.)

    Now that I have the habit, running is relatively easy, fun and relaxing. I miss it on my off-day.

    Merlin closes with sound advice for choosing tools and changing habits:

    Making improvements means change and often pain along the way. It's hard to get better, and good tools like these can definitely ease the journey. I guess I'm proposing you try to understand yourself at least as well as the widget you're hoping will turn things around.

    When you know yourself better, you will know your motivation better. That makes choices easier, sometimes obvious.

    I am reminded of a conversation from October 2003, when I ran my first marathon. Several of us her in town were going to run Chicago as a team, which was more symbolic than practical -- we were all at different levels and speeds, so we'd run the race itself solo. One of our group had dropped out of training a month earlier. At a party, three of us were chatting about our training when the guy who had dropped out said that, while he had wanted to do the marathon, he just didn't have time to train -- work and family and outside activities and nagging little injuries had pulled him away.

    After this guy left the conversation, the third guy -- our most experienced runner -- turned to me and said, "If you want to run..." He paused almost imperceptibly. "You run."

    I took great comfort in that. I was still a relative beginner, and that statement stayed in my mind for all those days when I might wonder what I wanted to do. Maybe I should go buy a new pair of running shoes... No, let's just hit the road.


    Posted by Eugene Wallingford | Permalink | Categories: General, Running

    July 23, 2005 4:45 PM

    Dog Days of Summer

    Busy, busy, busy.

    As I mentioned in my anniversary post, the more interesting thoughts I have, the more I tend to blog. The last few weeks have been more about the clutter of meetings and preparing for some new tasks than about interesting thoughts. That's a little sad, but true. I did manage to spend a little more time at home with my wife this week, while my daughters were away at camp. That isn't sad at all.

    OOPSLA 2005 logo

    I have been working a bit on the Educators' Symposium for OOPSLA 2005. My program committee and I are working on a panel session to close the symposium, one that we hope will spark the minds of attendees as they head out into the conference proper. The rough theme draws on what some of us see as a sea change in computing. Without a corresponding change in CS education, we may doom ourselves to a future in which biologists, economists, chemists, political scientists, and most everyone else teach courses that involve computers and modeling and simulation -- and we will teach only theory and perceived esoterica to a small but hardy minority of students. Maybe that is where CS education should go, but if so I'd rather go there because we intend to, not because we all behave as if we were Chance the Gardener from Being There. During our discussion of this panel, members of my program committee directed me to two classics I had not read in a while, Edsger Dijkstra's On the Cruelty of Really Teaching Computer Science and Tony Hoare's 1980 Turing Award lecture, The Emperor's Old Clothes. Reading these gems again will likely get my mind moving.

    An unusual note regarding the Educators' Symposium... For many years now, OOPSLA's primary sponsor -- ACM's Special Interest Group on Programming Languages -- has offered scholarships for educators to attend the conference and the Educators' Symposium. A few years ago, when OOP was especially hot, the symposium offered in the neighborhood of fifty scholarships, and the number of applicants was larger. This year, we have received only nineteen applications for scholarships. Is OOP now so mainstream that educators don't feel they need to learn any more about it or how to teach it? Did I not advertise the availability of scholarships widely enough? As an OOP educator with some experience, I can honestly say that I have a lot yet to learn about OOP and how to teach it effectively. I think we are only scratching the surface of what is possible. I wonder why more educators haven't taken advantage of the opportunity to apply for a great deal to come to OOPSLA. If nothing else, a few days in San Diego is worth the time of applying!

    I have had my opportunity to encounter some interesting CS thoughts the last few weeks, through meetings with grad students. But I've had a couple of weeks off from those as well. Maybe that's just as well... my mind may have been wandering a bit. Perhaps that would explain why one of my M.S. students sent me this comic:

    Ph.D. Comics, 05/28/05 -- Meeting of the Minds


    Posted by Eugene Wallingford | Permalink | Categories: Computing, General

    July 18, 2005 11:32 AM

    Lessons from 13 Books

    I've recently run across in several different places recommendations for Leonard Koren's Wabi-Sabi: for Artists, Designers, Poets & Philosophers, so I thought I'd give it a read. My local libraries don't have it, so I'm still waiting. While looking, though, I saw another book by Koren, called 13 Books : (Notes on the Design, Construction & Marketing of my Last...). The title was intriguing, so I looked for it in the stacks. The back cover kept my attention, so I decided to read it this weekend. It contained the opening sentences of the book:

    Authorship unavoidably implies a certain degree of expertise about the subject you are writing on. This has always troubled me because, although I have written numerous books on various subjects, I've never really considered myself an expert about anything. Recently, however, I had an encouraging realization. Surely I must know more about the making of the 13 books

    ... that he has written than anyone else! So he wrote this book, which consists of a discussion of each of his previous works, including his source of inspiration for the work, the greatest difficulty he faced in producing it, and one enduring lesson he learned from the experience.

    (This book ties back nicely to two previous entries here. First, it adds to my league-leading total for being the first reader of a book in my university library. Second, it was a gift to the library by Roy Behrens, a design professor here whose Ballast Quarterly Review I mentioned a few months ago.)

    13 Books is clearly the product of a graphic designer who likes to explore the interplay between text and graphic elements and who likes to make atypical books. It's laid out in a style that may distract some readers. But, within the self-referential narrative, I found some of Koren's insights to be valuable beyond his experience, in terms of software, creativity, and writing.

    On software

    Projects ultimately develop their own identity, at which point the creator has a limited role in determining its shape. Koren learned this when he felt compelled to include a person in one of his books, despite the fact that he didn't like the guy, because it was essential to the integrity of the project. I feel something similar when writing programs in a test-code-refactor rhythm. Whether I like a particular class or construct, sometimes I'm compelled to create or retain it. The code is telling me something about its own identity.

    Just from its brief appearance in this book, I can see how the idea of wabi-sabi found an eager audience with software developers and especially XPers. Koren defines wabi-sabi as "a beauty of things imperfect, impermanent, and incomplete... a beauty of things modest and humble..." In the face of changing requirements and user preferences, we must recognize that our software is ever-changing. If our sense of beauty is bound up in its final state, then we are destined to design software in a way that aims at a perfect end -- only to see the code break down when the world around it changes. We need a more dynamic sense of beauty, one that recognizes beauty in the work-in-progress, in the system that needs a few more features to be truly useful, in the program whose refactoring is part of its essence.

    Later in the book, Koren laments that making paper books is "retrograde" to his tech friends. He then says, "And the concept of wabi-sabi, the stated antithesis of digital this and digital that, was, by extrapolation, of negligible cultural relevance." I see no reason that wabi-sabi stands in opposition to digital creations. I sense it my programs.

    Finally, here is my favorite quote from the book that is unwittingly about software:

    The problem with bad craftsmanship is that it needlessly distracts from the purity of your communication; it draws away energy and attention; it raises questions in the reader's mind that shouldn't be there.

    Koren writes of font, layout, covers, and bindings. But he could just as easily be writing of variable names, the indentation of code, comments, ...

    On creativity and learning

    At least one of the thirteen books was inspired by Koren's rummaging through his old files, aimlessly looking at photos. We've seen this advice before, even more compellingly, in Twyla Tharp's "start with a box". (That reminds me: I've been meaning to write up a more complete essay on that book...)

    Taking on projects for reasons of perceived marketability or laziness may sometimes make sense, but not if your goal is to learn:

    The ideas for both books came too quickly and easily, and there was no subsequent concept of development. In my future books I would need to challenge myself more.

    In building software systems, in learning new languages, in adopting new techniques -- the challenge is where you grow.

    In retrospect, Koren categorized his sources of inspiration for his books. The split is instructive: 40% were the next obvious step in a process, 30% came from hard work, and 30% were the result of "epiphanies from out of the blue". This means that fully two-thirds of his books resulted from the work of of being a creator, not from a lightning bolt. Relying on flashes of inspiration is a recipe for slow progress -- probably no progress at all, because I believe that those flashes ultimately flow from the mind made ready by work.

    On writing and publishing

    Koren is a graphic designer for whom books are the preferred medium. Throughout his career, he has often been dissatisfied with power imbalance between creators and publishers. He is constantly on the look-out for a new way to publish. For many, the web has opened new avenues for publishing books, articles, and software with little or no interference from a publisher. The real-time connectedness of the web has even made possible new modes of publication such as the blog, with conversations as a medium for creating and sharing ideas in a way. Blogs are often characterized as being ephemeral and light, but I think that we'll all be referring to Martin Fowler's essays and the Pragmatic Programmers' articles on their blogs for years to come.

    While Koren may remain a bookmaker, and despite his comments against digital technology as creative medium, I think his jangling, cross-linked, quick-hit style would play well in a web site. It might be interesting to see him produce an on-line work that marries the two. Heck, it's been been done with PowerPoint.

    As someone who has been reflecting a year writing this blog, I certainly recognize the truth in this statement:

    A book need be grand neither in scale nor subject matter to find an audience.

    Waiting until I have something grand to say is a sure way to paralyze myself.

    Finally, Koren offered this as the enduring lesson he learned in producing his book Useful Ideas from Japan:

    Reducing topical information to abbreviated humorous tidbits is a road to popular cultural resonance.

    It seems that Koren has the spirit of a blogger after all.


    Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

    July 15, 2005 4:41 PM

    Think Big!

    A couple of quotes about thinking big -- one serious, and one just perfect for the end of a Friday:

    Think Global, Act Local

    Grady Booch writes about the much-talked-about 125 hard questions that science faces, posed by the American Association for the Advancement of Science on the occasion of its 125th anniversary:

    Science advances daily through the cumulative efforts of millions of individuals who investigate the edges of existing knowledge, but these sort of questions help direct the global trajectory of all that local work.

    At the end of his entry, Grady asks a great question: What are the grand challenges facing us in computer science and software development, the equivalent of Hilbert's problems in mathematics?

    When Is Enough Enough?

    Elizabeth Keogh writes about reality for many readers:

    If you think you're going to finish reading all those books you bought, you need more books.

    I'm a book borrower more than a book buyer, but if I substitute "borrowed" for "bought", this quote fits me. Every time I read a book, I seem to check out three more from the library. I recently finished Glenway Wescott's "The Pilgrim Hawk", and I'm about to finish James Surowiecki's "The Wisdom of Crowds". Next up: Leoneard Koren's "13 Books". (I'm a slow reader and can't keep up with all the recommendations I receive from friends and colleagues and blogs and ...!)


    Posted by Eugene Wallingford | Permalink | Categories: Computing, General

    July 09, 2005 8:01 AM

    Reflecting on a Year of Knowing and Doing

    Why do you never find anything written about that
    idiosyncratic thought you advert to, about your
    fascination with something no one else understands?
    Because it is up to you.

    -- Annie Dillard, "The Writing Life"

    Today is the one-year anniversary of Knowing and Doing. When I first started, I was not certain that I had anything to say, at least anything that anyone with a real life would want to take time out from their lives to read. I decided, though, that the writing would be its own reward; that, in taking the time to put words to the thoughts in my head, I would have better, more complete thoughts.

    So I wrote. I wrote down thoughts I had as I read interesting books, as I chaired conferences, as I wrote programs, as I taught classes, and as I ran. Somehow, they seemed worth writing down to me.

    My 100th post came on November 20. My 200th post came in the midst of agile moments on May 31. There is little doubt that my most-read post so far has been this report on Alan Kay's talks at OOPSLA 2004. I've seen it referenced and linked to more often than any other article I wrote; I received more individual e-mail responses to it than any other as well.

    That article is a great example of what blogs can do for both the writer and the world of readers. Alan's talks inspired me, and I wanted record the source of that inspiration for my own long-term benefit. So I blogged it. Writing it all down and organization a coherent report enriched my own experience. But by blogging it, I helped share a small part of the experience with many folks who were unable to come to OOPSLA and hear the talks for themselves. I only hope that my article helped Alan to inspire some of those readers, who found my article as regular readers of Knowing and Doing, or who stumbled upon it via a Google search. It is important to put Alan's ideas before as many people as possible, so that they will find a home in the minds that can make his vision a reality. Perhaps someday, I will be one of the folks who helps to make Alan's vision reality, but perhaps I have an even greater chance of affecting the world by writing about what I see, hear, and learn, and helping greater minds than mine run with ideas.

    When I first starting writing my blog, I asked a net acquaintance who blogs how to get people to read my stuff. What a lame question that was. It turns out that there isn't much of a recipe for creating readership other than "write stuff people want to read". I just started writing, and a small readership, word of mouth, and Google did the rest. I don't have any idea how many people read my blog regularly, because I've never tried to capture that data, or retrieve it from web server logs. I do know that I receive occasional e-mail from folks who have read an article and taken the extra initiative to drop me a line. I'm surprised by how good that feels, even today.

    The best way to write stuff people want to read is to write stuff that matters to you. If an idea matters to you, then it matters. Getting over the fear that no one will care frees you to get down to work.

    Jorn Barger, considered by some the coiner of the term 'weblog', once wrote, "The more interesting your life becomes, the less you post... and vice versa." Perhaps he was talking about the sort of blog that only reports the daily trivia of life, in which case more blogging reflects more triviality. But my experience as a blogger is just the opposite: the more interesting thoughts I have, the more I blog. I write more when I am reading good books, going to good conferences, discussing provocative ideas with smart friends and colleagues, and doing challenging work. The times I am blocked or uninspired are the times when I am not doing much interesting in my life. In that sense, Knowing and Doing serves as a barometer for the quality of my own intellectual experiences. When the mercury drops too low, I need to shake things up. Sometimes, it's as simple as taking friends' advice to read a couple of old articles ( [1] and [2]) and seeing what a master has to teach me today.

    Upon which I reminded myself that on the whole,
    throughout life as a whole, the appetites which
    do not arise until we have resolved to eat,
    which cannot be comprehended until we have eaten,
    are the noblest....

    -- Glenway Wescott, "The Pilgrim Hawk"

    Last July 9, I said, Welcome to my blog. This July 9, I say "Thank you". Let's see what a second year of Knowing and Doing will be.


    Posted by Eugene Wallingford | Permalink | Categories: General

    July 08, 2005 2:35 PM

    Breaking in a New iBook

    Daring Fireball recently ran a piece on several issues in the Apple world, including the recent streamlining of the iPod line:

    This emphasis on a simplified product lineup has been a hallmark of the Jobs 2.0 Administration. For the most part, given a budget and a use case, it's pretty easy to decide which Mac or which iPod to buy. (The hardest call to make, in my opinion, is between the iBooks and 12" PowerBook.)

    I agree with John's assessment of the last tough choice -- I recently agonized over the iBook versus PowerBook choice, focusing for budget reasons on the crossover point from iBook to PowerBook. In the end, I think it was more pride than anything else keeping me on the fence. I love my old G3 clamshell PowerBook and like the look of the titanium Powerbooks. But given my real needs and budget, the iBook was the right choice. One potential benefit of going with the simpler, lower-cost alternative for now is that it postpones a more substantial purchase until the shift to Intel-based processors is complete. My next PowerBook can be one from The Next Generation.

    My new iBook arrived last week. I've been having great fun playing with OS X 10.4... My old Powerbook is still running Jaguar, so I have been missing out on the many cool things available only on Panther. I'm only just now scratching the surface of Dashboard and Expose and the like, but they feel great. My only minor regret at this point is going with the 30GB drive. I'm already down to 13 gig free, and I haven't done much more than basic set up and my basic set of apps. In reality, that's still plenty of space for me. I don't store lots of video and music on my laptop yet, and my data will fit comfortably in 10 gig. If I do need more space, I can just pick up a external drive.

    While setting up this machine, it really struck how much of my Mac experience now is bundled up with Unix. In the old days, I set up Macs by dragging StuffIt archives around and creating folders; I spent a few minutes with control panels, but not all that much. Setting up OS X, I spend almost all of my time in a terminal shell, with occasional forays out to System Preferences. This machine switch may be more Unix-heavy than usual, because I've decided to follow OS X from tcsh to bash. Rewriting config files and hacking scripts is fun but time consuming.

    Of course, this change pales next to the switch I made when I went to grad school. As an undergrad, I became a rather accomplished VMS hacker on an old cluster of DEC Vaxes. When I got to my graduate program, there wasn't a Vax to be seen. Windows machines were everywhere, but the main currency was Unix, so I set out to master it.

    Another thing that struck me this week is how much of my on-line identity is bundled up in my Unix username. "I am wallingf." That has been my username since my first Unix account in the fall of 1986, and I've kept it on all subsequent Unix machines and whenever possible elsewhere. At least I know I'm not the only one who feels this way. Last year as we prepared for the Extravagria workshop at OOPSLA 2004, Dick Gabriel wrote that rpg is the

    Login name for RichardGabriel. I have used this login since 1973 and resent ISPs and organizations that don't allow me to use it.

    Anyway, my iBook now knows me as wallingf. I guess I should give her a name, too.


    Posted by Eugene Wallingford | Permalink | Categories: Computing, General

    June 16, 2005 3:51 PM

    Looking for Tools to Manage Information Overload

    Now that I am taking on more and different sorts of administrative tasks, I'm beginning to feel the load of managing a large number of to-do items of various urgency, complexity, and duration. I know that there are "productivity apps" and "personal information managers" out there aimed at just this sort of problem, but I tend to be a low-overhead, plain-text kind of guy. So I'm exploring some lightweight tools that I can use to document, organize, and use all the information that is rushing over me these days. Right now, I'm looking at some simple wiki-like tools.

    One tool I like a lot after a little experimentation is VoodooPad, a notepad that acts like a wiki. As a text editor, it feels just like TextEdit or NotePad, except that wiki names automatically create new pages and link to them. But it also supports lots of other operations, such as export to HTML (to publish pages to a server) and scripting (to add functionality for common actions). VoodooPad costs $24.95, though I've been exploring with both free options: using full VoodooPad with a limit of 15 pages per document, and using VoodooPad Lite with unlimited pages but no scripting and limited export.

    Oh, sorry for you non-Mac folks. VoodooPad is an OS X-only app.

    I'm also looking at TiddlyWiki, a cool little wiki that comes as a single HTML page. It's written in HTML, CSS, and JavaScript and so runs everywhere. To start, you just save the TiddlyWiki home page to your disk. Load a copy into your web browser, and all of the content is editable right in the browser. It uses several formatting conventions common to other wikis for its text, but the result is immediately live content.

    At this point, I am perhaps more enamored of the idea of TiddlyWiki than its usefulness to me. I really want my productivity app to be as simple as a text editor and support plain-text operations as much as possible (if only via export). But what a neat idea this is!

    Finally, and lowest tech of all, I am finding many different ways to grow and use a Hipster PDA to meet my information management needs. However much I love and use my laptop, there are times when pen and paper are my preferred solution. With custom products like the do-it-yourself Hipster PDA planner, I can be up and running in minutes -- with no power adapter required.


    Posted by Eugene Wallingford | Permalink | Categories: General

    June 09, 2005 10:33 AM

    The IRS to the Rescue!

    How's that for an unlikely title?

    In the Bible, Jesus takes some heat for befriending sinners and tax collectors. I'm not sure that the modern view of the tax collector is much different from the common one in Jesus's time. We don't have to deal with individuals overcollecting and skimming the difference, but we do have to deal with the IRS, known in the public mind mostly for a bloated, byzantine tax code and bad phone advice.

    But I have a good story to tell about the IRS.

    It seems that I made a mistake on my federal tax return this year, one that cost me $2000. The IRS noticed, corrected the error, and increased my refund accordingly.

    You see, I am one of those dinosaurs who still does his own taxes by hand -- pencil and paper, with folders of documents. I'm well organized and have some training in accounting, and I still enjoy the annual ritual of filling out the tax forms.

    In over twenty years, I do not think I have made any but the most trivial errors on a tax return. I check and double-check my work to be sure it's right before I submit. There may have been cases where I was not aggressive enough in claiming a deduction, or maybe too aggressive (though that's less likely). But the numbers I submitted were pretty much the right ones.

    This year, though, I forgot to claim my Child Tax Credit, on Line 51 of Form 1040. I correctly recorded my daughters' information on Page 1, but somehow wrote in $0 for credit, when it should have been $2000. That reduced my refund by, you guessed it, $2000. I don't honestly remember how I made this mistake, whether on my work or in transcribing the final answers. But it was there in black and white.

    I am a bit embarrassed to admit this to readers who may now question my reliability on matters of great professional import. But in the interest of fairness, I want to give credit where credit is due. Most of us take the time to complain when the world treats us ill, but we usually forget to take the time to rejoice, at least publicly, when the world treats us well. The result is that an organization like the IRS, which has the thankless job of collecting our hard-earned money to support the workings of the republic, ends up with a thankless reputation, too. But today I can say "thank you" for a job well done.

    I'm glad that the IRS did its job well and rescued me from a costly oversight. Apple is probably happy, too, because the larger refund means that I can now afford a higher-end PowerBook than was originally planning to buy!


    Posted by Eugene Wallingford | Permalink | Categories: General

    May 31, 2005 1:56 PM

    A Weekend in the Caves

    The family inside Mammoth Cave

    My family's weekend at the caves was a great success, both for family and relaxation. This is a completely personal entry, so feel free to move on if you are looking professional content...

    South-central Kentucky in the US is one of the most cavernous terrains in the world. My wife, my two daughters, my mom, and four nieces and nephews met there for a long weekend of vacation. In our two days in the area, we visited three different cave systems, all within five miles of one another.

    First up was the magnet that drew us to the area, Mammoth Cave National Park. At over 350 miles, Mammoth Cave is the longest known cave system in the world. On our first day together, we took the relatively easy hour-long "discovery tour" of Mammoth Cave, followed by a few hours of hiking the national park trails. The discovery tour introduces visitors to the history and geology of the cave via a gentle walk.

    Next time, I'll sign up for the longer, more strenuous Frozen Niagara tour. If you are a real spelunker, or want to be one, you can take specific tours that explore deeper and less accessible portions of the cave.

    Leaving Hidden River Cave

    On our second day, we visited two other caves in the area. Hidden River Cave has the second-largest cave opening in the world -- only Carlsbad Caverns' is larger. It is also one of only two river cave tours in the US. The river is small but steady, with crystal water. You enter the cave by descending through an old sinkhole located right on Main Street in the town of Horse Cave. I was amazed to find this cave site while running through town that morning -- it is stunning. The picture at the right shows the view as members of my tour were leaving the cave; I couldn't do justice to the mouth of the cave from above.

    The cave site also hosts a museum that is worth an hour or so. On both the cave tour and museum visit, you learn that Hidden River Cave is one of the great environmental reclamation successes of the last half century. This cave was a popular tourist attraction from 1916 through 1943, when it had to close due to pollution. The residents of the region had been disposing of their garbage and sewage by throwing it all into the many sinkholes that pockmark the area. These sinkholes feed the underwater river that flows through the Hidden River Cave. By the mid-1980s, the cave was such a polluted mess that the town above nearly died. The clean-up has been remarkable. Through education, folks stopped the dumping, and Mother Nature repaired herself. The river itself is clean now, and the cave is clean and pleasant.

    We ended our caving with a visit to Diamond Caverns, which is the best "formation cave" in the area. A formation cave is distinguished by the quantity and quality of its stalactites and stalagmites, the features most folks think of when they think of caves. Diamond Caverns' formations have spectacular shapes and colors. On this tour, I learned about The Cave Wars waged in the first decades of the 1900s by the owners of the commercial cave tours in the Mammoth Cave region. The owners tried to increase their own profits by damaging the other caves. As the most beautiful cave in the region, Diamond Caverns was a frequent target, and it suffered extensive damage to some of its chambers. Even still, it was worth a visit.

    For the runners among you: I did manage to work in a short long run on Sunday, an 11-mile out-and-back jaunt between our hotel in Cave City and the eastern edge of Horse Cave. The two towns are connected by old U.S. 31, a two-lane highway. The motorists I encountered were not malicious, but they didn't seem to think they should change their behavior to account for a runner in their midst. Fortunately, I ran 5:30-7:00 AM, and the road had sidewalks and grassy shoulders.

    This was the first break I'd taken from work since at least March, and my mind enjoyed it. Now, it's back to work -- with slack built into my schedule for summer.


    Posted by Eugene Wallingford | Permalink | Categories: General, Running

    May 27, 2005 10:21 AM

    Time for a Little Slack

    I feel a strong desire to write a substantive essay on something of great import, but at the same time I feel that I have nothing of great value to say today. I suspect that this is my mind and brain telling me to take a break, after a busy spring semester and all of the busy-ness that has followed. I'm due for some time away from the office. Fortunately, that is what's about to happen.

    In the US, this is Memorial Day weekend. My family and I will be using the long weekend to tour the caves at Mammoth Cave National Park in Kentucky. Mammoth Cave and its neighbors are the longest known system of caves in the world. I lived the first half of my life only a couple hours drive from this park but have never visited. My 12- and 9-year-old daughters will be able to say that they spelunked there, though we'll have to drive 9-10 hours to do it.

    This won't be my first spelunking adventure. While in Arizona the week before ChiliPLoP, I went with a few friends for a short cave tour just outside of Tucson. We were all in town to watch NCAA tournament games. By the time we reached the cave, I had already run 9 miles and played 90 minutes of basketball that morning. At least I should be better rested for Mammoth Cave, which will entail some more strenuous climbing.

    My mind can use a break. It's time for a little slack. (Or should I link to one of the long-running threads on the XP discussion list instead?)


    Posted by Eugene Wallingford | Permalink | Categories: General

    May 25, 2005 1:37 PM

    Waiting

    Vaclav Havel

    Last weekend, while my daughter was doing a final practice for her Suzuki Book I recital, I picked Vaclav Havel's The Art of the Impossible: Politics as Morality in Practice off the piano teacher's bookshelf for a little reading. This is a collection of speeches and short essays that Havel in the first half of the 1990s about his role as dissident, reformer, and president of the Czech Republic. He is, of course, famous as a poet, and his writing and speaking style have a poet's flare.

    I ended up spending most of my time with Havel's speech to the Academy of Humanities and Political Sciences in Paris on October 27, 1992. (I just noticed the date -- that's my birthday!) This speech discussed the different forms of waiting.

    Vladimir and Estragon in Waiting for Godot

    The first kind is sometimes characterized as waiting for Godot, after the absurdist play by Samuel Beckett. In this form, people wait for some sort of universal salvation. They have no real hope that life will get better, so they hold tightly to an irrational illusion of hope. Havel says that, for much of the 20th century, residents of the former communist world waited for Godot.

    At the opposite end of the waiting spectrum lies patience. Havel describes patience as waiting out of principle -- doing the right thing because it is right, not out of any expectation of immediate satisfaction. In this sense, patience is "waiting as a state of hope, not as an expression of hopelessness". Havel believes that the dissidents who ultimately toppled the communist regimes behind the Iron Curtain practiced this sort of waiting.

    When the curtain fell and the people of, say, Czechoslovakia took their first unsteady steps into the light of the western world, folks practicing the different forms of waiting encountered distinctive problems. Those who had been waiting for Godot felt adrift in a complex world unlike anything they had known or expected. They had to learn how to hope and to be free again.

    You might think that the patient dissidents would have adjusted better, but they faced an unexpected problem. They had hoped for and imagined a free world around them, but when they became free things didn't change fast enough. Havel relates his own struggles at being too idealistic and now impatient with the rate at which the Czech and Slovak republics assumed the mantel of democratic responsibility. Like many revolutionaries, he was criticized as out of his element in the new world, that he was most effective in the role of dissident but ineffective in the role of democratic leader.

    What struck me most about this essay came next. Havel recognized the problem: He had waited patiently as a dissident because he had no control over how anyone but himself behaved. Now that the huge impediment of an authoritarian regime had been surmounted, he found that he had become impatient for all the details of a democratic system to fall into place. He no longer waited well.

    In short, I thought time belonged to me. ...

    The world, Being, and history have their own time. We can, of course, enter that time in a creative way, but none of us has it entirely in his hands. The world and Being do heed the commands of the technologist or the technocrat....

    In his own transition from dissident to democratic leader, Havel learned again that he had to wait patiently as the world takes its "own meandering course". He asserts that the "postmodern politician" must learn waiting as patience -- a waiting founded on a deep respect for the world and its sense of time. Read:

    His actions cannot derive from impersonal analysis; they must come out of a personal point of view, which cannot be based on a sense of superiority but must spring from humility.

    When the world changed, even in the way for which he had been working, Havel had to learn again how to be patient.

    I think that the art of waiting is something that has to be learned. We must patiently plant the seeds and water the ground well, and give the plants exactly the amount of time they need to mature.

    Just as we cannot fool a plant, we cannot fool history.

    I think that 'waiting patiently as the world takes its own meandering course' translates into showing respect for people and the rate at which they can assimilate new ideas and change their behavior.

    Perhaps this speech affected me as it did because I am now thinking about leading my department. I certainly do not face a situation quite as extreme as Havel did when the communist regime fell in Czechoslovakia, yet I am in a situation where people do not trust the future as much as I'd like, and I need to find a way to help my department move in that direction. As Havel reminds me, I cannot move the department myself; I can only patiently plant the seeds of trust, water the ground well, and give the plants the time they need to grow.

    The value of this sort of waiting is not limited to the world of administration. Good instructors need to wait patiently, working with students to create the atmosphere in which they can grow and then giving them time and opportunity to do so.

    I also think that this sort of waiting holds great value in the world of software development. Agile methods are often characterized by folks in the traditional software engineering world as impatient in their desire to get to code sooner. But I think the opposite is true -- the agile methods are all about patience: waiting to write a piece of code until you really know what it should do, and waiting to design the whole program until you understand the world well enough to do it right. In this sense, traditional software engineering is the impatient approach, telling us to presumptuously design grand solutions to force the world to follow our senses of direction and time. The worlds in which most programs live are too complex for such hubris.

    I cannot resist closing with one last quote from the rich language of Havel himself:

    If we are certain that the planting was good and that we are watering regularly, we have no reason to be impatient. It is enough to realize that our waiting has meaning.

    Waiting that has meaning because it grows out of hope and not hopelessness, from faith and not despair, from humility toward the time of the world and not out of fear of its sublime tranquility, is accompanied not by boredom but by suspense. This kind of waiting is more than just waiting.

    It is life. Life as the joyous involvement in the miracle of Being.

    That sounds like a poet speaking, but it could be a programmer. And maybe the world would be a better place if all administrators thought this way. :-)


    Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Software Development, Teaching and Learning

    May 18, 2005 2:36 PM

    A New Set of Responsibilities

    As mentioned last week, I recently applied for the position of department head in my home department. After interviews scattered over the final three days of last week and a weekend of wa-a-a-iting (cue Tom Petty), the Dean called me yesterday to offer me the job. I start on August 1.

    I will continue to teach one class each term, for a total of three a year. So I expect to keep blogging about teaching and learning and software development on a regular basis. But because I blog mostly to examine what I am doing and learning, I expect that I'll be posting a new thread of messages over the next three years to examine the administrative and leadership side of my job. To this end, I am creating a new category, Managing and Leading, for this thread.

    (Those of you who come here to read about running, fear not! That thread will continue as usual. I've run 74 miles in the last 11 days, so I have plenty of raw material!)

    Some of my first entries in this new category will likely deal with what I've been reading about working with people, leadership, and technical administration. I think I've been subconsciously preparing for this role for a few years. But I still have a lot to learn.

    My official start date is August 1, but I'll be putting significant energy into the job before then. As I said, I have a lot to learn. Busy, busy, busy.


    Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

    May 10, 2005 4:07 PM

    Taking a Chance

    A while back, I mentioned my intention to apply for a job here at the university. The position is head of my department, and last week I did apply. The search is internal, which means that we are considering only current members of the faculty as candidates, and is for a three-year appointment. When I decided to apply, I didn't realize how tense I'd feel at the prospect of interviewing with my colleagues! I think this is the good sort of tension, though, the kind that lets me know that I really do care about the job. It's good to take risks of this sort every once in a while.

    The application process required a fair bit of writing, most notably a statement of administrative philosophy, that kept me busy last week. This week, I am preparing a short presentation to give to the faculty, on philosophy and goals for the department. The process has reminded of two things:

    • It is good to read broadly, especially material that comes highly recommended by people I respect. These books have exposed me to a plethora of ideas I would never have had on my own.

    • When I write, I find out what I really think. As teachers and mentors, we often tell students this, and we write so much in our own discipline that the idea becomes commonplace. But when I stepped out of my academic niche to write about how I would try to lead my department, I found myself in uncharted waters -- and learned a lot about myself in the writing.

    Sometime later this summer, I may write up a more general version of the ideas that went into my application and share them here. In the meantime, wish me luck!


    Posted by Eugene Wallingford | Permalink | Categories: General

    May 01, 2005 6:02 PM

    Planning for OOPSLA 2005

    OOPSLA 2005 logo

    Today and tomorrow, I am in San Diego for the OOPSLA 2005 spring planning meeting. I have the honor of chairing the Educators Symposium again this year and so am on the conference committee that makes OOPSLA happen. As usual, I'm impressed by the energy and talent of the people behind the scenes of OOPSLA. These folks are in the trenches scheduling talks and panels, tutorials and workshops, while simultaneously thinking about the future of software and where the conference can and should go in the future.

    the view from the OOPSLA 2005 conference site

    San Diego will be a great location for OOPSLA. We are at the Town and Country Resort, about 15 miles from downtown. The resort is more than just a hotel; the property includes several buildings of convention space, meeting rooms, and restaurants, not too mention the pools and outdoor gathering spaces. San Diego's temperate weather makes outdoor gatherings a real possibility. On our site tour earlier, a couple of us joked about holding software demonstrations poolside -- or even on the water, the "floating demo". We may as well surrender to the inevitable temptations that accompany meeting space adjacent to an outdoor pool.

    Last night, I had the pleasure of catching up with a current friend, a former student who now calls San Diego home. Kris picked me up, gave me a short driving tour of the area, and then took me off for dinner with another former UNI student, a former Waterloo native, and another Iowan. We talked sports, much to the chagrin of the ladies, and current campus goings-on. Dinner was at Quiigs Bar and Grill (5091 Santa Monica Ave) -- I had a wonderful grilled prawns dish.

    This morning, I checked out the running options from the hotel. I did an easy 10+-miler, heading west along Friars Road toward the beach. I never reached the beach but I did find an unexpected bonus: down near a marina off Sea World Drive, I came upon the start location for the Spring Sprint Triathlon and Biathlon. This is a little tri, a ¼-mile swim, a 9-mile bike, and a 3-mile run -- just within my reach. One of these days...

    Dick Gabriel and Ralph Johnson have a lot of neat ideas in the works for OOPSLA this year, including a track for essays, collocated symposia on wiki and Dylan, and the evolving Onward! track, which will debut a film festival. If you want to know what software people will be talking about in earnest three or four years from now, make sure to attend OOPSLA this year!


    Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

    April 22, 2005 3:17 PM

    Friday Fun

    Some fun for a Friday afternoon late in the semester.

    Get Perpendicular

    Hitachi is promoting a new method of data storage called "Perpendicular Recording", in which bits are stacked in order to save real estate on the disk. The idea itself is interesting, but even better is the promotional cartoon the company is using to teach us about it. You can't tell that some animators working these days were influenced by Schoolhouse Rock, can you?

    I Answered the Email, But I Didn't Inhale

    This from Clive Thompson:

    According to a new study commissioned by Hewlett Packard -- and conducted by psychologists at King's College in London -- extensive use email and instant messaging can drop your IQ by 10 per cent. In comparison, smoking pot regularly dents your IQ by only 4 per cent.

    The first one is always free...

    (The explanation for this anomaly apparently involves the unsuccessful attempt to multitask, which must lead to some sort of thrashing.)

    Need a Cluestick?

    Okay, this is just silly. But I chuckled.

    The users I know are so clueless, that if they were dipped in clue musk and dropped in the middle of a pack of horny clues, on clue prom night during clue happy hour, they still couldn't get a clue.

    ".. dropped in the middle of a pack of horny clues ..." And you thought I was having a bad day.

    (And I would never say this about my students. No--really.)


    Posted by Eugene Wallingford | Permalink | Categories: General

    April 22, 2005 9:35 AM

    Are the Gods Telling Me Something?

    Do you ever feel as if the gods are telling you to change your plans?

    I came to campus today planning to "go public" about my intention to apply for a job opening here.

    First, I walk to my office in a steady rain, and my umbrella deconstructs spontaneously. When I get to my office to make a phone call, I realize that I left my my food and drinks for the day back in the car. After I make my call, I head back to the car, taking my laptop with me so that I can head over to the library for some work. Still pouring down rain. On my way to the library, the metal clasp on the shoulder strap of my soft leather laptop bag breaks, and my laptop plummets to the sidewalk pavement.

    At this point, I'm starting to wonder.

    I get to the library. The laptop survived the fall. I survived the rain. I fire up an episode of Bob and Tom to bring me some laughs. I do some work.

    The gods have quieted down, or I am in denial.

    UPDATE: Mixed Messages

    When I went to leave my library office to go to lunch, I found that I had left my keys in the door outside for over three hours. The good news is that no enterprising student had decided to make them his own.

    And, though it may sound like I'm having a bad day, I don't really feel like that. Maybe that is a sign in its own right...


    Posted by Eugene Wallingford | Permalink | Categories: General

    April 16, 2005 5:51 PM

    Leading the League in...

    Scott Hastings, a journeyman NBA player from a decade ago, used to joke that, no matter how good Michael Jordan, Isiah Thomas, and Magic Johnson were, he himself always led the league in millions.

    Huh? Consider Hastings' typical line in the daily box score. He would get into the game at the very end, play a minute or two, shoot 0-0 from the field and 0-0 from the free-throw line, grab 0 rebounds, and have 0 assists. So his stat line always read something like this:

    1 0 0 0 0 0 0

    And there's his million. Michael, Zeke, and Magic could only dream of such a stat line!

    I claim bragging rights locally for another, perhaps less dubious, statistical feat, this one in the university library:

    I almost certainly lead the UNI community in most times checking a book out for the first time.

    I can't tell you how many times I've gone over to pick up a book only to find it in mint condition, not a blemish to be seen, with no stamps on the Date Due slip. This experience always gives me a little buzz, a sense that I am on the frontier. False pride, I know, but mostly a harmless diversion during a busy day.

    Just as Hastings owed his good fortune to the coaches, who put him into the game for garbage minutes each night, I owe my good fortune to colleagues like Rich Pattis and the many bloggers I read, who suggest hot and important books to me. By having well-read and deeply interesting friends and colleagues, I come into contact with books and ideas I'd otherwise only stumble across much later. I'm often surprised to find these books already on the library shelves -- a testimony to the good work done by by our bibliographers, both technical and general.

    Now, I just need a catchy name for this feat, so that I can impress unsuspecting friends and colleagues with a passing remark. Any suggestions?

    (In a similar vein, on Friday I checked out a book that had been on the shelves unread since 1971! A first-time reader, but only in the last two generations.)


    Posted by Eugene Wallingford | Permalink | Categories: General

    April 04, 2005 3:58 PM

    Agile Methods in the Dining Room

    I've written about using agile software principles in my running. Now Brian Marick applies the idea of Big Visible Charts as a motivational technique for losing a few pounds.

    The notion that practices from agile software development work outside of software should not surprise us too much. Agile practices emphasize individuals and interactions, doing things rather than talking about things, collaboration and communication, and openness to change. They reflect patterns of organization and interaction that are much bigger than the software world. (This reminds me of an idea that was hot a few years ago: design patterns occur in the real world, too.)

    Oh, and good luck, Brian! I know the feeling. Two years ago, I had broken the 190-pound barrier and, despite recreational jogging, felt soft and out of shape. By exercising more and eating less (okay, exercising a lot more and eating a lot less), I returned to the healthier and happier 160-pound range. I'll keep an eye on your big visible chart.


    Posted by Eugene Wallingford | Permalink | Categories: General, Patterns, Running, Software Development

    March 23, 2005 4:37 PM

    Hill Climbing and Dead Ends

    A couple of things that occurred to me while running through the low mountain desert north of Carefree, Arizona, this morning:

    A dead end isn't always a bad thing.

    Sometimes, running to the dead end has its own benefits, even if only the joy of the running.

    Running up big hills is worth the effort.

    Of course, running up hills makes you stronger. It makes you stronger when running on level ground. It makes little hills seem like not so much. It teaches you that running up other big hills is doable.

    Perhaps as important, though, is this: The view from the top of the hill can be stunning. You see the world differently up there.

    Sometimes, you run to get from Point A to Point B, to accomplish a goal. Sometimes, you should just run.

    These are true about computer programming, too, and many other things.


    Posted by Eugene Wallingford | Permalink | Categories: General, Running

    March 07, 2005 3:11 PM

    Made the News

    As I mentioned last week, a feature writer for the local newspaper recently interviewed me for an article on blogging. The article appears in today's paper, and the on-line version is already live. The article is a bit light, given how much we talked, but it probably reflects the fact that there are a lot more soap opera-like blogs out than technical ones. And the technical ones will have much less appeal to most of the Courier's readership.

    The good news is that I don't sound like an idiot. If anything, I sound like a long-winded academic. (The sarcastic reader can decide which it's better to be.) The author quotes me accurately enough, though my second quote is missing its ending, I think.

    I wonder if I'll see any hits from new readers in the Cedar Valley as a result... We have a few software developers in the area, so some may even find my blog interesting.


    Posted by Eugene Wallingford | Permalink | Categories: General

    March 02, 2005 10:10 AM

    Making News

    I was just interviewed by a feature writer for the local newspaper for an article on blogging. We talked about a wide range of issues, among them why I read blogs, why I write my blog, when and how I write, some of the technical issues of RSS, how I got started, and how others can get started reading and writing.

    We discussed the fact that some people think blogs are going to change the world of journalism, while others just don't get the whole blogging thing. I had to admit that I don't see much appeal in the sort of stream-of-consciousness confessional blogging that one tends to find at places at LiveJournal -- even if I occasionally post more personal items myself. I don't think that blogs are going to replace conventional journalism, and to the extent that conventional journalism changes in the next few years I think blogging will just be one instance of a larger cultural phenomenon at the root of the change.

    But blogging does lower the barrier for people who think they have something to say to reach out and say it. I likened such blogs to the articles that two UNI professors write for the Sunday issue of the local paper. Their articles provide them with a way to write about issues in their technical domain (popular culture and economics, respectively) for a wider and less technical audience. My blog doesn't reach that broad an audience -- yet :-) -- but I do reach a wider audience than my courses and published articles can reach. And I am able to write articles about topics and their intersection that would be difficult or impossible to publish in an academic journal. For me, the real value in reading and writing blogs lies in the intellectual community it creates. Reflective professionals and otherwise interesting people share what they learn as they learn, and we all grow richer in the process. Sometimes that stray idea would make great dinner conversation makes for a great blog article -- and folks who would never have a chance to dine together can have a conversation.

    The interview was fun for me, in much the same way as writing this blog can be: it caused me to formulate succinct answers to questions just under the surface of my consciousness. I hope that my answers make sense to the newspaper's readership. I also confess to a small bit of hope that I don't sound like a nutcase in print...

    The reporter was well-prepared for the interview. She had a broad set of questions to ask and did a good job of following my answers onto other interesting questions. Besides, she complimented me for being an articulate interview, so she must have good taste and a keen mind!

    Update: The feature will run in next Monday's issue, March 7. With any luck it will hit the on-line version of the paper, too.


    Posted by Eugene Wallingford | Permalink | Categories: General

    February 13, 2005 5:04 PM

    A Place for My Stuff

    Gus Mueller writes about his stuff folder:

    I've got a folder on my desktop named "stuff". It's a little over 25 gigs, and it currently has 154,262 files in it. I have no idea what exactly is in there. Random one-off projects, pics from the camera, various music files.

    I wonder, am I the only person with this situation? Should I just trash it, or should I at least try and go through it? I don't know. Do I really need anything in there?

    Oh my goodness. 154,262 files!?! But I can assure you that Gus isn't the only one.

    I have two stuff/ folders, one on my desktop machine and one on my laptop. The volume of Gus's stuff puts my folders to shame, though. They total only 750 MB. They are full of stuffed apps I want to try out when I get a few free minutes, articles I'd like to read, .jar files I think I might be able to use someday... Of course, someday never comes, and I just keep dropping new stuff in there.

    At some point, my stuff/ folder reaches a certain size at which the chance I have of remembering that a file is in there reaches effectively 0%. What good does it do me then? Even worse, I feel guilty for not using the stuff, and guilty for not deleting it.

    Every so often, I throw out all or most of my current stuff/ folder in a fit of good sense, but here I am again.

    For now I am buoyed by schadenfreude at least I'm not *that* bad. Thanks, Gus. :-)


    Posted by Eugene Wallingford | Permalink | Categories: General

    February 09, 2005 11:10 AM

    Some Wednesday Potluck

    I've been so busy that writing for the blog has taken a back seat lately. But I have run across plenty of cool quotes and links recently, and some are begging to be shared.

    Why Be a Scientist?

    Jules Henri Poincare said...

    The scientist does not study nature because it is useful. He studies it because he delights in it, and he delights in it because it is beautiful.

    ... as quoted by Arthur Evans and Charles Bellamy in their book An Inordinate Fondness for Beetles.

    How to Get Better

    When asked what advice he would give young musicians, Pat Metheny said:

    I have one kind of stock response that I use, which I feel is really good. And it's "always be the worst guy in every band you're in." If you're the best guy there, you need to be in a different band. And I think that works for almost everything that's out there as well.

    I remember when I first learned this lesson as a high school chessplayer. Hang out with the best players you can find, and learn from them.

    (I ran across this at Chris Morris's cLabs wiki, which has some interesting stuff on software development. I'll have to read more!)

    All Change is Stressful

    Perryn Fowler reminds us:

    All change is stressful - even if we know it's good change. ...

    Those of us who attempt to act as agents of change, whether within other organisations or within our own, could do well to remember it.

    Perryn writes in the context of introducing agile software methods into organizations, but every educator should keep this in mind, too. We are also agents of change.


    Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

    February 03, 2005 5:55 AM

    An Unexpected Accolade

    Following James Tauber's lead, I went vanity surfing at Technorati and found a pleasant surprise: One of my readers, Giulio Piancastelli, proclaimed Knowing and Doing to be his Weblog of the Year for 2004. I'm honored that Giulio would single this blog out in such a way. It's humbling to know that readers find something valuable here. I may not leave listeners inexplicably grinning like idiots, but maybe I am serving the needs of actual readers. What a nice way to end my day. Thanks, Giulio.


    Posted by Eugene Wallingford | Permalink | Categories: General

    January 13, 2005 5:50 AM

    I Knew It!

    My first musical recommendation here has a computer programming twist. Eternal Flame is the first song I've ever heard that talks about both assembly language programming and Lisp programming, Even if you don't like folk music all that much, listen for a little while... It will confirm what you've always known deep in your heart: God programs in Lisp.

    You can find other songs with technology themes at The Virtual Filksing.

    (Via Jefferson Provost. Jefferson, if you ever record a rock version of Eternal Flame, please let me know!)


    Posted by Eugene Wallingford | Permalink | Categories: Computing, General

    January 12, 2005 11:02 AM

    Humble on My Mind

    I've been busy with the first week of classes and some duties with conferences. But I wanted to share a couple of ideas that have been rolling around my mind lately.

    From Ward Cunningham

    There are a lot of ideas, but really good ideas are hard to snatch because they look so humble when you first see them. You expect the solution to be beautifully complex and a good solution is pretty plain until you see how that plainness plays in a complicated way to make something better than you could get in your head all at once.

    "Humble ideas". Beauty often lies in how simple things interact with one another.

    (I heard this in an interview Ward gave to TheServerSide.NET. You should watch it for yourself. Lots of great stuff there.)

    From Kent Beck

    Learning to listen empathically is the simplest thing I have control over that might change how others respond to me.

    This statement hits me square in a personal weakness. I can be a good listener, but I'm not reliably a good listener. And inconsistency has an especially deleterious effect on how people come to feel about a person.

    (I saw this quote on the XP discussion list.)


    Posted by Eugene Wallingford | Permalink | Categories: General

    December 21, 2004 1:22 PM

    Emerson's Prescience

    Last summer, I read E.L. Doctorow's dandy little essay book, Reporting the Universe. The book is a memoir of Doctorow's life as a writer, from heading off to Kenyon College to his approach to writing novels and academic prose. However, it opens with an essay on Ralph Waldo Emerson that I liked so much I had to jot down lots of great quotes. While cleaning up some files today, I ran across this selection:

    Emerson's idea of the writer goes right to the heart of the American metaphysic. He is saying we don't have the answers yet. It is a pragmatic thing to say. He knows he is at a culminant point in literary history, where the right of authorship has devolved from gods and their prophets and their priests to everyone. ... A true democracy endows itself with a multiplicity of voices and assures the creation of a self-revising consensual reality that inches forward over the generations to a dream of truth.

    Maybe I'm sensitized by recent rumination, but that sounds very much like the state of world these days, with the explosion of the blogosphere as a universal medium for publishing. The right of authorship has devolved from a select few to a much larger population, and the rich interactions among blogs and authors fosters a consensual reality of sorts.

    Brian Marick has written a bit about Emerson and how his pragmatic epistemology seems to be in sync with agile practices. I certainly recommend Doctorow's essay to interested readers. If you like to read what writers have to say about writing, as I do, then I can also recommend the entire book.


    Posted by Eugene Wallingford | Permalink | Categories: General

    December 17, 2004 1:12 PM

    When Blogs Do More Than Steal Time

    I often read about how blogging will change the world in some significant way. For example, some folks claim that blogs will revolutionize journalism, creating an alternative medium that empowers the masses and devalues the money-driven media through which most of the world sees its world. Certainly, the blogosphere offers a remarkable level of distributivity and immediacy to feedback, see the Chronicle of Higher Education's Scholars Who Blog, which chronicles this phenomenon.

    As I mentioned last time, I'm not often a great judge of the effect a new idea like blogging will have on the future. I'm skeptical of claims of revolutionary effect, if only because I respect the Power Law. But occasionally I get a glimpse of how blogging is changing the world in small ways, and I have a sense that something non-trivial is going on.

    I had one such glimpse this morning, when I took to reading a blog written by one of my student's blog. First of all, that seems like a big change in the academic order: a student publishes his thoughts on a regular basis, and his professors can read. Chuck's blog is a mostly personal take on life, but he is the kind of guy who experiences his academic life deeply, too, so academics show up on occasion. It's good for teachers to be reminded that students can and sometimes do think deeply about what they do in class.

    Second change: apparently he reads my blog, too. Students read plenty of formal material that their instructors write, but blogs open a new door on the instructor's mind. My blog isn't one of those confessional, LiveJournal-style diaries, but I do blog less formally and about less formal thoughts than I ordinarily write in academic material. Besides, a student reading my blog gets to see that I have interests beyond computer science, and even a little whimsy. It's good for students be reminded occasionally that teachers are people, too.

    Third, and this is what struck me most forcefully while reading this morning, these blogs make possible a new channel of learning for both students and teachers. Chuck blogged at some length about a program that he wrote for a data structures assignment. In the course thinking through the merits of his implementation relative to another student's, he had an epiphany about how to write more efficient multi-way selection statements -- and "noticed that no one is trying particularly hard to teach me" about writing efficient code.

    This sort of discovery happens rarely enough for students, and when it does happen it's likely to evanesce for lack of opportunity to take root in a conversation. Yet here I am privy to this discovery, six weeks after it happened. It would have been nice to talk about it when it happened, but I wasn't there. But through the blog I was able to respond to some of the points in the entry by e-mail. That I can have this peek into a student's mind (in this case, my own) and maybe carry on a conversation about an idea of importance to both of us -- that is a remarkable consequence of the blogosphere.

    I'm old enough to remember when Usenet newsgroups were the place to be. Indeed, I have a token prize from our Linux users group commemorating my claim to the oldest Google-archived Usenet post among our local crew. New communities, academic and personal, grew up in the Usenet news culture. (I still participate in an e-mail community spun off from rec.sport.basketball, and we gather once a year in person to watch NCAA tourrnament games.) So the ability of the Internet to support community building long predates the blog. But the culture of blogging -- personal, frequent posts sharing ideas on any topic; comments and trackbacks; the weaving of individual writers into a complex network of publication -- adds something new. And those personal reflections sometimes evolve into something more over the course of an entry, as in Chuck's programming reflection example.

    I do hope that there isn't some sort of Heisenberg thing going on here, though. I'd hate to think that students would be less willing to write honestly if they know their professors might be reading. (Feeling some pressure to write fairly and thoughtfully is okay. The world doesn't need any more whiny ranting blogs.) I know that, when I blog, at the back of my mind is the thought that my students might read what I'm about to say. So far, I haven't exercised any prior restraint on myself, at least any more than any thoughtful writer must exercise. But students are in a different position in the power structure than I am, so they may feel differently.

    Some people may worry about the fact that blogs lower or erase barriers of formality between students and professors, but I think they can help us get back to the sort of education that a university should offer -- a Church of Reason, to quote Robert Pirsig:

    [The real University is] a state of mind which is regenerated throughout the centuries by a body of people who traditionally carry the title of professor, but even that title is not part of the real University. The real University is nothing less than the continuing body of reason itself.

    The university is to be a place where ideas are created, evaluated, and applied, a place of dialogue among students and teachers. If the blogosphere becomes a place where such dialogue can occur with less friction -- and where others outside the walls of the church building itself can also join in the conversation, then the blogosphere may become a very powerful medium in our world after all. Maybe even revolutionary.


    Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

    December 06, 2004 8:14 AM

    Learning via the Blogosphere

    This is why I love the blogosphere so much. Somehow, I stumble across a link to Leonardo, an open-source blogging and wiki engine written in Python. I follow the link and start reading the blog of Leonardo's author, James Tauber. It's a well-written and thoughtful set of articles on an interesting mix of topics, including Python, extreme programming, mathematics, linguistics, New Testament Greek, music theory and composition, record producing and engineering, filmmaking, and general relativity. For example, my reading there has taught me some of the mathematics that underlie recent work on proving the Poincare Conjecture.

    But the topic that attracted my greatest attention is the confluence of personal information management, digital lifestyle aggregation, wiki, blogging, comments and trackbacks, and information hosting. I've only recently begun to learn more deeply about the issue of aggregation and its role in information sharing. This blog argues for an especially tight intellectual connection among all of these technologies and cultures. For example, Tauber argues that wiki entries are essentially the same as blog trackbacks, and that trackbacks could be used to share information about software projects among bosses and teammates, using RSS feeds, and to integrate requests with one's PIM. But I'm not limited to reading Tauber's ideas, as he links to other blogs and web pages that present alternative viewpoints on this topic.

    Following all these threads will take time, but that I can at all is a tribute to the blogosphere. Certainly, all of this could have been done in the olden days of the web, and indeed many people were publishing there diverse ideas about diverse topics back then. But the advent of RSS feeds and blogging software and wikis has made the conversation much richer, with more power in the hands of both readers and writers. Furthermore, the blogging culture encourages folks to prepare their ideas sooner for public consumption, to link ideas in a way that enables scientific inquiry, to begin a conversation rather than just publish a tract.

    The world of ideas is alive and well.


    Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

    November 30, 2004 5:48 PM

    Milestones and Nostalgia

    November 2004 will enter the books as my least-blogged month since starting Knowing and Doing back in July. I'm not all that surprised, given that:

    • it is the last full month of my semester,
    • it follows a month in which I was on the road every weekend and for a full week at OOPSLA, and
    • it contains a 5-day week for Thanksgiving.
    Each of these would cut into my time for blogging on its own, and together they are a death knell to any free moments.

    When I started Knowing and Doing, I knew that regular blogging would require strict discipline and a little luck. Nothing much has changed since then. I still find the act of writing these articles of great value to me personally, whether anyone reads them or not. That some folks do find them useful has been both gratifying and educational for me. I learn a lot from the e-mails I receive from Faithful Readers.

    I reached the 100-post plateau ten days ago with this puff piece. I hope that most of my posts are of more valuable than that! In any case, reaching 1000 entries will be a real accomplishment. At my current pace, that will take me three more years...

    While on this nostalgic kick, I offer these links as some obligatory content for you. Both harken back to yesteryear:

    • Brian Foote waxes poetic in his inimitable way about watching undergrads partake in the rite of initiation that is a major operating systems project. He points out that one thing has changed this experience significantly since he and I went through it thirty and (can it be?) twenty years ago, respectively: unit tests.

    • John O'Conner has a nice little piece on generalizing the lesson we all learned -- some of us long ago, it seems -- about magic numbers in our code. Now if only we turned all those constants into methods a lá the Default Value Method pattern...


    Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

    November 20, 2004 1:33 PM

    Strange Blogging Occurrences

    Two strange things happened while preparing my last blog entry. First, my editor flagged "Wozniak" as a misspelled word and offered wooziness as a correction. Clearly, my spell checker has been reading about Woz's fascination with Segways. :-)

    Then, for a span of at least ten minutes, http://www.amazon.com/ returned Http/1.1 Service Unavailable to all http requests. I wonder how often that happens at Amazon or Google? Maybe I'm just sensitive to down time right now, after having been reminded of my dependence on my server by 36+ hours of server crash earlier this week, which made the world think that my blog and e-mail address had disappeared...


    Posted by Eugene Wallingford | Permalink | Categories: General

    November 20, 2004 1:19 PM

    Dancing Naked in the Mind Field

    I recently finished reading Kary Mullis's Dancing Naked in the Mind Field. Mullis is the sort of guy people call a "character", the sort of guy whom my college friends would have called a "weird dude". But he's a weird dude who just happened to win a Nobel Prize in chemistry, for discovering PCR (polymerase chain reaction), a technique for finding and replicating an arbitrary sequence of nucleotides on a strand of DNA.

    The book consists of sixteen disconnected chapters that talk about various parts of Mullis's life as free spirit, biochemist, and celebrity scientist. I enjoyed his many chapters on the joys of doing science. He writes of discovering chemistry and electricity as a child, and he writes of the May evening drive up California's Highway 128 on which he had the brainstorm that led to PCR.

    In one chapter, Mullis tells us about making chemicals with a friend as a high school student, first in the commercial lab of a family friend and then in a homemade lab he and his friend built. Always the entrepreneur, Mullis hatched a scheme to make and sell chemicals that no one else was selling. In the doing so, he learned why no one else was doing it: the fabrication process was dangerous and wasteful. But he learned a lot.

    A neat line: When Mullis and his buddy took their first batch of nitrosobenzene to their family friend, he was "pleased to the point of adopting us both as his children forever. Chemists get emotional about other chemists because of the language they have in common and the burns on their hands." I know this feeling well from working with student programmers. But the burns on our hands are all metaphorical; they consist in the dangling pointers we've all chased, in the data files we've created and overwritten, in the failed attempts to make a language say something it cannot.

    This sort of precociousness has long been a hallmark of young computer programmers. From Jobs and Wozniak, Gates and Allen, all the way to all the local ISPs operating out of rural garages across the country, the history of computing is full kids who have set out to follow their curiosities and changed the world. The advent of the Internet and World Wide Web opened the doors to even more people. I only wish that I had the entrepreneurial spirit that accompanies their curiosity. Maybe I would have changed the world, too?

    In another chapters, Mullis describes how he came to know that no titans of thought were "minding the store", overseeing the world of science with firm, guiding hands. The science world is just a bunch of mortals doing their own things, with no distinguished wisdom for knowing today or the future. He contrasted how a naive, somewhat flaky article he wrote in college was published in the journal Nature, which later rejected -- along with all the other highest-ranking journals -- his paper describing PCR and its implications.

    I especially enjoyed the chapters that comment on the nature of science in the modern world. Mullis gives his views on how having to seek external grants distorts the scientific process, from the choosing of projects to the "selling" of results in a politically-correct culture. He tells that science has changed, and so should how we do science, but what people do doesn't change all that fast. He gives as an example something most high school graduates will remember, if only faintly: Avogadro's number. Computations using 6.02*1023 molecules (did I remember correctly, Mr. Smith?) used to be essential to the conduct of chemistry, when chemists had to work with relatively large masses of substance. But now chemists work with dozens of molecules, or 2, or 1. What's the point of doing calculations 23 orders of magnitude larger?

    Computing has its own historic remnants that affect how we think about programming and programs long after the world changed underneath them. Social change is slow, though, and the university is no exception. As long as we are able to discuss controversial ideas and offer alternatives, we have some hope of making progress.

    Perhaps my favorite chapter deals with how science and math are the result of humans trying to extend their limited senses. In the beginning, humans knew the world only from their natural senses, among which Mullis counts the traditional five plus the senses of falling and time. He argues that our sense modalities developed around the physical needs of the species. For example, our sense of hearing grew to hear sounds in the range that we can make, thus supporting the development of language; our sense of sight came to see the colors we needed to see and in the light conditions available to prehistoric man. As humans progressed intellectually, we derived science as way to see, hear, and otherwise sense things we could not perceive naturally. Mathematics grew as a way for us to describe these newly-perceived phenomena.

    For Mullis, this is a natural progression. However, over time, science has increasingly moved away from the original range of our natural senses, to increasingly small objects (quarks, anyone?) and increasingly large objects (galaxies and universes). Mathematics has followed a similar path toward abstraction. The result has been science and math increasingly divorced from the lives and understanding of non-scientists. We have moved away from "human-scale" science, from things we can apprehend naturally, to the physics of the very small and very large. Mullis suggests that we return most of our energy -- and most of our funding, 90% or so -- to things that can matter to everyday people as they live everyday lives. He includes in this category the sort of biochemistry he does, of course, for its potential to affect directly and dramatically human life. But he also suggest that we seek a better understanding asteroids and comets so that we can prevent the next major impact, like the ones in prehistoric times that caused mass extinction of species. Are we any better prepared than the dinosaurs were for a major asteroid impact, even if we are able to predict its coming years in advance? This all seems a bit crazy, but then that's Mullis. Thinking way-out thoughts can lead to change, if the ideas gain traction.

    Unfortunately, Dancing ... includes some chapters that are so unusual that they may turn some readers off. You will find plenty about drug use, alien abduction, and out-of-body experiences (that were. seemingly, not the result of drug use). Mullis clearly dances naked in the mind field and is not at all constrained by the rationalism that dominates science and technology these days. As a result, he ends up believing some odd juxtapositions at once. If you are put off by such stuff, skip these chapters; you'll not miss anything of "scientific substance". You may miss out on wondering just how a Nobel Prize-winning scientist can think so many strange thoughts, though. And, who knows, you may miss out on the next big thing.

    Mullis's analysis is not always all that deep, and he has biases like any interesting person. But he writes about interesting ideas, which can serve as a trigger for his reader to do the same thing. I rate Dancing Naked in the Mind Field worth a read.


    Posted by Eugene Wallingford | Permalink | Categories: General

    November 11, 2004 5:17 PM

    Knowing and Doing Has Come Unstuck in Time

    Kurt Vonnegut

    Today is Kurt Vonnegut's birthday. I've been reading Vonnegut since high school, before I even knew that, like me, he was a native the uniquely Midwestern big city of Indianapolis. Some folks have one author they can always turn to when they want to remind themselves of their humanity, and Vonnegut is that author for me. I even spent a few days one summer a few years back (or misspent, depending on your perspective) tabulating The Books of Bokonon, the phony religion that Vonnegut created in his novel Cat's Cradle. Of all the hundreds or thousands of pages that I have created for the web, this one page generates more and more consistent feedback from readers around the world. Vonnegut readers are a kindred lot.

    Billy Pilgrim, Kilgore Trout, Eliot Rosewater, Rabo Karabekian -- all are among my favorite characters in literature. Vonnegut has never been a haute auteur of the sort that attracts "serious" literary attention, but he can create as fully human a character as anyone I've read.

    Now, someone's 82nd birthday is hardly the sort of landmark that ordinarily calls for a big celebration. (Well, inasmuch as an 82nd birthday isn't grounds enough to celebrate!) But Vonnegut has a particular connection to my blog: I very nearly named my blog for one of his stories.

    I don't know about most other bloggers, but I spent considerable mental energy trying to find just the right name. Names are important. I wanted a name that I could live with a long time, one that would send the right message to potential readers. I ended up choosing "Knowing and Doing" in order to send a relatively straightforward message to readers, and to fit in with the mold of other blogs.

    But the names of three Vonnegut stories made the final cut: "Now It Can Be Told", "Tomorrow and Tomorrow and Tomorrow", and "The Euphio Question". I like them all. "Now It Can Be Told" and "Tomorrow and Tomorrow and Tomorrow" even sound like blog names. The one I like best, though, was the "The Euphio Question", but I ultimately decided that it was just too indirect to suit me. Why did I consider it?

    The story includes a description of a musical device created by Dr. Bockman, which leaves listeners "inexplicably grinning like idiots."

    I guess that, deep down, I hoped to have a similar effect on all my readers. But that seems a pretty high bar to set for one's self before ever writing a single entry, so I settled for a name that sounds as pretentious but doesn't promise rapture to my readers. :-)


    Posted by Eugene Wallingford | Permalink | Categories: General

    November 01, 2004 8:15 AM

    Collaboration and Passion

    As many of you know, I turned forty last week. If you are the last to know, I apologize. You must not know Robert Duvall.

    If I should have suffered through a mid-life crisis by now, I am sorry to disappoint you. To be honest, I hope that I have not yet reached the middle of a long and productive life.

    I am not now in the midst of a crisis, but recent events bring such thoughts to mind. I spent last week at OOPSLA amidst the intellectual, professional and personal passion of folks like Brian Marick, David Ungar, and Alan Kay. Then, on the flight home, I finally got around to reading Malcolm Gladwell's article on group think, which describes the passion that often infuses groups of creative minds working in fertile intimacy.

    I certainly crave that sort of passion but often find it lacking in my daily life.

    Mid-life crises may well happen when people realize that they've lost their passion. Perhaps they come to fear that they've lost their capacity to feel passionately. The steady drip-drip-drip of real life has a way of wearing down our sharp edges, leaving us just tiredly waiting for tomorrow.

    One way to combat this erosion is to surround yourself with the right people.

    Gladwell reports (from the work of such folks as Randall Collins and Jenny Uglow) and OOPSLA reminds of the power -- and critical need -- of groups for nurturing passion and driving greatness. Many people, especially Americans, subscribe to the myth of the solitary genius, the lone pioneer. But history shows that nearly all of the great advances attributed to individuals grew out of remarkable circles of creative people pushing each other, driving and feeding off of each other's passion.

    For academics, large universities have an advantage over smaller skills like UNI when it comes to gathering the critical mass of the right people in the right place at the right time. Academic centers like Boston and technological centers like Silicon Valley offer the same possibilities. But such groups can form and grow over space and time, too, especially in this era of easy travel and electronic communication. For me, in the last decade the software community that grew out of the Hillside Group has played that role. So have amorphous groups of creative and ambitious software developers and educators in the OOPSLA and SIGCSE communities. These groups intersect in interesting ways, with enough folks outside the core to inject new ideas occasionally.

    Unconsciously, I drew my program committee for the recent OOPSLA Educators Symposium from these groups. Their ideas and passion helped me to shepherd the symposium to success.

    But sometimes the passion of distributed groups wilts in the heat of a long semester. At OOPSLA, a friend shared his recent bout with this malaise, and I know the feeling well. But it's good to know that making and maintaining connections with good people -- which I have been fortunate to do throughout my career -- is one way to keep passion within my reach. I need to work to develop and maintain relationships if I wish to develop and maintain passion.


    Posted by Eugene Wallingford | Permalink | Categories: General

    October 30, 2004 11:33 AM

    My First Blog Rant

    Okay, I don't think I've ever blogged just to vent my frustration with people in this world. The blogosphere is full of blogs that are nothing but. They wear thin really quickly. But if you ask my wife, she'll tell you that I save my rants for her weary ears.

    Here is my first rant blog:

    People, turn your #$!@%#^@*&$#*^@# cell phones off when you enter a public space.

    I have had students' cell phones go off in my class, and I've been patient and generous in my assessment of the situation. They meant to turn them off, surely, or thought they had. I've heard them in theaters and in churches and at public talks. Each time, I wonder about the rudeness or inflated sense of importance that makes people think that they can impose on everyone else around them.

    The final straws fell last week at OOPSLA when a cell phone went off during Alan Kay's keynote address -- and then twice during his Turing Award lecture! How rude, self-important, or utterly stupid do you have to be to let your phone be the second to ring during a historic lecture?

    And what's worse, two of those people took the calls before leaving the room. Amazing.

    I know that this rant is cliché these days. Everyone complains about cell phone users. I've been muttering about them under my breath for a while myself. But each man has his breaking point, and I'm at mine. The people whose phones interrupted Alan Kay's talks last week are lucky that the only thing broken was my temper, and not their phones.

    Okay, I feel better now.


    Posted by Eugene Wallingford | Permalink | Categories: General

    October 28, 2004 7:30 PM

    OOPSLA Is Over

    My access to wireless at OOPSLA is coming to a close. I have a few ideas burning in my mind to be written about, but they will have to wait until I have time while traveling home tomorrow. For now, let me just say good-bye to the conference.

    At the beginning of his talk, which I wrote about earlier, Ward said: OOPSLA is about programming on the edge, doing practical things better than most.

    In a panel on the second day, Ralph said that, in some ways, objects were just the excuse for OOPSLA, not the reason. The real reason was hanging around with people doing interesting things, especially people interested in talking about and studying real programs.

    OOPSLA is a unique conference for the sort of people it attracts. I had the honor of lunching with David Ungar today, and he asked everyone, "What has inspired you this work? What do you want to rush home and work on?" The best thing you can say about a conference is that it does inspire you to hurry home and explore some new problem or approach, or write a new program. OOPSLA does that for me every year. I'll take time to write down tomorrow how OOPSLA has inspired me, but for now I can at least say that examples play a big role in my answer. Well, at least one of my answers.


    Posted by Eugene Wallingford | Permalink | Categories: General

    October 28, 2004 1:26 PM

    Requiem for a Power Adapter

    So, why didn't I post my last blog entry when written? I broke my G3 Powerbook's power adapter. As I was putting the laptop away after the OOPSLA 2005 planning BoF, I stretched the adapter end of the cord a bit too far, and the internal wire cable snapped.

    Were I living in the 21st century and using any of the more recent PowerBook or iBook models, I would only have had to worry about charging up for the flight home. There are 1436 people at OOPSLA this year, and there must be 1000 Mac laptops here. With the maturity of Mac OS X, Apple has really made inroads into the technical community, at least some of the more advanced technical folks. If you can have a fully-featured Unix OS, a wonderful Apple interface on top, and out-of-the-box wireless, why not pay a few extra bucks and get the Mac?

    But no, I am still using my 1999-era PowerBook, and its power adapter is incompatible with all more recent models. And I've only seen two other people using one here. Many folks I know prefer the stylish and sexy little silver disk of the older PowerBooks, with its compact cord winding. That is no consolation when a man has a single battery holding only a 40% charge, facing a day of conference and day of travel, and with a suddenly apparent reliance on his laptop, felt deeply within his bones.

    So I retired to my room last night fretting about the future. In order to preserve the remaining juice in my PowerBook, I resorted to an ancient technology -- the phonebook -- in hope of finding an authorized Apple dealer within walking distance of my hotel, in a secondary hope that the dealer will have in stock a power adapter for my dinosaur.

    The gods smiled on me. Perhaps it was a birthday gift for me. I found an entry for the Mac Station, a mile or so from the hotel. So I planned my morning around a jaunt in that direction.

    After several days of beautiful sunshine, I awoke to a day of rain. I put in an 8.5-mile jog around Stanley Park, grabbed some breakfast, threw on a jacket, and headed out into the mist. The walk was worth twenty minutes, as I saw the edge of Vancouver's Chinese district, some of its banking district, and its impressive six-story public library. I arrived at 9:02 AM -- and found that the store opens at 10:00 AM. The gods may be friendly, but they have a sense of humor, too.

    So I headed back to the library to fill the time reading Jutta Eckstein's new book. But the library opens at 10:00 AM, too, so I bought a soda and sat in a small cafe for a while.

    Back to the Mac Station. I arrive right at 10:00 to find them opening the front gate. Inside -- and joy. They don't carry an OEM power adapter for my model, but they do have a third-party product in stock. The price is a bit steep for my tastes ($130 Canadian), but an addict will pay the price for his next fix. We tried it out in the store -- I'm overly cautious sometimes -- and it worked just fine. Relief.

    I'm back at OOPSLA now, in an Onward! session of outlandish wishes. I'm on wireless. All is right with the world, at least for a few minutes.

    My new power adapter isn't as pretty as my old one, and I'll miss that. I know that Windows and Linux folks are used to pedestrian, merely functional, boring at best and ugly at worst components, but I'll miss my silver disk.


    Posted by Eugene Wallingford | Permalink | Categories: General

    October 26, 2004 11:08 PM

    OOPSLA, Day 1

    The first day of OOPSLA has been seen highs and lows, though the highs weren't as high as I'd expected and the lows were lower.

    Rick Rashid gave the keynote address to open the conference. A good keynote inspires a conference's participants to think interesting thoughts, to respond to some challenge, or to interpret what follows in a particular context. Sometimes, a keynote is spectacular in its scope or challenge. Christopher Alexander did that at OOPSLA a few years back, even though I didn't like the talk as much as many. But keynoters don't have to be spectacular. But they should at least provide a theme for the conference that threads through all the conversations that make it up, as Brian Marick explained last summer.

    Sadly, Rashid's talk didn't do either. He opened with 20 minutes or so of promising ideas, about how the future of computing lies in pervasive computing, with its attendant need for situating people and their computers in place and time. The interesting part of this discussion focused on Microsoft's serving up of geographical information and providing a way for programmers to integrate such information into apps. Check out especially http://terraserver.microsoft.com/, and also http://skyserver.sdss.org/, http://skyquery.net/, and http://gpgpu.com/.

    But soon after the talk lost its steam, ran out of the ideas that are the fuel of a good keynote. Rashid announced the upcoming web release of VisualStudio.Net 2005 and then called up a project engineer to demo a piece of the new tool. If this new tool had something new in it, that might be fodder for a keynote, but in this case the content is new only to the MSDN crew and pedestrian otherwise. After the demo, the talk never returned to intellectual currency, and it ended with a much smaller audience than it began with.

    Robert Biddle and James Noble opened the Onward! track with their "Notes on Notes on Postmodernism" (NoNoPoMo), a self-referential peek back at their groundbreaking 2002 "Notes on Postmodernism" in the inaugural Onward! Robert and James do wonderful theater -- as entertaining as anything you will find at any computing conference -- that points out an essential truth: We don't need to have one overarching unifying story to guide computing; lots of little stories, told by smaller communities to guide their work, can be enough. We can build software in this way, too. They passionately declare that, contrary to the fiction our own industry has created and nursed for 35 years, there is no software crisis; software development has actually been a prolific and wide-reaching success. We should admit that and move on to do our next good work.

    After lunch came Ward Cunningham's talk on systems of names. I have been looking forward to this talk for a couple of months, and Ward didn't let me down. He told a story, which is what Ward does. The story wove together the many ideas and contributions from his career -- including objects, CRC cards, patterns, wikis, and XP -- and drew out a theme that captures Ward's magic. It's not magic, though; it's an outlook on life, an attitude:

    Be Receptive to Discovery

    Ward even drew out some sub-themes to help us adopt this attitude. I was going to comment on each, but my words don't add to what Ward said, so:

    • Use what you know.
    • Feel it work.
    • Share the experience.
    • Wait for insight.
    • Refactor to include it.

    My last event from the day's main schedule was Brian Marick's talk on software methodology as ontology. If you read Brian's blog, you know that he has a peculiar philosophical attitude toward software development. This attitude probably follows a bit from Brian's having been an English major in school, but I suspect that it's mostly just who he is. Brian takes a unique perspective on software development, one which I find both enlightening and challenging. I'm a sucker for this kind of this stuff.

    Today's talk started with Ralph Waldo Emerson (who will almost certainly show up in my blog some day soon, for different reasons), whose fundamentally optimistic outlook on the world and human nature so differs from the software world's fear of change and complexity and failure; moved onto ontology as world view; and finally applied Imre Lakatos's view on the progress of science to the idea of software development methodology. I will save the real discussion of Brian's talk for a post later this week, to give it its due in a separate discussion. Suffice it to say that I found this talk both enlightening and challenging.

    The rest of the day has been special events. At dinner time Alan Kay gave his Turing Award lecture. I will also blog on Alan's talks later this week when I can give them full attention, but for now I will say this: Alan's keynote to the Educators' Symposium was the better talk. Perhaps it was nerves, or a smaller time window, or just the effects of giving very similar talks on consecutive days. But we really lucked out with our 90+-minute talk and 30-minute Q-n-A session.

    Finally, as I write this, I'm sitting in on the GoF 10th Anniversary Commemorative event. Solveig Haugland, author of the hilarious spoof Dating Design Patterns, is leading a fun session on her book, replete with skits about the untold story of the Gang of Four. So far, it's mentioned Trojan Proxies, Encapsulated Big Fat Openings, Half Bad Boy Plus Protocol, and leather magazines. I had the opportunity to meet Solveig before the talk, and she is a lot of fun. The session is a nice way to end the day.


    Posted by Eugene Wallingford | Permalink | Categories: General

    October 26, 2004 11:00 AM

    The First Day of the Rest of OOPSLA

    With the Educators' Symposium over, I now get to enjoy the rest of OOPSLA worry-free. The conference proper begins today, with some great speakers and panels on-tap. As is now usual, the Onward! track will get most of my attention between invited talks and networking.

    Vancouver really is a nice conference city. The view to the right is one third of the panorama outside of my 15th floor hotel room. This morning I ran about eight miles, into the city to the south and then around False Creek, an inlet from the bay to our west.

    I do like water. I'm not one of those folks who feels an indomitable urge to live on the sea, but I love to be near it. (Though after my run, I did come back and pop Son of a Son of a Sailor into the CD player...) The sounds of waves lapping against the shore line, birds flocking over head, and the gentle swoosh of a morning crew team out for a training run of their own certainly bring a sense of peace to a run.

    That said, it's easy to forget that not everything about the coast is light and beauty. Port cities are, at their edges, industrial places: big steel ships and beyond-human scale concrete piers. Here in the heart of Vancouver things aren't so bad, with the coastline dominated by marinas and restaurants, but even still I ran through a couple of industrial areas that didn't look or smell all that wonderful. I'm glad when the real world occasionally reminds me not to surrender to my weakness for romance.

    Well, off to Rick Rashid's conference keynote on the future of programming. Rick's not a programming languages guy, but he has written a little code... He developed the Mach operating system, which sits at the core of a number of operating sytem running on my laptop right now!


    Posted by Eugene Wallingford | Permalink | Categories: General, Running

    October 22, 2004 4:23 PM

    Heading to Vancouver

    I'm off to Vancouver for OOPSLA. The conference will be exciting, with two talks by Alan Kay, another by Ward Cunningham, events on Eclipse and Squeak, and the culmination of a year of my work in Educators' Symposium. When you take on one of these professional service tasks, it seems pretty innocuous, but after months of thinking and planning and organizing, and working with the dedicated people who invested their time and energy through my program committee, the event takes on a more personal tone. I have big hopes for a symposium that helps the community take a step in the direction of re-forming CS education.

    I love Vancouver as a conference town and a running location. I'm looking forward to circling Stanley Park again and to exploring some new parts of town. You can be certain to receive another episode in my Running on the Road series soon. (See the latest one here.)

    We'll have wireless access at the conference. I plan to blog daily once I arrive.


    Posted by Eugene Wallingford | Permalink | Categories: General

    October 08, 2004 6:02 AM

    A Busy Month

    Today begins a busy stretch for me.

    This afternoon, I leave for a weekend reunion at Ball State University, my alma mater. I'm part of a scholarship program that has quadrennial get-togethers, and 2004 is our year. This is Homecoming Weekend in Muncie, and there will be a lot happening. Several old friends and their families will be there, and my brother is coming up to spend the weekend with me. Amazingly enough, I never attended a Ball State homecoming football game, even as a student. This seems like a good weekend or a first time. I also plan two easy runs as I prepare for...

    Next weekend, I run the Des Moines Marathon. My training has gone well, with only a couple of hiccups that I survived. Now, I'm tapering. This will be my second marathon after running Chicago 2003, and I am in much better shape -- physically and mentally -- for this one. But I am humble in the face of the 26.2 miles and know that I will have to be careful and lucky to meet my race goals.

    The next weekend, I take off for a week in Vancouver for OOPSLA 2004, where I am chairing this year's Educators' Symposium. The highlight of the 2004 conference is Alan Kay's Turing Award lecture. Alan is also giving the keynote address at the Educators' Symposium, and I am quite psyched.

    But all this travel makes for a busy month. I'd better get home to spend some time with my family before I hit the road!


    Posted by Eugene Wallingford | Permalink | Categories: General

    October 01, 2004 2:04 PM

    Proofs from THE BOOK

    I just finished reading Proofs from THE BOOK, by Martin Aigner and Günter Ziegler. The title comes from Paul Erdós, who "liked to talk about The Book, in which God maintains the perfect proofs for mathematical theorems". Mathematicians don't have to believe in God, said Erdós, but they have to believe in the book. Proofs from THE BOOK collects some of the proofs Erdós liked best and some others in the same spirit: clever or elegant proofs that reflect some interesting insight into a problem.

    I am a computer scientist, not a mathematician, but many of these proofs made me smile. My favorite sections were on number theory and combinatorics. Some of the theorems on prime and irrational numbers were quite nice.

    My favorite proof from The Book applied the pigeonhole principle in a surprising way. The claim:

    Suppose we are given n integers a1, ..., an, which need not be distinct. Then there is always a set of consecutive numbers aj+1, ..., ak whose sum is a multiple of n.

    This doesn't seem obvious to me at all. But consider the sequences N = {0, a1, a1+a2, a1+a2+a3, ..., a1+a2+...+an} and R = {0, 1, 2, ..., n-1}. Now consider the function f that maps each member of N, ai, to (ai mod n) in R. Now, |N| = n+1 and |R| = n, so by the pigeonhole principle we know that there must be two sums from N, a1+...+aj and a1+...+ak (j<k), mapped to the same value in R. a1+...+aj may actually be 0, the first value in N, but that doesn't affect our result.

    Then:

    subtract the smaller sum from the larger -- and get the claim!

    must have a remainder of 0. QED. Beautiful.

    Eugene sez: Check out Proofs from THE BOOK.

    P.S. It might be fun to create a similar book for proofs related specifically to computer science. Proofs from THE BOOK has some proofs on counting and one proof on information theory, but most of the book focuses on mathematics more broadly.


    Posted by Eugene Wallingford | Permalink | Categories: Computing, General

    September 23, 2004 5:09 PM

    Meta-Blog

    A few items about the blog itself...

    Comments

    A few folks have asked why Knowing and Doing doesn't support comments. I've been thinking about adding comments since July. Several blogs I read don't support comments, and a few that have supported comments in the past are disabling them.

    I have two reasons. One is comment spam. Many bloggers (e.g., Michael Nielsen) find that keeping their comment sections neat and tidy is difficult in these days of spambots. I don't have time to monitor comments and eliminate the ones that waste space or (worse) offend readers.

    The second comes down to another form of laziness. I'm using NanoBlogger to power my blog, and adding comments requires a small but non-trivial amount of work. I tried it once, couldn't get it to work right away, and so dropped the idea for a while. Then school happened, and I've just been too busy to get back to it.

    To a certain part of the blogging community, not supporting comments is a huge faux pas. To these folks, comments are an essential part of the blogging experience. I hope that I have not lost many potential readers for this reason, but it's a risk I have to take until I have more time to mess with the software and monitor the comments.

    On the other hand, I've always thought the prospect of seeing "Comments(0)" at the bottom of every entry would be rather depressing, so maybe this is a case of "what you don't know can't hurt you". :-) The fact that I receive occasional responses lets me know that someone is reading. (And I even had my first spotted at reference today!)

    Categories

    I recently changed the name of the Elementary Patterns category to just plain Patterns. I realized that all of my patterns posts thus far had been more general, and I didn't want to mislead folks. I expect that Elementary Patterns will return as a category some day soon, when I have more material specific to that issue to write on.

    Category Feeds

    I wish I had them. According to the doc, NanoBlogger supports them, and I have the config flag set to 'on', but they don't seem to be there. One of these days, I'll fix it.

    Personal Blogging

    So far, my blog has been almost exclusively about matters of professional interest, with one broad exception: Running. I don't expect that I'll begin blogging in a confessional or stream-of-consciousness mode any time soon, because those sorts of posts can go off course into self-indulgence pretty quick, and I don't trust myself. I'll keep posting on running because (1) some folks have expressed interest, (2) sometimes those posts interact with professional threads, such as agile software development, and (3) I like it. Hey, even I can indulge myself some of the time.

    That said, I must admit that when I blog on running, it almost feels like a day off. There's a lot less pressure on me to get those posts "right".

    Wrap-Up

    That's all for now. I am surprised that I've been able to keep up a steady pace blogging after the academic year started. It takes time, and with two new preps plus all of my regular duties, time isn't exactly in surplus. But I enjoy the sort of thinking I have to do when I write for a public audience, and so I've made time.

    But pretty soon I have to grade some assignments, or the students will revolt. And I do have some other writing to do... so don't be surprised to see some of that material show up here!


    Posted by Eugene Wallingford | Permalink | Categories: General

    September 18, 2004 12:02 PM

    This and That

    I am enjoying a weekend at home without a lot of work to do. After having been at PLoP for five days last week, a chance to be with my family and to regroup is welcome. Tomorrow is my longest training run for the Des Moines Marathon -- 24 miles. My 22-miler last Sunday at Allerton Park went well, so I'm hopeful for a good run tomorrow.

    Here are some programs that I've been experimenting with lately...

    Markdown is a simple plain text formatting syntax *and* a program that translates this syntax into HTML. I like to work in plain text, so I write all of my own HTML by hand. But when I am trying to whip up lecture notes for my courses, I find typing all the HTML formatting stuff a drag on my productivity. Markdown gives me the ability to format my notes in an email-like way that can be posted and read as plaintext, if necessary, and then translated into HTML. It doesn't do all of HTML, but it covers almost everything that I usually do.

    Here are a couple of Smalltalks implemented in Java: Talks2 and Tim Budd's Little Smalltalk, SmallWorld. Talks2 builds on Squeak and so has more stuff in it, but SmallWorld is, well, small, and thus has source code that is more accessible to students.

    And here are a couple of fun little sites:

    • CSS Zen Garden, a demonstration of the power of CSS design for web sites, along with oodles of style sheets for download. I don't do much CSS, but some of these pages are pretty.

    • Segmation, a company that makes cool image segmentation software. The web site's draw is its Color by Number toy, where you can paint by the numbers using your mouse and screen. The kids can have fun with this one.


    Posted by Eugene Wallingford | Permalink | Categories: General

    September 17, 2004 4:54 PM

    Money as Technical Contribution

    A technical person who works at a university recently lamented:

    Our administrators want to be involved in technical decisions, but they don't understand the technology. So they buy stuff.

    In his view, these managers think that selecting the software everyone uses makes them relevant. It's about power.

    I have noticed this tendency in administrators as well, but I think that we can find a more charitable interpretation. These folks really do want to contribute value to the organization, but their lack of deep technical understanding leaves them with only one tool available to them, money. (Ironic that this so, in these days of deep cuts in academia.) Unfortunately, this often leaves the university supporting and using commercial software -- sometimes rather expensive software -- when free, open source, and better software would serve as well.

    If we believe the more charitable interpretation, then we need to do a better job helping administrators understand technology, software, and the values they embody. It also means getting involved in the hiring administrators, to help bring in folks who either understand already or who are keen on learning. Both of these require the gritty sort of committee work and meetings that many academics run away from, me included. In a big organization, it is sometimes hard for grassroots involvement to have a big effect on hiring and promotion. But the effort is almost certainly worth it.


    Posted by Eugene Wallingford | Permalink | Categories: General

    September 16, 2004 4:01 PM

    Paying Attention to the Details

    At PLoP last week Gerard Meszaros said something that caught my ear:

    risk = probability X consequence

    Why waste energy minimizing a risk whose consequence is too low to be worth the effort?

    This idea came up later at the conference when Ward talked about the convention-busting assumption of wiki, but it is of course a central tenet in the agile methods. Too often, when faced with an undesirable potential result, we focus too quickly on the event's likelihood, or on its effects. But risk arises in the interplay between the two, not in either factor alone. If an event is likely to happen but has only a small negative effect, or if it has a major effect but is unlikely to occur, then our risk is mitigated by the second factor. Recognizing this can help us avoid the pitfall of running from a potential event for the wrong reasons.

    Recognizing this relationship can also help us to take control of the problem. In XP and the other agile methods, we accept that change is highly likely, so we work to minimize the consequence of change. We do that by maintaining a comprehensive suite of tests to help us verify that changes to the system don't break something unexpectedly; and, when they do, we use the tests to find and fix the problem spots. We minimize the consequence of change by using refactoring tools that help us to change the structure of our programs when design requirements call for something different. We minimize the consequence of change by working on short iterations, continuously integrating our code, and releasing versions frequently, because these disciplines ensure that we get feedback from our tools and client frequently.

    Learning to pay attention to all the variables in a situation is more general than just assessing risk. In a recent message to the XP mailing list, Kent Beck said that he tries to help his customers to think in terms of return, not just value or cost:

    I have another goal for early estimation, which is to encourage the business-decision-makers to focus on return instead of just value. If the choice is between the Yugo and the Ferrari, I'll take the Ferrari every time. If I have $6000 and I know the price of the two cars, my thinking is more realistic.

    A system's value is a function of many variables, including its features, its cost, and the environment in which it must operate.

    We can become more accurate, more efficient decision makers by paying attention to what really matters, and not being distracted by our biases and initial reactions. Often, these biases were learned in other times, other environments. Fortunately, I think that this is something that we can learn to do, by consciously developing new habits of thought. It takes discipline and patience to form new habits, but the payoff is often worth the effort.


    Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

    September 15, 2004 11:33 AM

    Ward on the Wiki of the Future

    Last time I wrote about Norm Kerth's Saturday session at the recently-ended PLoP 2004. Norm's topic was myth, especially the Hero's Journey myth.

    Ward Cunningham led the second session of the day, beginning with some of his own history. As many of you know, Ward is best known for taking ideas and turning them into programs, or ways of making programs better. He spoke about "wielding the power of programming", to be able to make a computer do what is important to you. If you can think of an idea, you can write a program to bring it about.

    But programmers can also empower other people to do the same. Alan Kay's great vision going back to his grad school days is to empower people with a medium for creating and expressing thoughts. Ward pointed out that the first program to empower a large group of non-programmers was the spreadsheet. The web has opened many new doors. He called it "a faulty system that delivers so much value that we ignore its fault".

    Ward's wiki also empowers people. It is an example of social software, software that doesn't make sense to be used by one person. The value is in the people that use it together. These days, social software dominates our landscape: Amazon, ebay, and a multitude of web-based on-line communities are just a few examples. Wiki works best when people seek common ground; it is perhaps best thought of as a medium for making and refining arguments, carrying on a conversation that progresses toward a shared understanding.

    This dynamic is interesting, because wiki was predicated in part on the notion of taking a hard problem and acting as if it weren't a problem at all. For wiki, that problem is malevolent editing, users who come to a site for the purpose of deleting pages or defacing ideas. Wiki doesn't guard against this problem, yet, surprisingly, for the most part this just isn't a problem. The social processes of a community discourage malevolent behavior, and when someone violates the community's trust we find that the system heals itself through users themselves repairing the damage. A more subtle form of this is in the flip side of wiki as medium for seeking common ground: so-called "edit wars", in which posters take rigid positions and then snipe at one another on wiki pages that grow increasingly long and tedious. Yet community pressure usually stems the war, and volunteers clean up the mess.

    Ward's latest thoughts on wiki focus on two questions, one technical and one social, but both aimed at a common end.

    First, how can we link wikis together in a way benefits them all? When there was just one wiki, every reference matched a page on the same server, or a new page was created. But now there are dozens (hundreds?) of public wikis on the web, and this leads to a artificial disjunction in the sharing of information. For example, if I make a link to AgileSoftwareDevelopment in a post to one wiki, the only page to which I can refer is one on the same server -- even if someone has posted a valuable page of that name on another wiki. How could we manage automatic links across multiple wikis, multiple servers?

    Second, how can wiki help information to grow around the world? Ward spoke of wiki's role as a storytelling device, with stories spreading, being retold and changes and improved, across geographic and linguistic boundaries, and maybe coming back around to the originators with a trail of where the story has been and how it's change. Think of the children's game "telephone", but without the accumulation of accidental changes, only intentional ones. Could my server connect to other servers that have been the source of stories that interested me before, to find out what's new there? Can my wiki gain information while forgetting or ignoring the stuff that isn't so good?

    Some of these ideas exist today at different levels of human and program control. Some wikis have sister sites, implemented now with crosslinking via naming convention. But could such crosslinking be done automatically by the wikis themselves? For instance, I could tell my wiki to check Ward's nightly, looking for crossreferenced names and linking, perhaps even linking to several wikis for the same name.

    In the world of blogging, we have the blogroll. Many bloggers put links to their favorite bloggers on their own blog, which serves as a way to say "I find these sites useful; perhaps you will, too." I've found many of the blogs I like to read by beginning at blogs by Brian Marick and Martin Fowler, and following the blogroll trail. This is an effective human implementation of the spreading of useful servers, and much of the blogging culture itself is predicated on the notion of sharing stories -- linking to an interesting story and then expanding on it.

    Ward's discussion of automating this process brought to mind the idea of "recommender systems", which examine a user's preferences, finds a subset of the community whose collective preferences correlate well with the user's, and then uses that correlation to recommends content that the user hasn't seen yet. (One of my colleagues, Ben Schafer, does research in this area.) Maybe a collection of wikis could do something similar? The algorithms are simple enough; the real issue seems to be tracking and recording user preferences in a meaningful way. Existing recommender systems generally require the user to do a lot of the work in setting up preferences. But I have heard Ralph Johnson tell about an undergraduate project he directed in which preferences were extracted from Mac users' iTunes playlists.

    I must admit that I was a bit worried when I first heard Ward talk about having a wiki sift through its content and activity to determine who the most valuable contributors are. Who needs even narrower bandwidth for many potential posters to add useful content? But then I thought about all the studies of how power laws accurately model idea exchange in the blogosphere, and I realized that programmed heuristics might actually increase the level of democratization rather than diminish it. Even old AI guys like me sometimes react with a kneejerk when a new application of technology enters our frame of reference.

    The Saturday sessions at PLoP created an avalanche of thoughts in my mind. I don't have enough time to think them now or act on them, but I'll keep at it. Did I say before how much I like PLoP?


    Posted by Eugene Wallingford | Permalink | Categories: Computing, General

    September 10, 2004 11:03 PM

    And Now Starring...

    I was in a skit tonight. I don't think I've had a role in a skit not created by one of my daughters since 1985 or so, when I teamed with one of best friends and the cutest girl I went to college with on a skit in our Business Law class.

    The skit was put on by Mary Lynn Manns and Linda Rising to introduce their forthcoming book, Fearless Change: Patterns for Introducing New Ideas. They wrote a short play as a vehicle for presenting a subset of their pattern language at XP 2004, which they reprised here. I had two lines as an Early Adopter, a careful, judicious decision maker who needs to see concrete results before adopting a technology, but who is basically open to change. Wearing a construction paper hat, I pulled off my lines without even looking at my script. Look for the video at a Mr. Movies near you.

    My favorite part of the skit that didn't involve me was this Dilbert comic that opened the show.

    PLoP makes you do crazy things.

    Our writers' workshop was excellent. Ralph Johnson and I had two papers on our elementary patterns work at ChiliPLoP. They are the first concrete steps toward a CS1 textbook driven by patterns and an agile approach from the beginning. The papers were well-received, and we gathered a lot of feedback on what worked well and what could stand some improvement. Now we're talking about how to give the writing project a boost of energy. I'll post links to the papers soon.


    Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

    September 04, 2004 4:37 PM

    I've Been Toogled

    ASCII art seems to be going through a renaissance these days. This quintessentially '70s art form has attracted a cult following among today's youth. An example of this phenomenon is the fun little search-engine-and-art-generator Toogle. Toogle feeds your query to Google Images and then generates an ASCII art version of the #1 image it finds using the characters of the query as its alphabet. The images look pretty good; it even reproduces colors well.

    True to form, I have Toogled myself. The result is shown above. Not the first image I would choose, but who am I to doubt Google Images?

    Now I can tell everyone that I really am a man of letters.


    Posted by Eugene Wallingford | Permalink | Categories: General

    August 30, 2004 10:54 AM

    Serendipity and False Pride

    Yesterday I blogged about a new Rule of Three for the patterns community, taken from Gerald Weinberg's The Secrets of Consulting. Weinberg motivated the rule with a story of how his false pride in being thought smart -- by his students! -- led to ineffective thinking.

    The story of false pride reminded me of one of my favorite scenes in the movie Serendipity, and then of one of my favorite classical quotes.

    In the movie, Jonathan ( John Cusack) throws all sensibility to the wind in an effort to find the woman he fell in love with one afternoon many years ago. His search threatens his upcoming wedding with a beautiful woman and makes everyone think he's nuts. But his best friend, Dean ( Jeremy Piven), sees the search and its attendant risk as something more.

    Dean is married, but his marriage is in trouble. He and his wife have let their problems go on so long that now both are too proud to be the one to make the first move to fix them. When Jonathan wonders out loud if he has gone nuts and should just go home and marry his lovely fiancee, Dean tells him that his search has been an inspiration to work with his wife to repair their relationship. In support of his admiration for Jonathan, he recited a quote from a college humanities course that they shared: "If you want to improve, be content to be thought foolish and stupid..."

    That scene and quote so struck me that, the next day, I had to track down the source. As is usually the case, Google helped me find just what I wanted:

    If you want to improve, be content to be thought foolish and stupid with regard to external things. Don't wish to be thought to know anything; and even if you appear to be somebody important to others, distrust yourself. For, it is difficult to both keep your faculty of choice in a state conformable to nature, and at the same time acquire external things. But while you are careful about the one, you must of necessity neglect the other.

    ... and led me to the source, The Enchiridion, by Epictetus.

    This quote is a recurring source of encouragement to me. My natural tendency is to want to guard my reputation by appearing to have everything under control, by not asking questions when I have something more to learn, by not venturing to share my ideas. Before I started this blog, I worried that people would find what I said shallow or uninteresting. But then I decided to draw my inspiration from Serendipity's Jonathan and step forward.

    Weinberg's book teaches the same lesson throughout: A consultant will live a better life and help their clients more if only they drop their false pride and admit that they don't know all there is, that they can't answer every question.

    And if you like romantic comedies but haven't seen Serendipity yet, then by all means check it out soon.


    Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

    August 17, 2004 2:07 PM

    August 13 -- My Talk on Test-Driven Development

    On the last day of SugarLoafPLoP 2004, I gave my test-driven development tutorial as the last event on the main program, just before the closing ceremony. I was pretty tired but brought as much energy as I could to it. The audience was tired, too, and it showed on their faces, but most folks were attentive and a couple asked interesting questions.

    One person asked about the role of traditional testing skills, such as finding equivalence classes on inputs, in TDD. These skills are still essential to writing a complete set of tests. Brian Marick and his colleagues in "agile testing" have written a lot about how testers work with agile projects. One of the great values of agile software development is that most everyone on your team can develop some level of expertise at writing tests, and can use whatever knowledge they learn about testing.

    Someone in industry asked whether TDD increases the quality of code but at the cost of longer development times. I answered that many believe TDD doesn't increase net development time, because this approach includes some testing time and because the increase in code quality means many fewer bugs to fix downstream. I could not point to any controlled experiments that confirm this, such as the ones Laurie Williams has condcuted on pair programming. If you know of any such studies, I would love to hear from you. I think this is an area ripe with possibilities.

    All in all, folks were skeptical, which is no surprise from an audience with a serious bent toward traditional software engineering practice. TDD and other agile practices are as disorienting to many folks as finding myself in the Sao Paulo airport was to me. Perhaps I helped them to see at least that TDD isn't irresponsible, that it can be a foundation for sound software development.

    This day turned into one like last Sunday -- after a half day of conference, Rossana Andrade took me and Paulo Masiero on a short sightseeing and souvenir-shopping trip around Fortaleza. Then she and her husband Richard took me to a cool Brazilian pizza place for dinner, and finally they took me to the airport a few hours before my 11:10 PM flight to Rio de Janeiro, the first leg of my journey home. The day became Saturday with no fanfare, just a long flight with a layover in Recife to exchange passengers and arrival in an empty and quite English-free Rio de Janeiro airport.

    I must say thanks to my hosts in Brazil, Paulo and Rossana. They took wonderful care of me, fed me lots of authentic food, told me all about their cities and country, chauffered me around, and translated everything from pizza menus to billboards for me. Indeed, all the folks at the conference were wonderful hosts and colleagues. I can heartily recommend SugarLoafPLoP to anyone interested in participating in a patterns conference.


    Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

    August 17, 2004 1:54 PM

    August 12 -- Eu Sobrevivi El Insano

    [ Update at the end... ]

    The writers workshops at SugarLoafPLoP have gone well so far. To moderate a workshop is hard work, because the moderator has to understand each paper pretty deeply, which requires hard work studying the paper. Then, he has to guide the workshop, keeping it on focus and asking the right leading questions when the discussion slows. I have a lot to learn yet about being a truly good moderator.

    The coolest part of this day wasn't my 100-minute run at sunrise over the grounds of the Aquaville Resort, but our afternoon at Beach Park, a water park near the resort. The conference organizers set aside a three-hour block to decompress from the work we'd been doing by going to the park. This place has a wide variety of water slides -- and the scariest ones I've ever seen! The web site touts the newest, Kalafrio, and it was both fun and scary. But the scariest of all is named "el Insano", and the name fits. It is a 41m tall slide, with a drop that is darn close to vertical. A few of the college guys wanted to try it and, when they found out I was the only "old guy" around who wanted to give it a go, they took me with them. Click the link for a picture.

    That first moment is the scariest. Just after you go over the edge, you are airborne for a second, out of contact with the bottom of the slide. The gravity does its job and accelerates you to a remarkable speed. When you hit the bottom, you enter a curve that brings you into a tunnel parallel to the ground, where water hits you with a remarkable force. I think that is a deceleration mechanism, because otherwise they'd need an airport runway-length chute at the bottom. It all happened so fast that I hardly had time to be afraid. I remember my heart racing at the initial drop, and then the sensation of falling, and then the water in the tunnel -- but then it was over in what seemed like an instant. I may have screamed, but my heart was so loud in my ears that I wcouldn't have heard it.

    Thanks to Joe Yoder, I have a T-shirt to show all that "Eu sobrevivi ... El Insano". (That's "I survived", in Portugese.)

    And to prove it was no fluke, I did it again later!

    [ And now, thanks to Sergei Golitsinski, we have the only known picture of me descending the Great Wet Way. ]


    Posted by Eugene Wallingford | Permalink | Categories: General

    August 17, 2004 1:41 PM

    August 11 -- My Talk on Writing Patterns

    I finally gave my talk on writing patterns and pattern languages this morning. It went well enough, I suppose, but I broke many of the suggestions I made in the paper: too few examples, too abstract. Sigh. How can I manage so often often to know what to do yet not do it? This talk will be better the next time I give it.

    The best question at this session was about trying to write patterns that "live forever". I used Alexander's "Light on Two Sides of Every Room" as an example, and this prompted someone to point out that even the best patterns seem to become stale after a certain period of time. People wouldn't want to have two windows on two sides of their rooms if they lived in a dirty part of Sao Paulo, so Alexander's pattern is already dated; and, if Alexander's patterns suffer this fate, how can we mortals hope to write software patterns that live forever?

    My answer was two-fold:

    • The reason why "Light on Two Sides of Every Room" doesn't work in certain parts of big cities is that it is out of contex there -- the patterns that must be present before "Light" applies aren't there. If we wish to apply Alexander's pattern language in a form of diagnosis, a la The Oregon Experiment, we would have to start with patterns far upstream of this one. Indeed, Alexander would probably urge us to start from scratch somewhere else and make a more livable space to begin with!
    • In a discipline as young as building software, we can't expect that we will always understand things as well as we may in the future. So we should write patterns that document what works according to our current understanding of the world and, if we come to understand more or better five years hence, then update the language. Our pattern languages are always works in progress, as the software community discovers its communal knowledge of what gives software the Quality.

    That's my understanding today. If I learn something to make me change my mind tomorrow, I'll post an update. :-)


    Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

    August 17, 2004 1:33 PM

    August 10 -- Language and Fruit

    This is my first trip overseas, and I did not adequately have anticipate how I would feel being where language separates me from the world around me. Not understanding airport announcements and signs left me in a state of constant uncertainty. (I even managed to leave my checked bag at baggage claim in Sao Paulo yesterday, so I lived out of my carry-on for the second of two straight days up. That resulted partly from not understanding the language and partly from not understanding how customs overseas work.)

    Language can make us lose confidence in other ways, too. Technical jargon can turn a paper or class session into an intimidating experience. Given that I was in Brazil for a conference on how to write more effectively, this fact stood out to me from my experiences moving around the country, even with a native Brazilian often at my side to help me. I hope that I am able to keep this feeling in my mind this semester as I prepare lectures and talks for my students.

    I was to give my first talk to open the conference today, on writing patterns and pattern languages, but it was first postponed from 3:00 PM to 6:00 PM and then finally to 8:00 AM Tuesday morning. Paulo Borba, Rossana Andrade, and I spent the morning in Fortaleza on the campus of the Federal University of Ceara (UFC), where Rossana teaches. One of Rossana's students defended his master's thesis, and Paulo was on the thesis committee. When the defense ran later than scheduled and we spent more time than expected over lunch, we ended up arriving at the Aquaville Resort outside of Fortaleza after the time the conference was to begin. So we, as the chairs of the conference, postponed the start by half an hour! Things worked this way all week -- the schedule seemed more a helpful suggestion than a rigid expectation. The Brazilian folks seemed comfortable with this from the start. I adapted to this rhythm pretty quickly myself.

    Brazil has a lot of different fruits that we never see up here. Many have juice that is enjoyable only after sweetening, and the tastes of many are less bold than their more famous cousins, but they do add a new twist to the Brazilian diet. I especially like caja juice!


    Posted by Eugene Wallingford | Permalink | Categories: General

    August 17, 2004 1:25 PM

    August 9 -- My First talk in Brazil

    Well, I made it to Brazil. Yesterday was a day that came and went with no end. I ran 18 miles in Bradenton before the sun rose, visited with parents until after lunch, and then went to the airport for an overnight flight that brought me to Recife at lunch time Monday.

    The change from English to Portugese on theb plane from Miami to Sao Paulo made the newness of my surroundings obvious. In Sao Paulo, I went through the dreaded American immigration line. The Brazilian government strives to treat each nation's citizens as that nation treats Brazilian citizens and, with the procedures in place here since 9/11, that means long lines, fewer handling stations, photographs, and fingerprints for Americans entering Brazil. I spent over two and half hours of a three-hour layover in Sao Paulo going through the immigration line. And by I use the word "line" with some hesitation. The South Americans and Europeans in the crowd certainly didn't feel limited by any idea of the linear.

    My first stop was the Federal University of Pernambuco (UFPE), in Recife. My SugarLoafPLoP co-program chair, Paulo Borba, teaches there, and he asked me to give a talk to his department. I debuted the test-driven development talk that I planned to give at the conference. It went well, despite my not having slept more than an hour since 34 hours earlier and our running so late after lunch that on arrival I walked straight into the classroom and gave my talk. The audience was mostly graduate students, many of whom write software in industry. I'd forgotten what it felt like to be in a room with a bunch of grad students trying fit whatever talk they here into the context of their own research. I spent considerable time discussing the relation of TDD and refactoring to aspect-oriented programming, JML, and code coverage tools such as Clover. This dry run led me to make a couple of improvements to the talk before delivering it on Friday to the conference audience.

    I was energized by the foment! But then I was ready to crash.


    Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

    August 16, 2004 1:23 PM

    August 7 -- En Route to Florida

    My trip to Florida went well.

    I love to fly out of the Quad Cities Airport. It is small but not too small. The TSA personnel there work quickly and efficiently. AirTran, the discount airline that's the reason I occasionally fly out of the QCs, is fast and efficient, too.

    And now I have another reason to like this airport: It offers free wireless! The service is through MediaCom. It, too, was fast, and reliable. It was nice to be able to quickly scan e-mail and check a web site for the slides I was polishing.

    Ubiquitous wireless is the future, but the future is more and more with us today. The shopping mall in Cedar Falls and a couple of local eateries now provide wireless for their customers. How will this change our world? Checking mail and web surfing are the uses most of us make of it now, but they won't be where the real effect lies.

    I find that I'm sometimes more productive in airports and on airplanes than at home. Airports are, in a strange way, less distracting than the other places I work. I don't know anyone around me, the scenery is impersonal and unremarkable, and I'm away from everything but my laptop and my thoughts. All I really need is an extra battery for the laptop, or good enough fortune to find an open and easily accessible outlet in the airport. But even sitting on the concourse floor, under a telephone to be near an outlet, is strangely freeing -- the words flow. Perhaps it's only the change of scenery. I got as much work done yesterday as I had all week in the office.

    Tomorrow, I head to Brazil. I'm a bit on edge.


    Posted by Eugene Wallingford | Permalink | Categories: General

    August 16, 2004 1:20 PM

    Blogging from SugarLoafPLoP

    I had planned to blog while traveling to Florida and BrazilR, but I didn't have convenient Internet access for most of the trip. So I will post my entries one at a time now that I'm back, with titles that give the day the entry was written. I look forward to getting back to blogging on some content this week.


    Posted by Eugene Wallingford | Permalink | Categories: General

    August 06, 2004 8:44 AM

    Leaving for SugarLoafPLoP

    I am off to Florida for a couple of days on my way to Brazil for SugarLoafPLoP 2004. I give a talk in Recife on Monday and conference talks on Tuesday and Friday. In between I'm leading a writers workshop and serving as a writing mentor to a new author. I'll be busy! The conference hotel offers Internet access, so I'll try to blog about the goings-on.

    My talks aren't finished yet, so I will also be busy on the plan today and then on the flights to Brazil. Good thing my Tampa-Recife flights net out at 23 hours...

    Running update: I ran my best 6x1200m speed work-out this morning, bringing every repeat in at 4 seconds under target except the last, which I ran 14s faster than my goal time. So, after four slow runs recovering from my nine-day lay-off, I may be back on track! Let's hope that I can find the time and places I need to run while traveling.


    Posted by Eugene Wallingford | Permalink | Categories: General, Running

    August 04, 2004 12:35 PM

    Recent Articles of Interest

    I'm busy today working on conference chair duties, but I wanted to share a couple of ideas I ran across while reading yesterday:

    Laziness, Agility, and the Web

    Ben Hyde points to this story of the web at work.

    In December of 2002, I uploaded a screen-captured table .... I couldn't be bothered to convert it into HTML. Eighteen months on, Adrian Furby did just that. This shows there's some "can I have some more"'s law of the lazyweb or something, and that you should optimise for laziness and early public whining instead of planning ahead.

    I've experienced this on a local scale, with my students. Often, when I post something to a course web page that leaves a natural blank to be filled, a student will do the job -- especially if it allows them to show that they know something about a programming language or a tool.

    There is something agile in this "Can I Have Some More?"'s Law. Instead of waiting to post an idea until it is 100% ready, get something useful out for people to see. The community can often provide useful feedback that improves the idea, and some may even benefit from your incomplete idea now.

    One nice thing about the blog culture is that it lowers the barrier to sharing incomplete ideas and getting feedback from a wider set of readers.

    Tool-Making and Progress

    This article provides a nice reminder of how human progress depends on the creation of better tools. That should make computer scientists both feel good about our place in the world and remember the responsibility we bear. We are first and foremost tool builders, and the rest of the world depends on what we do to do what they do better.

    We also build tools for ourselves. One of the things that has always attracted me to certain software communities -- Lisp, Smalltalk, OOP, agile software -- is the liveliness with which they write and share programs to improve the lives of the people in them. This is true of many other software communities, too. Ruby and Perl come to mind. Perhaps this desire to build better tools for ourselves is one of the hallmarks of software people?


    Posted by Eugene Wallingford | Permalink | Categories: General

    July 29, 2004 8:15 AM

    Upcoming TDD Tutorial

    Today I am working on an invited tutorial that I will be giving at the Fourth Latin American Conference on the Pattern Languages of Programs, affectionately known as SugarLoafPLoP 2004. The conference is August 10 through 13 in Porto das Dunas, a resort town in the north of Brazil.

    I am also serving as program co-chair for the conference. Being a program chair for a PLoP conference can be quite time consuming, but this has not been too bad. My Brazilian co-chair, Paulo Borba, has done a lot of work, especially with submissions written in Portugese, and the conference chair, Rossana Andrade, has shouldered most of the responsibility for the conference schedule. This has left me to focus on the writers workshop program for authors working in English and with handling basic inquiries from authors and presenters.

    My tutorial is on test-driven development (TDD). I plan first to give some background on extreme programming (XP) and TDD's place in it, and then to introduce the practices, supporting tools, and "theory" of TDD through a pair programming demonstration. Lecturing on how to do something always seems like a bad idea when you can do it and talk about it at the same time. One of the things I've been thinking a lot about is how practicing TDD can both help and hinder the evolution of good design, and how other practices such as refactoring work with TDD to even better effect. I hope for my talk to convey the "rhythm" of TDD and how it changes one's mindset about writing programs.

    On my way to SugarLoafPLoP, I am giving a talk to the Centro de Informática at the Universidade Federal de Pernambuco in Recifé. I think that this talk will also be on TDD, with a focus on interaction-based testing with mock objects. I've been learning about this idea from a recent article by Martin Fowler. I could always still talk on elementary patterns, which is the work closest to my heart, but it seems more convenient to mine the same thread of material that's occupying my mind most deeply right now.

    This is my first trip outside of North America, so I'm looking forward several new experiences! Now, back to those talks...


    Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

    July 26, 2004 1:11 PM

    Tough Choices

    Today I've been working on OOPSLA 2004. If you use or study object-oriented techniques and have never been to OOPSLA, then you should definitely try to make it to Vancouver in October. OOPSLA is an electric conference, and Vancouver is a great conference town.

    I'm chair of this year's Educators Symposium, which promises to be a lot of fun. Alan Kay -- who this year has won the Turing Award, the Draper Prize, and Kyoto Prize -- is giving our keynote address. We have a great line-up of papers and activities, too. Most of my work as chair is done, as the program is mostly set. With the help of my program committee, I have a few panels and activities to finalize yet. The last big task I face is the one I'm doing now: choosing among the many deserving applicants for Educator Scholarships.

    OOPSLA is a conference of SIGPLAN. Each year, SIGPLAN generously supports a number of scholarships to OOPSLA for educators, so that these folks can learn about the latest in OO technology and use that to improve the teaching of OOP in our colleges and universities. I think that the educators add to the conference, too, by bringing teaching ideas that help trainers and by bringing an excitement to the professional population.

    In OOPSLA's salad days of the late 1990s, SIGPLAN was able to offer a large scholarship fund. In recent years, OOPSLA has not been as profitable. Fortunately, SIGPLAN continues to support the scholarships, but with an understandably smaller pot of money.

    Low supply makes awarding the scholarships even harder than usual. The demand for scholarships hasn't gone down, and I find that nearly all of the applicants are deserving. This problem is tough enough, but it is complicated by a couple of other factors:

    • Through my own work in the OOPSLA, patterns, and SIGCSE communities, I know many of the applicants, or their colleagues.
    • Several of my program committee members have applied. That's tough both because of the previous bullet and because it removes them from my pool of reviewers.

    The program committee has implemented a couple of ways to make the selection process more objective. But no set of rules can shield me from the selection process, as ultimately I have to sign off on the awards.

    As one program committee member told me in e-mail, "That's why they pay you the big bucks." I laughed then, but I don't feel like laughing right now.


    Posted by Eugene Wallingford | Permalink | Categories: General

    July 25, 2004 3:28 PM

    An End to My Innocence

    Someone stole the stereo out of my car this morning before church -- while it was parked in our own garage. My wife and daughter had gone out earlier in the morning to deliver newspapers, and they left the garage door for our van open because we'd soon be heading out again. We live in Cedar Falls, Iowa, a small university town known for being friendly and crime-free in a state known for being friendly and crime-free. It doesn't feel that way right now.

    I've never been a victim of a theft and, as small as this is, it's unsettling. Fortunately, the thief (1) knew what he was doing and didn't really damage the dashboard and (2) only wanted the stereo and so didn't take anything else from the garage or car. And my wife and daughters were not bothered. So I'll count my blessings.

    We've always been perhaps a bit too trusting in leaving the garage door up or the side door unlocked, but that will change. I don't think it's an overreaction to start locking doors more observantly. But I wish I didn't feel that I had to.

    Not having been a crime victim before, I don't think I ever really got it when other crime victims spoke of feeling "violated". I read that phrase in the newspaper almost daily and was unmoved. Sometimes, in my mind, I probably wondered if those folks weren't just a little too sensitive, maybe even whiners. I've learned the lesson now. As I noted above, the crime I've experienced is relatively tame and wholly impersonal. It compares not at all to many other crimes against one's person. I will be more compassionate in my response to other's misfortunes from now on.

    On a less emotional note, this theft has reminded just how dependent we are now on electonics. In removing my stereo, the thief disabled some part of my Taurus's electronic system, resetting a little computer somewhere. As a result, the speedometer doesn't work and I have no way to control the fans or air conditioner. Man, it gets hot in a car fast, even in Iowa! Let's roll the windows down then-- wait, we can't do that either. They have only electronic controls. And then, when I get home, I go to lock the car doors with the automatic switch on my keychain, but it doesn't work either. Sigh.

    At least I can lock and unlock the doors manually. I'll be careful to lock them wherever I go now.


    Posted by Eugene Wallingford | Permalink | Categories: General

    July 24, 2004 4:26 PM

    Writing a Novel in the Agile Way

    This weekend, I re-read Jon Hassler's My Staggerford Journal. Hassler is a novelist of small-town Minnesota life, and My Staggerford Journal is the diary-like story of the writing of his first novel, Staggerford, on a sabbatical from Brainerd Community College. I first read it in the months before my own sabbatical of Fall 2002, in hopes of some energy and inspiration. The results of my sabbatical disappointed me, but this journal did not. I heartily recommend his other novels to readers who like the stories of small-town Midwesterners coming to grips withe the changes of life.

    One paragraph jumped out at me from Hassler's description of what it was like to give birth to the novel he'd wanted -- needed -- to write for twenty years:

    I enjoy working on a second draft better than a first. If I had my choice I would write nothing but second (or later) drafts. But to get to that sometimes pedantic, sometimes exhilirating stage of perfection, polishing, filling in holes, rechanneling streams, etc., one has to struggle through the frightening first draft, create the damn thing through to the end, live it night and day and not know where it's going, or if you do know where it's going, then you don't know if you have the skill or stamina to get it there. It won't get there on its own.

    Those feelings sound familiar to this programmer.

    Hassler's discussions of rewriting brought to mind redesign and refactoring. Of course, Hassler wasn't just refactoring his novel. In the second and third drafts, he made substantive changes to the story being told and to the effect it elicits from his readers. But much of his rewriting sounded like refactoring: changing sentences here and there, even rewriting whole chapters to bring out the real story that the earlier version meant to tell. Hassler certainly writes of the experience as one who was "listening to the code".

    The pain of writing the first draft sounds like a rather un-agile way to develop a novel: creating a whole without knowing where he or the story are going, living in a constant state of uncertainty and discomfort. I have known that feeling as a programmer, and I try to teach my students how to avoid it -- indeed, that it is okay to avoid it.

    I wonder what it would be like to write a novel using an "agile" method? Can we create art in quite that way? I'm not an artist, but somehow I think painting may work that way more than writing a novel.

    Or maybe novelists already move in agile way, with the first draft being reworked in bits and pieces as they go, and later revisions just continuing with the evolution of the work. Maybe what distinguishes Hassler's first draft and his later drafts his more in his mind than in the work?


    Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

    July 22, 2004 10:07 AM

    Ego Run Wild

    Like many folks, I occasionally "vanity surf" on my name over at Google. Yesterday, I ran across this article on vanity searching for just one's first name. Of course, I couldn't resist...

    The first reference to me comes at link 72. So much for pumping up my ego.

    Looking a little closer, many references are to the city of Eugene, Oregon, and organizations in its penumbra, such as its airport, Chamber of Commerce, and newspapers. If we narrow our focus down to individuals, I come in 24th. Most of the folks ahead of me are well-known writers, artists, and astronauts. I was surprised at the number of computer scientists named Eugene -- three or four folks ahead of me on the list are computer scientists, including the department chair at the University of Toronto and the best known Eugene in CS, Gene Spafford.

    If I ever make contributions as important as Gene Spafford, I can afford to worry about my Google ranking. Until then, I should just get back to work.


    Posted by Eugene Wallingford | Permalink | Categories: General

    July 17, 2004 2:40 PM

    One Week and Counting...

    Well, I made it through a successful first week of blogging. I am not certain that I can maintain a post-a-day pace, but I think I can find a comfortable groove. Certainly my entries this week have been more like essays than just blogging, and doing that daily is untenable -- especially after school starts for the fall. But I hope that you find the articles useful. I have found writing them to be useful to me!

    Cool link of the day: Yesterday, Clive Thompson blogged on The Shape of a Song, a web app that creates images from midi files by finding patterns in the song and representing them with arcs. Not only is the app -- a Java applet -- available on-line, but the artist has a large repertoire of songs already available, extendible by his user community. Enjoy!


    Posted by Eugene Wallingford | Permalink | Categories: General

    July 09, 2004 12:37 PM

    Joining the Fray

    Welcome to my blog. I have been enjoying many different blogs for a couple of years, most notably from the agile software and OOP communities. All the while, I've been thinking I should give it a try myself.

    The hard drive of every computer I've ever had is littered with little snippets and observations that I would have liked to make to someone, anyone, at the time. As many of you do, I often write to clarify what's happening in my mind, and to learn something new. The idea of a weblog opens new avenues to sharing such thoughts, and to learn both from the writing and from whomever may read them.

    In this blog, I'll chronicle ideas from my professional life as an academic, teacher, and software developer. I have a strong interest in how people make things, in particular computer programs. One of the topics I most hope to write about are the elementary patterns of programs and how they enter into the process of learning how to build software.

    I'm also a human being (shh... don't tell my students!) and will also share some of my thoughts as I go through life these days as a runner, a student of piano, a father and husband, and a regular guy in a fun but complex world.

    I hope you find this blog as useful as I hope it will be to me.


    Posted by Eugene Wallingford | Permalink | Categories: General