Does it have a name?
Of course, Schadenfreude itself doesn't really have a name in English. It is a German word that means roughly delight in another person's misfortune. (However, I see that Wikipedia offers one, the 300+-year-old, apparently abandoned "epicaricacy".)
Last semester, a colleague described what struck me as the complement of Schadenfreude. He reported that one of our close friends, a retired professor here, expressed a strong unhappiness or distaste for faculty who succeeded in publishing academic papers. This matters to him because he is one of those folks. His friend came to the university in a different era, when we were a teacher's college without any pretension to being a comprehensive university. The new faculty who publish and talk about their research, she said, are "just showing off". Their success caused her pain, even if they didn't brag about their success.
This is not the opposite of Schadenfreude. That is happiness in another's good fortune, which Wikipedia tells us matches the Buddhist concept of mudita. What our friend feels inverts both the emotion and the trigger.
I don't think that her condition corresponds to envy. When someone is envious, they want what someone else has. Our friend doesn't want what the others have; she is saddened, almost angered, that others have it. No one should.
The closest concept I can think of is "sour grapes", a metaphor from one of Aesop's beloved fables. But in this story, the fox does want the grapes, and professes to despise them only when he can't reach them. I believe that our friend really doesn't want the success of research; she earnestly believes that our mission is to teach, not publish, and that energy spent doing research is energy misspent. And that makes her feel bad.
When my colleague told me his story, I joked that the name for this condition should be freudenschade. I proposed this even though I know a little German and know how non-sensical it is. But it seemed fun. Sadly, I wasn't the first person to coin the word... Google tells me that at least one other person has. You may be tempted to say that I feel freudenschade that someone else coined the term "freudenschade" first, but I don't. What I feel is envy!
The particular story that led to my discussion is almost beside the point. I'm on a mission that has moved beyond it. I am not aware of a German word for the complement of Schadenfreude. Nor am I aware of an English word for it. Is there a word for it anywhere, in English, German, or some other language?
I'm curious... Perhaps the Lazyweb can help me.
In case you think me odd in my recent interest in the idea of computer science for all students, even non-majors, check out an interview with Andries van Dam in the current issue of The Chronicle of Higher Education on-line:
Q: What do you hope to bring to computer-science education?
A: We'll try to figure out "computing in the broad sense" -- not just computer-science education, but computing education in other fields as well. What should high-school students know about computation? What should college students know about computation? I think these are all questions we're going to ask.
van Dam is a CS professor at Brown University and the current chair of the Computing Research Association's education committee. I look forward to seeing what the CRA can help the discipline accomplish in this space.
Do keep in mind when I say things like "computer science for all students", I mean this for some yet-undetermined value of "computer science". I certainly don't think that current CS curricula or even intro courses are suited for helping all university or high school students learn the power of computing. (Heck, I'm not even sure that most of our intro courses are the best way to teach our majors.)
That's one of the concerns that I have with the proposed Computing and the Arts major at Yale that I mentioned last time. It's not at all clear to me that a combination of courses from the existing CS and art majors is what is really needed to educate a new audience of intellectuals or professionals empowered to use computation in a new way. Then again, I do not know what such a new major or its courses might look like, so this experiment may be a good way to get started. But the faculty there -- and here, and on the CRA education committee, and everywhere else -- should be on the look-out for how we can best prepare an educated populace, one that is computation-savvy for a world in which computation is everywhere.
I've written so much about scientists as programmers, I'm a little disappointed that I haven't made time to write more about artists as programmers in response to Ira Greenberg's visit last month. The reason is probably two-pronged. First, I usually have less frequent interaction with artists, especially artists with this sort of mentality. This month, I did give a talk with an artistic connection to an audience of art students, but even that wasn't enough to prime the pump. That can be attributed to the second prong, which is teaching five-week courses in languages I have never taught before -- one of which, PHP, I've never even done much programming in. I've been busy preparing course materials and learning.
Before I lose all track of the artists-as-programmers thread for now, let me say a few things that I still have in waiting.
Processing is really just a Java IDE. I don't mean that in a dismissive sense; it's very useful and provides some neat tools to hide the details of Java -- including classes and the dread "public static void main" -- from programmers who don't care. But there is not all that much to it in a technical sense, which means that CS folks don't need to obsess about whether they are using it or not.
For example, you can do many of the same things in JES, the Jython environment created for Guzdial's media computation stuff. When I taught media computation using Erickson and Guzdial's Java materials, I had my students implement the an interpreter for the simplest of graphics languages and then asked them to show off their program with a piece of art produced with the language. One result was the image to the right, produced by freshman Keller McBride's using a program processed by his own interpreter.
During his talk, Greenberg mentioned that he had a different take on the idea of web "usability". Later I commented that I was glad he had said that, because I found that his website was a little bit funky. His response was interesting in a way that meshes with some of the things I occasionally say about computing as a new paradigm for expressing ideas. (This is not an original idea, of course; Alan Kay has been trying to help us understand this for forty years.)
Greenberg doesn't see computation only as an extension of the engineering metaphor that has defined computing in the age of electronics; he sees it as the "dawn of a new age". When we think of computation in the engineering context, issues such as usability and ergonomics become a natural focus. But in this new age, computing can mean and be something different:
Where I want my toaster to "disappear" and simply render perfectly cooked bread, I don't want that same experience when I compute--especially since I don't often have an initial goal/purpose.
He mentioned, too, that his ideas are not completely settled in this area, but I don't think that anyone has a complete handle on what the new age of computing really means. It sounds as if his ideas are as well formed as most anyone's, and I'm excited when I hear what non-CS people think in this space.
Finally, when it comes to teaching art and computer science together, some schools are already working in that direction. For example, faculty at Yale recently announced that they are putting together a major in computing and the arts. I am not sure what to think about their proposal which aims to be "rigorous" by requiring students to take existing courses in the arts and computer science. There are courses created especially for the major. That is probably a good idea for some audiences, but what about artists who don't want a full computer science-specific CS experience? Do they need the same technical depth as your average CS student? Somehow, I don't think so. A new kind of discipline may well require a new kind of major. But it's neat that someone is taking steps in this direction. We will probably learn something useful from their experience.
While catching up on some work at the office yesterday -- a rare Saturday indeed -- I listened to Peter Turchi's OOPSLA 2007 keynote address, available from the conference podcast page. Turchi is a writer with whom conference chair Richard Gabriel studied while pursuing his MFA at Warren Wilson College. I would not put this talk in the same class as Robert Hass's OOPSLA 2005 keynote, but perhaps that has more to do with my listening to an audio recording of it and not being there in the moment. Still, I found it to be worth listening as Turchi encouraged us to "get lost" when we want to create. We usually think of getting lost as something that happens to us when we are trying to get somewhere else. That makes getting lost something we wish wouldn't happen at all. But when we get lost in a new land inside our minds, we discover something new that we could not have seen before, at least not in the same way.
As I listened, I heard three ideas that captured much of the essence of Turchi's keynote. First was that we should strive to avoid preconception. This can be tough to do, because ultimately it means that we must work without knowing what is good or bad! The notions of good and bad are themselves preconceptions. They are valuable to scientists and engineers as they polish up a solution, but they often are impediments to discovering or creating a solution in the first place.
Second was the warning that a failure to get lost is a failure of imagination. Often, when we work deeply in an area for a while, we sometimes feel as if we can't see anything new and creative because we know and understand the landscape so well. We have become "experts", which isn't always as dandy a status as it may seem. It limits what we see. In such times, we need to step off the easy path and exercise our imaginations in a new way. What must I do in order to see something new?
This leads to the third theme I pulled from Turchi's talk: getting lost takes work and preparation. When we get stuck, we have to work to imagine our way out of the rut. For the creative person, though, it's about more about getting out of a rut. The creative person needs to get lost in a new place all the time, in order to see something new. For many of us, getting lost may seem like as something that just happens, but the person who wants to be lost has to prepare to start.
Turchi mentioned Robert Louis Stevenson as someone with a particular appreciation for "the happy accident that planning can produce". But artists are not the only folks who benefit from these happy accidents or who should work to produce the conditions in which they can occur. Scientific research operates on a similar plane. I am reminded again of Robert Root-Bernstein's ideas for actively engaging the unexpected. Writers can't leave getting lost to chance, and neither can scientists.
Turchi comes from the world of writing, not the world of science. Do his ideas apply to the computer scientist's form of writing, programming? I think so. A couple of years ago, I described a structured form of getting lost called air-drop programming, which adventurous programmers use to learn a legacy code base. One can use the same idea to learn a new framework or API, or even to learn a new programming language. Cut all ties to the familiar, jump right in, and see what you learn!
What about teaching? Yes. A colleague stopped by my office late last week to describe a great day of class in which he had covered almost none of what he had planned. A student had asked a question whose answer led to another, and then another, and pretty soon the class was deep in a discussion that was as valuable, or more, than the planned activities. My colleague couldn't have planned this unexpectedly good discussion, but his and the class's work put them in a position where it could happen. Of course, unexpected exploration takes time... When will they cover all the material of the course? I suspect the students will be just fine as they make adjustments downstream this semester.
What about running? Well, of course. The topic of air-drop programming came up during a conversation about a general tourist pattern for learning a new town. Running in a new town is a great way to learn the lay of the land. Sometimes I have to work not to remember landmarks along the way, so that I can see new things on my way back to the hotel. As I wrote after a glorious morning run at ChiliPLoP three years ago, sometimes you run to get from Point A to Point B; sometimes, you should just run. That applies to your hometown, too. I once read about an elite women's runner who recommended being dropped off far from your usual running routes and working your way back home through unfamiliar streets and terrain. I've done something like this myself, though not often enough, and it is a great way to revitalize my running whenever the trails start look like the same old same old.
It seems that getting lost is a universal pattern, which made it a perfect topic for an OOPSLA keynote talk.
My dean recently distributed copies of "Behind Door No. 2: A New Paradigm for Undergraduate Science Education", from the 2006 annual report of Research Corporation. It says many of the things we have all heard about revitalizing science education, summarizing some of the challenges and ideas that people have tried. The report speaks in terms of the traditional sciences, but most of what it says applies well to computer science.
I don't think I learned all that much new from this report, but it was nice to see s relatively concise summary of these issues. What enjoyed most were some of the examples and quotes from respected science researchers, such as physics Nobel laureate Carl Wiemann. One of the challenges that universities face in re-forming how they teach CS, math, and science is that research faculty are often resistant to changing how they teach or think about their classrooms. (Remember, we material to cover.) These faculty are often tenured full professors who wield significant power in the department over curriculum and program content.
At a comprehensive university such as mine, the problem can be accentuated by the fact that even the research faculty teach a full load of undergraduate courses! At the bigger research schools, there are often faculty and instructors who focus almost entirely on undergraduate instruction and especially the courses in the undergraduate core and for non-majors. The research faculty, who may not place too much confidence in "all that educational mumbo-jumbo", don't have as much contact with undergrads and non-majors.
I also enjoyed some of the passages that close the article. First, Bruce Alberts suggests that we in the universities worry about the mote in our own eye:
I used to blame all the K-12 people for everything, but I think we [in higher education] need to take a lot of responsibility. ... K-12 teachers who teach science learned it first from science courses in college. You really want to be able to start with school teachers who already understand good science teaching, ...
Leon Lederman points to the central role that science plays in the modern world:
Once upon a time the knowledge of Latin and Greek was essential to being educated, but that's no longer true. Everywhere you look in modern society in the 21st century, science plays a role that's crucial. It's hard to think of any policy decision on the national level that doesn't have some important scientific criteria that should weigh in on the decisions you make.
He probably wasn't thinking of computer science, but when I think such thoughts I surely am.
Finally, Dudley Herschbach reminds us that the need for better science education is grounded in more than just the need for economic development. We owe our students and citizens more:
So often the science education issue is put in terms of workforce needs and competitiveness. Of course, that's a factor. But for me it's even more fundamental. How can you have a democracy if you don't have literacy? Without scientific literacy, citizens don't know what to believe.... It is so sad that in the world's richest country, a country that prides itself on being a leader in science and technology, we have a large fraction of the population that might as well live in the 19th, 18th or 17th century. They aren't getting to live in the 21st century except in the superficial way of benefiting from all the gadgets. But they don't have any sense of the human adventure...
That is an interesting stance: much of our population doesn't live in the 21st century, because they don't understand the science that defines their world.
Yesterday, I represented our department at a recruitment open house on campus. One mom pulled her high-school senior over to the table where computer science and chemistry stood and asked him, "Have you considered one of these majors?" He said, "I don't like science." Too many students graduate high school feeling that way, and it is a tragedy. It's bad for the future of technology; it's bad for the future of our economy. And they are missing out on the world they live in. I tried to share the thrill, but I don't think I'll see him in class next fall.
How is this for a headline?
2004 Olympian DAN BROWNE Sets His Sights on Eugene
I'm making a dash for the finish line... Can I hold him off?
This has been my first week since the middle of last year not running, due to a little flu or cold bug I've picked up. It's been an icy enough week that I miss the run less than I might, but I am itching to hit the road. I hold some hope for tomorrow morning.
... though I can't in good conscience say, "for I know not what I do."
1. Write a script named simple-interest.phpthat defines a function to compute the simple interest on an amount, given the amount, annual interest rate, and number of months. Your script should apply the function to its command-line arguments, which are those same values.
The rest of my PHP class's first homework is more reasonable, with a couple of problems repeated from the bash class's first assignment as a way for students to get a sense of trade-offs between shell programming and scripting in a more general-purpose language.
Still, I had to squeeze my eyes shut tight to hit the key that published this assignment. I keep telling myself that this is just an ice-breaking assignment for students who have never written any PHP before, or learned how to access command-line arguments, or convert strings to integers. That such a simple, context-free task is a nice way for them to succeed on their first effort. That our future assignments will be engaging, challenging, worthwhile. But... Ick.
The first time I teach a course, there always seem to be clunkers like this. Starting from scratch, relying on textbooks for inspiration, and working under time pressure all work against my goal of making everything students do in the class worth their time and energy. I suppose that problems such as this one are my opportunities to improve next time out.
My despair notwithstanding, I suspect that many students are happy enough to have at least one problem that is a gift, however uninteresting it may be. Maybe I can find solace in that while I'm working on exercises for my next problem set.
... when Charlie Eppes invokes your research area on Numb3rs. In the episode I saw last Friday, the team used a recommender system, among other snazzy techie glitz, to track down a Robin Hood who was robbing from the dishonestly rich and giving to the poor through a collection of charities. A colleague of mine does work in recommender systems and collaborative filtering, so I thought of him immediately. His kind of work has entered the vernacular now.
I don't recall the Numb3rs crew ever referring to knowledge-based systems or task-specific architectures, which was my area in the old days. Nor do I remember any references to design patterns or to programming language topics, which is where I have spent my time in the last decade or so. Should I feel left out?
But Charlie and Amita did use the idea of steganography in an episode two years ago, to find a pornographic image hidden inside an ordinary image. I have given talks on steganography on campus occasionally in the last couple of years. The first time was at a conference on camouflage, and most recently I spoke to a graphic design class, earlier this month. (My next engagement is at UNI's Saturday Science Showcase, a public outreach lecture series my college runs in the spring.) So I feel like at least some of my intellectual work has been validated.
Coincidentally, I usually bill my talks on this topic as "Numb3rs Meets The Da Vinci Code: Information Masquerading as Art", and one of the demonstrations I do is to hide an image of Numb3rs guys in a digitized version of the Mona Lisa. The talk is a lot of fun for me, but I wonder if college kids these days pay much attention to network television, let alone da Vinci's art.
Lest you think that only we nth-tier researchers care to have our areas trumpeted in the pop world, even the great ones can draw such pleasure. Last spring, Grady Booch gave a keynote address at SIGCSE. As a part of his opening, he played for us a clip from a TV show that had brightened his day, because it mentioned, among other snazzy techie glitz, the Unified Modeling Language he had helped to create. Oh, and that video clip came from... Numb3rs!
Are all the open jobs in computing that we keep hearing about going unfilled?
Actually -- they're not. Companies do fill those jobs. They fill them with less expensive workers, without computing degrees, and then train them to program.
Mark Guzdial is concerned that some American CEOs and legislators are unconcerned -- "So? Where's the problem?" -- and wonders how we make the case that degrees in CS matter.
I wonder if the US would be better off if we addressed a shortage of medical doctors by starting with less expensive workers, without medical degrees, and then trained them to practice medicine? We currently do face a shortage of medical professionals willing to practice in rural and underprivileged areas.
The analogy is not a perfect one, of course. A fair amount of the software we produce in the world is life-critical, but a lot is not. But I'm not sure whether we want to live in a world where our financial, commercial, communication, educational, and entertainment systems depend on software to run, and that software is written by folks with a shallow understanding of software and computing more generally.
Maybe an analogy to the law or education is more on-point. For example, would the US would be better off if we addressed a shortage of lawyers or teachers by starting with less expensive workers, without degrees in those areas, and then trained them? A shortage of lawyers -- ha! But there is indeed a critical shortage of teachers in many disciplines looming in the near future, especially in math and science. This might lead to an interesting conversation, because many folks advocate loosening the restrictions on professional training for folks who teach in our K-12 classrooms.
I do not mean to say that folks who are trained "on the job" to write software necessarily have a shallow understanding of software or programming. Much valuable learning occurs on the job, and there are many folks who believe strongly in a craftsmanship approach to developing developers. My friend and colleague Ken Auer built his company on the model of software apprenticeship. I think that our university system should adopt more of a project-based and apprenticeship-based approach to educating software developers. But I wonder about the practicality of a system that develops all of its programmers on the job. Maybe my view is colored by self-preservation, but I think there is an important role for university computing education.
Speaking of practicality, perhaps the best way to speak to the CEOs and legislators who doubt the value of academic CS degrees is in their language of supply and productivity. First, decentralized apprenticeship programs are probably how people really became programmers, but they operate on a small scale. A university program is able to operate on a slightly larger scale, producing more folks who are ready for apprenticeship in industry sooner than industry can grow them from scratch. Second, the best-prepared folks coming out of university programs are much more productive than the folks being retrained, at least while the brightest trainees catch. That lack of productivity is at best an opportunity cost, and at worst an invitation for another company to eat your lunch.
Of course, I also think that in the future more and more programmers will be scientists and engineers who have learned how to program. I'm inclined to think that these folks and the software world will be better off being educated by folks with a deep understanding of computing. Artists, too. And not only for immediately obvious economic reasons.
I don't usually play meme games in my web, but as I am winding down for the week I ran across this one on Brian Marick's blog: grab the nearest book, open it to page 123, go to the 5th sentence, and type up the three sentences beginning there.
With my mind worn out from a week in which I caught something worse than a meme, I fell prey and swung my arm around. The nearest book was Beautiful Code. Technically, I suppose that a stack of PHP textbooks is a couple of inches closer to me, but does anyone really want to know what is on Page 123 of any of them?
Here is the output:
The resultant index (which was called iSrc in FilterMethodCS) might be outside the bounds of the array. The following code loads an integer 0 on the stack and branches if iSrc is less than 0, effectively popping both operands from the stack. This is a partial equivalent of the if statement conditional in line 19 of Example 8-2:
Okay, that may not be much more interesting than what a PHP book might have to say, at least out of context. I'm a compiler junkie, though, and I was happy to find a compiler-style chapter in the book. So I turned to the beginning of the chapter, which turns out to be "On-the-Fly Code Generation for Image Processing" by Charles Petzold. I must admit that this sounds pretty interesting to me. The chapter opens with something that may be of interest to others, too:
Among the pearls of wisdom and wackiness chronicled in Steve Levy's classic "Hackers: Heroes of the Computer Revolution" (Doubleday), my favorite is this one by Bill Gosper, who once said, "Data is just a dumb kind of programming."
Petzold then goes on to discuss the interplay between code and data, which is something I've written about as one of the big ideas computer science has taught the world.
What a nice way for me to end the week. Now I have a new something to read over the weekend. Of course, I should probably spend most of my time with those PHP textbooks; that language is iteration 2 in my course this semester. But I've avoided "real work" for a lot less in the past.
That is the title of a blog post that I planned to write five or six weeks ago. Here it is over a month later, and the course just ended. Well, it ended, and it begins again on Tuesday. So now I am thinking agile thoughts as I think back over my course, and still thinking agile thoughts as I prepare for the course. Let me explain.
810:151 is a different sort of course for us. We try to expose students to several different programming languages in the course of their undergraduate study. Even so, it is common for students to graduate thinking, "I wish I'd learned X." Sometimes X is a relatively new language, such as Scala or Groovy. Sometimes it's a language that is now more mainstream but has not yet made it into one of our courses, such as Ruby. Sometimes it is even a language we do emphasize in a course, such as Scheme, but in a course they didn't have an opportunity to take. We always have told students who express this wish that they should be well-equipped to learn a new language on their own, and they are. But...
While taking a full load of courses and working part-time (or taking a part-time load of courses and working full-time), it is often hard for students to set aside time to learn completely optional. People talk about "the real world" as if it is tougher than being in school, but students face a lot of competing demands for their time. Besides, isn't it often more fun to learn something from an expert who can help you learn the tricks of the trade faster and miss a few of the potholes that lie along the way?
I sometimes think of this course, which we call "Topics in Programming Languages", as a "make time" course for students who want to learn a new language, or perhaps a broader topic related to language, but who want or need the incentive that a regular course, assigned readings, and graded work provides. The support provided by the prof's guidance also is a good safety net for the less seasoned and less confident. For these folks, one of the desired outcomes is for them to realize, hey, I really can learn a language on my own.
We usually offer each section of 810:151 as a 1-credit course. The content reason is that the course has the relatively straightforward purpose of teaching a single language, without a lot of fluff. The practical purpose is that we can offer three 1-credit courses in place of a single 3-credit course. Rather than meet one hour per week for the entire semester, the course can meet 3 hours per week for 5 weeks. This works nicely for students who want to take all three, as they look and feel like a regular course. It also works nicely for students who choose to take only one or two of the courses, as they need not commit an entire semester's worth of attention to them.
This is my first semester assigned (by me) to teach this odd three-headed course. The topics this semester are Unix shell programming in bash, PHP, and Ruby.
I've been thinking of the three courses as three 5-week iterations. Though the topics of the three courses are different, they share a lot in terms of being focused on learning a language in five weeks. How much material can I cover in a course? How can students best use their time? How can I best evaluate their work and provide feedback? Teaching three iterations of a similar course in one semester is so much better for me when it comes to taking what I learn and trying to improve the next offering. With an ordinary course taught every semester, I would have to wait until next fall to begin implementing improvements; with an ordinary course three-course rotation, I would have to wait until Fall 2009!
I opted to dispense with all examinations and evaluate students solely in terms of the bash scripts they wrote. The goal of the course is for students to learn how to program in bash, so that is where I wanted the students' attention to be. One side effect of this decision is that the course is not really over yet; students will work on their final problem set in the coming week, and I'll have to grade it next Friday. The problem sets have consisted mostly in small-ish scripts that exercise the features of bash as we encounter them. We did have one larger task that students solved in three parts over the course of the semester, a processor for a Markdown-like mark-up language that produces HTML. This project scratched one of my own itches, as I like to use simple text-based, e-mail-friendly mark-up, and now I have a simple bash script that does the job!
One thing I did not do this semester that I thought I might, and which perhaps I should, is to work with them through a non-trivial shell script or two. I had thought that the fifth week would be devoted to examining and extending larger scripts, but I kept uncovering more techniques and ideas that I wanted them to see. Perhaps I could use a real script as a primary source for learning the many features of bash, instead of building their skills from the bottom up. That is how many of them have to come to know what little they know about shell scripting, by confronting a non-trivial script for building or configuring an open-source application that interests them. To be honest, though, I think that the bottom-up style that we used this semester may prepare them better for digging into a more complex script than starting with a large program first. This is one of the issues I hope to gain some insight into from student feedback on the course.
Making this "short iterations" more interesting is the fact that some students will be in all three of the iterations, but there will be a significant turnover in the class rosters. The client base evolves, but there should be enough overlap that I can get some comparative feedback as I try to implement improvements.
I tried to follow a few other agile principles as I started teaching this new prep. I tend to start each new course with a template from my past courses, from the way I organize sessions and lecture notes to the look-and-feel of the web site. This semester, I tried to maintain a YAGNI mindset: start as simple as I can, and add new elements only as I use them -- not when I think I need them tomorrow. By and large I have succeeded in this regard. My web site is bare-bones in comparison to my past sites, and lecture notes are plain text records of in-class activities and short messages to remind me and the students of what we discussed. I saved a lot of time not trying to produce attractive and complete lecture notes in HTML. Maybe some day, but this time around I just didn't need them.
One agile practice that I didn't think to encourage soon enough was unit testing. Shame on me. Some students suffered far more than I from this oversight. Many did a substandard job of testing their scripts, in part I think because they were biting off too much of a task to start. Unix pipelines are almost perfectly suited to unit testing, as one can test the heck out of each stage in isolation, growing the pipeline one stage at a time until the task is solved. The fact that each component is reading from stdin and writing to stdout means that later stages can be tested independent of the stages that precede it before we add it to the end.
For whatever reason, it didn't occur to me that there would exist an shUnit. It's too late for me to use it this semester, but I'll be sure to put phpUnit to good use in the next five weeks. And I've always known that I would use a unit testing framework such as this one for Ruby. Heck, we may even roll our own as we learn the language!
I've really enjoyed teaching a course with the Unix philosophy at the forefront of our minds. Simple components, the universal interface of plain text streams, and a mandate to make tools to work together -- the result is an amazingly "agile" programming environment. The best way to help students see the value of agile practices is to let them live in an environment where that is natural, and let them feel the difference from the programming environments in which they other times find themselves. I just hope that my course did the mindset justice.
The tool-builder philosophy that pervaded this course reminded me of this passage from Jason Marshall's Something to Say:
There's an old saying, "A good craftsman never blames his tools." Many people take this to mean "Don't make excuses," or even, "Real men don't whine when their tools break." But I take it to mean, "A good craftsperson does not abide inferior tools."
A good craftsman never blames his tools, because if his tools are blameworthy, he finds better tools. I associate this idea more directly with the pragmatic programmers than with the agile community, but it seems woven into the fabric of the agile approaches. The Agile Manifesto declares that we value "individuals and interactions over processes and tools", but I do not take this to mean that tools don't matter. I think it means that we should use tools (and processes) that empower us to focus our energy on people and interactions. We should let programs do what they do best so that we programmers can do what we do best. That's why we have unit testing frameworks, refactoring tools, automatic build tools, and the like. It's also why Unix is far more human-affirming than it is sometimes given credit for.
As I told students to close the lecture notes for this course: Don't settle for inferior tools. You are a computer programmer. Make the tools that make you better.
Leave it to George Costanza. In the episode of Seinfeld titled The Masseuse, George finally has a great relationship with a wonderful woman. Inexplicably, she likes everything about him. Yet all he can think about is Jerry's current girlfriend, a masseuse who can't stand George. Rather than turn his attention to his own loving partner, he makes such a strident effort to get the masseuse to like him that he drives her even further away -- and loses his own girl, who can't understand George's obsession. But it's really quite simple: George wants everyone to like him.
I understand that not everyone will like me. But deep inside it's easy to lose sight of that fact in the course of daily interactions. When I became department head, one of my goals was to treat everyone fairly, to be open and honest so that each member of the faculty could trust that I was giving him or her a fair hearing and doing the best I could to help him or her succeed within whatever conditions we found ourselves to be operating.
That's where George's problem tries to sneak in the door. What if I do treat everyone fairly and am open and honest; what if I do all I can so that each faculty can trust me and my intentions -- and still someone is unhappy with me? What then?
Trying to do what George tried to do is a recipe for disaster. As hard as it is sometimes, all I can do is what I can do. I should -- must -- act in a trustworthy manner, but I cannot make people like what I do, or like me. That is part of the territory. For me, though, the occasional encounter with this truth sucks a lot of psychic energy out of me.
This is the second semester of my third year as head, which means that I am undergoing a performance evaluation. I suppose the good news is that the dean feels comfortable enough with how I've done to do the review at all, rather than look for a new person for the next three-year appointment. He is using an assessment instrument developed by the IDEA Center at Kansas State. The faculty were asked to judge my performance on a number of tasks that are part of a head's job, such as "Guides the development of sound procedures for assessing faculty performance" and "Stimulates or rejuvenates faculty vitality/enthusiasm". My only role in the process was to rank each of the tasks in terms of their importance to the job.
I look at the review as both summative and formative. The summative side of the review is to determine how well I've done so far and whether I should get to keep doing it. The formative side is to give me feedback I can use to improve for the future. As you might guess from my fondness for so-called agile software development practices, I am much more interested in the formative role of the assessment. I know that my performance has not been ideal -- indeed, it's not even been close! -- but I also know that I can get better. Feedback from my colleagues and dean will help.
Though I was not asked to assess my performance on these issues, I do have a sense of my job performance. I have been only marginal in managing day-to-day affairs. That task requires a certain kind of focus and energy that I've had to develop on the job. I've also had to learn how to respond effectively in the face of a steady barrage of data, information, and requests. I have also been only marginal in "leadership" tasks, the ones that require I take initiative to create new opportunities for faculty and students to excel. This is an area where I have had a lot of ideas and discussed possibilities with the faculty, but finding time to move many of these ideas forward has been difficult.
In an area of particular importance to our department given its history, I have done a reasonable job of communicating information to the faculty, treating individual faculty fairly, and encouraging conversation. I recognized these tasks as primary challenges when I accepted my appointment and, while I had hoped to do better, I've done well so far to keep this dynamic front and center.
The results of the faculty survey are in; they arrived in my mailbox yesterday. I decided not to read the results right away... I have been a little under the weather and wanted to preserve my mental energy for work. The last session of my 5-week bash scripting course meets today, and I would rather be focused on wrapping up the class than on the data from my evaluation. I can tell myself not to fall victim to George's masseuse problem, but sometimes that is more easily done with conscious choices about how and when to engage relationships.
This afternoon, I'll look at the data, see what they can help me learn, and think about the future.
Today I spent the morning meeting with prospective CS majors and their parents. These prospective majors were high school seniors visiting campus as part of their deciding whether to come to my university. Such mornings are exhausting for me, because I'm not a natural glad-hander. Still, it is a lot of fun talking to folks about computer science and our programs. We end up spending a lot of time with relatively few folks, but I think of it like the retail politics of Iowa's caucus campaign: the word-of-mouth marketing is more valuable than any one or two majors we might attract, and these days, every new major counts.
Twice today I was surprised but a question that more high school students and parents could ask, but don't:
Me: So, you're interested in computer science?
Student: I think so, but I don't really know what computer science is.
Parent: Can you tell us what computer science is and what computer scientists do?
I'm glad that kids are now interested enough in CS to ask. In recent years, most have simply bypassed us for majors they understood already.
My answer was different each time but consistent in theme to what I talk about here. It occurs to me that "What Is Computer Science?" could make a good blog entry, and that writing it in concise form would probably be good practice for encounters such as the ones I had today.
I recently came across a SIGCSE paper from 1991 called Integrating Writing into Computer Science Courses, by Linda Hutz Pesante, who at the time was affiliated with the Software Engineering Institute at Carnegie Mellon. This paper describes both content and technique for teaching writing within a CS program, a topic that cycles back into the CS education community's radar every few years. (CS academics know that it is important even during trough periods, but their attention is on some other, also often cyclic, attention-getter.)
What caught my attention about Pesante's paper is that she tries help software engineers to use their engineering expertise to the task of writing technical prose. One of her other publications, a video, even has the enticing title, Applying Software Engineering Skills to Writing. I so often think about applying ideas from other disciplines to programming, the thought of applying ideas from software development to another discipline sounded like a new twist.
Pesante's advice on how to teach writing reflects common practice in teaching software development:
Given Pesante's affiliation with the SEI, her suggestions for what to teach about writing made a bigger impression on me. The software engineering community certainly embraces a broad range of development "methodologies" and styles but, underlying its acceptance even of iterative methods, there seems to be a strong emphasis on planning and "getting things right" the first time.
Her content advice relies on the notion that "writing and software development have something in common", from user analysis through the writing itself to evaluation. As such, a novice writer can probably learn a lot from how programmers write code. Programmers like to be creative and explore when they write, too, but they also know that thinking about the development process can add structure to a potentially unbounded activity. They use tools to help them manage their growing base of documents and to track revisions over time. That part of the engineer's mindset can come in handy for writers. For the programmer who already has that mindset, applying it to the perhaps scary of writing prose can put the inexperienced writer at ease.
Pesante enumerates a few other key content points:
The middle two of these especially feel more agile than the typical software engineering discussion. I think that the agile software community's emphasis on short iterations with frequent releases of working software to the client also matches quite nicely the last of the bullets. It's all too easy to do a good job of analysis and planning, produce a piece of software that is correct by the standards of the analysis and plan, and find that it does not meet the user's needs effectively. With user feedback every few weeks, the development team has many more opportunities to ensure that software stays on a trajectory toward effectiveness.
Most people readily accept the idea that creative writing is iterative, non-linear, and exploratory. But I have heard many technical writers and other writers of non-creative prose say that their writing also has these features -- that they often do not know what they had to say, or what their ultimate product would be, until they wrote it. My experience as a writer, however limited, supports the notion that almost all writing is exploratory, even when writing something so pedestrian as a help sheet for students using an IDE or other piece of software.
As a result, I am quite open to the notion that programming -- what many view as the creating of an "engineered" artifact -- also iterative, non-linear, and exploratory. This is true, I think, not only for what we actually call "exploratory programming" but also for more ordinary programming tasks, where both the client and I think we have a pretty good handle on what the resulting programming should do and look like. Often, over the course of writing an ever-larger program, our individual and shared understanding of the program evolves. Certainly my idea of the internal shape of the program -- the structure, the components, the code itself -- changes as the program grows. So I think that there is a lot to be learned going both directions in the writing:programming metaphor.
This half-hour interview with Pragmatic Dave Thomas starts off interesting enough to listen to, and about halfway through it becomes even more compelling, especially for academics. Interviewer Jim Coplien asks what advice Thomas would give to CS academics. Thomas has long been an industry trainer, and a good one. (I learned some Ruby from him fellow Prag Andy Hunt at an OOPSLA workshop in 2001.) But he has not been a CS academic since leaving graduate school a couple of decades ago. Still, his answer is marvelous:
... one thing I would say that you have to be very careful of, if you are an academic, is that you are dealing with a very delicate product in your students, and ultimately when a student gets into the industry it is not how well hey can analyze a particular function or the depth of knowledge in this particular architecture, it is their passion that drives them forward. And as an academic I think you have a responsibility not to squash that passion, I think you have to find ways of nurturing it.
I can't instill passion in someone, but I can kill someone's passion. Worse, I diminish someone's passion in small steps, in how I speak about the discipline, what I expect of them. When writing comments is more valuable than writing code, I dampen passion. When the form of a program matters more than the substance, I dampen passion.
Unfortunately, I think that our K-12 system kills the passion of many students. This is not a criticism of teachers, many of whom do wonderful, inspiring jobs under less than ideal conditions. The problem is more a product of the structure of our schools and our classrooms. At the university, we'd like to think that we begin to restore passion, and we do have more opportunities to do so. But we need to be honest with ourselves and stamp out the spirit-killing parts of our courses, curricula, and degree programs. I cannot instill passion, but I can stop killing passion. And I can help it grow.
Thomas didn't have much in the way of concrete advice for how to nurture passion, but he did say that teachers need to motivate what they teach and what they expect students to do. Yes; context matters. He also suggested that we encourage students to be well-rounded and try to attract well-rounded folks to the discipline. Yes; the more interesting ideas we have in our heads and in our classrooms, the better we can learn, and the better we can program.
As an aside, Thomas talks some about how he recently took up learning to play the piano, on the occasion of turning fifty. Early in this decade, I, too, began to study piano as an adult. In the year or so before I began writing this blog, I had let myself become too busy to practice and so fell away. I hope to make time to return to my study some day, for all the reasons that Thomas mentions and more.
... about teaching, and about software development.
The February 2008 issue of Smithsonian Magazine contains an article called Being Funny, by comedian, writer, and actor Steve Martin, that has received a fair amount of discussion on-line already. When I read it this weekend, I was surprised by how similar some of the lessons Martin learned as he grew into a professional comedian are to lessons that software developers and teachers learn. I enjoyed being reminded of them.
I gave myself a rule [for dealing with the audience]. Never let them know I was bombing: this is funny, you just haven't gotten it yet.
This is about not showing doubt. Now, I think it's essential for an instructor to be honest -- something I wrote about a while back, in the context of comedy as well. So I don't mean that I as teacher should try to bluff my way through something I don't know or know but have botched. Martin is talking about the audience getting it, and the doubt that enters my mind when a classroom of students seem not to. I experience this occasionally when I teach a course like Programming Languages to a new group of students. Some classes don't warm to me or the material in quite the same way as others, but I know that the material I'm teaching and the basic approach I am using are sound. When this semester's crowd takes a while to catch on -- or if they are openly skeptical of the material or approach -- it's essential that I remain confident. Sure, I'll make adjustments to the presentation to account for my current students' needs, but I should remain steadfast: This is good stuff; they'll get it soon.
I'm not sure this applies as well for software developers. Often, when my users don't get it yet and I feel compelled to bull on through, I have gone beyond the requirements, at least as my users understand them. In those cases, it's usually better to say to myself "You aren't gonna need it" and simplify.
Everything was learned in practice, and the lonely road, with no critical eyes watching, was the place to dig up my boldest, or dumbest, ideas and put them onstage.
This is hard to do as a teacher. I don't get to go on the road to a hundred musty nightclubs and try out my new lecture on continuation-passing style, as Martin did with his bits. He was practicing on small stages in order to prepare for the big ones, such as The Tonight Show. It's next to impossible for me to try a lecture out on a representative audience that isn't my regular audience: my students. I can practice my research talks before a small local audience before taking them to a conference, but the number of repetitions available to me is rather small even in that scenario. For class sessions, I pretty much have to try them out live, see what happens, and feed the results back into my next performance.
Fortunately, I'm not often trying radically new ideas out in a lecture, so fewer and live repetitions may suffice. I have tried teaching methods that quite different than the norm for me and for my students, such as a Software Systems course or gen-ed capstone course with no lecture and all in-class exercises. In those scenarios, I had to follow the advice discussed above: This is going to work; they just haven't gotten it yet...
This piece of advice applies perfectly to programming. The lonely road is my office or my room at home, where I can try out every new idea that occurs to me by writing a program (or ten) and seeing how it works. No critical eyes but my eye, which I turn off. Bold ideas, dumb ideas -- I find out which are which through practice.
The consistent work enhanced my act. I learned a lesson: it was easy to be great. Every entertainer has a night when everything is clicking. These nights are accidental and statistical: like lucky cards in poker, you can count on them occurring over time. What was hard was to be good, consistently good, night after night, no matter what the circumstances.
This is so true of teaching that I have nothing more to say. Read Martin's line again.
I think this is also true of programming, and writing more generally. Every so often, a great idea comes to mind; a great turn of phrase; a cool little hack that solves the problem at hand. To be a good programmer, you need to practice, so that each time you sit down to code you can produce something of value. That kind of skill is earned by practice, and, I think, attainable by everyone.
On one of my appearances [on The Tonight Show, after [Johnny Carson] had done a solid impression of Goofy the cartoon dog, he leaned over to me during a commercial and whispered prophetically, "You'll use everything you ever knew."
When working with students, I find myself borrowing from every good teacher I've ever had, and drawing on any experience I can recall. I've borrowed standards and attitudes from one of my favorite undergrad profs, who taught me the value of meeting deadlines. I've used phrases and jokes spoken by my high school chemistry teacher, who showed us that studying a difficult subject could be fun, with the right mindset and a supportive group of classmates. Whatever works, works. Use it. Adapt it.
Likewise, this advice is true when programming. In the last few years, the notion of scrapheap programming has become quite popular. In this style, a programmer looks for old pieces of code that do part of the job at hand, adapts them, and glues them together to get the job done. But this is how all writers and programmers work, drawing on all of the old bits of code rolling around their heads. In addition to practice, you can improve as a programmer by exposing yourself to as many different programs as possible. That way, you will see the data structures, the idioms, the design patterns, and, yes, the snippets of code that you will use twenty years from now when the circumstance is right. That may seem goofy, but it's not.
I believed it was important to be funny now, while the audience was watching, but it was also important to be funny later, when the audience was home and thinking about it.
As a teacher, I would like for what my students see, hear, and do in class today to make an impression today. That is what makes the material memorable and, besides, it's a lot more fun for both of us that way than the alternative. But perhaps more important, I would like for the ideas to make a lasting impression. When the student is home thinking about the course, or an assignment, or computer science in general, I want them to realize how much depth the idea or technique has, or breadth. Today's memorability can be ephemeral. When the idea sticks later, it can affect how the student programs forever.
The hard part is trusting when I don't see students getting it in real-time. Martin says that he learned not to worry if a gag got no response from his audience, "as long as I believed it had enough strangeness to linger". Strangeness may not be what I hope for in my class sessions, but I know what he means.
As a programmer, I think this advice applies, too. When I was an undergrad, I was once on a team building a large system as part of our senior project course. One of our teammates loved to write code that was clever, that would catch our eye on first read and recognize his creativity and skill. But we soon learned that much of this code was inscrutable in a day or two, which made modifying and maintaining his modules impossible. Design and code that makes a module easy to read and understand in a few weeks are what the team needed. Sometimes, the cleverness of a solution shone through then, too, but it impressed us all the more when it had "staying power".
Steve Martin is wacky comedian, not a programmer or teacher per se. Does his advice apply only in that context? I don't think so. Comedians are writers and creators, and many of the traits that make them successful apply to other tasks that require creating and communicating.
Readers of this blog know that programming is one of the topics I most like to write about. In recent months I've had something of a "programming for everyone" theme, with programming as a medium of expression, as a way to create new forms, ideas, and solutions. But programming is also for computer scientists, being the primary mode for communicating their ideas.
To the non-CS folks reading this, that may seem odd. Isn't CS about programming? Most non-CS folks seem to take as a given that computer science is, but these days it is de rigeur for us in the discipline to talk about "computing" and how much bigger it is than "just programming".
Too some extent, I am guilty of this myself. I often use the term "computing" generically in this blog to refer to the fuzzy union of computer science, software engineering, applications, and related disciplines. This term allows me to talk about issues in their broadest context without limiting my focus to any one sub-discipline. But it also lets me be fuzzy in my writing, by not requiring that I commit.
Sometimes, that breadth can give the impression that I think programming is but a small part of the discipline. But most of my writing comes back to programming, and when I teach a CS course, programming is always central. When I teach introductory computer science, programming is a way for us to explore ideas. When I teach compilers, it all comes down to the project. My students learn Programming Languages and Paradigms by writing code in a new style and then using what they learn to explore basic ideas about language in code. When I taught AI for eight or ten straight years, as many of the big ideas as possible found their way into lab exercises. Even when I taught one course that had no programming component -- an amalgam of design, HCI, and professional responsibility called Software Systems -- I had students read code: simple implementations of model-view-controller, written in Java, C++, Common Lisp, or Ada!
I love to program, and I hope students leave my courses seeing that programming is medium for expressing and creating ideas -- as well as creating "business solutions", which will be the professional focus for most of them. Then again, the best business solutions are themselves ideas that need to be discovered, explored, and evolved. Programming is the perfect medium to do that.
So when I ran across The Nerd Factor is Huge, via Chuck Hoffman at Nothing Happens, I found myself to be part of the choir, shouting out "Amen, brother!" every so often. The article is a paean to programming, in a blog "dedicated to the glory of software programming". It claims that programming needs an academic home, a discipline focused on understanding it better and teaching people how to do it. (And that discipline should have its own conference!)
In Yegge-like fashion, the author uses expressive prose to makes big claims, controversial claims. I agree with many of them, and feel a little in harmony even with the ones I wouldn't likely stake my professional reputation on.
I agree with the central thesis of this article. However, separating programming as a discipline from HCI and some of the other "non-programming ghettos" of CS creates a challenge for university educators. Most students come to us not only for enlightenment and great ideas but also for professional preparation. With several distinct disciplines involved, we need to help students put them all together to prepare them to be well-rounded pros.
How should we encourage more kids -- girls and boys alike -- to study computer science? "Nerd Factor" is right: don't shy away from programming; teach it, and sooner rather than later. Show them "how easy it is to create something. Because that is what programming is all about: making things." And making things is the key. Being able to program gives anyone the power to turn ideas into reality.
One of my early memories of OOPSLA comes from a panel discussion. I don't recall the reason for the panel, but several big names were talking about what we should call people who create software. There were probably some folks who supported the term "software engineer", because there always are. Kent Beck spoke heresy: "I'm just a programmer." I hope I muttered a little "Amen, brother" under my breath that day, too.
I wasn't expecting to hear John Maeda's name during the What is a Tree? talk, because I didn't know that researchers in Maeda's lab had created the language Processing. But hearing his name brought to mind something that has been in the back of my mind for a couple of months, since the close of my first theater experience. I had blogged about a few observations my mind had made about the processes of acting in and directing a play. The former were mostly introspective, and the latter were mostly external, as I watched our director coalesce what seemed like a mess into a half-way decent show. Some of these connections involved similarities I noticed between producing a play and creating software.
I made notes of a few more ideas that I hadn't mentioned yet, including:
I'm still wondering if those last two have any useful analogue in software development...
Since the show ended, I have occasionally tried to discern the value in the analogy between producing a play and creating software -- indeed, if there is any. That's where the connection to Maeda comes in. Last summer, I read the slender Laws of Simplicity, a collection of essays from Maeda's blog of the same name. The book suggest ten ways that we can design simpler systems and products. I must not have been in the right place to read the book just then, because I didn't get as much out of it as I had hoped. But one part of the book stuck with me.
For a metaphor to engage us deeply, Maeda wrote, it is essential that it relate, translate, and surprise. As I recall now, this means that the metaphor must relate the elements of the two things, that it must translate foreign elements from one of the things to the other, and that the result of this translation should surprise -- it should make us see or understand the other thing in a new way, give us insight.
There is a danger in finding analogies everywhere we look by making superficial connections. I am perhaps more prone to this risk than many other folks. That may be why I liked Maeda's relate/translate/surprise triad so much. Since reading it, I have used it as an external checkpoint for any new analogy that I want to make. If I can explain how the metaphor relates the two things, translates disparate elements, and surprises me, then I have some reason to think that the metaphor offers value -- at least more reason than just saying, "Hey, look at this cool new thing I noticed!"
To this point, I have not found the "surprise" in the theater experience that teaches me something new about how to think about making software. This doesn't mean that there is no value in the analogy, only that I haven't found it yet. By remaining skeptical a little while longer, I decrease the probability that I try to draw an inappropriate conclusion from the relationship.
Of course, just because I haven't yet found the surprise in the analogy doesn't mean that I did not find value in the experience that led me to it. A rich web of experiences is valuable in its own right, and enjoyable. It also provides the source material for learning.