While in St. Louis for SIGCSE, I got out for three runs. Most of my stay was at a Motel-6 just off I-270 on the northside of town, at Dunn Road. That means I can't give advice that will help to most of you, who are unlikely to have my family reasons for staying there -- or my idiosyncratic desire for taking economy travel to an extreme on some trips. Let's just say that I was pleased to find an attractive middle-class neighborhood just behind the motel, one with lots of trees and curvy roads. It was perfect for a couple of easy 5-milers before my drive downtown to the conference.
I spent my last night at the conference hotel, the Renaissance Grand, which is on the corner of 8th and Washington. This hotel is a half mile or so from St. Louis's signature landmark, the Gateway Arch. After a 5-minute jog toward the arch I reached Gateway Park. The arch sits in the middle of the park, with paved walks curving out from the arch to the corners of the park.
I ran the perimeter of the park, except the southwest corner, which I replaced with a path into the arch and one of those arcing roads out to the corner. The result was a loop of a bit more than 1.5 miles. I ran that loop seven times which, combined with the jogs to and from the hotel, gave me something in the neighborhood of a 12-mile jog.
Oh, and I must tell you that this is no ordinary loop. The Gateway Arch sits atop a ridge, and the park borders the Mississippi River. The result is a significant change in altitude. The westside of the park has three massive sets of concrete stairs. My loop included the northernmost staircase, which I ran up on each pass. (See the box on the map below.)
The climb took me 30-35 seconds and ended with rubbery legs. The rest of the loop had a few small rises and a couple of declines down to the waterfront road. This made for a good hill workout!
The next time I stay in St. Louis, I hope to have some opportunities for a few different runs. St. Louis has some other park running routes available as you get away from the crowded old city district. Downtown, I'd like to run Chestnut and Market Streets, which frame a long stretch of greenery leading west from Gateway Park out toward St. Louis University. Along that route I'll also pass near the Savvis Center -- perfect for a visit to Arch Madness some year. (Go, Panthers!)
The focal event of Day 3 at SIGCSE is the 8:30 AM special session, a panel titled Resolved: "Object Early" Has Failed. It captures a sense of unrest rippling through the SIGCSE community, unrest at the use of Java in the first year, unrest at the use of OOP in the first year, unrest at changes that have occurred in CS education over the last decade or more. Much of this unrest confounds unhappiness with Java and unhappiness with OOP, but these are real feelings felt by good instructors. But there are almost as many people uneasy that, after five or more years, so few intro courses seem to really do OOP, or at least do it well. These folks are beginning to wonder whether objects will ever take a place at the center of CS education, or if we are destined to teach Pascal programming forever in the language du jour. This debate gives people a chance to hear some bright folks on both sides of the issue share their thoughts and answer questions from one another and the audience.
The panel is structured as a light debate among four distinguished CS educators:
Our fearless leader is Owen "Moderator with a Law" Astrachan. The fact that the best humor in the introductions was aimed at the pro-guys may reveal Owen's bias, or maybe Stuart and Eliot are just easier targets...
Eliot opened by defining "objects early" as a course that on using and writing classes before writing control structures and algorithm development. He made the analogy to new math: students learn "concepts" but can't do basic operations well. Most of Eliot's comments focused on faculty's inability or unwillingness to change and on the unsuitability of the object-oriented approach for the programs students write at the CS1 level. The remark about faculty reminded me of a comment Owen made several years ago in a workshop, perhaps in an Educators Symposium at OOPSLA, that our problem here is no longer legacy software but legacy faculty.
Michael followed quite humorously with a send-up of this debate as really many debates: objects early, polymorphism early, interfaces early, GUIs early, events early, concurrency early... If we teach all of these in Week 1, then teaching CS 1 will be quite nice; Week 2 is the course wrap-up lecture! The question really is, "What comes last?" Michael tells us that objects haven't failed us; we have failed objects. Most of us aren't doing object yet! And we should start. Michael closed with a cute little Powerpoint demo showing a procedural-oriented instructor teaching objects not by moving toward objects, but by reaching for objects. When you reach too far without moving, you fall down. No surprise there!
Stuart returned the debate to the pro side. He sounded like someone who had broken free of a cult. He said that, once, he was a true believer. He drank the kool-aid and developed a CS 1 course in which students discussed objects on Day 2. He even presented a popular paper on the topic at SIGCSE 2000, Conservatively Radical Java in CS1. But over time he found that, while his good students succeeded in his new course, the middle tier of students struggled with the "object concept". He is willing to entertain the idea that the problem isn't strictly with objects-first but with the overhead of OOP in Java, but pragmatic forces and a herd mentality make Java the language of choice for most schools these days, so we need an approach that works in Java. Stuart lamented that his students weren't getting practice at decomposing problems into parts or implementing complete programs on their own. Students seem to derive great pleasure in writing a complete, if small, program to solve a problem. This works in the procedural style, where a 50- to 100-line can do something. Stuart asserted that this doesn't work with an OO style, at least in Java. Students have to hook their programs in with a large set of classes, but that necessitates programming to fixed API. The result just isn't the same kind of practice students used to get when we taught procedural programming in, um, Pascal. Stuart likens this to learning to paint versus learning to paint-by-the-numbers. OOP is, to Stuart, paint-by-the-numbers -- and it is ultimately unsatisfying.
Stuart's contribution to the panel's humorous content was to claim that the SIGCSE community was grieving the death of "objects early". Michael, Kim, and their brethren are in the first stage of grief, denial. Some objects-early folks are already in the stage of anger, and they direct their anger at teachers of computer science, who obviously haven't worked hard enough to learn OO if they can't succeed at teaching it. Others have moved onto the stage of bargaining: if only we form a Java Task Force or construct the right environment or scaffolding, we can make this work. But Stuart likened such dickering with the devil to cosmetic surgery, the sort gone horribly wrong. When you have to do that much work to make the idea succeed, you aren't doing cosmetic surgery; you are putting your idea on life support. A few years ago, Stuart reached the fourth stage of grief, depression, in which he harbored thoughts that he was the alone in his doubts, that perhaps he should just retire from the business. But, hurray!, Stuart finally reached the stage of acceptance. He decided to go back to the future to return to the halcyon days of the 1980s, of simple examples and simple programming constructs, of control structures and data structures and algorithm design. At last, Stuart is free.
Kim closed the position statement portion of the panel by admitting that it is hard work for instructors who are new to OO to learn the style, and for others to build graphics and event-driven libraries to support instruction. But the work is worth the effort. And we shouldn't fret about using "non-standard libraries", because that is how OO programming really works. Stuart followed up with a question: Graphics seems to be the killer app of OOP; name two other kinds of examples that we can use. Kim suggested that the key is not graphics themselves but the visualization and immediate feedback they afford, and pointed to BlueJ as an environment provides these features to most any good object.
In his closing statement for the con side, Michael closed with quote attributed Planck:
A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.
Stuart's closing statement for the pro side was more serious. It returned to structured programming comparison. It was hard to make the switch to structured programming in CS education. Everyone was comfortable with BASIC; now they had to buy a Pascal compiler for their machines; a compiler might not exist for the school's machines; .... But the trajectory of change was different. It worked, in the sense that people got it and felt it was an improvement over the old way -- and it worked faster than the switch to OOP has worked. Maybe the grieving is premature. Perhaps objects-early hasn't failed yet -- but it hasn't succeeded yet, either. According to Stuart, that should worry us.
The folks on the panel seemed to find common ground in the idea that objects-early has neither succeeded nor failed yet. They also seemed to agree that there are many reasonable ways to teach objects early. And most everyone seemed to agree that instructors should use a language that works best for the style of programming they teach. Maybe Java isn't that language.
In the Q-n-A session that followed, Michael made an interesting observation: We are now living through the lasting damage of the decision by many schools to adopt C++ in CS 1 over a decade ago. When Java came along, it looked really good as a teaching language only because we were teaching C++ at the time. But now we see that it's not good enough. We need a language suitable for teaching, in this case, teaching OOP to beginners. (Kim reminded us of Michael's own Blue language, which he presented at the OOPSLA'97 workshop on Resources for Early Object Design Education.)
I think that this comment shows an astute understanding of recent CS education history. Back when I first joined the CS faculty here, I supported the move from Pascal to C++ in CS 1. I remember some folks at SIGCSE arguing against C++ as too complex, too unsettled, to teach to freshmen. I didn't believe that at the time, and experience ultimately showed that it's really hard to teach C++ well in CS 1. But the real damage in everyone doing C++ early wasn't in the doing itself, because some folks succeeded, and the folks who didn't like the results could switch to something else. The real damage was in creating an atmosphere in which even a complex language such as Java looks good as a teaching language, an environment in which we seem compelled by external forces to teach an industrial language early in our curricula. Perhaps we slid down a slippery slope.
My favorite question from the crowd came from Max Hailperin He asked, "Which of procedural programming and OOP is more like the thinking we do in computer science when we aren't programming? The implication is that the answer to this question may give us a reason for preferring one over the other for first-year CS, even make the effort needed to switch approaches a net win over the course of the CS curriculum. I loved this question and think it could be the basis of a fun and interesting panel in its own right. I suspect that, on Max's implicit criterion, the mathematical and theoretical sides of computing may make procedural programming the preferred option. Algorithms and theory don't seem to have objects in them in the same way that objects populate an OO program. But what about databases and graphics and AI and HCI? In these "applications" objects make for a compelling way to think about problems and solutions. I'll have to give this more thought.
After the panel ended, Robert Duvall commented that the panel had taught him that the so-called killer examples workshop that has taken place at the last few OOPSLAs have failed. Not that idea -- having instructors share their best examples for teaching various OO concepts -- is a bad one. But the implementation has tended to favor glitzy examples, complicated examples. What we need are simple examples, that teach an important idea with the minimum overhead and minimum distraction to students. I'm certainly not criticizing these workshops or the folks who organize them, nor do I think Robert is; they are our friends. But the workshops have not yet had the effect that we had all hoped for them.
Another thing that struck about this panel was Stuart's relative calmness, almost seriousness. He seems at peace with his "back the '80s" show, and treats this debate as almost not a joking matter any more. His demeanor says something about the importance of this issue to him.
My feeling on all of this was best captured by a cool graphic that Owen posted sometime near the middle of the session:
The second attempt to fly Langley's Aerodrome on December 8, 1903, also ended up in failure. After this attempt, Langley gave up his attempts to fly a heavier-than-air aircraft.
(Thanks to the wonders of ubiquitous wireless, I was able in real time to find Owen's source at http://www.centennialofflight.gov/.)
I don't begrudge folks like Stuart and Elliott finding their comfort point with objects later. They, and more importantly their students, are best served that way. But I hope that the Michael Köllings and Kim Bruces and Joe Bergins of the world continue to seek the magic of object-oriented flight for CS 1. It would be a shame to give up on December 8 with the solution just around the corner.
This morning's keynote was given by last year's recipient of SIGCSE's award for outstanding contributions to CS education, Mordechai (Moti) Ben-Ari. Dr. Ben-Ari was scheduled to speak last year but illness prevented him from going to Norfolk. His talk was titled "The Concorde Doesn't Fly Anymore" and touched on a theme related to yesterday's talk, though I'm not certain he realized so.
Dr. Ben-Ari gave us a little history lesson, taking us back to the time of the US moon landing. He asserted that this was the most impressive achievement in the history of technology and reminded us that the project depended on some impressive computing -- a program that was delivered six months in advance of the mission, which used only 2K of writable memory.
Then he asked the audience to date some important "firsts" in computing history, such as the year the first e-mail was sent. I guessed 1973, but he gave 1971 as the right answer. (Not too bad a guess, if I do say so myself.) In the end, all of the firsts dated to the period 1970-1975 -- just after that moon landing. So much innovation in a such a short time span. Ben-Ari wondered, how much truly new have we done since then? In true SIGCSE fashion, he made good fun of Java, a language cobbled out of ideas discovered and explored in the '60s and '70s, among them Simula, Smalltalk, Pascal, and even C (whose contribution was "cryptic syntax").
The theme of the talk was "We in computing are not doing revolutionary stuff anymore, but that's okay." Engineering disciplines move beyond glitz as they mature. Valuable and essential engineering disciplines no longer break much new ground, but they have steady, sturdy curricula. He seemed to say that we in computing should accept that we have entered this stage of development and turn CS into a mature educational discipline.
His chief example was mechanical engineering. He contrasted the volume of mathematics, science, and settled engineering theory and concepts required by the ME program at his university with the same requirements in the typical CS program. Seven math courses instead of four; five science courses instead of two; a dozen courses in engineering foundations instead of three or four in computing. Yet we in CS feel a need to appear "relevant", teaching new material and new concepts and new languages and new APIs. No one, he said, complains that mechanical engineering students learn about pulleys and inclined planes -- 300-year-old technology! -- in their early courses, but try to teach 30-year-old Pascal in a CS program and prepare for cries of outrage from students, parents, and industry.
In this observation, he's right, of course. We taught Ada in our first-year courses for a few years in the late 1990s and faced a constant barrage of questions from parents and students as to why, and what good would it be to them and their offspring.
But in the larger scheme of things, though, is he right? It seems that Dr. Ben-Ari harkens back to the good old days when we could teach simple little Pascal in our first course. He's not alone in this nostalgia here at SIGCSE... Pascal has a hold on the heartstrings of many CS educators. It was a landmark the history of CS education, when a single language captured the zeitgeist and technology of computing all at once, in a simple package that appealed to instructors looking for something better and to students who could wrap their heads around the ideas inside of Pascal and the constructs that made up Pascal in their early courses.
A large part of Pascal's success as a teaching language lay in how well it supplanted languages such as Fortran (too limited) and PL/I (too big and complex) in the academic niche. I think PL/I is a good language to remember in this context. To me, Java is the PL/I of the late 1990s and early 2000s: a language that aspires to do much, a language well-suited to a large class of programs that need to be written in industry today, and a language that is probably too big and complex to serve all our needs as the first language our students see in computer science.
But that was just the point of Kim Bruce's talk yesterday. It is our job to build the right layer of abstraction at which to teach introductory CS, and Java makes a reasonable base for doing this. At OOPSLA last year, Alan Kay encouraged us to aspire to more, but I think that he was as disturbed by the nature of CS instruction as with Java itself. If we could build an eToys-like system on top of Java, then Alan would likely be quite happy. (He would probably still drop in a barb by asking, "But why would you want to do that when so many better choices exist?" :-)
In the Java world, many projects -- BlueJ, Bruce's ObjectDraw, JPT, Karel J. Robot, and many others -- are aimed in this direction. They may or may not succeed, but each offers an approach to focusing on the essential ideas while hiding the details of the industrial-strength language underneath. And Ben-Ari might be happy that Karel J. Robot's pedagogical lineage traces back to 1981 and the Era of Pascal!
As I was writing the last few paragraphs, Robert Duvall sat down and reminded me that we live in a different world than the one Pascal entered. Many of our incoming students arrive on campus with deep experience playing with and hacking Linux. Many arrive with experience building web sites and writing configuration scripts. Some even come in with experience contributing to open-source software projects. What sort of major should we offer such students? They may not know all that they need to know about computer science to jump to upper-division courses, but surely "Hello, World" in Pascal or C or Java is not what they need -- or want. And as much as we wax poetic about university and ideas, the world is more complicated than that. What students want matters, at least as it determines the desire they have to learn what we have to teach them.
Ben-Ari addressed this point in a fashion, asserting that we spend a lot of time trying to making CS easy, but that we should be trying to make it harder for some students, so they will be prepared to be good scientists and engineers. Perhaps so, but if we construct our programs in this way we may find that we aren't the ones educating tomorrow's software developers. The computing world really is a complex mixture of ideas and external forces these days.
I do quibble with one claim made in the talk, in the realm of history. Ben-Ari said, or at least implied, that the computing sciences and the Internet were less of a disruption to the world than the introduction of the telegraph. While I do believe that there is great value in remembering that we are not a one-of-a-kind occurrence in the history of technology -- as the breathless hype of the Internet boom screamed -- I think that we lack proper perspective for judging the impact of computing just yet. Certainly the telegraph changed the time scale of communication by orders of magnitude, a change that the Internet only accelerates. But computing affects so many elements of human life. And, as Alan Kay is fond of pointing out, its potential is far greater as a communication medium than we can even appreciate at this point in our history. That's why Kay reminds us of an even earlier technological revolution: the printing press. That is the level of change to which we in computing should aspire, fundamentally redefining how we talk about and do almost everything we do.
Ben-Ari's thesis resembles Kay's in its call for simplicity, but it differs in many other ways. Are we yet a mature a discipline? Depending on how we answer this question, the future of computer science -- and computer science education -- should take radically different paths.
After a busy week or two at home, I am now on the road at SIGCSE'05, the primary conference on computer science education. On Saturday evening, I am co-leading with Joe Bergin The Polymorphism Challenge, a workshop in which participants will learn more about dynamic polymorphism by taking it extremes on some common problems. But for the next few days I get to attend sessions and catch up with colleagues, in hopes of learning a few new tricks myself.
This morning, I attended two sessions. The first was keynote address by Kim Bruce, this year's recipient of SIGCSE's award for outstanding contributions to CS education. Kim has been involved in a number of CS education projects, and his keynote talk showed how his most recent project makes it possible to teach introductory computing topics such as assignment and recursion in a fundamentally different way. His main point: We should use abstraction to make complex ideas more concrete.
Alan Kay said something similar in his OOPSLA talks last fall. As a medium, computer programs give us new ways to make complex ideas concrete and manipulable, ways that before were impractical or even impossible. We computer science teachers need to seek more effective ways to use our own discipline's power to bridge the gap between complex ideas and the learner's mind.
In any event, Kim used two quotes about abstraction that I liked a lot. The first, I knew:
Fools ignore complexity; pragmatists suffer it; experts avoid it; geniuses remove it. ... Simplicity does not precede complexity, but follows it.
-- Alan Perlis
The second was new but is surely well known to many of you:
A good teacher knows the right lies to tell.
Using abstractions requires telling a certain kind of lie -- and also creates the subproblem of figuring out how and when to tell the fuller truth.
The second was a report of the ACM Education Board Java Task Force, which is designing a set of packages for the express purpose of using Java in CS 1. In a sense, these packages are yet another way to abstract away the complexities and idiosyncracies of Java so that beginning students can see the essential truths of computing.
There isn't much new in this work, and we can all quibble with some of the group's decisions, but the resulting product could be useful in unifying the software base used by different institutions to teach introductory CS -- if folks choose to use it. I have to admit, though, that several times during the presentation I thought to myself, "Well, of course. That's what we've been saying for years!" And once I even found myself saying, "That's just wrong." I'm sometimes surprised by how little effect mainstream and academic OOP have had on the CS education community. (Perhaps it's ironic that I'm writing this while listening in on a session about the importance of the history of computing for teaching computer science.)
I hope that my Algorithms course doesn't have this kind of reputation among our students!
Teaching Algorithms is a challenge different from my other courses, which tend to be oriented toward programming and languages. It requires a mixture of design and analysis, and the abstractness of the design combines with the theory in the analysis to put many students off. I try to counter this tendency by opening most class sessions with a game or puzzle that students can play. I hope that this will relax students, let them ease into the session, and help them make connections from abstract ideas to concrete experiences they had while playing.
Some of my favorite puzzles and games require analysis that stretches even the best students. For example, in the last few sessions, we've taken a few minutes each day to consider a puzzle that David Ginat has called Election. This is my formulation:
An election is held. For each office, we are given the number of candidates, c, and a text file containing a list of v votes. Each vote consists of the candidate's number on the ballot, from 1 to c. v and c are so large that we can't use naive, brute-force algorithms to process the input.
Design an algorithm to determine if there is a candidate with a majority of the votes. Minimize the space consumed by the algorithm, while making as few passes through the list of votes as possible.
We have whittled complexity down to O(log c) space and two passes through the votes. Next class period, we will see if we can do even better!
I figure that students should have an opportunity to touch greatness regularly. How else will they become excited by the beauty and possibilities of an area that to many of them looks like "just math"?
When I taught a course on Agile Software Development last semester, I fell into a habit of beginning the first class each week with a segment called "Agile Moments". Think of Saturday Night Live's old Deep Thoughts, by Jack Handy, only on a more serious plane. I'd gather up one or two interesting ideas I'd encountered in the previous week from posts to the XP mailing list, blogs I'd read, or projects I was working on. Then I'd pull out a provocative or entertaining quote and use that to launch into a discussion of the idea with my students. This seemed like a good way to share topical information with my students while showing them that they could enter into real conversations about how to write programs. It also reminded my students that I am always reading what others are writing about agile software, and that they could, too.
Anyway, I've been so busy planning for ChiliPLoP 2005, for which I am program chair and a Hot Topic leader, and SIGCSE 2005, at which I am co-leading a workshop with Joe Bergin, that I haven't been writing a lot here. But I do find myself having Agile Moments, so let me share them with you.
Programming and Testing
Patrick Logan has a nice article on speaking in objects, which suggests that programming is best thought of as a dialog with the computer. The pithy quote that stands out in this article is one he attributes to Ward Cunningham:
It's all talk until the tests run.
We all know that's true.
Grady Booch is spending a lot of time in his considerable library as hew works on his Handbook of Software Architecture. Yesterday he wrote about one of his favorite books, John Gall's Systemantics. Based on Grady's recommendation, I just picked this book up from our library. Among other things, Gall argues that successful large systems invariably come from successful small systems. This reminds me of agile development's emphasis on piecemeal growth and evolutionary design through tests and refactoring. One of Gall's quotes captures one of the primary motivations of the agile approaches:
Bad design can rarely be overcome by more design, whether good or bad.
This points out the big risk of Big Design Up Front: If we get it wrong, we likely have sunk our project. It is hard to recover from a bad design.
Of course, in true Agile Moments fashion, I also encountered a cautionary tale about evolutionary design. Martin Fowler warns of the risks of Abundant Mutation, which can occur with large teams when many sub-teams attack a common issue in different ways, or with smaller teams when a stream of newcomers continually join a project in mid-stream and take unfinished code in different directions. Martin reminds us that "evolutionary design requires attention, skill, and leadership".
Blaine Buxton recently posted a quote worthy of being an agile slogan:
Code removed is code earned.
Refactoring may end up shrinking the size of your system, but it is a net addition in the quality of your system. You earn the ability to add the next requirement more easily, as well as the ability to understand the program more easily.
Blaine's quote reminded me of my favorite quote about the joys of refactoring, which showed up in a blog entry I wrote long ago:
The only thing better than a 1000-line of code weekend is a minus 1000-line of code weekend.
-- Brian Foote
Brian could make a fine living as a writer of slogans and jingles!
Gus Mueller writes about his stuff folder:
I've got a folder on my desktop named "stuff". It's a little over 25 gigs, and it currently has 154,262 files in it. I have no idea what exactly is in there. Random one-off projects, pics from the camera, various music files.
I wonder, am I the only person with this situation? Should I just trash it, or should I at least try and go through it? I don't know. Do I really need anything in there?
Oh my goodness. 154,262 files!?! But I can assure you that Gus isn't the only one.
I have two stuff/ folders, one on my desktop machine and one on my laptop. The volume of Gus's stuff puts my folders to shame, though. They total only 750 MB. They are full of stuffed apps I want to try out when I get a few free minutes, articles I'd like to read, .jar files I think I might be able to use someday... Of course, someday never comes, and I just keep dropping new stuff in there.
At some point, my stuff/ folder reaches a certain size at which the chance I have of remembering that a file is in there reaches effectively 0%. What good does it do me then? Even worse, I feel guilty for not using the stuff, and guilty for not deleting it.
Every so often, I throw out all or most of my current stuff/ folder in a fit of good sense, but here I am again.
For now I am buoyed by schadenfreude at least I'm not *that* bad. Thanks, Gus. :-)
One of my favorite TV shows back when I was little guy in the 1970s was The Paper Chase. It was a show about a group of law students, well, being law students. It was based on a 1973 movie of the same name. It's not the sort of show that tends to hang around the TV schedule very long; it lasted only a year or two on network television. Even still, I fell in love with the quiet drama of university life while watching this show.
Whether you remember the show or not, you may well know its most famous character, Professor Charles W. Kingsfield, Jr., played with panache by the eminent John Houseman. Professor Kingsfield was in many ways the archetype for the brilliant scholar and demanding yet effective university teacher that one so often sees on film. Growing up, I always hoped that I might encounter such a professor in my studies, certain that he would mold me into the great thinker I thought I should be. In all my years of study, I came across only one teacher who could remind me of Kingsfield: William Brown, an old IBM guy who was on the faculty at Ball State University until his retirement. Many professors demanded much of us, and some were brilliant scholars, but only Dr. Brown had the style that made you feel honor in the fear that you might not meet his standard. I had plenty of great professors at Ball State, but none like Dr. Brown.
Why this reminiscence on a Friday afternoon 20 or 25 years later? I thought of John Houseman and a particular episode of The Paper Chase yesterday afternoon. The episode revolved around a particularly laborious assignment that Kingsfield had given his Contracts class. (Kingsfield made contract law seem like the most lively intellectual topic of all!) The assignment required students to answer one hundred questions about the law of contracts. Some of these questions were relatively routine, while others demanded research into hard-to-find articles and journals from way back. Of course, students worked together in study groups and so shared the load across four or five people, but even so the assignment was essentially impossible to complete in the time allotted.
While sharing their despair, our protagonists -- Mr. Ha-a-a-rt and his study group -- stumbled upon a plan: why not share the load with other study groups, too? Soon, all the study groups were working out trades. They couldn't trade answers one for one, because some groups had worked on the hardest questions first, so the answers they offered were more valuable than those of a group that had cherry-picked the easy ones first. By the end of the episode, the groups had drawn up contracts to guide the exchange of information, executed the deals, and submitted a complete set of answers to Kingsfield.
As the students submittted the assignments, some worried that they had violated the spirit of individual work expected of them in the classroom. But Hart realized that they had, in fact, fulfilled Kingsfield's plan for them. Only by negotiating contracts and exchanging information could they conceivably complete the assignment. In the process, they learned a lot about the law of contracts from the questions they answered, but they had learned even more about the law of contracts by living it. Kingsfield, the master, had done it again.
So, why this episode? Yesterday I received an e-mail message from one of our grad students, who has read some of my recent entries about helping students to learn new practices. (The most recent strand of this conversation started with this entry.) He proposed:
... I wonder if you have considered approaches to teaching that might include assigning projects or problems that are darned-near unsolvable using the methods you see the students using that you wish to change? ... you can't really force anyone to open up or see change if they don't feel there is anything fundamentally wrong with what they are doing now.
This is an important observation. A big part of how I try to design homework assignments involves finding problems that maximize the value of the skills students have learned in class. Not all assignments, of course; some assignments should offer students an opportunity to practice new ideas in a relatively stress-free context, and others should give them a chance to apply their new knowledge to problems that excite them and are methodology-neutral.
But assigning problems where new practices really shine through is essential, too. Kingsfield's approach is a form of Extreme Exercise, where students can succeed only if they follow the program laid out in class.
In university courses on programming and software development, this is a bigger challenge than in a law school. It is Conventional Wisdom that the best programmers are ten or more times more productive than their weakest colleagues. This disparity is perhaps nowhere wider than in the first-year CS classroom, where we have students everywhere along the continuum from extensive extracurricular experience and understanding to just-barely-hanging-on in a major that is more abstract and demanding than previously realized. Kingsfield's approach works better with a more homogeneous student body -- and in an academic environment where the mentality is "sink or swim", and students who fail to get it are encouraged to pursue another profession.
I still like the idea behind the idea, though, and think I should try to find an exercise that makes, say, test-driven design and refactoring the only real way to proceed. I've certainly done a lot of thinking along these lines in trying to teach functional programming in my Programming Languages course!
Practical matters of educational psychology confound even our best efforts, though. For example, my reader went on to say:
If a teacher can come up with scenarios that actually ... require the benefits of one approach over the other, I suspect that all but the most stubborn would be quick to convert.
In my years as a teacher, I've been repeatedly surprised by the fact that no, they won't. Some will change their majors. Others will cling to old ways at the cost of precious grade points, even in the face of repeated lack of success as a programmer. And, of course, those very best students will find a way to make it work using their existing skills, even at the cost of precious grade points. Really, though, why we penalize students very much for succeeding in a way we didn't anticipate?
With the image of Professor Kingsfield firmly planted in my mind, I will think more about how the right project could help students by making change the only reasonable option. I could do worse than to be loved for having the wisdom of Kingsfield.
And, as always, thanks to readers for their e-mail comments. They are an invaluable part of this blog for me!
I recently saw a statement somewhere, maybe on the XP mailing list, to the effect that XP doesn't prevent people who will fail from failing; it helps people who will succeed to succeed more comfortably or more efficiently.
I think that, in one sense, this is true. No software methodology can do magic. People who for whatever reason are unprepared to succeed are unlikely to succeed. And I believe that good developers can do even better when they work in an agile fashion.
But in another sense I think that XP and the other agile methods can help developers get better at what they do. If a person makes a commitment to improve, then adopting XP's disciplines can become a better developer.
This all came to the front of my mind yesterday as I read a short essay over at 43 Folders on systems for improving oneself. Merlin's comments on self-improvement systems caused a light bulb to go off: XP is a self-help system! Consider...
Agile software methods draw on a desire to get better by paying attention to what the process and code tell us and then feeding that back into the system -- using what we learn to change how we do what we do.
Practices such as continuous unit testing provide the feedback. The rhythm of the test-code-refactor cycle accentuates the salience of feedback, making it possible for the developer to make small improvements to the program over and over and over again. The agile methods also encourage using feedback from the process to fine-tune the process to the needs of the team, the client, and the project.
Improvement doesn't happen by magic. The practices support acquiring information and feeding it back into the code.
A person is more likely to stick with a system if it is simple enough to perform regularly and encourages small corrections.
Merlin proposes that all successful self-improvement systems embody...
... a few basic and seemingly immutable principles:
- action almost always trumps inaction
- planning is crucial; even if you don't follow a given plan
- things are easier to do when you understand why you're doing them
- your brain likes it when you make things as simple as possible
That sure sounds like an agile approach to software development. How about this:
... the idea basically stays the same: listen critically, reflect honestly, and be circumspect about choosing the parts that comport with your needs, values, and personal history. Above all, remember that the secret code isn't hiding ... -- the secret is to watch your progress and just keep putting one foot in front of the other. Keep remembering to think, and stay focused on achieving modest improvements in whatever you want to change. Small changes stick.
Any software developer who wants to get better could do much worse than to adopt this mindset.
I've been so busy that writing for the blog has taken a back seat lately. But I have run across plenty of cool quotes and links recently, and some are begging to be shared.
Why Be a Scientist?
Jules Henri Poincare said...
The scientist does not study nature because it is useful. He studies it because he delights in it, and he delights in it because it is beautiful.
... as quoted by Arthur Evans and Charles Bellamy in their book An Inordinate Fondness for Beetles.
How to Get Better
When asked what advice he would give young musicians, Pat Metheny said:
I have one kind of stock response that I use, which I feel is really good. And it's "always be the worst guy in every band you're in." If you're the best guy there, you need to be in a different band. And I think that works for almost everything that's out there as well.
I remember when I first learned this lesson as a high school chessplayer. Hang out with the best players you can find, and learn from them.
(I ran across this at Chris Morris's cLabs wiki, which has some interesting stuff on software development. I'll have to read more!)
All Change is Stressful
Perryn Fowler reminds us:
All change is stressful - even if we know it's good change. ...
Those of us who attempt to act as agents of change, whether within other organisations or within our own, could do well to remember it.
Perryn writes in the context of introducing agile software methods into organizations, but every educator should keep this in mind, too. We are also agents of change.
It's easy to make students jump through hoops in a course. What's hard is convincing them that jumping through those hoops after the course is over really will make their lives better. The best way I've found so far is to bring in experienced programmers who are doing exciting things, and have them say, "Comments, version control, test-driven development..."
Earlier in the same entry, he suggests that XP succeeds not because of its particular practices, but rather...
... that what really matters is deciding that you want to be a better programmer. If you make a sincere commitment to that, then exactly how you get there is a detail.
That's spot on with what I said in my last message. Learning happens when a person opens himself to change. That openness makes it possible for the learner to make the commitment to a new behavior. With that commitment, even small changes in practice can grow to large changes in capability. And I certainly concur with Gregg's advice to bring in outsiders who are doing cool things. Some students reach a level of internal motivation in that way that they will never reach through being asked to change on their own.
I realized yesterday that some of my students are approaching the course with an attitude of changing as little as possible: figure out how to do the assignments without becoming a different kind of programmer, or person. That makes learning a new set of practices, new habits, nearly impossible. They may not be doing it consciously, but I can see it in their behavior. That attitude has a place and time, but in the classroom -- where learning is the presumed goal -- it is an impediment.
Lecturing on some dry course content and giving exams full of objective questions would be a lot easier...
The temperature here has risen to unseasonably high levels the last week or so. That means that I am able to run outdoors again. And I love it -- fresh air and open space are where the great running is. I live where the American Midwest meets its Great Plains, so by most folks' standard the land here is flat. But I live near a river, and we do have gently rising and falling terrain, which makes every run more interesting and more challenging than any track can offer.
One thing I notice whenever I am able to break away from track running is an increase in the variability of my pace. When I run on a short indoor track, I usually find myself running relatively steady lap times, drawn into a rhythm by the short, repetitive environment.
Another thing I notice is that tend to run faster than I'd like, even on days I'd rather take it easy. One good result of this is that I get faster, but the negative side effect is that I end up more tired all week long. That affects the other parts of my life, like my teaching and my time with my family.
You might think that a couple of seconds per lap -- say, 52 second laps instead of 54 -- wouldn't make that much difference. That's less than 4%, right? But a small difference in effort can have a big effect of the body. That small difference compounds at every lap, much like interest in a bank account. What feels comfortable in the moment can be less so far after the fact, when that compounded difference makes itself apparent. There can value be in such stresses ("the only way to get faster is to run faster"), but there are also risks: a depressed immune system, increased susceptibility to injury, and the tiredness I mentioned earlier.
Most runners learn early to respect small changes and to use them wisely. They learn to mix relatively easy runs and even off days in with their harder runs as a way to protect the body from overuse. Folks who train for a marathon are usually told never to increase their weekly mileage by more than 10% in a given week, and to drop back every second or third week in order to let the body adjust to endurance stress.
At first, the 10% Rule seems like an inordinate restriction. "At this rate, it will take me forever to get ready for the marathon!" Well, not forever, but it will take a while. Most people don't have any real choice, though. The human body isn't tuned to accepting huge changes in endurance very quickly.
But their is hope, in the bank account analogy above. You may have heard of the Rule of 72, an old heuristic from accounting that tells us roughly how quickly a balance can double. If a bank account draws 5% interest a year, then the balance will double in roughly 72/5 ~~ 14 years. At 10% interest, it will double in about seven. This is only a heuristic, but the estimates are pretty close to the real numbers.
Applied to our running, the Rule of 72 reminds us that if we increase our mileage 10% a week, then we can double our mileage in only seven weeks! Throw in a couple of adjustment weeks, and still we can double in 10 weeks or less. And that's at a safe rate of increase that will feel comfortable to most people and protect their bodies from undue risks at the same time. Think about it: Even if you can only jog three miles at a time right now, you could be ready to finish a marathon in roughly 30 weeks! (Most training plans for beginners can get you there faster, so this is really just an upper bound...)
What does this all have to do with software development? Well, I have been thinking about how to encourage students, especially those in my first-year course, to adopt new habits, such as test-driven design and refactoring. I had hoped that, by introducing these ideas early in their curriculum, they wouldn't be too set in their ways yet, with old habits too deeply ingrained yet. But even as second-semester programmers, many of them seem deeply wedded to how they program now. Of course, programmers and students are people, too, so they bring with them cognitive habits from other courses and other subjects, and these habits interact with new habits we'd like them to learn. (Think deeply about the problem. Write the entire program from scratch. Type it in. Compile it. Run it. Submit it. Repeat.)
How can I help them adopt new practices? The XP mailing list discusses this problem all the time, with occasional new ideas and frequent reminders that people don't change easily. Keith Ray recently posted a short essay with links to some experience reports on incremental versus wholesale adoption of XP. I've been focusing on incremental change for the most part, due to the limits of my control over students' motivation and behavior.
The 10% Rule is an incremental strategy. The Rule of 72 shows that such small changes can add up to large effects quickly.
If students spends 10 minutes refactoring on the first day, and then add 10% each subsequent day, they could double their refactoring time in a week! Pretty soon, refactoring will feel natural, a part of the test-code-refactor rhythm, and they won't need to watch the clock any more.
I'm not sure how to use this approach with testing. So far, I've just started with small exercises and made them a bit larger as time passed, so that the number of tests needed has grown slowly. But I know that many still write their tests after they think they are done with the assignment. I shouldn't complain -- at least they have tests now, whereas before they had none. And the tests support refactoring. But I'd like to help them see the value in writing the tests sooner, even first.
Together, the 10% Rule and the Rule of 72 can result in big gains when the developer commits to a new practice in a disciplined way. Without commitment, change may well never happen. A runner who doesn't run enough miles, somehow increasing stamina and strength, isn't likely to make to the level of a marathon. That discipline is essential. The 10% Rule offers a safe and steady path forward, counting on the Rule of 72 to accumulate effects quicker than you might realize.
Following James Tauber's lead, I went vanity surfing at Technorati and found a pleasant surprise: One of my readers, Giulio Piancastelli, proclaimed Knowing and Doing to be his Weblog of the Year for 2004. I'm honored that Giulio would single this blog out in such a way. It's humbling to know that readers find something valuable here. I may not leave listeners inexplicably grinning like idiots, but maybe I am serving the needs of actual readers. What a nice way to end my day. Thanks, Giulio.
Recently I wrote about teacher Al Cullum, motivated by Rich Pattis's recommendation of the recent PBS movie about his teaching career and ideas. I finally finished Cullum's Push Back the Desks, the 1967 Citation Press book that introduced his teaching style to a wide audience. It's a light and easy read, but I kept slowing down to jot notes of the ideas that Cullum's ideas caused in my mind.
The book is organized around chapters that present one of Cullum's exercises for "pushing back the desks" and creating an active world of learning for his elementary school students. To help folks in all subject areas, he describes how his techniques can be applied across the curriculum. Some are not too surprising, such as writing a newspaper about a topic or putting on class plays. These are standard techniques in schools today.
I recognized a couple of Cullum's techniques as pedagogical patterns sometimes used in college CS classrooms:
Other of Cullum's techniques sounded new to me. For example, he had his students choose a U.S. president, write his inauguration speech based on what he actually did as president, and deliver the speech to the class, which acted as the Congress by voting thumbs up or thumbs down on the president's agenda. He also described Geography Launchings, the Poetry Pot, the Longfellow Lab, and the Newspaper Quiz (interesting for being a race mostly against oneself).
The technique I am most likely to use this semester is the Pulitzer Prize. At the beginning of the school year, Cullum announced that students could enter one of their written works in a variety of genres (essay, poem, and so on) into a an award competition a lá the Pulitzer Prizes. Participation was fully voluntary, and students could enter any work they wished, even one they wrote outside of class. He would work with students to help them improve their work throughout the year, to the degree they requested. At the end of the year, Cullum selected zero or more winners in each genre, based on the number and quality of the entrants. This might make a nice way to encourage students in a programming class to go beyond an assignment and its time constraints to craft a fine program. The genres could be things like Assigned Program, Freelance Program, Test Class, GUI Program, Text-based Program and so on.
Another neat idea for a CS course is the Renoir Room, in which students found, studied, and discussed the works of the famous pointer. How about a CS course built around the great works of one of our great artists: classical artists such as Knuth or McCarthy, or even postmodern artists such as Larry Wall?
The real joy in this book is not the techniques themselves but the rather the spirit Cullum brings to his classroom.
The fulcrum idea in Cullum's philosophy is a touch of greatness. Teach by exposing students to greatness, letting them respond to it, and then learn out of their own motivation to live in the presence of greatness. And Cullum doesn't mean "artificial greatness" created "at grade level" for students. He means Shakespeare, Longfellow, Chaucer, Dickinson, and even Big Ideas from math and science. (Do you note a recurring theme?)
Here are some of the quotes that I wanted to remember as inspiration. The emphasis is mine.
I have found that children are interested in two things-- doing and doing now. Children are looking for the unexpected, not the safe; children are looking for noise and laughter, not quite; children are looking for the heroic, not the common. 
Sadly our K-12 educational system tends to beat this energy out of college students long before they get to us. But I didn't think the fire has been extinguished, only masked. With some encouragement, and evidence that the instructor is serious about having real fun and learning in the classroom, most of my students seem to open up nicely. I've been most successful doing this in my algorithms course and my now-defunct user interface/professional responsibility course.
When I first began teaching, there was Al Cullum the teacher and Al Cullum the person. I soon discovered that this split personality was not a healthy one for the children or for me. I realized I had better bury Al Cullum the teacher and present Al Cullum the person, or else the school year would become monotonous months of trivia. I began to share with my students my moments of joy, my moments of love, my moments of scholarship, and even my uncertainties. 
That last sentence reminds me of a quote I read on someone else's blog page, about how the honest teacher presents his students with an honest picture of him or herself, as a scholar who doesn't know much but who searches for understanding. In my experience, students respond to this sort of honesty with openness and honesty of their own. Exposing the "real person" to students requires a certain kind of confidence in a teacher, but I think that such confidence is a habit that can be developed with practice. Don't wait to become confident. Be confident.
Many times as a beginning teacher I used to say to my classes:
"I insist you write complete sentences!"
"How can you possibly forget to put in the period?"
"This composition is too short!"
"This composition is too long!"
... One day I heard the echoes of my admonitions, and in an embarrassed fashion I asked myself, "What have you written lately?"
Programming teachers must write programs. It gives them the context within which to teach, and to learn with their students. Otherwise, instruction becomes nothing but surface information from the textbook du jour, with no reality for students. Plus, how can we stay alive as teachers of programming if we don't write programs?
Writing our own programs also reminds us what's hard for our students, so that we can better help them overcome the real obstacles they face rather than the artificial ones we create in our minds and in our theories.
Good schools introduce students to as many new worlds as possible. 
A love of reading is developed through students and teachers sharing what they have read. 
We can help our students develop a love of reading programs in the same way. I think that a love of writing programs grows in a similar fashion.
Once a teacher loses the feeling of doing something for the first time, it is time for the teacher to change grades, schools, or professions.
I've always asked to teach different courses every few semesters, for just this reason. If I teach any course for too long, it becomes stale, because I become stale. When I must, as is the case with our introductory OOP course, then I have to throw myself a change-up occasionally: change language, or program themes, or examples, or development style. Almost anything to keep the course fresh. (This semester, I have yet to reuse a line from from voluminous OOP course notes. I feel more alive (and on edge!) each day than I've felt in a long while.
I haven't heard of a student who died from being challenged too much, but I've heard of many who wasted away from boredom. 
This may be true in spirit, but students can be challenged too much. They need to have enough background, both in content and style, before they can rise to meet challenging problems. Hitting them too hard too soon is a recipe for revolt or, worse, desertion. (Those are especially unattractive outcomes at a time of falling enrollments nationwide.)
There are two aspects of every classroom -- the students and the teacher! Both need to be touched by greatness. Students seek the mystery and magic of school that was there the first day they entered the hallowed halls. Give them the magic again ... 
Be yourself. Dare to experiment. Be touched by greatness yourself. Live in the creative act, too.
Many eyes were moist; I knew mine were partly because of Longfellow's poetic gifts and partly because of one hundred and one students who had confirmed my opinion that they had the ability and nobility to accept a touch of greatness. There silence at the end of the poem ... told me that school life need not be routine, dull, or one long series of learning basic skills. We teachers must reach the hearts of children before they are impressed with our basic skills. 
Wow. I've known that feeling rarely, but it is magical. An algorithms session when a game or puzzle opens students' minds and hearts and they sense a greatness in the solution. An OOP session when a group of students clicks with, say, the Decorator pattern. Or when a programming languages class clicks with the idea of higher-order functions. A rare and beautiful feeling.
I have a chance to reach this elusive state only when I create a suitable environment, in which a great idea is front and center and the student is motivated and challenged. A lot of it is luck, but as they say, luck favors the prepared.
[M]essy classrooms are perfectly natural if something is happening in that room. If dreams of greatness are to be fulfilled ..., I don't see how teachers can avoid having messy classrooms. Sometimes a neat classroom is a bore. 
By this Cullum means the elementary classroom: paper scraps, glue, paint, easels, costumes, cloth scraps. But the college classroom should be messy, too, at least for a while: an intellectual mess, as ideas are being formed, and re-formed, extended and applied. A class gang-writing a program a lá XP will be "messy" for a few days, until the product takes shape and the program is refactored and the code makes us find and use a big idea (say, an interface) that resolves the mess.
While reading this book, my wife (a former elementary school teacher) and I both commented that this book poses a serious challenge for our elementary schools as they are right now. This approach requires real knowledge and love of the subject area. Someone teaching science only because the school needs a 5-6 science teacher will have a hard time sharing an abiding love for science and the scientific method. This approach also requires great confidence in one's knowledge and teaching skills.
But great teachers exist. We've all had them. And someone who sincerely wants to have a great classroom can develop the right habits for getting better.
Cullum's philosophy can be Just as tough to apply in a college classroom. A lecture section of 200 students. Picking up a new class in a new area, one that extends an instructor beyond the core of his or her expertise, because the old instructor retired or moved. But it can work. I eagerly sought out our programming languages courses and our algorithms course as a means for me to "go deep" and cultivate my love and knowledge of these areas. I could never have taught them if I hadn't wanted to touch their greatness, because I would have bored myself -- and my students -- to death, and killed everyone's spirit in the process. By no means am I a great teacher in these areas yet, but I do think I'm on the right path.
A touch of greatness. I think that's what conferences like OOPSLA do most for me: let me re-connect with the greatness of what I do and think about. Now, how can I make that feeling available to my students...