As promised back in May, I am offering a prize for the best failed idea of the semester in the my programming languages course. We are now about halfway through the second unit of the course, on recursive programming techniques. Students finally have tools powerful enough to solve interesting problems, but also tools powerful enough to go spectacularly astray. I want to encourage students to trust their ideas enough to fail, but that's how they will learn to trust their ideas enough to succeed. Rather than fear those inevitable failures, we will have fun with them.
Feel free to check out describing the contest and how to participate. As always, I welcome feedback from my readers.
In particular, I am still looking for a catchy, evocative, culturally relevant, funny name for the prize. If this were the late 1980s or early 1990s, I might name it for the film Ishtar, and everyone would know what I meant. For now, I have opted for a more recent high-profile film flop -- a spectacular loser on the balance sheet -- but it doesn't have the same pop for me. I'm not as hip as I used to be and can use some help.
Programming language researchers have the principles, tools, algorithms and abstractions to solve all kinds of problems, in all areas of computer science. However, identifying and evaluating new problems, particularly those that lie outside the typical core PL problems we all know and love, can be a significant challenge. Hence, the goal of this workshop is to identify and discuss problems that do not often show up in our top conferences, but where programming language researchers can make a substantial impact. The hope is that by holding such a forum and associating it directly with a top conference like POPL, we can slowly start to increase the diversity of problems that are studied by PL researchers and that by doing so we will increase the impact that our community has on the world.
I remember when I first started running across papers written by physicists on networks in social science and open-source software. Why were physicists writing these papers? They are out of their league. In fact, though they were something else: curious, and equipped with good tools for studying the problems. Good for them -- and good fro the rest of us. too, as they contributed to our understanding of how the world works.
Computer science has even better tools and methods for studying all manners of problems and systems, especially the more dynamic systems. Our ability to reify the language of any domain and then write programs to implement the semantics of the domain is one step up from the models that most mathematicians and physicists bring to the table.
Sometimes we forget the power we have in language. As Tim Ottinger tweeted today, "I think people have lost the idea that OO builds the language you implement in, as well as the product." We forget to use our own ability to create and use language even in an approach built on the premise! And of course we can go farther when we build architectures and virtual machines for domain-specific languages, rather than living inside a relatively restrictive model like Java.
The organizers of Off the Beaten Track remind us to think about the wealth of "principles, tools, algorithms, and abstractions" we possess and can bring to bear on problems far beyond the narrow technical area of programming languages research, from the natural sciences to art and music, from economics and the law to linguistics and education. They even acknowledge that we don't always appreciate the diversity in our own research field and so encourage submissions on "unusual compilers" and "underrepresented programming languages".
The last sentence in the passage above expresses an important ultimate goal: to increase the impact the programming languages community has on the world. I heartily support this goal and suggest that it is an important one not only for programming languages researchers. It is essential that many more of us in computer science look off the beaten track for ways to apply what we have learned to problems far beyond our own borders. If we start focusing on problems that matter to other people, problems that matter, we might just solve them.
My favorite line in the Off the Beaten Track home page is the last item in its bullet list of potential topic areas: Surprise us.. Indeed. Surprise us.
I mentioned last time that I've been spending some time with Norvig's work on Strachey's checkers program in CPL. This is fun stuff that can be used in my programming languages course. But it isn't the only new stuff I've been learning. When you work with students on research projects and independent studies, opportunities to learn await at every turn.
A grad student is taking the undergrad programming languages course and so has to do some extra projects to earn his grad credit. He is a lover of Ruby and has been looking at a couple of Scheme interpreters implemented in Ruby, Heist and Bus-Scheme. I'm not sure where this will lead yet, but that is part of the exercise. The undergrad who faced the "refactor or rewrite?" decision a few weeks ago teaches me something new every week, not only through his experiences writing a language processor but also about his program's source and target languages, Photoshop and HTML/CSS.
And as if that weren't enough, someone tweets that Avdi Grimm is sharing is code and notes as he implements Smalltalk Best Practice Patterns in Ruby. Awesome. This Avdi guy is rapidly becoming one of my heroes.
All of these projects are good news. One of the great advantages of working at a university is working with students and learning along with him. Right now, I have a lot on my plate. It's daunting but fun.
Yesterday morning, I was grading the first quiz from my programming languages course and was so surprised by the responses to the first short-answer question that I tweeted in faux despair:
Wow. We discussed a particular something every day in class for three weeks. Student quiz answers give no evidence of this.
Colleagues around the globe consoled me and commiserated. But I forgot that I am also followed by several of my student, and their reaction was more like... panic. Even though I soon followed up with a tweet saying that their Scheme code made me happy, they were alarmed about that first tweet.
It's a new world. I never used to grade with my students watching or to think out loud while I was grading. Twitter changes that, unless I change me and stop using Twitter. On balance, I think I'm still better off. When I got to class, students all had smiles on their faces, some more nervous than others. We chatted. I did my best to calm them. We made a good start on the day with them all focused on the course.
We have reached the end of Week 5, one-third of the way through the course. Despite the alarm I set off in students' minds, they have performed on par with students in recent offerings over the first three homeworks and the first quiz. At this point, I am more concerned with my performance than theirs. After class yesterday, I was keenly aware of the pace of the session being out of sync with the learning curve of material. The places where I've been slowing down aren't always the best places to slow down, and the places where I've been speeding up (whether intentional or unintentional) aren't always the best places to speed up. A chat with one student that afternoon cemented my impression.
Even with years of experience, teaching is hard to get right. One shortcoming of teaching a course only every third semester is that the turnaround time on improvements is so long. What I need to do is use my realization to improve the rest of this offering, first of all this unit on recursive programming.
I spent some time early this week digging into Peter Norvig's Prescient but Not Perfect, a reconsideration of Christopher Strachey's 1966 Sci Am article and in particular Strachey's CPL program to play checkers. Norvig did usual wonderful job with the code. It's hard to find a CPL compiler these days, and has been since about 1980, so he wrote a CPL-to-Python translator, encoded and debugged Strachey's original program, and published the checkers program and a literate program essay that explains his work.
This is, of course, a great topic for a programming languages course. Norvig exemplifies the attitude I encourage to my students on Day 1: if you need a language processor, write one. It's just another program. I am not sure yet when I will bring this topic into my course; perhaps when we first talk in detail about interpreters, or perhaps when we talk about parsing and parser generators. (Norvig uses YAPPS, a Python parser generator, to convert a representation of CPL's grammar into a CPL parser written in Python.)
There are some days when I had designed all of my course sessions to be 60 minutes instead of 75 minutes, so that we had more lüft for opportunistic topics like this one. Or that I could design a free day into the course every 2-3 weeks for the same purpose. Alas, the CS curriculum depends on this course to expose students to a number of important ideas and practices, and the learning curve for some of the material is non-trivial. I'll do my best to provide at least a cursory coverage of Norvig's article and program. I hope that a few students will turn out to his approach to the world -- the computer scientist's mindset.
If nothing else, working through his paper and code excite me, and that will leak over into the rest of my work.
Earlier this month, the New York Times ran a long article exploring the topic of technology in K-12 classrooms, in particular the lack of evidence that the mad rush to create the "classroom of future" is having any effect on student learning. Standardized test scores are stagnant in most places, and in schools showing improvements, research has not been able to separate the effect of using technology from the effect of extra teacher training.
We should not be surprised. It is unlikely that simply using a new technology will have any effect on student learning. If we teach the same way and have students do the same things, then we should expect student learning to be about the same whether they are writing on paper, using a typewriter, or typing on a computer keyboard. There are certainly some very cool things one can do with, say, Keynote, and I think having those features available can augment a student's experience. But I doubt that those features can have a substantial effect on learning. Technology is like a small booster rocket for students who are already learning a lot and like training wheels for those who are not.
As I read that article, one fact screamed out at me. Computers are being used in classrooms everywhere for almost everything. Everything, that is, except the one thing that makes them useful at all: computer programming.
After reading the Times piece, Clive Thompson pulled the critical question out of it and asked, What can computers teach that textbooks and paper can't? Mark Guzdial has written thoughtfully on this topic in the past as well. Thompson offers two answers: teaching complexity and seeing patterns, (His third answer is a meta-answer, more effective communication between teacher and student.) We can improve both teaching complexity and seeing patterns by using the right software, but -- thankfully! -- Thompson points out that we can do even better if we teach kids even a little computer programming.
Writing a little code is a great vehicle for exploring a complex problem and trying to create and communicate understanding. Using or writing a program to visualize data and relationships among them is, too.
Of course, I don't have hard evidence for these claims, either. But it is human nature to want to tinker, to hack, to create. If I am going to go out on a limb without data, I'd rather do it with students creating tools that help them understand their world than with students mostly consuming media using canned tools. And I'm convinced that we can expand our students' minds more effectively by showing them how to program than by teaching most of what we teach in K-12. Programming can be tedious, like many learning tasks, and students need to learn how to work through tedium to deeper understanding. But programming offers rewards in a way and on a scale that, say, the odd problems at the end of a chapter in an algebra textbook can never do by themselves.
Mark Surman wrote a very nice blog entry this week, Mozilla as teacher, expressing a corporate vision for educating the web-using public that puts technology in context:
... people who make stuff on the internet are better creators and better online citizens if they know at least a little bit about the web's basic building blocks.
As I've written before, we do future teachers, journalists, artists, filmmakers, scientists, citizens, and curious kids a disservice if we do not teach them a little bit of code. Without this knowledge, they face unnecessary limits on their ability to write, create, and think. They deserve the ability to tinker, to hack, to trick out their digital worlds. The rest of us often benefit when they do, because some of things they create make all of our lives better.
(And increasingly, the digital world intersects with the physical world in surprising ways!)
I will go a step further than Surman's claim. I think that people who create and who are engaged with the world they inhabit have an opportunity to better citizens, period. They will be more able, more willing participants in civic life when they understand more clearly their connections to and dependence on the wider society around them. By giving them the tools they need to think more deeply and to create more broadly, education can enable them to participate in the economy of the world and improve all our lots.
I don't know if K-12 standardized test scores would get better if we taught more students programming, but I do think there would be benefits.
As I often am, I am drawn back to the vision Alan Kay has for education. We can use technology -- computers -- to teach different content in different ways, but ultimately it all comes back to new and better ways to think in a new medium. Until we make the choice to cross over into that new land, we can spend all the money we want on technology in the classroom and not change much at all.
We have begun to write code in my programming languages course. The last couple of sessions we have been writing higher-order procedures, and next time we begin a three-week unit learning to write recursive programs following the structural recursion patterns in my small pattern language Roundabout.
One of the challenges for students is to learn how to use the patterns without feeling a need to mimic my style perfectly. Some students, especially the better ones, will chafe if I try to make them write code exactly as I do. They are already good programmers in other styles, and they can become good functional programmers without aping me. Many will see the patterns as a restriction on how they think, though in fact the patterns are source of great freedom. They force you to write code in a particular way; they give you tools for thinking about problems as you program.
Again, there is something for us to learn from our writing brethren. Consider a writing course like the one Roger Rosenblatt describes in his book, Unless It Moves the Human Heart: The Craft and Art of Writing, which I have referred to several times, most recently in The Summer Smalltalk Taught Me OOP. No student in Rosenblatt's course wants him to expect them to leave the course writing just like he does. They are in the course to learn elements of craft, to share and critique work, and to get advice from someone with extensive experience as a writer. Rosenblatt is aware of this, too:
Wordsworth quoted Coleridge as saying that every poet must create the taste by which he is relished. The same is true of teachers. I really don't want my students to write as I do, but I want them to think about writing as I do. In them I am consciously creating a certain taste for what I believe constitutes skillful and effective writing.
The course is more about learning how to think about writing as much as it is about learning how to write itself. That's what a good pattern language can do for us: help us learn to think about a class of problems or a class of solutions.
I think this happens whether a teacher intends it consciously or not. Students learn how to think and do by observing their teachers thinking and doing. A programming course is usually better if the teacher designs the course deliberately, with careful consideration of the patterns to demonstrate and the order in which students experience them in problems and solutions.
In the end, I want my students to think about writing recursive programs as I do, because experience as both a programmer and as a teacher tells me that this way of thinking will help them become good functional programmers as soon as possible. But I do not want them to write exactly as I so; they need to find their own style, their own taste.
This is yet another example of the tremendous power teachers wield every time they step foot in a classroom. As a student, I was always most fond of the teachers wielded it carefully and deliberately. So many of them live on in me in how I think. As a teacher, I have come to respect this power in my own hands and try to wield it respectfully with my students.
P.S. For what it's worth, Coleridge is one of my favorite poets!
Elisabeth Hendrickson recently posted a good short piece about Testing is a Whole Team Activity. She gives examples of some of the comments she hears frequently which indicate a misunderstanding about the relationship between coding and testing. My favorite example was the first:
"We're implementing stories up to the last minute, so we can never finish testing within the sprint."
Maybe I like this one so much because I hear it from students so often, especially on code that they find challenging and are having a hard time finishing.
"If we took the time to test our code, we would not get done on time."
"What evidence do you have that your code works for the example cases?"
"Well, none, really, but..."
"Then how do you know you are done?"
"'done'?" You keep using that word. I do not think it means what you think it means.
For a few years, I have been saving many .xls as plain text. This is handy when I want to program against the data more often than I want to use spreadsheet tools. It also makes the data more accessible across apps and platforms, which is good now and in the future. While doing this, I have came across this technique that I find useful more generally. Maybe you will, too.
People use spreadsheets used for two purposes, structuring data and presenting data. Both Excel and Apple's Numbers offer as much or more functionality for presenting information as they do for manipulating it. For me, the presentation stuff often gets in the way of organizing and manipulating, both by cluttering the UI with commands I don't need to know about and by adding formatting information to my data. The result is a UI more complicated than I need and data files much larger than I need them to be.
When I run into one of those bloated files, I sometimes take a round trip:
The result is a clean data file, with the data and its basic structure, but nothing more. No text formatting, no variably spaced rows or columns, and no presentation widgets. When I do work with the data in the spreadsheet app, it's unadorned, just as I like it.
In Liberating and future-proofing your research data, J.B. Deaton describes a recent tedious effort "to liberate several gigabytes of data from a directory of SigmaPlot files". Deaton works in Python, so he had to go through the laborious process of installing an evaluation copy of SigmaPlot, stepping through each SigmaPlot file workbook by workbook, exporting each to Excel, and then converting the files from Excel to CSV or some other format he could process in Python.
Of course, this was time spent not doing research.
All of this would have been a moot point if the data had been stored as CSV or plain text. I can open and process data stored in CSV on any operating system with a large number of tools, for free. And I am confident in 10 years time, I will be able to do the same.
This is a problem we face when we need to work with old data, as Deaton is doing. It's a problem we face when working with current data, too. I wrote recently about how doing a basic task of my job, such as scheduling courses each semester, in a spreadsheet gets in the way getting the job done as well as I might.
Had Deaton not taken the time to liberate his data, things could have been worse in the long run. Not only would the data have been unavailable to his current project, but it may well have fallen into disuse forever and eventually disappeared.
Kari Kraus wrote this week about the problem of data disappearing. One problem is the evolution of media:
When you saved that unpublished manuscript on [a 5-1/4" floppy disk], you figured it would be accessible forever. But when was the last time you saw a floppy drive?
Well, not a 5-1/4". I do have a 3-1/2" USB floppy drive at home and another in my office. But Kraus has a point. Most of the people creating data aren't CS professionals or techno-geeks. And while I do have a floppy drive, I never use them for my own data. Over the years, I've been careful to migrate my data, from floppy drives to zip drives, from CDs to large, replicated hard drives. Eventually it may live somewhere in the cloud, and I will certainly have to move it to the next new thing in hardware sometime in the future.
Deaton's problem wasn't hardware, though, and Kraus points out the bigger problem: custom encoded-data from application software:
If you don't have a copy of WordPerfect 2 around, you're out of luck.
The professional data that I have lost over the years hasn't been "lost" lost. The problem has always been with software. I, too, have occasionally wished I had a legacy of copy of WordPerfect lying around. My wife and I created a lot of files in pre-Windows WordPerfect back in the late 1980s, and I continued to use Mac versions of WP through the 1990s. As I moved to newer apps, I converted most of the files over, but every once in a while I still run across an old file in .wp format. At this point, it is rarely anything important enough to devote the time Deaton spent on his conversion experience. I choose to let that data die.
Fortunately, not all of my data from that era was encoded. I wrote most of grad school papers in nroff. That's also how I created our wedding invitations.
This is a risk we run as more of our world moves from paper to digital, even when it's just entertainment. Fortunately, for the last 5 years or so, I've been storing more and more of my data in plain text or, short of that, rich text. Like Deaton, I am pretty confident that I will be able to read and process that data 10 years hence. And, whenever possible, I have used an open file formats only policy with my colleagues.
Rather than having to liberate data in the future, it is wiser to let it live free from birth. That reduces friction now and later. Deaton offers a set of preferences that can help you keep your data as free as possible:
- Open source beats closed source.
- Ubiquitous beats niche software.
- Automation/scripting beats manual processes.
- Plain text beats binaries.
- READMEs in every project directory.
That third bullet is good advice even if you are not a computer scientist. Deaton isn't. But you don't have to be a computer scientist to reap the benefits of a little programming!
Or, Throw 'em Away Until You Get It Right
Just before classes started, one of my undergrad research students stopped by to talk about a choice he was weighing. He is writing a tool that takes as input a Photoshop document and produces as output a faithful, understandable HTML rendition of the design and content. He has written a lot of code in Python over the last few months, using an existing library of tools for working with Photoshop docs. Now that he understands the problem well and has figured out what a good solution looks like, he is dissatisfied with his existing code. He thinks he can create a better, simpler solution by writing his own Photoshop-processing tools.
The choice is: refactor the code incrementally until he has replaced all the library code, or start from scratch?
My student's dilemma took me back over twenty years, to a time when I faced the same choice, when I too was first learning my craft.
I was one of my doctoral advisor's first graduate students. In AI, this means that we had a lot of infrastructure to build in support of the research we all wanted to do. We worked in knowledge-based systems, and in addition to doing research in the lab we also wanted to deliver working systems to our collaborators on and off campus. Building tools for regular people meant having reasonable interfaces, preferably with GUI capabilities. This created an extra problem, because our collaborators used PCs, Macs, and Unix workstations. My advisor was a Mac guy. At the time, I was a Unix man in the process of coming to love Macs. In that era, there was no Java and precious few tools for writing cross-platform GUIs.
Our lab spent a couple of years chasing the ideal solution: a way to write a program and run it on any platform, giving users the same experience on all three. The lab had settled more or less on PCL, a portable Common Lisp. It wasn't ideal, but we grad students -- who were spending a lot of time implementing libraries and frameworks -- were ready to stop building infrastructure and start working full-time on our own code.
Then my advisor discovered Smalltalk.
The language included graphics classes in its base image, which offered the promise of write-once-run-anywhere apps for clients. And it was object-oriented, which matched the conceptual model of the software we wanted to build to a T. I had just spent several months trying to master CLOS, Common Lisp's powerful object system, but Smalltalk looked like just what we wanted. So we made the move -- and told advisor that this would be the last move. Smalltalk would be our home.
I learned the basics of the language by working through every tutorial I could my hands on, first Digitalk Smalltalk then ObjectWorks. They were pretty good. Then I wrote some toy programs of my own, to show I was ready to create on my own.
So I started writing my first Smalltalk system: a parser and interpreter for a domain-specific AI language with a built-in inference engine, a simple graphical and table-driven tool for writing programs in these languages, and graphical front end for running systems.
There was a lot going on there, at least for a green graduate student only recently called up to the big leagues. But I had done my homework. I was ready. I was sure of it. I was cocky.
It turn out that I didn't really understand the role that data should play in an OO program. My program soon became a tangle of data dependencies design before I understood my solution all that well, and the tangle made the code increasing turgid.
So I threw it away, rm -r'ed it into nothingness. I started from scratch, sure that Version 2 would be perfect.
I crashed again.
The program was better this time around, but it turns out that I didn't really understand how objects should interact in a large OO program. My program soon became a tangle of objects and wrappers and adapters, whose behavior I could not follow in even the simplest scenarios. The tangle made the code increasing murky.
So I threw it away -- rm -r again -- and started from scratch. Surely Version 3, based on several weeks of coding and learning, would be just what we wanted.
I crashed yet again. This time, the landing was more gentle, because I really was making progress. But as I coded my third system, I began to see ways to structure the program that would make the code easier to grow as I added features, and easier to change as I got better at design. I was just beginning to glimpse the Land of Design Patterns. But I always seemed to learn each lesson one day too late to use it.
My program was moving forward, creakily, but I just knew it could be better. I did not like the idea of maintaining this code for several years, as we modified apps fielded with our collaborators and as we used it as part of the foundation for the lab's vision.
So I threw it away and wrote my system from scratch one last time. The result was not a perfect program, but one that I could live with and be proud of. It only took me four major iterations and several months of programming.
Looking back, I faced the same decision my student faced recently with his system. Refactor or start over? He has the advantage of having written a better first program than I had, yet he made the sound decision to rewrite.
Sometimes, refactoring really is the better approach. You can keep system running while slowly corralling data dependencies, spurious object interactions, and suboptimal design. Had I been a more experienced programmer, I may well have chosen to refactor from Version 3 to Version 4 of my program. But I wasn't. Besides, I had neither a suite of unit tests nor access to automated refactoring tools. Refactoring without either of these makes the process scarier and more dangerous than it needs to be.
Maybe refactoring is the better approach most or all of the time. I've read all about how the Great Rewrite is one of those Things You Should Never Do.
But then, there is an axiom from Turing Award winner Fred Brooks that applies better to my circumstance of writing the initial implementation of a program: "... plan to throw one away; you will, anyhow". I find Brooks's advice most useful when I am learning a lot while I am programmer. For me, that is one context, at least, in which starting from scratch is a big win: when my understanding is changing rapidly, whether of domain, problem, or tools. In those cases, I am usually incapable of refactoring fast enough to keep up with my learning. Starting over offers a faster path to a better program than refactoring.
On that first big Smalltalk project of mine, I was learning so much, so fast. Smalltalk was teaching me object-oriented programming, through my trial and error and through my experience with the rest of the Smalltalk class hierarchy. I had never written a language interpreter or other system of such scale before, and I was learning lessons about modularity and language processing. I was eager to build a new and improved system as quickly as I could.
In such cases, there is nothing like the sound of OS X's shredder. Freedom. No artificial constraints from what has suddenly become legacy code. No limits from my past ignorance. A fresh start. New energy!
This is something we programmers can learn from the experience of other writers, if we are willing. In Unless It Moves the Human Heart: The Craft and Art of Writing, Roger Rosenblatt tells us that Edgar Doctorow ...
... had written 150 pages of The Book of Daniel before he'd realized he had chosen the wrong way to tell the story. ... So one morning Edgar tossed out the 150 pages and started all over.... I wanted the class to understand that Edgar was happy to start from scratch, because he had picked the wrong door the first time.
Sometimes, the best thing a programmer can admit is that he or she has picked the wrong door and walked down the wrong path.
But Brooks warns of a danger programmer's face on second efforts: the Second System Effect. As Scott Rosenberg writes in Code Reads #1: The Mythical Man-Month:
Brooks noted that an engineer or team will often make all the compromises necessary to ship their first product, then founder on the second. Throughout project number one, they stored up pet features and extras that they couldn't squeeze into the original product, and told themselves, "We'll do it right next time." This "second system" is "the most dangerous a man ever designs."
I never shipped the first version of my program, so perhaps I eluded this pitfall out of luck. Still, I was cocky when I wrote Version 1, and then I was cocky when I wrote Version 2. But both versions humbled me, humbled me hard. I was a good programmer, maybe, but I wasn't good enough. I had a lot to learn, and I wanted to learn it all.
So it was relatively easy to start over on #3 and #4. I was learning, and I had the luxury of time. Ah, to be a grad student again!
In the end, I wrote a program that I could release to users and colleagues with pride. Along the way, Smalltalk taught me a lot about OOP, and writing the program taught me a lot about expert system shells. It was time well spent.
I returned graded Homework 1 today at end of my Programming Languages class. I'm still getting to know faces and names in the group, so I paid attention to each person as he took his assignment from me. Occasionally, I glanced at the grade written atop the paper as I was handing it back. One grade caught my eye, and I made eye contact with the student. The grade was 10/20. Or it could have been 12/20, or 8/20, no matter. The point is the same: not an auspicious start to the semester.
Earlier, during class, I had noticed that this student was not taking notes. Sometimes, that's the sign of a really good student. He looked confident. Halfway through class, I gave a small exercise for students to work on, box-and-pointer diagrams for Scheme pairs and lists. Still nothing from the students. Perhaps his head was roiling in furious thought, but there was no visible evidence of thought. A calm sea.
When I saw the grade on his assignment, I leapt reflexively to a conclusion. This guy will be an early casualty in the course. He may drop early. He may limp along, scoring 50%, plus or minus a few points, all semester, and end up with a D or an F or, if he is truly fortunate, the C- he needs to count it toward his program of study.
That may seem like a harsh sort of profiling, but I've seen this situation many times, especially in this course. If you don't take Scheme seriously, if you don't take functional programming seriously, ideas and patterns and challenges can snowball quickly. I have a firend at another university who speaks of a colleague as telling his students on Day One of every course, There are only two ways to take my course: seriously, and again. That's how I feel about most of my courses, but especially this one. We can have a lot of fun, when students are present and engaged, seeking mastery. If not, well, it can be a long semester.
Don't think that my reaction means I will shirk my duty and let this student fail. When students struggle early, I try to engage them one-on-one, to see if we can figure out what the impediment is and how we night get them over it. Such efforts succeed too rarely for my tastes. Sadly, the behavior I see in class and on early homework is usually a window into their approach to the course.
So, I was sad for the student. I'll do what I can, but I felt guilty, like a swami looking into his crystal ball and seeing clearly a future that already is.
After class, though, I was sitting in my office and realized that all is not lost. I remembered that there is a second type of student who can start the course with this performance profile. That sort of student is rarer, but common enough to give me hope.
I did not actually think of a second set of students. I thought of a particular student from a recent graduating class. He started this course slowly. The profile wasn't quite the same, because this student took notes in class. Still, he seemed generally disengaged, and his homework scores were mediocre at best. Then, he did poorly on Quiz 1.
But at that point I noticed a change in his demeanor. He looked alive in class. He asked questions after class. He gave evidence of having read the assigned readings and of starting sooner on programs. His next homework was much better. His Quiz 2 was a complete reversal of the first.
This is the second kind of student: poor performance on the early homeworks and quizzes are a wake-up call. The student realizes, "I can't do what I've always done". He sees the initial grades in the course not as predictor of an immutable future but as a trigger to work differently, to work harder. The student changes his behavior, and as a result changes his results.
The particular student I remembered went on to excel in Programming Languages. He grasped many of the course's deeper concepts. Next he took the compilers course and was part of a strong team.
Sometimes, students are caught off guard by the foreign feel of the ideas in Scheme and functional programming, by the steep learning curve. The ones in the second group, the ones who usually succeed in the course after all, pay attention to early data and feed them back into their behavior.
So, there is hope for the student whose grade I glimpsed this afternoon. That strengthens my resolve to hang in there, offer help, and hope that I can reach him soon enough. I hope he is a Group 2 guy, or can be coaxed into crossing the border from Group 1 to Group 2.
The future is malleable, not frozen.
In two recent entries [ 1 | 2 ], I mentioned that I had been recently reading Roger Rosenblatt's Unless It Moves the Human Heart: The Craft and Art of Writing. Many of you indulge me my fascination with writers talking about writing, and I often see parallels between what writers of code and writers of prose and poetry do. That Rosenblatt also connects writing to teaching, another significant theme of my blog, only makes this book more stimulating to me.
"Unless it moves the human heart" is the sort of thing writers say about their calling, but not something many programmers say. (The book title quotes Aussie poet A. D. Hope.) It is clearly at the heart of Rosenblatt's views of writing and teaching. But in his closing chapter, Rosenblatt includes a letter written to his students as a postscript on his course that speaks to a desire most programmers have for the lives' work: usefulness. To be great, he says, your writing must be useful to the world. The fiction writer's sense of utility may differ from the programmer's, but at one level the two share an honorable motive.
This paragraph grabbed me as advice as important for us programmers as it is for creative programmers. (Which software people do you think of as you read it?)
How can you know what is useful to the world? The world will not tell you. The world will merely let you know what it wants, which changes from moment to moment, and is nearly always cockeyed. You cannot allow yourself to be directed by its tastes. When a writer wonders, "Will it sell?" he is lost, not because he is looking to make an extra buck or two, but rather because, by dint of asking the question in the first place, he has oriented himself to the expectations of others. The world is not a focus group. The world is an appetite waiting to be defined. The greatest love you can show it is to create what it needs, which means you must know that yourself.
What a brilliant sentence: The world is an appetite waiting to be defined. I don't think Ward Cunningham went around asking people if they needed wiki. He built it and gave it to them, and when they saw it, their appetite took form. It is indeed a great form of love to create what the world needs, whether the people know it yet or not.
(I imagine that at least a few of you were thinking of Steve Jobs and the vision that gave us the Mac, iTunes, and the iPad. I was too, though Ward has always been my hero when it comes to making useful things I had not anticipated.)
It seems fitting on this Labor Day weekend for us to think about all the people who make the world we live in and keep it running. Increasingly, those people are using -- and writing -- software to give us useful things.
I just gave my older daughter a tearful final kiss and hug and left her in the care of the small liberal arts college that will be her home for most of the next four years. I have tried to prepare myself for this moment over the last year, weeks, and days. Nothing could have prepared me for how I feel at this moment.
College faculty and administrators like to speak these days about the "transformative experience" that college will be for their students. After all these years of my wife and I doing everything we knew how to help our daughter grow into the poised, creative, curious, engaging, delightful young woman she has become, it's hard for me to imagine how much more she can grow. Yet we know she will. As she returns to us on breaks and summer vacations (we hope!), I expect not to meet a new person, but the same Sarah we have come to know and respect and love all these years. She will surely know herself better than she does now, and that will open a new side of her to us.
I am eager to watch her become ever more who she is and who she wants to be. I am eager to get to know her more, again, and still. Her future excites me.
But at this moment, I hurt as only a father or mother can.
In her invocation at today's convocation, the college chaplain prayed that the students of the Class of 2015 find "clarity of purpose". I like that phrase. Clarity of purpose can serve as a capable foundation for all these students will do. In many ways, they begin their lives anew today.
But not every young person in that quad today is just a member of the Class of 2015. One of them is my daughter, with whom I have spent so much time for the last eighteen years, teaching her and being taught by her in turn. I have a hard time imagining what those years would have been like without her -- without her boundless energy, without her love of life and books and people, without her smile and hugs. Or without her patient tutelage of a young man who occasionally lacks clarity of purpose in some things but whose sense of duty to her has never wavered.
I love you, sweetie. You are ready to spread your wings and fly. I wish you every good thing in this world and beyond. But I'm not sure I'm ready to say good-bye just yet. I hope you can teach me that, and so much more.