My compiler students are getting to the point where they should be deep in writing a parser for their language. Walking back from lunch, I was thinking about some very simple things they could do to make their lives -- and project -- better.
Yes, start. If you read the literature of the agile software development world or even of the life hacker world, people talk about the great power that comes just from taking the first step. I've always loved the old Goethe quote about the power of committing to a course of action. But isn't this all cliché?
It is so easy to put off tracing your language grammar, or building those FIRST and FOLLOW sets, or attacking what you know will be a massive parsing table. It is so easy to be afraid of writing the first line of code because you aren't sure what the whole app will look like.
Take the first step, however small and however scary. I'm always amazed how much more motivated I feel once I break the seal on a big task and have some real feedback from my client or my compiler.
2. Always have live code.
We did a lot of things wrong during the 2.5 years of pre-launch Gmail development, but one thing we did very right was to always have live code. ...
Of course none of the code from my prototype ever made it near the real product (thankfully), but that code did something that fancy arguments couldn't do (at least not my fancy arguments), it showed that the idea and product had real potential.
Your code can tell which ideas are good ones and which are bad ones. It can teach you about the app you are building. It can help you learn what your user wants.
I hear this all the time from students: "We have a pretty good handle on this, but no code yet." Sounds good, but... Live code can convince your professor that you really have done something. It can also help you ask questions and be submitted on the due date. Don't underestimate the value in that.
As Buchheit says from the Gmail experience, spend less time talking and more time prototyping. You may not be Google, but you can realize the same benefits as those guys. And with version control you don't have to worry about taking the wrong step; you can always back up.
3. Don't forget what you know.
Okay, I have to admit that this did not occur to me on my walk home form lunch. This afternoon, a former student and local entrepreneur gave a department seminar on web app security. He twice mentioned that many of the people he hires have learned many useful skills in object-oriented design and software engineering, using system languages such as Java and Ada. When they get to his shop, they are programming in a scripting language such as PHP. "And they throw away all they know!" They stop using the OOP principles and patterns they have learned. They stop documenting code and testing. It's as if scripting occurs in a different universe.
As he pointed out after the talk, all of those skills and techniques and practices matter just as much -- no, more -- when using a language with many power tools, few boundaries, and access to all of his and his clients' data and filesystem.
When building a compiler in class, or any other large-scale team project in a capstone course, all of those skills and techniques and practices matter, too, and sometimes for the first time in student's career. This is likely the largest and most sophisticated program they have ever written. It is the first time they have ever had to depend on one or two or five other students to get done, to understand and work with others' code, to turn their own code other for the understanding and use of their teammates.
There is a reason that you are learning all this stuff. It's for the project you are working on right now.
Confluence... The Education column in the February 2009 issue of Communications of the ACM, Human Computing Skills: Rethinking the K-12 Experience, champions computational thinking in lieu of programming:
Through the years, despite our best efforts to articulate that CS is more than "just programming," the misconception that the two are equivalent remains. This equation continues to project a narrow and misleading image of our discipline -- and directly impacts the character and number of students we attract.
I remain sympathetic to this concern. Many people, including lost potential majors, think that CS == programming. I don't know any computer scientists who think that is true. I'd like for people to understand what CS is and for potential majors who end up not wanting to program for a living to know that there is room for them in our discipline. But pitching programming to the aside altogether is the wrong way to do that, and will do more harm than good -- even for non-computer scientists.
It seems to me that the authors of this column conflate CS with programming at some level, because they equate writing a program with "scholarly work" in computer science:
While being educated implies proficiency in basic language and quantitative skills, it does not imply knowledge of or the ability to carry out scholarly English and mathematics. Indeed, for those students interested in pursuing higher-level English and mathematics, there exist milestone courses to help make the critical intellectual leaps necessary to shift from the development of useful skills to the academic study of these subjects. Analogously, we believe the same dichotomy exists between CT, as a skill, and computer science as an academic subject. Our thesis is this: Programming is to CS what proof construction is to mathematics and what literary analysis is to English.
In my mind, it is a big -- and invalid -- step from saying "CT and CS are different" to saying that programming is fundamentally the domain of CS scholars. I doubt that many professional software developers will agree with a claim that they are academic computer scientists!
I am familiar with Peter Naur's Programming as Theory Building, which Alistair Cockburn brought to the attention of the software development world in his book, Agile Software Development. I'm a big fan of this article and am receptive to the analogy; I think it gives us an interesting way to look at professional software development.
But I think there is more to it than what Naur has to say. Programming is writing.
Back to the ACM column. It's certainly true that, at least for many areas of CS, "The shift to the study of CS as an academic subject cannot .. be achieved without intense immersion in crafting programs." In that sense, Naur's thesis is a good fit. But consider the analogy to English. We all write in a less formal, less intense way long before we enter linguistic analysis or even intense immersion in composition courses. We do so as a means of communicating our ideas, and most of us succeed quite well doing so without advanced formal training in composition.
How do we reach that level? We start young and build our skills slowly through our K-12 education. We write every year in school, starting with sentences and growing into larger and larger works as we go.
I recall that in my junior year English class we focused on the paragraph, a small unit of writing. We had written our first term papers the year before, in our sophomore English course. At the time, this seemed to me like a huge step backward, but I now recognize this as part of the Spiral pattern. The previous year, we had written larger works, and now we stepped back to develop further our skills in the small after seeing how important they were in the large.
This is part of what we miss in computing: the K-8 or K-12 preparation (and practice) that we all get as writers, done in the small and across many other learning contexts.
Likewise, I disagree that proof is solely the province of mathematics scholars:
Just as math students come to proofs after 12 or more years of experience with basic math, ...
In my education, we wrote our first proofs in geometry -- as sophomores, the same year we wrote our first term papers.
I do think one idea from the article and from the CT movement merits more thought:
... programming should begin for all students only after they have had substantial practice acting and thinking as computational agents.
Practice is good! Over the years, I have learned from CS colleagues encountered many effective ways to introduce students, whether at the university or earlier, to ideas such as sorting algorithms, parallelism, and object-oriented programming by role play and other active techniques -- through the learner acting as a computational agent. This is an area in which the Computational Thinking community can contribute real value. Projects such as CS Unplugged have already developed some wonderful ways to introduce CT to young people.
Just as we grow into more mature writers and mathematical problem solvers throughout our school years, we should grow into more mature computational thinkers as we develop. I just don't want us to hold programming out of the mix artificially. Instead, let's look for ways to introduce programming naturally where it helps students understand ideas better. Let's create languages and build tools to make this work for students.
As I write this, I am struck by the different nouns phrases we are using in this conversation. We speak of "writers", not "linguistic thinkers". People learn to speak and write, to communicate their ideas. What is it that we are learning to do when we become "computational thinkers"? Astrachan's plea for "computational doing" takes on an even more XXXXX tone.
Alan Kay's dream for Smalltalk has always been the children could learn to program and grow smoothly into great ideas, just as children learn to read and write English and grow smoothly into the language and great ideas of, say, Shakespeare. This is a critical need in computer science. The How to Design Programs crowd have shown us some of the things we might do to accomplish this: language levels, tool support, thinking support, and pedagogical methods.
Deep knowledge of programming is not essential to understand all basic computer science, some knowledge of programming adds so very much even to our basic ideas.
And we decided to innovate our way through this downturn, so that we would be further ahead of our competitors when things turn up.
"This downturn" was the dot.com bust. The speaker was Steve Jobs. The innovations were the iPod and iTunes. Seems to have worked out fine.
My agile friends are positioned well to innovate through our current downturn, as are the start-ups that other friends and former students run. It is something of a cliché but true nonetheless. Recessions can be a good time for people and organizations that are able -- and willing -- to adapt. They can be an opportunity as much as a challenge.
I hope that the faculty and staff of my university can approach these troubled budget times with such an attitude. In five years, we could be doing a much better job for our students, our state, and our respective academic disciplines.
I've heard from a few of you about my previous post. People have strong feelings in both directions. If you haven't seen it already, check out Mark Guzdial's commentary on this topic. Mark explores a bit further what it means to understand algorithms and data structures without executing programs, and perhaps without writing them. I'm glad that he is willing to stake out a strong position on this issue.
Those of you who receive inroads, SIGCSE's periodical, should watch for a short article by Owen Astrachan in the next issue, called "Cogito Ergo Hack". Owen hits the target spot-on: without what he calls "computational doing", we miss a fantastic opportunity to help people understand computational ideas at a deeper level by seeing them embodied in something they themselves create. Computational doing might involve a lot of different activities, but programming is one of the essential activities.
We need as many people as possible, and especially clear thinkers and writers like Mark and Owen, to ask the questions and encourage others to think about what being a computational thinker means. Besides, catchy phrases like "computational doing" and "Cogito Ergo Hack" are likely to capture the attention of more people than my pedestrian prose!
Tweet of the Day
Haskell is a human-readable program compression format.
-- Michael Feathers
Maybe we should write a generator that produces Haskell.
Non-Tech Blog of the Day
Earlier in my career I worked hard to attract attention. ... The problem with this approach is that eventually it all burns down to ashes and no one knows a thing more about software development than they did before.
-- Kent Beck
Seek truth. You will learn to focus your life outside your own identity, and it makes finding out you're wrong not only acceptable, but desirable.
Last week, I read a paper on changes how statistics is taught. In the last few decades, more schools have begun to teach stats conceptually, so that the general university graduate might be able to reason effectively about events, variation, and conditions. This contrasts with the older style in which it was taught as a course for mathematicians, with the focus on formulas and mastery of underlying theorems. The authors said that the new style emphasized statistical thinking, rather than traditional statistics.
For some reason, this brought to mind the buzz around "computational thinking" in the CS world. I have to be honest: I don't know exactly what people mean when they talk about computational thinking. I think the idea is similar to what the stats guys are talking about: using the ideas and methods discovered in computer science to reason about processes in the real world. I certainly agree that most every college graduate could benefit from this, and that popularizing these notions might do wonders for helping students to understand why CS is important and worth considering as a major and career.
But when I look at the work that passes under the CT banner, I have a hard time distinguishing computational thinking from what I would call a publicly-accessible view of computer science. Maybe that's all it is: an attempt to offer a coherent view of CS for the general public, in a way that all could begin to think computationally.
The one thing that stands out in all the papers and presentations about CT I've seen is this: no programming. Perhaps the motivation for leaving programming out of the picture is that people find it scary and hard, so omitting it makes for a more palatable public view. Perhaps some people think that programming isn't an essential part of computational thinking. If it's the former, I'm willing to cut them some slack. If it's the latter, I disagree. But that's not surprising to readers here.
While thinking on this, I came across this analogy: computational thinking with no programming is like statistical thinking without any mathematics. That seems wrong. We may well want stats courses aimed at the general populace to emphasize application and judgment, but I don't think we want students to see statistics devoid of any calculation. When we reason about means and variance, we should probably have some idea how these terms are grounded in arithmetic that people understand and can do.
When I tried my analogy out on a colleague, he balked. We don't need much math to reason effectively in a "statistical" way, he said; that was precisely the problem we had before. Is he overreacting? How well can people understand the ideas of mean and standard deviation without knowing how to compute them? How little math can they know and still reason effectively? He offered as an example the idea of a square root. We can understand what a square root and what it means without knowing how to calculate one by hand. Nearly every calculator has a button for the square root, and most students' calculators these days have buttons for the mean -- and maybe the variance; I'll have to look at my high school daughter's low-end model to see.
For the most part, my colleague feels similarly about programming for everyone. His concern with CT is not eliminating programming but what would be taught in lieu of programming. Many of the articles we have seen on CT seem to want to replace programming with definitions and abstractions that are used by professional computer scientists. The effect is to replace teaching a programming language with teaching a relatively formal "computational thinking" language. In his mind, we should replace programming with computational skills and ideas that are useful for people involved in everyday tasks. He fears that, if we teach CT as if the audience is a group of budding computer scientists, we will make the same mistake that mathematics often has: teaching for the specialists and finding out that "everyone else is rotten at it".
The stats teaching paper I read last week says all the right things. I should look at one of the newer textbooks to see how well they carry it out, and how much of the old math skills they still teach and require.
I'm still left thinking about how well people can think computationally without learning at least a little programming. To the extent that we can eliminate programming, how much of what is left is unique to computer science? How far can we take the distinction between formula and algorithm without seeing some code? And what is lost when we do? There is something awfully powerful about watching a program in action, and being able to talk about and see the difference between dynamic behavior and static description.
Two can be as bad as one
It's the loneliest number since the number one
-- Three Dog Night
Two is the number of times I have run since Tuesday afternoon. It is also the the number of times I ran between Tuesday and December 21, and twice as many times as I ran in the entire month of January.
I was so excited when I ran three miles on January 30 that I nearly blogged One, because it had been one month since my previous run, and my only run in January. (I'll talk about that December 30 in an upcoming Running Year in Review.)
Good news: The two runs this week have not led to the symptoms that knocked out most of November and December. That's especially good news given that months of medical testing have failed to find their cause. At this point, I have the most accurate snapshot of my body and its health ever: blood tests, metabolic tests, stress tests, sleep tests, gastrointestinal scopes, an MRI of my head, an ultrasound of my internal organs, and finally a marrow biopsy. The tests came back negative, which is positive -- except for finding a cause. I'm ready for the symptoms to disappear and become a mystery of the past.
So two runs in a week with good health feels great. I ran only three miles each time, slow but not so slow that I am setting negative PRs. My legs feel good, ready for more. I'll wait until Sunday or so before trying that.
Ever the optimist, earlier this month I signed up to run a spring race. My running buddy has wanted to run the Indy 500 Mini Marathon for a few years. I'm an Indianapolis native who grew up amid the mystique of the Indy 500 but who didn't run while I lived there. I was easily sold.
The race -- America's largest half marathon and one of the five or six largest races of any distance -- is May 2, so I have my work cut out for me. At this point I don't envision myself "racing" this one. Instead I'll use it as a goal to drive my return to running this spring, and then on race day as a symbolic return to the real thing. Running for fun can be fun, and it will let me know whether I have what it takes to get back into serious mileage and some real racing later. Maybe even an autumn marathon.
A student reader sent me a message to say that he didn't click with my talk of 'embracing' failure. He prefers to think in terms of pushing through failure to reach success. So I found it interesting when I ran across a blog entry with a similar them (via The Third Bit) on the same day:
I'm writing this in order to talk about failure and coping.... I've got the failure down. So what about coping? ... failure is really information. You don't fail and therefore become a failure. You fail and in so doing you learn and gain more understanding.
... I've come to understand that if I'm not making mistakes it means I'm not trying hard enough, and I'm not pushing myself far enough.
In this article, David Humphrey recounts the details of a vexing experience trying to fix a bug. Humphrey uses the bug -- unresolved by the end of the article -- to explain how he feels about failing repeatedly. He isn't broken; he is empowered by the information he has captured. Whether we call that embracing failure, both the inevitability of failing and the knowledge we gain by failing, or coping, or gathering information, it seems to be an important trait of people who enjoy computing.
If you'd like to read more on Humphrey's ideas after he fixed his bug, check out his follow-up article.
A while back, I clipped this quote from a university publication, figuring it would decorate a blog entry some day:
The thing about a liberal arts education ... is it prepares you to fail successfully and learn from that failure. ... You will all fail. That's OK.
-- Jim Linahon
Linahon is an alumnus of our school who works in the music industry. He gave a talk on campus for students aspiring to careers in the industry, the theme of which was, "Learn to fail. It happens"
More recently, I ran across this as the solution to a word puzzle in our local paper:
You've got to jump off cliffs all the time and build your wings on the way down.
-- Ray Bradbury
Bradbury was one of my favorite authors when I was growing up (The Martian Chronicles mesmerized me!) This quote goes farther than Linahon's: what other people call failure is learning to fly. Do not fear.
A comment made at the Rebooting Computing summit about "embracing failure" brought these quotes back to mind, along with an e-mail message Rich Pattis wrote sometime last year. Rich talked about how hard our discipline must feel to beginners, because it is a discipline learned almost wholly by failure. Learning to program can seem like nothing more than an uninterrupted sequence failures: syntax errors, logic errors, boundary cases, ugly interface, .... I'm not a novice any more, but I still feel the constant drip of failure whenever I work on a piece of code I don't already understand well.
The thing is, I kinda like that feeling -- the challenge of scaling a mountain of code. My friends who program can at least say that they don't mind it, and the best among them seem to thrive in such an environment. I think that's part of what separates programmers from less disturbed people.
Then again, repeated failure is a part of learning many things. Learning to play a musical instrument or a sport require repeated failure for most people. Hitting a serve in tennis, or a free throw on the hardcourt, or a curve ball in baseball -- the only way to learn is by doing it over and over, failing over and over, until the mind and body come together in a memory that make success a repeatable process. This seems to be an accepted part of athletics, even among the duffers who only play for fun. How many people in America are on a golf course this day, playing the game poorly but hoping -- and working -- to get better?
Why don't we feel the same way about academics, and about computer programming in particular? Some small number seem to, maybe the 2% that Knuth said are capable of getting it.
I have heard some people say that in sports we have created mechanisms for "meaningful failure", though I'm not sure exactly what that means, but I suspect that if we could build tools for students and set problems before them that give them a sense of meaningful failure, we'd probably not scare off so many people from our early courses. I suspect that this is part of what some people mean when they say we should make our courses more fun, though thinking in terms of meaningful failures might give us a better start on the issue than simply mantras about games and robots.
I don't think just equating programming to sports is enough. Mitch Wand sent a message to the PLT Scheme mailing list this weekend on a metaphor he has been using to help students want to stick to the design recipes of How to Design Programs:
In martial arts, the first thing you learn is to do simple motions very precisely. Ditto for ballet, where the first thing you learn is the five positions.
Once those are committed to muscle memory, you can go on to combinations and variations.
Same deal for programming via HtDP: first practice using the templates until you can do it without thinking. Then you can go on to combinations and variations.
I like the analogy and have used a similar idea with students in the past. But my experience is that this only works for students who want to learn to programming -- or martial arts or ballet, for that matter. If you start with people who want to go through the process of learning, then lots of things can work. The teacher just needs to motivate the student every once in a while to stick with the dream. But it's already their dream.
Maybe the problem is that people want to play golf and the martial arts -- for whatever social, business, or masochistic reasons -- but that most people don't want to learn to program? Then our problem comes back to a constant theme on this blog: putting a feeling of power in peoples' hands when we show them programming, so they want to endure the pain.
One last quote, in case you ever are looking for a literary way to motivate students to take on tough challenges rather than little ones that acquiesce easily and making us feel good sooner:
What we choose to fight is so tiny!
What fights us is so great!
When we win it's with small things,
and the triumph itself makes us small.
Winning does not tempt that man.
This is how he grows: by being defeated, decisively,
by constantly greater beings.
This comes from Rainer Maria Rilke's The Man Watching. What a marvelous image, growing strong by being beaten -- decisively, less -- by ever greater opponents. I'm sure you professional programmers who have been tackling functional programming, continuations, Scheme, Haskell, and Erlang these last few years know just the feeling Rilke describes, deep in the marrow of your bones.
... so many meetings and budget discussion. A week in which I do no computer science is as bad as a week in which I do not run.
I did play hooky while attending a budget meeting yesterday. I took one of our department laptops so that I could upgrade Ruby, install Bluecloth, and play with a new toy. But that's bit-pushing, not computer science.
Why all the budget talk? Working at a public university offers some shelter from changing economic winds and tends to level changes out over time. But when the entire economy goes down, so do state revenues. My university is looking at a 9% cut in its base budget for next year. That magnitude of change means making some hard choices. Such change creates uneasy times among the faculty, and more work planning for changes and contingencies among department heads.
There is some consolation in being on the front line, knowing that I can shield the faculty from much of the indecision. I also have opportunities to argue against short-sighted responses to the budget cuts and to suggest responses that will help the university in the long term. There is nothing like a sudden lack of revenue to help you focus on what really matters!
Still, I'd rather be writing a program or working on a tough CS problem.
I need a better memory.
As a science journalist, I can tell you the best thing to do, as an academic getting interviewed and wanted to guide the interview somewhat, is to have analogies cocked, locked and loaded.... [R]eporters go nuts for pre-thought-out analogies/explanations because it's quotable material, and could in fact be the center of the article.... So cranking them out before you speak with someone is a great way to maintain some control of what reporters quote you on.
As in so many things, preparation pays off.
Of course, this isn't quite the same problem. Talking about one's own research or teaching is different than talking about department business or someone else's project. But that is one of the responsibilities that comes with chairing the department -- speaking about the wider interests of the department.
The bigger issue here is, how to convert what I read into learning. The passage above stuck out enough that I filed it away for eighteen months. But it doesn't do me any good sitting in a text file somewhere.