Someone tweeted a link to Philip Greenspun's M.S. thesis yesterday. This is how you grab your reader's attention:
A revolution in earthmoving, a $100 billion industry, can be achieved with three components: the GPS location system, sensors and computers in earthmoving vehicles, and SITE CONTROLLER, a central computer system that maintains design data and directs operations. The first two components are widely available; I built SITE CONTROLLER to complete the triangle and describe it here.
Now I have to read the rest of the thesis.
You could do worse than use Greenspun's first two sentences as a template for your next abstract:
A revolution in <major industry or research area> can be achieved with <n> components: <component-1>, <component-2>, ... and <component-n>. The first <n-1> components are widely available. I built <program name> to meet the final need and describe it here.
I am adding this template to my toolbox of writing patterns, alongside Kent Beck's four-sentence abstract (scroll down to Kent's name), which generalizes the idea of one startling sentence that arrests the reader. I also like good advice on how to write concise, incisive thesis statements, such as that in Matt Might's Advice for PhD Thesis Proposals and Olin Shivers's classic Dissertation Advice.
As with any template or pattern, overuse can turn a good idea into a cliché. If readers repeatedly see the same cookie-cutter format, it begins to look stale and will cause the reader to lose interest. So play with variations on the essential theme: I have solved an important problem. This is my solution.
If you don't have a great abstract, try again. Think hard about your own work. Why is this problem important? What is the big win from my solution? That's a key piece of advice in Might's advice for graduate students: state clearly and unambiguously what you intend to achieve.
Indeed, approaching your research in a "test-driven" way makes a lot of sense. Before embarking on a project, try to write the startling abstract that will open the paper or dissertation you write when you have succeeded. If you can't identify the problem as truly important, then why start at all? Maybe you should pick something more valuable to work on, something that matters enough you can write a startling abstract for the esult. That's a key piece of advice shared by Richard Hamming in his You and Your Research.
And whatever you do, don't oversell a minor problem or a weak solution with an abstract that promises too much. Readers will be disappointed at best and angry at worst. If you oversell even a little bit too many times, you will become like the boy who cried wolf. No one will believe your startling claim even when it's on the mark.
Greenspun's startling abstract ends as strongly as it begins. Of course, it helps if you can close with a legitimate appeal to ameliorating poverty around the world:
This area is exciting because so much of the infrastructure is in place. A small effort by computer scientists could cut the cost of earthmoving in half, enabling poor countries to build roads and rich countries to clean up hazardous waste.
I'm not sure adding another automating refactoring to Eclipse or creating another database library can quite rise to the level of empowering the world's poor. But then, you may have a different audience.
This Fortune Management article describes a technique Jeff Bezos uses in meetings of his executive team: everyone begins by "quietly absorbing ... six-page printed memos in total silence for as long as 30 minutes".
There is a good reason, Bezos knows, for an emphasis on reading and the written word:
There is no way to write a six-page, narratively structured memo and not have clear thinking.
This is certainly true for programming, that unique form of writing that drives the digital world. To write a well-structured, six-page computer program to perform a task, you have to be able to think clearly about your topic.
Alas, the converse is not true, at least not without learning some specific skills and practicing a lot. But then again, that makes it just like writing narratives.
My Programming Languages students this semester are learning that, for functional programming in Scheme, the size limit is somewhat south of six pages. More along the lines of six lines.
That's a good thing if your goal is clear thinking. Work hard, clarify your thoughts, and produce a small function that moves you closer to your goal. It's a bad thing if your goal is to get done quickly.
Thelonious Monk was a cool cat on the piano, but I think he could feel at home as a programmer. For example:
The _inside_ of the tune is the part that makes the _outside_ sound good.
Monk would understand that the design of your code matters as much as the design of your program's user interface. That is certainly true for developers who will have to maintain and modify the code over time. But it is also true for your program's users. It's hard for a program to be well designed on the outside, however pretty, when it is poorly designed on the inside.
Don't play _everything_ (or every time); let some things go by. Some music is just _imagined_. What you _don't_ play can be more important than what you _do_ play.
Some of the most effective software design happens in the negative space around software components. Alan Kay's original notions for designing objects stressed the messages that pass between objects more than the objects themselves. When we unfocus our eyes a bit and look at our system as a whole, the parts you don't design can come into focus.
And like Monk's missing notes, the code you don't write can be as important as the code you do, or more. The You Aren't Gonna Need It mindset tells us not to solve problems that don't exist yet. Live in the current spec. The result will be a minimal system, in terms of code size, with maximal effect.
You've got to dig it to _dig_ it, you dig?
A lot of people don't dig XP. But that's because they don't _dig_ it, you dig? Sometimes it takes surrendering old habits and thought processes all the way, pulling on a whole new way of approaching music or software, and letting it seep into your being for a while before you can really dig it. Some people begin skeptical but come to dig it after immersion.
This is true for a lot of practices that seem unusual or awkward, not just XP. As Alan Kay is also fond of saying, "Don't dip your toe in the water. Get wet."
PHOTO. Thelonious Monk, circa 1947 by William P. Gottlieb. Source: Wikipedia.
... in a Comic-Con 2010 interview:
Don't think about things, just do them.
Don't predict them, just make them.
This goes a bit farther than Kay's "The best way to predict the future is invent it". In particular, I think he is okay with thinking about things.
Text and audio excerpts of the Bradbury interview are available on-line at Brain Pickings.
In a reminiscence on his experience as a student, John Cook writes:
I enjoyed learning about it as a student and I enjoyed teaching it later. (Or more accurately, I enjoyed being exposed to it as a student and really learning it later when I had to teach it.)
It is a commonplace for anyone who has taught that we learn a lot more about any topic when we teach it -- even a topic in which we are acknowledged experts. Between organizing material for instruction and interacting with people as they learn, we learn an awful lot ourselves.
There is, however, a hidden gem in John's comment that is not so commonly talked about: "I enjoyed being exposed to it as a student...".
As teachers, we do well to remember that our students aren't really learning something when they take our courses, especially when the course is their first encounter with the material. We are merely exposing them to the topic, giving them the lay of the land and a little vocabulary. The course is an opportunity to engage with the material, perhaps again. If we don't keep this in mind, we may deceive ourselves with unrealistic expectations about what students will know and be able to do at the end of the course.
A second advantage of remembering this truth is that we may be on guard to create opportunities to deepen their exposure, through homework and projects and readings that pull them in. It is through our students' own efforts that learning takes place, and that our courses succeed.
Graham Lee makes an ironic observation in Does the history of making software exist?:
"[S]oftware engineering" ... was introduced to suggest a professionalism beyond the craft discipline that went before it, only to become a symbol of lumbering lethargy among adherents of the craft discipline that came after it.
It's funny how terms evolve and communities develop sometimes.
There are a lot of valuable lessons to be learned from the discipline of software engineering. As a mindset, it can shape how we build systems with good results. Taken too far, it can be a mindset can stifles and overloads the process of making software.
As a university professor, I have to walk a fine line, exposing students to the valuable lessons without turning the creation of software into a lethargic, lumbering process. My courses tend to look different from similar courses taught by software engineering profs. I presume that they feel different to students.
As a programmer, I walk a fine line, too, trying to learn valuable lessons from wherever I can. Often that's from the software engineering community. But I don't want to fall into a mindset where the process becomes more important than the result.
Last week, one of my Programming Languages students sent me a note saying that his homework solution worked correctly but that he was bothered by some duplicated code.
I was so happy.
Any student who has me for class for very long hears a lot about the dangers of duplication for maintaining code, and also that duplication is often a sign of poor design. Whenever I teach OOP or functional programming, we learn ways to design code that satisfy the DRY principle and ways to eliminate it via refactoring when it does sneak in.
I sent the student an answer, along with hearty congratulations for recognizing the duplication and wanting to eliminate it. My advice
When I sat down to blog the solution, I had a sense of deja vu... Hadn't I written this up before? Indeed I had, a couple of years ago: Increasing Duplication to Eliminate Duplication. Even in the small world of my own teaching, it seems there is nothing new under the sun.
Still, there was a slightly different feel to the way I talked about this in class later that day. The question had come earlier in the semester this time, so the code involved was even simpler. Instead of processing a vector or a nested list of symbols, we were processing with a flat list of symbols. And, instead of applying an arbitrary test to the list items, we were simply counting occurrences of a particular symbol, s.
The duplication occurred in the recursive case, where the procedure handles a pair:
(if (eq? s (car los)) (+ 1 (count s (cdr los))) ; <--- (count s (cdr los))) ; <---
Then we make the two sub-cases more parallel:
(if (eq? s (car los)) (+ 1 (count s (cdr los))) ; <--- (+ 0 (count s (cdr los)))) ; <---
And then use distributivity to push the choice down a level:
(+ (if (eq? s (car los)) 1 0) (count s (cdr los))) ; <--- just once!
This time, I made a point of showing the students that not only does this solution eliminate the duplication, it more closely follows the command to follow the shape of the data:
When defining a program to process an inductively-defined data type, the structure of the program should follow the structure of the data.
This guideline helps many programmers begin to write recursive programs in a functional style, rather than an imperative style.
Note that in the first code snippet above, the if expression is choosing among two different solutions, depending on whether we see the symbol s in the first part of the pair or not. That's imperative thinking.
But look at the list-of-symbols data type:
<list-of-symbols> ::= () | (<symbol> . <list-of-symbols>)
How many occurrences of s are in a pair? Obviously, the number of s's found in the car of the list plus the number of s's found in the cdr of the list. If we design our solution to match the code to the data type, then the addition operation should be at the top to begin:
(+ ; number of s's found in the car ; number of s's found in the cdr )
If we define the answer for the problem in terms of the data type, we never create the duplication-by-if in the first place. We think about solving the subproblems for the car and the cdr, fill in the blanks, and arrive immediately at the refactored code snippet above.
I have been trying to help my students begin to "think functionally" sooner this semester. There is a lot or room for improvement yet in my approach. I'm glad this student asked his question so early in the semester, as it gave me another chance to model "follow the data" thinking. In any case, his thinking was on the right track.
Here are three articles, all different, but with a connection to the future of education.
• Matthew Howell, Teaching Programming
What is the ideal class size? Over the year, I've taught classes that ranged in size from a single person to as many as ten. Through that experience, I've settled on five as my ideal.
Anyone who has taught intro programming in a high school or university is probably thinking, um, yeah, that would be great! I once taught an intermediate programming section with fifty or so people, though most of my programming courses have ranged from fifteen to thirty-five students. All other things being equal, smaller is better. Helping people learn to write and make things almost usually benefits from one-on-one time and time for small groups to critique design together.
Class size is, of course, one of the key problems we face in education these days, both K-12 and university. For a lot of teaching, n = 5 is just about perfect. For upper-division project courses, I prefer four groups of four students, for a total of sixteen. But even at that size, the costs incurred by a university offering sections of are rising a lot faster than its revenues.
With MOOCs all the rage, Howell is teaching at the other end of spectrum. I expect the future of teaching to see a lot of activity at both scales. Those of us teaching in the middle face bleaker prospects.
• Mike Caulfield, B. F. Skinner on Teaching Machines (1954)
Caulfield links to this video of B.F. Skinner describing a study on the optimal conditions for self-instruction using "teaching machines" in 1954. Caulfield points out that, while these days people like to look down on Skinner's behaviorist view of learning, he understood education better than many of his critics, and that others are unwittingly re-inventing many of his ideas.
[Skinner] understands that it is not the *machine* that teaches, but the person that writes the teaching program. And he is better informed than almost the entire current educational press pool in that he states clearly that a "teaching machine" is really just a new kind of textbook. It's what a textbook looks like in an age where we write programs instead of paragraphs.
That's a great crystallizing line by Caulfield: A "teaching machine" is what a textbook looks like in an age where we write programs instead of paragraphs.
Caulfield reminds us that Skinner said these things in 1954 and cautions us to stop asking "Why will this work?" about on-line education. That question presupposes that it will. Instead, he suggests we ask ourselves, "Why will this work this time around?" What has changed since 1954, or even 1994, that makes it possible this time?
This is a rightly skeptical stance. But it is wise to be asking the question, rather than presupposing -- as so many educators these days do -- that this is just another recursion of the "technology revolution" that never quite seems to revolutionize education after all.
• Clayton Christensen in Why Apple, Tesla, VCs, academia may die
Christensen didn't write this piece, but reporter Cromwell Schubarth quotes him heavily throughout on how disruption may be coming to several companies and industries of interest to his Silicon Valley readership.
First, Christensen reminds young entrepreneurs that disruption usually comes from below, not from above:
If a newcomer thinks it can win by competing at the high end, "the incumbents will always kill you".
If they come in at the bottom of the market and offer something that at first is not as good, the legacy companies won't feel threatened until too late, after the newcomers have gained a foothold in the market.
We see this happening in higher education now. Yet most of my colleagues here on the faculty and in administration are taking the position that leaves legacy institutions most vulnerable to overthrow from below. "Coursera [or whoever] can't possibly do what we do", they say. "Let's keep doing what we do best, only better." That will work, until it doesn't.
But now online learning brings to higher education this technological core, and people who are very complacent are in deep trouble. The fact that everybody was trying to move upmarket and make their university better and better and better drove prices of education up to where they are today.
We all want to get better. It's a natural desire. My university understands that its so-called core competency lies in the niche between the research university and the liberal arts college, so we want to optimize in that space. As we seek to improve, we aspire to be, in our own way, like the best schools in their niches. As Christensen pointed out in The Innovator's Dilemma, this is precisely the trend that kills an institution when it meets a disruptive technoology.
Later in the article, Christensen talks about how many schools are getting involved in online learning, sometimes investing significant resources, but almost always in service of the existing business model. Yet other business models are being born, models that newcomers are willing -- and sometimes forced -- to adopt.
One or more of these new models may be capable of toppling even the most successful institutions. Christensen describes one such candidate, a just-in-time education model in which students learn something, go off to use it, and then come back only when they need to learn what they need to know in order to take their next steps.
This sort of "learn and use", on-the-job learning, whether online or in person, is a very different way of doing things from school as we know it. It id not especially compatible with the way most universities are organized to educate people. It is, however, plenty compatible with on-line delivery and thus offers newcomers to the market the pebble they may use to bring down the university.
The massively open on-line course is one form the newcomers are taking. The smaller, more intimate offering enabled by the likes of SkillShare is another. It may well be impossible for legacy institutions caught in the middle to fend off challenges from both directions.
As Caulfield suggests, though, we should be skeptical. We have seen claims about technology upending schools before. But we should adopt the healthy skepticism of the scientist, not the reactionary skepticism of the complacent or the scared. The technological playing field has changed. What didn't work in 1954 or 1974 or 1994 may well work this time.
Will it? Christensen thinks so:
Fifteen years from now more than half of the universities will be in bankruptcy, including the state schools. In the end, I am excited to see that happen.
I fear that universities like mine are at the greatest risk of disruption, should the wave that Christensen predicts come. I don't know many university faculty are excited to see it happen. I just hope they aren't too surprised if it does.
After class today, a few of us were discussing the market for functional programmers. Talk turned to Clojure and Scala. A student who claims to understand monads said:
To understand monad tutorials, you really have to understand monads first.
Priceless. The topic of today's class was mutual recursion. I think we are missing a base case here.
I don't know whether this is a problem with monads, a problem with the writers of monad tutorials, or a problem with the rest of us. If it is true, then it seems a lot of people are unclear on the purpose of a tutorial.
After tweeting a jewel from Clay Shirky's latest article, I read the counterpoint article by Maria Bustillos, Venture Capital's Massive, Terrible Idea For The Future Of College, and an earlier piece by Darryl Tippens in the Chronicle of Higher Education, Technology Has Its Place: Behind a Caring Teacher. I share these writers' love of education and sympathetic to many of their concerns about the future of the traditional university. In the end, though, I think that economic and technological forces will unbundle university education whether we like it or not, and our best response is not simply to lament the loss. It is to figure out how best to preserve the values of education in the future, not to mention its value to future citizens.
While reading Bustillos and Tippens, I thought about how the lab sciences (such as physics, biology, and chemistry) are in many ways at a bigger disadvantage in an on-line world than disciplines that traffic primarily in ideas, such as the humanities. Lab exercises are essential to learning science, but they require equipment, consumable supplies, and dedicated space that are not typically available to us in our homes. Some experiments are dangerous enough that we don't want students trying them without the supervision of trained personnel.
Humanities courses have it relatively easier. Face-to-face conversation is, of course, a huge part of the educational experience there. But the sharing of ideas, and the negotiation of shared understanding, can be conducted in a number of ways that are amenable to on-line communication. Reading and writing have long played a central role in the growth of knowledge, alongside teaching in a classroom and conversation with like-minded individuals in close personal settings.
I soon realized something. Bustillos and Tippens, like so many others, seem to assume that collaboration and meaningful interaction cannot happen on-line.
Bustillos puts it harshly, and inaccurately:
MOOCs are an essentially authoritarian structure; a one-way process in which the student is a passive recipient required to do nothing except "learn."
Tippens expresses the sentiment in a more uplifting way, quoting Andrew Delbanco:
Learning is a collaborative rather than a solitary process.
On-line education does not have to be passive any more than a classroom has to be passive. Nor must it be solitary; being alone a lot of the time does not always mean doing alone.
A few faculty in my department have begun to create on-line versions of their courses. In these initial efforts, interaction among students and teacher have been paramount. Chat rooms, e-mail, wikis, and discussion boards all provide avenues for students to interact with the instructor and among themselves. We are still working at a small scale and primarily with students on-campus, so we have had the safety valve of face-to-face office hours available. Yet students often prefer to interact off hours, after work or over the weekend, and so the on-line channels prove to be most popular.
Those in the software world have seen how collaboration can flourish on-line. A lot of the code that makes our world go is developed and maintained by large, distributed communities whose only opportunities to collaborate are on-line. These developers may be solitary in the sense that they work in a different room from their compatriots, but they are not solitary in the sense of being lonesome, desolate, or secluded. They interact as a matter of course. Dave Humphrey has been using this model and its supporting technology as part of his teaching at Seneca College for a few years now. It's exciting.
My own experience with on-line interaction goes back to the 1980s, when I went to graduate school and discovered Usenet. Over the next few years, I made many good friends, some of whom I see more often in-person than I see most friends from my high school and college years. Some, I have never met in person, yet I consider them good friends. Usenet enabled me to interact with people on matters of purely personal interest, such as basketball and chess, but also on matters of academic value.
In particular, I was able to discuss AI, my area of study, with researchers from around the world. I learned a lot from them, and those forums gave me a chance to sharpen my ability to express ideas. The time scale was between the immediate conversation of the classroom and the glacial exchange of conference and journal papers. These on-line conversations gave me time to reflect before responding, while still receiving feedback in a timely fashion. They were invaluable.
Young people today grow up in a world of on-line interaction. Most of their interactions on-line are not deep, to be sure, but some are. And more could be, if someone could show them the way. That's the educator's job. The key is that these youth know that on-line technology allows them to be active, to create, and to learn. Telling them that on-line learning must be passive or solitary will fall on deaf ears.
Over twenty years of teaching university courses has taught me how important face-to-face interaction with students can be. How well experiments in on-line education address the need for interpersonal communication will go a long way to determining whether they succeed as education. But assuming that collaboration and meaningful interaction cannot happen on-line is surely a losing proposition.
Computational Thinking Division. From Jon Udell, another lesson that programming and computing teach us which can be useful out in the world:
Focus on understanding why the program is doing what it's doing, rather than why it's not doing what you wanted it to.
This isn't the default approach of everyone. Most of my students have to learn this lesson as a part of learning how to program. But it can be helpful outside of programming, in particular by influencing how we interact with people. As Udell says, it can be helpful to focus on understanding why one's spouse or child or friend is doing what she is doing, rather than on why she isn't doing what you want.
Motivational Division. From the Portland Ballet, of all places, several truths about being a professional dancer that generalize beyond the studio, including:
There's a lot you don't know.
There may not be a tomorrow.
There's a lot you can't control.
You will never feel 100% ready.
So get to work, even if it means reading the book and writing the code for the fourth time. That is where the fun and happiness are. All you can affect, you affect by the work you do.
Mac Chauvinism Division. From Matt Gemmell, this advice on a particular piece of software:
There's even a Windows version, so you can also use it before you've had sufficient success to afford a decent computer.
But with enough work and a little luck, you can afford better next time.
Mitch Daniels, the new president of Purdue University, says this about shared governance in An Open Letter to the People of Purdue, his initial address to the university community:
I subscribe entirely to the concept that major decisions about the university and its future should be made under conditions of maximum practical inclusiveness and consultation. The faculty must have the strongest single voice in these deliberations, but students and staff should also be heard whenever their interests are implicated. I will work hard to see that all viewpoints are fairly heard and considered on big calls, including the prioritization of university budgetary investments, and endeavor to avoid surprises even on minor matters to the extent possible.
Shared governance implies shared accountability. It is neither equitable or workable to demand shared governing power but declare that cost control or substandard performance in any part of Purdue is someone else's problem. We cannot improve low on-time completion rates and maximize student success if no one is willing to modify his schedule, workload, or method of teaching.
Participation in governance also requires the willingness to make choices. "More for everyone" or "Everyone gets the same" are stances of default, inconsistent with the obligations of leadership.
I love the phrase, inconsistent with the obligations of leadership.
Daniels recently left the governor's house in Indiana for the president's house at Purdue. His initial address is balanced, open, and forward-looking. It is respectful of what universities do and forthright about the need to recognize changes in the world around us, and to change in response.
My university is hiring a new president, too. Our Board of Regents will announce its selection tomorrow. It is probably too much to ask that we hire a new president with the kind of vision and leadership that Daniels brings to West Lafayette. I do hope that we find someone up to the task of leading a university in a new century.
From this Paris Review interview with novelist William Faulkner:
Some people say they can't understand your writing, even after they read it two or three times. What approach would you suggest for them?
Read it four times.
The first three times through the book are sunk cost. At this moment, you don't understand. What should you do? Read it again.
I'm not suggesting you keep doing the same failing things over and over. (You know what Einstein said about insanity.) If you read the full interview, you'll see that Faulkner isn't suggesting that, either. We're suggesting you get back to work.
Studying computer science is different from reading literature. We can approach our study perhaps more analytically than the novel reader. And we can write code. As an instructor, I try to have a stable of ideas that students can try when they are having trouble grasping a new concept or understanding a reading, such as:
One thing that doesn't work very well is being passive. Often, students come to my office and say, "I don't get it." They don't bring much to the session. But the best learning is not passive; it's active. Do something. Something new, or just more.
Faulkner is quite matter-of-fact about creating and reading literature. If it isn't right, work to make it better. Technique? Method? Sure, whatever you need. Just do the work.
This may seem like silly advice. Aren't we all working hard enough already? Not all of us, and not all the time. I sometimes find that when I'm struggling most, I've stopped working hard. I get used to understanding things quickly, and then suddenly I don't. Time to read it again.
I empathize with many of my students. College is a shock to them. Things came easily in high school, and suddenly they don't. These students mean well but seem genuinely confused about what they should do next. "Why don't I understand this already?"
Sometimes our impatience is born from such experience. But as Bill Evans reminds us, some problems are too big to conquer immediately. He suggests that we accept this up front and enjoy the whole trip. That's good advice.
Faulkner shrugs his shoulders and tells us to get back to work.
PHOTO. William Faulkner, dressed for work. Source: The Centered Librarian.