As always, at this workshop I have heard lots of one-liners and one-off comments that will stick with me after the event ends. This entry is a place for me to think about them by writing, and to share them in case they click with you, too.
The buzzword of this year's workshop: infiltration. Frontal curricular assaults often fail, so people here are looking for ways to sneak new ideas into courses and programs. An incremental approach creates problems of its own, but agile software proponents understand its value.
College profs like to roll their own, but high-school teachers are great adapters. (And adopters.)
Chris Hoffman, while describing his background: "When you reach my age, the question becomes, 'What haven't you done?' Or maybe, 'What have you done well?'"
Lenny Pitt: "With Python, we couldn't get very far. Well, we could get as far as we wanted, but students couldn't get very far." Beautiful. Imagine how far students will get with Java or Ada or C++.
Rubin Landau: "multidisciplinary != interdisciplinary". Yes! Ideas that transform a space do more than bring several disciplines into the same room. The discipline is new.
It's important to keep in mind the relationship between modeling and computing. We can do model without computing. But analytical models aren't feasible for all problems, and increasingly the problems we are interested in fall into this set.
Finally let me re-run an old rant by linking to the original episode. People, when you are second or third or sixth, you look really foolish.
As much as computation is now changing biology, it has already changed physics. Last year's workshop had a full complement of physicists and astronomers. In their minds, it is already clear that physicists must program -- even students learning intro physics. The question is, what problems do they face in bringing more computation to physics education? This panel session shared some physicists' experience in the trenches. Bruce Sherwood, the panel chair, set the stage: We used to be able to describe physics as theory, experiment, and the interplay between the two. This is no longer true, and it hasn't been for a while. Physics is now theory, experiment, simulation, and the interplay among the three! Yet this truth is not reflected in the undergraduate physics curriculum -- even at so-called "respectable schools".
Rubin Landau described a systemic approach, a Computational Physics major he designed and implemented at Oregon State. He was motivated by what he saw as a turning inward of physics, efforts to cover all of the history of physics in the undergrad curriculum, with a focus on mathematics from the 19th century, and not looking outward to how physics is done today. (This CS educator felt immediate empathy for Landau's plight.) He noted his own embarrassment: computational physicists at major physics conferences who refuse to discuss their algorithms or the verification of their programs. This is simply not part of the culture of physics.
Students learn by doing, so projects are key to this Computational Physics curriculum. Students use a "compiled language", which is Landau's way to distinguish programming in a CS-style language from Mathematica and Maple. For him, the key is to separate the program from engine; students need to see the program as an idea. Two languages is better, as that gives students a chance to generalize the issues at play in using computation for modeling.
The OSU experience is that the political issues in changing the curriculum are much tougher to solve than the academic issues: the need for budget, the resistance of senior faculty, the reluctance of junior faculty to risk tenure, and so on.
Landau closed by saying that, for physics-minded students, using computation in physics and then taking a CS course seems to work best. He likened this to the use of Feynman diagrams in grad school: students learn to calculate with them, and then learn the field theory behind them the next year. His undergrads have several "A-ha!" moments throughout CS1. I suspect that this approach would work for a lot of CS students, too, if we can get them to use computation. Media computation is one avenue I've seen work with some.
Next up was Robert Swendsen, from Carnegie-Mellon. In the old days, physicists wrote programs because they did not know how to solve a problem analytically. Now, they compute to solve problems that no one knows how to solve analytically. (Mental note: It also lets them ask new questions.) The common problem many of us face: we tend to teach the course we took -- something of a recursion problem. (Mental note: Where is the base case? Aristotle, I suppose.)
Swendsen identified a few other challenges. Students are used to looking at equations, though if they don't get as much from them as we do, but they have no experience looking at and reasoning about data. They struggle even with low-level issues such as accuracy in terms of the number of significant digits. Further, many students do not think that computational physics is "real" physics. To them, physics == equations.
This is a cultural expectation across the sciences, a product of the few centuries of practice. Nor is it limited to students; people out in the world think of science as equations. Perhaps they pick this notion up in their high-school courses, or even in their college courses. I think that faculty in and out of the sciences share this misperception as well. The one exception is probably biology, which may account for part of its popularity as a major -- no math! no equations! I couldn't help but think of Bernard Chazelle's efforts to popularize the notion that the algorithm is the idiom of modern science.
Listening to Swendsen, I also had an overriding sense of deja vu, back to when CS faculty across the country were trying to introduce OO thinking into the first-year CS curriculum. Curriculum change must share some essential commonalities due to human nature.
Physicist Mark Haugan focused on a particular problem he sees: a lack of continuity across courses in the physics curriculum with respect to computation. Students may use computation in one course and then see no follow-through in their next courses. In his mind, students need to learn that computation is a medium for expressing ideas -- a theme regular readers of this blog will recognize. Mathematical equations are one medium, and programs are another. I think the key is that we need to discuss and work with problems where computation matters -- think Astrachan's Law -- problems for which the lack of computation would limit our ability to understand and solve the problem. This, too, echoes the OO experience in computer science education. We still face the issue that other courses and other professors will do things in a more traditional way. This is another theme common to both SECANT workshops: we need to help students feel so empowered by computation that they use it unbidden in their future courses.
The Q-n-A session contained a wonderful thread on the idea of physics as a liberal art. One person reported a comment made by a student who had taken a computational physics course and then read a newspaper article on climate modeling:
Wow. Now I know what that means.
I can think of no higher "student learning outcome" we in computer science can have for our general education and introductory programming courses: Wow. Now I know what that means.
There are many educated people who don't what "computer model" means. They don't understand what is reported in the news. There are many educated people reporting the news who don't understand the news they are reporting.
That's not right.
The next session of the workshop was a panel of university faculty working in the health sciences, talking about how they use computation in their disciplines and what the key issues are. Panel chair, Raj Acharya, from Penn State's Computer Science and Engineering department, opened with the bon mot "all science is computer science", a reference to a 2001 New York Times piece that I have been using for the last few years when speaking to prospective students, their parents, and other faculty. By itself, this statement sounds flip, but it is true in many ways. The telescope astronomers use today is as much a computational instrument as a mechanical one. Many of the most interesting advances in biology these days are really bioinformatics.
The dawn of big data is changing what we do in CS, but it's having an even bigger effect in some other sciences by creating a new way to do science. Modeling is a nascent research method based in computation: propose model, test it against the data, and iterate. Data mining is an essential step in this new process: all of the data goes into a box, and the box has to make the sense of the data. This swaps two steps in the traditional scientific method... Instead of forming a hypothesis and then testing it by collecting data, a scientist can mine a large collection of data to find candidate hypotheses, and then confirm with more traditional bench science and by checking models against other and larger data sets.
Tony Hazbun, who works in the School of Pharmacy at Purdue, talked about work in systems biology. He identified four key ideas that biologists need to learn from computer science, which echoed a talk from last year's workshop:
Hazbun made one provocative claim that I think hits the heart of why this sort of science is important. We mine data sets to see patterns that we probably would not have seen otherwise. This is approach is more objective than traditional science, in which the hypotheses we test are the ones we create out of our own experience. This is a much more personal approach -- and thus more subjective. Data mining helps us to step outside our own experience.
Next up was Daisuke Kihara, a Purdue bioinformatician who was educated in Japan. He talked about the difficulties he has had building a research group of graduate students. The main problem is that biology students have few or no skills in mathematics and programming, and CS students know little or no biology. In the US, he said, education is often too discipline-specific, with not enough breadth, which limits the kind of cross-fertilization needed by researchers in bioinformatics. My university created an undergraduate major in Bioinformatics three years ago in an effort to bridge this gap, in part because biotechnology is an industry targeted for economic development in our state.
(My mind wandered a bit as I thought about Kihara's claim about US education. If he is right, then perhaps the US grew strong technically and academically during a time when the major advances came within specific disciplines. Now that the most important advances are coming in multidisciplinary areas, we may well need to change our approach, or lose our lead. I've been concerned about this for a year or so, because I have seen the problem of specializing too soon creeping down into our high schools. But then I wondered, is Kihara's claim true? Computer science has a history grounded in applications that motivate our advances; I think it's a relatively recent phenomenon that we spend most of our time looking inward.)
In addition to technical skills and domain knowledge, scientists of the future need the elusive "problem-solving skills" we all talk about and hope to develop in our courses. Haixu Tang, from the Informatics program at Indiana contrasted the mentality of what he called information technology and scientific computing:
These distinctions reflect a cultural divide that makes integrating CS into science disciplines tough. In Tang's experience, domain knowledge is not the primary hurdle, but he has found it easier to teach computer scientists biology than to teach biologists computer science.
Tang also described the shift in scientific method that computing enables. In traditional biology, scientists work from hypothesis to data to knowledge, with a cycle from data back to hypothesis. In genome science, science can proceed from data to hypothesis to knowledge, with a cycle from hypothesis back to data. The shift is from hypothesis-driven science to data-driven science. Simulation has joined theory and statistics in the methodological toolbox.
In the Q-n-A session that followed the panel, someone expressed concern with data-driven research. Too many people don't go back to do the experiments needed to confirm hypotheses found via data mining or to verify their data by independent means. The result is bad science. Olga Vitek, a statistical bioinformatician, replied that the key is developing skill in experimental design. Some researchers in this new world are learning the hard way.
The last speaker was Peter Waddell, a comparative biologist who is working to reconstruct the tree of life based on genome sequences. One example he offered was that the genome record shows primates' closest relatives to be... tree lemurs and shrews! This process is going slowly but gaining speed. He told a great story about shotgun sequencing, BLAST, and the challenges in aligning and matching sequences. I couldn't follow it, because I am a computer scientist who needs to learn more biology.
When Waddell began to talk about some of the computing challenges he and his colleagues face, I could follow the details much better. They are working with a sparse matrix that will have between 102 and 103 rows and between 102 and 109 (!!) columns. The row and column sums will differ, but he needs to generate random matrices having the same row and column sums as the original matrix. In his estimation, students almost need to have a triple major in CS, math, and stats, with lots of biology and maybe a little chemistry thrown in, in order to contribute to this kind of research. The next best thing is cross-fertilization. His favorite places to work have been where all of the faculty lunch together, where they are able to share ideas and learn to speak each other's languages.
This remark led to another question, because it "raised the hobgoblin of multidisciplinary research": an undergraduate needs seven years of study in order to prepare for a research career -- and that is only for the best students. Average undergrads will need more, and even that might not be enough. What can we do? One idea: redesign the whole curriculum to be interdisciplinary, with problems, mathematics, computational thinking, and research methods taught and reinforced everywhere. Graduating students will not be as well-versed in any one area, but perhaps they will be better at solving problems across the boundaries of any single discipline.
This isn't just a problem for multidisciplinary science preparation. We face the same problem in computer science itself, where the software development side of our discipline requires a variety of skills that are often best learned in context. The integrated curriculum suggestion made here makes me think of the integrated apprenticeship-style curriculum that ChiliPLoP produced this year.
To open the workshop, the SECANT faculty at Purdue described an experimental course they taught last spring, Introduction to Computational Thinking. It was designed by a multi-disciplinary team from physics, chemistry, biology, and computer science for students from across the sciences.
The first thing that jumped out to me from this talk was that the faculty first designed the projects that they wanted students to do, and then figured out what students would need to know in order to do the projects. This is not a new idea (few ideas are), but while many people talk about doing this, I don't see as many actually doing it. It's always interesting to see how the idea works in practice. Owen Astrachan would be proud.
The second was the focus on visualization of results as essential to science and as a powerful attractor for students. It is not yet lunch time on Day 1, but I have heard enough already to say that visualization will be a key theme of the workshop. That's not too surprising, because visualization was also a recurring theme in last year's workshop. Again, though, I am glad to be reminded of just how important this issue is outside the walls of the Computer Science building. It should affect how we prepare students for careers applying CS in the world.
The four projects in the Purdue course's first offering were:
This looks like a broad set of problems, the sort of interdisciplinary science that the core natural sciences share and which we computer scientists often miss out on. For CS students to take this course, they will need to know a little about the several sciences. That would be good for them, too.
Teaching CS principles to non-CS students required the CS faculty to take an approach unlike what they are used to. They took advantage of Python's strengths as a high-level, dynamic scripting language to use powerful primitives, plentiful libraries, and existing tools for visualizing results. (They also had to deal with its weaknesses, not the least of which for them was the delayed feedback about program correctness that students encounter in a dynamically-typed language.) They delayed teaching the sort of software engineering principles that we CS guys love to teach early. Instead, they tried to introduce abstractions only on a need-to-know basis.
Each project raised particular issues that allowed the students to engage with principles of computing. Audio manipulation exposed the idea of binary representation, and percolation introduced recursion, which exposed the notion of the call stack. Other times, the mechanics of writing and running programs exposed underlying computing issues. For example, when a program ran slower than students expected on the basis of previous programs, they got to learn about the difference in performance between primitive operations and user-defined functions.
The panelists reported lessons from their first experience that will inform their offering next spring:
One of the open questions they are considering is, do they need or want to offer different sections of this course for different majors? This is a question many of us are facing. Having a more homogeneous student base would allow the use of different kinds of problem and more disciplinary depth. But narrowing the problem set would lose the insight available across disciplines. At a school like mine, we also risk spreading the student base so thin that we are unable to offer the courses at all.
Somewhere in this talk, speaker Susanne Hambrusch, the workshop organizer and leader, said something that made me think about what in my mind is the key to bringing computation to the other disciplines most naturally: We need to leave students thinking, "This helps me answer questions in my discipline -- better, or faster, or ...". This echoed something that Ruth Chabay said at the end of last year's workshop. Students who see the value of computation and can use computation effectively will use computation to solve their own problems. That should be one of the primary goals of any course in computing we teach for students outside of CS.
This set of entries records my experiences at the 2008 SECANT 2008 workshop October 30-31, hosted by the Department of Computer Science at Purdue University.
On my drive to Purdue today, I listened to the first 3/4 of Caleb Carr's novel, "Killing Time". This is not a genre I read or listen to often, so it's hard for me to gauge the book's quality. If you are inclined, you can read reviews on-line. At this point, I would say that it is not a very good book, but it delivered fine escapism for a car ride on a day when I needed a break more than deep thought. But it did get me to thinking about... computer science. The vignette that sets up the novel's plot is based on a typical use case for Photoshop, or a homework assignment in a media computation CS1 course.
Carr describes a world controlled by "information barons", a term intended to raise the specter of the 19th century's rail barons and their control of wealth and commerce. The central feature of his world in 2023 is deception -- the manipulation of information, whether digital or physical, to control what people think and feel. The novel's opening involves the role a doctored video plays in a presidential assassination, and later episodes include doctored photos, characters manufactured via the data planted on the internet, the encryption of data on disk, and real-time surveillance of encrypted communication.
If students are at all interested in this kind of story, whether for the science fiction, the intrigue, or the social implications of digital media and their malleability, then we have a great way to engage them in computing that matters. It's CSI for the computer age.
Carr seems to have an agenda on the social issues, and as is often the case, such an agenda interferes with the development of the story. His characters are largely cut-outs in service of the message. Carr paints a dystopian view striking for its unremitting focus on the negatives of digital media and the science's increasing understanding of the world at a molecular level. The book seems unaware that biology and chemistry are helping us to understand diseases, create new drugs, and design new therapies, or that computation and digital information create new possibilities in every discipline and part of life. Perhaps it is more accurate to say that Carr starts with these promises as his backdrop and chooses to paint a world in which everything that could go wrong has. That makes for an interesting story but ultimately an unsatisfying thought experiment. For escapism, that may be okay.
After my previous entry, I couldn't help but wonder whether I would have the patience to read this book. I have to think not. How many pages? 274 pages -- almost slender compared to Perec's book. Still, I'm glad I'm listening and not reading.
I leave today to attend the second SECANT workshop at Purdue. This is the sort of trip I like: close enough that I can drive, which bypasses all the headaches and inconveniences of flight, but far enough away that it is a break from home. My conference load has been light since April, and I can use a little break from the office. Besides, the intersection of computer science and the other sciences is an area of deep interest, and the workshop group is a diverse one. It's a bit odd to look forward to six hours on the road, but driving, listening to a book or to music, and thinking are welcome pursuits.
As I was checking out of the office, I felt compelled to make two public confessions. Here they are.
First, I recently ran across another recommendation for Georges Perec's novel, Life: A User's Manual. This was the third reputable recommendation I'd seen, and as is my general rule, after the third I usually add it to my shelf of books to read. As I was leaving campus, I stopped by the library to pick it up for the trip. I found it in the stacks and stopped. It's a big book -- 500 pages. It's also known for its depth and complexity. I returned the book to its place on the shelf and left empty-handed. I've written before of my preference for shorter books and especially like wonderful little books that are full of wisdom. But these days time and energy are precious enough resources that I have to look at a complex, 500-page book with a wary eye. It will make good reading some other day. I'm not proud to admit it, but my attention span isn't up to the task right now.
Second, on an even more frivolous note, there is at the time of this writing no Diet Mountain Dew in my office. I drank the last one yesterday afternoon while giving a quiz and taking care of pre-trip odds and ends. This is noteworthy in my mind only because of its rarity. I do not remember the last time the cupboard was bare. I'm not a caffeine hound like some programmers, but I don't drink coffee and admit some weakness for a tasty diet beverage while working.
I'll close with a less frivolous comment, something of a pattern I've been noticing in my life. Many months ago, I wrote a post on moving our household financial books from paper ledgers and journals into the twentieth century. I fiddled with Quicken for a while but found it too limiting; my system is a cross between naive home user and professional bookkeeping. Then I toyed with the idea of using a spreadsheet tool like Numbers to create a cascaded set o journals and ledgers. Yet at every turn I was thinking that I'd want to implement this or that behavior, which would strain the limits of typical spreadsheets. Then I came to my computer scientist's senses: When in doubt, write a program. I'd rather spend my time that way anyway, and the result is just what I want it to be. No settling. This pattern is, of course, no news at all to most of you, who roll your own blogging software and homework submission systems, even content management systems and book publishing systems, to scratch your own itches. It's not news to me, either, though sometimes my mind comes back to the power slowly. The financial software will grow slowly, but that's how I like it.
As a friend and former student recently wrote, "If only there were more time..."
Off to Purdue.
There are two big 40th anniversary events coming up for those of us in computer science. On November 5, the Computer History Museum is hosting the 40th Anniversary of the Dynabook, with Alan Kay, Charles Thacker, and Mary Lou Jepsen. Then on December 9, Stanford is hosting Engelbart and the Dawn of Interactive Computing: SRI's Revolutionary 1968 Demo. Much of the last 40 years of technology has been an evolution toward the ideas embodied in Kay's FLEX machine and Engelbart's mouse-controlled, real time-interactive, networked computer. These ideas showed us what was possible. In addition to technological vision, Kay and Engelbart also shared a greater goal: "to use computing to augment society's collective intellect and ability to solve the complex issues of our time".
I expect that 40 is going to be a common number in computer science celebrations in the next few years.
I don't know if "reddited" is a word like "slashdotted" yet, but I can say that yesterday's post, No One Programs Any More, has reached #1 on Reddit's programming channel. This topic seems to strike a chord with a lot of people, both in the software business and in other technology pursuits. Here are my favorite comments so far:
I can't think of a single skill I've learned that has had more of an impact on my life than a semi-advanced grasp of programming.
This person received some grief for ranking learning how to program ahead of, say, learning how to eat, but I know just what the commenter means. Learning to program changes one's mind in the same way that learning to read and write. Another commenter agreed:
It's amazing how after a year of programming at university, I began to perceive the world around me differently. The way I saw things and thought about them changed significantly. It was almost as if I could feel the change.
Even at my advanced age, I love when that feeling strikes. Beginning to understand call/cc felt like that (though I don't think I fully grok it yet).
My favorite comment is a bit of advice I recommend for you all:
I will not argue with a man named Eugene.
Reddit readers really are smart!
One of my colleagues in the Math department sent me some e-mail today:
I am a constant advocate for our (math) majors receiving some sort of 'computer-programming experience' before they graduate.
Of course not all of my colleagues are as enthusiastic about this... In fact, at a recent meeting, someone stated: "No one programs anyone." This is was the basis of their argument against requiring a programming course... and it turns out that several people believe this statement.
He asked for my reaction to their stance.
This request comes as I prepare to attend the second SECANT workshop at Purdue next week. Last fall I wrote several articles about the inaugural workshop for this NSF-funded project. The NSF must think that programming and, more generally, computer science are important beyond the walls of the CS building, because it has funded projects like SECANT, the goal of which is to:
... bring together computer scientists and natural scientists who recognize that computing has become indispensable to scientific inquiry and is set to permeate science in a transformative manner.
Most of the attendees last year, and many on the roster for this year, are scientists: physicists, biologists, chemists, and astronomers. They all program in some form, because CS has redefined how they do science. Some of them are developing curriculum in their disciplines that are programming-based so that future grads are better prepared for their careers.
In the time since I joined the faculty here, many departments have dropped the computer programming requirement from their majors. Part of the reason is probably that the intro programming courses were not meeting their students' needs, and our department needs to take responsibility for that. But a big part of the reason is that many faculty across campus believe as the Math faculty do, that their students don't need to learn computer programming anymore. Not too surprisingly, I disagree.
We have started to see some movement in the other direction. The Physics department now requires an introductory programming course because so many physicists need to know how to write and modify simulation programs that serve as their experiments. One result has been a steady stream of students in our intro C course, which focuses on scientific applications. Another is an ongoing research relationship among a member of the Physics faculty, a member of the CS faculty, and undergraduates from both departments that has produced several papers (with undergrad co-authors) and occasional award recognition. None of this research is possible without physics students being able to program complex molecular system simulations.
Scientists are not the only non-CS people who need to program -- or want to. People working in finance and other areas of business program, even if only in the form of complex spreadsheets, which are constraint propagation programs. Even further afield, artists are beginning to use computational media to create art and to explore concepts of form and color in a new way.
Saying all this, I can understand how mathematicians who work at a distance from computational applications might think that programming is passe. They have little experience with code themselves, and then they read vague articles in the newspapers about off-shoring and the demise of programming. Even among computer scientists who work with scientists know "surprisingly little about how scientists develop and use software in their research", which is why some of them are conducting sponsored research to to survey scientists on how they use computers.
But surely mathematicians are aware computational work on number theory that requires a nearly global network of computers to perform massive calculations, for instance, to find large prime numbers. One might dismiss such work as "merely" applications, not real math, but these applications are testing mathematical theorems about numbers in ways we could only have dreamed of in past times.
Math profs at mid-sized universities are not the only ones with the impression that programming is disappearing or less important than it used to be. Mark Guzdial recently wrote that some think programming isn't essential even for computer scientists:
I was at a meeting a couple weeks ago where an anecdote was related that speaks to this concern. A high-ranking NSF official made the argument that programming is not a critical skill for computer scientists. "Google doesn't want smart programmers! They want smart people!" A Google executive was in the audience and responded, "No, we want people who program."
I'm glad that Google knows better what Google needs than this particular high-ranking NSF official, and I realize that said official may only have meant that the smart people Google hires can become good programmers. But I do think that this story indicates the breadth of the misunderstanding people have about the role programming plays in the world today.
Perhaps the math profs here who said that "no one programs any more" were speaking only of math graduates from this university. But even that very limited claim is false. I suggested to my friend that they should probably survey their own alumni. I know several of them who program for a living. And some of them came back after graduation to learn how.
In recent weeks, the financial markets of the world have entered "interesting times". There is a great story to tell here about the role that computational models have played in the financial situation we face, but I have been more intrigued by another connection to computing, more specifically to software development. It turns out that these bad times for the economy are a good time to "be agile".
Paul Graham writes that this is a good time to start a start-up, in part because a start-up can be more nimble and consume fewer resources than a big software shop. A new product can grow in small steps in a market where resources are limited. Tim Bray expands on that idea in his post, A Good Time for Agility. It may be difficult to get major projects and corresponding big budgets approved in tough times, because most execs will be focused on cost containment and surviving to the quarterly report. But...
The classic Agile approach, where you pick two or three key features, spec'em out with test suites that involve the business side, build'em and test'em, and then think about maybe going back for the next two or three, well, that's starting to look awfully attractive.
Small steps and small up-front expense draw less attention than BDUF and multi-month budgets. And if they lead to concrete, measurable improvements, they have a greater chance of sticking. They might even lead to important new software.
The third example I read recently came in a posting to the XP mailing list, the link to which I seem to have lost. The gist was straightforward: The writer worked in the software arm of a major financial institution. Having previously adopted agile practices enabled his shop to shift direction on short notice in response to the market crash. They were not in the middle of a major project with a specific market but in the middle of ongoing creation of loan products. The market for their usual products deteriorated and were able to begin delivering software to a new market relatively quickly. This did not require a new major project, but a twist on their current trajectory.
This shouldn't surprise us. Agile approaches allow us to manage risk and change at finer levels of granularity, and in a time of major change the massive dinosaurs will be at a disadvantage against more nimble species.
Not all news is so rosy. Bureaucracy can still dominate an environment. Last Friday, an alumnus of our department gave a talk for our current students on how not to stink in industry. His advice was uniformly practical, with many of his technical points reminiscent of The Practical Programmer. But in response to a question about XP and agile practices, his comments were not so positive. He and his team have not yet figured out how to do planning for XP projects, so they are left with tool-specific XP practices such as pair programming and testing early and often. I think that I can help him get a grip on XP-style planning, and offered to do so, but I think his problem goes deeper, to something he has little control over: his team's customers expect big project plans and fixed-price "contracts".
This is not a new problem. I was fortunate to visit RoleModel Software back when Ken Auer was first building it, and one topic of discussion within the company was how to educate clients about a new way of planning and budgeting for projects and how to shift the culture when all of its competitors was doing the same old thing. His potential customers had one way of managing the risk they faced, and that was do to things the usual way, even if that led to software that was off target, over budget, and over time. I don't know much more about the issue than this and need to see if anyone has written of positive experiences with it in industry.
My former student works for a government agency, which perhaps makes the bureaucracy hard to move by law or administrative rule, rather than by market forces. I feel for him as my department continues to work on outcomes assessment. University mandates are my job's version of "the customer demands a big project plan". (Outcomes assessment is an academic form of unit testing for its curriculum.) As we try to enact an outcomes assessment plan in small steps, we face a requirement to produce a BDUF plan by the end of this year. It's hard to figure out what will work best for us if we have to predict what is best up front. Some will tell us that the academic world understands outcomes assessment well enough to design a suitable plan from scratch, but many of us in the trenches will disagree. It's certainly possible to design from scratch a plan that looks familiar to other people, but who knows if that is what will help this department and this faculty steer its programs most effectively?
Complete operational plans of this sort often end up being as useful as many of their software design counterparts. Worse, mandates for such also tend to counterproductive, because when the last big plan fails, and when administration doesn't follow through by holding departments accountable for fixing the plans, faculty learn to mistrust both the mandates and the plans. That is how the link in the previous paragraph can be to a post nearly two years old, yet my department still not have an effective assessment plan in place: the faculty have a hard time justifying spending the time and energy to take on such a big project if it is likely to fail or if not developing a plan has no consequences.
I am hoping that we can use the most recent mandate as an opportunity to begin growing an assessment regimen that will serve us well over time. I believe in units tests and continuous feedback. I'm also willing to argue to the administration that an "incomplete" but implementable plan is better than a complete plan with no follow-through.
As the software guys are saying, this is a good time to be agile.
I'm on the road to a recruiting event in Des Moines. The event is for girls who are interested in math and science. For me, the real treat is a chance to meet Mae Jemison the first woman of color to travel in space, on the space shuttle Endeavour in 1992. She's surely going to do a better selling math and science to these students than I could! (Note after the talk: She did. Perhaps the best way to summarize her message is, "We have choices to make.")
A few short items have been asking me to write them:
• At the risk of living too public a life where my students can see, I will say that the personality of my current class of students is not one that gives me a lot of energy. They are either still wary or simply disinterested. This happens every once in a while, and I'll try to find a way to create more energy in the room. In any case, it's nice at least to have a student or two who are like this.
• Kevin Rutherford has been working on a little software tool called reek, a smell detector for Ruby code.
That is what I would like to be doing right now, with either Ruby or Scheme being fine as a source language. Every time I teach programming languages I get the itch to dive deeply back into the refactoring pool. This is the primary drawback of administrative work and the primary indicator that I am probably not suited for a career in administration.
Short of working on such a cool project, blogging about interesting ideas is the next best thing.
• But consider this advice on writing:
If you have so many ideas, prove it to the world and start blogging. There is nothing like a blog to help you realize you have nothing new to say.
That post is really about why not to write a book. For many people, writing a book is a way to gain or demonstrate authority. Several of my friends and family have asked when I plan to write a book, and for at least a few their desire for me is grounded in the great respect that have for the value of a book. But I think that the author of the post is correct that writing a book is an outdated way to gain authority.
The world still needs great books such as, well, Refactoring, and one day I may sit down to write one. But I have to have something to say that should best be said in a book.
Perhaps we should take this author's advice with caution. She wrote a book and markets it with a blog!
• That piece also contains the following passage:
... self-respect comes from having some sort of vision for one's life and heading in that direction. And there is no one who can give you that vision -- you have to give it to yourself, and before you can feel like you have direction, you have to feel lost -- and lost is okay.
Long-time readers of this blog know that getting lost is not only okay but also demonstrates and exercises the imagination. Sometimes we get lost inside code just so that we can learn a new code base intimately.
• Finally, Seth Godin offers an unusual way to get things done:
Assemble your team (it might be just you) ... and focus like your hair is on fire. ... Do nothing except finish the project.
I need a Friday or a Monday at the office to try this out on a couple of major department projects. I was already planning a big sprint this weekend on a particularly persistent home project, and now I have a more provocative way to rev my engine.
One of my most senior colleagues has recently become enamored of Facebook. One of his college buddies started using it to share pictures, so my colleague created an account. Within minutes, he had a friend request -- from a student in one of his classes. And they kept coming... He now has dozen of friends, mostly undergrads at our school but also a few former students and current colleagues.
Earlier this week, he stopped me in the hall to report that during his class the previous hour, a student in the class had posted a message on his own Facebook page saying something to the effect, "I can't keep my eyes open. I have to go to sleep!" How does the prof know? Because they are Facebook friends, of course.
Did the student think twice about posting such a message during class? I doubt it. Was he so blinded by fatigue or boredom that he forgot the prof is his friend and so would see the message? I doubt it. Is he at all concerned in retrospect, or even just a little sheepish? I doubt it. This is standard operating procedure for a college set that opens the blinds on it life, day by day and moment by moment.
We live in a new world. Our students live much more public lives than most of us did, and today's network technology knocks down the well that separates Them from Us.
This can be a good thing. My colleague keeps his Facebook page open in the evenings, where his students can engage him in chat about course material and assignments. He figures that his office hours are now limited only by the time he spends in front of a monitor. Immediate interaction can make a huge difference to a student who is struggling with a database problem or a C syntax error. The prof does not mind this as an encroachment on his time or freedom; he can close the browser window and draw the blinds on office hours anytime he wants, and besides, he's hacking or reading on-line most of the time anyway!
I'm uncertain what the potential downsides of this new openness might be. There's always a risk that students can become too close to their professors, so a prof needs to take care to maintain some semblance of a professional connection. But the demystification of professors is probably a good thing, done right, because it enables connections and creates an environment more conducive to learning. I suppose one downside might be that students develop a sense of entitlement to Anytime, Anywhere access, and professors who can't or don't provide could be viewed negatively. This could poison the learning environment on both sides of the window. But it's also not a new potential problem. Just ask students about the instructors who are never in their offices for face-to-face meetings or who never answer e-mail.
I've not had experience with this transformation due to Facebook. I do have a page, created originally for much the same reason as my colleague's. I do have a small number of friends, including undergrads, former students, current colleagues, a grade-school buddy, and even my 60+ aunt. But I use Facebook sparingly, usually for a specific task, and rarely have my page open. I don't track the comments on my "wall", and I don't generally post on others'. It has been useful in one particular case, though, reconnecting me with a former student whose work I have mentioned here. That has been a real pleasure. (FYI, the link to his old site seems to be broken now.)
However, I do have limited experience with the newly transparent wall between me and my students, through blogs. It started when a few students -- not many -- found my blog and began to read it. Then I found the blogs of a few recent students and, increasingly, current students. I don't have a lot of time to read any blogs these days, but when I do read, I read some of theirs. Blogs are not quite as immediate as the Twitter-like chatter to be found in Facebook, but they are a surprisingly candid look into my students' lives and minds. Struggles they have with a particular class or instructor; personal trials at home; illness and financial woes -- all are common topics in the student blogs I read. So, too, are there joys and excitement and breakthroughs. Their posts enlighten me and humble me. Sometimes I feel as if I am privy to far too much, but mostly I think that the personal connection enriches my relationship both with individual students and with the collective student body. What I read certainly can keep me on a better path as I play the role of instructor or guide.
And, yes, I realize that there is a chance that the system can be gamed. Am I being played by a devious student? It's possible, but honestly, I don't think it's a big issue. The same students who will post in full view of their instructor that they want to sleep through class without shame or compunction are the ones who are blogging. There is a cultural ethic at play, a code by which these students live. I feel confident in assuming that their posts are authentic, absent evidence to the contrary for any given blogger.
(That said, I appreciate when students write entries that praise a course or a professor. Most students current students are circumspect enough not to name names, but there is always the possibility that they refer to my course. That hope can psyche me up some days.)
To be fair, we have to admit that the same possibility for gaming the system arises when professors blog. I suppose that I can say anything here in an effort to manipulate my students' perceptions or feelings. I might also post something like this, which reflects my take on a group of students, and risk affecting my relationship with those students. One of my close friends sent me e-mail soon after that post to raise just that concern.
For the same reasons I give the benefit of the doubt to student bloggers, I give myself the benefit of the doubt, and the same to the students who read this blog. To be honest, writing even the few entries I manage to write these days takes a lot of time and psychic energy. I have too little of either resource to spend them disingenuously. There is a certain ethic to blogging, and most of us who write do so for more important purposes than trying to manipulate a few students' perceptions. Likewise, I trust the students who read this blog to approach it with a mindset of understanding something about computer science and just maybe to get a little sense of what their Dear Old Professor tick.
I know that is the main reason I write -- to figure out how I tick, and maybe learn a few useful nuggets of wisdom along the way. Knowing that I do so in a world much more transparent than the one I inhabited as a CS student years ago is part of the attraction.
Some days, things go well, beyond expectation. Enjoy them! Today was one for me.
I've been thinking a lot about how students learn a new style of programming or a language that is quite different from their experience. Every class has its own personality, which includes interaction style, interest in Big Ideas, and curiosity. Last night it occurred to me that another important part of that personality is trust.
I was grading a quiz and suddenly felt a powerful personal connection to Gunnery Sergeant Foley from one of my favorite movies, An Officer and a Gentleman. There is a scene halfway through the film when he catches the protagonist, Zack Mayo, running an illegal contraband operation out of his barracks. The soldiers are in their room one afternoon when Foley walks in and declaims, "In every class, there's always one guy who thinks he's smarter than me. In this class, that's you, Mayo." He then dislodges a ceiling tile to reveal Mayo's stash of contraband and lets everyone know the jig is up.
Beyond the occasional irrational desire I have to be Lou Gossett breaking the spirits of cocky kids and building them back up from scratch, while grading solutions to a particular exam problem I couldn't help but think, "In every class, there's always one guy who thinks he's smarter than me..." Some of the students seemed to be going out of their ways not to use the technique we had learned in class, which resulted in them writing complex, often incorrect code. More practically for them, they ended up writing more code than they needed, which spent extra time they didn't have the luxury of spending. I felt bad for them grade-wise, but also a little sad that they seemed to have missed out on the beautiful idea beyond the programming pattern they were not using.
(Don't worry, class. This irrational desire of mine is fleeting. I don't want your DOR. Quite the contrary; I am looking for ways help you succeed!)
Sometimes, I wonder if the problem is that students don't really trust me. Why should they? Sure, I'm the teacher, but they feel pretty good about their programming skills, and the patterns I show them may be different and complex enough that they'd rather trust their own skills than my claim that, say, mutual recursion makes life better. They'll learn that with enough experience, and then they may realize that they can trust me after all.
In many ways, though, a bigger part of the problem may be a failure of storytelling. On my side are the stories I tell to engage students in an idea and its use. To paraphrase Merlin Mann paraphrasing Cliff Atkinson, I need to tell a story that makes the students feel like an character with a problem they care about and then show how our new way of solving their problem -- their problem -- makes them winners in the end. I think I do a better job of this now than I did ten years ago in this course, but I always wonder how I can do better.
On their side is, perhaps, a failure of their own storytelling -- not just about bugs, as Guzdial writes, but about the problem domain itself, the data types at play, and the kind of problem they are solving. I suspect writing code over nested symbolic lists that represent programs is so different from the students' experience that many of them have a hard time getting a real sense of what is going on. As long as the domain and task remain completely abstract in the mind, the problems look almost like random markings on the page. Where to start? That disorientation may account for not starting in what seems to me to be the obvious location.
As a teacher, failures in their storytelling become failures in my storytelling. I need to reconsider how I communicate the "big picture" behind my course. Asking students to create their own examples is one micro-step in this direction. But I also need to think about the macro-level -- something like XP's notion of metaphor. That practice has proved to be a stumbling block for XP, and I expect that it will remain a challenge for me.
Last time I mentioned a Supreme Court justice's thoughts on how universal access to legal case data changes the research task associated with the practice of the law. Justice Roberts's comments brought to mind two thoughts, one related to the law and one not.
As a graduate student, I worked on the representation and manipulation of legal arguments. This required me to spend some time reading legal journals for two different purposes. First, I needed to review the literature on applying computers to legal tasks, ad in particular how to represent knowledge of statute and cases. Second, I needed to find, read, and code cases for the knowledge base of my program. I'm not that old, but I'm old enough that my research preceded the Internet Age's access to legal cases. I went to the campus library to check out thick volumes of the Harvard Law Review and other legal collections and journals. These books became my companions for several months, as I lay on the floor of my study and pored over them.
When I could not find a resource I needed on campus, I rode my bike to the Michigan State Law Library in downtown Lansing to use law reviews in its collection. I was not allowed to take these home, so I worked through them one at a time in carols there. I was quite an anomalous sight there, in T-shirt and shorts with a bike helmet at my side!
I loved that time, reading and learning. I never considered studying the law as a profession, but this work was a wonderful education in a fascinating domain where computing can be applied. My enjoyment of the reading almost certainly extending my research time in grad school by a couple of months.
The second thought was of the changes in chess brought about by the application of simple database technology. I've written about chess before, but not about computing applications to it. Of course, the remarkable advances in chess-playing computers that came to a head in Hitech and Deep Thought have now reached the desktop in the form of cheap and amazingly strong programs. This has affected chess in so many ways, from eliminating the possibility of adjournments in most tournaments to providing super-strong partners for every player who wants to play, day or night. The Internet does the same, though now we are never sure if we are playing against a person or a person sitting next to a PC running Fritz.
But my thoughts turned to the same effect Justice Roberts talked about, the changes created by opening databases on how players learn, study, and stay abreast of opening theory. If you have never played tournament chess, you may not be aware of how much knowledge of chess openings has been recorded. Go to a big-box bookstore like Amazon or Barnes and Noble or Borders and browse the library of chess titles. (You can do that on-line now, of course!) You will see encyclopedias of openings like, well, the Encyclopedia of Chess Openings; books on classes of openings, such as systems for defending against king pawn openings; and books upon books about individual openings, from the most popular Ruy Lopez and Sicilian Defense to niche openings like my favorites, Petroff's Defense and the Center Game.
In the olden days of the 1980s, players bought books on their objects of study and pored over them with the same vigor as legal theorists studying law review articles. We hunted down games featuring our openings so that we could play through them to see if there was a novelty worth learning or if someone had finally solved an open problem in a popular variation. I still have a binder full of games with Petroff's Defense, cataloged using my own system, variation by variation with notes by famous players and my own humble notes from unusual games. My goal was to know this opening so well that I could always get a comfortable game out of the opening, against even stronger players, and to occasionally get a winning position early against a player not as well versed in the Petroff as I.
Talk about a misspent youth.
Chessplayers these days have the same dream, but they rarely spend hours with their heads buried inside opening books. These days, it is possible to subscribe to a database service that puts at our fingertips, via a computer keyboard, every game played with any opening -- anywhere in the recorded chess world, as recently as the latest update a week ago. What is the study of chess openings like now? I don't know, having grown up in the older era and not having kept up with chess study in many years. Perhaps Justice Roberts feels a little like this these days. Clerks do a lot of his research, and when he needs to do his own sleuthing, those old law reviews feel warm and inviting.
I do know this. Opening databases have so changed chess practice, from grandmasters down to patzers like me, that the latest issue of Chess Life, the magazine of U.S. Chess, includes a review of the most recent revision of Modern Chess Openings -- the opening bible on which most players in the West once relied as the foundation of broad study -- whose primary premise is this: What role does MCO play in a world where computer database is king? What is the use of this venerable text?
From our gamerooms to our courtrooms, applications of even the most straightforward computing technology have changed the world. And we haven't even begun to talk about programs.
The Chief Justice of the U.S. Supreme Court, John Roberts, spoke last week at Drake University, which merited an article in our local paper. Robert spoke on the history of technology in the law, and in particular on how the internet is changing in fundamental ways how the law is practiced. He likened the change to that created by the printing press, an analogy I use whenever I speak with parents and prospective CS majors.
The detective work that was important and rewarding when I was starting out is now almost ... irrelevant.
I wonder if this will have an effect on the kind of students who undertake study of the law, or the kind of lawyers who succeed in the profession. I don't imagine that it will affect the attractiveness of the law for a while, because I doubt that a desire to spend countless hours poring through legal journals is the primary motivator for most law students. Prestige and money are certainly more prominent, as is a desire to "make a difference". But who performs best way well change, as the circumstances under which lawyers work change. This sort of transformation is almost unavoidable when a new medium redefines even part of a discipline.
Roberts is perhaps concerned about this part of the change himself. Technology makes information more accessible, which means skill in finding it is no longer as valuable. How about skill at manipulating it? Being able to find information more readily can liberate practitioners, but only if they know what to do with it.
There's a lot of value in thinking outside the box. But the key word is "thinking". ... You cannot think effectively outside the box if you don't know where the box is.
I love that sentence! It's a nice complement to a phrase of Twyla Tharp's that I wrote about over three years ago: Before you can think out of the box, you have to start with a box. Tharp and Roberts are speaking of different boxes, and both are so right about both boxes.
I had one of those agile moments on Wednesday. A colleague stopped by my office to share his good feeling. He had just come from a CS 2 lab. "I love it whenever I design a lab in which students work in pairs. There is such life in the lab!" He went on to explain the interactions within pairs but also across pairs; one group would hear what another was thinking or doing, and would ask about it. So much learning was in the air.
This reminded me of the old joke... Patient: "Doctor, it hurts when I do this." (Demonstrates.) "Can you help me?" Doctor: "Sure. Don't do that."
Of course, it reminded of the negative space around the joke. Patient: "Doctor, life is great when I do this." (Demonstrates.) "Can you help me?" Doctor: "Sure. Do more of that."
"But..." But. We faculty are creatures of habit, both in knowing and in doing. We just know we can't teach all of our material with students working in pairs, so we don't. I think we can, even when I don't follow my own advice. (Doctor, heal thyself!) We design the labs, so if we want students to work in pairs, we can have them work in pairs.
I've had one or two successful experiences with all pair programming all the time in closed labs. Back when we taught CS1 and CS2 in C++, in the mid-1990s, and I was doing our first-year courses a lot, I designed all of my labs for students working in pairs. I wish I could say I had bee visionary, but my motivation was extrinsic: I had 25-30 students in class and 15 computers in the lab. Students worked with different students every week, in pseudo-random assignments of my device.
My C++-based courses probably weren't very good -- I was relatively new to teaching, and we were using C++ after all -- and the paired programming in our lab sessions may have been one of the saving graces: students shared their perplexity and helped each other learn. When they worked on outside programming assignments for the course, they could call on a sturdy network of friends they had built in lab sessions. Without the pairs, I fear that our course would have worked well for very few students.
If something works well, let's try to understand the context in which it works, and then do it more often in those contexts. That's an agile sentiment, whether we apply it to pair programming or not. Whether we apply it at the university or in industry. Whether we apply it to software development or any other practice in which we find ourselves engaged.