Here are two signs that you may face a tough 20-miler:
... and that doesn't even take into account the fact that it's a 20-miler!
There isn't much I can do to change the weather conditions, and there's not much I can do to address them when it stays hot and humid through the night. I can run slowly and take plenty of fluids. Choosing a shaded route is good, too.
My soreness is under some control, though I depend on my harder workouts to get faster for my race. This week, however, I overdid both of my fast workouts, partly out of ignorance though also partly out of overeagerness. (Stick to the plan!)
This morning, I ran slowly, but "on plan" -- ≤ 70 seconds over marathon goal pace, which is within the 60-90 seconds over marathon goal pace recommended by the coach who designed the plan I'm following. The route I ran offered shade for much of the last 8 miles, which certainly helped as the temperature rose into the mid-80s. I also drank a lot of fluid -- at least when compared to my usual practice. I consumed 19 oz. of the sports drink that will be available on the race course this fall. Last year's experience suggests that my body may not absorb the liquid I take on the course all that well, especially this brand. That almost certainly affected my hydration during the marathon last year. This year I hope to train my body to take more liquid, and this kind of liquid in particular. If nothing else, I need to get used to living with the flavor for almost four hours!
The first two hours today felt good, much better than the first half of last week's 18-miler. But I began to feel the effects of the weather and fatigue in the last hour. Then I ran a bit faster for my 18th mile (8:15), just to see if I had anything left. I did -- but then not for the last two miles. This is an indulgence I need to give up. (Stick to the plan!)
Last year at this time I ran a great 20-miler on a rails-to-trails trail in Muncie, Indiana. That run felt much better than today's, and I ran much faster. I was in better shape last year after a perfect spring 2005. The timing of my hard workouts and the weather probably contributed, too. But running too fast on my long runs last year also probably hindered my speed training last year and may have led me to peak too soon, both physically and mentally. So I am not too worried about running slower on my first 20-mile run of 2006.
For I now, I am happy to have my first pair of big weeks (50 and 52 miles) under my belt. Now I look forward to a consolidation week -- 48 miles, with a long run of only 14 or so. Such periods, which I schedule every three weeks, let the body recover from recent increases in hard work and mileage and adapt to these new stresses. Then I face my next big climb: weeks of 56 and 58 miles, with long runs of 22 and 24. I think of that pair as like the Tour de France entering the Alps for a couple of massive climbs...
That was my standard answer to a particular family of questions for many years. The prototypical question in this family was, "Do you think you'll ever want to be a department head?" And I always said, "July 27 at 2:00 PM".
If you are a faculty member or even a graduate student, you probably know just what I meant. Summer is a great time to be a faculty member. We get to read, write code, play with new ideas -- all without having to worry about preparing for and meeting classes, attending department meetings, or otherwise doing much with the day-to-day business of the university. Even better, we can read, write, and play most anywhere we want, most anytime we want. We can become absorbed in a program and work all day, and never have to worry about a meeting or a class interrupting the flow. We can work from home or from the road, but we don't have to work at our offices unless we want to. No one expects us to be there much.
Department heads, though, are administrators. The daily business of the university goes on, and the heads have to keep tabs on it. There are salary letters to be written, budgets to be closed, memos to write for the dean and provost, and inquiries to be answered. The university pays their salary during the summer, and the university expects a return on its investment.
So, there we are, 2:00 PM on a beautiful Thursday, July 27th. As a faculty member, I could be almost anywhere, doing almost anything, learning something new. As a department head, I would be in the office, dressed well enough to meet the public if necessary, "working".
"July 27 at 2:00 PM" meant "I don't think so, and here's why...".
The great irony is that I am now finishing up my first year as department head now, and it is nearly 2:00 PM on Thursday, July 27. It's overcast outside, not sunny, and frightfully muggy. Where am I? In my home study. What am I doing? Reading from the stack of papers that has built up over the last few months, and writing a quick entry for this blog. Not much different than any other summer in the past 15 years.
However, I stand by metaphorical answer. All other things being equal, summer life as a faculty member is freer and open to more possibilities than summer life as a department head. It requires some adjustment.
My reading today has been from an old-fashioned pile of stuff, the papers and journal articles that I run into during busy days and set aside for later. I don't know about you, but my eyes are always bigger than my stomach when it comes to the list of things I want to read. So I print them out and wait for a free moment. Eventually the pile of papers exceeds any realistic amount of time I have to read and the amount of space I have to store them. Today, I made a few free moments to do triage, tossing papers that I know I'll never get to and, every so often, stopping to read something that sounds good right now. A very few papers earn a spot in the new, streamlined pile of stuff, in an almost certainly fantastic hope that some day I'll get to them.
Here are two passages I read today that made the effort worthwhile. First, a quote from Uncle Bob Martin, from a 2004 message to the extreme programming mailing list, in the thread "Designing before doing":
> "But how can you do anything without designing it first?!" ...
You can't. You always have to design something before you build it.
The question is: "How much do I have to design before I build?"
The answer is: "Just enough so that what I build gives me better insight into the design of the next step."
Seen this way, the act of building is *part* of the act of design, and the original question inverts itself: "How can you design something without verifying your design decisions by implementing them?"
Well said. Thank you, Uncle Bob.
Finally, a bit of humor from Eric Raymond's classic FAQ, How to Become a Hacker:
Q:I'm having problems with my Windows software. Will you help me?
A:Yes. Go to a DOS prompt and type "format c:". Any problems you are experiencing will cease within a few minutes.
Enjoy your July 27!
Summer is a great time to read old papers, though this summer has been too light on free reading time for my tastes. This morning, I read David Gries's 1974 SIGCSE paper What Should We Teach in an Introductory Programming Course?, which came up in a SIGCSE mailing list discussion earlier this year. Gries has long been involved in the effort to improve computer science instruction, from at least his 1973 PL/I-based intro text with Conway, An Introduction to Programming up to his recent multimedia effort Program Live!, with his son, Paul. In part because his work tends to be more formalist than my own, I usually learn a lot from his writing.
This paper focuses on two of what Gries thinks are the three essential aspects of an intro course: how to solve problems and how to describe solutions algorithmically. His discussion of general problem solving was the reason the paper came up in the SIGCSE discussion, which considered how best to teach students "general problem-solving skills". Gries notes that in disciplines such as philosophy one studies problem-solving methods as philosophical stances, but not as processes to try in practice, and that in subjects such as mathematics one learns how to solve specific classes of problems by dint of repetition. By exposing students to several classes of problem, teachers hope that they generalize their learning into higher-order skills. This is true in many disciplines.
But we in computer science face a more challenging problem. We strive to teach our students how to write programs that solve any sort of problem, whether from business or science or education. In order to program well across all the disciplines, our students must learn more general problem-solving skills.
That said, I do not think that we can teach general problem-solving skills in our intro course, because I do not think the brain works that way. A lot of evidence from cognitive psychology shows that humans learn and tend to operate in specific contexts, and expertise transfers across boundaries only with great attention and effort. The the real question is not how we can teach general problem-solving skills in our intro course, but rather how can we help students develop general problem-solving skills over the course of our curriculum. My current belief is that we should teach programming in context, while opportunistically pointing out the patterns that students see and will see again later. Pick a context that students care about, so that they will have plenty of their own motivation to succeed. Then do it again, and again. By making explicit the more general patterns that students see, I think that we can do better than most CS and math curricula currently do. We would not be teaching general problem-solving skills in any course so much as creating an environment in which students can maximize their likelihood of "getting it". Our courses should not be weed-out courses, because I think most students can get it -- and we need them to, whether as CS majors or as non-majors prepared to participate in a technological society.
When Gries looked at what people were teaching in intro CS courses back then, he found them doing what many of our courses do now: describing a set of tools available in a programming language, showing students a few examples, and then sending them off to write programs. Not much different than our math brethren teaching differentiation or integration. Gries exposes the folly in this approach with a little analogy:
Suppose you attend a course in cabinet making. The instructor briefly shows you a saw, a plane, a hammer, and a few other tools,letting you use each one for a few minutes. He next shows you a beautifully-finished cabinet. Finally, he tells you to design and build your own cabinet and bring him the finished product in a few weeks.
You would think he was crazy!
(For some reason, this passage reminded me of a witty student evaluation that Rich Pattis shared in his SIGCSE keynote address.)
Gries offers a few general principles from the likes of Descartes, Polya, and Dijkstra and suggests that we might teach them to students, that they might use the principles to help themselves organize their thoughts in the midst of writing a complex program. I suspect that their greatest value is in helping instructors organize their thoughts and keep the ultimate goal of the course in mind. For example, while reading this section, I was struck by Polya's fourth phase of solving problems: "Look back". We instructors must remember to give our students both the opportunity to look back at what they've done and think about their process and product, and the time to consolidate their learning. So often, we are driven by the need to cover more, more, more material, and the result is a treadmill from which students fall at the end of the course, exhausted and not quite sure what all just happened.
Gries then offers a language for describing algorithms, very much in sync with the Structured Programming movement of the 1970s. My primary reaction to this discussion was "where are the higher-level patterns?" If all we teach students are basic statements, procedure definitions and calls, and control structures, we are operating only barely above the level of the programming language -- and thus leaving students to discover on their own fundamental patterns like Guarded Linear Search.
What language should we teach in the intro course? Gries attacks this question with gusto. As I've written before, language sniping is a guilty pleasure of mine, and Gries's comments are delightful. Indeed, this whole paper is written in a saucy style not often seen in academic papers, then and especially now. I wonder if most SIGCSE referees would let such style pass these days?
First, Gries reminds us that in an intro the programming language is only a vehicle for teaching the ideas we think are important. It should be as close as possible to the way we want students to think about programs, but also "simple, elegant, and modular, so that features and concepts not yet taught won't get in the way." (As a smug language weenie, I am already thinking Smalltalk or Scheme...) But we usually teach the wrong language:
The language taught is often influenced by people outside the computer science profession, even though their opinions are not educated enough to deserve recognition.
At the time he wrote the paper, Gries would like to have taught Pascal, BLISS, or a variant of Algol, but found that most departments taught Fortran, BASIC, or PL/I. Gries minces no words. On Fortran:
Fortran is out of date and shouldn't be used unless there is absolutely nothing else available. If this is the case, use it under protest and constantly bombard the manufacturers or other authorities with complaints, suggesting the make available a more contemporary.
I learned Fortran in my CS 1 course back in 1983!
[It] should never have come into existence. When it was contemplated, its designers should have done their research to see what programming and programming languages are all about before plunging in.
I learned Basic as my first programming language in high school, in 1980!
But then Gries expresses one of the tenets of his approach that I disagree with:
I have doubts about teaching students to think "on-line"; algorithms should be designed and written slowly and quietly at one's desk. Only when assured of correctness is it time to go to the computer and test the algorithm on-line.
First, notice the older sense of "on-line". And then ask yourself: Is this how you program, or how you want to program? I know I'm not smart enough to get any but the most trivial programs correct without testing them on-line.
Of the choices realistically available, Gries decided that PL/I was the best alternative and so wrote his text with with Conway. I used the 1979 version of this text as the PL/I reference in my data structures course back in 1983. (It was the language companion to the language-independent Standish text I so loved.)
Even though Gries grudgingly adopted PL/I, he wasn't happy:
What's wrong with PL/I? Its syntax is enough to offend anyone who has studied English grammar; its data structure facilities ... could have been less clumsy ...; it is not modular ...; its astonishment factor is much too high (e.g., what is 25 + 1/3 ?); ... and so on.
But with the right subset of the language, Gries felt he could teach structured programming effectively. That is just the sort of compromise that C++ advocates made in the 1990s. I willingly accepted C++ as a CS 1 language myself, though it didn't take long for me to realize that this was a mistake. By comparison, PL/I was a relatively nice language for novices.
The last section of Gries's paper turns to the topic of program documentation. His central tenet will sound familiar to agile programmers of the new century:
Program documentation should be written while the program is being written, if not before, and should be used by the programmer in proving correctness and in checking his program out.
This is a fine argument for test-driven development! This is a common theme among formalist computer scientists, and one I've written about with regard to Edsger Dijkstra. The place where the agile folks diverge from folks like Gries and Dijkstra is in their strong conviction that we should use code -- executable tests -- to document the intended behavior of the system. If the documentation is so valuable, why not write it in a form that supports automated application and continuous feedback? Sitting quietly at one's desk and writing an outline of the intended algorithm by candlelight seems not only quaint but also sub-optimal.
The outline form that Gries recommends is certainly more palatable than other alternatives, such as flowcharts. I think that the modern rendition of this approach is Matthias Felleisen's design recipe approach, described most completely in How to Design Programs. I have great respect for this work and am always looking for ways to use its ideas to improve how I teach.
Gries concludes his paper with good advice on "practicing what you teach" for any instructor:
The students will easily sense whether you believe in what you tell them, and whether you yourself practice what you teach.
He wrote this at a time when many CS instructors needed to be retrained to teach the new orthodoxy of structured programming. It has been equally true over the last ten years or so, with the move to object-oriented programming and then agile approaches. One of the reasons I dislike textbooks these days is that too often I get the sense that the authors don't really believe what they are saying or, if they do, that the belief is more a skin they've put on than a deep grokking of the ideas. Gries advises ways in which to deepen one's understand, including the surprisingly surprising "write several programs, both large and small, using the tools and techniques advocated". Why should this be surprising to anyone? I don't know, but I wonder how many folks who now teach an OO intro course have ever written and lived inside a large object-oriented program.
The end of this paper supports a claim I made about academic conservatism a couple of years ago, and brings us back to Dijkstra again. First he expresses hope:
You would think that the University, where one searches for truth and knowledge, would be the place for innovative thinking, for people are tuned to new and better ideas.
... and then he quotes Dan McCracken, who had surveyed academics about their intro courses:
"Nobody would claim that Fortran is ideal for anything, from teachability, to understandability of finished programs, to extensibility. Yet it is being used by a whopping 70% of the students covered by the survey, and the consensus among the university people is that nothing is going to change much anytime soon."
Does this sound like educators who are committed to teaching concepts, to teaching people what they need to know to prepare for the future?
As noted above, my alma mater was still teaching Fortran in CS 1 into the 1980s. Gries is hard on his colleagues, as I have been at times, but the truth is that changing how one programs and teaches is hard to do. And he and I have been guilty of making PL/I-like compromises, too, as we try to introduce our own ideas. The lesson here is not one of blame but one of continually reminding ourselves of what matters and trying to make those things happen in our courses.
Reading this paper was a nice diversion from the other duties I've been facing lately, and a nice way to start preparing more for my fall CS 1 course.
I am coming to the end of my first year as department head, which officially began on August 1 last year. In that time, I have written less about management than I had expected. Why? One ingredient is that I blog using my public identity. I don't often mention UNI or the University of Northern Iowa in my entries, but the blog is located in my departmental web space, and I provide plenty of links off to other UNI pages.
For much of what I've thought to write in the last year, I would have had a hard time writing in an anonymous way; too often, it would have been obvious that my comments were about particular individuals -- folks that some of my readers know, or could know, and folks who themselves might stumble upon my blog. In a couple of CS curriculum entries such as this one, I have referred to a nameless colleague who would surely recognize himself (and our current and former students can probably identify him, too) -- but those were posts about CS content, not discussions of my role as head.
Why would that be a problem? I think that I am more prone to write these days on negative experiences that occur as an administrator, manager, or dare I say leader than I am to write on positive events. This is the same sort of motivation I have for writing software patterns, negative experiences that happen when programming and designing. But in those cases I also have a solution that resolves the forces. As an administrator, I am mostly still looking for solutions, feeling my way through the job and trying to learn general lessons. And that gets us to the real reason I haven't written much on this topic: I'm not there yet. I don't think I've learned enough to speak with any amount of authority here, and I don't have the confidence to ruminate yet.
As I wrote back in May, I thought I might have more time to write in the summer, at least more regularly. But instead of fewer distractions, I think I've experienced more. I've actually had less concentrated time to write. In that same entry, I related a quote from Henri Nouwen that "... the interruptions were my real work." The idea behind this quote is that we must come to peace with the distractions and view them as opportunities to serve. In a rational sense, I understand this, but I am not there yet. Sometimes, interruptions just seem like interruptions. I can go home at the end of a day of interruptions feel like I did something useful, but I also usually feel unfulfilled at not having accomplished my other goals for the day.
Sometimes, the interruption really does turn out to be my real task, one that displaces other plans in a meaningful way. The best example of that this summer has been a long-tabled proposal for our department to offer a B.S. in Software Engineering. I definitely have plenty to write about this in the next couple of weeks, but it won't be my colleagues here who might be unhappy with what I say; it will be colleagues at our sister state institutions.
This week has been filled with nothing but distractions of the former type. For most of the week, I have felt like a preemptible processor thrashing on a steady flow of interrupts. On the other hand, I did manage to tie up some loose ends from old projects, which should make August a more enjoyable month.
In that May entry, I also quoted Chad Fowler's vignette about "fighting the traffic". I have come to realize that, unlike Fowler's cabbie, I don't love to fight the traffic. At least not as my primary job. I know that it is an essential part of this job, so I am looking for middle ground. That's a meta-goal of mine for the coming year. I'd also like to refine how I respond to interrupts and how I schedule and manage my time. This year will tell me a lot about whether I should consider this as a long-term professional opportunity.
But now I am ready for a little vacation. I haven't take any break other than at Christmas since long weekends in May and July of 2005. While I am not despairing for my job, I am reminded of a quote I used when starting as head last August 1:
Oft expectation fails, and most oft there
Where most it promises; and oft it hits
Where hope is coldest, and despair most fits.
-- William Shakespeare
All's Well That Ends Well (II, i, 145-147)
Earlier this summer, I had a fantasy of a running vacation: Drive somewhere mostly isolated, and find a home base in a little motel. Get up every morning and run from 5:30 until 9:00 or so, anywhere from 12 to 15 miles. Come back, shower, eat, and nap until noon. Then spend the day reading, writing, and goofing off. Mmmm.
But I also have a strong desire to spend some time with my wife and daughters, and so that's what I'll do: hang around home. I'll do some "home"work, relax and read. I'll probably also spend a little relaxed time corralling my work to-do list, to prepare for returning to the office. That will actually feel good. I'll even do some disciplined reading of work e-mail -- but I don't want to be this guy:
If I have to go an extreme, I'd rather be Delbert T. Quimby, a guy from an editorial cartoon by Ed Stein back in 1995. I can't find this on-line and don't have a scanned version of the cartoon, so you'll have to settle for my description. The scene is a cozy den, with a wall full of books. In the soft spotlight at the middle of the room is our protagonist, comfortably dressed, book in lap, glass of wine on the side table. The caption reads, "Eccentric Internet holdout Delbert T. Quimby passes yet another day non-digitally."
Back in 1995, such a maneuver was likely viewed as an act of rebellion; today, it would be viewed by many as just plain nuts. But, you know, it sounds really good right now. So who knows?
(My students probably consider me to be a Delbert T. Quimby already, not for the wardrobe and dumpy physique but for this neo-Luddite fact: I still have only a dial-up Internet connection at home!)
Oh, and I will definitely run a lot of miles next week -- 52 or so -- and blog a bit.
Today I've been cleaning out an old-fashioned sort of current stuff folder: a box of papers that had gathered on my desk last year and made the move en masse when we moved the department office to its new building. It is truly amazing in this day of digital technology and on-line communication that so much paper still flows among the various units of a large organization. Most of the paper in the box I attacked today has hit either the recycle bin or the shredder; the rest, mostly related to enrollment and budget, has been safely filed away, though I'm not sure how often it will be used.
Occasional pages contain highlighted quotes that struck me as worthy of keeping at some point last year. Not all seem as worthy today, but a couple ring as true now -- or truer -- than they did then.
One is "Midlife in Academe", an essay by Jeffrey Nesteruk in the July 8, 2005, issue of the Chronicle of Higher Education. Nesteruk recounts the enlightenment that came over him as he reached middle age, a tenured full prof who has reached most of his external professional goals. But this passage expresses a feeling I have after a year as department head:
... I have to be careful how I handle the loss of my professional innocence. I know the personalities in my academic world. I know the political divides, I know the battles that can be won and those that can't. But there's a risk in knowing too well your world's limits. I don't want that knowledge to constrain my imagination.
Like teaching and writing, administering a department and trying to lead it require imagination. It's important not to get boxed in by the geography you've observed so far.
Then again, maybe I should be running for the hills away from this job. From the International Association for Pattern Recognition's IAPR Newsletter, date unknown:
Administration is abhorred by all right-thinking academics and best to be avoided whenever possible. When done badly it causes grief; when done well it attracts even more to be done.
I've already lived both sides of that equation, and I expect the coming year to be more of the same.
One of the enjoyable outreach activities I've been involved with as department head this year has been the state of Iowa's Information Technology Council. A few years back, the Iowa Department of Economic Development commissioned the Battelle corporation to study the prospects for growing the state's economy in the 21st century. They focused on three areas: bioscience, advanced manufacturing, and information technology. The first probably sounds reasonable to most people, given Iowa's reputation as an agriculture state, but what of the other two? It turns out that Iowa is much more of a manufacturing state than many people realize. Part of this relates back to agriculture. While John Deere is headquartered in Moline, Illinois, most of its factories are in Iowa. We also have manufacturers such as Rockwell Collins and Maytag (though that company has been purchased by Whirlpool and will close most or all of its Iowa locations soon).
But information technology? Des Moines is home to several major financial services companies or their regional centers, such as Principal Financial Group and Wells Fargo. Cedar Rapids has a few such firms as well, as well as other companies with a computing focus such as NCR Pearson and ACT.
IDED created the IT Council to guide the state in implementing the Information Technology Strategic Roadmap developed by Battelle as a result of its studies. (You can see the report at this IDED web page.) The council consists of representatives from most of Iowa's big IT firms and many of the medium-sized and small IT firms that have grown up throughout the state. Each of the three state universities has a representative on the council, as does the community college system and the consortium of Iowa's many, many private colleges and universities. I am UNI's representative.
The council has been meeting for only one year, and we have spent most of our time really understanding the report and mapping out some ideas to act on in the coming year. One of the big issues is, of course, how Iowa can encourage IT professionals to make the state their home, to work at existing companies and to create innovative start-ups that will fuel economic growth in the sector. Another part of the challenge is to encourage Iowa students to study computer science, math, and other science and engineering disciplines -- and then to stay in Iowa, rather than taking attractive job offers from the Twin Cities, Chicago, Kansas City, and many other places with already-burgeoning IT sectors.
To hear Paul Graham tell it, we are running a fool's errand. Iowa doesn't seem to be a place where nerds and the exceedingly rich want to live. Indeed, Iowa is one of those red states that he dismisses out of hand:
Conversely, a town that gets praised for being "solid" or representing "traditional values" may be a fine place to live, but it's never going to succeed as a startup hub. The 2004 presidential election ... conveniently supplied us with a county-by-county map of such places. 
Actually, as I look at this map, Iowa is much more people than red, so maybe we have a chance! I do think that a resourceful people that is willing to look forward can guide its destiny. And the homes of our three state universities -- Iowa City, Ames, and Cedar Falls -- bear the hallmark of most university towns: attracting and accepting more odd ideas than the surrounding environment tends to accept. But Iowans are definitely stolid Midwestern US stock, and it's not a state with grand variation in geography or history or culture. We have to bank on solidity as a strength and hope that some nerds might like to raise their families in a place with nice bike trails and parks, a place where you can let your kids play in the neighborhood with fearing the worst.
We also don't have a truly great university, certainly not of the caliber Graham expects. Iowa and Iowa State are solid universities, with very strong programs in some areas. UNI is routinely praised for its efficiency and for its ability to deliver a solid education to its students. (Solid -- there's that word again!) But none of the schools has a top-ten CS program, and UNI has not historically been a center of research.
I've sometimes wondered why Urbana-Champaign in Illinois hasn't developed a higher-profile tech center. UIUC has a top-notch CS program and produces a lot of Ph.D., M.S., and B.S. graduates every year. Eric Sink has blogged for a few years about the joys of starting an independent software company amid the farmland of eastern Illinois. But then there is that solid, traditional-values, boring reputation to overcome. Chicago is only a few hours' drive away, but Chicago just isn't a place nerds want to be near.
So Iowa is fighting an uphill battle, at least by most people's reckoning. I think that's okay, because I think the battle is still winnable -- perhaps not on the level of the original Silicon Valley but at least on the scale needed to invigorate Iowa's economy. And while reputation can be an obstacle, it also means that competitors may not be paying enough attention. The first step is to produce more tech-savvy graduates, especially ones with an entrepreneurial bent, and then convince them to stay home. Those are steps we can take.
One thing that has surprised me about my work with the IT Council is that Iowa is much better off on another of Graham's measures than I ever realized, or than most people in this state know. We have a fair amount of venture capital and angel funding waiting for the right projects to fund. This is a mixture of old money derived from stodgy old companies like Deere and new money from the 1990s. We need to find a way to connect this money to entrepreneurs who are ready to launch start-ups, and to educate folks with commercializable ideas on how to make their ideas attractive to the folks with the money.
Here at UNI, we are blessed to have an existence proof that it is possible to grow a tech start-up right here in my own backyard: TEAM Technologies, which among its many endeavors operates the premier data center in the middle part of the USA. A boring, solid location with few people, little crime, and no coastal weather turns out to be a good thing when you want to store and serve data safely! TEAM is headed up by a UNI alumnus -- another great strength for our department as we look for ways to expand our role in the economic development of the state.
This morning I was thinking about a game show from my youth, To Tell the Truth (not the 1953 version!). In this game, "a person of some notoriety and two impostors try to match wits with a panel of four celebrities". The goal of the celebrity panel was to identify the 'real person' and ferret out the impostors.
Why would a thirty-year-old game show come to mind on a hazy July morning? The track was doing its best to determine if I was an impostor or the real McCoy. I think I passed my first test of the training season.
The Twin Cities Marathon takes place on October 1, so I have just under 12 weeks to prepare. Last week I bumped my weekly mileage up into the low- to mid-40s, and this week I began a 12-week "can't fail" training plan by running coach Bob Williams that I found in an old Runner's World magazine.
Now generally, I'm no better at sticking to someone else's training plan than I am sticking to someone else's textbook; I like to tinker with the plan, make it conform to my schedule and a bit to my expectations. This year isn't much different, with one exception: I intend to follow Williams's speed workouts as closely as I can. His plan calls for one workout of short, fast repeats each week or so, and one workout of longer repeats every other week or so, with tempo runs in the alternate weeks. Today, I ran speed workout #1, 5x800m repeats at roughly 5K pace. I managed 3:13-3:15 per half-mile on this fine day, but only after I spent my warm-up miles wondering if I had what it took to run repeats today.
The plan I selected for this season is unusual in another way, one that doesn't demonstrate the sort of humility that has cropped up in my recent writing: I opted for the "advanced" program. Many training plans for 5Ks, halfs, and marathons come in three flavors, beginner, intermediate, and advanced. I have always gone with the intermediate plan, but this year I was honest with myself; by the recommendation of the coaches, I am ready for the advanced plan. For example, Williams's plan says "Typically this person has completed at least three marathons and has run consistently for three years or more." Check and check. Note that this precondition says nothing about the runner's speed or goal time; it depends only on the person's ability to work up to a sufficient mileage and handle speed workouts at a runner-specific pace. I'm ready for that.
One thing I like about running, which has popped up from time to time in this blog, is the accountability it exacts from us. It's impossible to be an impostor in this game. Eventually, the road or track finds you out. It lets your body know that you've been found out, and your body tells your mind. The feedback comes immediately, in the form of aches and pains and fatigue, and over the long haul, in the form of persistent fatigue and injury. So we do end up facing the need for humility after all, but also a challenge to stretch and grow.
For one day at least I am the real guy. Let's see if I am found to be an impostor by 11:30 AM or so on October 1.
That's one of the reasons for my "I don't know" policy -- an answer of "I don't know" on any homework or exam question is worth 25% partial credit. A blank response doesn't count; to get the partial credit, you must explicitly acknowledge your ignorance. (The other reason, of course, is that it cuts way back on random nonsense maybe-I'll-get-pity-credit-for-stumbling-on-the-right-keywords answers, which makes grading much easier.)
Excellent idea... Anyone who has ever graded exams with open-ended questions knows just what Ernie is talking about. Some students will write anything down to bluff their way to a few points, and the time the grader spends seeing through the smoke (or not!) is much better spent on almost anything else. The more open-ended the problem or question, the more likely that the student's tangential dump will look just enough like a real answer that it requires extra attention.
Before I would use this strategy, I think I would add a phrase to the required answer: "... but I will by Friday." This transforms the answer from just an admission into a promise to learn. This turns what can be a dispiriting experience -- a complete blank on an exam question -- into a chance to get better.
Of course, with that phrase, I would reserve the right to ask the student for the answer later, by e-mail. This adds an element of accountability to the equation and might encourage students to take their admission more seriously. (With the right set of students, especially in a junior/senior course, I might want the right to ask for the answer in class!)
I'm guessing that the students in my fall course would probably prefer that I not browse my stuff folders, if this is what happens when I do.
I haven't written an entry about my blog in a while, but I hope you'll indulge me today. This is the 377th entry in my blog. I posted the first on July 9 two years ago.
This second year has seen considerably less activity than the first, in which I posted 225 entries. But I've still managed a dozen or so entries per month in 2005-2006 -- though the content of my postings looks a bit different in the second year as well. In Year 1, my writing seemed driven by thoughts of agile software development and especially connections I was making between agile ideas and my training for the Des Moines Marathon. My initial readership came largely from folks interested in agile development, and sometimes those interested in how I was trying to teach those ideas in a senior-level seminar.
Near the end of my first year I took on a three-year assignment as head of my department. I had thought that this would result in frequent writing about management and leadership, as I tried to figure out how to do them. But most of my entries have been about topics at the intersection of my headship and my teaching, the future of computer science at the university and the teaching of introductory CS courses. Why I have not written more frequently about the administrative and management sides of my job is worthy of its own entry in the future, but the short answer is that I have not yet managed to consolidate my learning in this area yet.
I think the entry from this year that elicited the most response was my report on a lecture by Thomas Friedman. In retrospect, I'm not sure if I have a favorite post from the year, though I recall articles on negative splits in learning, Robert Hass's OOPSLA keynote on creativity, and talks by Marcia Bjornerud on popular science writing and my friend Roy Behrens on teaching as "subversive inactivity" with fondness. (Does a particular article or theme stand out in your mind?)
After all this time, I still haven't followed through with allowing comments or posting links to some of my favorite blogs. The comments are problematic, given the low-tech blogging tool I use, NanoBlogger. With some time and patience, they are doable, but the opportunity cost of that time seems inordinately high. But I may move from NanoBlogger soon, for a variety of technical reasons (long, unmeaningful entry names and slowness processing a blog with 300+ entries among them), so who knows. I can add a blogroll of sorts with minimal effort, and I can only plead inordinate laziness as my excuse. Soon.
On my one-year anniversary, I wrote a brief reflection and wondered what a second year of Knowing and Doing would bring. I love the quotes I used there, from Annie Dillard's "The Writing Life" and Glenway Wescott's "The Pilgrim Hawk". They remain true for me today and express something of why I will continue to write here.
Thanks to all you who read my ramblings. Thanks, too, to all of you who send me short notes when a post strikes a chord with you, or when you have something to share. You've taught me much. I hope to make the time you spend reading in the coming year worth your while.
Back in January, I wrote about making things worse in the introductory course, which had been triggered by an article at Uncertain Principles. The world is circling back on itself, because I find myself eager to write about how we drive students away from our intro courses, again triggered by another article at Uncertain Principles. This time, Chad's article was itself triggered by another physicist's article on driving students away. Perhaps the length of the chain grows by each time it comes around...
According to these physicists, one of the problems with intro course in physics is that it is too much like high school physics, which bores the better students to the point that they lose interest. We don't face that issue in CS much these days, at least in my neck of the woods, because so few of our freshmen enter the program with any formal CS or programming education. I'm not a fan of the approaches suggested to keep the attention of these well-prepared students (a byzantine homework policy, lots of quizzes) because I think that repeating material isn't the real problem. And these approaches make the real problem worse.
The real problem they describe is one with which we are familiar: students "lose sight of the fun and sense of wonder that are at the heart of the most successful scientific careers". The intro physics course...
... covers 100's years of physics in one year. We rarely spend more than a lecture on a single topic; there is little time for fun. And if we want to make room for something like that we usually have to squeeze out some other topic. Whoosh!
Chad says that this problem also affects better students disproportionately, because they "have the preparation to be able to handle something more interesting, if we could hold their attention".
I think most students can handle something more interesting. They all deserve something more interesting than we usually give them, too.
And I don't think that the answer involves "more content". Whenever I talk to scientists about the challenges of teaching, the conversation always seems to turn to how much content we have to deliver. This attitude seems wrongheaded to me when taken very far. It's especially dangerous in an introductory course, where novices can easily drown in syntax and detail -- and lose sight of what it is like to be a scientist, or an engineer. Pouring on more content, even when the audience is honors students, almost always results in suboptimal learning, because the course tends to become focused on data rather than ideas.
In closing, I did enjoy seeing that academic physicists are now experimenting with courses about something more than the formulas of physics. One of the commenters on the Uncertain Principles article notes that he is tweaking a new course design around the question, "How old is the universe?" He also mentions one of the obstacles to making this kind of change: students actually expect a memorization-driven course, because that's what they've learned from their past experiences. This is a problem that really does affect better students differently, because they have mastered the old way of doing things! As a result, some of them will resent a new kind of course. My experience, though, is that you just have to stick to your approach through some rough patches early; nearly all of these students will eventually come around and appreciate the idea- and practice-driven approach even more once they adapt to the change. Remember, adaptation to change takes time, even for those eager to to change...
Three quotes for what amounts in most American workplaces the middle of a long holiday weekend. The first two remind me to approach my teaching and administrative duties with humility. The third reminds me of we in America celebrate this holiday weekend at all.
... from Gerald Weinberg:
When I write a book or essay, or teach a course, I have one fundamental measure of failure, which I call Weinberg's Target:
After exposure to my work, does the audience care less about the subject than they did before?
If the answer is Yes, I've failed. If the answer is No, I've succeeded, and I'm happy for it. Perhaps you consider my goal too modest. Perhaps you aspire to something greater, like making the student learn something, or even love the subject. Oh, I'm not dismayed by such fine outcomes, but I don't think it's a reasonable goal to expect them.
We can do much worse than communicate some information without dampening our audience's natural enthusiasm.
... from Steve Yegge:
If you don't know whether you're a bad manager, then you're a bad manager. It's the default state, the start-state, for managers everywhere. So just assume you're bad, and start working to get better at it. ... Look for things you're doing wrong. Look for ways to improve. If you're not looking, you're probably not going to find them.
Steve's essay doesn't have much in the way of concrete suggestions for how to be a good manager, but this advice is enough to keep most of us busy for a while.
I have sworn upon the altar of God eternal hostility against every form of tyranny over the mind of man.
A mixture of humility and boldness befitting revolution, of thought and government.