Today is my birthday. I won't tell you how old I am, and I promise not to pull a Gwyneth Paltrow. Trust me; no one wants that.
I do not have a birthday essay planned, or any deep reflections on the passing of time. Birthdays have never been times of deep reflection for me. As I grow older, they have not yet started to mean anything different to me, even as I realize that they will be fewer and fewer. I mostly think of them as a time to relax, think on the good things in my life, and get on with living it.
It's just as well that I have no birthday essay planned. My friend Daniel Steinberg wrote about his birthday a few weeks ago, and I cannot improve on what he said. Daniel turned 65 this year, a milestone in a culture that emphasizes youth. Daniel, though, is still learning and teaching us what he learns with his books and presentations. He notices now that the other attendees at the conferences where he speaks are much younger than he is:
I'm the same age or older than their parents.
I'm sure they see me as old.
For the most part, it is much worse in my head than in theirs.
I know this feeling. I work with college students every day. I'm older than all their parents now, and probably not that much younger than a few grandparents. I'm sure they seem me as old, but how much of that is in my own mind?
I have a few years before I reach sixty-five, but it's close
enough that I've been told I should start thinking about
retirement and investment accounts and a different kind of life.
But I'm of the same mind as Daniel: "Retire to do what?
Travel? Do things I'm interested in? I do all that now."
The life of an academic, at least one fortunate enough to have
found a steady, secure position, is good: we read, write, and
teach the thing we love. As long as I can do these things
well enough and enjoy them, I will.
2024 has been, in some ways, an eventful year. My dad died unexpectedly this summer. A couple of months later I learned that one of my college mentors died around the same time. Those were times of reflection for me. On a more uplifting note, my wife and I got to visit our daughters in Boston this spring and spend time with our favorite people in the world. Later we took a 48-hour break mid-summer to visit Minnesota and ride the full length of the Sakatah Singing Hills State Trail and back in one day. Riding 85 miles gives you plenty of time to enjoy the beauty of the world. (That's a blog post I've been meaning to write for three months...)
This is a milestone birthday for me, too, as counted by the rest of the world. However, a few years ago, I started enumerating my birthdays in hex (base 16), a shrewd move for a computer scientist. That means I am still comfortably in my 0x30s, young enough to keep doing the things I love to do.
Andrej Karpathy loves his calculator, and I like his post about how he loves his calculator. I was planning to quote the beginning of one paragraph as a teaser, but I could not decide where to clip the passage. Each sentence is worth seeing. I hope he does not mind that I show you the entire paragraph with encouragement to read the entire post:
Let's put this in perspective to the technology we increasingly accept as normal. The calculator requires no internet connection to set up. It won't ask for bluetooth permissions. It doesn't want to know your precise location. You won't be prompted to create an account and you don't need to log in. It does not download updates every other week. You're not going to be asked over and over to create and upgrade your subscription to the Calculator+ version that also calculates sine and cosine. It won't try to awkwardly become a platform. It doesn't need your credit card on file. It doesn't ask to track your usage to improve the product. It doesn't interrupt you randomly asking you to review it or send feedback. It does not harvest your information, for it be sold later on sketchy data markets, or for it to be leaked on the dark web on the next data breach. It does not automatically subscribe you to the monthly newsletter. It does not notify you every time the Terms of Service change. It won't break when the servers go down. The computation you perform on this device is perfectly private, secure, constrained fully to the device, and no running record of it is maintained or logged anywhere. The calculator is a fully self-contained arithmetic plugin for your brain. It works today and it would work a thousand years ago. You paid for it and now it is yours. It has no other master. It just does the thing. It is perfect.
You paid for it, and now it's yours.
My favorite pieces of software and favorite creators of software embody this ideal, at least as much of it as they can given the constraints of the modern tech world. Audio Hijack from Rogue Amoeba comes to mind and Acorn from Flying Meat come to mind.
That said, I loved calculators long before the web existed at all. Young Eugene had many hours of pleasure noodling on a calculator, playing with numbers. I first learned about rounding errors and limits by typing in a number and repeatedly hitting the square root key until the display showed a 1. I computed batting averages and winning percentages for my favorite teams and players. Eventually I was computing chess ratings by hand, until I sensed what a computer program could do for me. That would become the first program I ever wrote out of passion.
Even after learning to program, I never really lost the joy of tinkering with an old handheld calculator. It just does its thing. It is perfect.
First from French writer André Gide, in Le Traité du Narcisse (1892):
Everything has been said before. But since nobody listens we have to keep going back and beginning all over again.
Every teacher knows how Gide feels.
Then from Bruce Springsteen, in a New Yorker interview I've lost track of:
This music has not been heard at this moment, in this place, by these faces. That's why we go out there.
Springsteen said that when asked how he could still sing "Born to Run" with so much energy after decades of performing. I think something similar to myself on many teaching days. I love the thing I am teaching and, even though it's old hat to me, it's new to my students, and I want them to love it, too.
I've always been a sucker for quotes that aren't about teaching, or programming, but could be. Making that sort of connection can motivate me on those days when I need a little pick-up.
But I also realize that making connections to other disciplines is a big part of how we teach or write programs at all. Metaphors are everywhere in both. Consider this line from Zach Tellman in a recent issue of the newsletter about his book in progress, "Explaining Software Design":
The queue is a useful metaphor because it makes us ask useful questions.
We create analogies and metaphors because they help us ask useful questions. They help us see our own objects of study, or own activities, in a new way.
I guess this is my way of saying that I'm okay with my irrational fondness for applying quotes out of context to my own world. Doing so occasionally helps me ask better questions. Even when they don't, I get to smile.
Really. Lea Verou mentioned in this blog post that she hopes to write another blog post soon explaining...
How I used web technologies instead of LaTeX to write my PhD thesis (and print it to PDF for submission), with 11ty plus several open source plugins, many of which I wrote, an ecosystem I hope to one day free more people from the tyranny of LaTeX (which was amazing in the 70s, but its ergonomics are now showing their age).
I wrote my Ph.D. dissertation in WordPerfect (yes, I am old). The idea of writing a document as large and complex as a Ph.D. dissertation in HTML and CSS is both intimidating and intriguing, a challenge worth tackling — if only I hadn't already been facing the challenge of completing my research and, you know, writing a dissertation. For me, trying to write a simple midterm or final exam for one of my courses using only HTML and CSS usually results in me banging my head against a surprising number of walls followed by an hour or so of making tedious small changes to the stylesheet to get everything Just Right.
Verou had the advantage of deep expertise in web technologies and having done her research on the same. Even so, I am impressed and may take a peek under the hood of her website to see some of her best tricks.
This reflection is a convenient way for me to preface the fact that I am teaching web development again this fall. As I wrote in an earnest appeal for assistance last spring, this is an introductory course for non-majors, covering HTML, CSS, and a little interactivity using JavaScript. The course went reasonably well the first time around, though my coverage of JavaScript fell flat with most everyone in the class, especially the non-majors. I'm still working on ways to do better when we reach JS in a month or so.
After only a few weeks, though, I have been impressed with my students' eagerness to put their new web knowledge to use. On the first real homework assignment, I asked students to create a web page using a bunch of the HTML elements we had learned. I suggested possible pages they might create but left it open for them to create any page they wanted. As you might imagine, most students like this kind of freedom.
A couple of students wrote fan pages for their favorite musical artists (*). For me, that was shades of 1995. Another student used his assignment to create an instructional guide for the dog walker he and his significant other were hiring. He apologized for creating a page he didn't think I'd care to read, but I was very happy, and told him so.
I remember how much fun I had learning to write HTML back in the early days of the web and then writing all sorts of pages about things that I cared about. That's what knowing how to write can do for you, and knowing how to write for the web means being able to share what you write with anyone who cares to read. My students are using what they're learning to write about things they care about.
That makes me happy. And I think that Verou would be proud, too.
(*) Taylor Swift really is popular... A couple of my students were surprised and a little impressed that I like Swift, too. My daughters and I have bonded over music since they were toddlers, and they love to share their musical interests with Dad. Swift is one of many artists they've turned me on to over the years.
One of the downsides of being department head is that much of the daily work I have to do is less interesting to me than studying computer science or writing programs. It's important work, sure, at least most of it, but it's not why I became a computer scientist.
Today, I ended up not doing most of the administrative work I had planned for the day. On the exercise bike this morning, I read the recent Quanta article about a new way to estimate the number of distinct items in a list. My mind craved computing, so I spent some time implementing the algorithm. I still have a bug somewhere, but I'm close.
Any day I get to write code for fun I call a good day. This is important work, too, because it keeps me fresh. I can justify the time practically, because the exercise also gives me raw material for my courses and for research with my undergrads. But the real value is in keeping me alive as a computer scientist.
The good feeling from writing code today is heightened knowing that tomorrow's planned administrative work can't be postponed and is not fun, for tomorrow I must finish up writing annual salary letters. I am fortunate that writing these letters provokes as little stress as possible, because the faculty in my department are all very good and are doing good work. Even so... Salary letters? Ugh.
Thus I take extra pleasure in an absorbing programming process today.
I realized something about myself as a programmer while on a bike ride with my wife this afternoon.
Even though I prefer to write code in small steps and to write tests before (or in parallel with) the code, I have a weakness: If you give me the algorithm for a task in full upfront, and I grok it, I will occasionally implement the entire algorithm upfront, too. Then I end up debugging the program one error at a time, like a caveman.
Today, I succumbed to this tendency without even thinking about it. I am human.
Sometimes, speculative fiction seems eerily on the mark:
Montag turned and looked at his wife, who sat in the middle of the parlor talking to an announcer, who in turn was talking to her. "Mrs. Montag," he was saying. This, that, and the other. "Mrs. Montag--" Something else and still another. The converter attachment, which had cost them one hundred dollars, automatically supplied her name whenever the announcer addressed his anonymous audience, leaving a blank where the proper syllables could be filled in. A special spot-wavex-scrambler also caused his televised image, in the area immediately about his lips, to mouth the vowels and consonants beautifully. He was a friend, no doubt of it, a good friend. "Mrs. Montag--now look right here."
"Spot-wavex-scrambler" is a great phrase. Someone should make it a product name.
That is a paragraph from Ray Bradbury's Fahrenheit 451. I was not far into the book before its description of technology used to entertain — distract, occupy, sedate — the population began to seem eerily familiar. It's not what we have now, and there hasn't been anything especially AI-like in the story yet, except perhaps the sinister robot dog at the fire station. But the entertainment tech hits close to the mark. Mildred wears earbuds all the time, listening to her shows or just to white noise.
The timeline isn't perfect, either ("We've started and won two atomic wars since 2022!"), but the timing isn't all that far off. Almost everyone these days is living with a sense of disruption from the events of the last decade or so, including wars, which is in rhythm with the story. The fictional government, I presume, makes people happy by surrounding them, literally, with video and audio entertainment 24/7 — all the better not to think about what's really happening out in the world.
Reading this is eerie for me in another way. I read a lot of Ray Bradbury when I was growing up, and for a long time I thought I had read Fahrenheit 451. But then I wasn't so sure, because I couldn't bring to mind any memory around reading it, let alone any memory of the content. (The latter is common for many books I read in high school.) On my last trip to the library, I checked out a copy in order to fill either the gap in my memory or the gap in my reading.
It's a prescient book. I see why it remains a common text in high school and college lit courses. I look forward to the rest of the story.
This morning, I finished reading How a Script Doctor Found His Own Voice, about screenwriter Scott Frank. Late in the piece, there's a bit on "how difficult it can be to remain relevant as a screenwriter as you age". Frank took caution from the experience of one of his mentors, director Sydney Pollack:
After decades of success making such movies as "Three Days of the Condor" and "Out of Africa", Pollack had "a way of working," Frank said. "And it stopped working." Suddenly, Pollack was out of step. Frank urged him to do "something different, something small, something that's not a love story where they end up together." He even tried to get Pollack to direct his thriller "The Lookout". But Pollack couldn't change. To Frank, the lesson was clear: you can't "just double down on what you used to do." The only way to remain vital is to take chances.
That called to mind something I read earlier in the week, a short blog post by Jessamyn West, on how the intersection of an LLM chatbot tool and a newsletter called "The Soul of a New Machine" made her laugh. She closes with a paragraph that felt familiar:
I'm now what folks might consider later-career. I'm faffing about with this newfangled technological stuff knowing both that it's a big deal and also that I only sort of care about it (at my peril? perhaps.) ....
I, too, am late in my career. As an academic computer scientist, "newfangled technological stuff" is my line of work, but... I can't think of many things less interesting for me to do than figuring out how to prompt an LLM to write code or text for me. My lack of enthusiasm may portend the sort of irrelevance that befell Pollack, but I hope not. Unlike Pollack, I feel no need to double down on what I've always done, and indeed am open to something new. So I'll keep poking around, enjoying what I enjoy, and hope to find a path more like the one Frank followed: taking a different kind of chance.
~~~~~
Postscript: If you think this post seems like an aftershock to a post from the turn of the year, you are not alone. Still searching.
Whatever you think of this post, though, I heartily recommend the New Yorker article on Scott Frank, which was engaging throughout and full of interesting bits on writing, filmmaking, and careers.
In her Conversation with Tyler, scholar Katherine Rundell said something important about the books we give our children:
Children's novels tend to teach the large, uncompromising truths that we hope exist. Things like love will matter, kindness will matter, equality is possible. I think that we express them as truths to children when what they really are are hopes.
This passage immediately brought to mind Marick's Law: In software, anything of the form "X's Law" is better understood by replacing the word "Law" with "Fervent Desire". (More on this law below.)
While comments on different worlds, these two ideas are very much in sync. In software and so many other domains, we coin laws that are really much more expressions of our aspiration. This no less true in how we interact with young people.
We usually think that our job is to teach children the universal truths we have discovered about the world, but what we really teach them is our view of how the world can or should be. We can do that by our example. We can also do that with good books.
But aren't the universal truths in our children's literature true? Sometimes, perhaps, but not all of them are true all of the time, or for all people. When we tell stories, we are describing the world we want for our children, and giving them the hope, and perhaps the gumption, to make our truths truer than we ourselves have been able to.
I found myself reading lots of children's books and YA fiction when my daughters were young: to them, and with them, and on their recommendation. Some of them affected me enough that I quoted them in blog posts. There is so many good books for our youth in the library: honest, relevant to their experiences, aspirational, exemplary. I concur in Rundell's suggestion that adults should read children's fiction occasionally, both for pleasure and "for the unabashed politics of idealism that they have".
More on Marick's Law and Me
I remember posting Marick's Law on this blog in October 2015, when I wanted to share a link to it with Mike Feathers. Brian had tweeted the law in 2009, but a link to a tweet didn't feel right, not at a time when the idealism of the open web was still alive. In my post, I said "This law is too important to be left vulnerable to the vagaries of an internet service, so let's give it a permanent home".
In 2015, the idea that Twitter would take a weird turn, change its name to X, and become a place many of my colleagues don't want to visit anymore seemed far-fetched. Fortunately, Brian's tweet is still there and, at least for now, publicly viewable via redirect. Even so, given the events of the last couple of years, I'm glad I trusted my instincts and gave the law a more home on Knowing and Doing. (Will this blog outlive Twitter?)
The funny thing, though, is that that wasn't its first appearance here. I found the 2015 URL for use in this post by searching for the word "fervent" in my Software category. That search also brought up a Posts of the Day post from April 2009 — the day after Brian tweeted the law. I don't remember that post now, and I guess I didn't remember it in 2015 either.
Sometimes, "Great minds think alike" doesn't require two different people. With a little forgetfulness, they can be Past Me and Current Me.
I was listening to some music from the 1970s yesterday morning while doing some academic bookkeeping. As happens occasionally, the lyrics of one of the songs jerked me out of my bureaucratic trance by echoing my subconscious:
I need to be three men in one
To get my job done
I need a thirty hour day
Two jobs with double pay
I need a man to go to work
A man to stay at home
That's William Bell in his 1977 R&B crossover hit "Tryin' to Love Two" [ YouTube ].
I love only one, truly, but... University work has been unusually busy the last couple of weeks, and now we enter April, which is always a hyperactive month on campus. Add to that regular life — tax season and plans for May travel and wanting to spend time with the one I love — and I empathize with Bell wanting to be two or three people all at once. A doppelgänger to attend all my extra meetings would certainly be welcome some days!
At times like this, though, it's good to remember how lucky I am that this is the biggest predicament I face. So: hello, April.
At some point last week, I found myself pointed to this short YouTube video of Jerry Seinfeld talking with Howard Stern about work habits. Seinfeld told Stern that he was essentially always thinking about making comedy. Whatever situation he found himself in, even with family and friends, he was thinking about how he could mine it for new material. Stern told him that sounded like torture. Jerry said, yes, it was, but...
Your blessing in life is when you find the torture you're comfortable with.
This is something I talk about with students a lot.
Sometimes it's a current student who is worried that CS isn't for them because too often the work seems hard, or boring. Shouldn't it be easy, or at least fun?
Sometimes it's a prospective student, maybe a HS student on a university visit or a college student thinking about changing their major. They worry that they haven't found an area of study that makes them happy all the time. Other people tell them, "If you love what you do, you'll never work a day in your life." Why can't I find that?
I tell them all that I love what I do -- studying, teaching, and writing about computer science -- and even so, some days feel like work.
I don't use torture as analogy the way Seinfeld does, but I certainly know what he means. Instead, I usually think of this phenomenon in terms of drudgery: all the grunt work that comes with setting up tools, and fiddling with test cases, and formatting documentation, and ... the list goes on. Sometimes we can automate one bit of drudgery, but around the corner awaits another.
And yet we persist. We have found the drudgery we are comfortable with, the grunt work we are willing to do so that we can be part of the thing it serves: creating something new, or understanding one little corner of the world better.
I experienced the disconnect between the torture I was comfortable with and the torture that drove me away during my first year in college. As I've mentioned here a few times, most recently in my post on Niklaus Wirth, from an early age I had wanted to become an architect (the kind who design houses and other buildings, not software). I spent years reading about architecture and learning about the profession. I even took two drafting courses in high school, including one in which we designed a house and did a full set of plans, with cross-sections of walls and eaves.
Then I got to college and found two things. One, I still liked architecture in the same way as I always had. Two, I most assuredly did not enjoy the kind of grunt work that architecture students had to do, nor did I relish the torture that came with not seeing a path to a solution for a thorny design problem.
That was so different from the feeling I had writing BASIC programs. I would gladly bang my head on the wall for hours to get the tiniest detail just the way I wanted it, either in the code or in the output. When the torture ended, the resulting program made all the pain worth it. Then I'd tackle a new problem, and it started again.
Many of the students I talk with don't yet know this feeling. Even so, it comforts some of them to know that they don't have to find The One Perfect Major that makes all their boredom go away.
However, a few others understand immediately. They are often the ones who learned to play a musical instrument or who ran cross country. The pianists remember all the boring finger exercises they had to do; the runners remember all the wind sprints and all the long, boring miles they ran to build their base. These students stuck with the boredom and worked through the pain because they wanted to get to the other side, where satisfaction and joy are.
Like Seinfeld, I am lucky that I found the torture I am comfortable with. It has made this life a good one. I hope everyone finds theirs.
In a recent post on Computational Complexity, Bill Gasarch wrote up the solution to a fun little dice problem he had posed previously. Check it out. After showing the solution, he answered some meta-questions. I liked this one:
How did I find this question, and its answer, at random? I intentionally went to the math library, turned my cell phone off, and browsed some back issues of the journal Discrete Mathematics. I would read the table of contents and decide what article sounded interesting, read enough to see if I really wanted to read that article. I then SAT DOWN AND READ THE ARTICLES, taking some notes on them.
He points out that turning off his cell phone isn't the secret to his method.
It's allowing yourself the freedom to NOT work on a a paper for the next ... conference and just read math for FUN without thinking in terms of writing a paper.
Slack of this sort used to be one of the great attractions of the academic life. I'm not sure it is as much a part of the deal as it once was. The pace of the university seems faster these days. Many of the younger faculty I follow out in the world seem always to be hustling for the next conference acceptance or grant proposal. They seem truly joyous when an afternoon turns into a serendipitous session of debugging or reading.
Gasarch's advice is wise, if you can follow it: Set aside time to explore, and then do it.
It's not always easy fun; reading some articles is work. But that's the kind of fun many of us signed up for when we went into academia.
~~~~~
I haven't made enough time to explore recently, but I did get to re-read an old paper unexpectedly. A student came to me to discuss possible undergrad research projects. He had recently been noodling around, implementing his own neural network simulator. I've never been much of a neural net person, but that reminded of this paper on PushForth, a concatenative language in the spirit of Forth and Joy designed as part of an evolutionary programming project. Genetic programming has always interested me, and concatenative languages seem like a perfect fit...
I found the paper in a research folder and made time to re-read it for fun. This is not the kind of fun Gasarch is talking about, as it had potential use for a project, but I enjoyed digging into the topic again nonetheless.
The student looked at the paper and liked the idea, too, so we embarked on a little project -- not quite serendipity, but a project I hadn't planned to work on at the turn of the new year. I'll take it!
My social media feed this week has included many notes and tributes on the passing of Niklaus Wirth, including his obituary from ETH Zurich, where he was a professor. Wirth was, of course, a Turing Award winner for his foundational work designing a sequence of programming languages.
Wirth's death reminded me of
END DO
,
my post on the passing of John Backus, and before that
a post
on the passing of Kenneth Iverson. I have many fond memories related
to Wirth as well.
Pascal
Pascal was, I think, the fifth programming language I learned. After that, my language-learning history starts to speed up and blur. (I do think APL and Lisp came soon after.)
I learned BASIC first, as a junior in high school. This ultimately changed the trajectory of my life, as it planted the seeds for me to abandon a lifelong dream to be an architect.
Then at university, I learned Fortran in CS 1, PL/I in Data Structures (you want pointers!), and IBM 360/370 assembly language in a two-quarter sequence that also included JCL. Each of these language expanded my mind a little.
Pascal was the first language I learned "on my own". The fall of my junior year, I took my first course in algorithms. On Day 1, the professor announced that the department had decided to switch to Pascal in the intro course, so that's what we would use in this course.
"Um, prof, that's what the new CS majors are learning. We know Fortran and PL/I." He smiled, shrugged, and turned to the chalkboard. Class began.
After class, several of us headed immediately to the university library, checked out one Pascal book each, and headed back to the dorms to read. Later that week, we were all using Pascal to implement whatever classical algorithm we learned first in that course. Everything was fine.
I've always treasured that experience, even if it was little scary for a week or so. And don't worry: That professor turned out to be a good guy with whom I took several courses. He was a fellow chess player and ended up being the advisor on my senior project: a program to perform the Swiss system commonly used to run chess tournaments. I wrote that program in... Pascal. Up to that point, it was the largest and most complex program I had ever written solo. I still have the code.
The first course I taught as a tenure-track prof was my university's version of CS 1 -- using Pascal.
Fond memories all. I miss the language.
Wirth sightings in this blog
I did a quick search and found that Wirth has made an occasional appearance in this blog over the years.
• January 2006: Just a Course in Compilers
This was written at the beginning of my second offering of our compiler course, which I have taught and written about many times since. I had considered using as our textbook Wirth's Compiler Construction, a thin volume that builds a compiler for a subset of Wirth's Oberon programming language over the course of sixteen short chapters. It's a "just the facts and code" approach that appeals to me most days.
I didn't adopt the book for several reasons, not least of which that at the time Amazon showed only four copies available, starting at $274.70 each. With two decades of experience teaching the course now, I don't think I could ever really use this book with my undergrads, but it was a fun exercise for me to work through. It helped me think about compilers and my course.
Note: A PDF of Compiler Construction has been posted on the
web for many years, but every time I link to it, the link ultimately
disappears. I decided to mirror the files locally, so that the link
will last as long as this post lasts:
[
Chapters 1-8
|
Chapters 9-16
]
• September 2007: Hype, or Disseminating Results?
... in which I quote Wirth's thoughts on why Pascal spread widely in the world but Modula and Oberon didn't. The passage comes from a short historical paper he wrote called "Pascal and its Successors". It's worth a read.
• April 2012: Intermediate Representations and Life Beyond the Compiler
This post mentions how Wirth's P-code IR ultimately lived on in the MIPS compiler suite long after the compiler which first implemented P-code.
• July 2016: Oberon: GoogleMaps as Desktop UI
... which notes that the Oberon spec defines the language's desktop as "an infinitely large two-dimensional space on which windows ... can be arranged".
• November 2017: Thousand-Year Software
This is my last post mentioning Wirth before today's. It refers to the same 1999 SIGPLAN Notices article that tells the P-code story discussed in my April 2012 post.
I repeat myself. Some stories remain evergreen in my mind.
The Title of This Post
I titled my post on the passing of John Backus END DO
in homage to his intimate connection to Fortran. I wanted to do something
similar for Wirth.
Pascal has a distinguished sequence to end a program:
"end.
". It seems a
fitting way to remember the life of the person who created it and who
gave the world so many programming experiences.
In his year-end wrap-up, Greg Wilson writes:
I want to find something to learn that excites me. A new musical instrument is out because of my hand; I've thought about reviving my French, picking up some Spanish, diving into Postgres or machine learningn (yeah, yeah, I know, don't hate me), but none of them are making my heart race.
What he said. I want to find something to learn that excites me.
I just spent six months immersed in learning more about HTML, CSS, and JavaScript so that I could work with novice web developers. Picking up that project was one part personal choice and one part professional necessity. It worked out well. I really enjoyed studying the web development world and learned some powerful new tools. I will continue to use them as time and energy permit.
But I can't say that I am excited enough by the topic to keep going in this area. Right now, I am still burned out from the semester on a learning treadmill. I have a followup post to my early reactions about the course's JavaScript unit in the hopper, waiting for a little desire to finish it.
What now? There are parallels between my state and Wilson's.
Unlike Wilson, I do not play a musical instrument. I did, however, learn a little basic piano twenty-five years ago when I was a Suzuki piano parent with my daughters. We still have our piano, and I harbor dreams of picking it back up and going farther some day. Right now doesn't seem to be that day.
I have several other possibilities on the back burner, particularly in the area of data analytics. I've been intrigued by the work on data-centric computing in education being done by Kathi Fisler and Shriram Krishnamurthi have been at Brown. I also will be reading a couple of their papers on program design and plan composition in the coming weeks as I prepare for my programming languages course this spring. Fisler and Krishnamurthi are coming at these topics from the side of CS education, but the topics are also related to my grad-school work in AI. Maybe these papers will ignite a spark.
Winter break is coming to an end soon. Like others, I'm thinking about 2024. Let's see what the coming weeks bring.
Any man can call time out, but no man
can say how long the time out will be.
--
Books of Bokonon
I realized early last week that it had been a while since I blogged. June was a morass of administrative work, mostly summer orientation. Over the month, I had made notes for several potential posts, on my web dev course, on the latest book I was reading, but never found -- made -- time to write a full post. I figured this would be a light month, only a couple of short posts, if I only I could squeeze another one in by Friday.
Then I saw that the date of my most recent post was May 26, with the request for ideas about the web course coming a week before.
I no longer trust my sense of time.
This blog has certainly become much quieter over the years, due in part to the kind and amount of work I do and in part to choices I make outside of work. I may even have gone a month between posts a few fallow times in the past. But June 2023 became my first calendar month with zero posts.
It's somewhat surprising that a summer month would be the first to shut me out. Summer is a time of no classes to teach, fewer student and faculty issues to deal with, and fewer distinct job duties. This occurrence is a testament to how much orientation occupies many of my summer days, and how at other times I just want to be AFK.
A real post or two are on their way, I promise -- a promise to myself, as well as to any of you who are missing my posts in your newsreader. In the meantime...
On the web dev course: thanks to everyone who sent thoughts! There were a few unanimous, or near unanimous, suggestions, such as to have students use VS code. I am now learning it myself, and getting used to an IDE that autocompletes pairs such as "". My main prep activity up to this point has been watching David Humphrey's videos for WEB 222. I have been learning a little HTML and JavaScript and a lot of CSS and how these tools work together on the modern web. I'm also learning how to teach these topics, while thinking about the differences between my student audience and David's.
On the latest book: I'm currently reading Shop Class as Soulcraft, by Matthew Crawford. It came out in 2010 and, though several people recommended it to me then, I had never gotten around to it. This book is prompting so many ideas and thoughts that I'm constantly jotting down notes and thinking about how these ideas might affect my teaching and my practice as a programmer. I have a few short posts in mind based on the book, if only I commit time to flesh them out. Here are two passages, one short and one long, from my notes.
Fixing things may be a cure for narcissism.
Countless times since that day, a more experienced mechanic has pointed out to me something that was right in front of my face, but which I lacked the knowledge to see. It is an uncanny experience; the raw sensual data reaching my eye before and after are the same, but without the pertinent framework of meaning, the features in question are invisible. Once they have been pointed out, it seems impossible that I should not have seen them before.
Both strike a chord for me as I learn an area I know only the surface of. Learning changes us.
A short passage from Innocence, by Penelope Fitzgerald:
In 1927, when they moved me from Ustica to Milan, I was allowed to plant a few seeds of chicory, and when they came up I had to decide whether to follow Rousseau and leave them to grow by the light of nature, or whether to interfere in the name of knowledge and authority. What I wanted was a a decent head of chicory. It's useless to be doctrinaire in such circumstances.
Sometimes, you just want a good head of chicory -- or a working program. Don't let philosophical ruminations get in the way. There will be time for reflection and evaluation later.
A few years ago, I picked up Fitzgerald's short novel The Bookshop while browsing the stacks at the public library. I enjoyed it despite the fact that (or perhaps because) it ended in a way that didn't create a false sense of satisfaction. Since then I have had Fitzgerald on my list of authors to explore more. I've read the first fifty pages or so of Innocence and quite like it.
I just started reading Joshua Kendall's The Man Who Made Lists, a story about Peter Mark Roget. Long before compiling his namesake thesaurus, Roget was a medical doctor with a local practice. After a family tragedy, though, he returned to teaching and became a science writer:
In the 1820s and 1830s, Roget would publish three hundred thousand words in the Encyclopaedia Brittanica and also several lengthy review articles for the Society for the Diffusion of Useful Knowledge, the organization affiliated with the new University of London, which sought to enable the British working class to educate itself.
What a noble goal, enabling the working class to educate itself. And what a cool name: The Society for the Diffusion of Useful Knowledge!
For many years, my university has provided a series of talks for retirees, on topics from various departments on campus. This is a fine public service, though without the grand vision -- or the wonderful name -- of the Society for the Diffusion of Useful Knowledge. I suspect that most universities depend too much on tuition and lower costs these days to mount an ambitious effort to enable the working class to educate itself.
Mental illness ran in Roget's family. Kendall wonders if Roget's "lifelong desire to bring order to the world" -- through his lecturing, his writing, and ultimately his thesaurus, which attempted to classify every word and concept -- may have "insulated him from his turbulent emotions" and helped him stave off the depression that afflicted several of his family members.
Academics often live an obsessive connection with the disciplines they practice and study. Certainly that sort of focus can can be bad for a person when taken too far. (Is it possible for an obsession not to go too far?) For me, though, the focus of studying something deeply, organizing its parts, and trying to communicate it to others through my courses and writing has always felt like a gift. The activity has healing properties all its own.
In any case, the name "The Society for the Diffusion of Useful Knowledge" made me smile. Reading has the power to heal, too.
I'm not sure what to think of the fact that my bank says it received my money NaN years ago:
At least NaN hasn't show up as my account balance yet! I suppose that if it were the result of an overflow, at least I'd know what it's like to be fabulously wealthy.
(For my non-technical readers, NaN stands for "Not a Number", and is used in computing interpreted as a value that is not defined or not representable. You may be able to imagine why seeing this in a bank transaction would be disconcerting to a programmer!)
Sometimes, I run across a sentence I wish I had written. Here are several paragraphs by Dan Bouk I would be proud to have written.
Museums offer a place to practice looking for and acknowledging beauty. This is, mostly, why I visit them.
As I wander from room to room, a pose diverts me, a glance attracts me, or a flash of color draws my eye. And then I look, and look, and look, and then move on.
Outside the museum, I find that this training sticks. I wander from subway car to platform, from park to city street, and a pose diverts me, a glance attracts me, or a flash of color draws my eye. People of no particular beauty reveal themselves to be beautiful. It feels as though I never left the museum, and now everything, all around me, is art.
This way of seeing persists, sometimes for days on end. It resonates with and reinforces my political commitment to the equal value of each of my neighbors. It vibrates with my belief in the divine spark, the image of God, that animates every person.
-- Dan Bouk, in On Walking to the Museum, Musing on Beauty and Safety
In a recent blog post, Why Grace Matters (for Software Development), Avdi Grimm tells the story of how he came to name his training site "Graceful.Dev". Check it out. This passage resolves into the answer:
You know, the word "grace" is interesting, because it has two different meanings. On the one hand, it means beauty in lines or in motion. But if you were raised with a religious background anything like mine, you know that grace is also something that saves you.
And in that moment on the dance floor, I realized that these two meanings of grace are really one and the same thing. Because grace is something that makes space for you to screw up, and then turns it into something beautiful.
I don't think I was raised in the same religious tradition as Avdi, but I was raised in a tradition that valued deeply the notion of grace. Grace manifest in sacrament was a powerful notion to me, one of the religious ideas I found most compelling as I was growing up.
That's probably why Avdi's realization strikes close to home for me. I carry the idea of grace present in other parts of my life as part of my cultural DNA. His connection of grace to software feels right. "Grace makes space for you to screw up, and then turns it into something beautiful." -- I imagine that many programmers know this feeling, in an non-religious way, if only vaguely.
Robin Sloan speculates that language-learning models like ChatGPT have gone through a phase change in what they can accomplish.
AI at this moment feels like a mash-up of programming and biology. The programming part is obvious; the biology part becomes apparent when you see AI engineers probing their own creations the way scientists might probe a mouse in a lab.
Like so many people, I find my social media and blog feeds filled with ChatGPT and LLMs and DALL-E and ... speculation about what these tools mean for (1) the production of text and code, and (2) learning to write and program. A lot of that speculation is tinged with fear.
I admire Sloan's effort to be constructive in his approach to the uncertainty:
I've found it helpful, these past few years, to frame my anxieties and dissatisfactions as questions. For example, fed up with the state of social media, I asked: what do I want from the internet, anyway?
It turns out I had an answer to that question.
Where the GPT-alikes are concerned, a question that's emerging for me is:
What could I do with a universal function — a tool for turning just about any X into just about any Y with plain language instructions?
I admit that I am reacting to these developments slowly compared to many people. That's my style in most things: I am more likely to under-react to a change than to over-react, especially at the onset of the change. In this case, there is no chance of immediate peril, so waiting to see what happens as people use these tools seems like a reasonable reasonable. I haven't made any effort to use these tools actively (or even been moved to), so any speculating I do would be uninformed by personal experience.
Instead, I read as people whose work I respect experiment with these tools and try to make sense of them. Occasionally, I draw a small, tentative conclusion, such as that prompting these generators is a lot like prompting students. After a few months of reading and a little reflection, I still think the biggest risk we face is probably that we tend to change the world around us to accommodate our technologies. If we put these tools to work for us in ways that enhance what we do, then the accommodation will pay off. If not, then we may, as Daniel Steinberg wrote in one of his newsletters, stop asking the questions we want to ask and start asking only the questions these tools can answer.
Professionally, I think most about the effect that ChatGPT and its ilk will have on programming and CS education. In these regards, I've been paying special attention to reports from David Humphrey, such as this blog post on his university's attempt to grapple the widespread availability of these tools. David has approached OpenAI with an open mind and written thoughtfully about the promise and the risk. For example, he has written a lot of code with an LLM assistant and found it improving his ability both to write code and to think about problems. Advanced CS students can benefit from this kind of assistance, too, but David wonders how such tools might interfere with students first learning to program.
What do we educators want from generative programming tools anyway? What do I as a programmer and educator want from them?
So, at this point, my personal interaction with the phase change that Sloan describes has been mostly passive: I read about what others are doing and think about the results of their exploration. Perhaps this post is a guilty conscience asserting that I should be doing more. Really, though, I think of it more as an active form of inaction: an effort to collect some of my thinking as I slowly respond to the changes that are coming. Perhaps some day soon I will feel moved to use of these tools as I write code of my own. For now, though, I am content to watch from the sidelines. You can learn a lot just by watching.
Poet Marvin Bell, in his contribution to the collection Writers on Writing:
The future belongs to the helpless. I am often presented that irresistible question asked by the beginning poet: "Do you think I am any good?" I have learned to reply with a question: "If I say no, are you going to quit?" Because life offers any of us many excuses to quit. If you are going to quit now, you are almost certainly going to quit later. But I have concluded that writers are people who you cannot stop from writing. They are helpless to stop it.
Reading that passage brought to mind Ted Gioia's recent essay on musicians who can't seem to retire. Even after accomplishing much, these artists seem never want to stop doing their thing.
Just before starting Writers on Writing, I finished Kurt Vonnegut's Sucker's Portfolio, a slim 2013 volume of six stories and one essay not previously published. The book ends with an eighth piece: a short story unfinished at the time of Vonnegut's death. The story ends mid-sentence and, according to the book's editor, at the top of an unfinished typewritten page. In his mid-80s, Vonnegut was creating stories t the end.
I wouldn't mind if, when it's my time to go, folks find my laptop open to some fun little programming project I was working on for myself. Programming and writing are not everything there is to my life, but they bring me a measure of joy and satisfaction.
~~~~~
This week was a wonderful confluence of reading the Bell, Gioia, and Vonnegut pieces around the same time. So many connections... not least of which is that Bell and Vonnegut both taught at the Iowa Writers' Workshop.
There's also an odd connection between Vonnegut and the Gioia essay. Gioia used a quip attributed to the Roman epigrammist Martial:
Fortune gives too much to many, but enough to none.
That reminded me of a story Vonnegut told occasionally in his public talks. He and fellow author Joseph Heller were at a party hosted by a billionaire. Vonnegut asked Heller, "How does it make you feel to know that guy made more money yesterday than Catch-22 has made in all the years since it was published?" Heller answered, "I have something he'll never have: the knowledge that I have enough."
There's one final connection here, involving me. Marvin Bell was the keynote speaker at Camouflage: Art, Science & Popular Culture an international conference organized by graphic design prof Roy Behrens at my university and held in April 2006. Participants really did come from all around the world, mostly artists or designers of some sort. Bell read a new poem of his and then spoke of:
the ways in which poetry is like camouflage, how it uses a common vocabulary but requires a second look in order to see what is there.I gave a talk at the conference called NUMB3RS Meets The DaVinci Code: Information Masquerading as Art. (That title was more timely in 2006 than 2023...) I presented steganography as a computational form of camouflage: not quite traditional concealment, not quite dazzle, but a form of dispersion uniquely available in the digital world. I recall that audience reaction to the talk was better than I feared when I proposed it to Roy. The computer science topic meshed nicely with the rest of the conference lineup, and the artists and writers who saw the talk seemed to appreciate the analogy. Anyway, lots of connections this week.
Catching up on articles in my newsreader and ran across Commands I Use by @gvwilson. That sounded like fun, and I was game:
$ history | awk '{print $2}' | sort | uniq -c | sort -nr > commands.txt
The first four items on my list are essentially the same as Wilson's, and there are a lot of other similarities, too. I don't think this is surprising, given how Unix works and how much sense git makes for software developers to use.
It's interesting to see that I use rm and /bin/rm in roughly even measure. I would have guessed that I used the guarded command in higher proportion.
At the bottom of the tally are a few items I don't use often, or don't generally launch from the command line:
... and a bunch of typos, including:
That was fun! Thanks to Greg for the prompt.
Bruce Springsteen, on why he puts on such an intense physical show:
So the display of exuberance is critical. "For an adult, the world is constantly trying to clamp down on itself," he says. "Routine, responsibility, decay of institutions, corruption: this is all the world closing in. Music, when it's really great, pries that shit back open and lets people back in, it lets light in, and air in, and energy in, and sends people home with that and sends me back to the hotel with it. People carry that with them sometimes for a very long period of time."
This passage is from a 2012 profile of the Boss, We Are Alive: Bruce Springsteen at Sixty-Two. A good read throughout.
Another comment from earlier in the piece has been rumbling around my head since I read it. Many older acts, especially those of Springsteen's vintage, have become essentially "their own cover bands", playing the oldies on repeat for nostalgic fans. The Boss, though, "refuses to be a mercenary curator of his past" and continually evolves as an artist. That's an inspiration I need right now.
This morning on the exercise bike, I read a big chunk of Daniel Gross and Tyler Talk Talent, from the Conversations with Tyler series. The focus of this conversation is how to identify talent, as prelude to the release of their book on that topic.
The bit I've read so far has been like most Conversations with Tyler: interesting ideas with Cowen occasionally offering an offbeat idea seemingly for the sake of being offbeat. For example, if the person he is interviewing has read Shakespeare, he might say,
"Well, my hypothesis is that in Romeo and Juliet, Romeo and Juliet don't actually love each other at all. Does the play still make sense?" Just see what they have to say. It's a way of testing their second-order understanding of situations, diversity of characters.
This is a bit much for my taste, but the motivating idea behind talking to people about drama or literature is sound:
It's not something they can prepare for. They can't really fake it. If they don't understand the topic, well, you can switch to something else. But if you can't find anything they can understand, you figure, well, maybe they don't have that much depth or understanding of other people's characters.
It seems to me that this style of interviewing runs a risk of not being equitable to all candidates, and at the very least places high demands on both the interviewee and the interviewer. That said, Gross summarizes the potential value of talking to people about movies, music, and other popular culture in interviews:
I think that works because you can learn a lot from what someone says -- they're not likely to make up a story -- but it's also fun, and it is a common thing many people share, even in this era of HBO and Netflix.
This exchange reminded me of perhaps my favorite interview of all time, one in which I occupied the hot seat.
I was a senior in high school, hoping to study architecture at Ball State University. (Actual architecture... the software thing would come later.) I was a pretty good student, so I applied for Ball State's Whitinger Scholarship, one of the university's top awards. My initial application resulted in me being invited to campus for a personal interview. First, I sat to write an essay over the course of an hour, or perhaps half an hour. To be honest, I don't remember many details from that part of the day, only sitting in a room by myself for a while with a blue book and writing away. I wrote a lot of essays in those days.
Then I met with Dr. Warren Vander Hill, the director of the Honors College, for an interview. I'd had a few experiences on college campuses in the previous couple of years, but I still felt a little out of my element. Though I came from a home that valued reading and learning, my family background was not academic.
On a shelf behind Dr. Vander Hill, I noticed a picture of him in a Hope College basketball jersey, dribbling during a college game. I casually asked him about it and learned that he had played Division III varsity ball as an undergrad. I just now searched online in hopes of confirming my memory and learned that he is still #8 on the list of Hope's all-time career scoring leaders. I don't recall him slipping that fact into our chat... (Back then, he would have been #2!)
Anyway, we started talking basketball. Somehow, the conversation turned to Oscar Robertson, one of the NBA's all-time great players. He starred at Indianapolis's all-black Crispus Attucks High School and led the school to a state championship in 1955. From there, we talked about a lot of things -- the integration of college athletics, the civil rights movement, the state of the world in 1982 -- but it all seemed to revolve around basketball.
The time flew. Suddenly, the interview period was over, and I headed home. I enjoyed the conversation quite a bit, but on the hour drive, I wondered if I'd squandered my chances at the scholarship by using my interview time to talk sports. A few weeks later, though, I received a letter saying that I had been selected as one of the recipients.
That was the beginning of four very good years for me. Maybe I can trace some of that fortune to a conversation about sports. I certainly owe a debt to the skill of the person who interviewed me.
I got to know Dr. Vander Hill better over the next four years and slowly realized that he had probably known exactly what he was doing in that interview. He had found a common interest we shared and used it to start a conversation that opened up into bigger ideas. I couldn't have prepared answers for this conversation. He could see that I wasn't making up a story, that I was genuinely interested in the issues we were discussing and was, perhaps, genuinely interesting. The interview was a lot of fun, for both of us, I think, and he learned a lot about me from just talking.
I learned a lot from Dr. Vander Hill over the years, though what I learned from him that day took a few years to sink in.
I'm trying to get back into the habit of writing here more regularly. In the early days of my blog, I posted quick snippets every so often. Here's a set to start 2023.
• Falsework
From A Bridge Over a River Never Crossed:
Funnily enough, traditional arch bridges were built by first having a wood framing on which to lay all the stones in a solid arch (YouTube). That wood framing is called falsework, and is necessary until the arch is complete and can stand on its own. Only then is the falsework taken away. Without it, no such bridge would be left standing. That temporary structure, even if no trace is left of it at the end, is nevertheless critical to getting a functional bridge.
Programmers sometimes write a function or an object that helps them build something else that they couldn't easily have built otherwise, then delete the bridge code after they have written the code they really wanted. A big step in the development of a student programmer is when they do this for the first time, and feel in their bones why it was necessary and good.
• Repair as part of the history of an object
From The Art of Imperfection and its link back to a post on making repair visible, I learned about Kintsugi, a practice in Japanese art...
that treats breakage and repair as part of the history of an object, rather than something to disguise.
I have this pattern around my home, at least on occasion. I often repair my backpack, satchel, or clothing and leave evidence of the repair visible. My family thinks it's odd, but figure it's just me.
Do I do this in code? I don't think so. I tend to like clean code, with no distractions for future readers. The closest thing to Kintsugi I can think of now are comments that mention where some bit of code came from, especially if the current code is not intuitive to me at the time. Perhaps my memory is failing me, though. I'll be on the watch for this practice as I program.
• "It is good to watch the master."
I've been reading a rundown of the top 128 tennis players of the last hundred years, including this one about Pancho Gonzalez, one of the great players of the 1940s, '50s, and '60s. He was forty years old when the Open Era of tennis began in 1968, well past his prime. Even so, he could still compete with the best players in the game.
Even his opponents could appreciate the legend in their midst. Denmark's Torben Ulrich lost to him in five sets at the 1969 US Open. "Pancho gives great happiness," he said. "It is good to watch the master."
The old masters give me great happiness, too. With any luck, I can give a little happiness to my students now and then.
I've been catching up on some items in my newsreader that went unread last summer while I rode my bike outdoors rather than inside. This passage from a blog post by Fred Wilson at AVC touched on a personal habit I've been working on:
I can't imagine an effective exec team that isn't in person together at least once a month.
I sometimes fall into a habit of saying or thinking "I can't imagine...". I'm trying to break that habit.
I don't mean to pick on Wilson, whose short posts I enjoy for insight into the world of venture capital. "I can't imagine" is a common trope in both spoken and written English. Some writers use it as a rhetorical device, not as a literal expression. Maybe he meant it that way, too.
For a while now, though, I've been trying to catch myself whenever I say or think "I can't imagine...". Usually my mind is simply being lazy, or too quick to judge how other people think or act.
It turns out that I usually can imagine, if I try. Trying to imagine how that thinking or behavior makes sense helps me see what other people might be thinking, what their assumptions or first principles are. Even when I end up remaining firm in my own way of thinking, trying to imagine usually puts me in a better position to work with the other person, or explain my own reasoning to them more effectively.
Trying to imagine can also give me insight into the limits of my own thinking. What assumptions am I making that lead me to have confidence in my position? Are those assumptions true? If yes, when might they not to be true? If no, how do I need to update my thinking to align with reality?
When I hear someone say, "I can't imagine..." I often think of Russell and Norvig's textbook Artificial Intelligence: A Modern Approach, which I used for many years in class [1]. At the end of one of the early chapters, I think, they mention critics of artificial intelligence who can't imagine the field of AI ever accomplishing a particular goal. They respond cheekily to the effect, This says less about AI than it says about the critics' lack of imagination. I don't think I'd ever seen a textbook dunk on anyone before, and as a young prof and open-minded AI researcher, I very much enjoyed that line [2].
Instead of saying "I can't imagine...", I am trying to imagine. I'm usually better off for the effort.
~~~~
[1] The Russell and Norvig text first came out in 1995. I wonder if the subtitle "A Modern Approach" is still accurate... Maybe theirs is now a classical approach!
[2] I'll have to track that passage down when I am back in my regular office and have access to my books. (We are in temporary digs this fall due to construction.) I wonder if AI has accomplished the criticized goal in the time since Russell and Norvig published their book. AI has reached heights in recent years that many critics in 1995 could not imagine. I certainly didn't imagine a computer program defeating a human expert at Go in my lifetime, let alone learning to do so almost from scratch! (I wrote about AlphaGo and its intersection with my ideas about AI a few times over the years: [ 01/2016 | 03/2016 | 05/2017 | 05/2018 ].)
From Robin Sloan Sloan's newsletter:
There was a book I wanted very badly to write; a book I had been making notes toward for nearly ten years. (In my database, the earliest one is dated December 13, 2013.) I had not, however, set down a single word of prose. Of course I hadn't! Many of you will recognize this feeling: your "best" ideas are the ones you are most reluctant to realize, because the instant you begin, they will drop out of the smooth hyperspace of abstraction, apparate right into the asteroid field of real work.
I can't really say that there is a book I want very badly to write. In the early 2000s I worked with several colleagues on elementary patterns, and we brainstormed writing an intro CS textbook built atop a pattern language. Posts from the early days of this blog discuss some of this work from ChiliPLoP, I think. I'm not sure that such a textbook could ever have worked in practice, but I think writing it would have been a worthwhile experience anyway, for personal growth. But writing such a book takes a level of commitment that I wasn't able to make.
That experience is one of the reasons I have so much respect for people who do write books.
While I do not have a book for which I've been making notes in recent years, I do recognize the feeling Sloan describes. It applies to blog posts and other small-scale writing. It also applies to new courses one might create, or old courses one might reorganize and teach in a very different way.
I've been fortunate to be able to create and re-create many courses over my career. I also have some ideas that sit in the back of my mind because I'm a little concerned about the commitment they will require, the time and the energy, the political wrangling. I'm also aware that the minute I begin to work on them, they will no longer be perfect abstractions in my mind; they will collide with reality and require compromises and real work.
(TIL I learned the word "apparate". I'm not sure how I feel about it yet.)
I've been pretty quiet on Twitter lately. One reason is that my daily schedule has been so different for the last six or eight weeksr: I've been going for bike rides with my wife at the end of the work day, which means I'm most likely to be reading Twitter late in the day. By then, many of the threads I see have played themselves out. Maybe I should jump in anyway? Even after more than a decade, I'm not sure I know how to Twitter properly.
Here are a few Twitter replies that no one asked for and that I chose not to send at the time.
• When people say, "That's the wrong question to ask", what they often seem to mean -- and should almost always say -- is, "That's not the question I would have asked."
• No, I will not send you a Google Calendar invite. I don't use Google Calendar. I don't even put every event into the calendaring system I *do* use.
• Yes, I will send you a Zoom link.
• COVID did not break me for working from home. Before the pandemic, I almost never worked at home during the regular work day. As a result, doing so felt strange when the pandemic hit us all so quickly. But I came first to appreciate and then to enjoy it, for many of the same reasons others enjoy it. (And I don't even have a long or onerous commute to campus!) Now, I try to work from home one day a week when schedules allow.
• COVID also did not break me for appreciating a quiet and relatively empty campus. Summer is still a great time to work on campus, when the pace is relaxed and most of the students who are on campus are doing research. Then again, so is fall, when students return to the university, and spring, when the sun returns to the world. The takeaway: It's usually a great time to be on campus.
I realize that some of these replies in absentia are effectively subtweets at a distance. All the more reason to post them here, where everyone who reads them has chosen to visit my blog, rather in a Twitter thread filled with folks who wouldn't know me from Adam. They didn't ask for my snark.
I do stand by the first bullet as a general observation. Most of us -- me included! -- would do better to read everyone else's tweets and blog posts as generously as possible.
First, a relatively small-scale dread. From Jeff Jarvis in What Is Happening to TV?
I dread subscribing to Apple TV+, Disney+, Discovery+, ESPN+, and all the other pluses for fear of what it will take to cancel them.
I have not seen a lot of popular TV shows and movies in the last decade or two because I don't want to deal with the hassle of unsubscribing from some service. I have a list of movies to keep an eye out for in other places, should they ever appear, or to watch at their original homes, should my desire to see them ever outgrow my preference to avoid certain annoyances.
Next, a larger-scale source of hope, courtesy of Neel Krishnaswami in The Golden Age of PL Research:
One minor fact about separation logic. John C. Reynolds invented separation logic when was 65. At the time that most people start thinking about retirement, he was making yet another giant contribution to the whole field!
I'm not thinking about retirement at all yet, but I am past my early days as a fresh, energetic, new assistant prof. It's good to be reminded every once in a while that the work we do at all stages of our careers can matter. I didn't make giant contributions when I was younger, and I'm not likely to make a giant contribution in the future. But I should strive to keep doing work that matters. Perhaps a small contribution remains to be made.
~~~~
This isn't much of a blog post, I know. I figure if I can get back into the habit of writing small thoughts down, perhaps I can get back to blogging more regularly. It's all about the habit. Wish me luck.
I did not intend for August to be radio silence on my blog and Twitter page. The summer just caught up with me, and my brain took care of itself, I guess, by turning off for a bit.
One bit of newness for the month was setting up a new Macbook Air. I finally placed my order on July 24. It was scheduled to arrive the week of August 10-17 but magically appeared on our doorstep on July 29. I've been meaning to write about the experience of setting up a new Mac laptop after working for seven years on a trusty Macbook Pro, but that post has been a victim of the August slowdown. I can say this: I pulled out the old Macbook Pro to watch Netflix on Saturday evening... and it felt *so* heavy. How quickly we adjust to new conditions and forget how lucky we were before.
Another pleasure in August was meeting up with Daniel Steinberg over Zoom. I remember back near the beginning of the pandemic Daniel said something on Twitter about getting together for a virtual coffee with friends and colleagues he could no longer visit. After far too long, I contacted him to set up a chat. We had a lot of catching up to do and ended up discussing teaching, writing, programming, and our families. It was one of my best hours for the month!
My wife and I took advantage of the last week before school started by going on a couple of hikes. We visited Backbone State Park for the first time and spent an entire day walking and enjoying scenery that most people don't associate with Iowa. The image at the top of this post comes from the park's namesake trail, which showcases some of the dolomite limestone cliffs leftover from before the last glaciers. Here's another shot, of an entrance to a cave carved out by icy water that still flows beneath the surface:
Closer to home, we took a long morning to walk through Hartman Reserve, a county preserve. Walking for a couple of hours as the sun rises and watching the trees and wildlife come to light is a great way to shake some rust off the mind before school starts.
I had a tough time getting ready mentally for the idea of a new school year. This summer's work offered more burnout than refreshment. As the final week before classes wound down, I had to get serious about class prep -- and it freed me up a bit. Writing code, thinking about CS, and getting back into the classroom with students still energize me. This fall is my compilers course. I'm giving myself permission to make only a few targeted changes in the course plan this time around. I'm hoping that this lets me build some energy and momentum throughout the semester. I'll need that in order to be there for the students.
From the closing pages from The Orchid Thief, which I mentioned in my previous post:
"The thing about computers," Laroche said, "the thing that I like is that I'm immersed in it but it's not a living thing that's going to leave or die or something. I like having the minimum number of living things to worry about in my life."
Actually, I have two comments.
If Laroche had gotten into open source software, he might have found himself with the opposite problem: software that won't die. Programmers sometimes think, "I know, I'll design and implement my own programming language!" Veterans of the programming languages community always seem to advise: think twice. If you put something out there, other people will use it, and now you are stuck maintaining a package forever. The same can be said for open source software more generally. Oh, and did I mention it would be really great if you added this feature?
I like having plants in my home and office. They give me joy every day. They also tend to live a lot longer than some of my code. The hardy orchid featured above bloomed like clockwork twice a year for me for five and a half years. Eventually it needed more space than the pot in my office could give, so it's gone now. But I'm glad to have enjoyed it for all those years.
Chad Orzel, getting ready to begin his 22nd year as a faculty member at Union College, muses:
It's really hard to see myself in the "grizzled veteran" class of faculty, though realistically, I'm very much one of the old folks these days. I am to a new faculty member starting this year as someone hired in 1980 would've been to me when I started, and just typing that out makes me want to crumble into dust.
I'm not the sort who likes to one-up another blogger, but... I can top this, and crumble into a bigger, or at least older, pile of dust.
In May, I finished my 30th year as a faculty member. I am as old to a 2022 hire as someone hired in 1962 would have been to me. Being in computer science, rather than physics or another of the disciplines older than CS, this is an even bigger gap culturally than it first appears. The first Computer Science department in the US was created in 1962. In 1992, my colleagues who started in the 1970s seemed pretty firmly in the old guard, and the one CS faculty member from the 1960s had just retired, opening the line into which I was hired.
Indeed, our Department of Computer Science only came into existence in 1992. Prior to that, the CS faculty had been offering CS degrees for a little over a decade as part of a combined department with Mathematics. (Our department even has a few distinguished alums who graduated pre-1981, with CS degrees that are actually Math degrees with a "computation emphasis".) A new department head and I were hired for the department's first year as a standalone entity, and then we hired two more faculty the next year to flesh out our offerings.
So, yeah. I know what Chad means when he says "just typing that out makes me want to crumble into dust", and then some.
On the other hand, it's kind of cool to see how far computer science has come as an academic discipline in the last thirty years. It's also cool that I am still be excited to learn new things and to work with students as they learn them, too.
I first saw Billy Joel perform live in 1983, with a college roommate and our girlfriends. It was my first pop/rock concert, and I fancied myself the biggest Billy Joel fan in the world. The show was like magic to a kid who had been listening to Billy's music on vinyl, and the radio, for years.
Since then, I've seen him more times than I can remember, most recently in 2008. My teenaged daughters went with me to that one, so it was magic for more reasons than one. I've even seen a touring Broadway show built around his music. So, yeah, I'm still a fan.
On Saturday morning, I drove to Elkhart, Indiana, to meet up with three friends from college to go see Billy perform outdoors at Notre Dame Stadium. We bought our tickets in October 2019, pre-COVID, expecting to see the show in the summer of 2020. After two years of postponement, Billy, the venue, and the fans were ready to go. Six hours is a long way to drive to see a two- or three-hour show, especially knowing that I had to drive six hours back the next morning. I'm not a college student any more!
You may be right; I may be crazy. But I would drive six hours again to see Billy. Even at 73, he puts on a great show. I hope I have that kind of energy -- and the desire to still do my professional thing -- when I reach that age. (I don't expect that 50,000 students will pay to see me do it, let alone drive six hours.) For this show, I had the bonus of being able to visit with good friends, one of whom I've known since grade school, after too long a time.
I went all fanboy in my short post about the 2008 concert, so I won't bore you again with my hyperbole. I'll just say that Billy performed "She's Always A Woman" and "Don't Ask Me Why" again, along with a bunch of the old favorites and a few covers: I enjoyed his impromptu version of "Don't Let the Sun Go Down on Me", bobbles and all. He played piano for one of his band members, Mike DelGuidice, who sang "Nessun Dorma". And the biggest ovation of the night may have gone to Crystal Taliafero, a multi-talented member of Billy's group, for her version of "Dancing in the Streets" during the extended pause in "The River of Dreams".
This concert crowd was the most people I've been around in a long time... I figured a show in an outdoor stadium was safe enough, with precautions. (I was one of the few folks who wore a mask in the interior concourse and restrooms.) Maybe life is getting back to normal.
If this was my last time seeing Billy Joel perform live, it was a worthy final performance. Who knows, though. I thought 2008 might be my last live show.
This past weekend, it was supposed to rain Saturday evening into Sunday, so I woke up with uncertainty about my usual Sunday morning bike ride. My exercise bike broke down a few weeks back, so riding outdoors was my only option. I decided before I went to bed on Saturday night that, if it was dry when I woke up, I would ride a couple of miles to a small lake in town and ride laps in whatever time I could squeeze in between rain showers.
The rain in the forecast turned out to be a false alarm, so I had more time to ride than I had planned. I ended up riding the 2.3 miles to the fifteen 1.2-mile laps, and 2.30 miles back home. Fifteen mile-plus laps may seem crazy to you, but it was the quickest and most predictable adjustment I could make in the face of the suddenly available time. It was like a speed workout on the track from my running days. Though shorter than my usual Sunday ride, it was an unexpected gift of exercise on what turned out to be a beautiful morning.
A couple of laps into the ride, the hill on the far end of the loop began to look look foreboding. Thirteen laps to go... Thirteen more times up an extended incline (well, at least what passes for one in east central Iowa).
After a few more laps, my mindset had changed. Six down. This feels good. Let's do nine more!
I had found the rhythm of doing repeats.
I used to do track repeats when training for marathons and always liked them. (One of my earliest blog entries sang the praises of short iterations and frequent feedback on the track.) I felt again the hit of endorphins every time I completed one loop around the lake. My body got into the rhythm. Another one, another one. My mind doesn't switch off under these conditions, but it does shift into a different mode. I'm thinking, but only in the moment of the current lap. Then there's one more to do.
I wonder if this is one of the reasons some programmers like programming with stories of a limited size, or under the constraints of test-driven design. Both provide opportunities for frequent feedback and frequent learning. They also provide a hit of endorphins every time you make a new test pass, or see the light go green after a small refactoring.
My willingness to do laps, at least in service of a higher goal, may border on the unfathomable. One Sunday many years ago, when I was still running, we had huge thunderstorms all morning and all afternoon. I was in the middle of marathon training and needed a 20-miler that day to stay on my program. So I went to the university gym -- the one mentioned in the blog post linked above, with 9.2 laps to a mile -- and ran 184 laps. "Are you nuts?" I loved it! The short iterations and frequent feedback dropped me in to a fugue-like rhythm. It was easy to track my pace, never running too fast or too slow. It was easy to make adjustments when I noticed something off-plan. In between moments checking my time, I watched people, I breathed, I cleared my mind. I ran. All things considered, it was a good day.
Sunday morning's fifteen laps were workaday in comparison. At the end, I wished I had more time to ride. I felt strong enough. Another five laps would have been fun. That hill wasn't going to get me. And I liked the rhythm.
Like many people, I carry a small notebook most everywhere I go. It is not a designer's sketchbook or engineer's notebook; it is intended primarily to capturing information and ideas, a lá Getting Things Done, before I forget then. Most of the notes end up being transferred to one of my org-mode todo lists, to my calendar, or to a topical file for a specific class or project. Write an item in the notebook, transfer to the appropriate bin, and cross it off in the notebook.
I just filled the last line of my most recent notebook, a Fields Notes classic that I picked up as schwag at Strange Loop a few years ago. Most of the notebook is crossed out, a sign of successful capture and transfer. As I thumbed back through it, I saw an occasional phrase or line that never made into a more permanent home. That is pretty normal for my notebooks. At this point, I usually recycle the used notebook and consign untracked items to lost memories.
For some reason, this time I decided to copy all of the untracked down and savor the randomness of my mind. Who knows, maybe I'll use one of these notes some day.
The Fedsbasic soul math
I want to be #0
routine, ritual
gallery.stkate.edu
M. Dockery
www.wastetrac.org/spring-drop-off-event
Crimes of the Art
What the Puck
Massachusetts ombudsman
I hope it's still funny...
chessable.com
art gallery
ena @ tubi
In Da Club (50 Cent)
Gide 25; 28 May : 1
HFOSS project
April 4-5: Franklin documentary
Mary Chapin Carpenter
"Silent Parade" by Keigo Higashino
www.pbs.org -- search Storm Lake
"Hello, Transcriber" by Hannah Morrissey
Dear Crazy Future Eugene
I recognize most of these, though I don't remember the reason I wrote all of them down. For whatever reason, they never reached an actionable status. Some books and links sound interesting in the moment, but by the time I get around to transcribing them elsewhere, I'm no longer interested enough to commit to reading, watching, or thinking about them further. Sometimes, something pops into my mind, or I see something, and I write it down. Better safe than sorry...
That last one -- Dear Crazy Future Eugene -- ends up in a lot of my notebooks. It's a phrase that has irrational appeal to me. Maybe it is destined to be the title of my next blog.
There were also three multiple-line notes that were abandoned:
poem > reality
words > fact
a model is not identical
I vaguely recall writing this down, but I forget what prompted it. I vaguely agree with the sentiment even now, though I'd be hard-pressed to say exactly what it means.
Scribble pages that separate notes from full presentation
(solutions to exercises)
This note is from several months ago, but it is timely. Just this week, a student in my class asked me to post my class notes before the session rather than after. I don't do this currently in large part because my sessions are a tight interleaving of exercises that the students do in class, discussion of possible solutions, and use of those ideas to develop the next item for discussion. I think that Scribble, an authoring system that comes with Racket, offers a way for me to build pages I can publish in before-and-after form, or at least in an outline form that would help students take notes. I just never get around to trying the idea out. I think the real reason is that I like to tinker with my notes right up to class time... Even so, the idea is appealing. It is already in my planning notes for all of my classes, but I keep thinking about it and writing it down as a trigger.
generate scanner+parser? expand analysis,
codegen (2 stages w/ IR -- simple exps, RTS, full)
optimization! would allow bigger source language?
This is evidence that I'm often thinking about my compiler course and ways to revamp it. This idea is also already in the system. But I keep to prompting myself to think about it again.
Anyway, that was a fun way to reflect on the vagaries of my mind. Now, on to my next notebook: a small pocket-sized spiral notebook I picked up for a quarter in the school supplies section of a big box store a while back. My friend Joe Bergin used to always have one of these in his shirt pocket. I haven't used a spiral-bound notebook for years but thought I'd channel Joe for a couple of months. Maybe he will inspire me to think some big thoughts.
It's been a while since I read a non-technical article and made as many notes as I did this morning on this Paris Review interview with Billy Collins. Collins was poet laureate of the U.S. in the early 2000s. I recall reading his collection, Sailing Alone Around the Room, at PLoP in 2002 or 2003. Walking the grounds at Allerton with a poem in your mind changes one's eyes and hears. Had I been blogging by then, I probably would have commented on the experience, and maybe one or two of the poems, in a post.
As I read this interview, I encountered a dozen or so passages that made me think about things I do, things I've thought, and even things I've never thought. Here are a few.
I'd like to get something straightened out at the beginning: I write with a Uni-Ball Onyx Micropoint on nine-by-seven bound notebooks made by a Canadian company called Blueline. After I do a few drafts, I type up the poem on a Macintosh G3 and then send it out the door.
Uni-Ball Micropoint pens are my preferred writing implement as well, though I don't write enough on paper any more to make buying a particular pen much worth the effort. Unfortunately, just yesterday my last Uni-Ball Micro wrote its last line. Will I order more? It's a race between preference and sloth.
I type up most of the things I write these days on a 2015-era MacBook Pro, often connected to a Magic Keyboard. With the advent of the M1 MacBook Pros, I'm tempted to buy a new laptop, but this one serves me so well... I am nothing if not loyal.
The pen is an instrument of discovery rather than just a recording implement. If you write a letter of resignation or something with an agenda, you're simply using a pen to record what you have thought out. In a poem, the pen is more like a flashlight, a Geiger counter, or one of those metal detectors that people walk around beaches with. You're trying to discover something that you don't know exists, maybe something of value.
Programming may be like writing in many ways, but the search for something to say isn't usually one of them. Most of us sit down to write a program to do something, not to discover some unexpected outcome. However, while I may know what my program will do when I get done, I don't always know what that program will look like, or how it will accomplish its task. This state of uncertainty probably accounts for my preference in programming languages over the years. Smalltalk, Ruby, and Racket have always felt more like flashlights or Geiger counters than tape recorders. They help me find the program I need more readily than Java or C or Python.
I love William Matthews's idea--he says that revision is not cleaning up after the party; revision is the party!
Refactoring is not cleaning up after the party; refactoring is the party! Yes.
... nothing precedes a poem but silence, and nothing follows a poem but silence. A poem is an interruption of silence, whereas prose is a continuation of noise.
I don't know why this passage grabbed me. Perhaps it's just the imagery of the phrases "interruption of silence" and "continuation of noise". I won't be surprised if my subconscious connects this to programming somehow, but I ought to be suspicious of the imposition. Our brains love to make connections.
She's this girl in high school who broke my heart, and I'm hoping that she'll read my poems one day and feel bad about what she did.
This is the sort of sentence I'm a sucker for, but it has no real connection to my life. Though high school was a weird and wonderful time for me, as it was for so many, I don't think anything I've ever done since has been motivated in this way. Collins actually goes on to say the same thing about his own work. Readers are people with no vested interest. We have to engage them.
Another example of that is my interest in bridge columns. I don't play bridge. I have no idea how to play bridge, but I always read Alan Truscott's bridge column in the Times. I advise students to do the same unless, of course, they play bridge. You find language like, South won with dummy's ace, cashed the club ace and ruffed a diamond. There's always drama to it: Her thirteen imps failed by a trick. There's obviously lots at stake, but I have no idea what he's talking about. It's pure language. It's a jargon I'm exterior to, and I love reading it because I don't know what the context is, and I'm just enjoying the language and the drama, almost like when you hear two people arguing through a wall, and the wall is thick enough so you can't make out what they're saying, though you can follow the tone.
I feel seen. Back when we took the local daily paper, I always read the bridge column by Charles Goren, which ran on the page with the crossword, crypto cipher, and other puzzles. I've never played bridge; most of what I know about the game comes from reading Matthew Ginsberg's papers about building AI programs to bid and play. Like Collins, I think I was merely enjoying sound of the language, a jargon that sounds serious and silly at the same time.
Yeats summarizes this whole thing in "Adam's Curse" when he writes: "A line will take us hours maybe, / Yet if it does not seem a moment's thought / Our stitching and unstitching has been naught."
I'm not a poet, and my unit of writing is rarely the line, but I know a feeling something like this in writing lecture notes for my students. Most of the worst writing consists of paragraphs and sections I have not spent enough time on. Most of the best sounds natural, a clean distillation of deep understanding. But those paragraphs and sections are the result of years of evolution. That's the time scale on which some of my courses grow, because no course ever gets my full attention in any semester.
When I finish a set of notes, I usually feel like the stitching and unstitching have not yet reached their desired end. Some of the text "seems a moment's thought", but much is still uneven or awkward. Whatever the state of the notes, though, I have move on to the next task: grading a homework assignment, preparing the next class session, or -- worst of all -- performing the administrivia that props up the modern university. More evolution awaits.
~~~~
This was a good read for a Sunday morning on the exercise bike, well recommended. The line on revision alone was worth the time; I expect it will be a stock tool in my arsenal for years to come.
When I was in high school, the sponsor of our our Math Club, Mr. Harpring, liked to give books as prizes and honors for various achievements. One time, he gave me Women in Mathematics, by Lynn Osen. It introduced me to Émilie du Châtelet, Sophie Germain, Emmy Noether, and a number of other accomplished women in the field. I also learned about some cool math ideas.
Another time, I received The Unexpected Hanging and Other Mathematical Diversions, a collection of Martin Gardner's columns from Scientific American. One of the chapters was about Hexapawn, a simple game played with chess pawns on a 3x3 board. The chapter described an analog computer that learned how to play a perfect game of Hexapawn. I was amazed.
I played a lot of chess in high school and was already interested in computer chess programs. Now I began to wonder what it would be like to write a program that could learn to play chess... I suspect that Gardner's chapter planted one of the seeds that grew into my study of computer science in college. (It took a couple of years, though. From the time I was eight years old, I had wanted to be an architect, and that's where my mind was focused.)
As I wrote those words, it occurred to me that I may have written about the Gardner book before. Indeed I have, in a 2013 post on building the Hexapawn machine. Some experiences stay with you.
They also intersect with the rest of the world. This week, I read Jeff Atwood's recent post about his project to bring the 1973 book BASIC Computer Games into the 21st century. This book contains the source code of BASIC programs for 101 simple games. The earliest editions of this book used a version of BASIC before it included the GOSUB command, so there are no subroutines in any of the programs! Atwood started the project as a way to bring the programs in this book to a new audience, using modern languages and idioms.
You may wonder why he and other programmers would feel so fondly about BASIC Computer Games to reimplement its programs in Java or Ruby. They feel about these books the way I felt about The Unexpected Hanging. Books were the Github of the day, only in analog form. Many people in the 1970s and 1980s got their start in computing by typing these programs, character for character, into their computers.
I was not one of those people. My only access to a computer was in the high school, where I took a BASIC programming class my junior year. I had never seen a book like BASIC Computer Games, so I wrote all my programs from scratch. As mentioned in an old OOPSLA post from 2005, the first program I wrote out of passion was a program to implement a ratings system for our chess club. Elo ratings were great application for a math student and beginning programmer.
Anyway, I went to the project's Github site to check out what was available and maybe play a game or two. And there it was: Hexapawn! Someone has already completed the port to Python, so I grabbed it and played a few games. The text interface is right out of 1973, as promised. But the program learns, also as promised, and eventually plays a perfect game. Playing it brings back memories of playing my matchbox computer from high school. I wonder now if I should write my own program that learns Hexapawn faster, hook it up with the program from the book, and let them duke it out.
Atwood's post brought to mind pleasant memories at a time when pleasant memories are especially welcome. So many experiences create who we are today, yet some seem to have made an outsized contribution. Learning BASIC and reading Martin Gardner's articles are two of those for me.
Reading that blog post and thinking about Hexapawn also reminded me of Mr. Harpring and the effect he had on me as a student of math and as a person. The effects of a teacher in high school or grade school can be subtle and easy to lose track of over time. But they can also be real and deep, and easy not to appreciate fully when we are living them. I wish I could thank Mr. Harpring again for the books he gave me, and for the gift of seeing a teacher love math.
A couple of weeks back, I saw an article in which Malcom Gladwell noted that he did not know The Triggering Town, a slim book of essays by poet Richard Hugo. I was fortunate to hear about Hugo many years ago from software guru Richard Gabriel, who is also a working poet. It had been fifteen years or more since I'd read The Triggering Town, so I stopped into the library on my way home one day and picked it up. I enjoyed it the second time around as much as the first.
I frequently make notes of passages to save. Here are five from this reading.
Actually, the hard work you do on one poem is put in on all poems. The hard work on the first poem is responsible for the sudden ease of the second. If you just sit around waiting for the easy ones, nothing will come. Get to work.
That advice works for budding software developers, too.
Emotional honesty is a rare thing in the academic world or anywhere else for that matter, and nothing is more prized by good students.
Emotion plays a much smaller role in programming than in writing poetry. Teaching, though, is deeply personal, even in a technical discipline. All students value emotional honesty, and profs who struggle to be open usually struggle making connections to their students.
Side note: Teachers, like policemen, firemen, and service personnel, should be able to retire after twenty years with full pension. Our risks may be different, but they are real. In twenty years most teachers have given their best.
This is a teacher speaking, so take the recommendation with caution. But more than twenty years into this game, I know exactly what Hugo means.
Whatever, by now, I was old enough to know explanations are usually wrong. We never quite understand and we can't quite explain.
Yet we keep trying. Humans are an optimistic animal, which is one of the reasons we find them so endearing.
... at least for me, what does turn me on lies in a region of myself that could not be changed by the nature of my employment. But it seems important (to me even gratifying) that the same region lies untouched and unchanged in a lot of people, and in my innocent way I wonder if it is reason for hope. Hope for what? I don't know. Maybe hope that humanity will always survive civilization.
This paragraph comes on the last page of the book and expresses one of the core tenets of Hugo's view of poetry and poets. He fought in World War 2 as a young man, then worked in a Boeing factory for 15-20 years, and then became an English professor at a university. No matter the day job, he was always a poet. I have never been a poet, but I know quite well the region of which he speaks.
Also: I love the sentence, "Maybe hope that humanity will always survive civilization."
The week after Strange Loop has been a blur of catching up with all the work I didn't do while attending the conference, or at least trying. That is actually good news for my virtual conference: despite attending Strange Loop from the comfort of my basement, I managed not to get sucked into the vortex of regular business going on here.
A few closing thoughts on the conference:
• Speaking of "the comfort of my basement", here is what my Strange Loop conference room looked like:
The big screen is a 29" ultra-wide LG monitor that I bought last year on the blog recommendation of Robert Talbert, which has easily been my best tech purchase of the pandemic. On that screen you'll see vi.to, the streaming platform used by Strange Loop, running in Safari. To its right, I have emacs open on a file of notes and occasionally an evolving blog draft. There is a second Safari window open below emacs, for links picked up from the talks and the conference Slack channels.
On the MacBookPro to left, I am running Slack, another emacs shell for miscellaneous items, and a PDF of the conference schedule, marked up with the two talks I'm considering in each time slot.
That set-up served me well. I can imagine using it again in the future.
• Attending virtually has its downsides, but also its upsides. Saturday morning, one attendee wrote in the Slack #virtual-attendees channel:
Virtual FTW! Attending today from a campsite in upstate New York and enjoying the fall morning air
I was not camping, but I experienced my own virtual victories at lunch time, when I was able to go for a walk with my wife on our favorite walking trails.
• I didn't experience many technical glitches at the conference. There were some serious AV issues in the room during Friday's second slot. Being virtual, I was able to jump easily into and out of the room, checking in on another talk while they debugged on-site. In another talk, we virtual attendees missed out on seeing the presenter's slides. The speaker's words turned out to be enough for me to follow. Finally, Will Byrd's closing keynote seemed to drop its feed a few times, requiring viewers to refresh their browsers occasionally. I don't have any previous virtual conferences to compare to, but this all seemed pretty minor. In general, the video and audio feedbacks were solid and of high fidelity.
• One final note, not related to The Virtual Experience. Like many conferences, Strange Loop has so many good talks that I usually have to choose among two or three talks I want to see in each slot. This year, I kept track of alt-Strange Loop, the schedule of talks I didn't attend but really wanted to. Comparing this list to the list of talks I did attend gives a representative account of the choices I faced. It also would make for a solid conference experience in its own right:
There is a tie for the honor of "talk I most wanted to see but didn't": Wallhermfechtel on creating more inclusive tech spaces and Marcia on crypto steganography. I'll be watching these videos on YouTube some time soon!
As I mentioned in Day 1's post, this year I tried to force myself out of usual zone, to attend a wider range of talks. Both lists of talks reflect this mix. At heart I am an academic with a fondness for programming languages. The tech talks generally lit me up more. Even so, I was inspired by some of the talks focused on community and the use of technology for the common good. I think I used my two days wisely.
That is all. Strange Loop sometimes gives me the sort of inspiration overdose that Molly Mielke laments in this tweet. This year, though, Strange Loop 2021 gave me something I needed after eighteen months of pandemic (and even more months of growing bureaucracy in my day job): a jolt of energy, and a few thoughts for the future.
I am usually tired on the second day of a conference, and today was no exception. But the day started and ended with talks that kept my brain alive.
• "Poems in an Accidental Language" by Kate Compton -- Okay, so that was a Strange Loop keynote. When the video goes live on YouTube, watch it. I may blog more about the talk later, but for now know only that it included:
• Quantum computing is one of those technical areas I know very little about, maybe the equivalent of a 30-minute pitch talk. I've never been super-interested, but some of my students are. So I attended "Practical Quantum Computing Today" to see what's up these days. I'm still not interested in putting much of my time into quantum computing, but now I'm better informed.
• Before my lunch walk, I attended a non-technical talk on "tech-enabled crisis response". Emma Ferguson and Colin Schimmelfing reported on their experience doing something I'd like to be able to do: spin up a short-lived project to meet a critical need, using mostly free or open-source tools. For three months early in the COVID pandemic, their project helped deliver ~950,000 protective masks from 7,000 donors to 6,000 healthcare workers. They didn't invent new tech; they used existing tools and occasionally wrote some code to connect such tools.
My favorite quote from the talk came when Ferguson related the team's realization that they had grown too big for the default limits on Google Sheets and Gmail. "We thought, 'Let's just pay Google.' We tried. We tried. But we couldn't figure it out." So they built their own tool. It is good to be a programmer.
• After lunch, Will Crichton live-coded a simple API in Rust, using traits (Rust's version of interfaces) and aggressive types. He delivered almost his entire talk within emacs, including an ASCII art opening slide. It almost felt like I was back in grad school!
• In "Remote Workstations for Discerning Artists", Michelle Brenner from Netflix described the company's cloud-based infrastructure for the workstations used by the company's artists and project managers. This is one of those areas that is simply outside my experience, so I learned a bit. At the top level, though, the story is familiar: the scale of Netflix's goals requires enabling artists to work wherever they are, whenever they are; the pandemic accelerated a process that was already underway.
• Eric Gade gave another talk in the long tradition of Alan Kay and a bigger vision for computing. "Authorship Environments: In Search of the 'Personal' in Personal Computing" started by deconstructing Steve Jobs's "bicycle for the mind" metaphor (he's not a fan of what most people take as the meaning) and then moved onto the idea of personal computing as literacy: a new level at which to interrogate ideas, including one's own.
This talk included several inspirational quotes. My favorite was was from Adele Goldberg:
There's all these layers in everything we do... We have to learn how to peel.(I have long admired Goldberg and her work. See this post from Ada Lovelace Day 2009 for a few of my thoughts.)
As with most talks in this genre, I left feeling like there is so much more to be done, but frustrated at not knowing how to do it. We still haven't found a way to reach a wide audience with the empowering idea that there is more to computing than typing into a Google doc or clicking in a web browser.
• The closing keynote was delivered by Will Byrd. "Strange Dreams of Stranger Loops" took Douglas Hofstadter's Gödel, Escher, Bach as its inspiration, fitting both for the conference and for Byrd's longstanding explorations of relational programming. His focus today: generating quines in mini-Kanren, and discussing how quines enable us to think about programs, interpreters, and the strange loops at the heart of GEB.
As with the opening keynote I may blog more about this talk later. For now I give you two fun items:
Strange Loop 2021 has ended. I "hit the road" by walking upstairs to make dinner with my wife.
On this first day of my first virtual conference, I saw a number of Strange Loop-y talks: several on programming languages and compilers, a couple by dancers, and a meta-talk speculating on the future of conferences.
• I'm not a security guy or a cloud guy, so the opening keynote "Why Security is the Biggest Benefit of Using the Cloud" by AJ Yawn gave me a chance to hear what people in this space think and talk about. Cool trivia: Yawn played a dozen college basketball games for Leonard Hamilton at Florida State. Ankle injuries derailed his college hoops experience, and now he's a computer security professional.
• Richard Marmorstein's talk, "Artisanal, Machine-Generated API Libraries" was right on topic with my compiler course this semester. My students would benefit from seeing how software can manipulate AST nodes when generating target code.
Marmorstein uttered two of the best lines of the day:
I've been working with students all week trying to help them see how an object in their compiler such as a token can help the compiler do its job -- and make the code simpler to boot. Learning software design is hard.
• I learned a bit about the Nim programming language from Aditya Siram. As you might imagine, a language designed at the nexus Modula/Oberon, Python, and Lisp appeals to me!
• A second compiler-oriented talk, by Richard Feldman, demonstrated how opportunistic in-place mutation, a static optimization, can help a pure functional program outperform imperative code.
• After the talk "Dancing With Myself", an audience member complimented Mariel Pettee on "nailing the Strange Loop talk". The congratulations were spot-on. She hit the technical mark by describing the use of two machine learning techniques, variational auto encoding and graph neural networks. She hit the aesthetic mark by showing how computer models can learn and generate choreography. When the video for this talk goes live, you should watch.
Pettee closed with the expansive sort of idea that makes Strange Loop a must-attend conference. Dance has no universal language for "writing" choreography, and video captures only a single instance or implementation of a dance, not necessarily the full intent of the choreographer. Pettite had expected her projects to show how machine learning can support invention and co-creation, but now she sees how work like this might provide a means of documentation. Very cool. Perhaps CS can help to create a new kind of language for describing dance and movement.
• I attended Laurel Lawson's "Equitable Experiential Access: Audio Description" to learn more about ways in which videos and other media can provide a fuller, more equitable experience to everyone. Equity and inclusion have become focal points for so much of what we do at my university, and they apply directly to my work creating web-based materials for students. I have a lot to learn. I think one of my next steps will be to experience some of web pages (session notes, assignments, resource pages) solely through a screen reader.
• Like all human activities, traditional in-person conferences offer value and extract costs. Crista Lopes used her keynote closing Day 1 to take a sober look at the changes in their value and their costs in the face of technological advances over the last thirty years.
If we are honest with ourselves, virtual conferences are already able to deliver most of the value of in-person conferences (and, in some ways, provide more value), at much lower cost. The technology of going virtual is the easy part. The biggest challenges are social.
~~~~~
A few closing thoughts as Day 1 closes.
As Crista said, "Taking paid breaks in nice places never gets old." My many trips to OOPSLA and PLoP provided me with many wonderful physical experiences. Being in the same place with my colleagues and friends was always a wonderful social experience. I like driving to St. Louis and going to Strange Loop in person; sitting in my basement doesn't feel the same.
With time, perhaps my expectations will change.
It turns out, though, that "virtual Strange Loop" is a lot like "in-person Strange Loop" in one essential way: several cool new ideas arrive every hour. I'll be back for Day Two.
After a couple of years away, I am attending Strange Loop. 2018 seems so long ago now...
Last Wednesday morning, I hopped in my car and headed south to Strange Loop 2018. It had been a few years since I'd listened to Zen and the Art of Motorcycle Maintenance on a conference drive, so I popped it into the tape deck (!) once I got out of town and fell into the story. My top-level goal while listening to Zen was similar to my top-level goal for attending Strange Loop this year: to experience it at a high level; not to get bogged down in so many details that I lost sight of the bigger messages. Even so, though, a few quotes stuck in my mind from the drive down. The first is an old friend, one of my favorite lines from all of literature:
Assembly of Japanese bicycle require great peace of mind.The other was the intellectual breakthrough that unified Phaedrus's philosophy:
Quality is not an object; it is an event.This idea has been on my mind in recent months. It seemed a fitting theme, too, for Strange Loop.
There will be no Drive South in 2021. For a variety of reasons, I decided to attend the conference virtually. The persistence of COVID is certainly one of big the reasons. Alex and the crew at Strange Loop are taking all the precautions one could hope for to mitigate risk, but even so I will feel more comfortable online this year than in rooms full of people from across the country. I look forward to attending in person again soon.
Trying to experience the conference at a high level is again one of my meta-level goals for attending. The program contains so many ideas that are new to me; I think I'll benefit most by opening myself to areas I know little or nothing about and seeing where the talks lead me.
This year, I have a new meta-level goal: to see what it is like to attend a conference virtually. Strange Loop is using Vito as its hub for streaming video and conference rooms and Slack as its online community. This will be my first virtual conference, and I am curious to see how it feels. With concerns such as climate change, public health, and equity becoming more prominent as conference-organizing committees make their plans, I suspect that we will be running more and more of our conferences virtually in the future, especially in CS. I'm curious to see how much progress has been made in the last eighteen months and how much room we have to grow.
This topic is even on the program! Tomorrow's lineup concludes with Crista Lopes speaking on the future of conferences. She's been thinking about and helping to implement conferences in new ways for a few years, so I look forward to hearing what she has to say.
Whatever the current state of virtual conferences, I fully expect that this conference will be a worthy exemplar. It always is.
So, I'm off to Strange Loop for a couple of days. I'll be in my basement.
Monday, I spent a full day at the office. It was my most normal day on campus in 16 months. The university held a visit day for prospective students and their parents, with a large gathering followed by break-out sessions with individual departments. In the afternoon, I interviewed a candidate for a secretary position in our department. Both were fully in person, with no extra distance built into the process.
Scattered among those events were typical online stuff that department heads do. A first-year student I worked with at orientation last month wanted to add marching band to his schedule, so I helped by moving several courses around for him. I evaluated courses from another university for someone who is thinking about transferring here, and then answered several questions from the student in email.
There was also one "new normal" sort of thing: a Zoom meeting with a colleague about possible exchange programs in China and Kosovo. We've learned a lot about working online over the last two years. One valuable lesson is that some meetings are much better done online. When I was a few minutes late, my colleague sat comfortably in her office, working away happily. On a 90-degree day, not having to walk across campus to a meeting is a win.
One sorta new normal thing for me: I'm still wearing a mask while indoors with people and maintaining a bit more distance than usual. We have one month before fall classes begin, fully in person, with no extra distance built into the process. After sixteen months of being careful and staying healthy, this seems like a goofy time to lower my guard more than necessary and catch the virus. (Yes, I'm fully vaccinated.)
You may notice what I didn't mention about Monday: no teaching, no computer science. The same happened Tuesday, which included another campus visit day, another Zoom meeting, and plenty of online file shuffling. One of the side effects of the last couple of years is an expansion of administrative work in the summers. Working with students and recruiting are important as we look to bounce back from low enrollments last year.
The good news is that there is still time left this summer. I did some CS last week, designing a new source language for my compiler course this fall. It was the most fun I've had all summer. This week I'm going to write some code and prep more for class. With any luck, I'll write about that soon, and not go more than two months between posts again!
On teaching, via Robert Talbert:
Look at the course you teach most often. If you had the power to remove one significant topic from that course, what would it be, and why?
I have a high degree of autonomy in most of the courses I teach, so power isn't the limiting factor for me. Time is a challenge to making big changes, of course. Gumption is probably what I need most right now. Summer is a great time for me to think about this, both for my compiler course this fall and programming languages next spring.
On research, via Kris Micinski:
i remember back to Dana Scott's lecture on the history of the lambda calculus where he says, "If Turing were alive today, I don't know what he'd be doing, but it wouldn't be recursive function theory." I think about that a lot.
Now I am, too. Seriously. I'm no Turing, but I have a few years left and some energy to put into something that matters. Doing so will require some gumption to make other changes in my work life first. I am reaching a tipping point.
A couple of weeks ago, a former student emailed me after many years. Felix immigrated to the US from the Sudan back in the 1990s and wound up at my university, where he studied computer science. While in our program, he took a course or two with me, and I supervised his undergrad research project. He graduated and got busy with life, and we lost touch.
He emailed to let me know that he was about to defend his Ph.D. dissertation, titled "Efficient Reconstruction and Proofreading of Neural Circuits", at Harvard. After graduating from UNI, he programmed at DreamWorks Interactive and EA Sports, before going to grad school and working to "unpack neuroscience datasets that are almost too massive to wrap one's mind around". He defended his dissertation successfully this week.
Congratulations, Dr. Gonda!
Felix wrote initially to ask permission to acknowledge me in his dissertation and defense. As I told him, it is an honor to be remembered so fondly after so many years. People often talk about how teachers affect their students' futures in ways that are often hard to see. This is one of those moments for me. Arriving at the end of what has been a challenging semester in the classroom for me, Felix's note boosted my spirit and energizes me a bit going into the summer.
If you'd like to learn more about Felix and his research, here is his personal webpage The Harvard School of Engineering also has a neat profile of Felix that shows you what a neat person he is.
This is short story about a student finding something helpful in class and making my day, preceded by a long-ish back story.
In my programming languages course yesterday, I did a session on optimization. It's a topic of some importance, and students are usually interested in what it means for an interpreter or compiler to "optimize" code. I like to show students a concrete example that demonstrates the value of an optimization. Given where we are in the course and the curriculum, though, it would be difficult to do that with a full-featured language such as Python or Java, or even Racket. On the other end of the spectrum, the little languages they have been implementing and using all semester are too simple to benefit from meaningful optimization.
I found a sweet spot in between these extremes with BF. (Language alert!) I suppose it is more accurate to say that Eli Bendersky found the sweet spot, and I found Bendersky's work. Back in 2017, he wrote a series of blog posts on how to write just-in-time compilers, using BF as his playground. The first article in that series inspired me to implement something similar in Python and to adapt it for use with my students.
BF is well-suited for my purposes. It is very simple language, consisting of only eight low-level operators. It is possible to write a small interpreter for BF that students with only a background in data structures can understand. Even so, the language is Turing complete, which means that we can write interesting and arbitrarily complex programs.
The low-level simplicity of BF combines with its Turing completeness to create programs that are horribly inefficient if they are interpreted in a naive manner. There are many simple ways to optimize BF programs, including creating a jump table to speed up loops and parsing runs of identical opcodes (moves, increments, and decrements) as more efficient higher-level operators. Even better, the code to implement these optimizations is also understandable to a student with only data structures and a little background in programming languages.
My session is built around a pair of interpreters, one written in a naive fashion and the other implementing an optimization. This semester, we preprocessed BF programs to compute a table that makes jumping to the beginning or end of a loop an O(1) operation just like BF's other six primitives. The speed-up on big BF programs, such as factoring large numbers or computing a Mandelbrot set, is impressive.
Now to the story.
At the end of class, I talk a bit about esoteric languages more broadly as a way for programmers to test the boundaries of programming language design, or simply to have fun. I get to tell students a story about a four-hour flight back from OOPSLA one year during which I decided to roll a quick interpreter for Ook in Scheme. (What can I say; programming is fun.)
To illustrate some of the fun and show that programmers can be artists, too, I demo programs in the language Piet, which is named for the Dutch abstract painter Piet Mondrian. He created paintings that look like this:
That is not a Mondrian, but it is a legal program in the Piet language. It prints 'Piet'. Here is another legal Piet program:
It prints "Hello, World". Here's another:
That program reads an integer from standard input, determines whether it is prime or not, and prints 'Y' or 'N'. Finally, how about this:
If you are a certain age, you may notice something special about this image: It is made up exclusively of Tetris pieces. The program prints... "Tetris". Programming truly is an art!
One of my students was inspired. While reviewing the session notes, he searched for more information about Piet online and found this interactive editor. He then used it to create a Piet program in honor of a friend of his who passed away earlier this semester. It prints the Xbox gamertag of his late friend. In his email to me, he said that writing this program was therapeutic.
I'm not sure one of my class sessions has ever had a more important outcome. I'm also not sure that I have ever been happier to receive email from a student.
This has been a tough year for most everyone, and especially for students who are struggling with isolation and countermeasures against a nasty virus. I'm so glad that programming gave one student a little solace, at least for an evening. I'm also glad he shared his story with me.
A couple of months ago, I missed a major anniversary and almost didn't notice. Not so today, on an even more important personal landmark.
Ten years ago today, I ran for the last time.
According to my running log, it was an ordinary March morning, cold, damp, but no ice on the ground. I ran one of my favorite routes, and 8.25-mile loop I had been running for fifteen years. It was the first route longer than 5.5 miles I ever designed and ran, beginning an ending at the first house we owned in town. It passed only a few hundred feet from the house we moved to in 2008, so I adapted it and kept running. It consisted of small neighborhood streets, some urban trail, and a 2/3-mile passage through a wooded area near our new house. I ran this route on lots of Wednesdays in marathon training and lots of Fridays in the off-season when I was running purely for fun. March 4, 2011, was such a day.
There was nothing remarkable about my run that day. I had been coming down with a cold, so my time was unremarkable, too. I recorded my time when I got home and figured I'd run twelve miles on Sunday, as I usually did at this time of year.
Unfortunately, the cold turned worse, and suddenly I was as sick as I had been in a long time. I can't remember if I ever went to the doctor, but this one knocked me down hard for a week and a half. Just as I was ready to start running again, I felt a twinge in my right knee heading out for a walk with my wife. I became a runner, interrupted. I didn't know at the time, but I would not be running again.
Running had become a big part of my life over the previous 10 or 15 years, and it was a bit of a shock not to be able to enjoy the highs and lows of miles on the road. But we humans are resilient creatures, and I eventually adjusted to the new normal. I occasionally still dream about running, which is, to be honest, glorious. But mostly I get by walking with my wife, riding my bike, and trying to stay fit with other kinds of workout. Nothing feels like running, though, and nothing has ever made me as fit. As much as I like to bike and walk, I have never thought of myself as a walker or a cyclist. Maybe one day I will.
I think, though, that I will always think of myself as a runner. However, my right knee no longer agrees with me, and I am rational enough to weigh benefits and costs and make the right choice. So I don't run.
Ten years on, that still feels a little odd. As with so many things in life, no one waved a checkered flag at the end of my last run. I didn't know it was my last run until six weeks later, so I ended up grieving a loss that had, in a way, already happened.
I realize that, in the grand scheme of things, this is a minor loss. I've been fortunate my entire life, and if not running is the worst thing that ever happens to me, I will have lived an insanely fortunate life. Still I miss it and probably always will.
A few weeks ago, a Scott Galloway video clip made the rounds. In it, Galloway was saying something about "finding your passion" that many people have been saying for a long time, only in that style that makes Galloway so entertaining. Here's a great bit of practical advice on the same topic from tech guru Kevin Kelly:
Following your bliss is a recipe for paralysis if you don't know what you are passionate about. A better motto for most youth is "master something, anything". Through mastery of one thing, you can drift towards extensions of that mastery that bring you more joy, and eventually discover where your bliss is.
My first joking thought when I read this was, "Well, maybe not anything..." I mean, I can think of lots of things that don't seem worth mastering, like playing video games. But then I read about professional gamers making hundreds of thousands of dollars a year, so who am I to say? Find something you are good at, and get really good at it. As Galloway says, like Chris Rock before him, it's best to become good at something that other people will pay you for. But mastery of anything opens doors that passion can only bang on.
The key to the "master something, anything" mantra is the next sentence of Kelly's advice. When we master something, our expertise creates opportunities. We can move up or down the hierarchy of activities built from that mastery, or to related domains. That is where we are most likely to find the life that brings us joy. Even better, we will find it in a place where our mastery helps us get through the inevitable drudge work and over the inevitable obstacles that will pop in our way. I love to program, but some days debugging is a slog, and other days I butt up against thorny problems beyond my control. The good news is that I have skills to get through those days, and I like what I'm doing enough to push on through to the more frequent moments and days of bliss.
Passion is wonderful if you have it, but it's hard to conjure up on its own. Mastering a skill, or a set of skills, is something every one of us can do, and by doing it we can find our way to something that makes us happy.
... my daughter gave to me:
We celebrated Part 2 of our Zoom Family Christmas this morning. A package from one of our daughters arrived in the mail after Part 1 on Christmas Day, so we reprised our celebration during today's weekly call.
My daughter does not read my blog, at least not regularly, but she did search around there for evidence that I might already own these titles. Finding none, she ventured the long-distance gift. It was received with much joy.
I've known about I Am a Strange Loop for over a decade but have never read it. Somehow, Surfaces and Essences flew under my radar entirely. A book that is new to me!
These books will provide me many hours of enjoyment. Like Hofstadter's other books, they will probably bend my brain a bit and perhaps spark some welcome new activity.
Hofstadter appears in this blog most prominently in a set of entries I wrote after he visited my university in 2012:
I did mention I Am a Strange Loop in a later entry after all, a reflection on Alan Turing, representation, and universal machines. I'm glad that entry did not undermine my daughter's gift!
The physicist Leo Szilard once announced to his friend Hans Bethe that he was thinking of keeping a diary: "I don't intend to publish it; I am merely going to record the facts for the information of God." "Don't you think God knows the facts?" Bethe asked. "Yes," said Szilard. "He knows the facts, but He does not know this version of the facts."
I began 2021 by starting to read Disturbing the Universe, Freeman Dyson's autobiographical attempt to explain to people who are not scientists what the human situation looks like to someone who is a scientist. The above passage opens the author's preface.
Szilard's motive seems like a pretty good reason to write a blog: to record the one's own version of the facts, for oneself and for the information of God. Unlike Szilard, we have an alternative in between publishing and not publishing. A blog is available for anyone to read, at almost no cost, but ultimately it is for the author, and maybe for God.
I've been using the long break between fall and spring semesters to strengthen my blogging muscle and redevelop my blogging habit. I hope to continue to write more regularly again in the coming year.
Dyson's book is a departure from my recent reading. During the tough fall semester, I found myself drawn to fiction, reading Franny and Zooey by J. D. Salinger, The Bell Jar by Sylvia Plath, The Lucky Ones by Rachel Cusk, and The Great Gatsby by F. Scott Fitzgerald, with occasional pages from André Gide's diary in the downtime between books.
I've written about my interactions with Cusk before [ Outline, Transit, Kudos ], so one of her novels is no surprise here, but what's with those classics from sixty years ago or more? These stories, told by deft and observant writers, seemed to soothe me. They took the edge off of the long days. Perhaps I could have seen a run of classic books coming... In the frustrating summer run-up to fall, I read Thomas Mann's Death in Venice and Ursula Le Guin's The Lathe of Heaven.
For some reason, yesterday I felt the urge to finally pick up Dyson's autobiography, which had been on my shelf for a few months. A couple of years ago, I read most of Dyson's memoir, Maker of Patterns, and found him an amiable and thoughtful writer. I even wrote a short post on one of his stories, in which Thomas Mann plays a key role. At the time, I said, "I've never read The Magic Mountain, or any Mann, for that matter. I will correct that soon. However, Mann will have to wait until I finish Dyson...". 2020 may have been a challenge in many ways, but it gave me at least two things: I read my first Mann (Death in Venice is much more approachable than The Magic Mountain...), and it returned me to Dyson.
Let's see where 2021 takes us.
(This is an almost entirely personal entry. If that's not your thing, feel free to move on.)
Last Monday morning, I sat down and wrote a blog entry. It was no big deal, just an observation from some reading I've been doing. A personal note, a throwaway. I'm doing the same this morning.
I'd forgotten how good that can feel, which tells you something about my summer and fall.
Last spring, I wrote that I would be teaching a new course in the fall. I was pretty excited for the change of pace, even if it meant not teaching my compiler course this year, and was already thinking ahead to course content, possible modes of delivery in the face of Covid-19, and textbooks.
Then came summer.
Some of my usual department head tasks, like handling orientation sessions for incoming first-year students, were on the calendar, but the need to conduct them remotely expanded what used to be a few hours of work each week to a few hours each day. (It must have been even worse for the university staff who organize and run orientation!) Uncertainty due to the pandemic and the indecisiveness of upper administration created new work, such as a seemingly endless discussion of fall schedule, class sizes, and room re-allocation.
One of the effects of all this work was that, when August rolled around, I was not much better prepared to teach my class than I had been in May when I solicited everyone's advice.
Once fall semester started, my common refrain in conversations with friends was, "It feels like I'm on a treadmill." As soon as I finished preparing a week's in-class session, I had to prepare the week's online activity. Then there were homework assignments to write, and grade. Or I had to write an exam, or meet with student's to discuss questions or difficulties, made all the more difficult but the stress the pandemic placed on them. I never felt like I could take a day or hour off, and when I did, class was still there in my mind, reminding of all I had to do before another week began and the cycle began again.
That went on for fourteen weeks. I didn't feel out of sorts so much as simply always busy. It would be over soon enough, my rational mind told me.
When the semester ended at Thanksgiving, the treadmill of new work disappeared and all that was left was the grading. I did not rush that work, letting it spread over most of the week and a half I had before grades were due. I figured that it was time to decompress a bit.
After grades were in and I had time to get back to normal, I noticed some odd things happening. First of all I was sleeping a lot: generous naps most days, and a couple of Fridays where I was in bed for ten hours of rest (followed, predictably, by a nap later in the day). I'm no insomniac by nature, but this was much more sleep than I usually take, or need.
My workout data told a tale of change, too. My elliptical and bike performances had been steadily showing the small improvements of increased capability through May or so. They leveled off into the summer months, when I was able to ride outside more with my wife. Then fall started, and my performance levels declined steadily into November. The numbers started to bounce back in December, and I feel as strong as I've felt in a long while.
I guess fall semester hit me harder than I realized.
In most ways, I feel like I'm back to normal now. I guess we will find out next week, when my attention turns to spring semester, both as department head and instructor. At least I get to teach programming languages, where I have a deep collection of session materials in hand and years of thinking and practice to buoy me up. Even with continued uncertainty due to the pandemic, I'm in pretty good shape.
Another effect of the summer and fall was that my already too-infrequent blogging slowed to a trickle. The fate of most blogs is a lack of drama. For me, blogging tends to flag when I am not reading or doing interesting work. The gradual expansion of administrative duties over the last few years has certainly taken its toll. But teaching a new course usually energizes me and leads to more regularly writing here. That didn't happen this fall.
With the semester now over, I have a better sense of the stress I must have been feeling. It affected my sleep, my workouts, and my teaching. It's no surprise that it affected my writing, too.
One of my goals for the coming year is to seek the sort of conscious, intentional awakening of the senses that Gide alludes to the passage quoted by that blog post. I'm also going to pay better attention to the signs that the treadmill is moving too fast. Running faster isn't always the solution.
I make it a point never to believe anything just because it's widely known to be so. -- Bill James
A few years ago, a friend was downsizing and sent me his collection of The Bill James Abstract from the 1980s. Every so often I'll pick one up and start reading. This week, I picked up the 1984 issue.
It's baseball, so I enjoy it a lot. My #1 team, the Cincinnati Reds, were in the summer doldrums for most of the 1980s but on their way to a surprising 1990 World Series win. My #2 team, the Detroit Tigers, won it all in 1984, with a performance so dominating that it seemed almost preordained. It's fun to reminisce about those days.
It's even more fascinating to watch the development of the scientific method in a new discipline.
Somewhere near the beginning of the 1984 abstract, James announces the stance that underlies all his work: never believe anything just because everyone else says it's true. Scattered through the book are elaborations of this philosophy. He recognizes that understanding the world of baseball will require time, patience, and revision:
In many cases, I have no clear evidence on the issue, no way of answering the question. ... I guess what I'm saying is that if we start trying to answer these questions now, we'll be able to answer them in a few years. An unfortunate side effect is that I'm probably going to get some of the answers wrong now; not only some of the answers but some of the questions.
Being wrong is par for the course for scientists; perhaps James felt some consolation in that this made him like Charles Darwin. The goal isn't to be right today. It is to be less wrong than yesterday. I love that James tells us that, early in his exploration, even some of the questions he is asking are likely the wrong questions. He will know better after he has collected some data.
James applies his skepticism and meticulous analysis to everything in the game: which players contribute the most offense or defense to the team, and how; how pitching styles affect win probabilities; how managers approach the game. Some things are learned quickly but are rejected by the mainstream. By 1984, for example, James and people like him knew that, on average, sacrifice bunts and most attempts to steal a base reduced the number of runs a team scores, which means that most of them hurt the team more than they help. But many baseball people continued to use them too often tactically and even to build teams around them strategically.
At the time of this issue, James had already developed models for several phenomena in the game, refined them as evidence from new seasons came in, and expanded his analysis into new areas. At each step, he channels his inner scientist: look at some part of the world, think about why it might work the way it does, develop a theory and a mathematical model, test the theory with further observations, and revise. James also loves to share his theories and results with the rest of us.
There is nothing new here, of course. Sabermetrics is everywhere in baseball now, and data analytics have spread to most sports. By now, many people have seen Moneyball (a good movie) or read the Michael Lewis book on which it was based (even better). Even so, it really is cool to read what are in effect diaries recording what James is thinking as he learns how to apply the scientific method to baseball. His work helped move an entire industry into the modern world. The writing reflects the curiosity, careful thinking, and humility that so often lead to the goal of the scientific mind:
Yesterday, I mentioned Barney Stinson's legendary rules for running a marathon. Truly dedicated readers of this blog may remember that I first quoted Barney's rules many moons ago when beginning to run again after a short layoff.
Back then, I blogged a fair bit about running, and especially about the mental connections I made among training, running, teaching, and writing software. Alas, my running life ended rather abruptly when the ultimate effects of an old tennis injury became apparent over a decade later.
October this year brought the tenth anniversary of my final marathon, a beautiful morning spent on the flat streets and trails of Des Moines. Such a momentous anniversary deserved a celebration, or at least a recollection in my blog. However, in the blur of my fall semester, it blew right by me, much like the fastest runners on the course that morning.
I did not completely forget the anniversary. Several months ago, it came to mind, and at the time I made a mental note to write a retrospective blog post. Few readers would care enough to read such a post, but this would be for me, a way to remember so many fond moments on the road and to mourn again, ever so briefly, something I lost.
I should have set an alert in my calendar app.
If this were a wedding anniversary, a spouse might have reminded me, but this was all on me. (I say "might" there because my own spouse is not the sort to valorize anniversaries and birthdays.) Like so much blogging not done in the last year or so, the post I intended to write turned out to be vaporware.
Here, on a beautiful sunny morning seven weeks later, I don't feel much need to write about that last marathon again. Running is no longer a part of my day, my life, though occasionally it is still part of my dreams at night. Some days, I miss it more than others. Perhaps the next time I feel that longing, I'll sit down immediately and write, rather than plan to write some months on. That's the best way for me to blog: to try to capture a thought or feeling in the moment and make sense of it then.
On a completely unrelated note: While looking in my blog's image library for the picture posted above, I realized that the date of my last marathon, 10-20-2010, is a palindrome of sorts: 10 20 20 10. I love to play with digits and feel no compulsion to hew closely to dictionary definitions when it comes to the patterns I see. Numbers offer small joys in most moments, if we let them.
From this enlightening article that was being passed around a while back:
Talking posed a challenge for me. While my Mandarin was strong for someone who had grown up in the US, I wasn't fluent enough to express myself in the way I wanted. This had some benefits: I had to think before I spoke. I was more measured. I was a better listener. But it was also frustrating, as though I'd turned into a person who was meek and slow on the uptake. It made me think twice about the Chinese speakers at work or school in the US whom I'd judged as passive or retiring. Perhaps they were also funny, assertive, flirtatious, and profane in their native tongue, as I am in mine.
When people in the US talk about the benefits of learning a second language, they rarely, if ever, mention the empathy one can develop for others who speak and work in in a second language. Maybe that's because so few of us Americans learn a foreign language well enough to reach this level of enlightenment.
I myself learned just enough German in school to marvel at the accomplishment of exchange students studying here in their second language, knowing that I was nowhere near ready to live and study in a German-speaking land. Marvel, though, is not quite as valuable in this context as empathy.
People often look at the difference between the highest-rated male chess player in a group and the highest-rated female chess player in the same group and conclude that there is a difference between the abilities of men and women to play chess, despite the fact that there are usually many, many more men in the group than women. But that's not even good evidence that there is an achievement gap. From What Gender Gap in Chess?:
It's really quite simple. Let's say I have two groups, A and B. Group A has 10 people, group B has 2. Each of the 12 people gets randomly assigned a number between 1 and 100 (with replacement). Then I use the highest number in Group A as the score for Group A and the highest number in Group B as the score for Group B. On average, Group A will score 91.4 and Group B 67.2. The only difference between Groups A and B is the number of people. The larger group has more shots at a high score, so will on average get a higher score. The fair way to compare these unequally sized groups is by comparing their means (averages), not their top values. Of course, in this example, that would be 50 for both groups -- no difference!
I love this paragraph. It's succinct and uses only the simplest ideas from probability and statistics. It's the sort of statistics that I would hope our university students learn in their general education stats course. While learning a little math, students can also learn about an application that helps us understand something important in the world.
The experiment described is also simple enough for beginning programmers to code up. Over the years, I've used problems like this with intro programming students in Pascal, Java, and Python, and with students learning Scheme or Racket who need some problems to practice on. I don't know whether learning science supports my goal, but I hope that this sort of problem (with suitable discussion) can do double duty for learners: learn a little programming, and learn something important about the world.
With educational opportunities like this available to us, we really should be able to turn graduates who have a decent understanding of why so many of our naive conclusions about the world are wrong. Are we putting these opportunities to good use?
This.
Was the time I spent writing my RSS scripts more than the time I would now spend thinking about the "best" RSS aggregator and reader? Doesn't matter. I enjoyed writing the scripts. I learned new things and got satisfaction out of seeing them run correctly. I get nothing like that out of comparing apps and services.
I concur so strongly not only because he writes about RSS, which I'm on record as supporting and using. I enjoy rolling my own simple software in almost any domain. Simple has a lot of advantages. Under my control has a lot of advantages. But the biggest advantage echoes what Dr. Drang says: Programming is often more fun than the alternative uses of my time.
I program because I like to, and because I can.
As a financial writer for Forbes and the Wall Street Journal, Jason Zweig has received a lot of letters from readers, many of them forcefully suggesting that his columns could be better. In this interview, he speaks calmly about processing negative feedback:
... when you get negative feedback, you have to sort it. You can't just take all negative feedback and throw it in the "I'm not reading this" bucket. You have to go through it. And you have to say: "Is this person, who says I'm wrong, right or wrong?" Because if the person says you're wrong, and is wrong, then how does that hurt you? But if the person who says you're wrong is right, it's devastating to you if you don't listen.
It's not about winning. It's about learning.
I know profs who refuse to read their student assessments because it's too hard emotionally to deal with negative feedback. I understand the temptation... There are semesters when thirty-nine reviews are positive, yet the one negative review lodges itself in my brain and won't let go. Even after decades of teaching, it can be hard to shake off those comments immediately. And when there many comments that are "constructive" or just plain negative, well, reading the assessments can really demoralize.
But as Zweig says, closing myself off to the feedback is ultimately a losing proposition. Sometimes I assess a comment and decide that it's off the mark, or the result of singular event or experience and therefore isn't worth sweating over. But what about when the reviewer is right? Or when there's a kernel of truth in an otherwise unnecessarily personal comment? Ignoring the truth doesn't do me any good. I want to get better.
I did not receive student assessments this spring. When the university moved to remote instruction suddenly, the administration and faculty agreed to suspended assessments for the semester, with the idea that teaching and learning would both be a bit bumpier than usual under the extreme conditions. Just before the last week of the term, they agreed to bring optional assessments back purely for the prof's personal use, but by then I decided to pass. Some of my students provided some helpful feedback, including constructive criticism, all on their own.
I'll actually miss reading my assessments this month, if not the sudden spike in my blood pressure that sometimes accompanies them. Students are usually helpful and surprisingly generous in their evaluations, and I still usually learn a lot from the positive ones and the negative ones alike.
As I closed down my remote class session yesterday, I felt a familiar feeling... That session can be better! I've been using variations of this session, slowly improving it, for a few years now, and I always leave the classroom thinking, "Wait 'til next time." I'm eager to improve it now and iterate, trying it again tomorrow. Alas, tomorrow is another day, with another class session all its own. Next time is next year.
I feel this way about most of the sessions in most of my courses. Yesterday, it occurred to me that this must be what Phil Connors feels like in Groundhog Day.
Phil wakes up every day in the same place and time as yesterday. Part way through the film, he decides to start improving himself. Yet the next morning, there he is again, in the same place and time as yesterday, a little better but still flawed, in need of improvement.
Next spring, when I sit down to prep for this session, it will be like hitting that alarm clock and hearing Sonny and Cher all over again.
I told my wife about my revelation and my longing: If only I could teach this session 10,000 times, I'd finally get it right. You know what she said?
"Think how your students must feel. If they could do that session 10,000 times, they'd feel like they really got it, too."
My wife is wise. My students and I are in this together, getting a little better each day, we hope, but rarely feeling like we've figured all that much out. I'll keep plugging away, Phil Connors as CS prof. "Okay, campers, rise and shine..." Hopefully, today I'll be less wrong than yesterday. I wish my students the same.
Who knows, one of these days, maybe I'll leave a session and feel as Phil does in the last scene of the film, when he wakes up next to his colleague Rita. "Do you know what today is? Today is tomorrow. It happened. You're here." I'm not holding my breath, though.
Early in this Paris Review interview, Ray Bradbury says, "A conglomerate heap of trash, that's what I am." I smiled, because that's what I feel like sometimes, both culturally and academically. Later he confessed something that sealed my sense of kinship with him:
I am a librarian. I discovered me in the library. I went to find me in the library. Before I fell in love with libraries, I was just a six-year-old boy. The library fueled all of my curiosities, from dinosaurs to ancient Egypt.
I was a library kid, too. I owned a few books, but I looked forward to every chance we had to go to the library. My grade school had books in every classroom, and my teachers shared their personal books with those of us who so clearly loved to read. Eventually my mom took me and my siblings to the Marion County public to get a library card, and the world of books available seemed limitless. When I got to high school, I spent free time before and after classes wandering the stacks, discovering science fiction, Vonnegut and Kafka and Voltaire, science and history. The school librarian got used to finding me in the aisles at times. She became as much a friend as any high school teacher could. So many of my friends have shelves and shelves of books; they talk about their addiction to Amazon and independent bookstores. But almost all of the books I have at home fit in a single bookshelf (at right). One of them is Bradbury's The Martian Chronicles, which I discovered in high school.
I do have a small chess library on another shelf across the room and a few sports books, most from childhood, piled nearby. I tried to get rid of the sports books once, in a fit of Marie Kondo-esque de-cluttering, but I just couldn't. Even I have an attachment to the books I own. Having so few, perhaps my attraction is even stronger than it might otherwise be, subject to some cosmic inverse square law of bibliophilia.
At my office, I do have two walls full of books, mostly textbooks accumulated over my years as a professor. When I retire, though, I'll keep only one bookcase full of those -- a few undergrad CS texts, yes, but mostly books I purchased because they meant something to me. Gödel, Escher, Bach. Metamagical Themas. Models of My Life. A few books about AI. These are books that helped me find me.
After high school, I was fortunate to spend a decade in college as an undergraduate and grad student. I would not trade those years for anything; I learned a lot, made friends with whom I remain close, and grew up. Bradbury, though, continued his life as an autodidact, going to the public library three nights a week for a decade, until he got married.
So I graduated from the library, when I was twenty-seven. I discovered that the library is the real school.
Even though I spent a decade as a student in college and now am a university prof, the library remains my second home. I rarely buy books to this day; I don't remember my last purchase. The university library is next to my office building, and I make frequent trips over in the afternoons. They give me a break from work and a chance to pick up my next read. I usually spend a lot more time there than necessary, wandering the stacks and exploring. I guess I'm still a library kid.
First the good news: after three more sessions, I am less despondent than I was after Week Two. I have taken my own advice from Week One and lowered expectations. After teaching for so many years and developing a decent sense of my strengths and weaknesses in the classroom, this move took me out of my usual groove. It was easy to forget in the rush of the moment not to expect perfection, and not being able to interact with students in the same way created different emotions about the class sessions. Now that I have my balance back, things feel a bit more normal.
Part of what changed things for me was watching the videos I made of our class sessions. I quickly realized that these sessions are no worse than my usual classes! It may be harder for students to pay attention to the video screen for seventy-five minutes in the same way they might pay attention in the classroom, but my actual presentation isn't all that different. That was comforting, even as I saw that the videos aren't perfect.
Another thing that comforted me: the problems with my Zoom sessions are largely the same as the problems with my classroom sessions. I can fall into the habit of talking too much and too long unless I carefully design exercises and opportunities for students to take charge. The reduced interaction channel magnifies this problem slightly, but it doesn't create any new problems in principle. This, too, was comforting.
For example, I notice that some in-class exercises work better than others. I've always know this from my in-person course sessions, but our limited interaction bandwidth really exposes problems that are at the wrong level for where the students are at the moment (for me, usually too difficult, though occasionally too easy). I am also remembering the value of the right hint at the right moment and the value of students interacting and sharing with one another. Improving on these elements of my remote course should result in corresponding improvements when we move back to campus.
I have noticed one new problem: I tend to lose track of time more easily when working with the class in Zoom, which leads me to run short on time at the end of the period. In the classroom, I glance at a big analog clock on the wall at the back of the room and use that to manage my time. My laptop has a digital clock in the corner, but it doesn't seem to help me as much. I think this is a function of two parameters: First, the clock on my computer is less obtrusive, so I don't look at it as often. Second, it is a digital clock. I feel the geometry of analog time viscerally in a way that I don't with digital time. Maybe I'm just old, or maybe we all experience analog clocks in a more physical way.
I do think that watching my lectures can help me improve my teaching. After Week One, I wondered, "In what ways can going online, even for only a month and a half, improve my course and materials?" How might this experience make me a better teacher or lead to better online materials? I have often heard advice that I should record my lectures so that I could watch them with an experienced colleague, with an eye to identifying strengths to build on and weaknesses to improve on. Even without a colleague to help, this few weeks of recording gives me a library of sessions I can use for self-diagnosis and improvement.
Maybe this experience will have a few positives to counterbalance its obvious negatives.
This passage from Remembering the LAN recalls an earlier time that feels familiar:
My father, a general practitioner, used this infrastructure of cheap 286s, 386s, and 486s (with three expensive laser printers) to write the medical record software for the business. It was used by a dozen doctors, a nurse, and receptionist. ...
The business story is even more astonishing. Here is a non-programming professional, who was able to build the software to run their small business in between shifts at their day job using skills learned from a book.
I wonder how many hobbyist programmers and side-hustle programmers of this sort there are today. Does programming attract people the way it did in the '70s or '80s? Life is so much easier than typing programs out of Byte or designing your own BASIC interpreter from scratch. So many great projects out on Github and the rest of the web to clone, mimic, adapt. I occasionally hear a student talking about their own projects in this way, but it's rare.
As Crawshaw points out toward the end of his post, the world in which we program now is much more complex. It takes a lot more gumption to get started with projects that feel modern:
So much of programming today is busywork, or playing defense against a raging internet. You can do so much more, but the activation energy required to start writing fun collaborative software is so much higher you end up using some half-baked SaaS instead.
I am not a great example of this phenomenon -- Crawshaw and his dad did much more -- but even today I like to roll my own, just for me. I use a simple accounting system I've been slowly evolving for a decade, and I've cobbled together bits and pieces of my own tax software, not an integrated system, just what I need to scratch an itch each year. Then there are all the short programs and scripts I write for work to make Spreadsheet City more habitable. But I have multiple CS degrees and a lot of years of experience. I'm not a doctor who decides to implement what his or her office needs.
I suspect there are more people today like Crawshaw's father than I hear about. I wish it were more of a culture that we cultivated for everyone. Not everyone wants to bake their own bread, but people who get the itch ought to feel like the world is theirs to explore.
Earlier in the week I read this article by Jason Fried and circled these sentences:
Ultimately this major upheaval is an opportunity. This is a chance for your company, your teams, and individuals to learn a new skill. Working remotely is a skill.
After two weeks of the great COVID-19 school-from-home adventure, I very much believe that teaching remotely is a skill -- one I do not yet have.
Last week I shared a few thoughts about my first week teaching online. I've stopped thinking of this as "teaching online", though, because my course was not designed as an online course. It was pushed online, like so many courses everywhere, in a state of emergency. The result is a course that has been optimized over many years for face-to-face synchronous interaction being taken by students mostly asynchronously, without much face-to-face interaction.
My primary emotion after my second week teaching remotely is disappointment. Switching to this new mode of instruction was simple but not easy, at least not easy to do well. It was simple because I already have so much textual material available online, including detailed lecture notes. For students who can read the class notes, do exercises and homework problems, ask a few questions, and learn on their own, things seem to be going fine so far. (I'll know more as more work comes in for evaluation.) But approximately half of my students need more, and I have not figured out yet how best to serve them well.
I've now hosted four class sessions via Zoom for students who were available at class time and interested or motivated enough to show up. With the exception of one student, they all keep their video turned off, which offers me little or no visual feedback. Everyone keeps their audio turned off except when speaking, which is great for reducing the distraction of noises from everybody's homes and keyboards. The result, though, is an eerie silence that leaves me feeling as if I'm talking to myself in a big empty room. As I told my students on Thursday, it's a bit unnerving.
With so little perceptual stimulus, time seems to pass quickly, at least for me. It's easy to talk for far too long. I'm spared a bit by the fact that my classes intersperse short exposition by me with interactive work: I set them a problem, they work for a while, and then we debrief. This sort of session, though, requires the students to be engaged and up to date with their reading and homework. That's hard for me to expect of them under the best of conditions, let alone now when they are dispersed and learning to cope with new distractions.
After a few hours of trying to present material online, I very much believe that this activity requires skill and experience, of which I have little of either at this point. I have a lot of work to do. I hope to make Fried proud and use this as an opportunity to learn new skills.
I had expected by this point to have created more short videos that I could use to augment my lecture notes, for students who have no choice but to work on the course whenever they are free. Time has been in short supply, though, with everything on campus changing all at once. Perhaps if I can make a few more videos and flip the course a bit more, I will both serve those students better and find a path toward using our old class time better for the students who show up then and deserve a positive learning experience.
At level of nuts and bolts, I have already begun to learn some of the details of Panopto, Zoom, and our e-learning system. I like learning new tools, though the complications of learning them all at once and trying to use them at the same time makes me feel like a tech newbie. I guess that never changes.
The good news is that other parts of the remote work experience are going better, sometimes even well. Many administrative meetings work fine on Zoom, because they are mostly about people sharing information and reporting out. Most don't really need to be meetings anyway, and participating via Zoom is an improvement over gathering in a big room. As one administrator said at the end of one meeting recently, "This was the first council meeting online and maybe the shortest council meeting ever." I call that a success.
I saw Robin Sloan's An App Can Be a Home-Cooked Meal floating around Twitter a few days back. It really is quite good; give it a read if you haven't already. This passage captures a lot of the essay's spirit in only a few words:
The exhortation "learn to code!" has its foundations in market value. "Learn to code" is suggested as a way up, a way out. "Learn to code" offers economic leverage, a squirt of power. "Learn to code" goes on your resume.
But let's substitute a different phrase: "learn to cook." People don't only learn to cook so they can become chefs. Some do! But far more people learn to cook so they can eat better, or more affordably, or in a specific way. Or because they want to carry on a tradition. Sometimes they learn just because they're bored! Or even because -- get this -- they love spending time with the person who's teaching them.
Sloan expresses better than I ever have an idea that I blog about every so often. Why should people learn to program? Certainly it offers a path to economic gain, and that's why a lot of students study computer science in college, whether as a major, a minor, or a high-leverage class or two. There is nothing wrong with that. It is for many a way up, a way out.
But for some of us, there is more than money in programming. It gives you a certain power over the data and tools you use. I write here occasionally about how a small script or a relatively small program makes my life so much easier, and I feel bad for colleagues who are stuck doing drudge work that I jump past. Occasionally I'll try to share my code, to lighten someone else's burden, but most of the time there is such a mismatch between the worlds we live in that they are happier to keep plugging along. I can't say that I blame them. Still, if only they could program and used tools that enabled them to improve their work environments...
But... There is more still. From the early days of this blog, I've been open with you all:
Here's the thing. I like to write code.
One of the things that students like about my classes is that I love what I do, and they are welcome to join me on the journey. Just today a student in my Programming Languages drifted back to my office with me after class , where we ended up talking for half an hour and sketching code on a whiteboard as we deconstructed a vocabulary choice he made on our latest homework assignment. I could sense this student's own love of programming, and it raised my spirits. It makes me more excited for the rest of the semester.
I've had people come up to me at conferences to say that the reason they read my blog is because they like to see someone enjoying programming as much as they do. many of them share links with their students as if to say, "See, we are not alone." I look forward to days when I will be able to write in this vein more often.
Sloan reminds us that programming can be -- is -- more than a line on a resume. It is something that everyone can do, and want to do, for a lot of different reasons. It would be great if programming "were marbled deeply into domesticity and comfort, nerdiness and curiosity, health and love" in the way that cooking is. That is what makes Computing for All really worth doing.
Maybe people don't tell stories only to make sense of the world, but rather sometimes to deceive themselves?
It was an interesting idea, I said, that the narrative impulse might spring from the desire to avoid guilt, rather than from the need -- as was generally assumed -- to connect things together in a meaningful way; that it was a strategy calculated, in other words, to disburden ourselves of responsibility.
This is from Kudos, by Rachel Cusk. Kudos is the third book in an unconventional trilogy, following Outline and Transit. I blogged on a passage from Transit last semester, about making something that is part of who you are.
I have wanted to recommend Cusk and these books, but I do not feel up to the task of describing how or why I think so highly of them. They are unorthodox narratives about narrative. To me, Cusk is a mesmerizing story-teller who intertwines stories about people and their lives with the fabric of story-telling itself. She seems to value the stories we tell about ourselves, and yet see through them, to some overarching truth.
As for my own narrative impulse, I think of myself as writing posts for this blog in order to make connections among the many things I learn -- or at least that is I tell myself. Cusk has me taking seriously the idea that some of the stories I tell may come from somewhere else.
Over the holiday, I realized something about myself.
A simple test exists for determining when I have reached the enlightenment of the Buddha: I will be able to fly without tension or complaint over delayed flights, airline tickets and policies, and the TSA.
The good news: I survived another trip. Special thanks to Felipe at United Airlines customer service for doing his best to help with a delayed flight on Christmas Eve, to Nic R. at the Cedar Rapids Airport for solving our problem, and to Joy, the flight attendant on the first leg of our journey, for a soothing excursion.
I did more than survive, though. Once in Boston, my wife and I had a wonderful time visiting our daughters for the week between Christmas Eve and New Year's Eve! I wrote once about my older daughter leaving for college. Both are graduated now, making their way out in the world as adults. This was our first holiday away from our home -- what used to be their home -- and in their new homes. I enjoyed spending time with them on their turf and on their terms. We talked about their futures, but also routine life: recipes, movies, and life in the city. On the last day, my older daughter and I made candles. It was an unexpected joy.
For us, a trip to Boston includes visits to a museum whenever possible. This trip included three: the Harvard art museums, the Isabella Stewart Gardner, and the MFA. My daughters showed me paintings they have discovered on their own visits, and I shared with them ones that I like. I usually discover a work or two on each visit that grab my eye for the first time, or again in a new way. On my first visit to the Harvard museum, one painting really grabbed me: "Leander's Tower on the Bosporus", by Sanford Robinson Gifford.
I've been to the MFA a few times and have a few favorites. It has a large collection of work by John Singer Sargent, a Bostonian who had a long relationship with the museum. This time, his paiting "The Master and His Pupils" drew me in as it had not before:
Our evenings often brought movies. We saw a new studio release, Star Wars: The Rise of Skywalker, in the theater; a made-for-BBC movie about Agatha Christie; and a movie that has flown under my radar for twenty years, GalaxyQuest. I enjoyed all three! The Star Wars film has its flaws, but it was a fun and appropriate end to a series that has spanned most of my life. We were lucky to stumble upon the Christie film while scrolling Netflix; it felt a lot like her mystery novels. I was surprised by how much I enjoyed GalaxyQuest. So much fun! How had I not seen it before?
The trip ended as it began, with an unexpected delay that stretched an already long travel day. Our time on the ground in Chicago at least offered the consolation of a computer malfunction that echoed our delay, designed for a programmer: Javascript for the travel-weary.
Again, though, there was good news: lots of time to walk with my wife, which was a good way to spend the day, and a good way to end our trip.
Now, back to working on my enlightenment...
Earlier this year, many people were passing around this article about "re-learning how to be yourself online". Near the end the author reaches a set of questions that are motivating him:
Here I was retreating from the web because I thought my online presence was unimportant and inconsequential. Meanwhile, a foreign power was using its resources to pretend to be someone like me to try to influence someone like me. What kind of influence does that mean I really have? What kind of influence does that mean each of us has? And who fills that vacuum if we fail to fill it ourselves?
Given the size of my following on Twitter and the size of my blog's readership, it's easy for me to think my online presence is unimportant or inconsequential. When time gets tight and work crushes me down or other interests call, feeling that my writing is inconsequential can be all that it takes not to take the time to write. I have to remind that that is never been why I tweet or blog.
With so much of modern life happening online, sometimes it can feel as if online writing has much higher stakes than it really does, especially for someone with my limited audience. It's worth reminding ourselves that tweeting and blogging can be simple reflections of who we are, nothing more and nothing less. The stakes don't have to be high, and our influence can be small. That's okay. Writing has its own benefits, and connecting with readers, however few there might be, is a bonus.
I am unlikely ever to write much about politics here, so my influence is unlikely to fill a space coveted by foreign powers. Whatever influence I may have will come from being myself. I need to overcome the inertia of busy days and make time to write.
Actually, that is not quite right. Much as I mention in my first blog entry ever, linked above, I have amassed a folder of ideas for blog entries. I also have a single org-mode file containing entries to write. Some, like the item that became this post, consist only of a quoted passage or some other trigger and a single idea waiting to be expanded. The most depressing items in the file are a seemingly endless collection of partially-written posts, some nearly finished, that never quite crossed the finish line.
So: I need to overcome inertia and make time to finish. That is easier said than done some days. Writing takes time, but finishing often takes a surprising amount of time. Finishing also means putting the words out in the world for others to read. That feels risky for many different reasons, and our minds can trick us into thinking we are better off saving the file somewhere and never finishing. But I like to think and write, and connecting with readers, however few there might be, is a bonus.
Oh, and as we in America enter a long weekend dedicated to gratitude, I thank all of you who are still reading my blog after all these years. I appreciate that you spend even a few of your scarce minutes reading what I write. Indeed, I marvel at it. I hope you find it time well spent.
I remember first learning as a student that some infinities are bigger than others. For some sets of numbers, it was easy to see how. The set of integers is infinite, and the set of real numbers is infinite, and it seemed immediately clear that there are fewer integers than reals. Demonstrations and proofs of the fact were cool, but I already knew what they showed me.
Other relationships between infinities were not so easy to grok. Consider: There are an infinite numbers of points on a sheet of paper. There are an infinite numbers of points on a wall. These infinities are equal to one another. But how? Mathematician Yuri Manin demonstrates how:
I explained this to my grandson, that there are as many points in a sheet of paper as there are on the wall of the room. "Take the sheet of paper, and hold it so that it blocks your view of the wall completely. The paper hides the wall from your sight. Now if a beam of light comes out of every point on the wall and lands in your eye, it must pass through the sheet of paper. Each point on the wall corresponds to a point on the sheet of paper, so there must be the same number of each."
I remember reading that explanation in school and feeling both amazed and enlightened. What sorcery is this? So simple, so beautiful. Informal proofs of this sort made me want to learn more mathematics.
Manin told the story quoted above in an interview a decade or so ago with Mikhail Gelfand, We Do Not Choose Mathematics as Our Profession, It Chooses Us. It was a good read throughout and reminded me again how I came to enjoy math.
The narrator in Rachel Cusk's "Transit" relates a story told to her by Pavel, the Polish builder who is helping to renovate her flat. Pavel left Poland for London to make money after falling out with his father, a builder for whom he worked. The event that prompted his departure was a reaction to a reaction. Pavel had designed and built a home for his family. After finishing, he showed it to his father. His father didn't like it, and said so. Pavel chose to leave at that moment.
'All my life,' he said, 'he criticise. He criticise my work, my idea, he say he don't like the way I talk -- even he criticise my wife and my children. But when he criticise my house' -- Pavel pursed his lips in a smile -- 'then I think, okay, is enough.'
I generally try to separate myself from the code and prose I write. Such distance is good for the soul, which does not need to be buffeted by criticism, whether external or internal, of the things I've created. It is also good for the work itself, which is free to be changed without being anchored to my identity.
Fortunately, I came out of home and school with a decent sense that I could be proud of the things I create without conflating the work with who I am. Participating in writers' workshops at PLoP conferences early in my career taught me some new tools for hearing feedback objectively and focusing on the work. Those same tools help me to give feedback better. I use them in an effort to help my students develop as people, writers and programmers independent of the code and prose they write.
Sometimes, though, we make things that are expressions of ourselves. They carry part of us in their words, in what they say to the world and how they say it. Pavel's house is such a creation. He made everything: the floors, the doors, and the roof; even the beds his children slept in. His father had criticized his work, his ideas, his family before. But criticizing the house he had dreamed and built -- that was enough. Cusk doesn't give the reader a sense that this criticism was a last straw; it was, in a very real way, the only straw that mattered.
I think there are people in this world who would like just once in their lives to make something that is so much a part of who they are that they feel about it as Pavel does his house. They wish to do so despite, or perhaps because of, the sharp line it would draw through the center of life.
In a YC Female Founder Story, Danielle Morrill gives a wise answer to an old question:
Q: What do you wish someone had told you when you were 15?
I think people were telling me a lot of helpful things when I was 15 but it was very hard to listen.
This may seem more like a wry observation than a useful bit of wisdom. The fifteen-year-olds of today are no more likely to listen to us than we were to listen to adults when we were fifteen. But that presumes young people have more to learn than the rest of us. I'm a lot older than 15, and I still have plenty to learn.
Morrill's answer is a reminder to me to listen more carefully to what people are telling me now. Even now that can be hard, with all the noise out there and with my own ego getting in my way. Setting up my attention systems to identify valuable signals more reliably can help me learn faster and make me a lot more productive. It can also help future-me not want to look back wistfully so often, wishing someone had told me now what I know then.
I hope to eventually write up a reflection on my first Dagstuhl seminar, but for now I have a short story about how I encountered a new idea three times in ten days, purely by coincidence. Actually, the idea is over one hundred fifty years old but, as my brother often says, "Hey, it's new to me."
On the second day of Dagstuhl, Mark Guzdial presented a poster showing several inspirations for his current thinking about task-specific programming languages. In addition to displaying screenshots of two cool software tools, the poster included a picture of an old mechanical device that looked both familiar and strange. Telegraphy had been invented in the early 1840s, and telegraph operators needed some way to type messages. But how? The QWERTY keyboard was not created for the typewriter until the early 1870s, and no other such devices were in common use yet. To meet the need, Royal Earl House adapted a portion of a piano keyboard to create the input device for the "printing telegraph", or teleprinter. The photo on Mark's poster looked similar to the one on Wikipedia page for the teleprinter.
There was a need for a keyboard thirty years before anyone designed a standard typing interface, so telegraphers adapted an existing tool to fit their needs. What if we are in that same thirty-year gap in the design of programming languages? This has been one of Mark's inspirations as he works with non-computer scientists on task-specific programming languages. I had never seen an 1870s teleprinter before and thought its keyboard to be a rather ingenious way to solve a very specific problem with a tool borrowed from another domain.
When Dagstuhl ended, my wife and I spent another ten days in Europe on a much-needed vacation. Our first stop was Paris, and on our first full day there we visited the museum of the Conservatoire National des Arts et Métiers. As we moved into the more recent exhibits of the museum, what should I see but...
... a Hughes teleprinter with piano-style keyboard, circa 1975. Déjà vu! I snapped a photo, even though the device was behind glass, and planned to share it with Mark when I got home.
We concluded our vacation with a few days in Martinici, Montenegro, the hometown of a department colleague and his wife. They still have a lot of family in the old country and spend their summers there working and relaxing. On our last day in this beautiful country, we visited its national historical museum, which is part of the National Museum of Montenegro in the royal capital of Cetinje. One of the country's most influential princes was a collector of modern technology, and many of his artifacts are in the museum -- including:
This full-desk teleprinter was close enough to touch and examine up close. (I didn't touch!) The piano keyboard on the device shows the wear of heavy use, which brings to mind each of my laptops' keyboards after a couple of years. Again, I snapped a photo, this time in fading light, and made a note to pass it on.
In ten days, I went from never having heard much about a "printing telegraph" to seeing a photo of one, hearing how it is an inspiration for research in programming language design, and then seeing two such devices that had been used in the 19th-century heyday of telegraphy. It was an unexpected intersection of my professional and personal lives. I must say, though, that having heard Mark's story made the museum pieces leap into my attention in a way that they might not have otherwise. The coincidence added a spark to each encounter.
A few months back, Mark Guzdial began to ponder a new research question:
I did some literature searches, and found a highly relevant paper: "Task specific programming languages as a first programming language." And the lead author is... me. I wrote this paper with Allison Elliott Tew and Mike McCracken, and published it in 1997. I honestly completely forgot that I had written this paper 22 years ago. Guzdial-past knew things that Guzdial-present does not.
I know this feeling too well. It seems that whenever I look back at an old blog post, especially from the early years, I am surprised to have already thought something, and usually to have thought it better and more deeply than I'm thinking it now! Perhaps this says something about the quality of my thinking now, or the quality of my blogging then. Or maybe it's simply an artifact of time and memory. In any case, stumbling across a link to an ancient blog entry often leads to a few moments of pleasure after an initial bit of disorientation.
On a related note, the fifteenth anniversary of my first blog post passed while I was at Dagstuhl earlier this month. For the first few years, I regularly wrote twelve to twenty posts a month. Then for a few years I settled into a pattern of ten to twelve monthly. Since early 2017, though, I've been in the single digits, with fewer substantial entries. I'm not giving Eugene-2025 much material to look back on.
With a new academic year soon upon us, I hope to write a bit more frequently and a bit more in depth about my programming, my teaching, and my encounters with computer science and the world. I think that will be good for me in many ways. Sometimes, knowing that I will write something encourages me to engage more deeply than I might otherwise. Nearly every time, the writing helps me to make better sense of the encounter. That's one way to make Eugene-Present a little smarter.
As always, I hope that whoever is still reading here finds it worth their time, too.
In You Are Here, Ben Hunt writes:
You know what I miss most about the world before Amazon? I miss going to the library and looking up a book in the card catalog, searching the stacks for the book in question, and then losing myself in the experience of discovery AROUND the book I was originally searching for. It's one of the best feelings in the world, and I'm not sure that my children have ever felt it. I haven't felt it in at least 20 years.
My daughters, now in their mid-20s, have felt it. We were a library family, not a bookstore family or an Amazon family. Beginning as soon as they could follow picture books, we spent countless hours at the public library in our town and the one in the neighboring city. We took the girls to Story Time and to other activities, but mostly we went to read and wander and select a big stack of books to take home. The books we took home never lasted as long as we thought they would, so back we'd go.
I still wander the stacks myself, both at the university library and, less often these days, the local public libraries. I always start with a few books in mind, recommendations gathered from friends and articles I've read, but I usually bring home an unexpected bounty. Every year I find a real surprise or two, books I love but would never have known about if I hadn't let myself browse. Even when I don't find anything surprising to take home, it's worth the time I spend just wandering.
Writing a little code often makes my day better. So does going to the library. Walking among books, starting with a goal and then aimlessly browsing, calms me on days I need calming and invigorates me on days when my energy is down. Some days, it does both at the same time. Hunt is right: It's one of the best feelings in the world. I hope that whatever else modern technology does for our children, it gives them something to rival this feeling.
At one point in the novel "Outline", by Rachel Cusk, a middle-aged man relates a conversation that he had with his elderly mother, in which she says:
I could weep just to think that I'll never see you again as you were at the age of six -- I would give anything, she said, to meet that six-year-old one more time.
This made me think of two photographs I keep on the wall at my office, of my grown daughters when they were young. In one, my older daughter is four; in the other, my younger daughter is two. Every once in a while, my wife asks why I don't replace them with something newer. My answer is always the same: They are my two favorite pictures in the world. When my daughters were young, they seemed to be infinite bundles of wonder: always curious, discovering things and ideas everywhere they went, making connections. They were restless in a good way, joyful, and happy. We can be all of these things as we grow into adulthood, but I experienced them so much differently as a father, watching my girls live them.
I love the people my daughters are now, and are becoming, and cherish my relationship with them. Yet, like the old woman in Cusk's story, there is a small part of me that would love to meet those little girls again. When I see one of my daughters these days, she is both that little girl, grown up, and not that little girl, a new person shaped by her world and by her own choices. The photographs on my wall keep alive memories not just of a time but also of specific people.
As I thought about Cusk's story, it occurred to me that the idea of "her and not her" does not apply only to my daughters, or to my wife, old pictures of whom I enjoy with similar intensity. I am me and not me.
I'm both the little guy who loved to read encyclopedias and shoot baskets every day, and not him. I'm not the same guy who walked into high school in a new city excited about the possibilities it offered and nervous about how I would fit in, yet I grew out of him. I am at once the person who started college as an architecture major -- who from the time he was eight years old had wanted to be an architect -- and not him. I'm not the same person who defended a Ph.D. dissertation half a life ago, but who I am owes a lot to him. I am both the man my wife married and not, being now the man that man has become.
And, yes, the father of those little girls pictured on my wall: me and not me. This is true in how they saw me then and how they see me now.
I'm not sure how thinking about this distinction will affect future me. I hope that it will help me to appreciate everyone in my life, especially my daughters and my wife, a bit more for who they are and who they have been. Maybe it will even help me be more generous to 2019 me.
From David Lebovitz:
The best way to repair your knives is not to damage them in the first place.
I think one can probably replace "knives" with the name of any tool and have a good piece of advice. It may even apply to software.
Over the years, like many of you, I have gone through phases in which I was enamored with productivity pr0n. I also have an interest in good pens and notebooks, though not nearly to the level of some of my friends. Lately for me, though, it has been cooking that has captured my attention. My twice-weekly adventures in the kitchen are so conspicuous among my family that now, whenever my daughters go to cool places like India and Europe, they bring me back native spices as gifts.
... which accounts for why I might be quoting a blog on knives. I've become so aware of the utility of kitchen knives, and their feel in my hand, that I'm reading about them and thinking about making a purchase or two.
Every programmer knows that a good tool can make all the difference in how we feel when we work and in the quality of what we create. That's true in the kitchen, too.
This morning, while riding the exercise bike, I read two items within twenty minutes or so that formed a nice juxtaposition for our age. First came The Cost of Distraction, an old blog post by L.M. Sacasas that reconsiders Kurt Vonnegut's classic story, "Harrison Bergeron" (*). In the story, it is 2081, and the Handicapper General of the United States ensures equality across the land by offsetting any advantages any individual has over the rest of the citizenry. In particular, those of above-average intelligence are required to wear little earpieces that periodically emit high-pitched sounds to obliterate any thoughts in progress. The mentally- and physically-gifted Harrison rebels, to an ugly end.
Soon after came Ian Bogost's Apple's AirPods Are an Omen, an article from last year that explores the cultural changes that are likely to ensue as more and more people wear AirPods and their ilk. ("Apple's most successful products have always done far more than just make money, even if they've raked in a lot of it....") AirPods free the wearer in so many ways, but they also bind us to ubiquitous distraction. Will we ever have a free moment to think deeply when our phones and laptops now reside in our heads?
As Sacasas says near the end of his post,
In the world of 2081 imagined by Vonnegut, the distracting technology is ruthlessly imposed by a government agency. We, however, have more or less happily assimilated ourselves to a way of life that provides us with regular and constant distraction. We have done so because we tend to see our tools as enhancements.
Who needs a Handicapper General when we all walk down to the nearest Apple Store or Best Buy and pop distraction devices into our own ears?
Don't get me wrong. I'm a computer scientist, and I love to program. I also love the productivity my digital tools provide me, as well as the pleasure and comfort they afford. I'm not opposed to AirBuds, and I may be tempted to get a pair someday. But there's a reason I don't carry a smart phone and that the only iPod I've ever owned is 1GB first-gen Shuffle. Downtime is valuable, too.
(*) By now, even occasional readers know that I'm a big Vonnegut fan who wrote a short eulogy on the occasion of his death, nearly named this blog after one of his short stories, and returns to him frequently.
This morning I read Tyler Cowen's conversation with Paul Romer. At one point, Romer talks about being introduced to C.S. Peirce, who had deep insights into "abstraction and how we use abstraction to communicate" (a topic Romer and Cowen discuss earlier in the interview). Romer is clearly enamored with Peirce's work, but he's also fascinated by the fact that, after a long career thinking about a set of topics, he could stumble upon a trove of ideas that he didn't even know existed:
... one of the joys of reading -- that's not a novel -- but one of the joys of reading, and to me slightly frightening thing, is that there's so much out there, and that a hundred years later, you can discover somebody who has so many things to say that can be helpful for somebody like me trying to understand, how do we use abstraction? How do we communicate clearly?
But the joy of scholarship -- I think it's a joy of maybe any life in the modern world -- that through reading, we can get access to the thoughts of another person, and then you can sample from the thoughts that are most relevant to you or that are the most powerful in some sense.
This process, he says, is the foundation for how we transmit knowledge within a culture and across time. It's how we grow and share our understanding of the world. This is a source of great joy for scholars and, really, for anyone who can read. It's why so many people love books.
Romer's interest in Peirce calls to mind my own fascination with his work. As Romer notes, Peirce had a "much more sophisticated sense about how science proceeds than the positivist sort of machine that people describe". I discovered Peirce through an epistemology course in graduate school. His pragmatic view of knowledge, along with William James's views, greatly influenced how I thought about knowledge. That, in turn, redefined the trajectory by which I approached my research in knowledge-based systems and AI. Peirce and James helped me make sense of how people use knowledge, and how computer programs might.
So I feel a great kinship with Romer in his discovery of Peirce, and the joy he finds in scholarship.
In yesterday's post, I mentioned re-reading Richard Hamming's 1986 talk, You and Your Research. Hamming himself found it useful to manage his own behavior in order to overcome his personal faults, in service of his goal to do great work. I have faults, too, and need occasional reminders to approach my work more intentionally.
I've been at low ebb recently with my own creative work, so there is plenty of low-hanging fruit to be picked after this read. In the short term, I plan to...
I'm also our department head, an administrative role that diverts much of my attention and energy from doing computer science. Hamming doesn't dismiss "management" outright, as so many scientists do. That's heartening, because organizations need good leaders to help create the conditions in which scientists do great work. He even explains why a capable scientist might reasonably choose to become a manager: "The day your vision, what you think needs to be done, is bigger than what you can do single-handedly, then you have to move toward management."
When I became head, I had some ideas about our department that I wanted to help implement from a leadership position. Do I still such ideas that I need to drive forward? If so, then I need to focus my administrative work on those goals. If not, then I need to think about next steps.
I like this passage from John Urschel Goes Pro, about the former NFL player who is pursuing a Ph.D. in math:
The world thinks mathematicians are people for whom math is easy. That's wrong. Sure, some kids, like Urschel, have little trouble with school math. But everyone who starts down the road to creating really new mathematics finds out what Urschel did: It's a struggle. A prickly, sometimes lonely struggle whose rewards are uncertain and a long time coming. Mathematicians are the people who love that struggle.
It's cliché to tell kids to "find their passion". That always seems to me like an awful lot of pressure to put on young adults, let alone teenagers. I meet with potential CS majors frequently, both college students and high school students. Most haven't found their passion yet, and as a result many wonder if there is something wrong with them. I do my my best to assure them that, no, there is nothing wrong with them. It's an unreasonable expectation placed on them by a world that, usually with good intentions, is trying to encourage them.
I don't think there is anything I'd rather be than a computer scientist, but I did not walk a straight path to being one. Some choices early on were easy: I like biology as a body of knowledge, but I never liked studying biology. That seemed a decent sign that maybe biology wasn't for me. (High-school me didn't understand that there might be a difference between school biology and being a biologist...) But other choices took time and a little self-awareness.
From the time I was eight years old or so, I wanted to be an architect. I read about architecture; I sent away for professional materials from the American Institute of Architects; I took courses in architectural drafting at my high school. (There was an unexpected benefit to taking those courses: I got to meet a lot of people were not part of my usual academic crowd.) Then I went off to college to study architecture... and found that, while I liked many things about the field, I didn't really like to do the grunt work that is part of the architecture student's life, and when the assigned projects got more challenging, I didn't really enjoy working on them.
But I had enjoyed working on the hard projects I'd encountered in my programing class back in high school. They were challenges I wanted to overcome. I changed my major and dove into college CS courses, which were full of hard problems -- but hard problems that I wanted to solve. I didn't mind being frustrated for an entire semester one year, working in assembly language and JCL, because I wanted to solve the puzzles.
Maybe this is what people mean when they tell us to "find our passion", but that phrase seems pretty abstract to me. Maybe instead we should encourage people to find the hard problems they like to work on. Which problems do you want to keep working on, even when they turn out to be harder than you expected? Which kinds of frustration do you enjoy, or at least are willing to endure while you figure things out? Answers to these very practical questions might help you find a place where you can build an interesting and rewarding life.
I realize that "Find your passion" makes for a more compelling motivational poster than "What hard problems do you enjoy working on?" (and even that's a lot better than "What kind of pain are you willing to endure?"), but it might give some people a more realistic way to approach finding their life's work.
For my convenience and yours, here are all of Strange Loop 2018 posts:
... and few parting thoughts of the non-technical variety:
I think that's all from Strange Loop 2018. It was fun.
Friday was a long day, but a good one. The talks I saw were a bit more diverse than on Day One: a couple on language design (though even one of those covered a lot more ground than that), one on AI, one on organizations and work-life, and one on theory:
• "All the Languages Together", by Amal Ahmed, discussed a problem that occurs in multi-language systems: when code written in one language invalidates the guarantees made by code written in the other. Most languages are not designed with this sort of interoperability baked in, and their FFI escape hatches make anything possible within foreign code. As a potential solution, Ahmed offered principled escape hatches designed with specific language features in mind. The proposed technique seems like it could be a lot of work, but the research is in its early stages, so we will learn more as she and her students implement the idea.
This talk is yet another example of how so many of our challenges in software engineering are a result of programming language design. It's good to see more language designers taking issues like these seriously, but we have a long way to go.
• I really liked Ashley Williams's talk on on the evolution of async in Javascript and Rust. This kind of talk is right up my alley... Williams invoked philosophy, morality, and cognitive science as she reviewed how two different language communities incorporated asynchronous primitives into their languages. Programming languages are designed, to be sure, but they are also the result of "contingent turns of history" (a lá Foucault). Even though this turned out to be more of a talk about the Rust community than I had expected, I enjoyed every minute. Besides, how can you not like a speaker who says, "Yes, sometimes I'll dress up as a crab to teach."?
(My students should not expect a change in my wardrobe any time soon...)
• I also enjoyed "For AI, by AI", by Connor Walsh. The talk's subtitle, "Freedom & Evolution of the Algopoetic Avant-Garde", was a bit disorienting, as was its cold open, but the off-kilter structure of the talk was easy enough to discern once Walsh got going: first, a historical review of humans making computers write poetry, followed by a look at something I didn't know existed... a community of algorithmic poets — programs — that write, review, and curate poetry without human intervention. It's a new thing, of Walsh's creation, that looks pretty cool to someone who became drunk on the promise of AI many years ago.
I saw two other talks the second day:
I wish I had more to say about the last talk but, with commitments at home, the long drive beckoned. So, I departed early, sadly, hopped in my car, headed west, and joined the mass exodus that is St. Louis traffic on a Friday afternoon. After getting past the main crush, I was able to relax a bit with the rest of Zen and the Art of Motorcycle Maintenance.
Even a short day at Strange Loop is a big win. This was the tenth Strange Loop, and I think I've been to five, or at least that's what my blog seems to tell me. It is awesome to have a conference like this in Middle America. We who live here benefit from the opportunities it affords us, and maybe folks in the rest of the world get a chance to see that not all great computing ideas and technology happen on the coasts of the US.
When is Strange Loop 2019?
This talk, the first of the afternoon on Day 1, opened with a familiar image: René Magritte's "this is not a pipe" painting, next to a picture of an actual pipe from some e-commerce site. Throughout the talk, speaker David Schmüdde returned to the distinction between thing and referent as he looked at the phenomenon of software users who used -- misused -- software to do something other than intended by the designer. The things they did were, or became, art.
First, a disclaimer: David is a former student of mine, now a friend, and one of my favorite people in the world. I still have in my music carousel a CD labeled "Schmudde Music!!" that he made for me just before he graduated and headed off to a master's program in music at Northwestern.
I often say in my conference reports that I can't do a talk justice in a blog entry, but it's even more true of a talk such as this one. Schmüdde demonstrated multiple works of art, both static and dynamic, which created a vibe that loses most of its zing when linearized in text. So I'll limit myself here to a few stray observations and impressions from the talk, hoping that you'll be intrigued enough to watch the video when it's posted.
Art is a technological endeavor. Rembrandt and hip hop don't exist without advances in art-making technology.
Misuse can be a form of creative experimentation. Check out Jodi, a website created in 1995 and still available. In the browser, it seems to be a work of ASCII art, but show the page source... (That's a lot harder these days than it was in 1995.) Now that is ASCII art.
Schmüdde talked about another work of from the same era, entitled Rain. It used slowness -- of the network, of the browser -- as a feature. Old HTML (or was it a bug in an old version of Netscape Navigator?) allowed one HEAD tag in a file with multiple BODY tags. The artist created such a document that, when loaded in sequence, gave the appearance of rain falling in the browser. Misusing the tools under the conditions of the day enabled the artist to create an animation before animated GIFs, Flash, and other forms of animation existed.
The talk followed with examples and demos of other forms of software misuse, which could:
Accidental misuse is life. We expect it. Intentional misuse is, or can be, art. It can surprise us.
What does art preservation look like for these works? The original hardware and software systems often are obsolete or, more likely, gone. To me, this is one of the great things about computers: we can simulate just about anything. Digital art preservation becomes a matter of simulating the systems or interactions that existed at the time the art was created. We are back to Magritte's pipe... This is not a work of art; it is a pointer to a work of art.
It is, of course, harder to recreate the experience of the art from the time it was created, but isn't this true of all art? Each of us experiences a work of art anew each time we encounter it. Our experience is never the same as the experience of those who were present when the work was first unveiled. It's often not even the same experience we ourselves had yesterday.
Schmüdde closed with a gentle plea to the technologists in the room to allow more art into their process. This is a new talk, and he was a little concerned about his ending. He may find a less abrupt way to end in the future, but to be honest, I though what he did this time worked well enough for the day.
Even taking my friendship with the speaker into account, this was the talk of the conference for me. It blended software, users, technology, ideas, programming, art, the making of things, and exploring software at its margins. These ideas may appear at the margin, but they often lie at the core of the work. And even when they don't, they surprise us or delight us or make us think.
This talk was a solid example of what makes Strange Loop a great conference every year. There were a couple of other talks this year that gave me a similar vibe, for example, Hannah Davis's "Generating Music..." talk on Day 1 and Ashley Williams's "A Tale of Two asyncs" talk on Day 2. The conference delivers top-notch technical content but also invites speakers who use technology, and explore its development, in ways that go beyond what you find in most CS classrooms.
For me, Day One of the conference ended better than most: over a beer with David at Flannery's with good conversation, both about ideas from his talk and about old times, families, and the future. A good day.
Last Wednesday morning, I hopped in my car and headed south to Strange Loop 2018. It had been a few years since I'd listened to Zen and the Art of Motorcycle Maintenance on a conference drive, so I popped it into the tapedeck (!) once I got out of town and fell into the story. My top-level goal while listening to Zen was similar to my top-level goal for attending Strange Loop this year: to experience it at a high level; not to get bogged down in so many details that I lost sight of the bigger messages. Even so, though, a few quotes stuck in my mind from the drive down. The first is an old friend, one of my favorite lines from all of literature:
Assembly of Japanese bicycle require great peace of mind.
The other was the intellectual breakthrough that unified Phaedrus's philosophy:
Quality is not an object; it is an event.This idea has been on my mind in recent months. It seemed a fitting theme, too, for Strange Loop.
On the first day of the conference, I saw mostly a mixture of compiler talks and art talks, including:
• @mraleph's "Six Years of Dart", in which he reminisced on the evolution of the language, its ecosystem, and its JIT. I took at least one cool idea from this talk. When he compared the performance of two JITs, he gave a histogram comparing their relative performances, rather than an average improvement. A new system often does better on some programs and worse on others. An average not only loses information; it may mislead.
• Jason Dagit's "Your Secrets are Safe with Julia", about a system that explores the use of homomorphic encryption to to compile secure programs. In this context, the key element of security is privacy. As Dagit pointed out, "trust is not transitive", which is especially important when it comes to sharing a person's health data.
• I just loved Hannah Davis's talk on "Generating Music From Emotion". She taught me about data sonification and its various forms. She also demonstrated some of her attempts to tease multiple dimensions of human emotion out of large datasets and to use these dimensions to generate music that reflects the data's meaning. Very cool stuff. She also showed the short video Dragon Baby, which made me laugh out loud.
• I also really enjoyed "Hackett: A Metaprogrammable Haskell", by Alexis King. I've read about this project on the Racket mailing list for a few years and have long admired King's ability in posts there to present complex ideas clearly and logically. This talk did a great job of explaining that Haskell deserves a powerful macro system like Racket's, that Racket's macro system deserves a powerful type system like Haskell's, and that integrating the two is more challenging than simply adding a stage to the compiler pipeline.
I saw two other talks the first day:
I had almost forgotten how many different kinds of cool ideas I can encounter in a single day at Strange Loop. Thursday was a perfect reminder.
In an interview at The Great Discontent, designer John Gall is asked, "What kind of legacy do you hope to leave?" He replies:
I have no idea; it's not something I think about. It's the thing one has the least control over. I just hope that my kids will have nice things to say about me.
I admire this answer.
No one is likely to ask me about my legacy; I'm just an ordinary guy. But it has always seemed strange when people -- presidents, artists, writers, film stars -- are asked this question. The idea that we can or should craft our own legacy like a marketing brand seems venal. We should do things because they matter, because they are worth doing, because they make the world better, or at least better than it would be without us. It also seems like a waste of time. The simple fact is that most of us won't be remembered long beyond our deaths, and only then by close family members and friends. Even presidents, artists, writers, and film stars are mostly forgotten.
To the extent that anyone will have a legacy, it will decided in the future by others. As Gall notes, we don't have much control over how that will turn out. History is full of people whose place in the public memory turned out much differently than anyone might have guessed at the time.
When I am concerned that I'm not using my time well, it's not because I am thinking of my legacy. It's because I know that time is a precious and limited resource and I feel guilty for wasting it.
About the most any of us can hope is that our actions in this life leave a little seed of improvement in the world after we are gone. Maybe my daughters and former students and friends can make the world better in part because of something in the way I lived. If that's what people mean by their legacy, great, but it's likely to be a pretty nebulous effect. Not many of us can be Einstein or Shakespeare.
All that said, I do hope my daughters have good things to say about me, now and after I'm gone. I love them, and like them a lot. I want to make their lives happier. Being remembered well by them might also indicate that I put my time on Earth to good use.
There are two spiritual dangers in not owning a farm. One is the danger of supposing that breakfast comes from the grocer, and the other that heat comes from the furnace.
The remedy for the first, according to Aldo Leopold, is to grow a garden, preferably in a place without the temptation and distraction of a grocery store. The remedy for the second is to "lay a split of good oak on the andirons" and let it warm your body "while a February blizzard tosses the trees outside".
I ran across Leopold's The Sand County Almanac in the local nature center late this summer. After thumbing through the pages during a break in a day-long meeting indoors, I added it to my long list of books to read. My reading list is actually stack, so there was some hope that I might get to it soon -- and some danger that it would be buried before I did.
Then an old high school friend, propagating a meme on Facebook, posted a picture of the book and wrote that it had changed his life, changed how he looked at the world. That caught my attention, so I anchored it atop my stack and checked a copy out of the university library.
It now serves as a quiet read for this city boy on a dark and rainy three-day weekend. There are no February blizzards here yet, of course, but autumn storms have lingered for days. In an important sense, I'm not a "city boy", as my by big-city friends will tell me, but I've lived my life mostly sheltered from the reality growing my own food and heating my home by a wonderful and complex economy of specialized labor that benefits us all. It's good to be reminded sometimes of that good fortune, and also to luxuriate in the idea of experiencing a different kind of life, even if only for a while.
If you don't sit facing the window, you could be in any town.
I read that line this morning in Maybe the Cumberland Gap just swallows you whole, where it is a bittersweet observation of the similarities among so many dying towns across Appalachia. It's a really good read, mostly sad but a little hopeful, that applies beyond one region or even one country.
My mind is self-centered, though, and immediately reframed the sentence in a way that cast light on my good fortune.
I just downloaded a couple of papers on return-oriented programming so that I can begin working with an undergraduate on an ambitious research project. I have a homework assignment to grade sitting in my class folder, the first of the semester. This weekend, I'll begin to revise a couple of lectures for my compiler course, on NFAs and DFAs and scanning text. As always, there is a pile of department work to do on my desk and in my mind.
I live in Cedar Falls, Iowa, but if I don't sit facing the window, I could be in Ames or Iowa City, East Lansing or Durham, Boston or Berkeley. And I like the view out of my office window very much, thank you, so I don't even want to trade.
Heading into a three-day weekend, I realize again how fortunate I am. Do I put my good fortune to good enough use?
I just finished David Mamet's Three Uses of the Knife, a wide-ranging short book with the subtitle: "on the nature and purpose of drama". It is an extended essay on how we create and experience drama -- and how these are, in the case of great drama, the same journey.
Even though the book is only eighty or so pages, Mamet characterizes drama in so many ways that you'll have to either assemble a definition yourself or accept the ambiguity. Among them, he says that the job of drama and art is to "delight" us and that "the cleansing lesson of the drama is, at its highest, the worthlessness of reason."
Mamet clearly believes that drama is central to other parts of life. Here's a cynical example, about politics:
The vote is our ticket to the drama, and the politician's quest to eradicate "fill in the blank", is no different from the promise of the superstar of the summer movie to subdue the villain -- both promise us diversion for the price of a ticket and a suspension of disbelief.
As reader, I found myself using the book's points to ruminate about other parts of life, too. Consider the first line of the second essay:
The problems of the second half are not the problems of the first half.
Mamet uses this to launch into a consideration of the second act of a drama, which he holds equally to be a consideration of writing the second act of a drama. But with fall semester almost upon us, my thoughts jumped immediately to teaching a class. The problems of teaching the second half of a class are quite different from the problems of teaching the first half. The start of a course requires the instructor to lay the foundation of a topic while often convincing students that they are capable of learning it. By midterm, the problems include maintaining the students' interest as their energy flags and the work of the semester begins to overwhelm them. The instructor's energy -- my energy -- begins to flag, too, which echoes Mamet's claim that the journey of the creator and the audience are often substantially the same.
A theme throughout the book is how people immerse themselves in story, suspending their disbelief, even creating story when they need it to soothe their unease. Late in the book, he connects this theme to religious experience as well. Here's one example:
In suspending their disbelief -- in suspending their reason, if you will -- for a moment, the viewers [of a magic show] were rewarded. They committed an act of faith, or of submission. And like those who rise refreshed from prayers, their prayers were answered. For the purpose of the prayer was not, finally, to bring about intercession in the material world, but to lay down, for the time of the prayer, one's confusion and rage and sorrow at one's own powerlessness.
This all makes the book sound pretty serious. It's a quick read, though, and Mamet writes with humor, too. It feels light even as it seems to be a philosophical work.
The following paragraph wasn't intended as humorous but made me, a computer scientist, chuckle:
The human mind cannot create a progression of random numbers. Years ago computer programs were created to do so; recently it has been discovered that they were flawed -- the numbers were not truly random. Our intelligence was incapable of creating a random progression and therefore of programming a computer to do so.
This reminded me of a comment that my cognitive psychology prof left on the back of an essay I wrote in class. He wrote something to the effect, "This paper gets several of the particulars incorrect, but then that wasn't the point. It tells the right story well." That's how I felt about this paragraph: it is wrong on a couple of important facts, but it advances the important story Mamet is telling ... about the human propensity to tell stories, and especially to create order out of our experiences.
Oh, and thanks to Anna Gát for bringing the book to my attention, in a tweet to Michael Nielsen. Gát has been one of my favorite new follows on Twitter in the last few months. She seems to read a variety of cool stuff and tweet about it. I like that.
I spent a couple of hours this morning at a roundtable discussion listening to area tech employers talk about their work and their companies' needs. It was pretty enjoyable (well, except perhaps for the CEO who too frequently prefaced his remarks with "What the education system needs to understand is ..."). To a company, they all place a lot of value on the projects that job candidates have done. Their comments reminded me of an old MAA blog post in which a recent grad said:
During the fall of my junior year, I applied for an internship at Red Ventures, a data analytics and technology company just outside Charlotte. Throughout the rigorous interview process, it wasn't my GPA that stood out. I stood out among the applicants, in part, because I was able to discuss multiple projects I had taken ownership of and was extremely passionate about.
I encourage this mentality in my students, though I think "passionate about" is too strong a condition (not to mention cliché). Students should have a few projects that they are interested in, or proud of, or maybe just completed.
Most of the students taking my compiler course this fall won't be applying for a compiler job when they graduate, but they will have written a compiler as part of a team. They will have met a spec, collaborated on code, and delivered a working product. That is evidence of skill, to be sure, but also of hard work and persistence. It's a significant accomplishment.
The students who take our intelligent systems course or our real-time embedded systems will be able to say the same thing. Some students will also be able to point to code they wrote for a club or personal projects. They key is to build things, care about them, and "deliver", whatever that means in the context of that particular project.
I made note of one new piece of advice to give our students, offered by a former student I mentioned in a blog post many years ago who is now head of a local development team for mobile game developer Jam City: Keep all the code you write. It can be a GitHub repo, as many people now recommend, but it doesn't have to be. A simple zip file organized by courses and projects can be enough. Such a portfolio can show prospective employers what you've done, how you've grown, and how much you care about the things you make. It can say a lot.
You might even want to keep all that code for Future You. I'm old enough that it was difficult to keep digital copies of all the code I wrote in college. I have a few programs from my undergrad days and a few more from grad school, which have migrated across storage media as time passed, but I missing much of my class work as a young undergrad and all of the code I wrote in high school. I sometimes wish I could look back at some of that code...
This weekend, my family headed to St. Paul, Minnesota, to celebrate my younger daughter's graduation from college. The day served as a bookend to the day. seven years ago this fall when I dropped my older daughter off at college. Two beginnings, two endings -- and the beginnings they created.
Graduation day was emotional for me in a different way. It is wonderful gift to hear faculty and friends say such good things about your child. This is a young women I've always known to be special, and now we know some of the ways the rest of the world appreciates her. They appreciate some of the same things I appreciate, but they also know her in ways I do not and so can appreciate her ways I don't always have access to. Another gift.
Off into the world she goes to do her thing. To be honest, though, she's been out in the world for a long time doing her thing and making it a better place. It's one of the things I admire so in her, and in her big sister. I enjoy admiring my daughters as much as I do.
With both daughters out of college, I will miss the time we've spent visiting their college campuses. I tried to savor this weekend more knowing as I do how much I missed my older daughter's campus after she graduated. Of course, now I'll get to visit them in places like Boston and Minneapolis and get to know these cities better, through their eyes. Yet another gift.
It's been a tough semester. On top of the usual business, there have been a couple of extra stresses. First, I've been preparing for the departure of a very good friend, who is leaving the university and the area for family and personal reasons. Second, a good friend and department colleague took an unexpected leave that turned into a resignation. Both departures cast a distant pall over my workdays. This week, though, has offered a few positive notes to offset the sadness.
Everyone seems to complain about email these days, and I certainly have been receiving and sending more than usual this semester, as our students and I adjust to the change in our faculty. But sometimes an email message makes my day better. Exhibit 1, a message from a student dealing with a specific issue:
Thank you for your quick and helpful response!
Things don't look so complicated or hopeless now.
Exhibit 2, a message from a student who has been taming the bureaucracy that arises whenever two university systems collide:
I would like to thank you dearly for your prompt and thorough responses to my numerous emails. Every time I come to you with a question, I feel as though I am receiving the amount of respect and attention that I wish to be given.
Compliments like these make it a lot easier to muster the energy to deal with the next batch of email coming in.
There has also been good news on the student front. I received email from a rep at a company in Madison, Wisconsin, where one of our alumni works. They are looking for developers to work in a functional programming environment and are having a hard time filling the positions locally, despite the presence of a large and excellent university in town. Our alum is doing well enough that the company would like to hire more from our department, which is doing a pretty good job, too.
Finally, today I spoke in person with two students who had great news about their futures. One has accepted an offer to join the Northwestern U. doctoral program and work in the lab of Kenneth Forbus. I studied Forbus's work on qualitative reasoning and analogical reasoning as a part of my own Ph.D. work and learned a lot from him. This is a fantastic opportunity. The other student has accepted an internship to work at PlayStation this summer, working on the team that develops the compilers for its game engines. He told me, "I talked a lot about the project I did in your course last semester during my interview, and I assume that's part of the reason I got an offer." I have to admit, that made me smile.
I had both of these students in my intro class a few years back. They would have succeeded no matter who taught their intro course, or the compiler course, for that matter, so I can't take any credit for their success. But they are outstanding young men, and I have had the pleasure of getting to know over the last four years. News of the next steps in their careers makes me feel good, too.
I think I have enough energy to make it to the end of the semester now.
Theoretical physicist Marcelo Gleiser, in The More We Know, the More Mystery There Is:
But even if we did [bring the four fundamental forces together in a common framework], and it's a big if right now, this "unified theory" would be limited. For how could we be certain that a more powerful accelerator or dark matter detector wouldn't find evidence of new forces and particles that are not part of the current unification? We can't. So, dreamers of a final theory need to recalibrate their expectations and, perhaps, learn a bit of epistemology. To understand how we know is essential to understand how much we can know.
|
People are often surprised to hear that, in all my years of school, my favorite course was probably PHL 440 Epistemology, which I took in grad school as a cognate to my CS courses. I certainly enjoyed the CS courses I took as a grad student, and as an undergrad, too, and but my study of AI was enhanced significantly by courses in epistemology and cognitive psychology. The prof for PHL 440, Dr. Rich Hall, became a close advisor to my graduate work and a member of my dissertation committee. Dr. Hall introduced me to the work of Stephen Toulmin, whose model of argument influenced my work immensely.
I still have the primary volume of readings that Dr. Hall assigned in the course. Looking back now, I'd forgotten how many of W.V.O. Quine's papers we'd read... but I enjoyed them all. The course challenged most of my assumptions about what it means "to know". As I came to appreciate different views of what knowledge might be and how we come by it, my expectations of human behavior -- and my expectations for what AI could be -- changed. As Gleiser suggests, to understand how we know is essential to understanding what we can know, and how much.
Gleiser's epistemology meshes pretty well with my pragmatic view of science: it is descriptive, within a particular framework and necessarily limited by experience. This view may be why I gravitated to the pragmatists in my epistemology course (Peirce, James, Rorty), or perhaps the pragmatists persuaded me better than the others.
In any case, the Gleiser interview is a delightful and interesting read throughout. His humble of science may get you thinking about epistemology, too.
... and, yes, that's the person for whom a quine in programming is named. Thanks to Douglas Hofstadter for coining the term and for giving us programming nuts a puzzle to solve in every new language we learn.
Seven years ago, I went out for my last run. I didn't know at the time that it would be my last run. A month or so later, I noted that I had been sick for a couple of weeks and then sore for a couple of weeks. After another four weeks, I reported that my knee wasn't going to get better in a way that would enable me to run regularly again. That was it.
My knee is better now in most important ways, though. A simple fix wasn't possible, but a more involved surgery was successful. Today, I walk a lot, especially with my wife, ride a bike a lot, again especially with my wife, and otherwise live a normal physical life. The repaired knee is not as mobile or responsive as my other knee but, all things considered, life is pretty good.
Even so, I miss running. A couple of years ago, I wrote that even five years on, I still dreamed about running occasionally. I'll be up early some morning, see a sunrise, and think, "This would make for a great run." Sometimes, when I go out after a snowfall, I'll remember what it was like to be the first person running on fresh snow out on the trails, under ice- or snow-covered branches. I miss that feeling, and so many others. I still enjoy sunrises and new snow, of course, but that enjoyment has long been tangled up with the feel of running: the pumping lungs, the long strides, the steady flow of scenery. Walking and biking have never given me the same feeling.
My orthopedic surgeon was worried that I would be like a lot of former runners and not stay "former", but I've been pretty well-behaved. In seven years I have rarely broken into even the slowest of trots, to cross a street or hurry to class. The doctor explained to me the effects of running on my reconstructed knee, the risk profile associated with contact sports, and what contact would likely mean for the future of the knee. As emotional I can seem about running, I'm much too rational to throw caution out the door for a brief thrill of running. So I don't run.
Even so, I often think back to the time I was rehabilitating my knee after surgery. Our athletic department has a therapy pool with an underwater treadmill, and my therapist had me use it to test my endurance and knee motion. The buoyancy of the water takes enough pressure off the legs that the impact on the knee doesn't damage the joint. I think I can achieve the same effect in the ocean, so the next time I get to a coast, I may try an underwater run. And I dream of getting rich enough to install one of those therapy pools in my house. I may not be a runner anymore, but I'm adaptable and perfectly willing to enjoy the benefits of technology.
In Material as Metaphor, the artist Anni Albers talks about how she came to choose media in which she worked:
How do we choose our specific material, our means of communication? "Accidentally". Something speaks to us, a sound, a touch, hardness or softness, it catches us and asks us to be formed. We are finding our language, and as we go along we learn to obey their rules and their limits. We have to obey, and adjust to those demands. Ideas flow from it to us and though we feel to be the creator we are involved in a dialogue with our medium. The more subtly we are tuned to our medium, the more inventive our actions will become. Not listening to it ends in failure.
This expresses much the way I feel about different programming languages and styles. I can like them all, and sometimes do! I go through phases when one style speaks to me more than another, or when one language seems to be in sync with how I am thinking. When that happens, I find myself wanting to learn its rules, to conform so that I can reach a point where I feel creative enough to solve interesting problems in the language.
If I find myself not liking a language, it's usually because I'm not listening to it; I'm fighting back. When I first tried to learn Haskell, I refused to bend to its style of functional programming. I had worked hard to grok FP in Scheme, and I was so proud of my hard-won understanding that I wanted to impose it on the new language. Eventually, I retreated for a while, returned more humbly, and finally came to appreciate Haskell, if not master it deeply.
My experience with Smalltalk went differently. One summer I listened to what it was telling me, slowly and patiently, throwing code away and starting over several times on an application I was trying to build. This didn't feel like a struggle so much as a several-month tutoring session. By the end, I felt ideas flowing through me. I think that's the kind of dialogue Albers is referring to.
If I want to master a new programming language, I have to be willing to obey its limits and to learn how to use its strengths as leverage. This can be a conscious choice. It's frustrating when that doesn't seem to be enough.
I wish I could always will myself into the right frame of mind to learn a new way of thinking. Albers reminds us that often a language speaks to us first. Sometimes, I just have to walk away and wait until the time is right.
This morning I read an interview with Steven Soderbergh, which is mostly about his latest project, the app/series, "Mosaic". A few weeks ago, "Mosaic" was released as an app in advance of its debut on HBO. Actually, that's not quite right. It is an app, which will also be released later in series form, and then only because Soderbergh needed money to finish the app version of the film. In several places, he talks about being a director in ways that made me think of being a university professor these days.
One was in terms of technology. There are moments in the "Mosaic" app when it offers the viewer an opportunity to digress and read a document, to flash back, or to flash forward. The interviewer is intrigued by the notion that a filmmaker would be willing to distract the viewer in this way, sending texts and pushing notifications that might disrupt the experience. Soderbergh responded:
My attitude was, "Look, we've gotten used to watching TV now with three scrolling lines of information at the bottom of the screen all the time. People do not view that stuff the same way that they would have viewed it 20 years ago."
... To not acknowledge that when you're watching something on your phone or iPad that there are other things going on around you is to be in denial. My attitude is, "Well, if they're going to be distracted by something, let it be me!"
I'm beginning to wonder if this wouldn't be a healthier attitude for us to have as university instructors. Maybe I should create an app that is my course and let students experience the material using what is, for them, a native mode of interaction? Eventually they'll have to sit down and do the hard work of solving problems and writing code, but they could come to that work in a different way. There is a lot of value in our traditional modes of teaching and learning, but maybe flowing into our students' daily experience with push requests and teaser posts would reach them in a different way.
Alas, I doubt that HBO will front me any money to make my app, so I'll have to seem other sources of financing.
On a more personal plane, I was struck by something that Soderbergh said about the power directors have over the people they work with:
What's also interesting, given the environment we're in right now, is that I typically spend the last quarter of whatever talk I'm giving [to future fillmakers] discussing personal character, how to behave, and why there should be some accepted standard of behavior when you interact with people and how you treat people. Particularly when you're in a position like that of a director, which is an incredibly powerful situation to be in, pregnant with all kinds of opportunity to be abusive.
... if you're in a position of power, you can look at somebody sideways and destroy their week, you know? You need to be sensitive to the kind of power that a director has on a set.
It took me years as a teacher to realize the effect that an offhand remark could have on a student. I could be lecturing in class, or chatting with someone in my office, and say something about the course, or about how I work, or about how students work or think. This sentence, a small part of a larger story, might not mean all that much to me, and yet I would learn later that it affected how the student felt about himself or herself for a week, or for the rest of the course, or even longer. This effect can be positive or negative, of course, depending on the nature of the remark. As Soderbergh says, it's worth thinking about how you behave when you interact with people, especially when you're in a position of relative authority, in particular as a teacher working with young people.
This applies to our time as a parent and a spouse, too. Some of my most regrettable memories over the years are of moments in which I made an offhand remark, thoughtlessly careless, that cut deep into the heart of my wife or one of my daughters. Years later, they rarely remember the moment or the remark, but I'm sad for the pain I caused in that moment and for any lingering effect it may have. The memory is hard for me to shake. I have to hope that the good things I have said and done during our time together meant as much. I can also try to do better now. The same holds true for my time working with students.
In this conversation, the interviewer asked Maurice Sendak, then age eighty-three, how long he could work in one stretch. The illustrator said that two hours was a long stretch for him.
Because I'm older, I get tired and nap. I love napping. Working and napping and reading and seeing my students. They're awfully nice. They're young and they're hopeful.
I'm not quite eighty-three, but I agree with every sentence in Sendak's answer. I could do worse than be as productive and as cantankerous for as long as he was.
Yesterday, I wrote an entire blog entry that I rm'ed before posting. I had read a decent article on a certain flavor of primitive obsession and design alternatives, which ended with what I thought was a misleading view of object-oriented programming. My blog entry set out to right this wrong.
In a rare moment of self-restraint, I resisted the urge to correct someone who was wrong on the internet. There is no sense in subjecting you to that. Instead, I'll just say that I like both OOP and functional programming, and use both regularly. I remain, in my soul, object-oriented.
On a more positive note, this morning I read an old article that made me smile, Why I'm Productive in Clojure. I don't use Clojure, but this short piece brought to mind many of the same feelings in me, but about Smalltalk. Interestingly, the sources of my feelings are similar to the author's: the right amount of syntax, facilities for meta-programming, interactive development. The article gave me a feeling that is the opposite of schadenfreude: pleasure from the pleasure of others. Some Buddhists call this mudita. I felt mudita after reading this blog entry. Rock on, Clojure dude.
from Forbes St., Pittsburgh, PA |
I had a couple of hours yesterday between the end of the CS education summit and my shuttle to the airport. Rather than sit in front of a computer for two more hours, I decided to take advantage of my location, wander over to the Carnegie Mellon campus, and take a leisurely walk through the Gates Center for Computer Science. I'm glad I did.
At the beginning of my tour, I was literally walking in circles, from the ground-level entrance shown in its Wikipedia photo up to where the CS offices seem to begin, up on the fourth floor. This is one of those buildings that looks odd from the outside and is quite confusing on the inside, at least to the uninitiated. But everyone inside seemed to feel at home, so maybe it works.
It didn't take long before my mind was flooded by memories of my time as a doctoral student. Michigan State's CS program isn't as big as CMU's, but everywhere I looked I saw familiar images: Students sitting in their labs or in their offices, one or two or six at a time, hacking code on big monitors, talking shop, or relaxing. The modern world was on display, too, with students lounging comfy chairs or sitting in a little coffee shop, laptops open and earbuds in place. That was never my experience as a student, but I know it now as a faculty member.
I love to wander academic halls, in any department, really, and read what is posted on office doors and hallway walls. At CMU, I encountered the names of several people whose work I know and admire. They came from many generations... David Touretzky, whose Lisp textbook taught me a few things about programming. Jean Yang, whose work on programming languages I find cool. (I wish I were going to SPLASH later this month...) Finally, I stumbled across the office of Manuel Blum, the 1995 Turing Award winner. There were a couple of posters outside his door showing the work of his students on problems of cryptography and privacy, and on the door itself were several comic strips. The punchline of one read, "I'll retire when it stops being fun." On this, even I am in sync with a Turing Award winner.
Everywhere I turned, something caught my eye. A pointer to the Newell/Simon bridge... Newell-and-Simon, the team, were the like the Pied Piper to me when I began my study of AI. A 40- or 50-page printout showing two old researchers (Newell and Simon?) playing chess. Plaques in recognition of big donations that had paid for classrooms, labs, and auditoria, made by Famous People who were either students or faculty in the school.
CMU is quite different from my school, of course, but there are many other schools that give off a similar vibe. I can see why people want to be at an R-1, even if they aspire to be teachers more than research faculty. There is so much going on. People, labs, sub-disciplines, and interdisciplinary projects. Informal talks, department seminars, and outside speakers. Always something going on. Ideas. Energy.
On the ride to the airport later in the day, I sat in some slow, heavy traffic going one direction and saw slower, heavier traffic going in the other. As much as I enjoyed the visit, I was glad to be heading home.
This graph illustrates one of the problems that afflicts me as a writer. Too often, I don't have the confidence (or gumption) to start writing until I reach the X. By that time in the learning cycle, downhill momentum is significant. It's easier not to write, either because I figure what I have to say is old news or because my mind has moved on to another topic.
I am thankful that other people share their learning at the top of the curve.
~~~~
Sarah Perry. created the above image for one of her many fine essays. I came upon it in David Chapman's Ignorant, Irrelevant, and Inscrutable. The blue X is mine.
I just finished reading Tyler Cowen's recent interview with historian Jill Lepore. When Cowen asks Lepore about E.B. White's classic Stuart Little, Lepore launches into a story that illustrates quite nicely what it's like to be a scholar.
First, she notes that she was writing a review of a history of children's literature and kept coming across throwaway lines of the sort "Stuart Little, published in 1945, was of course banned." This triggered the scholar's impulse:
And there's no footnote, no explanation, no nothing.
At the time, one of my kids was six, and he was reading Stuart Little, we were reading at night together, and I was like, "Wait, the story about the mouse who drives the little car and rides a sailboat across the pond in Central Park, that was a banned book? What do I not know about 1945 or this book? What am I missing?"
These last two sentences embody the scholar's orientation. "What don't I know about these two things I think I know well?"
And I was shocked. I really was shocked. And I was staggered that these histories of children's literature couldn't even identify the story. I got really interested in that question, and I did what I do when I get a little too curious about something, is I become obsessive about finding out everything that could possibly be found out.
Next comes obsession. Lepore then tells a short version of the story that became her investigative article for The New Yorker, which she wrote because sometimes I "just fall into a hole in the ground, and I can't get out until I have gotten to the very, very bottom of it."
Finally, three transcript pages later, Lepore says:
It was one of the most fun research benders I've ever been on.
It ends in fun.
You may be a scholar if you have this pattern. To me, one of the biggest downsides of becoming department head is having less time to fall down some unexpected hole and follow its questions until I reach the bottom. I miss that freedom.
I recently ran across a newspaper column about triple-doubles, a secondary statistic in basketball that tracks when a player reaches "double figures" in three of the major primary stats. The column included this short passage about the great Oscar Robertson:
You probably know that Robertson averaged triple-doubles for an entire season, 1961-62. But did you know that he averaged triple-doubles over the cumulative first five seasons of his NBA career, from 1960-61 through '64-65? In that stretch Robertson averaged 30.3 points, 10.4 rebounds, and 10.6 assists.
This is indeed an amazing accomplishment. But it was not news to me.
I grew up in Indiana, a hotbed of basketball, with the sort of close personal attachment to Robertson that only a young sports fan can have. Like me, the "Big O" was from Indianapolis. He had led Crispus Attacks High School to two straight state championships in the late 1950s and helped lead the University of Cincinnati to national prominence in the early 1960s. When I first became aware of basketball as a young boy, Robertson was deep into a stellar pro career. He was one of my early sports idols.
Later, Robertson and his legacy played an unexpected role in my life. When I interviewed for the scholarship that would pay my way through college, I found that the interviewer, the dean of the Honors College, was a former Division III basketball player from Michigan who had gone on to earn a PhD in history from the University of Maryland. Our conversation quickly turned to basketball and our mutual admiration for the Big O, but it was not all sports talk. Robertson's career as a black player in the 1950s and '60s launched us into a discussion of urban segregation, race relations, and the role of sport in culture.
After the interview, I wondered if it had been wise tactially to talk so much about basketball. I guess the conversation went well enough, though.
Folks today can have Michael Jordan and LeBron. They are undeniably great players, but Oscar Robertson will always be my standard for all-around play -- and a touchpoint in my own life.
As I got ready for class yesterday morning, I decided to refactor a piece of code. No big deal, right? It turned out to be a bigger deal than I expected. That's part of the fun of programming.
The function in question is a lexical addresser for a little language we use as a specimen in my Programming Languages course. My students had been working on a design, and it was time for us to build a solution as a group. Looking at my code from the previous semester, I thought that changing the order of two cases would make for a better story in class. The cases are independent, so I swapped them and ran my tests.
The change broke my code. It turns out that the old "else" clause had been serving as a convenient catch-all and was only working correctly due to an error in another function. Swapping the cases exposed the error.
Ordinarily, this wouldn't be a big deal, either. I would simply fix the code and give my students a correct solution. Unfortunately, I had less than an hour before class, so I now found myself in a scramble to find the bug, fix it, and make the changes to my lecture notes that had motivated the refactor in the first place. Making changes like this under time pressure is rarely a good idea... I was tempted to revert to the previous version, teach class, and make the change after class. But I am a programmer, dogged and often foolhardy, so I pressed on. With a few minutes to spare, I closed the editor on my lecture notes and synced the files to my teaching machine. I was tired and still had a little nervous energy coursing through me, but I felt great. That's part of the fun of programming.
I will say this: Boy, was I glad to have my test suite! It was incomplete, of course, because I found an error in my program. But the tests I did have helped me to know that my bug fix had not broken something else unexpectedly. The error I found led to several new tests that make the test suite stronger.
This experience was fresh in my mind this morning when I read "Physics Was Paradise", an interview with Melissa Franklin, a distinguished experimental particle physicist at Harvard. At one point, Franklin mentioned taking her first physics course in high school. The interviewer asked if physics immediately stood out as something she would dedicate her life to. Franklin responded:
Physics is interesting, but it didn't all of a sudden grab me because introductory physics doesn't automatically grab people. At that time, I was still interested in being a writer or a philosopher.
I took my first programming class in high school and, while I liked it very much, it did not cause me to change my longstanding intention to major in architecture. After starting in the architecture program, I began to sense that, while I liked architecture and had much to learn from it, computer science was where my future lay. Maybe somewhere deep in my mind was memory of an experience like the one I had yesterday, battling a piece of code and coming out with a sense of accomplishment and a desire to do battle again. I didn't feel the same way when working on problems in my architecture courses.
Intro CS, like intro physics, doesn't always snatch people away from their goals and dreams. But if you enjoy the fun of programming, eventually it sneaks up on you.
Almost a decade ago, I last blogged about my library book version of Scott Hastings's million: leading the league, so to speak, in being the first person to check a particular book out of my university library. Since my first post of the subject twelve years ago, I have apparently mellowed... I'm no longer willing to claim that I lead the league in this stat. Other patrons certainly read more books than I do, and if they keep their eyes open for new acquisitions, they almost surely top me. But even the humbler me can't help but notice that many of the books I check out, whether CS books or literature, seem to be in mint condition. They are being read for the first time, by me.
These days, identifying millions is a more fallible task. The library long ago terminated the practice of stamping a book's due date on a slip inside the front cover. That's all handled electronically now (those infernal computers!), so only the library knows when a book was last checked out. On the long list of cultural experiences lost to changing culture and advancing technology, this one is rather minor, but it's one that a frequent user of libraries might notice feel wistful about. I sometimes do. I always liked seeing the long list of due dates in the books I borrowed; it bestowed on me a sense of kinship with the readers who came before. In the case of all those Asimov and Vonnegut books I re-read, one of the readers who came before was me!
I do think I have another million to my credit, though. A couple of weeks ago, I went over to library to look for some book -- a novel or literary criticism, I can't remember. I looked it over and decided not to check it out after all. I was in no hurry to get back to the office, so I indulged in a few minutes of wandering the stacks to see if anything struck my fancy. I hadn't done that in a while, and I missed it.
I ended up holding Cycling, the 2003 second novel of Greg Garrett. It was purchased long enough ago to have a Due Date slipped glued inside the front cover, but the slip was empty. The cover, pages, and corners gave every appearance of being brand new. Seeing no evidence to the contrary, I stake my claim. When you consider changes in library acquisitions, changes in my reading habits, and the uncertainty today of knowing which books have never been read, how many more times will I have the chance?
"Cycling" was a pretty good read, too, for what that's worth. It was another serendipitous find while wandering the stacks.
A few quick notes on my previous post about the effect of ubiquitous information on knowing and doing.
~~~~
The post reminded a reader of something that Guy Steele said at DanFest, a 2004 festschrift in honor of Daniel Friedman's 60th birthday. As part of his keynote address, Steele read from an email message he wrote in 1978:
Sussman did me a very big favor yesterday -- he let me flounder around trying to build a certain LISP interpreter, and when I had made and fixed a critical bug he then told me that he had made (and fixed) the same mistake in Conniver. I learned a lot from that bug.
Isn't that marvelous? "I learned a lot from that bug."
Thanks to this reader for pointing me to a video of Steele's DanFest talk. You can watch this specific passage at the 12:08 mark, but really: You now have a link to an hour-long talk by Guy Steele that is titled "Dan Friedman--Cool Ideas". Watch the entire thing!
~~~~
If all you care about is doing -- getting something done -- then ubiquitous information is an amazing asset. I use Google and StackOverflow answers quite a bit myself, mostly to navigate the edges of languages that I don't use all the time. Without these resources, I would be less productive.
~~~~
Long-time readers may have read the story about how I almost named this blog something else. ("The Euphio Question" still sets my heart aflutter.) Ultimately I chose a title that emphasized the two sides of what I do as both a programmer and a teacher. The intersection of knowing and doing is where learning takes place. Separating knowing from doing creates problems.
In a post late last year, I riffed on some ideas I had as I read Learn by Painting, a New Yorker article about an experiment in university education in which everyone made art as a part of their studies.
That article included a line that expressed an interesting take on my blog's title: "Knowing and doing are two sides of the same activity, which is adapting to our environment."
That's cool thought, but a rather pedestrian sentence. The article includes another, more poetic line that fits in nicely with the theme of the last couple of days:
Knowing is better than not knowing, but knowing without doing is as good as not knowing.
If I ever adopt a new tagline for my blog, it may well be this sentence. It is not strictly true, at least in a universal sense, but it's solid advice nonetheless.
Since sometime last summer, I have been posting capsule reviews of the movies I watch on Facebook. They are reactions, really, not reviews -- three bullet points, or a couple of sentences, that express how I felt during after watching. These updates have been humorous for many of my friends, because I'm often watching movies that they watched one or five or ten years ago. As many movies I have watched, I'm still just catching up. The selections vary from old standards ("A White Christmas"), to recent classics ("American Hustle") and cult classics ("Firefly: The Series"), to guilty pleasures ("The Replacements").
A couple of months ago, Tyler Cowen blogged about choosing movies and wrote this about reading reviews before watching a movie:
I use movie criticism in the following way: I read just enough to decide if I want to see the movie, and then no more. I also try to forget what I have read. But before a second viewing of a film, I try to read as much as possible about it.
I have a similar approach on my first viewing. If I already know I want to watch a particular movie, I read nothing about it. If not, I read the bare minimum needed to make the decision. Then I do my best to forget it all. I don't want to be reminded about what I already knew about the movie or what I just read about it. I want to watch with as clean a mind as possible.
Yet I can watch certain movies many, many times and enjoy them immensely every time. Sometimes I read a lot about a movie, both background and criticism, mostly because I love pop culture and I love hearing about how creators create. However many times I watch though, I try to watch each time with a beginner's mind. This used to drive my wife crazy: "You've seen this movie a dozen times; what do you mean you don't want to think about what happens next?". One of my gifts seems to be an ability to suspend memory and belief. When watching movies, that's usually how I like it best.
In the Paris Review's The Art of Fiction No. 183, the interviewer asks Tobias Wolff how he balances writing with university teaching. Wolff figures that teaching is a pretty good deal:
When I think about the kinds of jobs I've had and the ways I've lived, and still managed to get work done--my God, teaching in a university looks like easy street. I like talking about books, and I like encountering other smart, passionate readers, and feeling the friction of their thoughts against mine. Teaching forces me to articulate what otherwise would remain inchoate in my thoughts about what I read. I find that valuable, to bring things to a boil.
That reflects how I feel, too, as someone who loves to do computer science and write programs. As a teacher, I get to talk about cool ideas every day with my students, to share what I learn as I write software, and to learn from them as they ask the questions I've stopped asking myself. And they pay me. It's a great deal, perhaps the optimal point in the sort of balance that Derek Sivers recommends.
Wolff immediately followed those sentences with a caution that also strikes close to home:
But if I teach too much it begins to weigh on me--I lose my work. I can't afford to do that anymore, so I keep a fairly light teaching schedule.
One has to balance creative work with the other parts of life that feed the work. Professors at research universities, such as Wolff at Stanford, have different points of equilibrium available to them than profs at teaching universities, where course loads are heavier and usually harder to reduce.
I only teach one course a semester, which really does help me to focus creative energies around a smaller set of ideas than a heavier load does. Of course, I also have the administrative duties of a department head. They suffocate time and energy in a much less productive way than teaching does. (That's the subject of another post.)
Why can't Wolff afford to teach too many courses anymore? I suspect the answer is time. When you reach a certain age, you realize that time is no longer an ally. There are only so many years left, and Wolff probably feels the need to write more urgently. This sensation has been seeping into my mind lately, too, though I fear perhaps a bit too slowly.
~~~~
(I previously quoted Wolff from the same interview in a recent entry about writers who give advice that reminds us that there is no right way to write all programs. A lot of readers seemed to like that one.)
I'm not a New Year's resolution person, but I did make a change recently that moved me out of my comfort zone. Here's a quick version of the story.
I'm a hierarchical guy, like a lot of computer scientists, I imagine. That helps me manage a lot of complexity, but sometimes it also consumes more personal time than I'd like.
I'm also a POP mail guy. For many years, Eudora was my client of choice. A while back, I switched to Mail.app on OS X. In both, I had an elaborate filing system in which research mail was kept in a separate folder from teaching mail, which was kept in a separate folder from personal was kept in a separate folder from .... There were a dozen or so top-level folders, each having sub-folders.
Soon after I became department head a decade or so ago, I began to experience the downsides of this approach as much as the upsides. Some messages wanted to live in two folders, but I had to choose one. Even when the choice was easy, I found myself spending too many minutes each week filing away messages I would likely never think of again.
For years now, my browser- and cloud-loving friends have been extolling to me the value of leaving all my mail on the server, hitting 'archive' when I wanted to move a message out of my inbox, and then using the mail client's search feature to find messages when I need them later. I'm not likely to become a cloud email person any time soon, but the cost in time and mental energy of filing messages hierarchically finally became annoying enough that I decided to move into the search era.
January 1 was the day.
But I wasn't ready to go all the way. (Change is hard!) I'd still like to have a gross separation of personal mail from professional mail, and gross separation among email related to teaching, research, professional work, and university administration. If Mail.app had tags or labels, I might use them, but it doesn't. At this point, I have five targeted archive folders:
I still have three other small hierarchies. The first is where I keep folders for other courses I have taught or plan to teach. I like the idea of keeping course questions and materials easy to find. The second is for hot topics I am working on as department head. For instance, we are currently doing a lot of work on outcomes assessment, and it's helpful to have all those messages in a separate bin. When a topic is no longer hot, I'll transfer its messages to the department archive. The third is is a set of two or three small to-do boxes. Again, it's helpful to an organizer like me to have such messages in a separate bin so that I can find and respond to them quickly; eventually those messages will move to the appropriate flat archive.
Yes, there is still a lot going on here, but it's a big change for me. So far, so good. I've not felt any urges to create subfolders yet, and I've used search to find things when I've needed them. After I become habituated to this new way of living, perhaps I'll feel daring enough to go even flatter.
Let's not talk about folders in my file system, though. Hierarchy reigns supreme there, as it always has.
In My Writing Education: A Time Line, George Saunders recounts stories of his interactions with writing teachers over the years, first in the creative writing program at Syracuse and later as a writer and teacher himself. Along the way, he shows us some of the ways that our best teachers move us.
Here, the teacher gets a bad review:
Doug gets an unkind review. We are worried. Will one of us dopily bring it up in workshop? We don't. Doug does. Right off the bat. He wants to talk about it, because he feels there might be something in it for us. The talk he gives us is beautiful, honest, courageous, totally generous. He shows us where the reviewer was wrong -- but also where the reviewer might have gotten it right. Doug talks about the importance of being able to extract the useful bits from even a hurtful review: this is important, because it will make the next book better. He talks about the fact that it was hard for him to get up this morning after that review and write, but that he did it anyway. He's in it for the long haul, we can see.
I know some faculty who basically ignore student assessments of their teaching. They paid attention for a while at the beginning of their careers, but it hurt too much, so they stopped. Most of the good teachers I know, though, approach their student assessments the way that Doug approaches his bad review: they look for the truths in the reviews, take those truths seriously, and use them to get better. Yes, a bad set of assessments hurts. But if you are in it for the long haul, you get back to work.
Here, the teacher gives a bad review:
What Doug does for me in this meeting is respect me, by declining to hyperbolize my crap thesis. I don't remember what he said about it, but what he did not say was, you know: "Amazing, you did a great job, this is publishable, you rocked our world with this! Loved the elephant." There's this theory that self-esteem has to do with getting confirmation from the outside world that our perceptions are fundamentally accurate. What Doug does at this meeting is increase my self-esteem by confirming that my perception of the work I'd been doing is fundamentally accurate. The work I've been doing is bad. Or, worse: it's blah. This is uplifting -- liberating, even -- to have my unspoken opinion of my work confirmed. I don't have to pretend bad is good. This frees me to leave it behind and move on and try to do something better. The main thing I feel: respected.
Sometimes, students make their best effort but come up short. They deserve the respect of an honest review. Honest doesn't have to be harsh; there is a difference between being honest and being a jerk. Sometimes, students don't make their best effort, and they deserve the respect of an honest review, too. Again, being honest doesn't mean being harsh. In my experience, most students appreciate an honest, objective review of their work. They almost always know when they are coming up short, or when they aren't working hard enough. When a teacher confirms that knowledge, they are freed -- or motivated in a new way -- to move forward.
Here, the teacher reads student work:
I am teaching at Syracuse myself now. Toby, Arthur Flowers, and I are reading that year's admissions materials. Toby reads every page of every story in every application, even the ones we are almost certainly rejecting, and never fails to find a nice moment, even when it occurs on the last page of the last story of a doomed application. "Remember that beautiful description of a sailboat on around page 29 of the third piece?" he'll say. And Arthur and I will say: "Uh, yeah ... that was ... a really cool sailboat." Toby has a kind of photographic memory re stories, and such a love for the form that goodness, no matter where it's found or what it's surrounded by, seems to excite his enthusiasm. Again, that same lesson: good teaching is grounded in generosity of spirit.
It has taken me a long time as a teacher to learn to have Toby's mindset when reading student work, and I'm still learning. Over the last few years, I've noticed myself trying more deliberately to find the nice moments in students' programs, even the bad ones, and to tell students about them. That doesn't mean being dishonest about the quality of the overall program. But nice moments are worth celebrating, wherever they are found. Sometimes, those are precisely the elements students need to hear about, because they are the building blocks for getting better.
Finally, here is the teacher talking about his own craft:
During the Q&A someone asks what Toby would do if he couldn't be a writer.
A long, perplexed pause.
"I would be very sad", he finally says.
I like teaching computer science, but what has enabled me to stay in the classroom for so many years and given me the stamina to get better at teaching is that I like doing computer science. I like to program. I like to solve problems. I like to find abstractions and look for ways to solve other problems. There are many things I could do if I were not a computer scientist, but knowing what I know now, I would be a little sad.
Last week, I read a blog entry by Ben Thompson that said Influence lives at intersections. Thompson was echoing a comment about Daniel Kahneman's career: "Intellectual influence is the ability to dissolve disciplinary boundaries." These were timely references for my week.
On Friday night, I had the pleasure of attending the Heritage Honours Awards, an annual awards dinner hosted by my university's alumni association. One of our alumni, Wade Arnold, received the Young Alumni Award for demonstrated success early in a career. I mentioned Wade in a blog entry several years ago, when he and I spoke together at a seminar on interactive digital technologies. That day, Wade talked about intersections:
It is difficult to be the best at any one thing, but if you are very good at two or three or five, then you can be the best in a particular market niche. The power of the intersection.
Wade built his company, Banno, by becoming very good at several things, including functional programming, computing infrastructure, web development, mobile development, and financial technology. He was foresightful and lucky enough to develop this combination of strengths before most other people did. Most important, though, he worked really hard to build his company: a company that people wanted to work with, and a company that people wanted to work for. As a result, he was able to grow a successful start-up in a small university town in the middle of the country.
It's been a delight for me to know Wade all these years and watch him do his thing. I'll bet he has some interesting ideas in store for the future.
The dinner also provided me with some unexpected feelings. Several times over the course of the evening, someone said, "Dr. Wallingford -- I feel like I know you." I had the pleasure of meeting Wade's parents, who said kind things about my influence on their son. Even his nine-year-old son said, "My dad was talking about you in the car on the drive over." No one was confused about whom we were there to honor Friday night, about who had done the considerable work to build himself into an admirable man and founder. That was all Wade. But my experience that night is a small reminder to all you teachers out there: you do have an effect on people. It was certainly a welcome reminder for me at the end of a trying semester.
I saw a passage attributed to Søren Kierkegaard that I might translate as:
The life of humanity could very well be conceived as a speech in which different people represented the various parts of speech [...]. How many people are merely adjectives, interjections, conjunctions, adverbs; how few are nouns, verbs; how many are copula?
This is a natural thing to ponder around my birthday. It's not a bad thing to ask myself more often: Which part of speech will I be today?
According to Darwin himself, in his autobiography:
I have no great quickness of apprehension or wit which is so remarkable in some clever men, for instance, Huxley. I am therefore a poor critic: a paper or book, when first read, generally excites my admiration, and it is only after considerable reflection that I perceive the weak points.
If you read my blog, you know this about me. Either you enjoy my occasionally uncritical admiration, or at least you tolerate it.
I ended up with an unexpected couple of hours free yesterday afternoon, and I decided to clean up several piles of old papers on the floor of my running room. Back when I ran marathons, I was an information hound. I wrote notes, collected maps, and clipped articles on training plans, strength training, stretching and exercise, diet and nutrition -- anything I thought I could use to get get better. I'm sure this surprises many of you.
There was a lot of dust to dig through, but the work was full of happy reminiscences. It's magical how a few pieces of paper can activate our memories. The happy memories leave in their wake a sadness, when my time as a runner ended. That's when the piles stopped growing. I stopped collecting material, because I wasn't running anymore.
Fortunately, the sadness of loss didn't drown out the happy memories. Instead, I started thinking about the future, which is really now. These thoughts are long past due.
Looking back through my running logs reminded of the pattern of my life as a marathoner. There was an ebb and a flow to the year. I trained for my first half marathon. Then I trained for my first full marathon. I ran lightly for a few weeks as my body recovered. Winter and spring saw regular runs, but a break for mind and body alike: no big plans, just enjoying the road. Then came the end of spring, and it started all over again: training for big races. These years were filled with variety in my running, variety in my goals.
The last few years have been different. I recovered from a couple of operations, eventually taking up the elliptical machine and returning to my bike for fitness. However, I have never become a cyclist in spirit the way I became a runner. I've been exercising lots, staying fit and healthy, but I miss the rhythm of running and training for marathons. In comparison, my exercise since leaves me bored and uninspired.
Diving into those piles of paper yesterday started me thinking, what are the next goals? I'll be working on that as we slide into winter, looking forward what next spring might bring.
Last month, Seth Godin posted a short entry about reputation. It brought back memories of my first couple of years as department head. I was excited and wanted to do a good job, so I took on a lot. Pretty soon I was in a position of having promised more than I could deliver. Some of the shortfall resulted from the heavy load itself; there are only so many hours in a week. Some resulted from promising things I couldn't deliver, because I didn't understand the external constraints I faced.
When you don't deliver, explanations sound like excuses.
If I were giving advice to myself at the time I became head (already an adult who should have known better...), I would tell him to heed Godin's advice and help people learn what they can expect from you. Explicit attention to expectations can pay off in seeding reputation but also by setting parameters for yourself. Then live up to that standard.
If you teach people to expect little, perhaps unintentionally, they will -- even on the occasions when you do better. And after you get better, if you do, it takes a long time to undo the expectations you created early on.
Live the life you've taught people to expect from you -- but first be careful what you teach them to expect.
W.H. Auden, in A Certain World, on the idea of The Two Cultures:
Of course, there is only one. Of course, the natural sciences are just as "humane" as letters. There are, however, two languages, the spoken verbal language of literature, and the written sign language of mathematics, which is the language of science. This puts the scientist at a great advantage, for, since like all of us he has learned to read and write, he can understand a poem or a novel, whereas there are very few men of letters who can understand a scientific paper once they come to the mathematical parts.
When I was a boy, we were taught the literary languages, like Latin and Greek, extremely well, but mathematics atrociously badly. Beginning with the multiplication table, we learned a series of operations by rote which, if remembered correctly, gave the "right" answer, but about any basic principles, like the concept of number, we were told nothing. Typical of the teaching methods then in vogue is the mnemonic which I had to learn.Minus times Minus equals Plus:
The reason for this we need not discuss.
Sadly, we still teach young people that it's okay if math and science are too hard to master. They grow into adults who feel a chasm between "arts and letters" and "math and science". But as Auden notes rightly, there is no chasm; there is mostly just another language to learn and appreciate.
(It may be some consolation to Auden that we've reached a point where most scientists have to work to understand papers written by scientists in other disciplines. They are written in highly specialized languages.)
In my experience, it is more acceptable for a humanities person to say "I'm not a science person" or "I don't like math" than for a scientist to say something similar about literature, art, or music. The latter person is thought, silently, to be a Philistine; the former, an educated person with a specialty.
I've often wondered if this experience suffers from observation bias or association bias. It may well. I certainly know artists and writers who have mastered both languages and who remain intensely curious about questions that span the supposed chasm between their specialties and mine. I'm interested in those questions, too.
Even with this asymmetry, the presumed chasm between cultures creates low expectations for us scientists. Whenever my friends in the humanities find out that I've read all of Kafka's novels and short stories; that Rosencrantz and Guildenstern Are Dead is my favorite play, or that I even have a favorite play; that I really enjoyed the work of choreographer Merce Cunningham; that my office bookshelf includes the complete works of William Shakespeare and a volume of William Blake's poetry -- I love the romantics! -- most seem genuinely surprised. "You're a computer scientist, right?" (Yes, I like Asimov, Heinlein, Clarke, and Bradbury, too.)
Auden attributes his illiteracy in the language of mathematics and science to bad education. The good news is that we can reduce, if not eliminate, the language gap by teaching both languages well. This is a challenge for both parents and schools and will take time. Change is hard, especially when it involves the ways we talk about the world.
I agree with W.H. Auden:
Who on earth invented the silly convention that it is boring or impolite to talk shop? Nothing is more interesting to listen to, especially if the shop is not one's own.
My wife went on a forty-mile bike ride this morning, a fundraiser for the Cedar Valley Bicycle Collective, which visited three local farms. At those stops, I had the great fortune to listen to folks on all three farms talk shop. We learned about making ice cream and woodcarving at Three Pines Farm. We learned about selecting, growing, and picking apples -- and the damage hail and bugs can do -- at Blueridge Orchard. And the owner of the Fitkin Popcorn Farm talked about the popcorn business. He showed us the machines they use to sort the corn out of the field, first by size and then by density. He also talked about planting fields, harvesting the corn, and selling the product nationally. I even learned that we can pop the corn while it's still on the ears! (This will happen in my house very soon.)
I love to listen to people talk shop. In unguarded moments, they speak honestly about something they love and know deeply. They let us in on what it is like for them to work in their corner of the world. However big I try to make my world, there is so much more out there to learn.
The Auden passage is from his book A Certain World, a collage of poems, quotes, and short pieces from other writers with occasional comments of his own. Auden would have been an eclectic blogger! This book feels like a Tumblr blog, without all the pictures and 'likes'. Some of the passages are out of date, but they let us peak in on the mind of an accomplished poet. A little like good shop talk.
There is a scene in The Big Bang Theory where Sheldon laments that, without realizing it, he had allowed his girl/friend to alter his personality. Leonard responds, "Well, you didn't really have a 'personality'. You just had some shows you liked."
This scene came to mind when I read a passage from Kenneth Goldsmith's Uncreative Writing earlier this week:
I don't think there's a stable or essential "me". I am an amalgamation of many things: books I've read, movies I've seen, television shows I've watched, conversations I've had, songs I've sung, lovers I've loved. In fact, I'm a creation of so many people and so many ideas, to the point where I feel I've actually had few original thoughts and ideas; to think that what I consider to be "mine" was "original" would be blindingly egotistical.
It is occasionally daunting when I realize how much I am a product of the works, people, and ideas I've encountered. How can I add anything new? But when I surrender to the fact that I can't, it frees me to write and do things that I like. What I make may not be new, but it can still be useful or valuable, even if only to me.
I wonder what it's like for kids to grow up in a self-consciously mash-up culture. My daughters have grown up in a world where technology and communication have given everyone the ability to mix and modify other work so easily. It's a big part of the entertainment they consume.
Mash-up culture must feel hugely empowering in some moments and hugely terrifying in others. How can anyone find his or her own voice, or say something that matters? Maybe they have a better sense than I did growing up that nothing is really new and that what really matters is chasing your interests, exploring the new lands you enter, and sharing what you find. That's certainly been the source of my biggest accomplishments and deepest satisfactions.
(I ran across the passage from Goldsmith on Austin Kleon's blog.)
Kevin Kelly, in Amish Hackers:
One Amish-man told me that the problem with phones, pagers, and PDAs (yes he knew about them) was that "you got messages rather than conversations". That's about as an accurate summation of our times as any. Henry, his long white beard contrasting with his young bright eyes told me, "If I had a TV, I'd watch it." What could be simpler?
Unlike some younger Amish, I still do not carry a smart phone. I do own a cell but use it only when traveling. If our home phone disappeared overnight, it would likely take several days before my wife or I even noticed.
I also own a television, a now-déclassé 32" flat screen. Henry is right: having a TV, I find myself watching it on occasion. I enjoy it but have to guard vigilantly against falling into a hypnotic trance. It turns out that I form certain habits quite easily.
I've been reading a bunch of the essays on David Chapman's Meaningness website lately, after seeing a link to one on Twitter. (Thanks, @kaledic.) This morning I read How To Think Real Good, about one of Chapman's abandoned projects: a book of advice for how to think and solve problems. He may never write this book as he once imagined it, but I'm glad he wrote this essay about the idea.
First of all, it was a fun read, at least for me. Chapman is a former AI researcher, and some of the stories he tells remind me of things I experienced when I was in AI. We were even in school at about the same time, though in different parts of the country and different kinds of program. His work was much more important than mine, but I think at some fundamental level most people in AI share common dreams and goals. It was fun to listen as Chapman reminisced about knowledge and AI.
He also introduced me to the dandy portmanteau anvilicious. I keep learning new words! There are so many good ones, and people make up the ones that don't exist already.
My enjoyment was heightened by the fact that the essay stimulated the parts of my brain that like to think about thinking. Chapman includes a few of the heuristics that he intended to include in his book, along with anecdotes that illustrate or motivate them. Here are three:
All problem formulations are "false", because they abstract away details of reality.
Solve a simplified version of the problem first. If you can't do even that, you're in trouble.
Probability theory is sometimes an excellent way of dealing with uncertainty, but it's not the only way, and sometimes it's a terrible way.
He elaborates on the last of these, pointing out that probability theory tends to collapse many different kinds of uncertainty into a single value. This does not work all that well in practice, because different kinds of uncertainty often need to be handles in very different ways.
Chapman has a lot to say about probability. This essay was prompted by what he sees as an over-reliance of the rationalist community on a pop version of Bayesianism as its foundation for reasoning. But as an old AI researcher, he knows that an idea can sound good and fail in practice for all sorts of reasons. He has also seen how a computer program can make clear exactly what does and doesn't work.
Artificial intelligence has always played a useful role as a reality check on ideas about mind, knowledge, reasoning, and thought. More generally, anyone who writes computer programs knows this, too. You can make ambiguous claims with English sentences, but to write a program you really have to have a precise idea. When you don't have a precise idea, your program itself is a precise formulation of something. Figuring out what that is can be a way of figuring out what you were really thing about in the first place.
This is one of the most important lessons college students learn from their intro CS courses. It's an experience that can benefit all students, not just CS majors.
Chapman also includes a few heuristics for approaching the problem of thinking, basically ways to put yourself in a position to become a better thinker. Two of my favorites are:
Try to figure out how people smarter than you think.
Find a teacher who is willing to go meta and explain how a field works, instead of lecturing you on its subject matter.
This really is good advice. Subject matter is much easier to come by than deep understanding of how the discipline work, especially in these days of the web.
The word meta appears frequently throughout this essay. (I love that the essay is posted on the metablog/ portion of his site!) Chapman's project is thinking about thinking, a step up the ladder of abstraction from "simply" thinking. An AI program must reason; an AI researcher must reason about how to reason.
This is the great siren of artificial intelligence, the source of its power and also its weaknesses: Anything you can do, I can do meta.
I think this gets at why I enjoyed this essay so much. AI is ultimately the discipline of applied epistemology, and most of us who are lured into AI's arms share an interest in what it means to speak of knowledge. If we really understand knowledge, then we ought to be able to write a computer program that implements that understanding. And if we do, how can we say that our computer program isn't doing essentially the same thing that makes us humans intelligent?
As much as I love computer science and programming, my favorite course in graduate school was an epistemology course I took with Prof. Rich Hall. It drove straight to the core curiosity that impelled me to study AI in the first place. In the first week of the course, Prof. Hall laid out the notion of justified true belief, and from there I was hooked.
A lot of AI starts with a naive feeling of this sort, whether explicitly stated or not. Doing AI research brings that feeling into contact with reality. Then things gets serious. It's all enormously stimulating.
Ultimately Chapman left the field, disillusioned by what he saw as a fundamental limitation that AI's bag of tricks could never resolve. Even so, the questions that led him to AI still motivate him and his current work, which is good for all of us, I think.
This essay brought back a lot of pleasant memories for me. Even though I, too, am no longer in AI, the questions that led me to the field still motivate me and my interests in program design, programming languages, software development, and CS education. It is hard to escape the questions of what it means to think and how we can do it better. These remain central problems of what it means to be human.
Jack Levine, on painting as a realist in the 1950s, a time of abstract expressionism and art as social commentary:
The difficulty is for me to be affirmative. I'm a little inhibited, as you have noticed, by not being against any of these people. The spirit of denunciation is more in the spirit of our time: sensation brought to an extreme.
Levine might just as well have been talking about today's social and political climate. Especially if he had had a Facebook or Twitter account.
~~~~
(This passage comes from Conversations with Artists. These entries also draw passages from it: [ 07/19 | 07/27 | 07/31 ]. This is my last entry drawn from the book, at least for now.)
A great analogy from Frank Cottrell:
Think of it, he says, the sun pours down its energy onto the surface of the planet for millennia. The leaves soak up the energy. The trees fall and turn to coal. Coal is solid sunlight, the stored memory of millions of uninhabited summers. Then one day, in Coalbrookdale, someone opens a hole in the ground and all that stored energy comes pouring out and is consumed in furnaces, engines, motors.
When we -- teachers, parents, carers, friends -- read to our children, I believe that's what we're doing. Laying down strata of fuel, fuel studded with fossils and treasures. If we ask for anything back, we burn it off too soon.
My wife and I surely did a lot of things wrong as we raised our daughters, but I think we did at least two things right: we read to them all the time, and we talked to them like we talk to everyone else. Their ability to speak and reason and imagine grew out of those simple, respectful acts.
Teaching at a university creates an upside-down dynamic by comparison, especially in a discipline many think of as being about jobs. It is the students and parents who are more likely to focus on the utility of knowledge. Students sometimes ask, "When will we use this in industry?" With the cost of tuition and the uncertainty of the times, I understand their concern. Even so, there are times I would like to say "I don't know" or, in my weaker moments, the seemingly disrespectful "I don't care". Something more important should be happening here. We are creating fuel for a lifetime.
(The Cottrell article takes an unusual route to an interesting idea. It was worth a read.)
Yesterday, I wrote me some Java. It was fun.
A few days ago, I started wondering if there was something unique I could send my younger daughter for her birthday today. My daughters and I were all born in presidential election years, which is neat little coincidence. This year's election is special for the birthday girl: it is her first opportunity to vote for the president. She has participated in the process throughout, which has seen both America's most vibrant campaign for progressive candidate in at least forty years and the first nomination of a woman by a major party. Both of these are important to her.
In the spirit of programming and presidential politics, I decided to write a computer program to convert images into the style of Shepard Fairey's iconic Obama "Hope" poster and then use it to create a few images for her.
I dusted off Dr. Java and fired up some code I wrote when I taught media computation in our intro course many years ago. It had been a long time since I had written any Java at all, but it came back just like riding a bike. More than decade of writing code in a language burns some pretty deep grooves in the mind.
I found RGB values to simulate the four colors in Fairey's poster in an old message to the mediacomp mailing list:
Color darkBlue = new Color(0, 51, 76); Color lightBlue = new Color(112, 150, 158); Color red = new Color(217, 26, 33); Color yellow = new Color(252, 227, 166);
Then came some experimentation...
I liked the outputs of this third effort quite a bit, at least for the photos I gave it as input. Two of them worked out especially well. With a little doctoring in Photoshop, they would have an even more coherent feel to them, like an artist might produce with a keener eye. Pretty good results for a few fun minutes of programming.
Now, let's hope my daughter likes them. I don't think she's ever received a computer-generated present before, at least not generated by a program her dad wrote!
The images I created were gifts to her, so I'll not share them here. But if you've read this far, you deserve a little something, so I give you these:
|
|
Now that is change we can all believe in.
Brent Simmons has recently suggested that Swift would be better if it were more dynamic. Some readers have interpreted his comments as an unwillingness to learn new things. In Oldie Complains About the Old Old Ways, Simmons explains that new things don't bother him; he simply hopes that we don't lose access to what we learned in the previous generation of improvements. The entry is short and worth reading in its entirety, but the last sentence of this particular paragraph deserves to be etched in stone:
It seemed like magic, then. I later came to understand how it worked, and then it just seemed like brilliance. (Brilliance is better than magic, because you get to learn it.)
This gets to close to the heart of why I love being a computer scientist.
So many of the computer programs I use every day seem like magic. This might seem odd coming from a computer scientist, who has learned how to program and who knows many of the principles that make complex software possible. Yet that complexity takes many forms, and even a familiar program can seem like magic when I'm not thinking about the details under its hood.
As a computer scientist, I get to study the techniques that make these programs work. Sometimes, I even get to look inside the new program I am using, to see the algorithms and data structures that bring to life the experience that feels like magic.
Looking under the hood reminds me that it's not really magic. It isn't always brilliance either, though. Sometimes, it's a very cool idea I've never seen or thought about before. Other times, it's merely a bunch of regular ideas, competently executed, woven together in a way that give an illusion of magic. Regular ideas, competently executed, have their own kind of beauty.
After I study a program, I know the ideas and techniques that make it work. I can use them to make my own programs.
This fall, I will again teach a course in compiler construction. I will tell a group of juniors and seniors, in complete honesty, that every time I compile and execute a program, the compiler feels like magic to me. But I know it's not. By the end of the semester, they will know what I mean; it won't feel like magic to them any more, either. They will have learned how their compilers work. And that is even better than the magic, which will never go away completely.
After the course, they will be able to use the ideas and techniques they learn to write their own programs. Those programs will probably feel like magic to the people who use them, too.
Henry Miller, in "The Books in My Life" (1969):
Every day of his life the common man makes use of what men in other ages would have deemed miraculous means. In the range of invention, if not in powers of invention, the man of today is nearer to being a god than at any time in history. (So we like to believe!) Yet never was he less godlike. He accepts and utilizes the miraculous gifts of science unquestioningly; he is without wonder, without awe, reverence, zest, vitality, or joy. He draws no conclusions from the past, has no peace or satisfaction in the present, and is utterly unconcerned about the future. He is marking time.
It's curious to me that this was written around the same time as Stewart Brand's clarion call that we are as gods. The zeitgeist of the 1960s, perhaps.
"The Books in My Life" really has been an unexpected gift. As I noted back in November, I picked it up on a lark after reading a Paris Review interview with Miller, and have been reading it off and on since. Even though he writes mostly of books and authors I know little about, his personal reflections and writing style click with me. Occasionally, I pick up one of the books he discusses, ost recently Richard Jefferies's The Story of My Heart.
When other parts of the world seem out of sync, picking up the right book can change everything.
So, a commodity chess program is now giving odds of a pawn and a move to a world top-ten player -- and winning?
The state of computer chess certainly has changed since the fall of 1979, when I borrowed Mike Jeffers's Chess Challenger 7 and played it over and over and over. I was a rank novice, really just getting my start as a player, yet after a week or so I was able to celebrate my first win over the machine, at level 3. You know what they say about practice...
My mom stopped by our study room several times during that week, trying to get me to stop playing. It turns out that she and my dad had bought me a Chess Challenger 7 for Christmas, and she didn't want me to tire of my present before I had even unwrapped it. She didn't know just how not tired I would get of that computer. I wore it out.
When I graduated with my Ph.D., my parents bought me Chess Champion 2150L, branded by in the name of world champion Garry Kasparov. The 2150 in the computer's name was a rough indication that it played expert-level chess, much better than my CC7 and much better than me. I could beat it occasionally in a slow game, but in speed chess it pounded me mercilessly. I no longer had the time or inclination to play all night, every night, in an effort to get better, so it forever remained my master.
Now US champ Hikaru Nakamura and world champ Magnus Carlsen know how I feel. The days of any human defeating even the programs you can buy at Radio Shack have long passed.
Two pawns and move odds against grandmasters, and a pawn and a move odds against the best players in the world? Times have changed.
Michael Fogus, in the latest issue of Read-Eval-Print-λove, writes:
The book in question was Thinking Forth by Leo Brodie (Brodie 1987) and upon reading it I immediately put it into my own "personal pantheon" of influential programming books (along with SICP, AMOP, Object-Oriented Software Construction, Smalltalk Best Practice Patterns, and Programmers Guide to the 1802).
Mr. Fogus has good taste. Programmers Guide to the 1802 is new to me. I guess I need to read it.
The other five books, though, are in my own pantheon influential programming books. Some readers may be unfamiliar with these books or the acronyms, or aware that so many of them are available free online. Here are a few links and details:
|
There is one book on my own list that Fogus did not mention: Paradigms of Artificial Intelligence Programming, by Peter Norvig. It holds perhaps the top position in my personal pantheon. Subtitled "Case Studies in Common Lisp", this book teaches Common Lisp, AI programming, software engineering, and a host of other topics in a classical case studies fashion. When you finish working through this book, you are not only a better programmer; you also have working versions of a dozen classic AI programs and a couple of language interpreters.
Reading Fogus's paragraph of λove for Thinking Forth brought to mind how I felt when I discovered PAIP as a young assistant professor. I once wrote a short blog entry praising it. May these paragraphs stand as a greater testimony of my affection.
I've learned a lot from other books over the years, both books that would fit well on this list (in particular, A Programming Language by Kenneth Iverson) and others that belong on a different list (say, Gödel, Escher, Bach -- an almost incomparable book). But I treasure certain programming books in a very personal way.
This weekend has been a normal one at home, a little online and a little off, but last weekend I went offline for most of three and a half days to visit my older daughter in Boston. She been in Jamaica Plain in for eight months and had plenty of sights to show. I hadn't spent much time in Boston since AAAI 1990 and, except for the walk across the Charles River from my MIT dorm room, had forgotten most of the details of that trip. Now that I blog, I can preserve my memories for 2042 me.
Offline. Being offline for most of three and a half days was a treat. I had my laptop on for only a couple of hours in the Chicago airport when I used a long layover to grade one of my students programming assignments. Thereafter, I left it in its bag, turned off. It was great to be present to my daughter and the world for a while without feeling the need to check mail or tinker with work.
Goal. At the airport, I saw several members of the Fort Lauderdale Strikers, a North American Soccer League team. I'm guessing they were passing through ORD en route to a match with the Minnesota United. From the score of the game, I think my weekend went better than theirs.
Confluence. Saturday morning, we went out for brunch at the Center Street Cafe in Jamaica Plain. We arrived a few minutes after opening. Seating is limited, so we waited outside in line.
There was one party ahead of us, a young couple. The young woman kept looking at my jacket and finally said, "Did you go to UNI?" When I told her that I teach there and that my daughter is from Cedar Falls, she told us that she is from Des Moines. The older guy behind me in line heard us discussing Iowa, asked where we were from, said that he is from Council Bluffs, and recalled that a good friend of his UNI. We all marveled at the coincidence. Our new Council Bluffs friend wondered what it was that attracted Iowans to Boston; I silently wondered if Boston depended on an influx of Iowa talent to stay fresh.
The food was excellent, too.
Walking. After brunch, my daughter and I spent several hours walking in the Arnold Arboretum and the Forest Hills Cemetery. The arboretum is not yet in bloom yet still had plenty of neat things to see, as well as a prodigious hill to climb. The cemetery is full of impressive monuments and interesting sculpture.
For some reason, I got it in my head that I wanted that I wanted to see e.e. cummings's cemetery marker. I did not know that it is famously difficult to find. Alas, after several hours on foot at the arboretum and cemetery in the sun, my brain was not up to the task of finding it. Now I have extra motivation to return to JP. I'll have a picture next time.
The Arts. We decided to spend Sunday afternoon at the Museum of Fine Arts, but I could have spent a week there. Our first stop was the special exhibit called Megacities, by artists from cities that have, in the last few decades, grown to populations in excess of 20 million people. These artists are responding to what this growth means for the people, their way of life, and the cities themselves. The old architecture student in me was drawn especially to two spaces created to evoke the cities that existed before the growth:
Sarah wanted to be sure to see a painting she likes, of a big storm in a valley, and otherwise was open to explore. It turned out to be Albert Bierstadt's wonderful Storm in the Mountains. I expressed interest in the impressionists, so we made sure to swing through those galleries, too. The Pisarros took more of my attention than in the past, and the Monets lived up to my expectations. We spent several minutes examining several of his Rouen Cathedral canvases and several of his Morning on the Seine works up close, then walked to the opposite corner of the room to experience them from a distance. It was hard to leave.
A New Favorite. The MFA has an extensive collection of John Singer Sargent's work, about which I had much to learn. I left the weekend with a new painting among my favorites, Sargent's "An Artist in His Studio":
Up close, the detail in the bedding grabs the eye: "Surely, never were tumbled white sheets so painted before." The artist at work.
Coincidence. My reading for the trip was The Book of Tea, Okakura Kakuzo's short book on the Japanese tea ceremony and its intimate connection to art, culture, and philosophy. Until I reached the biographical essay at the back of this 1956 edition of the book, I did not know that Okakura had a connection to the MFA in Boston:
The wholesale destruction of a nation's cultural heritage [in the late 19th century] aroused to action a small group of Japanese artists and men of letters and a handful of foreigners who seemed more concerned about the fate of Japanese art than were many native hotheads. The nucleus of this movement emerged from the Imperial University in Tokyo, with Professors Morse and Fenollosa in the lead, and with Kano Hogai, of the ancient family of artists, to act as historic instructor. Fenollosa urged his wealthy friend, William Sturgis Bigelow, to buy up whatever of value was tossed on a careless market; this was to become the core of the great Oriental collection of the Boston Museum. Okakura Kakuzo and Baron Kuki were the most energetic Japanese workers in this group.
Iowans and Japanese intersecting with Boston. The Oriental collection is definitely on the itinerary for my next visit.
Much More. We packed the weekend from morning until night, beginning with a workout at my daughter's gym and ending each night with a film. In addition to the places I've mentioned, we visited the aquarium, the North End, Boston Common, and the public garden, another brunch at Vee Vee, and a dinner at Bella Luna. It was a weekend well-spent.
A Modern Man. I even joined the 21st century on this trip. I sent a text for no purpose other than to say 'hello'. My daughter and I streamed movies from Netflix. And I relied on my cell phone alarm to awaken to catch a cab at 5:00 AM. A weekend well-spent, indeed.
PLT Rising. One last bit of new knowledge: Northeastern University is but two short subway stops from where I got off for my visit. This means that my next visit to Jamaica Plain, should there be one, will include a visit to see the home of the PLT group there. If nothing else, I can take snapshots of the labs and offices where so much cool Racket work is done. Then maybe I could write the excursion off as a business trip.
Getting Older
In Fun With Aging, "Dean Dad" Matt Reed pontificates on reaching a Certain Age.
When Mom was the age I am now, I was in grad school. That can't possibly be right, but that's what the math says.
When my mom was the age I am now, I was already in my second year as an assistant professor, a husband, and father to a two-year-old daughter. Wow.
Getting old: what a strange thing to happen to a little boy.
That said, I am one up on Reed: I know one of Justin Bieber's recent songs and quite like it.
An Interesting Juxtaposition
Earlier this week, I read The Real Reason Middle America Should Be Angry, about St. Louis's fall from national prominence. This morning, I read The Refragmentation, Paul Graham's essay on the dissolution of the 20th century's corporate and cultural order abetted, perhaps accelerated, by computation.
Both tell a story of the rise and fall of corporations across the 20th century. Their conclusions diverge widely, though, especially on the value of government policies that affect scale. I suspect there are elements of truth in both arguments. In any case, they make interesting bookends to the week.
A Network of Links
Finally, as I tweeted yesterday, a colleague told me that he was going to search my blog. He had managed to forget where his own blog lives, and he remembered that I linked to it once.
At first, I chuckled at this situation as a comment on his forgetfulness, and ruefully as a comment on the passing of the age of the blog. But later I realized that this is as much a comment on the wonderfulness of blogging culture, in which links are life and, as long as the network is alive, conversation can be revived.
I hope he blogs again.
This morning I read three pieces with some connection to universities and learning. Each had a one passage that made me smart off silently as I pedaled.
From The Humanities: What's The Big Idea?:
Boyarin describes his own research as not merely interdisciplinary but "deeply post-disciplinary." (He jokes that when he first came to Berkeley, his dream was to be 5 percent in 20 departments.)
Good luck getting tenure that way, dude.
"Deeply post-disciplinary" is a great bit of new academic jargon. Universities are very much organized by discipline. Figuring out how to support scholars who work outside the lines is a perpetual challenge, one that we really should address at scale if we want to enable universities to evolve.
From this article on Bernie Sanders's free college plan:
Big-picture principles are important, but implementation is important, too.
Hey, maybe he just needs a programmer.
Implementing big abstractions is hard enough when the substance is technical. When you throw in social systems and politics, implementing any idea that deviates very far from standard practice becomes almost impossible. Big Ball of Mud, indeed.
From Yours, Isaac Asimov: A Life in Letters:
Being taught is the intellectual analog of being loved.
I'll remind my students of this tomorrow when I give them Exam 3, on syntactic abstraction. "I just called to say 'I love you'."
Asimov is right. When I think back on all my years in school, I feel great affection for so many of my teachers, and I recall feeling their affection for me. Knowledge is not only power, says Asimov; it is happiness. When people help me learn they offer me knew ways to be happy.
( The Foundation Trilogy makes me happy, too.)
|
courtesy of the American Go Association |
In Why AlphaGo Matters, Ben Kamphaus writes:
AlphaGo recognises strong board positions by first recognizing visual features in the board. It's connecting movements to shapes it detects. Now, we can't see inside AlphaGo unless DeepMind decides they want to share some of the visualizations of its intermediate representations. I hope they do, as I bet they'd offer a lot of insight into both the game of Go and how AlphaGo specifically is reasoning about it.
I'm not sure seeing visualizations of AlphaGo's intermediate representations would offer much insight into either the game of Go or how AlphaGo reasons about it, but I would love to find out.
One of the things that drew me to AI when I was in high school and college was the idea that computer programs might be able to help us understand the world better. At the most prosaic level, I though this might happen in what we had to learn in order to write an intelligent program, and in how we structured the code that we wrote. At a more interesting level, I thought that we might have a new kind of intelligence with which to interact, and this interaction would help us to learn more about the domain of the program's expertise.
Alas, computer chess advanced mostly by making computers that were even faster at applying the sort of knowledge we already have. In other domains, neural networks and then statistical approaches led to machines capable of competent or expert performance, but their advances were opaque. The programs might shed light on how to engineer systems, but the systems themselves didn't have much to say to us about their domains of expertise or competence.
Intelligent programs, but no conversation. Even when we play thousands of games against a chess computer, the opponent seems otherworldly, with no new principles emerging. Perhaps new principles are there, but we cannot see them. Unfortunately, chess computers cannot explain their reasoning to us; they cannot teach us. The result is much less interesting to me than my original dreams for AI.
Perhaps we are reaching a point now where programs such as AlphaGo can display the sort of holistic, integrated intelligence that enables them to teach us something about the game -- even if only by playing games with us. If it turns out that neural nets, which are essentially black boxes to us, are the only way to achieve AI that can work with us at a cognitive level, I will be chagrined. And most pleasantly surprised.
(CC BY 3.0 US) |
Marvin Minsky, one of the founders of AI, died this week. His book Semantic Information Processing made a big impression on me when I read it in grad school, and his paper Why Programming is a Good Medium for Expressing Poorly Understood and Sloppily-Formulated Ideas remains one of my favorite classic AI essays. The list of his students contains many of the great names from decades of computer science; several of them -- Daniel Bobrow, Bertram Raphael, Eugene Charniak, Patrick Henry Winston, Gerald Jay Sussman, Benjamin Kuipers, and Luc Steels -- influenced my work. Winston wrote one of my favorite AI textbooks ever, one that captured the spirit of Minsky's interest in cognitive AI.
It seems fitting that Minsky left us the same week that Google published the paper Mastering the Game of Go with Deep Neural Networks and Tree Search, which describes the work that led to AlphaGo, a program strong enough to beat an expert human Go player. ( This brief article describes the accomplishment and program at a higher level.) One of the key techniques at the heart of AlphaGo is neural networks, an area Minsky pioneered in his mid-1950s doctoral dissertation and continued to work in throughout his career.
In 1969, he and Seymour Papert published a book, Perceptrons, which showed the limitations of a very simple kind of neural network. Stories about the book's claims were quickly exaggerated as they spread to people who had never read the book, and the resulting pessimism stifled neural network research for more than a decade. It is a great irony that, in the week he died, one of the most startling applications of neural networks to AI was announced.
Researchers like Minsky amazed me when I was young, and I am more amazed by them and their lifelong accomplishments as I grow older. If you'd like to learn more, check out Stephen Wolfram's personal farewell to Minsky. It gives you a peek into the wide-ranging mind that made Minsky such a force in AI for so long.
From How to Disagree:
Once disagreement starts to be seen as utterly normal, and agreement the rare and beautiful exception, we can stop being so surprised and therefore so passionately annoyed when we meet with someone who doesn't see eye-to-eye with us.
Sometimes, this attitude comes naturally to me. Other times, though, I have to work hard to make it my default stance. Things usually go better for me when I succeed.
This tweet has been making the rounds again the last few days. It pokes good fun at the modern propensity to overuse the phrase 'exponential growth', especially in situations that aren't exponential at all. This usage has even invaded the everyday speech of many of my scientist friends, and I'm probably guilty more than I'd like to admit.
In The Day I Became a Millionaire, David Heinemeier Hansson avoids this error when commenting on something he's learned about wealth and happiness:
The best things in life are free. The second best things are very, very expensive. -- Coco ChanelWhile the quote above rings true, I'd add that the difference between the best things and the second best things is far, far greater than the difference between the second best things and the twentieth best things. It's not a linear scale.
I started to title this post "A Power Law of Wealth and Happiness" before realizing that I was falling into a similar trap common among computer scientists and software developers these days: calling every function with a steep end and a long tail "a power law". DHH does not claim that the relationship between cost and value is exponential, let alone that it follows a power law. I reined in my hyperbole just in time. "A Non-Linear Truth ..." may not have quite the same weight of power law academic-speak, but it sounds just fine.
By the way, I agree with DHH's sentiment. I'm not a millionaire, but most of the things that contribute to my happiness would scarcely be improved by another zero or two in my bank account. A little luck at birth afforded me almost all of what I need in life, as it has many other people. The rest is an expectations game that is hard to win by accumulating more.
I smiled a big smile when I read this passage in an interview with Victoria Gould, a British actor and mathematician:
And just as it did when she was at school, maths still brings Victoria relief and reassurance. "When teaching or acting becomes stressful, I retreat to maths a lot for its calmness and its patterns. I'll quite often, in a stressful time, go off and do a bit of linear algebra or some trigonometric identities. They're hugely calming for me." Maths as stress relief? "Absolutely, it works every time!"
It reminded me of a former colleague, a mathematician who now works at Ohio University. He used to say that he had pads and pencils scattered on tables and counters throughout his house, because "I never know when I'll get the urge to do some math."
Last night, I came home after a couple of days of catching up on department work and grading. Finally, it was time to relax for the holiday. What did I do first? I wrote a fun little program in Python to reverse an integer, using only arithmetic operators. Then I watched a movie with my wife. Both relaxed me.
I was fortunate as a child to find solace in fiddling with numbers and patterns. Setting up a system of equations and solving it algebraically was fun. I could while away many minutes playing with the square root key on my calculator, trying to see how long it would take me to drive a number to 1.
Then in high school I discovered programming, my ultimate retreat.
On this day, I am thankful for many people and many things, of course. But Gould's comments remind me that I am also thankful for the privilege of knowing how to program, and for the way it allows me to escape into a world away from stress and distraction. This is a gift.
Novelist Henry Miller lamented one of his greatest vices, recommending books and authors too enthusiastically, but ultimately decided that he would not apologize for it:
However, this vice of mine, as I see it, is a harmless one compared with those of political fanatics, military humbugs, vice crusaders, and other detestable types. In broadcasting to the world my admiration and affection, my gratitude and reverence, ... I fail to see that I am doing any serious harm. I may be guilty of indiscretion, I may be regarded as a naïve dolt, I may be criticized justly or unjustly for my taste, or lack of it; I may be guilty, in the high sense, of "tampering" with the destiny of others; I may be writing myself down as one more "propagandist", but -- how am I injuring anyone? I am no longer a young man. I am, to be exact, fifty-eight years of age. (Je me nomme Louis Salavin.) Instead of growing more dispassionate about books, I find the contrary is taking place.
I'm a few years younger than Messrs. Miller and Salavin, but I share this vice of Miller's, as well as his conclusion. When you reach a certain age, you realize that admiration, affection, gratitude, and reverence, especially for a favorite book or author, are all to be cherished. You want to share them with everyone you meet.
Even so, I try to rein in my vice in the same way Miller himself knew he ought in his soberer moments, by having a lighter touch when I recommend. Broadcasting one's admiration and affection too enthusiastically often has the opposite effect to the one intended. The recipients either take the recommendation on its face and read with such high expectations that they will surely be disappointed, or they instinctively (if subconsciously) react with such skepticism that they read with an eye toward deflating the recommendation.
I will say that I have been enjoying The Books In My Life, from which the above passage comes. I've never read any of Miller's novels, only a Paris Review interview with him. This book about the books that shaped him has been a pleasant introduction to Miller's erudite and deeply personal style. Alas, the occasional doses of French are lost on me without the help of Google Translate.
StrangeLoop 2015 starts tomorrow, and after a year's hiatus, I'm back. The pre-conference workshops were today, and I wish I could have been here in time for the Future of Programming workshop. Alas, I have a day job and had to teach class before hitting the road. My students knew I was eager to get away and bid me a quick goodbye as soon as we wrapped up our discussion of table-driven parsing. (They may also have been eager to finish up the scanners for their compiler project...)
As always, the conference line-up consists of strong speakers and intriguing talks throughout. Tomorrow, I'm looking forward to talks by Philip Wadler and Gary Bernhardt. Wadler is Wadler, and if anyone can shed new light in 2015 on the 'types versus unit tests' conflagration and make it fun, it's probably Bernhardt.
On Saturday, my attention is honed in on David Nolen's and Michael Bernstein's A History of Programming Languages for 2 Voices. I've been big fans of their respective work for years, swooning on Twitter and reading their blogs and papers, and now I can see them in person. I doubt I'll be able to get close, though; they'll probably be swamped by groupies. Immediately after that talk, Matthias Felleisen is giving a talk on Racket's big-bang, showing how we can use pure functional programming to teach algebra to middle school students and fold the network into the programming language.
Saturday was to begin with a keynote by Kathy Sierra, whom I last saw many years ago at OOPSLA. I'm sad that she won't be able to attend after all, but I know that Camille Fournier's talk about hopelessness and confidence in distributed systems design will be an excellent lead-off talk for the day.
I do plan one change for this StrangeLoop: my laptop will stay in its shoulder bag during all of the talks. I'm going old school, with pen and a notebook in hand. My mind listens differently when I write notes by hand, and I have to be more frugal in the notes I take. I'm also hoping to feel a little less stress. No need to blog in real time. No need to google every paper the speakers mention. No temptation to check email and do a little work. StrangeLoop will have my full attention.
The last time I came to StrangeLoop, I read Raymond Queneau's charming and occasionally disorienting "Exercises in Style", in preparation for Crista Lopes's talk about her exercises in programming style. Neither the book nor talk disappointed. This year, I am reading The Little Prince -- for the first time, if you can believe it. I wonder if any of this year's talks draw their inspiration from Saint-Exupéry? At StrangeLoop, you can never rule that kind of connection out.
A couple of months back, someone posted a link to an interview with guitarist Steve Vai, to share its great story about how Vai came to work with Frank Zappa. I liked the entire piece, including the first paragraph, which sets the scene on how Vai got into music in the first place:
Steve Vai: I recall when I was very, very young I was always tremendously excited whenever I was listening to the radio or records. Even back then a peculiar thing happened that still happens to me today. When I listen to music I can't focus on anything else. When there's wallpaper music on the radio it's not a problem but if a good song comes on it's difficult for me to carry on a conversation or multitask. It's always odd to me when I'm listening to something or playing something for somebody and they're having a discussion in the middle of a piece of music [laughs].
I have this pattern. When a song connects with me, I want to listen; not read or talk, simply listen. And, yes, sometimes it's "just" a pop song. For a while, whenever "Shut Up and Dance" by Walk the Moon came on the radio, it had my full attention. Ah, who am I kidding? It still has that effect on me.
Also, I love Vai's phrase "wallpaper music". I often work with music on in the background, and some music I like knows how to stay there. For me, that's a useful role for songs to play. Working in an environment with some ambient noise is much better for me than working in complete silence, and music makes better ambient noise for me than life in a Starbucks.
When I was growing up, I noticed that occasionally a good song would come on the air, and my level of concentration managed to hold it at bay. When I realized that I had missed the song, I was disappointed. Invariably in those cases, I had been solving a math problem or a writing a computer program. That must have been a little bit like the way Vai felt about music: I wanted to know how to do that, so I put my mind into figuring out how. I was lucky to find a career in which I can do that most of the time.
Oh, and by the way, Steve Vai can really play.
I recently discovered that the students at my university have a chess club, so I stopped over yesterday to play a couple of games. In the first, my opponent played Philidor's Defense. In the second, I played Petrov's Defense. For a moment, I felt as if we had drifted in time to a Parisian cafe, circa 1770.
Then I looked up and saw a bank of TV screens surrounded by students who were drinking lattes and using cell phones to scroll through photos. I was back from my reverie.
I am committed to being wrong bigger and more often in 2015. Yet I am mindful of Avdi Grimm's admonition:
... failure isn't always that informative. You can learn a thousand different ways to fail and never learn a single way to succeed.
To fail for failure's sake is foolish and wasteful. In writing, the awful stuff you write when you start isn't usually valuable in itself, but rather for what we learn from studying and practicing. In science, failing isn't usually valuable in itself, but rather for what you learn when you prove an idea wrong. The scientist's mindset has a built-in correction for dealing with failure: every surprising result prompts a new attempt to understand why and build a better model.
As Grimm says, be sure you know what purpose your failure will serve. Sometimes, taking bigger risks intellectually can help us get off a plateau in our thinking, or even a local maximum. The failure pays off when we pay attention to the outcome and find a better hill to climb.
... from Book 6 of The Meditations, courtesy of George Berridge:
You are not compelled to form any opinion about this matter before you, nor to disturb your piece of mind at all. Things in themselves have no power to extort a verdict from you.
This seems especially sound advice in this era, full of devices that enable other people to bombard our minds with matters they find Very Important Indeed. Maintain your piece of mind until you encounter a thing that your own mind knows to be important.
In Mathematics, Live: A Conversation with Laura DeMarco and Amie Wilkinson, Amie Wilkinson recounts the pivotal moment when she knew she wanted to be a mathematician. Insecure about her abilities in mathematics, unsure about what she wanted to do for a career, and with no encouragement, she hadn't applied to grad school. So:
I came back home to Chicago, and I got a job as an actuary. I enjoyed my work, but I started to feel like there was a hole in my existence. There was something missing. I realized that suddenly my universe had become finite. Anything I had to learn for this job, I could learn eventually. I could easily see the limits of this job, and I realized that with math there were so many things I could imagine that I would never know. That's why I wanted to go back and do math. I love that feeling of this infinite horizon.
After having written software for an insurance company during the summers before and after my senior year in college, I knew all too well the "hole in my existence" that Wilkinson talks about, the shrinking universe of many industry jobs. I was deeply interested in the ideas I had found in Gödel, Escher, Bach, and in the idea of creating an intelligent machine. There seemed no room for those ideas in the corporate world I saw.
I'm not sure when the thought of graduate school first occurred to me, though. My family was blue collar, and I didn't have much exposure to academia until I got to Ball State University. Most of my friends went out to get jobs, just like Wilkinson. I recall applying for a few jobs myself, but I never took the job search all that seriously.
At least some of the credit belongs to one of my CS professors, Dr. William Brown. Dr. Brown was an old IBM guy who seemed to know so much about how to make computers do things, from the lowest-level details of IBM System/360 assembly language and JCL up to the software engineering principles needed to write systems software. When I asked him about graduate school, he talked to me about how to select a school and a Ph.D. advisor. He also talked about the strengths and weaknesses of my preparation, and let me know that even though I had some work to do, I would be able to succeed.
These days, I am lucky even to have such conversations with my students.
For Wilkinson, DeMarco and me, academia was a natural next step in our pursuit of the infinite horizon. But I now know that we are fortunate to work in disciplines where a lot of the interesting questions are being asked and answers by people working in "the industry". I watch with admiration as many of my colleagues do amazing things while working for companies large and small. Computer science offers so many opportunities to explore the unknown.
Reading Wilkinson's recollection brought a flood of memories to mind. I'm sure I wasn't alone in smiling at her nod to finite worlds and infinite horizons. We have a lot to be thankful for.
Yesterday, I read three passages about being wrong. First, this from a blog entry about Charles Darwin's "fantastically wrong" idea for how natural selection works:
Being wildly wrong is perfectly healthy in science, because when someone comes along to prove that you're wrong, that's progress. Somewhat embarrassing progress for the person being corrected, sure, but progress nonetheless.
Then, P.G. Wodehouse shared in his Paris Review interview that it's not all Wooster and Jeeves:
... the trouble is when you start writing, you write awful stuff.
And finally, from a touching reflection on his novelist father, this delicious sentence by Colum McCann:
He didn't see this as a failure so much as an adventure in limitations.
My basic orientation as a person is one of small steps, small progress, trying to be a little less wrong than yesterday. However, such a mindset can lead to a conservatism that inhibits changes in direction. One goal I have for 2015 is to take bigger risks intellectually, to stretch my thinking more than I have lately. I'll trust Wodehouse that when I start, I may well be awful. I'll recall Darwin's example that it's okay to be wildly wrong, because then someone will prove me wrong (maybe even me), and that will be progress. And if, like McCann's father, I can treat being wrong as merely an adventure in my limitations, perhaps fear and conservatism won't hold me back from new questions worth asking.
Mark Guzdial blogged this morning about the challenge of turning business teachers into CS teachers. Where is the passion? he asks.
These days, I wince every time I hear word 'passion'. We apply it to so many things. We expect teachers to have passion for the courses they teach, students to have passion for the courses they take, and graduates to have passion for the jobs they do and the careers they build.
Passion is a heavy burden. In particular, I've seen it paralyze otherwise well-adjusted college students who think they need to try another major, because they don't feel a passion for the one they are currently studying. They don't realize that often passion comes later, after they master something, do it for a while, and come to appreciate it ways they could never imagine before. I'm sure some of these students become alumni who are discontent with their careers, because they don't feel passion.
I think requiring all CS teachers to have a passion for CS sets the bar too high. It's an unrealistic expectation of prospective teachers and of the programs that prepare them.
We can survive without passionate teachers. We should set our sights on more realistic and relevant goals:
Curiosity is so much more important than passion for most people in most contexts. If you are curious, you will like encountering new ideas and learning new skills. That enjoyment will carry you a long way. It may even help you find your passion.
Perhaps we should set similarly realistic goals for our students, too. If they are curious, professional, and competent, they will most likely be successful -- and content, if not happy. We could all do worse.
... is determined by the moments when something happens.
In the end, a person doesn't view his [or her] life as merely the average of its moments -- which, after all, is mostly nothing much, plus some sleep. Life is meaningful because it is a story, and a story's arc is determined by the moments when something happens.
So writes Atul Gawande in Being Mortal: Medicine and What Matters in the End. When I am deep in a semester, preparing a course and doing all the things that a department head must do, both big small, the pace of life reaches a point where my mind is prone to go into cruise control. That's when I need to remind myself not to let my story become a stretch of uninterrupted white noise. I have to consciously step out of the blur and make something worthwhile -- and memorable -- happen.
I was in St. Paul this weekend to visit my younger daughter for the first time since she started college six weeks ago. (It is hard to believe I dropped my older daughter off at school for the first time three years ago.)
Friday night, my daughter treated us to a performance by Buckets and Tap Shoes. The group blended dance, music, and seductive showmanship to create a show that kept me bouncing in rhythm for two hours. The dance and music were heavy on tap and drums, which let the performers play with rhythms. They also interacted with the crowd throughout the show, which set them up for an encore finale with twenty or more audience members on stage drumming, dancing, and generally into the rhythm. Impressive.
We like her school, and the Highland Park neighborhood is quite nice, but the Twin Cities are too big and busy for my tastes. I'm glad to live where I live.
I saw a commercial recently for one of those on-line schools no one has ever heard of. In it, a non-traditional student with substantial job experience said, "At [On-Line U.], I can take classes I control."
I understand a desire for control, especially given the circumstances in which so many people go to university now. Late twenties or older, a family, a house, bills to pay. Under such conditions, school becomes a mercenary activity: get in, get a few useful skills and a credential, get out. Maximize ROI; minimize expenses.
In comparison, my years studying at a university were a luxury. I went to college straight out of high school, in a time with low tuition and even reasonable room and board. I was lucky to have a generous scholarship that defrayed my costs. But even my friends without scholarships seemed more relaxed about paying for school than students these days. It wasn't because Mom and Dad were picking up the tab, either; most of my friends paid their own way.
The cost was reasonable and as a result, perhaps, students of my era didn't feel quite the same need to control all of their classes. That is just as well, because we didn't have much control, nor much bargaining power to change how our professors worked.
What a fortunate powerlessness that was, though. In most courses, I encountered creative, intelligent professors. Once a year or so, I would walk into a course with rather pedestrian goals only to find that the professor had something different in mind, something unimagined, something wonderful. If naive, twenty-year-old Eugene had had control of all his courses, he would likely have missed out on a few experiences that changed his life.
What a great luxury it was to surrender control for eleven weeks and be surprised by new knowledge, ideas, and possibilities -- and by the professors who made the effort to take me there.
I know was lucky in a lot of ways, and for that I am thankful. I hope that our inability or unwillingness to keep public university affordable doesn't have as an unintended casualty the wonderful surprises that can happen in our courses.
Just this week I learned that Jon Sticklen, my PhD advisor, has moved to Michigan Tech to chair its Department of Engineering Fundamentals. As I recall, Michigan Tech focuses much of its effort on undergraduate engineering education. This makes it a good fit for Jon, who has been working on projects in engineering education at Michigan State for a number of years now, with some success. I wish him and them well.
By the way, if you can handle a strong winter, then Tech can be a great place to live. The upper peninsula of Michigan is stunning!
In her diary, Woolf once secured in words a state of mind that has waylaid me recently.
Still if one is Prometheus, if the rock is hard and the gadflies pungent, gratitude, affection, none of the nobler feelings have sway. And so this August is wasted.
And yet hope remains. August is but half past.
~~~~~
(From an entry dated Tuesday, August 18, 1921, in A Writer's Diary.)
In this New York Times article on James Baldwin's ninetieth birthday, scholar Henry Louis Gates laments:
On one hand, he's on a U.S. postage stamp; on the other hand, he's not in the Common Core.
I'm not qualified to comment on Baldwin and his place in the Common Core. In the last few months, I read several articles about and including Baldwin, and from those I have come to appreciate better his role in twentieth-century literature. But I also empathize with anyone trying to create a list of things that every American should learn in school.
What struck me in Gates's comment was the reference to the postage stamp. I'm old enough to have grown up in a world where the postage stamp held a position of singular importance in our culture. It enabled communication at a distance, whether geographical or personal. Stamps were a staple of daily life.
In such a world, appearing on a stamp was an honor. It indicated a widespread acknowledgment of a person's (or organization's, or event's) cultural impact. In this sense, the Postal Service's decision to include James Baldwin on a stamp was a sign of his importance to our culture, and a way to honor his contributions to our literature.
Alas, this would have been a much more significant and visible honor in the 1980s or even the 1990s. In the span of the last decade or so, the postage stamp has gone from relevant and essential to archaic.
When I was a boy, I collected stamps. It was a fun hobby. I still have my collection, even if it's many years out of date now. Back then, stamp collecting was a popular activity with a vibrant community of hobbyists. For all I know, that's still true. There's certainly still a vibrant market for some stamps!
But these days, whenever I use a new stamp, I feel as if I'm holding an anachronism in my hands. Computing technology played a central role in the obsolescence of the stamp, at least for personal and social communication.
Sometimes people say that we in CS need to a better job helping potential majors see the ways in which our discipline can be used to effect change in the world. We never have to look far to find examples. If a young person wants to be able to participate in how our culture changes in the future, they can hardly do better than to know a little computer science.
A blog can be many things.
It can an essay, a place to work out what I think, in the act of writing.
It can be a lecture, a place to teach something, however big or small, in my own way.
It can be memoir, a place to tell stories about my life, maybe with a connection to someone else's story.
It can be a book review or a conference review, a place to tell others about something I've read or seen that they might like, too. Or not.
It can be an open letter, a place to share news, good or bad, in a broadcast that reaches many.
It can be a call for help, a request for help from anyone who receives the message and has the time and energy to respond.
It can be a riff on someone else's post. I'm not a jazz musician, but I like to quote the melodies in other people's writing. Some blog posts are my solos.
It can be a place to make connections, to think about how things are similar and different, and maybe learn something in the process.
A blog is all of these, and more.
A blog can also be a time machine. In this mode, I am the reader. My blog reminds me who I was at another time.
This effect often begins with a practical question. When I taught agile software development this summer, I looked back to when I taught it last. What had I learned then but forgotten since? How might I do a better job this time around?
When I visit blog posts from the past, though, something else can happen. I sometimes find myself reading on. The words mesmerize me and pull me forward on the page, but back in time. It is not that the words are so good that I can't stop reading. It's that they remind me who I was back then. A different person wrote those words. A different person, yet me. It's quite a feeling.
A blog can combine any number of writing forms. I am not equally good writing in all of these forms, or even passably good in any of them. But they are me. Dave Winer has long said that a blog is the unedited voice of a person. This blog is the unedited voice of me.
When I wrote my first blog post ten years ago today, I wasn't sure if anyone wanted to hear my voice. Over the years, I've had the good fortune to interact with many readers, so I know someone is listening. That still amazes me. I'm glad that something you read here is worth the visit.
Back in those early days, I wondered if it even mattered whether anyone else would read. The blog as essay and as time machine are valuable enough on their own to make writing worth the effort to me. But I'll be honest: it helps a lot knowing that other people are reading. Even when you don't send comments by e-mail, I know you are there. Thank you for your time.
I don't write as often as I did in the beginning. But I still have things to say, so I'll keep writing.
Earlier this week, I looking for inspiration for an exam problem in my algorithms course. I started thumbing through a text I briefly considered considering for adoption this semester. (I ended up opting for no text without considering any of them very deeply.)
The first problem I read was written with a political spin, at the expense of one of the US political parties. I was aghast. I tweeted:
This textbook uses an end-of-chapter exercise to express an opinion about a political party. I will no longer consider it for adoption.
I don't care which party is made light of or demeaned. I'm no longer interested. I don't want a political point unrelated to my course to interfere with my students' learning -- or with the relationship we are building.
In general, I don't want students to think of me in a partisan way, whether the topic is politics, religion, or anything else outside scope of computer science. It's not my job as a CS instructor to make my students uncomfortable.
That isn't to say that I want students to think of me as bland or one-dimensional. They know me to be a sports fan and a chess player. They know I like to read books and to ride my bike. I even let them know that I'm a Billy Joel fan. All of these give me color and, while they may disagree with my tastes, none of these are likely to create distance between us.
Nor do I want them to think I have no views on important public issues of the day. I have political opinions and actually like to discuss the state of the country and world. Those discussions simply don't belong in a course on algorithms or compiler construction. In the context of a course, politics and religion are two of many unnecessary distractions.
In the end, I did use the offensive problem, but only as inspiration. I wrote a more generic problem, the sort you expect to see in an algorithms course. It included all the relevant features of the original. My problem gave the students a chance to apply what they have have learned in class without any noise. The only way this problem could offend students was by forcing them to demonstrate that they are not yet ready to solve such a problem. Alas, that is an offense that every exam risks giving.
... and yes, I still owe you a write-up on possible changes to the undergraduate algorithms canon, promised in the entry linked above. I have not forgotten! Designing a new course for spring semester is a time-constrained operation.
Rand's Inbox Reboot didn't do much for me on the process and tool side of things, perhaps other than rousing a sense of familiarity. Been there. The part that stood out for me was when he talked about the ultimate effect of not being in control of the deluge:
As a leader, you define your reputation all the time. You'd like to think that you could choose the moments that define your reputation, but you don't. They are always watching and learning. They are always updating their model regarding who you are and how you lead with each observable action, no matter how large or small.
He was speaking of technical management, so I immediately thought about my time as department head and how true this passage is.
But it is also true -- crucially true -- of the relationship teachers have with their students. There are few individual moments that define how a teacher is seen by his or her students. Students are always watching and learning. They infer things about the discipline and about the teacher from every action, from every interaction. What they learn in all the little moments comes to define you in their minds.
Of course, this is even more true of parents and children. To the extent I have any regrets as a parent, it is that I sometimes overestimated the effect of the Big Moments and underestimated the effect of all those Little Moments, the ones that flow by without pomp or recognition. They probably shaped my daughters, and my daughters' perception of me, more than anything particular action I took.
Start with a good set of values, then create a way of living and working that puts these values at the forefront of everything you do. Your colleagues, your students, and your children are always watching.
For a variety of reasons, the following passage came to mind today. It is from a letter that Jonathan Schoenberg wrote as part of the "Dear Me, On My First Day of Advertising" series on The Egotist forum:
You got into this business by accident, and by the generosity of people who could have easily been less generous with their time. Please don't forget it.
It's good for me to remind myself frequently of this. I hope I can be as generous with time to my students and colleagues as as so many of my professors and colleagues were with their time. Even when it means explaining nested for-loops again.
Or: Irrational Exuberance When Programming
My wife and daughter laughed at me yesterday.
A few years ago, I blogged about implementing Farey sequences in Klein, a language for which my students at the time were writing a compiler. Klein was a minimal functional language with few control structures, few data types, and few built-in operations. Computing rational approximations using Farey's algorithm was a challenge in Klein that I likened to "integer assembly programming".
I clearly had a lot of fun with that challenge, especially when I had the chance to watch my program run using my students' compilers.
This semester, I am again teaching the compiler course, and my students are writing a compiler for a new version of Klein.
Last week, while helping my daughter with a little calculus, I ran across a fun new problem to solve in Klein:
There are two stations on opposite sides of a river. The river is 3 miles wide, and the stations are 5 miles apart along the river. We need to lay pipe between the stations. Pipe laid on land costs $2.00/foot, and pipe laid across the river costs $4.00/foot. What is the minimum cost of the project?
This is the sort of optimization problem one often encounters in calculus textbooks. The student gets to construct a couple of functions, differentiate one, and find a maximum or minimum by setting f' to 0 and solving.
Solving this problem in Klein creates some of challenges. Among them are that ideally it involves real numbers, which Klein doesn't support, and that it requires a square root function, which Klein doesn't have. But these obstacles are surmountable. We already have tools for computing roots using Newton's method in our collection of test programs. Over a 3mi-by-5mi grid, an epsilon of a few feet approximates square roots reasonably well.
My daughter's task was to use the derivative of the cost function but, after talking about the problem with her, I was interested more in "visualizing" the curve to see how the cost drops as one moves in from either end and eventually bottoms out for a particular length of pipe on land.
So I wrote a Klein program that "brute-forces" the minimum. It loops over all possible values in feet for land pipe and compares the cost at each value to the previous value. It's easy to fake such a loop with a recursive function call.
The programmer's challenge in writing this program is that Klein has no local variables other function parameters. So I had to use helper functions to simulate caching temporary variables. This allowed me to give a name to a value, which makes the code more readable, but most importantly it allowed me to avoid having to recompute expensive values in what was already a computationally-expensive program.
This approach creates another, even bigger challenge for my students, the compiler writers. My Klein program is naturally tail recursive, but tail call elimination was left as an optional optimization in our class project. With activation records for all the tail calls stored on the stack, a compiler has to use a lot of space for its run-time memory -- far more than is available on our default target machine.
How many frames do we need? Well, we need to compute the cost at every foot along a (5 miles x 5280 feet/mile) rectangle, for a total of 26,400 data points. There will, of course, be other activation records while computing the last value in the loop.
Will I be able to see the answer generated by my program using my students' compilers? Only if one or more of the teams optimized tail calls away. We'll see soon enough.
So, I spent an hour or so writing Klein code and tinkering with it yesterday afternoon. I was so excited by the time I finished that I ran upstairs to tell my wife and daughter all about it: my excitement at having written the code, and the challenge it sets for my students' compilers, and how we could compute reasonable approximations of square roots of large integers even without real numbers, and how I implemented Newton's method in lieu of a sqrt, and...
That's when my wife and daughter laughed at me.
That's okay. I am programmer. I am still excited, and I'd do it again.
When someone asked Benjamin Franklin why he had declined to seek a patent for his famous stove, he said:
I declined it from a principle which has ever weighed with me on such occasions, that as we enjoy great advantages from the inventions of others, we should be glad of an opportunity to serve others by any invention of ours.
This seems a fitting sentiment to recall as I look forward to a few days of break with my family for Thanksgiving. I know I have a lot to be thankful for, not the least of which are the inventions of so many others that confer great advantage on me. This week, I give thanks for these creations, and for the creators who shared them with me.
... you wake groggily at 5:30 on a Sunday morning. You lie in bed, half awake, as your mind begins designing a new class session for your compiler course. You never go back to sleep.
Before you rise, you have a new reading assignment, an opening exercise asking your students to write a short assembly language program, and two larger in-class exercises aimed at helping them make a good start on their compiler's run-time system.
This is a thorny topic. It's been bothering you. Now, you have a plan.
[My notes on StrangeLoop 2013: Table of Contents]
Six good talks a day is about my limit. Seven for sure. Each creates so much mental activity that my brain soon loses the ability to absorb more. Then, I need a walk.
~~~~
After Jenny Finkel's talk on machine, someone asked if Prismatic's system had learned any features or weights that she found surprising. I thought her answer was interesting. I paraphrase: "No. As a scientist, you should understand why the system is the way that it is, or find the bug if it shouldn't be that way."
In a way, this missed the point. I'm guessing the questioner was looking to hear about a case that required them to dig in because the answer was correct but they didn't know why yet, or incorrect and the bug wasn't obvious. But Finkel's answer shows how matter-of-fact scientists can be about what they find. The world is as it is, and scientists try to figure out why. That's all.
~~~~
The most popular corporate swag this year was stickers to adorn one's laptop case. I don't put stickers on my gear, but I like looking at other people's stickers. My favorites were the ones that did more than simply display the company name. Among them were asynchrony:
-- which is a company name but also a fun word in its own right -- and data-driven:
-- by O'Reilly. I also like the bound, graph-paper notebooks that O'Reilly hands out. Classy.
~~~~
In a previous miscellany I mentioned Double Multitasking Guy. Not me, not this time. I carried no phone, as usual, and this time I left my laptop back in the hotel room. Not having any networked technology in hand creates a different experience, if not a better one.
Foremost, having no laptop affects my blogging. I can't take notes as quickly, or as voluminously. One of the upsides of this is that it's harder for me to distract myself by writing complete sentences or fact-checking vocabulary and URLs. Quick, what is the key idea here? What do I need to look up? What do I need to learn next?
~~~~
With video recording now standard at tech conferences, and with StrangeLoop releasing its videos so quickly now, a full blow-by-blow report of each talk becomes somewhat less useful. Some people find summary reports helpful, though, because they don't want to watch the full talks or have the time to do so. Short reports let these folks keep their pulse on the state of the world. Others are looking for some indication of whether they want to invest the time to watch.
For me, the reports serve another useful purpose. They let me do a little light analysis and share my personal impressions of what I hear and learn. Fortunately, that sort of blog entry still finds an audience.
A trip to my alma mater for a reunion this weekend brings to mind these words from Roger Ebert:
There is a part of me that will forever want to be walking under autumn leaves, carrying a briefcase containing the works of Shakespeare and Yeats and a portable chess set. I will pass an old tree under which once on a summer night I lay on the grass with a fragrant young woman and we quoted e.e. cummings back and forth.
I was more likely carrying Keats than Yeats and quoting Voltaire than cummings, but the feeling's the same. There is something about the age as we enter adulthood that becomes permanent in us, more so than any other time. I'm old enough to know that these memories can't hurt a thing.
I sent one daughter off to college a couple of years ago and will send another next year. In this experience, I feel more like the wistful father who penned My Dear Son. Indeed, "every age has its gifts for the man who is willing to work for them and use them temperately".
Attributions. The Ebert passage comes from his review of the film Liberal Arts. (Ebert calls it "an almost unreasonable pleasure"; I agree.) John Mellencamp wrote the line "old enough to know...".
On my first day as a faculty member at the university, twenty years ago, the department secretary sent me to Public Safety to pick up my office and building keys. "Hi, I'm Eugene Wallingford," I told the person behind the window, "I'm here to pick up my keys." She smiled, welcomed me, and handed them to me -- no questions asked.
Back at the department, I commented to one of my new colleagues that this seemed odd. No one asked to see an ID or any form of authorization. They just handed me keys giving me access to a lot of cool stuff. My colleague shrugged. There has never been a problem here with unauthorized people masquerading as new faculty members and picking up keys. Until there is a problem, isn't it nice living in a place where trust works?
Things have changed. These days, we don't order keys for faculty; we "request building access". This phrase is more accurate than a reference to keys, because it includes activating the faculty ID to open electronically-controlled doors. And we don't simply plug a new computer into an ethernet jack and let faculty start working; to get on the wireless network, we have to wait for the Active Directory server to sync with the HR system, which updates only after electronic approval of a Personnel Authorization Form that set up of the employee's payroll record. I leave that as a run-on phrase, because that's what living it feels like.
The paperwork needed to get a new faculty member up and running these days reminds me just how simple life was when in 1992. Of course, it's not really "paperwork" any more.
The quest for comeuppance is a misallocation of personal resources. -- Tyler Cowen
Far too often, my reaction to events in the world around me is to focus on other people not following rules, and the unfairness that results. It's usually not my business, and even when it is, it's a foolish waste of mental energy. Cowen expresses this truth nicely in neutral, non-judgmental language. That may help me develop a more productive mental habit.
What we have today is a wonderful bike with training wheels on. Nobody knows they are on, so nobody is trying to take them off. -- Alan Kay, paraphrased from The MIT/Brown Vannevar Bush Symposium
Kay is riffing off Douglas Engelbart's tricycle analogy, mentioned last time. As a computer scientist, and particularly one fortunate enough to have been exposed to the work of Ivan Sutherland, Englebart, Kay and the Xerox PARC team, and so many others, I should be more keenly conscious that we are coasting along with training wheels on. I settle for limited languages and limited tools.Even sadder, when computer scientists and software developers settle for training wheels, we tend to limit everyone else's experience, too. So my apathy has consequences.
I'll try to allocate my personal resources more wisely.
I start with a seemingly random set of sentences to blog about and, in the process of writing about them, find that perhaps they aren't so random after all.
An Era of Sharing Our Stuff
Property isn't theft; property is an inefficient distribution of resources.
This assertion comes from an interesting article on "economies of scale as a service", in reaction to a Paul Graham tweet:
Will ownership turn out to be largely a hack people resorted to before they had the infrastructure to manage sharing properly?
Open-source software, the Creative Commons, crowdsourcing. The times they are a-changin'.
An Era of Observing Ourselves
If the last century was marked by the ability to observe the interactions of physical matter -- think of technologies like x-ray and radar -- this century is going to be defined by the ability to observe people through the data they share.
... from The Data Made Me Do It.
I'm not too keen on being "observed" via data by every company in the world, even as understand the value it can brings the company and even me. But I like very much the idea that I can observe myself more easily and more productively. For years, I collected and studied data about my running and used what I learned to train and race better. Programmers are able to do this better now than ever before. You can learn a lot just by watching.
An Era of Thinking Like Scientist
... which leads to this line attributed to John C. Reynolds, an influential computer scientist who passed away recently:
Well, we know less than we did before, but more of what we know is actually true.
It's surprising how easy it is to know stuff when we don't have any evidence at all. Observing the world methodically, building models, and comparing them to what we observe in the future helps to know less of the wrong stuff and more of the right stuff.
Not everyone need be a scientist, but we'd all be better off if more of us thought like a scientist more often.
John Burroughs, in "The Exhilarations of the Road" (1895):
[The walker] is not isolated, but one with things, with the farms and industries on either hand. The vital, universal currents play through him. He knows the ground is alive; he feels the pulses of the wind, and reads the mute language of things. His sympathies are all aroused; his senses are continually reporting messages to his mind. Wind, frost, ruin, heat, cold, are something to him. He is not merely a spectator of the panorama of nature, but a participator in it. He experiences the country he passes through--tastes it, feels it, absorbs it; the traveller in his fine carriage sees it merely.
Knee surgery ended my avocation as a runner. I used to walk a lot, too, but these days I walk even more than I used to. For more than a year, I have walked to and from work almost every day, even through the Iowa winter. As both runner and walker, I recognize the exhilaration Burroughs describes. I find that I appreciate the elements rather than curse them. Wind and frost, rain and snow, heat and cold all matter. Why complain about a driving rain? The world is alive around me.
(I found Burroughs's passage in Solvitur Ambulando, a discourse on the virtues of walking in the spirit of Thoreau. I love the title as well as the essay.)
A friend put a copy of GameInformer magazine in my box yesterday with a pointer to an interview with the Great and Powerful Woz, Steve Wozniak. It's a short interview, only two pages, but it reminded me just how many cool things Wozniak (and so many others) did in the mid-1970s. It also reminded me of my younger days, coming into contact with the idea of games and machine learning for the first time.
Woz described how, after seeing Pong in a video arcade, he went home and built his own Pong game out of twenty-eight $1 chips. Steve Jobs took the game to Atari, where he encountered Nolan Bushnell, who had an idea for a single-player version of Pong. Thus did Woz design Breakout, a game with an especially apt name. It helped define Apple Computer.
The thought of building a computer game out of chips still amazes me. I was never a hardware guy growing up. I never had access to computer chips or that culture, and I had little inclination to fiddle with electronics, save for a few attempts to take apart radios and put them back together. When I designed things as a kid, they were houses or office buildings. I was going to be an architect. But Woz's story reminded me of one experience that foreshadowed my career as a computer scientist.
One year in school, I won a math contest. First prize was a copy of The Unexpected Hanging and Other Mathematical Diversions, a collection of Martin Gardner's columns from Scientific American. Chapter 8 was called "A Matchbox Game-Learning Machine". It described Hexapawn, a game played on a 3x3 board with chess pawns. The game was no more complex than Tic Tac Toe, but it was new. And I loved board games.
Gardner's article had more in store for me, though, than simply another game to study. He described how to create a "computer" -- a system of matchboxes -- that learns how to play the game! Here's how:
You make one box for each possible board position. In the box, you put different colored marbles corresponding to the moves that can be played in the position. Then you play a bunch of games against the matchbox computer. When it is the computer's turn to move, you pick up the box for that board position, shake it, and see which marble lands in the lower-right corner of the box. That's the computer's move.
When the game is over, the computer gets feedback. If it won the game, then put all the marbles back in their boxes. If it lost, punish it by keeping the marble responsible for its last move; put all the rest back in their boxes. Gardner claimed that by following this strategy, the matchbox computer would learn to play a perfect game in something under fifty moves.
This can't possibly work, can it? So I built it. And it did learn. I was happy, and amazed.
I remember experimenting a bit. Maybe a move wasn't always a loser? So I seeded the computer with more than one marble for each candidate move, so that the computer could overcome bad luck. Hexapawn is so simple that this wasn't necessary -- losing moves are losing moves -- but the computer still learned to play a perfect game, just a bit slower than before.
This is one of the earliest experiences I remember that started me down the road of studying artificial intelligence. Reading copious amounts of science fiction pushed me in that direction, too, but this was different. I had made something, and it learned. I was hooked.
So, I wasn't a hardware kid, but I had a hardware experience. It just wasn't digital hardware. But my inclination was always more toward ideas than gadgets. My interests quickly turned to writing programs, which made it so much easier to tinker with variations and to try brand-new ideas.
(Not so quickly, though, that I turned away from my dream of being an architect. The time I spent in college studying architecture turned out to be valuable in many ways.)
Wozniak was a hardware guy, but he quickly saw the potential of software. "Games were not yet software, and [the rise of the microprocessor] triggered in my mind: microprocessors can actually program games." He called the BASIC interpreter he wrote "Game BASIC". Ever the engineer, he designed the Apple II with integrated hardware and software so that programmers could write cool games.
I don't have a lot in common with Steve Wozniak, but one thing we share is the fun we have playing games. And, in very different ways, we once made computers that changed our lives.
~~~~
The GameInformer interview is on-line for subscribers only, but there is a cool video of Wozniak playing Tetris -- and talking about George H.W. Bush and Mikhail Gorbachev!
My workout at dawn this New Year's Day brought me the following passage, from Dave Winer:
... it gets ridiculous near the end, time runs so fast, it's December just after it's January and then of course it's January again, until there's no more time.
Time goes by quickly, whether we use it well or not. We may as well use the time we have to its fullest, until there's no more time.
I spent part of my morning playing at the intersection of today and yesterday. My wife gave me a USB-equipped turntable for Christmas, which will be handy for converting some music I have on vinyl LPs to digital format. With so much music streaming on the internet these days, it's hard to believe that there is anything not available on CD or iTunes. Even when a remastered CD is available, though, sometimes it's more fun to spin the vinyl record, convert it to something more "modern", and fiddle with the resulting file to get just the right sound.
May the blink of an eye that is 2013 bring you manifold opportunities to build things and tear them down, and may you have a lot of fun along the way.
I am thankful for human beings' capacity to waste time.
We waste it in the most creative ways. My life is immeasurably better because other people have wasted time and created art and literature. Even much of the science and technology I enjoy came from people noodling around in their free time. The universe has blessed me, and us.
~~~~
At my house, Thanksgiving lasts the whole weekend. I don't mind writing a Thanksgiving blog the day after, even though the rest of the world has already moved on to Black Friday and the next season on the calendar. My family is, I suppose, wasting time.
This note of gratitude was prompted by reading a recent joint interview with Brian Eno and Ha-Joon Chang, oddities in their respective disciplines of music and economics. I am thankful for oddities such as Eno and Chang, who add to the world in ways that I cannot. I am also thankful that I live in a world that provides me access to so much wonderful information with such ease. I feel a deep sense of obligation to use my time in a way that repays these gifts I have been given.
In high school, I worked after school doing light custodial work for a local parochial school for a couple of years. One summer, a retired guy volunteered to lead me and a couple of other kids doing maintenance projects at the school and church.
One afternoon, he found me in trying to loosen the lid on a paint can using one of my building keys. Yeah, that was stupid. He looked at me as if I were an alien, got a screwdriver, and opened the can.
Later that summer, I overheard him talking to the secretary. She asked how I was working out, and he said something to the effect of "nice kid, but he has no common sense".
That stung. He was right, of course, but no one likes to be thought of as not capable, or not very smart. Especially someone who likes to be someone who knows stuff.
I still remember that eavesdropped conversation after all these years. I knew just what he meant at the time, and I still do. For many years I wondered, what was wrong with me?
It's true that I didn't have much common sense as a handyman back then. To be honest, I probably still don't. I didn't have much experience doing such projects before I took that job. It's not something I learned from my dad. I'd never seen a bent key before, at least not a sturdy house key or car key, and I guess it didn't occur to me that one could bend.
The A student in me wondered why I hadn't deduced the error of my ways from first principles. As with the story of Zog, it was immediately obvious as soon as it was pointed out to me. Explanation-based learning is for real.
Over time, though, I have learned to cut myself some slack. Voltaire was right: Common sense is not so common. These days, people often say that to mean there are far too many people like me who don't have the sense to come in out of the rain. But, as the folks at Wikipedia recognize, that sentence can mean something else even more important. Common sense isn't shared whenever people have not had the same experiences, or have not learned it some other way.
Maybe there are still some things that most of us can count on as common, by virtue of living in a shared culture. But I think we generally overestimate how much of any given person's knowledge is like that. With an increasingly diverse culture, common experience and common cultural exposure are even harder to come by.
That gentleman and secretary probably forgot about their conversation within minutes, but the memory of his comment still stings a little. I don't think I'd erase the memory, though, even if I could. Every so often, it reminds me not to expect my students to have too much common sense about programs or proofs or programming languages or computers.
Maybe they just haven't had the right experiences yet. It's my job to help them learn.
Duncan Davidson tweeted a check of Google search suggestions for his name. Now, Duncan's a well known guy in a certain corner of the technical world, but he didn't even crack the top six:
A spell-corrected "dunkin donuts" did, though. Ha.
Hey, why not try "Eugene"? Until 1970, it was a much more common first name for boys than "Duncan":
I'll bet you didn't know that Eugene had ever been so popular! Alas, since 1930, it has been in steady decline. Duncan, on the other hand, has been on the rise since 1970 and passed Eugene in 2000. I haven't grabbed 2010 data to update my Name Surfer app, but I'm sure the recent trajectories have continued.
So:
Eugene, Oregon, is my Duncan Hines.
I don't make my first appearance in the top ten until we get to "eugene wal":
One more letter vaults me near the top of the list:
Finally, we add an 'i' and push the pesky Eugene Wallace off his throne. Indeed, I begin to dominate:
Even so, my blog barely nudges past Mr. Wallace. I am so unpopular that Google would rather believe users have misspelled his name than believe they are looking for my Twitter page.
I think I'd better start working on my legacy.
Last night I attended my daughter's high school orchestra concert. (She plays violin.) Early on I found myself watching the conductor rather than the performers. He was really into the performance, as many conductors are. He's a good teacher and gets pretty good sound out of a few dozen teenagers. Surely he must be proud of their performance, and at least a little proud of his own.
Maybe it's just the end of another academic year, but my next thought was, "This concert will be over in an hour." My mind flashed to a movie from the 1990s, Mr. Holland's Opus. What does the conductor feel like when it's over? Is there a sense of emptiness? What does he think about, knowing that he'll being doing this all again next year, just as he did last year? The faces will change, and maybe the musical selections, but the rest will be eerily familiar.
Then it occurred to me: This is the plight of every teacher. It is mine.
Sometimes I envy people who make things for a living. They create something that people see and use. In the case of software, they may have the opportunity to grow their handiwork, to sustain it. It's tangible. It lasts, at least for a while.
Teachers live in a different world. I think about my own situation, teaching one class a semester, usually in our upper division. Every couple of years, I see a new group of students. I have each of them in class once or twice, maybe even a few times. Then May comes, and they graduate.
To the extent that I create anything, it resides in someone else. In this way, being a teacher less like being a writer or a creator and more like being a gardener. We help prepare others to make and do.
Like gardeners, we plant seeds. Some fall on good soil and flourish. Some fall on rocks and die. Sometimes, you don't even know which is which; you find out only years later. I have been surprised in both ways, more often pleasantly than not.
Sure, we build things, too. We CS profs write software. We university profs build research programs. These are tangible products, and they last for a while.
(We administrators create documents and spreadsheets. Let's not go there.)
But these products are not our primary task, at least not at schools like mine. It is teaching. We help students exercise their minds and grow their skills. If we are lucky, we change them in ways that go beyond our particular disciplines.
There is a different sort of rhythm to being a teacher than to being a maker. You need to be able to delay gratification, while enjoying the connections you make with students and ideas. That's one reason it's not so easy for just anyone to be a teacher, at least not for an entire career. My daughter's orchestra teacher seems to have that rhythm. I have been finding this rhythm over the course of my career, without quite losing the desire also to make things I can touch and use and share.
March 4, 2011, was a Friday like any other. I had fallen into a comfortable weekly routine: easy 5-milers on Tuesday and Thursday, a medium 7-miler on Wednesday, a hard 8 miles on the indoor track on Friday, and 8 miles on Sunday. February had been snowy and cold, which made my preferred 12-mile long run on Sundays a bit too much. In place of those miles, I ran a little faster on Fridays and looked forward to the coming spring thaw.
The morning was crisp and the roads clearer than they had been, so I decided to run outdoors. I picked out my favorite 8-mile route, an old friend I had first met when we lived on the other side of town. It passed by our new house, too, and so made the move with us.
It was an excellent run. Footing on hills and in curves is the big challenge of running outdoors in winter, so I didn't worry about pace. I breathed in the cold air, took in the old sights, and felt my body reach equilibrium with the elements.
I did not know at the time, but this would be my last run.
A flu that had been going around hit me later that day, and I was in bed sick for a few days. Then one morning, as I was starting to feel better, I woke up with a sore knee. No big deal, I thought; I'll take advantage of the extra day to be sure I've really licked that flu.
The flu left, but the knee pain did not. It got worse. I eventually went to see a specialist, who gave me the bad news, operated, gave me some more bad news, and operated a second time.
Since then, I have been rehabbing, slowly adding time and effort to my work-outs. But I have not run.
Two days ago, another first Friday of March, 52 weeks later, I had my best post-running workout yet. I still have far to go. The knee is still a little swollen, and it stiffens up after the shortest bouts of inactivity. It feels funny. But I see light.
There are days when I still feel that old urge to lace up my Asics GT-2100s and take off. I expect that summer bring plenty more of those days. The long road to who-knows-where stretches out before me as always, but I won't be exploring it on the run.
This morning, I did something I hadn't done since Sunday, February 27. I did a long workout. That day, it was a 12-mile run in single-digit temperatures, a bright, sunny morning. This day, it was a 1-hour ride on an exercise bike in my basement. It is again a bright, sunny morning, but I was shielded from the winter old.
It felt good.
I've been riding an exercise bike daily since mid-November or so, working my way up from the 10-15 minute I rode occasionally in earlier therapy physical sessions first to 30 minutes, and now to 45 minutes at a time. Today for some reason, my mind said, just keep riding.
Over the last couple of months, I have begun exercising my knee more often and for longer, as I rehab from knee surgery last summer. In addition to the exercise bike, I have been walking a lot. Most days now, I walk 4-5 miles, usually in the evening with my family. It's not running, but it's moving, and it feels good to move -- and burn a few calories. After seven months of inactivity, I had gained twenty pounds and lost my lovely figure. I'm working on both those problems now.
The last two months of 2011 offered an experience that turned my memory inside out: my therapist had me run in our athletic department's Hydroworx pool. Put simply, this is a treadmill on the floor of a pool, which can be lowered to any level. Air jets blow water at the runner to create more resistance. Running in the water blunts impact on all the joints, including the knee, so my surgeon recommended it as part of my therapy.
The first time we speeded the treadmill from walk to gentle run, I was in ecstasy. My body fell into that comfortable rhythm that runners know and love. My heart raced. At first, my mind was empty, but soon it flooded with memories of runs past. I had not felt like a runner since last March. Yet there I was, a runner again.
That feeling was bittersweet, though. I knew that I could run in the pool only for a couple of months, as part of my therapy. Once my knee regained a certain level of strength and balance, pool sessions for therapy would end. And so they did. When I make my first Internet million, perhaps I'll build such a pool at my house, but for now I am back to walking and biking.
I haven't been writing about my knee, or about not running, for a lot of small reasons. This isn't a confessional blog, and I doubt many readers are interested in hearing me go on and on about my feelings. I also haven't find myself making connections between my rehab and my teaching or my software development, as I did when I was running. My experiences have been nothing unique, mostly what musical artist John Mellencamp calls "ditch digging": just doing what little I have to do each day. There is certainly something to be learned in this experience, but at this point I have nothing special to say.
Still, after riding for a full hour today, feeling a little like I did on all those Sunday long runs, reminded me of something worth remembering. When we do the little work day to day, we build something bigger. It takes patience. Another shovel of dirt.
I just gave my older daughter a tearful final kiss and hug and left her in the care of the small liberal arts college that will be her home for most of the next four years. I have tried to prepare myself for this moment over the last year, weeks, and days. Nothing could have prepared me for how I feel at this moment.
College faculty and administrators like to speak these days about the "transformative experience" that college will be for their students. After all these years of my wife and I doing everything we knew how to help our daughter grow into the poised, creative, curious, engaging, delightful young woman she has become, it's hard for me to imagine how much more she can grow. Yet we know she will. As she returns to us on breaks and summer vacations (we hope!), I expect not to meet a new person, but the same Sarah we have come to know and respect and love all these years. She will surely know herself better than she does now, and that will open a new side of her to us.
I am eager to watch her become ever more who she is and who she wants to be. I am eager to get to know her more, again, and still. Her future excites me.
But at this moment, I hurt as only a father or mother can.
In her invocation at today's convocation, the college chaplain prayed that the students of the Class of 2015 find "clarity of purpose". I like that phrase. Clarity of purpose can serve as a capable foundation for all these students will do. In many ways, they begin their lives anew today.
But not every young person in that quad today is just a member of the Class of 2015. One of them is my daughter, with whom I have spent so much time for the last eighteen years, teaching her and being taught by her in turn. I have a hard time imagining what those years would have been like without her -- without her boundless energy, without her love of life and books and people, without her smile and hugs. Or without her patient tutelage of a young man who occasionally lacks clarity of purpose in some things but whose sense of duty to her has never wavered.
I love you, sweetie. You are ready to spread your wings and fly. I wish you every good thing in this world and beyond. But I'm not sure I'm ready to say good-bye just yet. I hope you can teach me that, and so much more.
Another of the interviews I've read recently was The Rolling Stone's 1994 interview with Steve Jobs, when he was still at NeXT. This interview starts slowly but gets better as it goes on. The best parts are about people, not about technology. Consider this, on the source Jobs' optimism:
Do you still have as much faith in technology today as you did when you started out 20 years ago?Oh, sure. It's not a faith in technology. It's faith in people.
Explain that.
Technology is nothing. What's important is that you have a faith in people, that they're basically good and smart, and if you give them tools, they'll do wonderful things with them. It's not the tools that you have faith in -- tools are just tools. They work, or they don't work. It's people you have faith in or not.
I think this is a basic attitude held by many CS teachers, about both technology and about the most important set of people we work with: students. Give them tools, and they will do wonderful things with them. Expose them to ideas -- intellectual tools -- and they will do wonderful things. This mentality drives me forward in much the same way as Jobs's optimism does about the people he wants to use Apple's tools.
I also think that this is an essential attitude when you work as part of a software development team. You can have all the cool build, test, development, and debugging tools money can buy, but in the end you are trusting people, not technology.
Then, on people from a different angle:
Are you uncomfortable with your status as a celebrity in Silicon Valley?I think of it as my well-known twin brother. It's not me. Because otherwise, you go crazy. You read some negative article some idiot writes about you -- you just can't take it too personally. But then that teaches you not to take the really great ones too personally either. People like symbols, and they write about symbols.
I don't have to deal with celebrity status in Silicon Valley or anywhere else. I do get to read reviews of my work, though. Every three years, the faculty of my department evaluate my performance as part of the dean's review of my work and his decision to consider for me another term. I went through my second such review last winter. And, of course, frequent readers here have seen my comments on student assessments, which we do at the end of each semester. I wrote about assessments of my spring Intelligent Systems course back in May. Despite my twice annual therapy sessions in the form of blog entries, I have a pretty good handle on these reviews, both intellectually and emotionally. Yet there is something visceral about reading even one negative comment that never quite goes away. Guys like Jobs probably do there best not to read newspaper articles and unsolicited third-party evals.
I'll have to try the twin brother gambit next semester. My favorite lesson from Jobs's answer, though, is the second part: While you learn to steel yourself against bad reviews, you learn not to take the really great ones too personally, either. Outliers is outliers. As Kipling said, all people should count with you, but none too much. The key in these evaluations to gather information and use it to improve your performance. And that most always comes out of the middle of the curve. Treating raves and rants alike with equanimity keeps you humble and sane.
Ultimately, I think one's stance toward what others say comes back to the critical element in the first passage from Jobs: trust. If you trust people, then you can train yourself to accept reviews as a source of valuable information. If you don't, then the best you can do is ignore the feedback you receive; the worst is that you'll damage your psyche every time you read them. I'm fortunate to work in a department where I can trust. And, like Jobs, I have a surprising faith in my students' fairness and honesty. It took a few years to develop that trust and, once I did, teaching came to feel much safer.
A Teacher Learns from Coaches -- Run to the Roar
For what is genius, I ask you,
but the capacity to be obsessed?
One thing about recovering from knee surgery: it gives you lots of time to read. In between bouts of pain, therapy, and sleep, I have been reading newspapers, magazines, and a few books lying around the house, including the rest of a Dave Barry book and the excellent but flawed law school memoir One L. Right now I am enjoying immensely re-reading Isaac Asimov's Foundation trilogy. (Psychohistory -- what a great computational challenge!)
The most unusual book I've read thus far has been Run to the Roar. This is not great literature, but it is an enjoyable read. It draws its power to attract readers from perhaps the most remarkable sports streak in history at any level: the men's squash team at Trinity College has won thirteen consecutive national titles and has not lost a single match during the streak. The thread that ties the book together as a story is the tale of the eleventh national championship match, in 2009 against Princeton University. This match is considered by many to be the greatest collegiate squash match of all time, won 5-4 by Trinity in see-saw fashion. Six of the nine matches went the full five games, with three of those requiring comebacks from 0-2 down, and only one match ended in straights. Two great teams, eighteen great players, battling to the last point. In the end, Trinity survived as much as it won.
I didn't know much about squash before the match, though I used to play a little racquetball. Fortunately, the book told me enough about the game and its history that I could begin to appreciate the drama of the match and the impressive nature of the streak. Unbelievable story, really.
But the book is also about its co-author, Trinity coach Paul Assaiante, both his life and his coaching methods. The book's subtitle is "Coaching to Overcome Fear", which captures in one phrase Assaiante's approach as well as any could. He works to help his players play in the moment, with no thoughts of external concerns weighing on their minds; enjoying the game they love and the privilege they have to test their preparation and efforts on the court in battle.
Assaiante views himself as a teacher, which makes what he says and the way he says it interesting to the teacher in me. There were many passages that struck a chord with me, whether as an "I have that pattern" moment or as an idea that I might try in my classroom. In the end, I saved two passages for more thought.
The first is the passage that leads off this entry. Assaiante attributed it to Steven Millhauser. I had never heard the quote, so I googled it. I learned that Millhauser is an award-winning author. Most hits took me pages with the quote as the start of a longer passage:
For what is genius, I ask you, but the capacity to be obsessed? Every normal child has that capacity; we have all been geniuses, you and I; but sooner or later it is beaten out of us, the glory fades, and by the age of seven most of us are nothing but wretched little adults.
What a marvelous pair of sentences. It's easy to see why the sentiment means so much to Assaiante. His players are obsessive in their training and their playing. Their coach is obsessive in his preparation and his coaching. (The subtitle of one of the better on-line articles about Trinity's streak begins "Led by an obsessive coach...".)
My favorite story of his coaching obsessiveness was how he strives to make each practice different -- different lengths, different drills, different times of day, different level of intensity, and so on. He talks of spending hours to get each practice ready for the team, ready to achieve a specific goal in the course of a season aimed at the national championship.
Indeed, Assaiante is seemingly obsessive in all parts of his life; the book relates how he conquered several personal and coaching challenges through prolonged, intense efforts to learn and master new domains. One of the sad side stories of Run to the Roar explores whether Assaiante's obsessiveness with coaching squash contributed to the considerable personal problems plaguing his oldest child.
Most really good programmers are obsessive, too -- the positive compulsive, almost uncontrollable drive that sticks with a thorny problem until it is solved, that tracks a pernicious bug until it is squashed. Programming rewards that sort of single-mindedness, elevating it to desirable status.
I see that drive in students. Some have survived the adults and schools that seem almost aimed at killing children's curiosity and obsessiveness. My very best students have maintained their curiosity and obsessiveness and channeled them positively into creative careers and vocations.
The best teachers are obsessive, too. The colleagues I admire most for their ability to lead young minds are some of the most obsessive people I know. They, too, seem to have channeled their obsessiveness well, enabling them to lead well-adjusted lives with happy, well-adjusted spouses and children -- even as they spend hours poring over new APIs, designing and solving new assignments for their students, and studying student code to find the key thing missing from their lectures, and then making those lectures better.
(As an aside, the Millhauser quote comes from his novel, "Edwin Mullhouse: The Life and Death of an American Writer 1943-1954 by Jeffrey Cartwright", a book purportedly written by a seven-year-old. I read a couple of reviews such as this one, and I'm not sure whether I'll give it a read or not. I am certainly intrigued.)
The second passage I saved from Assaiante's book comes from Jack Barnaby, Harvard's legendary squash and tennis coach:
The greatest limitation found in teachers is a tendency for them to teach the game the way they play it. This should be avoided. A new player may be quite differently gifted, and the teacher's personal game may be in many ways inappropriate to the pupil's talents. A good teacher assesses the mental and physical gifts of his pupil and tries to adapt to them. There is no one best way to play the game.
(I think this comes from Barnaby's out-of-print Winning Squash Racquets, but I haven't confirmed it.)
One of the hardest lessons for me to learn as a young teacher was not to expect every student to think, design, or code like me. For years I struggled and probably slowed a lot of my students' learning, as they either failed to adapt to my approach or fought me. Ironically, the ones most likely to succeed in spite of me were the obsessive ones, who managed to figure things out on their own by sheer effort!
Eventually I realized that being more flexible wasn't dumbing down my course but recognizing what Barnaby knew: students may have their own abilities and skills that are quite different from mine. My job is to help them maximize their abilities as best I can, not make them imitate me. Sometimes that means helping them to change, perhaps helping them recognize the need to change, but never simply to force them into the cookie cutter of what works well for me.
Sometimes I envy coaches, who usually work with a small cadre of student-athletes for an entire year, with most or all of them studying under the coach for four years. This gives the coach time to "assess the mental and physical gifts of his pupils and try to adapt to them". I teach courses that range from 10 to 40 students in size, two a year, and my colleagues teach six sections a year. We are lucky to see some students show up multiple times over the course of their time at the university, but it is with only a select few that I have the time and energy to work with individually at that level. I so try to assess the collective gifts, interests, and abilities of each class and adapt how and what I teach them as best as I am able.
In the end, I enjoyed all the threads running through Run to the Roar. I'm still intrigued by the central lesson of learning to "run to the roar", to confront our fears and see how feeble what we fear often turns out to be. I think that a lot of college students are driven by fear more than we realize -- by fear of failing in a tough major, fear of disappointing their parents, fear of not understanding, or appearing unintelligent, or not finding a career that will fulfill and sustain them. I have encountered a few students over the years in whom I came to see the fear holding them back, and on at least some occasions was able to help them face those fears more courageously, or at least I hope so.
Having read this book, I hope this fall to be more sensitive to this potential obstacle to learning and enjoyment in my class, and to be more adaptable in trying to get over, through, or around it.
After two bouts of bad news about my knee -- first the diagnosis and then the ineffectiveness of simpler fixes -- I have received good news, all things considered. A week ago Monday, I underwent a relatively new form of partial knee replacement called makoplasty. The surgeon thought the operation went very well. Good news at last!
Now I am in a second round of recovery and rehabilitation, with therapy much like the first round: lots of non-weight-bearing motion to loosen the joint and to strengthen the muscles in the joint and the rest of the leg. There is certainly more pain than last time, but it's not so bad. I did have an adverse reaction to the medication prescribed for the pain, which has been uncomfortable and slowed my recovery, but I think I have to be happy with where I am and cautiously optimistic about where I can be in a few weeks and months. While that still almost certainly still will not include running, I should be able to return to an active life.
An experience from the surgery reminded me that, while I may be able to become active again, I won't be a youngster anymore. This procedure required spending one night in the hospital, so that they could monitor my vitals and my incision closely for a few hours. On the overnight shift, I had a college-aged nurse's aide who helped me several times. She called me, "Honey". Twice. Each time, I felt twice my age, rather than twice hers.
Still, I look forward to the progress a little hard work can provide to feeling young again.
Yesterday, I read Esther Derby's recent post, Promoting Double Loop Learning in Retrospectives, which discusses ways to improve the value of our project retrospectives. Many people who don't do project retrospectives will still find Derby's article useful, because it's really about examining how we think and expanding possibilities.
One of the questions she uses to jump start deeper reflection is:
What would have to be true for [a particular practice] to work?
This is indeed a good question to ask when we are trying to make qualitative changes in our workplaces and organizations, for the reasons Derby explains. But it is also useful more generally as a communication tool.
I have a bad personal habit. When someone says something that doesn't immediately make sense to me, my first thought is sometimes, "That doesn't make sense." (Notice the two words I dropped...) Even worse, I sometimes say it out loud. That doesn't usually go over very well with the person I'm talking to.
Sometime back in the '90s, I read in a book about personal communication about a technique for overcoming this disrespectful tendency, which reflects a default attitude. The technique is to train yourself to think a different first thought:
What would have to be true in order for that statement to be true?
Rather than assume that what the person says is false, assume that it is true and figure out how it could be true. This accords my partner the respect he or she deserves and causes me to think about the world outside my own point of view. What I found in practice, whether with my wife or with a professional colleague, was that what they had said was true -- from their perspective. Sometimes we were starting from different sets of assumptions. Sometimes we perceived the world differently. Sometimes I was wrong! By pausing before reacting and going on the defensive (or, worse, the offensive), I found that I was saving myself from looking silly, rash, or mean.
And yes, sometimes, my partner was wrong. But now my focus was not on proving his wrong but on addressing the underlying cause of his misconception. That led to a very different sort of conversation.
So, this technique is not an exercise in fantasy. It is an exercise in more accurate perception. Sometimes, what would have to be true in the world actually is true. I just hadn't noticed. In other cases, what would have to be true in the world is how the other person perceives the world. This is an immensely useful thing to know, and it helps me to respond both more respectfully and more effectively. Rather than try to prove the statement false in some clinical way, I am better served by taking one of two paths:
I am still not very good at this, and occasionally I slip back into old habits. But the technique has helped me to be a better husband as well as a better colleague, department head, and teacher.
Speaking as a teacher: It is simply mazing how different interactions with students can be when, after students say something that seems to indicate they just don't get it, "What would have to be true in order for that statement to be true?" I have learned a lot about student misconceptions and about the inaccuracy of the signals I send students in my lectures and conversations just by stepping back and thinking, "What would have to be true..."
Sometimes, our imaginations are too small for our own good, and we need a little boost to see the world as it really is. This technique gives us one.
Many people are talking about Conan O'Brien's recent commencement address at Dartmouth, in which he delivered vintage Conan stand-up for fifteen minutes and a thoughtful, encouraging, and wise message about failure. We talk about the virtues of failure in many contexts, including start-ups, agile software development, and learning. O'Brien reminds us that failure hurts. It makes us question our dreams and ourselves. But out of the loss can come conviction, creation, and re-creation. Indeed, it is in failing to achieve the ideals we set for ourselves that ends up making us who we are. Your dream will change. That's okay.
If you haven't seen this speech, check it out. It really is quite good, both entertaining and educational. If you are not particularly a fan of O'Brien's stand-up, you can skip to 15:40 or even 16:15 to get to the important message at its heart.
I've been thinking about failure and liberal arts colleges in New England in recent days, as my daughter prepares to head off for the latter with a little fear of the former. So this talk meant a lot to me. She isn't sure yet what she wants to major in or do for a living. This has been tough, because she has felt subtle pressure from a lot of people that she should have a big dream, or at least have a specific goal to work toward. But she likes so many things and isn't ready to specialize yet.
So she went looking for a liberal arts college. Then she hears a lot about unemployed English grads, students who lack practical job skills, and 20-somethings with crushing loan debts and no prospect of paying them off. That's where the fear comes in...
But I think people are making a fallacious connection between undergraduate education and professional prospects. First of all, a student can go to school with a particular job path in mind, amass huge debt, and enter a profession that doesn't pay well enough to pay it off. I saw news articles in the last year that talked about problems some grads have faced with degrees in social work and counseling psychology. There is nothing wrong with these degrees per se, but the combination of low median pay and debt amassed even at public schools can be deadly.
Second, and perhaps more important, many people seem to misunderstand the nature of a liberal education. They think it means studying only "soft" academic disciplines in the humanities, such as literature, history, and philosophy. Maybe that is what most people mean by the term, but I think about it more broadly as the freedom to read and study widely. Liberal arts majors are not limited to studying only in the humanities. They can study literature and also economics, chemistry, and international relations. They can study languages and also political science and a little math; history and also graphic design. They could even learn a little computer programming.
The sciences are part of a liberal education. I think CS can be, too. And the small size of many liberal arts majors gives students the freedom to sample broadly across the spectrum of human knowledge and skills.
The danger of a liberal arts education is that some students and professors take it as license to study only in the humanities. But the ultimate value of a liberal arts education lies not in that narrow circle, as valuable and rewarding as it can be in its own right. The value lies in intersections: the ability to understand them, the ability to recognize them, and the ability to work in them. It is most desirable to learn something about a lot of different things, even real problems and real solutions in the modern world. Put together with a few key skills, the combination is powerful.
Just as it's important not to be too narrowly trained, it's important not to be too narrowly "liberally educated".
So I've encouraged my daughter not to worry about her lack of narrow focus just yet. She has a lot to learn yet, most importantly about the challenging problems that will vex humanity in the coming century. Many of them lie at the intersection of several disciplines, and solving them will be the responsibility of well-prepared minds.
After I wrote my previous post, I downloaded the PDF version of the June issue of Chess Life to check out its quality and readability. Lo and behold, the first letter to the editor said:
I would like to seek readers' help in solving my dilemma about Chess Life magazine. Since 1972 -- almost 40 years! -- I have saved every copy of Chess Life. I treasure these magazines, of course, and I want to keep the "streak" going until I pass away. However, I also recognize that mailing magazines is costly for USCF, and that it is much less "green" than reading Chess Life online.So, dear readers, what should I do? Keep the 40-year streak going despite the cost and environmental impact, or get with the times and read Chess Life online?
The motive force behind my dilemma is less base than wanting to maintain a streak, and yet more selfish than wanting to save the earth. Still, I chuckled at the unexpected conjunction served up by the universe.
While writing that post, I read back over the much older post I linked to, Electronic Communities and Dancing Animals. It contains an extended passage that I reread and thought about for a while:
I know this beauty, and I'm sure you do. We are physical beings. The ability and desire to make and share ideas distinguish us from the rest of the world, but still we are dancing animals. There seems in us an innate need to do, not just think, to move and see and touch and smell and hear. Perhaps this innate trait is why I love to run.But I am also aware that some folks can't run, or for whatever reason cannot sense our physical world in the same way. Yet many who can't still try to go out and do. At my marathon last weekend, I saw men who had lost use of their legs -- or lost their legs altogether -- making their way over 26.2 tough miles in wheelchairs. The long uphill stretches at the beginning of the course made their success seem impossible, because every time they released their wheels to grab for the next pull forward they lost a little ground. Yet they persevered. These runners' desire to achieve in the face of challenge made my own difficulties seem small.
I suspect that these runners' desire to complete the marathon had as much to do with a sense of loss as with their innate nature as physical beings. And I think that this accounts for Vonnegut's and others' sentiment about the insufficiency of electronic communities: a sense of loss as they watch the world around evolve quickly into something very different from the world in which they grew.
Living in the physical world is clearly an important part of being human. But it seems to be neither necessary nor sufficient as a condition.
Another timely conjunction. I am heartened now by the stout spirit of the runners I saw in DC. I am also reminded anew of how small my own loss is when compared to theirs, and how much more noble its source. Fortunately, I seem to have reached a state of acceptance relatively quickly, enough so that I don't feel much envy or sadness when I see runners pass by my house on the bike trail that leads to many of my favorite routes. Still, at heart, I am a dancing animal.
I remember the day I received my first issue of Chess Life & Review magazine. It was the summer of 1979, in late June or early July. I had won a membership in the U.S. Chess Federation as part of a local goodwill tournament, by virtue of beating my good buddy and only competition for the junior prize. My victory entitled me to the membership as $20 of loot, which includes a portable set I use to this day and a notation book that records my games over a period of five or ten years.
That first issue arrived while I was at summer school. The cover heralded the upcoming U.S. Open championship, but inside the story of Montreal 1979, a super-GM tournament, captivated me with the games of familiar names (Karpov, Tal, Larsen, and Spassky) and new heroes (Portisch, Huuml;bner, Hort, and Ljubojevic). A feature article reviewed films of the 1940s that featured chess and introduced me to Humphrey Bogart's love of and skill at the game I loved to play. Bogart: the man's man, the tough-guy leading man at whose feet women swooned. Bogart! The authors of the magazine's regular columns became my fast friends, and for years thereafter I looked forward monthly to Andy Soltis's fun little stories, which always seemed to teach me something, and Larry Evans's Q-n-A column, which always seemed to entertain.
I was smitten, as perhaps only a young bookish kid can be.
Though I haven't played tournament chess regularly in three decades, I have remained an occasional player, a passionate programmer, and a lovestruck fan. And I've maintained my membership in the USCF, which entitles me to a monthly issue of Chess Life. Though life as husband, father, and professor leave me little time for the game I once played so much, every month I anticipate the arrival of my new issue, replete with new names and new games, tournament reports and feature articles, and regular columns that include Andy Soltis's "Chess to Enjoy". Hurray!
... which is all prelude to my current dilemma, a psychological condition that reveals me a man of my time and not a man of the future, or even the present. It's time to renew my USCF membership, and I am torn: do I opt for the membership that provides on-line access only to Chess Life?
For the last few years, ever since we moved into a new house and I cam face to face with just how much stuff I have, I've been in the process of cutting back. Even before then, I have made some society membership choices based in part on how little I need more piles of paper taking up space in my house and attention in my mind. This is the 21st century, right? I am a computer scientist, who deals daily in digital materials, who has disk space beyond his wildest dreams, whose students have effortlessly grown into a digital world that makes magazines seem like quaint compendia of the past. Right?
Yet I waffle. I can save roughly $7 a year by going paperless, which is a trifle, I know, but a prudent choice nonetheless. Right?
Undoubtedly, my CL&R-turned-CL collection takes up space. If I stem the tide of incoming issues, I can circumscribe the space needed to store my archive and devote future space to more worthy application. Perhaps I could even convert some of the archive into digital form and recoup space already spent?
This move would space, but if I am honest it does not free up all my attention. My magazines will join my music collection in the river of bits flowing into my future, being copied along from storage device to storage device, from medium to medium, and from software application to software application. I've lived through several generations of storage media, beginning in earnest with 5-1/4" floppies, and I'm sure I'll live through several more.
And what of changing formats? The text files that have followed me from college remain readable, for the most part, but not everything survives. For every few files I've converted from WordPerfect for DOS I have surely lost a file or two. Occasionally I run across one and ask myself, is it worth my time to try to open it and convert it to something more modern? I am sad to say that too often the answer is, well, no. This never happens to my books and magazines and pamphlets from that time. I choose to keep or to discard, and if I have it, I can read it. Where will PDF be in 50 years?
I am also just old enough that I somewhat cherish having a life that is separate from my digital existence. When I have the chance to play chess these days, I still prefer to pull out a board and set up the pieces. The feel of the ivory or plastic or wood in my hands is part of the experience -- not essential to the experience, I suppose, in a cosmic sense, but a huge ingredient to my personal experience. I have been playing chess on computers since 1980 or so, which isn't much later than I began playing the game in earnest as in grade school, so I know that feeling, too. But feeling the pieces in my hand, poring over My 60 Memorable Games (another lifelong treasure from the booty that first brought me Chess Life) line by line in search of Bobby Fischer's magic... these are a part of the game for me.
Ultimately, that's where my renewal dilemma lies, too. My memories of checking the mailbox every day at that time of the month, eager to find the next issue of the magazine. The smell of the ink as I thumbed through the pages, peeking ahead at the delights that awaited me. The feel of the pages as I turned to the next column or article or advertisement. The joy of picking out an old issue, grabbing that magnetic portable set from 30-odd years ago, and settling into a comfortable chair for an evening of reminiscence and future-making. All are a part of what chess has been for me. A cache of PDF files, $22 over three years, and a little closet space hardly seem sufficient consideration.
Alas, we are all creatures of our own times, I no less than any man. Even though I know better, I find myself pulled backward in time just as much as Kurt Vonnegut, who occasionally waxed poetic about the future of printed book. Both Vonnegut and I realize that the future may well exceed our imaginations, but our presents retain the gifts of days past.
When we last visited this tale, I had learned that my right knee suffers from a condition known as OCD and that my life as a distance runner was likely over. Depending on the size of the lesion and the state of the bone tissue, there are several potential reparative and restorative procedures that my surgeon could take. But running was almost certainly out of question.
After doing some research, we decided to do arthroscopic surgery to try to repair the lesion. My surgeon hoped that he would be able to do microfracture surgery or, if the lesion were a little bigger, perhaps the OATS procedure, which transplants good cartilage to the lesion for regrowth. If the lesion were too large for either of these procedures, there was one more option, the first step of a newer technique known as CARTICEL. The expected procedures, microfracture or OATS, would require a recovery period of six to eight weeks, during which I would not be allowed to put weight on the knee but would be doing a lot of motion therapy to stimulate blood flow and tissue growth.
I went in for arthroscopy last Wednesday, May 25. It had been thirty years since I had undergone surgery, to repair the rotator cuff in my left shoulder, and this experience was quite different. Medical technology has come a long way in thirty years. We did the operation at an outpatient surgery center, which was much more comfortable than the typical hospital. I was in and out in about four hours, despite being placed under general anesthesia. I went to sleep and woke up comfortably and even recall some of the conversations I had with nurses as I left the post-op room. The surgeon spoke to my wife after surgery, but I was still out cold. That evening, I was home resting comfortably.
The surgery was one of those good news/bad news things. The good news was that my recovery would be faster than we had planned. The bad news was why: the surgeon was not able to do either microfracture or OATS, because the damage to my joint is more extensive than we thought. It looks to be more degenerative than the result of a specific trauma, which fits how it presented better than the typical cases. So, instead he removed some loose cartilage, including one large piece, and cleaned up cartilage on both sides of the joint.
For the last week, I have been doing physical therapy, using lots of non-weight-bearing motion to loosen the joint and to strengthen other muscles in the leg, so that they can take pressure off the knee when we return it to full use.
Yesterday I went in for my post-op appointment with the surgeon, to gauge the state of recovery and to discuss next steps. He showed me pictures of the inside of my knee from the scope and explained why he could not do the procedures he had planned. The reasons came down to two. First, the lesion is wider and deeper than we had hoped, and microfracture and OATS only work on shallow wounds of a few centimeters at most. Second, there is also damage on the tibia across from the lesion on the femur. This is known as a "kissing lesion" and means that any new tissue growth at the bad spot on the femur would be damaged whenever I walked and the knee joint closed.
The next thing for us to try is a partial knee replacement, in which he cleans up the damaged area and fills the lesion with a piece of something. Basically, the options are again two. One is called osteochondral allograft, which uses a bone and tissue plug taken from a cadaver. The second is to use a synthetic implant made of the plastic and metal. The surgeon suggested that I may be a candidate for makoplasty, which uses computer visualization to help create the implant and an interactive robotic arm and to place it in the lesion and attach it to the femur. That sounds incredibly cool. I have to be sure not to let my fascination with the technology unduly influence my decision!
At this point, my wife and I have some research to do, to decide what, if anything, we want to do next. I am on the young side for even a partial knee replacement, but medical advances are improving the longevity of the procedures' effectiveness. My surgeon is sensitive to the fact that, as a relatively long guy, I probably want to live a more active lifestyle than an unrepaired joint is likely to allow. It is a big step for me, whatever we choose.
In any case, the surgeon says I need to continue working diligently on physical therapy, to build up the muscles both in the knee and, more importantly, all the other muscles and joints in the leg. If I don't do more surgery, these muscles are essential to supporting the damaged knee; if I do opt for more surgery, these muscles need to be as strong as possible to support the knee during recovery and rehabilitation. So, off to therapy I go.
If any of my running friends are still reading, I can add this: given both the size and character of my lesion and the way it presented, the surgeon is unable to say to what extent my heavy mileage affected the condition. Clearly, heavy mileage delivers a lot of repeated trauma to our knee joints. But with no previous pain or disruption to my running, it seems almost as likely that my running delayed the onset of the bone necrosis as that it caused it. I seem simply to have been unlucky genetically in this one regard.
We professors usually write glowingly of our students. Writing bad things about students in public seems like a bad idea. Besides, we mean the good things we say. By and large students are great people to be around. We get to learn with them watch them grow. Yet...
Yesterday, I tweeted out of an emotion I occasionally feel when I read my student evaluations after a semester ends: even if n-1 students say positive things and offer constructive suggestions for improvement, my mind focuses on the one student who was unhappy and complained unhelpfully. It's just the ego at work; objectively, every instructor realizes that whenever a bunch of students gather in one place, it is likely that a few will be unhappy. It's unrealistic -- foolish, really -- to think that everyone should like you.
Fortunately, after a few minutes (or hours, or days, if you haven't yet trained your mind in this discipline yet), the feeling passes and you move forward, learning from the assessment and improving the course.
Occasionally, the negative comments are not a random event. In this case, I'm pretty sure I know who was unhappy. This student had felt similarly in previous semesters. He or she is just not a fan of mine.
If we are all honest with ourselves and each other, we have to admit that the same is true for us professors. Occasionally, we encounter a student who rubs us the wrong way. It is rare, perhaps surprisingly so, but every few years I encounter a student of whom I am not a big fan. Sometimes the feeling is mutual, but not always. Occasionally, I have students who don't like me much but whom I like well enough, or students who rub me the wrong way but seem to like me fine. The good news is that, even in these circumstances, students and professors alike do a pretty good of working together professionally. For me, it's a point of professional pride not to let how I feel about any student, positive or negative, affect my courses.
I almost titled this post "Difficult Students", but that struck me as wrong. From the student's perspective, this is about difficult instructors. And it's not really about students and instructors at all, at least most of the time. Other students enjoy my courses even when one does not; other faculty like and enjoy the students who aren't my favorites. It's about relationships, one-on-one.
And, as I wrote in the George Costanza post linked above, this is to be expected. We are all human.
~~~~
In response to my tweet, David Humphrey shared this comic to help ease the psychological trauma of even one negative student:
(If you prefer an analgesic with a harder edge, I offer you Gaping Void's take on the matter.)
Late last week, Michael Nielsen tweeted:
"The most successful people are those who are good at Plan B." -- James Yorke
This is one of my personal challenges. I am a pretty good Plan A person. Historically, though, I am a mediocre Plan B person. This is true of creating Plan B, but more importantly of recognizing and accepting the need for Plan B.
Great athletes are good at Plan B. My favorite Plan B from the sporting world was executed by Muhammad Ali in the Rumble in the Jungle, his heavyweight title fight against George Foreman in October 1974. Ali was regarded by most at that time as the best boxer in the world, but in Foreman he encountered a puncher of immense power. At the end of Round 1, Ali realized that his initial plan of attacking Foreman was unlikely to succeed, because Foreman was also a quick fighter who had begun to figure out Ali's moves. So Ali changed plans, taking on greater short-term risk by allowing Foreman to hit him as much as he wanted, so long as the blows were not the kind likely to end the fight immediately. Over the next few rounds, Foreman began to wear down, unaccustomed to throwing so many punches for so many rounds against an opponent who did not weaken. Eventually, Ali found his opening, attacked, and ended the fight in Round 8.
This fight is burned in my mind for the all-time great Plan B moment: Ali sitting on his stool between the first and second rounds, eyes as wide and white as platters. I do not ever recall seeing fear in Muhammad Ali's eyes at any other time in his career, before or after this fight. He believed that Foreman could knock him out. But rather than succumb to the fear, he gathered himself, recalculated, and fought a different fight. Plan B. The Greatest indeed.
Crazy software developer that I am, I see seeds of Plan B thinking in agile approaches. Keep Plan A simple, so that you don't overcommit. Accept Plan B as a matter of course, refactoring in each cycle to build what you learn from writing the code back into the program. React to your pair's ideas and to changes in the requirements with aplomb.
There is good news: We can learn how to be better at Plan B. It takes effort and discipline, just as changing any of our habits does. For me, it is worth the effort.
~~~~
If you would like to learn more about the Rumble in the Jungle, I strongly recommend the documentary film When We Were Kings, which tells the story of this fight and how it came to be. Excellent sport. excellent art, and you can see Ali's Plan B moment with your own eyes.
I just read this passage from The Rhythm of Life, by Matthew Kelly:
You never can get enough of what you don't really need.Fulfillment comes not from having more and more of everything forever into oblivion. Fulfillment comes from having what you need.
Kelly is talking about how we live our lives. However, I could not help but think of You Aren't Gonna Need It and agile software development.
From there, Kelly takes a moral turn, but even then I hear the agile voice within:
The whole world is chasing illegitimate wants with reckless abandon. We use all of our time, effort, and energy in the pursuit of our illegitimate wants, hypnotized by the lie that our illegitimate wants are the key to our happiness.At the same time, the gentle voice within us is constantly calling out to us, trying to encourage us not to ignore the wisdom we already possess.
There is a lot to be said for learning to be content with implementing the features we are working on right now, not features we think are coming in the future. Perhaps if we can learn to be content in life we can also learn to be content in code.
Last week, I thought out loud about the university's relationship with its students, which may be different from what some are thinking it is. I just ran across an interview with Tim O'Reilly from last week about the future of his industry. His industry is different from what some are thinking it is, too:
At O'Reilly the way we think about our business is that we're not a publisher; we're not a conference producer; we're a company that helps change the world by spreading the knowledge of innovators.
My university could do worse than simply stealing O'Reilly's mission statement. There is more to our mission, of course. Research universities, at least, have historically also been about creating knowledge; universities such as mine have been as much about integrating knowledge as creating it. Public universities also serve their states in various ways, not the least of which is preparing educated citizens for participation in public life. It's hard to serve all these roles, and ultimately it all comes down to students and learning.
Mission statements and strategic plans have a bad reputation among faculty, and for good reason. They to be corporate statements diluted, by trying to say everything and, as a result, saying nothing much. But thinking hard about the real core of the university's mission might help us evolve and survive, rather than becoming the next dinosaur.
Our mission certainly isn't about classrooms, packaged courses, and labs filled with general purpose computers. Those are implementation details, built from the technology of a particular era. Just as the book and newspaper are undergoing changes in their basic form as technology evolves, so too should the university experience. Some of the technology we use now belongs to a dying era.
That's what makes the O'Reilly's statement I quote above so important. He has always recognized that his business is not defined by the technology of the time. Some people are afraid of changing technology because they do see themselves and their businesses as defined by their implementations. As technology evolves, O'Reilly is comfortable evolving his business model along with it, without abandoning what the company is really about.
Part of what made diagnosing my knee injury challenging is that the injury has not presented normally. Normally, this condition follows an obvious trauma. I did not suffer one. Normally, the symptoms include occasional locking of the joint and occasionally feeling as if the joint is going to give out. I have not experienced either. Normally, there is more pain than I seem to be having.
The doctors were surprised by this unusual presentation, but it didn't worry them much. They are used to the fact that there is no normal.
The human body is a complex machine, and people subject their bodies to a complex set of stimuli and conditions. As a result, the body responds in an unbelievable number of ways. What we think of as the "normal" path of most diseases, injuries, and processes is a composite of many examples. Each symptom or observation has some likelihood of occurring, but it is common for a particular case to look quite unusual.
This is something we learn when we study statistical methods. It's possible that no number in a set is equal to the average of all the numbers in a set. It's possible that no member in a set is normal in the sense of sharing all the features that are common to most members.
A large software system is a complex machine, and people subject software to a complex set of stimuli and conditions. As a result, the software responds in a surprising number of ways. When we think of this from the perspective people as users, we realize just how important designing for usability, reliability, and robustness are.
Programmers are people who interact with software, too, and we subject our programs to a wide-ranging set of demands. When we think about "there is no normal" from this perspective, we better understand why it is so challenging to debug, extend, and maintain programs.
Our programs may not be as complex as the human body, and we try to design them rather than let them evolve unguided. But I think it's still useful to program with a mindset that there is no normal. That way, like my doctor, we can handle cases that seem unusual with aplomb.
I received some bad news from the doctor yesterday. It looks like my running career is over.
As I wrote earlier this month, I haven't run since March 4, when I came down with the flu. As that was ending, my right knee started to hurt and swell, though I was not aware of any injury or trauma that might have caused the symptoms.
In the few weeks since that entry, the pain and swelling have decreased but not disappeared. We finally had an MRI done on Thursday so that an orthopedic surgeon could examine me yesterday.
The diagnosis: Osteochondritis dissecans (OCD). This is a condition in which articular cartilage in a joint and the bone to which it attaches crack or pull away from the rest of the bone. OCD occurs when the subchondral bone is deprived of blood. As near as I can tell from my reading thus far, the cause of the blood deprivation itself is less well understood.
Depending on the size of the lesion and the state of the bone tissue, there are several potential reparative and restorative steps that my surgeon can take. Unfortunately, even with the best outcomes, the new tissue is more fragile than the original tissue and usually is not able to withstand high-impact activity over a long period.
That's where we get to the bad news. I almost surely cannot run any more.
After the doctor told me his diagnosis and showed me the MRI, he said, "You're taking this awfully well." To be honest, as the doctor and I talked about this, it felt as if we were discussing someone else. I'm not the sort of person who tends to show a lot of emotion in such situations anyway, but in this case the source of my dispassion was easy enough to see. In an instant, I was jolted from trying to get better to never getting better. On top of that jolt, it wasn't all that long ago that I went from running 30+ peaceful miles a week to not running at all. I was stunned.
On the walk back to my office, my conscious and subconscious minds began to process the news, trying to make sense of it. I have had a lot of thoughts since then. My first was that there was still a small chance that the lesion wouldn't look so bad under the scope, that it could heal and that I would be back on the road soon. I chuckled when I realized that I had already entered the stage of grief, denial. That small chance does exist, but it is not a rational one on which to plan my future. The expected value of this condition is much closer to long-term problems with the knee than to "yeah, I'm running again". I chuckled because my mind was so predictable.
Not being able to run is a serious lifestyle change for someone who has run 13,000 miles in the last eight years. It also means that I will have to make other changes to my lifestyle as well. My modest hope is that eventually I will still be able to take walks with my wife. In then grand scheme, I would probably miss those more than I miss the running.
I'm also going to have to change my diet. As a runner, I have been able to consume a lot of calories, and it will be impossible to burn that many calories without running or other high-impact exercise. This may actually be a good thing for my health, as strange as that may sound. Burning 4000 extra calories a week covers up a multitude of eating sins. I'll have to do the right thing now. Maybe this is what people mean when they see a misfortune as an opportunity?
I've had only two really sad thoughts since hearing the news. First, I wish I had know that my last run was going to be my last run. Perhaps I could have savored it a bit more. As it is, it's already receding into the fogginess of my memory. Second, I wonder what this will mean for my sense of identity. For the last decade, I have been a runner. Being a runner was part of who I was. That is probably gone now.
Of course, when I put this into context, it's not all that bad. There could have been much worse diagnoses with much more dire consequences for my future. Much worse events could have happened to me or a loved one. As one of my followers on Twitter recently put it, this is a First World Problem: a relatively healthy, economically and politically secure white male won't be able to spend hours each week indulging himself in frivolous athletic activity for purely personal gain. I think I'll survive.
Still, it's a shock, one that I may not get used to for a while. I'm not a runner any more.
I'm fortunate to have good relationships with a number of former students, many of whom I've mentioned here over the years. Some are now close friends. To others, I am still as their teacher, and we interact in much the same way as we did in the classroom. They keep me apprised of their careers and occasionally ask for advice.
I'm honored to think that a few former students might think of me as their mentor.
Steve Blank recently captured the difference between teacher and mentor as well as I've seen. Reflecting back to the beginning of his career, he considered what made his relationships with his mentors different from the relationship he had with his teachers, and different from the relationship his own students have with him now. It came down to this:
I was giving as good as I was getting. While I was learning from them -- and their years of experience and expertise -- what I was giving back to them was equally important. I was bringing fresh insights to their data. It wasn't that I was just more up to date on current technology, markets or trends; it was that I was able to recognize patterns and bring new perspectives to what these very smart people already knew. In hindsight, mentorship is a synergistic relationship.
In many ways, it's easier for a teacher to remain a teacher to his former students than to become a mentor. The teacher still feels a sense of authority and shares his wisdom when asked. The role played by both teacher and student remains essentially the same, and so the relationship doesn't need to change. It also doesn't get to grow.
There is nothing wrong with this sort of relationship, nothing at all. I enjoy being a teacher to some of my once and future students. But there is a depth to a mentorship that makes it special and harder to come by. A mentor gets to learn just as much as he teaches. The connection between mentor and young colleague does not feel as formal as the teacher/learner relationship one has in a classroom. It really is the engagement of two colleagues at different stages in their careers, sharing and learning together.
Blank's advice is sound. If what you need is a coach or a teacher, then try to find one of those. Seek a mentor when you need something more, and when you are ready and willing to contribute to the relationship.
As I said, it's an honor when a former student thinks of me as a mentor, because that means not only do they value my knowledge, expertise, and counsel but also they are willing to share their knowledge, expertise, and experience with me.
Have you ever confused a dream with life?
I haven't blogged about running in a long while. Then again, I haven't run in a while now.
If you'd rather not hear me whine about that, you should skip the rest of this post!
I was getting through winter well enough, running 30+ miles each week. The coldest weather wasn't even stopping me; long, cold runs were the norm on Sunday mornings.
On March 4, I ran a good track workout, but as the day wore on I knew I wasn't feeling well. What ensued was the worst flu or something that I can remember. It knocked me out for two full weeks, through SIGCSE and spring break. I made it to Dallas for the conference, but my paucity of SIGCSE-themed posts was certainly a product of just how bad I felt.
Just as I was ready to start running again, my right knee became the problem. I don't recall suffering any particular injury to it at the time. I woke up one day with a stiff knee. Within a couple days I felt pain while walking, and soon it was swollen and stiffer.
It's now been two weeks more. I'm still hobbling around, knee wrapped tightly to immobilize it. The swelling and pain have decreased, and I hope that means I am on a trajectory to normal function. I have a second doctor's appointment this coming week. With any luck, an MRI will be able to tell us what is going on in there.
While I don't recall suffering any particular injury to the knee recently, I suspect that this is related to an injury I do remember. In 1999, I was at ChiliPLoP. We were playing doubles tennis after a long day working on elementary patterns. About an hour in, I was back and my partner was at the net. One of our opponents hit a bunny that floated enticingly over the net on my side of the court. I called my partner off and ran in for what would be an impressive smash. My partner must not have heard me. He ran along the net, apparently with the same idea in mind. As I stretched out to strike the ball, he struck me -- solidly on the inside of my knee, which buckled outwards.
For the rest of ChiliPLoP, I was pretty well immobile. After I returned home, I went to a highly respected orthopedic surgeon. He suggested a conservative plan, letting the knee heal and then strengthening it with targeted exercise. If all seemed well, we would skip surgery and see what happened.
After rest and exercise, the knee seemed fine, so we let it be. And so it was for twelve years. And just twelve years of sedentary lifestyle. Since 2003, I have kept close records of my running, and by my tally I have run about 13,000 miles. In all that time, I've never had any knee pain, and scant few days of running lost to injury. This injury could be the result of cumulative wear and tear, but I really would have expected to see symptoms of the wearing down over time.
I do hope that the doctor can figure out what's going on. With any luck, it's an aberration, and I'll be back on the trail soon!
(Writing this reminds me that I still haven't posted my Running Year in Review for 2010. I have started it a time or two and need only to finish. You'd think that not running would give me plenty of time...)
The latest newsletter from my graduate alma mater mentioned that one of my favorite profs is retiring. I always considered myself fortunate to have studied AI with him in my first term of grad school and later to have worked as a TA with him when he taught the department's intro course. I learned a lot from both experiences. The AI course taught me a lot about how to be a scientist. The intro course taught me a lot about how to be a teacher.
I often hear about how faculty at research schools care only about their research and therefore make for bad teachers in the undergraduate classroom. There are certainly instances of this stereotype, but I think it is not generally true. Active researchers can be bad teachers, but then again so can faculty who aren't active in research. Working as this prof's TA, I saw that even good researchers can be very good undergraduate teachers. He cared about his students, cared about their learning, and prepared his classes carefully. Those are important ingredients for good teaching no matter who is doing it.
I dropped him a quick e-mail to thank him for all his guidance while I was in grad school and to wish him well in retirement. In his response, he expressed a sentiment many teachers will recognize:
I'm sure you have been in the university business long enough to realize what a great job we have. Working with students such as you has been very rewarding.
I have, indeed, been in the university business long enough to realize what a great job we professors have. Over the years, I've had the good fortune to work with some amazing students, both undergrad and grad. That joy comes as part of working with a lot of wonderful young people along their path to professional careers and meaningful lives.
When you add the opportunity to work with students and the opportunity to explore interesting ideas and follow where they might lead us, you get a nearly unbeatable combination. It's good for me to step back every once in a while and remember that.
It's been a good start to semester, though too busy to write much.
My Intelligent Systems course shows promise, with four teams of students taking on some interesting projects: recognizing faces, recognizing sign language, classifying transactions in a big-data finance context, and a traditional knowledge-based system to do planning. I am supervising a heavier-than-usual set of six individual research projects, including a grad student working on progressive chess and undergrads working on projects such as a Photoshop-to-HTML translator, an interpreter for a homoiconic OO language, and pure functional data structures. This all means that I have a lot to learn this semester!
I'm also still thinking about the future, as the dean's 3-year review of my work as department head proceeds. Yesterday I watched the video of Steve Jobs's commencement address at Stanford. This time around, his story about the freeing power of death grabbed special attention. Jobs gets up each day and asks himself, "If this is your last day on Earth, will you be happy doing what you are doing today?" If the answer is "no" too many days in a row, then he knows he needs to make a change.
That's a bracing form daily ritual. When it comes to this level of self-honesty, on most days I feel more like Howard W. Campbell, Jr. than Steve Jobs. I think I also finally grok this aphorism, a favorite saying of a friend: "Real men don't accept tenure". It can be an unhealthy form of academic immortality.
The question I ask myself more often than not these days is, "Are you programming?" Let me substitute "programming" for "writing" in a passage by Norman Fischer quoted at What I've Learned So Far:
... programming is a sort of absolute bottom line. "Are you programming?" If the answer is yes, then no matter what else is going on, your life -- and all of life -- is basically OK. You are who you are supposed to be, and your existence makes sense. If the answer is no, then you are not doing well, your relationships and basic well-being are in jeopardy, and the rest of the world is dark and problematic.
A day without writing code is like, you know, night. (With apologies to Steve Martin.)
When you run a road race, you usually receive some token, usually a ribbon or a medal. The longer the race, the more likely you are to receive a medal, but even shorter distances these days often come with a medal. For marathons and half-marathons, the race medal is often a Very Big Deal, both for race sponsors and the runners.
Here is an example, from the 2004 Des Moines Marathon:
Local color plus attitude -- this is a great design!
For many runners, the race medal is an important memento. I appreciate this feeling and understand the desire to keep and display the symbol of their achievement. This feeling is perhaps strongest in first-time and one-time marathoners, who rightly see their race as the culmination of a much longer journey.
After a while, though, these mementos begin to pile up. I have run seven marathons, many half-marathons, and many more shorter races, and the result was a box full of ribbons, medals, and medallions. Over the last year or so, I have been working to reduce clutter in my house and mind, which has led me to ask myself some tough questions about the role of keepsakes. In the grand scheme of things, a shoebox of race medals is no big deal, but it was really just one manifestation of my habit of stockpiling memories: ticket stubs; programs from plays, recitals, and school programs; newspaper and magazine clippings; and, yes, race memorabilia. The list goes on. I wanted to make a change: to keep fewer physical keepsakes and to work harder to preserve the memories themselves.
Could I really give up my race medals? If so, could I just throw them away?
Walking around the race expo for the 2010 Des Moines Marathon, I discovered a better way. There I learned that about Medals4Mettle, a non-profit founded in my hometown of Indianapolis that collects marathon finisher's medals and distributes them to people who have "demonstrated similar mettle" by dealing with disease, handicaps, and other challenges:
As marathoners run through the streets, large crowds cheer the runners for their effort. Medals4Mettle lets these runners, healthy enough to compete in such an event, return the cheers to those who have supported them.
Why should my medals gather dust in a box in my basement when they could cheer up a child facing a real challenge? I finish these races through a combination of great luck in birth and in life. Put in context, my accomplishments are small.
The Iowa chapter of Medals4Mettle seems to be primarily the work of one man, Jason Lawry, a Des Moines runner. I was touched by his commitment, took his card, and ran my race.
Over Christmas break, I pulled the trigger and donated my medals to Jason's group. First, I snapped digital photos of every marathon and half-marathon medal in my box. Not being a great photographer or the owner of an awesome camera, this took a while, but it allowed me to spend some time saying good-bye to my medals. In the end, I showed a small bit of weakness and kept three. The 2003 Chicago Marathon was my first and so holds a special place with me:
I ran the 2009 500 Festival Mini-Marathon in my hometown after two years of unexplained illness that brought racing and most running to a halt:
Finally, there is something about running the 2007 Marine Corps Marathon that will always stick with me. Sharing the course with our nation's military, both veterans and active duty, inspired me. Receiving this medal from uniformed Marines makes it special:
Keeping three mementoes with particular significance seemed okay, though they might well mean more to someone else. Perhaps I'll donate these medals later, as memories fade or as new ones take on greater significance. I was proud to drop off the rest of them at a local runner's store for collection by Mr. Lawry.
And, no, I did not keep that cool 2004 Des Moines medal with the gangsta Grant Wood theme. I hope it brings a smile to the face of a person who can use the smile.
Last evening, my wife and I attended the wedding of one of my former graduate students. He is from India, and his new wife is from Russia. This was a small ceremony and party, with only a few of their closest friends. They will have big celebrations in their homelands next year to commemorate the marriage with extended family. This celebration consisted of a simple civil ceremony, beautiful wedding vows, and lots of fun courtesy of their multicultural friends.
I was honored that he asked me to attend, and more honored that he considers me a close enough friend to invite to such a small and intimate affair. He arrived at my university as a student in one of my classes. Soon, he chose to work with me as his thesis advisor. In the years since he graduated, we have become colleagues and friends, talking first about our careers then about our lives. I consider him to be a friend of the highest order.
The academic world sometimes delivers great gifts to us. There was no particular reason for this talented, hard-working, generous, and loyal young man to arrive on our doorstep, among all the many schools he could have chosen. When he did, I received the good fortune of working with him academically for two years and then the good fortune of a great friend thereafter.
One of the traditions the groom and bride celebrated at their wedding was that anyone who wanted to give a toast could do so. Nearly everyone in attendance who knew either of them did so, and the groom himself spoke several times. About half way through the dinner and social time, my friend addressed the group, answering the often-asked question, how does a young man from Hyderabad end up at regional university in the middle of America's cornfields? He told a story about the e-mail he and I exchanged as he looked for an American grad school. Though we did not know it at the time, that was the beginning of our friendship.
This story honored me, far beyond the wedding invitation, and humbled me. It was an unexpected gift as the snow began to fall at the outset of the holiday weekend.
I wish Shri and Katya a lifetime of love and their own deep friendship. A friend is an eternal gift.
This is my last day at the office before Christmas, and my mind has turned to 2011 already. I'll do some work next week but hope to spend most of my week celebrating the holiday with my families. Besides, this is the time when a professor mind naturally turns to spring semester courses.
I have another reason to think about the future. This is the last year of my second three-year term as department head, so the dean is conducting a review of my performance. Implicit in these reviews is that it's a natural time for the dean to decide whether to continue with the current head or go in a different direction.
The head also makes the decision. There is a lot I like about being a department head, and a lot that I can do to help the faculty and students that I could not do otherwise. But it changes the shape and tenor of every day. At research schools, chairs often count the days of their terms from Day One. For a guy who is a researcher, programmer, or teacher at heart, there is precious little time to think about and do those things.
I head into to breaking thinking about the future. I have a lot of people's words rolling around in my head. Last week, Seth Godin said
If someone asks you ["What are you working on?"], are you excited to tell them the answer?I hope so. If not, you're wasting away.
Last year, Chad Fowler asked:
But I think a good first step is to ask yourself the question: "What would I rather be doing right now?" And then, "Why is that not what I'm doing?"
... and Derek Sivers exclaimed:
No more yes. It's either HELL YEAH! or no.
... and Hugh MacLeod doodled: Life is too short not to do something that matters.
The recurring theme is: Take control of life. Don't drift. Don't let others determine your life. Godin goes an extra step and reminds us that we can apply this advice without changing our careers or job titles:
No matter what your job is, no matter where you work, there's a way to create a project ... where the excitement is palpable.... Hurry, go do that.
Plenty for my mind, conscious and subconscious, to mull over.
On Thursday, my students presented their Klein compilers. Several of the groups struggled with code generation, which is a common experience in a compiler course. There are a lot of reasons, most prominently that it's a difficult task. (Some students' problems were exacerbated by not reading the textbook...)
Still, all four groups managed to get something working for a subset of the language. They worked really hard, sometimes trying crazy ideas, all in an effort to make it work.
Over the years, I have noticed that some students have this attribute: they find a way to get things done. Whatever constraints they face, even under sub-optimal conditions they create for themselves, they find a way to solve the problem or make the program meet the spec. I'm surprised how often some students get things done despite not really understanding what they are doing! (Of course, sometimes, you just gotta know stuff.)
This reminds me of a conversation I had at Clemson University back in 1994 or 1995. I was attending and NSF workshop on closed labs. We were attending the midweek social event that seems de rigeur at weeklong workshops, chatting with some Clemson CS profs who had joined us for the evening and some AP CS graders who were also stationed at Clemson for the week. The AP folks talking about grading programs, the sort our students write in AP CS, CS1 and CS2.
One Clemson prof was surprised by how much weight the CS1 profs give to style, documentation, and presentation, relative to correctness. He said that he taught CS1 differently. Programming is hard enough, he said, that if you can find students who can wrote code, you should do whatever you can to encourage and develop them. We can teach style, presentation, and documentation standards to those students. Trying to teach more advanced programming skills to people who produce nice-looking programs but don't seem to "get it" is much, much harder.
He was expressing a preference for students who get stuff done.
In practice, students who major in CS from all across the spectrum. As a professor, I would like for my courses and our academic programs to help develop the "gets things done" attribute in our students, wherever they start along the spectrum. This requires that we help them grow not only in knowledge but also work habits. Perhaps most important is to help students develop a certain attitude toward problems, a default way of confronting the challenges they invariably encounter. Attitudes do not change easily, but they can evolve slowly over time. We profs can set a good example in how we confront challenges in class. We can also create conditions that favor a resilient approach to tough problems.
It was good for me to end the semester -- and the 2010 calendar year -- seeing that, whether by nature or nurture, some of our CS majors manage to get stuff done. That bodes well for success when they leave here.
... you get to the code generation part of the compiler course you are teaching. You realize that you have forgotten the toy assembly language from the textbook for its toy virtual machine, so you have to relearn it. You think, "It's gonna be fun to write programs in this language again."
Assembly language? A toy assembly language? Really?
I just like to program. Besides, after programming in Klein this semester, a language I termed as an integer assembly language, moving to a very RISC assembly doesn't seem like that big a step down. Assembly language can be fun, though I don't think I'd want to program in it for a living!
I watched A League of Their Own with my wife and daughters tonight.
It's supposed to be hard. If it wasn't hard, everybody would do it. The hard... is what makes it great.-- Coach Jimmy Dugan
I'm thankful for the hard that makes so many of my experiences great. I'm also thankful to live in a world where my daughters have as many opportunities as they do to find and do whatever makes them whole.
I've spent considerable time this morning cleaning out the folder on my desktop where I keep stuff. In one the dozens of notes files I've created over the last year or so, I found this unattributed quote:
In 1961, the scholar and cultural critic George Steiner argued in a controversial book, "The Death of Tragedy", that theatrical tragedies had begun their steady decline with the rise of rationalism and the Enlightenment in the 17th century. The Greek notion of tragedy was premised on man's inability to control his fate in the face of capricious, often brutal gods. But the point of a tragedy was to dramatize man's ability to make choices whatever his uncontrollable end.The emphasis was not on the death -- what the gods usually had in store -- but on what the hero died for: the state, love, his own dignity. Did he die nobly? Or with shame? For a worthy cause? Or pitiless self-interest? Tragedies, then, were ultimately "an investigation into the possibilities of human freedom", as Walter Kerr put it in "Tragedy and Comedy" (1967).
I like this passage now as much as I must have when I typed it up from some book I was reading. (I'm surprised I did not write down the source!) It reminds me that I face and make choices every day that reveal who I am. Indeed, the choices I make create who I am. That message feels especially important to me today.
And yes, I know there are better tools for keeping notes than dozens of text files thrown into nearly as many folders. I take notes using a couple of them as well. Sometimes I lack the self-discipline I need to leave an ordered life!
Playwright Arthur Miller is often quoted as saying:
Man must shape his tools lest they shape him.
I read this again yesterday, in the online book Focus, while listening to faculty at a highly-ranked local school talk about the value of a liberal arts education. The quote reminds me about one of the reasons I so like being a computer scientist. I can shape my tools. If I need a new tool, or even a new kind of tool, I can make it.
Our languages are tools, too. We can shape them, grow them, change them. We can create new ones. (Thanks, Matz.)
Via the power of the Internet I am continuously surrounded by colleagues smarter and more motivated than I doing the same. I've been enjoying watching Brian Marick tweet about his thoughts and decision making as he implements Midje in Clojure. His ongoing dialog reminds me that I do not have to settle.
A few recent entries have given rise to interesting responses from readers. Here are two.
Fat Arrows
Relationships, Not Characters talked about how the most important part of design often lies in the space between the modules we create, whether objects or functions, not the modules themselves. After reading this, John Cook reminded me about an article by Thomas Guest, Distorted Software. Near the end of that piece, which talks about design diagrams, Guest suggests that the arrows in application diagrams should be larger, so that they would be proportional to the time their components take to develop. Cook says:
We typically draw big boxes and little arrows in software diagrams. But most of the work is in the arrows! We should draw fat arrows and little boxes.
I'm not sure that would make our OO class diagrams better, but it might help us to think more accurately!
My Kid Could Do That
Ideas, Execution, and Technical Achievement wistfully admitted that knowing how to build Facebook or Twitter isn't enough to become a billionaire. You have to think to do it. David Schmüdde mentioned this entry in his recent My Kid Could Do That, which starts:
One of my favorite artists is Mark Rothko. Many reject his work thinking that they're missing some genius, or offended that others see something in his work that they don't. I don't look for genius because genuine genius is a rare commodity that is only understood in hindsight and reflection. The beauty of Rothko's work is, of course, its simplicity.
That paragraph connects with one of the key points of my entry: Genius is rare, and in most ways irrelevant to what really matters. Many people have ideas; many people have skills. Great things happen when someone brings these ingredients together and does something.
Later, he writes:
The real story with Rothko is not the painting. It's what happens with the painting when it is placed in a museum, in front of people at a specific place in the world, at a specific time.
In a comment on this post, I thanked Dave, and not just because he discusses my personal reminiscence. I love art but am a novice when it comes to understanding much of it. My family and I saw an elaborate Rothko exhibit at the Smithsonian this summer. It was my first trip to the Smithsonian complex -- a wonderful two days -- and my first extended exposure to Rothko's work. I didn't reject his art, but I did leave the exhibit puzzled. What's the big deal?, I wondered. Now I have a new context in which to think about that question and Rothko's art. I didn't expect the new context to come from a connection a reader made to my post on tech start-up ideas that change the world!
I am glad to know that thinkers like Schmüdde are able to make connections like these. I should note that he is a professional artist (both visual and aural), a teacher, and a recovering computer scientist -- and a former student of mine. Opportunities to make connections arise when worlds collide.
Four or five years ago, my best buddy on campus and I were having lunch at our favorite Chinese buffet. He looked up between bites of General Tsao's and asked, "Why didn't you and I sit down five years ago and write Facebook?"
You see, he is an awesome programmer and has worked with me enough to know that I do all right myself. At various times, both of us have implemented bits and pieces of the technology that makes up Facebook. It doesn't look like all that big a deal.
I answered, "Because we didn't think of it."
The technical details may or may not have been a big deal. Once implemented, they look straightforward. In any case, though, the real reason was that it never occurred to us to write Facebook. We were techies who got along nicely with the tools available to us in 1999 or 2000, such as e-mail, wiki, and the web. If we needed to improve our experience, we did so by improving our tools. Driven by one of his own itches, Troy had done his M.S. research with me as his advisor, writing a Bayesian filter to detect spam. But neither of us thought about supplanting e-mail with a new medium.
We had the technical skills we needed to write Facebook. We just didn't have the idea of Facebook. Turns out, that matters.
That lunch conversation comes into my mind every so often. It came back yesterday when I read Philip Greenspun's blog entry on The Social Network. Greenspun wrote one of my favorite books, Philip and Alex's Guide to Web Publishing, which appeared in 1998 and which describes in great detail (and with beautiful photos) how to implement web community software. When his students ask how he feels about Zuckerberg getting rich without creating anything "new", Greenspun gives a wonderfully respectful and dispassionate answer: "I didn't envision every element of Facebook." Then he explains what he means.
Technically, Greenspun was positioned as well or better than my buddy and I to write Facebook. But he didn't the idea, either, at least not to the same level as Zuckerberg. Having the core of an idea is one thing. Developing it to the point that it becomes a platform that changes the world in which it lives is another. Turns out, that matters, too.
I like Lawrence Lessig's most succinct summation of what makes Zuckerberg writing Facebook a notable achievement: He did it. He didn't just have an idea, or talk about it, or dream about it. He implemented it.
That's what great hackers do.
Watch this short video to hear Zuckerberg himself say why he built. His answer is also central to the hacker ethic: Because he wanted to.
(Also read through to the end of Lessig's article for a key point that many people miss when they think about the success and achievement of things like Facebook and Twitter and Napster: The real story is not the invention.
Zuckerberg may or may not be a genius? I don't know or care. That is a word that modern culture throws around far too carelessly these days. I will say this. I don't think that creating Facebook is in itself sufficient evidence for concluding so. A lot of people have cool ideas. A more select group of people write the code to make their ideas come alive. Those people are hackers. Zuckerberg is clearly a great hacker.
I'm not a big Facebook user, but it has been on my mind more than usual the last couple of days. Yesterday was my birthday, and I was overwhelmed by all the messages I received from Facebook friends wishing me a happy day. They came from all corners of the country; from old grade-school friends I haven't seen in over thirty years; from high school and college friends; from professional friends and acquaintances. These people all took the time to type a few words of encouragement to someone hundreds of miles away in the middle of the Midwest. I felt privileged and a little humbled.
Clearly, this tool has made the world a different place. The power of the social network lies in the people. The technology merely enables the communication. That's a significant accomplishment, even if most of the effects are beyond what the creator imagined. That's the power of a good idea.
~~~~
All those years ago, my buddy and I talked about how little technical innovation there was in Facebook. Greenspun's answer reminds us that there was some. I think there is another element to consider, something that was a driving force at StrangeLoop: big data. The most impressive technical achievement of Facebook and smaller web platforms such as Twitter is the scale at which they operate. They've long ago outgrown naive implementations and have had to try to offer uninterrupted service in the face of massive numbers of users and exponential growth. Solving the problems associated with operating at such a scale is an ongoing technical challenge and a laudable achievement in its own right.
Some miscellaneous thoughts after a couple of days in the mix...
Pertaining to Knowing and Doing
** Within the recurring theme of big data, I still have a few things to study: MongoDB, CouchDB, and FlockDB. I also learned about Pig, a language I'd never heard of before. I think I need to learn it.
** I need to be sure that my students learn about the idea of multimethods. Clojure has brought them back into mainstream discussion.
** Kevin Weil, who spoke about NoSQL at Twitter, told us that his background is math and physics. Not CS. Yet another big-time developer who came to programming from another domain because they had real problems to solve.
Pertaining to the Conference
** The conference served soda all day long, from 8:00 in the morning to the end of the day. Hurray! My only suggestion: add Diet Mountain Dew to the selections.
** The conference venues consist of two rooms in a hotel, two rooms in a small arts building, and the auditorium of the Pageant Theater. The restrooms are all small. During breaks, the line for the men's room was, um, long. The women in attendance came and went without concern. This is exactly opposite of what one typically sees out in public. The women of Strange Loop have their revenge!
** This is the first time I have ever attended a conference with two laptop batteries. And I saw that it was good. Now, I just have to find out why every couple of weeks my keyboard and trackpad freeze up and effectively lock me out. Please, let it not be a failing mother board...
Pertaining to Nothing But Me
** Like every conference, Strange Loop fills the silence between sessions with a music loop. The music the last two days has been aimed at its audience, which is mostly younger and mostly hipper than I am. I really enjoyed it. I even found a song that will enter my own rotation, "Hey, Julie" by Fountains of Wayne. You can, of course, listen to it on YouTube. I'll have to check out more Fountains of Wayne later.
** On Twitter, I follow a relatively small number of people, mostly professional colleagues who share interesting ideas and links. I also follow a few current and former students. Rounding out the set are a couple connections I made with techies through others, back when Twitter was small. I find that I enjoy their tweets even though I don't know them, or perhaps because I don't.
On Thursday, it occurred to me: Maybe it would be fun to follow some arbitrary incredibly popular person. During one of the sessions, we learned that Lady Gaga has about 6,500,000 followers, surpassing Ashton Kutcher's six million. I wonder it would be like to have their tweets flowing in a stream with those of Brian Marick and Kevlin Henney, Kent Beck and Michael Feathers.
I'm in St. Louis now for Strange Loop, looking at the program and planning my schedule for the next two days. The abundant options nearly paralyze me... There are so many things I don't know, and so many chances to learn. But there are a limited number of time slots in any day, so the chances overlap.
I had planned to check in at the conference and then eat at The Pasta House, a local pasta chain that my family discovered when we were here in March. (I am carbo loading for the second half of my travels this week.) But after I got the motel, I was tired from the drive and did not relish getting into my car again to battle the traffic again. So I walked down the block to Bartolino's Osteria, a more upscale Italian restaurant. I was not disappointed; the petto di pollo modiga was exquisite. I'll hit the Pasta House tomorrow.
When I visit big cities, I immediately confront the fact that I am, or have become, a small-town guy. Evening traffic in St. Louis overwhelms my senses and saps my energy. I enjoy conferences and vacations in big cities, but when they end I am happy to return home.
That said, I understand some of the advantages to be found in large cities. Over the last few weeks, many people have posted this YouTube video of Steven Johnson introducing his book, "Where Good Ideas Come From". Megan McArdle's review of the book points out one of the advantages that rises out of all that traffic: lots of people mean lots of interactions:
... the adjacent possible explains why cities foster much more innovation than small towns: Cities abound with serendipitous connections. Industries, he says, may tend to cluster for the same reason. A lone company in the middle of nowhere has only the mental resources of its employees to fall back on. When there are hundreds of companies around, with workers more likely to change jobs, ideas can cross-fertilize.
This is one of the most powerful motivations for companies and state and local governments in Iowa to work together to grow a more robust IT industry. Much of the focus has been on Des Moines, the state capitol and easily the largest metro area in the state, and on the Cedar Rapids/Iowa City corridor, which connects our second largest metro area with our biggest research university. Those areas are both home to our biggest IT companies and also home to a lot of people.
The best IT companies and divisions in those regions are already quite strong, but they will be made stronger by more competition, because that competition will bring more, and more diverse, people into the mix. These people will have more, and more diverse, ideas, and the larger system will create more opportunities for these ideas to bounce off one another. Occasionally, they'll conjoin to make something special.
The challenge of the adjacent possible makes me even more impressed by start-ups in my small town. People like Wade Arnold at T8 Webware are working hard to build creative programming and design shops in a city without many people. They rely on creating their own connections, at places like Strange Loop all across the country. In many ways, Wade has to think of his company as an incubator for ideas and a cultivator of people. Whereas companies in Des Moines can seek a middle ground -- large enough to support the adjacent possible but small enough to be comfortable -- companies like T8 must create the adjacent possible in any way they can.
I've been carrying this tune in my mind for the last couple of days, courtesy of singer-songwriter and fellow Indianapolis native John Hiatt:
So whatever your hands find to do
You must do with all your heart
There are thoughts enough
To blow men's minds and tear great worlds apart...
Don't ask what you are not doing
Because your voice cannot command
In time we will move mountains
And it will come through your hands
One of my deepest hopes as a parent is that I can help my daughters carry this message with them throughout their lives.
I also figure I'll be doing all right as a teacher if my students take this message with them when they graduate, whether or not they remember anything particular about design patterns or lambda.
The June 2010 issue of Communications of the ACM included An Interview with Ed Feigenbaum, who is sometimes called the father of expert systems. Feigenbaum was always an ardent promoter of AI, and time doesn't seem to have made him less brash. The interview closes with the question, "Why is AI important?" The father of expert systems pulls no punches:
There are certain major mysteries that are magnificent open questions of the greatest import. Some of the things computer scientists study are not. If you're studying the structure of databases -- well, sorry to say, that's not one of the big magnificent questions.
I agree, though occasionally I find installing and configuring Rails and MySQL on my MacBook Pro to be one of the great mysteries of life. Feigenbaum is thinking about the questions that gave rise to the field of artificial intelligence more than fifty years ago:
I'm talking about mysteries like the initiation and development of life. Equally mysterious is the emergence of intelligence. Stephen Hawking once asked, "Why does the universe even bother to exist?" You can ask the same question about intelligence. Why does intelligence even bother to exist?
That is the sort of question that captivates a high school student with an imagination bigger than his own understanding of the world. Some of those young people are motivated by a desire to create an "ultra-intelligent computer:, as Feigenbaum puts it. Others are motivated more by the second prize on which AI founders set their eyes:
... a very complete model of how the human mind works. I don't mean the human brain, I mean the mind: the symbolic processing system.
That's the goal that drew the starry-eyed HS student who became the author of this blog into computer science.
Feigenbaum closes his answer with one of the more bodacious claims you'll find in any issue of Communications:
In my view the science that we call AI, maybe better called computational intelligence, is the manifest destiny of computer science.
There are, of course, many areas of computer science worthy of devoting one's professional life to. Over the years I have become deeply interested in questions related to language, expressiveness, and even the science or literacy that is programming. But it is hard for me to shake the feeling, still deep in my bones, that the larger question of understanding what we mean by "mind" is the ultimate goal of all that we do.
A friend sent me an e-mail message last night that said, among other things, "So, you've been recruiting." I began my response with, "It's a hard habit to break." Immediately, this song was in my ear. I had an irrational desire to link to it, or mention it, or at least go play it. But I doubt Rick cared to hear it, or even hear about my sudden obsession with it, and I had too much to do to take the time to surf over to YouTube and look it up.
Something like this happens to me nearly every time I sent down to write, especially when I blog. The desire to pack my entries with a dense network of links is strong. Most of those links are useful, giving readers an opportunity to explore the context of my ideas or to explore a particular idea deeper. But every so often, I want to link to a pop song or movie reference whose connection to my entry is meaningful only to me.
YouTube did this to me. So did Hulu and Wikipedia and Google and Twitter, and the rest of the web.
What an amazing resource. What a joy to be able to meet an unexpected need or desire.
What a complete distraction.
It is hard for someone who remembers the world pre-web to overstate how amazing the resource is. These days, we are far more likely to be surprised not to find what we want than the other way around. Another friend expressed faux distress this morning when he couldn't find a video clip on-line of an old AT&T television commercial from the 70s or 80s with a Viking calling home to Mom. Shocking! The interwebs had failed him. Or Google.
Still there are days when I wonder how much having ubiquitous information at my fingertips has changed me for the worse, too. The costs of distraction are often subtle, an undertow on the flow of conscious thought. Did I really need to think about Chicago while writing e-mail about ChiliPLoP? The Internet didn't invent distraction, but it did make a cheap, universal commodity out of it.
Ultimately, this all comes back to my own weakness, the peculiar way in which my biology and experience have wired my mind for making connections, whether useful or useless. That doesn't mean the Internet isn't an enabler.
I am in a codependent relationship with the web. And we all know that a codependent relationship can be a hard habit to break.
I learned today that my colleague Mark Jacobson created a blog last spring, mostly as a commonplace book of quotes on which he was reflecting. While looking at his few posts, I came across this passage from Rainer Maria Rilke in his inaugural entry:
If your daily life seems poor, do not blame it; blame yourself that you are not poet enough to call forth its riches; for the Creator, there is no poverty.
There were a couple of times this week when I really needed to hear this, and reading it today was good fortune. It's also a great passage for computer programmers, who need never accept the grayness of their tools because they have the powers of a Creator.
I have long been a fan of Rilke's imagery and have quoted him before. I remember reading my first Rilke poem, in my high school German IV course. Our wonderful teacher, Frau Griffith, had the heart of a poet and asked those of us who had survived into the fourth year to read as much original German literature as we could, in order that we might come to understand more fully what it means to be German. (She was a deeply emotional woman who had escaped Nazi Germany on a train in the dead of night, just ahead of the local police.) I came to love Rilke and to be mesmerized by Kafka, whose work I have read in translation many times since. His short fiction is often surprising.
I do not remember just which Rilke poem we read first. I only remember that it showed me German could be beautiful. My German IV classmates and I were often teased by classmates who studied French and Spanish. They praised the fluidity of their languages and mocked the turgid prose of ours. We mocked them back for studying easier languages, but secretly we admitted that German was sounded and looked uglier. Rilke showed us that we were wrong, that German syllables could flow as mellifluously as any other. Revelation!
Later Frau introduced us to the popular music of Udo Jürgens, and we were hooked.
Recently, I ran across a reference to Goethe's "The Holy Longing". I tracked it down in English and immediately understood its appeal. But the translation feels so clunky... The original German has a rhythm that is hard to capture. Read:
Keine Ferne macht dich schwierig,
Kommst geflogen und gebannt,
Und zuletzt, des Lichts begierig,
Bist du Schmetterling verbrannt.
That's not quite Rilke to my ears, but it feels right.
"Tell me you're not playing chess," my colleague said quizzically.
But I was. My newest grad student and I were sitting in my office playing a quick couple of games of progressive chess, in which I've long been interested. In progressive chess, white makes one move, then black makes two; white makes three moves, then black makes four. The game proceeds in this fashion until one of the players delivers checkmate or until the game ends in any other traditional way.
This may seem like a relatively simple change to the rules of the game, but the result is something that almost doesn't feel like chess. The values of the pieces changes radically, as does the value of space and the meaning of protection. That's why we needed to play a couple of games: to acquaint my student with how different it is from the classical chess I know and love and which has played since a child.
For his master's project, the grad student wanted to do something in the general area of game-playing and AI, and we both wanted to work on a problem that is relatively untouched, where a few cool discoveries are still accessible to mortals. Chess, the fruit fly of AI from the 1950s into the 1970s, long ago left the realm where newcomers could make much of a contribution. Chess isn't solved in the technical sense, as checkers is, but the best programs now outplay even the best humans. To improve on the state of the art requires specialty hardware or exquisitely honed software.
Progressive chess, on the other hand, has a funky feel to it and looks wide open. We are not yet aware of much work that has been done on it, either in game theory or automation. My student is just beginning his search of the literature and will know soon how much has been done and what problems have been solved, if any.
That is why we were playing chess in my office on a Wednesday afternoon, so that we could discuss some of the ways in which we will have to think differently about this problem as we explore solutions. Static evaluation of positions is most assuredly different from what works in classical chess, and I suspect that the best ways to search the state space will be quite different, too. After playing only a few games, my student proposed a new way to do search to capitalize on progressive chess's increasingly long sequences of moves by one player. I'm looking forward to exploring it further, giving it a try in code, and finding other new ideas!
I may not be an AI researcher first any more, but this project excites me. You never know what you will discover until you wander away from known territory, and this problem offers us a lot of unknowns.
And I'll get to say, "Yes, we are playing chess," every once in a while, too.
Andrew Gelman writes about a competition offered by Kaggle to find a better rating system for chess. The Elo system has been used for the last 40+ years with reasonable success. In the era of big data, powerful ubiquitous computers, and advanced statistical methods, it turns out that we can create a rating system that predicts more accurately the performance of players on games in the near-future. Very cool. I'm still enough of chess geek that I want to know just when Capablanca surpassed Lasker and how much better Fischer was than his competition in the 1972 challenger's matches. I've always had an irrational preference for ridiculously precise values.
Even as we find systems that perform better, I find myself still attached to Elo. I'm sure part of it is that I grew up with Elo ratings as a player, and read Elo's The Rating of Chess Players, Past and Present as a teen.
But there's more. I've also written programs to implement the rating system, including the first program I ever wrote out of passion. Writing the code to assign initial ratings to a pool of players based on the outcomes of games played among them required me to do something I didn't even know was possible at the time: start a process that wasn't guaranteed to stop. I learned about the idea of successive approximations and how my program would have to settle for values that fit the data well enough. This was my first encounter with epsilon, and my first non-trivial use of recursion. Yes, I could have written a loop, but the algorithm seemed so clear written recursively. Such experiences stick with a person.
There is still more, though, beyond my personal preferences and experiences. Compared to most of the alternatives that do a better job objectively, the Elo system is simple. The probability curve is simple enough for anyone to understand, and the update process is basic arithmetic. Even better, there is a simple linear approximation of the curve that made it possible for a bunch of high school kids with no interest in math to update ratings based on games played at the club. We posted a small table of expected values based on rating differences at the front of the room and maintained the ratings on index cards. (This is a different sort of index-card computing than I wrote about long ago.) There may have been more accurate systems we could have run, but the math behind this one was so simple, and the ratings were more than good enough for our purposes. I am guessing that the Elo system is more than good enough for most people's purposes.
Simple and good enough is a strong combination. Perhaps the Elo system will turn out to be the Newtonian physics of ratings. We know there are a better, more accurate models, and we use them whenever we need something very accurate. Otherwise, we stick to the old model and get along just fine almost all the time.
This morning, a buddy of mine said something like this as part of a group e-mail discussion:
For me, the Moody Blues are a perfect artist for iTunes. Obviously, "Days of Future Passed" is a classic, but everything else is pretty much in the "one good song on an album" category.
This struck as a reflection of an interesting way in which iTunes has re-defined how we think about music. There are a lot of great albums that everyone should own, but even for fans most artists produce only a song or two worth keeping over the long haul. iTunes makes it possible to cherry-pick individual songs in a way that relegates albums to second thought. A singer or band have achieved something notable if people want to buy the whole album.
That's not the only standard measure I encountered in that discussion.
After the same guy said, "The perfect Moody Blues disc collection is a 2-CD collection with the entirety of 'Days of Future Passed' and whatever else you can fit", another buddy agreed and went further (again paraphrased):
"Days of Future Passed" is just over half an 80-minute CD, and then I grabbed another 8 or 9 songs. That worked out right for me.Even though CDs are semi-obsolete in this context, they still serve a purpose, as a sort of threshold for asking the question "How much music from this band do I really want to rip?"
When I was growing up, the standard was the 90-minute cassette tape. Whenever I created a collection for a band from a set of albums I did not want to own, I faced two limits: forty-five minutes on a side, and ninety minutes total. Those constraints caused me many moments of uncertainty as I tried to cull my list of songs into two lists that fit. Those moments were fun, though, too, because I spent a lot of time thinking about the songs on the bubble, listening and re-listening until I could make a comfortable choice. Some kids love that kind of thing.
Then, for a couple of decades the standard was the compact disc. CDs offered better quality with no halfway cut, but only about eighty minutes of space. I had to make choices.
When digital music leapt from the CD to the hard drive, something strange happened. Suddenly we were talking about gigabytes. And small numbers of gigabytes didn't last long. From 4- and 8-gigabyte devices we quickly jumped to iPods with a standard capacity of 160GB. That's several tens of thousands of songs! People might fill their iPods with movies, but most people won't ever need to fill them with the music they listen to on any regular basis. If they do, they always have the hard drive on the computer they sync the iPod with. Can you say "one terabyte", boys and girls?
The computer drives we use for music got so large so fast that they are no longer useful as the arbitrary limit on our collections. In the long run, that may well be a good thing, but as someone who has lived on both sides of the chasm, I feel a little sadness. The arbitrary limits imposed by LPs, cassettes, and CDs caused us to be selective and sometimes even creative. This is the same thing we hear from programmers who had to write code for machines with 128K of memory and 8 Mhz processors. Constraints are a source of creativity and freedom.
It's funny how the move to digital music has created one new standard of comparison via the iTunes store and destroyed another via effectively infinite hard drives. We never know quite how we and our world will change in response to the things we build. That's part of the fun, I think.
I spent the weekend in southwestern Ohio at Hueston Woods State Park lodge with a bunch of friends from my undergrad days. This group is the union of two intersecting groups of friends. I'm a member of only one but was good friends with the two main folks in the intersection. After over twenty years, with close contact every few years, we remain bonded by experiences we shared -- and created -- all those years ago.
The drives to and from the gathering were more eventful than usual. I was stopped by a train at same railroad crossing going both directions. On the way there, a semi driver intentionally ran me off the road while I was passing him on the right. I don't usually do that, but he had been driving in the left lane for quite a while, and none too fast. Perhaps I upset him, but I'm not sure how. Then, on the way back, I drove through one of the worst rainstorms I've encountered in a long while. It was scarier than most because it hit while I was on a five-lane interstate full of traffic in Indianapolis. The drivers of my hometown impressed me by slowing down, using their hazard lights, and cooperating. That was a nice counterpoint to my experience two days earlier.
Long ago, my mom gave me the New Testament of the Bible on cassette tape. (I said it was long ago!) When we moved to a new house last year, I came across the set again and have had it in pile of stuff to handle ever since. I was in an unusual mood last week while packing for the trip and threw the set in the car. On the way to Ohio, I listened to Gospel of Matthew. I don't think I have ever heard or read an entire gospel in one setting before. After hearing Matthew, I could only think, "This is a hard teaching." (That is a line from another gospel, by John, the words and imagery of which have always intrigued me more than the other gospels.)
When I arrived on Friday, I found that the lodge did offer internet service to the rooms, but at an additional cost. That made it easier for me to do what I intended, which was to spend weekend off-line and mostly away from the keyboard. I enjoyed the break. I filled my time with two runs (more on them soon) and long talks with friends and their families.
Ironically, conversation late on Saturday night turned to computers. The two guys I was talking with are lawyers, one for the Air Force at Wright Patterson Air Force Base and one for a U.S. district court in northern Indiana. Both lamented the increasing pace of work expected by their clients. "I blame computers," said one of the guys.
In the old days, documents were prepared, duplicated, and mailed by hand. The result was slow turnaround times, so people came to expect slow turnaround. Computers in the home and office, the Internet, and digital databases have made it possible to prepare and communicate documents almost instantly. This has contributed to two problems they see in their professional work. First, the ease of copy-and-paste has made it even easier to create documents that are bloated or off-point. This can be used to mislead, but in their experience the more pernicious problem is lack of thoughtfulness and understanding.
Second, the increased speed of communication has led to a change in peoples' expectations about response. "I e-mailed you the brief this morning. Have you resolved the issue this morning?" There is increasing pressure to speed up the work cycle and respond faster. Fortunately, both report that these pressures come only from outside. Neither the military brass nor the circuit court judges push them or their staff to work faster, and in fact encourage them to work with prudence and care. But the pressure on their own staff from their clients grows.
Many people lash out and blame "computers" for whatever ills of society trouble them. These guys are bright, well-read, and thoughtful, and I found their concerns about our legal system to be well thought out. They are deeply concerned by what the changes mean for the cost and equitability of the justice the system can deliver. The problem, of course, is not with the computers themselves but with how we use them, and perhaps with how they change us. For me as a computer scientist, that conversation was a reminder that writing a program does not always solve our problems, and sometimes it creates new ones. The social and cultural environments in which programs operate are harder to understand and control than our programs. People problems can be much harder to solve than technical problems. Often, when we solve technical problems, we need to be prepared for unexpected effects on how people work and think.
As department head, I teach only one course each semester, not the three that is standard for our faculty. The past academic year has been a bit unusual, though, with a load that kept me busy, busy, busy. In the fall, I taught Software Engineering, a course I had never taught before. My spring load included not only Programming Languages but also ten weeks of teaching Cobol two hours a week. I've taught Cobol many times before, but not for over fifteen years. Then, to top it all off, I taught my first May term course, an agile software dev course that met two hours every day for four weeks.
All of this was fun for the teacher in me, but no one turned down the spigot of administrative work pouring into my office. As a result, the year felt something like a treadmill. Then I spent a week digging out of a pile of undone work, eight days on vacation, and another week plus digging out of a new pile of undone work. With the delivery of faculty salary letters, I am ready to begin summer.
Fall classes begin in eight weeks -- a blink of an eye.
... and a fine one, spent with family, enjoying the world. I recall a passage from Josef Albers:
calm down
what happens
happens mostly
without you
Thanks to David Schmüdde for reminding me of Albers's quiet aside.
A friend and colleague sent me this:
So I had a very strange dream last night. I almost never remember dreams, so this was worth bringing up.You were in grad school ..., getting a second PhD (I think in psychology). I was there visiting you. You were single for some reason. Anyway, while I was there, you were accused of murdering another graduate student. The big evidence that they had of this was a video recording of you and your rock band (you were the lead singer) rehearsing for a gig. The other graduate student (Chinese girl) had been in the band, and suddenly she was totally missing from the video and your microphone was stained red. You and I had gotten a copy of the recording and were running from investigators. We finally got to a video editing lab on campus and were trying to figure out what was going on. We found a spot where the recording had clearly been edited. We were in the midst of finding the original recording when I woke up.
Boy would I like to have someone who interprets dreams take a whack at this one.
After I finish off my second Ph.D., I'm sure I'll be able to help with that.
It is perhaps sad that I am more interesting in my dreams than in real life, and sadder that I am more interesting in other people's dreams than in my own.
It's good to know that my ability to analyze video as data may well help me clear my good name!
First, Chuck Hoffman tweeted, The life of a code monkey is frequently depressingly futile.
I had had a long week, filled with the sort of activities that can make a programmer pine for days as a code monkey, and I replied, Life in many roles is frequently depressingly futile. Thoreau was right.
The ever-timely Brian Foote reminded me:
Sometimes utility feels like futility, but someone's gotta do it.
Thanks, Brian. I needed to hear that.
I remember hearing an interview with musician John Mellencamp many years ago in which he talked about making the movie Falling from Grace. Th interviewer was waxing on about the creative process and how different movies were from making records, and Mellencamp said something to the effect of, "A lot of it is just ditch digging: one more shovel of dirt." Mellencamp knew about that sort of manual labor because he had done it, digging ditches and stringing wire for a telephone company before making it as an artist. And he's right: an awful lot of every kind of working is moving one more shovel of dirt. It's not romantic, but it gets the job done.
A house burned in my neighborhood tonight. I do not know yet the extent of the damage, but the fight was protracted. My first hope is that no one was hurt, neither residents of the house nor the men and women who battled the blaze.
Such a tragedy puts my family's recent loss into perspective. No matter how valuable our data, when a hard drive fails, no one dies. Even without a backup, life goes. Even without a backup, there is a chance of recovery. We can run utilities that come with our OS. We can run wonderful programs that cost little money. Specialists can pull the platters from the drive and attempt to read data raw.
Things lost in a fire are lost forever.
If we follow a few simple and well-known rules, we can have a backup: a bit-for-bit copy of our data, all our digital stuff, indistinguishable from the original. In principle and in practice, we can encounter failures and lose nothing. In the material world, we cannot make a copy of everything we own. Yes, we can make copies of important documents, and we can store some of our stuff somewhere else. But we don't live in a bizarro Steven Wright world where we possess an identical copy of every book, every piece of clothing, every memento.
In the digital world, we can make copies that preserve our world.
So, I type this with a different outlook. The world reminds me that there are things worse than a lost disk drive. I hope that my daughters -- who lost the most in our failure -- can feel this way, too. We are well on the way to resuming our digital lives, buoyed by technology that will help us not to suffer such a loss again.
That said, it's worth keeping in mind Jamie Zawinski's cautionary words, "the universe tends toward maximum irony", and stay alert.
The universe tends toward maximum irony.
Don't push it.
-- JWZ
I have had Jamie Zawinski's public service announcement on backups sitting on desk since last fall. I usually keep my laptop and my office machine pretty well in sync, so I pretty much always have a live back-up. But some files live outside the usual safety zone, such as a temporary audio files on my desktop, which also contains one or two folders of stuff. I knew I need to be more systematic and complete in safeguarding myself from disk failure, so printed Zawinski's warning and resolved to Do the Right Thing.
Last week, I read John Gruber's ode to backups and disk recovery. This article offers a different prescription but the same message. You must be 100% backed up, including even the files that you are editing now in the minutes or hours before the next backup. Drives fail. Be prepared.
Once again, I was energized to Do the Right Thing. I got out a couple of external drives that I had picked out for a good price recently. The plan was to implement a stable, complete backup process this coming weekend.
The universe tends toward maximum irony. Don't push it.
If the universe were punishing me for insufficient respect for its power, you would think that the hard drive in either my laptop or my office machine would have failed. But both chug along just fine. Indeed, I still have never had a hard drive fail in any of my personal or work computers.
It turns out that the universe's sense of irony is much bigger than my machines.
On Sunday evening, the hard drive in our family iMac failed. I rarely use this machine and store nothing of consequence there. Yet this is a much bigger deal.
My wife lost a cache of e-mail, an address book, and a few files. She isn't a big techie, so she didn't have a lot to lose there. We can reassemble the contact information at little cost, and she'll probably use this as a chance to make a clean break from Eudora and POP mail and move to IMAP and mail in the cloud. In the end, it might be a net wash.
My teenaged daughters are a different story. They are from a new generation and live a digital life. They have written a large number of papers, stories, and poems, all of which were on this machine. They have done numerous projects for schools and extracurricular activities. They have created artwork using various digital tools. They have taken photos. All on this machine, and now all gone.
I cannot describe how I felt when I first realized what had happened, or how I feel now, two days later. I am the lead techie in our house, the computer science professor who knows better and preaches better, the husband and father who should be taking care of what matters to his family. This is my fault. Not that the hard drive failed, because drives fail. It is my fault that we don't have a reliable, complete backup of all the wonderful work my daughters have created.
Fortunately, not all is lost. At various times, we have copied files to sundry external drives and servers for a variety of reasons. I sometimes copy poetry and stories and papers that I especially like onto my own machines, for easy access. The result is a scattering of files here and there, across a half dozen machines. I will spend the next few days reassembling what we have as best I can. But it will not be all, and it will not be enough.
The universe maximized its irony this time around by getting me twice. First, I was gonna do it, but didn't.
That was just the head fake. I was not thinking much at all about our home machine. That is where the irony came squarely to rest.
Shut up. I know things. You will listen to me. Do it anyway.
Trust Zawinski, Gruber, and every other sane computer user. Trust me.
Do it. Run; don't walk. Whether your plan uses custom tools or a lowly cron running rysnc, do it now. Whether you go as far as using a service such as dropbox to maintain working files or not, set up an automatic, complete, and bootable backup of your hard drives.
I know I can't be alone. There must be others like me out there. Maybe you used to maintain automatic and complete system backups and for whatever reason fell out of the habit. Maybe you have never done it but know it's the right thing to do. Maybe, for whatever reason, you have never thought about a hard drive failing. You've been lucky so far and don't even know that your luck might change at any moment.
Do it now, before dinner, before breakfast. Do it before someone you love loses valuable possessions they care deeply about.
I will say this: my daughters have been unbelievable through all this. Based on what happened Sunday night, I certainly don't deserve their trust or their faith. Now it's time to give them what they deserve.
[A transcript of the SIGCSE 2010 conference: Table of Contents]
Day 2 brought three sessions worth their own blog entries, but it was also a busy day meeting with colleagues. So those entries will have to wait until I have a few free minutes. For now, here are a few miscellaneous observations from conference life.
On Wednesday, I checked in at the table for attendees who had pre-registered for the conference. I told the volunteer my name, and he handed me my bag: conference badge, tickets to the reception and Saturday luncheon, and proceedings on CD -- all of which cost me in the neighborhood of $150. No one asked for identification. I though, what a trusting registration.
This reminded me of picking up my office and building keys on my first day at my current job. The same story: "Hi, I'm Eugene", and they said, "Here are your keys." When I suggested to a colleague that this was perhaps too trusting, he scoffed. Isn't it better to work at a place where people trust you, at least until we have a problem with people who violate that trust? I could not dispute that.
The Milwaukee Bucks are playing at home tonight. At OOPSLA, some of my Canadian and Minnesotan colleagues and I have a tradition of attending a hockey game whenever we are in an NHL town. I'm as big a basketball fan as they are hockey fans, so maybe I should check out an NBA game at SIGCSE? The cheapest seat in the house is $40 or so and is far from the court. I would go if I had a posse to go with, but otherwise it's a rather expensive way to spend a night alone watching a game.
SIGCSE without my buddy Robert Duvall feels strange and lonely. But he has better things to do this week: he is a proud and happy new daddy. Congratulations, Robert!
While I was writing this entry, the spellchecker on my Mac flagged www.cs.indiana.edu and suggested I replace it with www.cs.iadiana.edu. Um, I know my home state of Indiana is part of flyover country to most Americans, but in what universe is iadiana an improvement?
People, listen to me: problem-solve is not a verb. It is not a word at all. Just say solve problems. It works just fine. Trust me.
While casting about Roy Behrens's blog recently, I came across a couple of entries that connected with my own experience. In one, Behrens discusses Arthur Koestler and his ideas about creativity. I enjoyed the entire essay, but one of its vignettes touched a special chord with me:
In 1971, as a graduate student at the Rhode Island School of Design, I finished a book manuscript in which I talked about art and design in relation to Koestler's ideas. I mailed the manuscript to his London home address, half expecting that it would be returned unopened. To my surprise, not only did he read it, he replied with a wonderfully generous note, accompanied by a jacket blurb.
My immediate reaction was "Wow!", followed almost imperceptibly by "I could never do such a thing." But then my unconscious called my bluff and reminded me that I had once done just such a thing.
Back in 2004, I chaired the Educators' Symposium at OOPSLA. As I first wrote back then, Alan Kay gave the keynote address at the Symposium. He also gave a talk at the main conference, his official Turing Award lecture. The Educators' Symposium was better, in large part because we gave Kay the time he needed to say what he wanted to say.
2004 was an eventful year for Kay, as he won not only the Turing Award but also the Draper Prize and Kyoto Prize. You might guess that Kay had agreed to give his Turing address at OOPSLA, given his seminal influence on OOP and the conference, and then consented to speak a second time to the educators.
But his first commitment to speak was to the Educators' Symposium. Why? At least in part because I called him on the phone and asked.
Why would an associate professor at a medium-sized regional public university dare to call the most recent Turing Award winner on the phone and ask him to speak at an event on the undercard of a conference? Your answer is probably as good as mine. I'll say one part boldness, one part hope, and one part naivete.
All I know is that I did call, hoping to leave a message with his secretary and hoping that he would later consider my request. Imagine my surprise when his secretary said, "He's across the hall just now; let me get him." My heart began to beat in triple time. He came to the phone, said hello, and we talked.
For me, it was a marvelous conversation, forty-five minutes chatting with a seminal thinker in my discipline, of whose work I am an unabashed fan. We discussed ideas that we share about computer science, computer science education, and universities. I was so caught up in our chat that I didn't consider just how lucky I was until we said our goodbyes. I hung up, and the improbability of what had just happened soaked in.
Why would my someone of Kay's stature agree to speak at a second-tier event before he had even been contacted to speak at the main event? Even more, why would he share so much time talking to me? There are plenty of reasons. The first that comes to mind is most important: many of the most accomplished people in computer science are generous beyond my ken. This is true in most disciplines, I am sure, but I have experienced it firsthand many times in CS. I think Kay genuinely wanted to help us. He was certainly willing to talk to me at some length about my hopes for the symposium and the role he could play.
I doubt that this was enough to attract him, though. The conference venue being Vancouver helped a lot; Kay loves Vancouver. The opportunity also to deliver his Turing Award lecture at OOPSLA surely helped, too. But I think the second major reason was his longstanding interest in education. Kay has spent much of his career working toward a more authentic kind of education for our children, and he has particular concerns with the state of CS education in our universities. He probably saw the Educators' Symposium as an opportunity to incite revolution among teachers on the front-line, to encourage CS educators to seek a higher purpose than merely teaching the language du jour and exposing students to a kind of computing calcified since the 1970s. I certainly made that opportunity a part of my pitch.
For whatever reason, I called, and Kay graciously agreed to speak. The result was a most excellent keynote address at the symposium. Sadly, his talk did not incite a revolt. It did plant seeds in the minds of at least of a few of us, so there is hope yet. Kay's encouragement, both in conversation and in his talk, inspire me to this day.
Behrens expressed his own exhilaration "to be encouraged by an author whose books [he] had once been required to read". I am in awe not only that Behrens had the courage to send his manuscript to Koestler but also that he and Koestler continued to correspond by post for over a decade. My correspondence with Kay since 2004 has been only occasional, but even that is more than I could have hoped for as a undergrad, when I first heard of Smalltalk or, as a grad student, when I first felt the power of Kay's vision by living inside a Smalltalk image for months at a time.
I have long hesitated to tell this story in public, for fear that crazed readers of my blog would deluge his phone line with innumerable requests to speak at conferences, workshops, and private parties. (You know who you are...) Please don't do that. But for a few moments once, I felt compelled to make that call. I was fortunate. I was also a recipient of Kay's generosity. I'm glad I did something I never would do.
Time to blog has been scarce, with the beginning of an unusual semester. I am teaching two courses instead of one, and administrative surprises seem to be arriving daily, both inside the department and out. Teaching gives me energy, but most days I leave for home feeling a little humbler than I started -- or a little less satisfied with state of affairs.
Perhaps this is why a particular passage from an entry on urban planning policy at The Urbanophile keeps coming to mind. It offers a lesson for urban policy based on the author's reading of Dietrich Dörner's The Logic of Failure (a new addition to my must-read list):
The first [lesson] is simply to approach urban policy and urban planning with humility and rich understanding of the limits of what we can accomplish. This I think is desperately needed. There are so many policies out there that are promoted with almost messianic zeal by their advocates.
One person's messianic zeal, unfettered from reality, is a dangerous force. It can wear out even a resolute team; when coupled with normal human frailty, the results can destroy opportunities for progress.
Another passage from the same blog has had a more personal hold on me of late:
People with talent, with big dreams and ambitions, want to live in a place where the civic aspiration matches their personal aspirations.
Sense of place and sense of self are hard to separate. This is true for cities -- the great ones capitalize on the coalescence of individual and communal aspiration -- and for academic departments.
Last spring, a colleague commented that he didn't think our department spent enough time trying to be great. This made me sad, but it struck me as true. At the time, I wasn't sure how to respond.
All groups have their internal politics. Some political situations are short-lived; others are persistent, endemic. We are no different, and maybe even above average. (Someone has to be!) Political struggles take time and energy. They steal focus.
I think everyone in our group desires to be great. Unfortunately, that's the easy part. For a group to achieve greatness, individuals must work together in a common direction. In our group, it is hard to build consensus on a shared vision. I don't pretend that once we share a vision that greatness will come easily, but it's hard to get anywhere unless everyone is trying to go to the same place -- or at least is using the same criteria for progress.
As for me, in my role as department head, I have not always found -- or created -- the will, the energy, or the tools I need to help us move confidently in the direction of greatness. So, at times, we seem to settle, working locally but not globally.
This train of thought reminds me of a couple of comments James Shore made about stumbling through mediocrity in the context of agile software development:
The emphasis [in the software world] has shifted from "be great" to "be Agile." And that's too bad. As much as I like it, there's really no point in Agile for the sake of Agile.
The point is to be great, or perhaps more accurately, to do great things. Agile approaches are a path, not a destination.
I want to work with people who want to be great. People who aren't satisfied just fitting in. People who are willing to take risks, rock the boat, and change their environment to maximize their productivity, throughput, and value.
One of the things that has surprised me so much about group dynamics since I joined a faculty and perhaps more so since I've been in the position of head is the enormous role that fear plays in how individuals work and interact with one another. It takes courage to take risks, to rock the boat, and to change the environment in which we live and work. It takes courage to be honest. It takes courage to take an action that may make a colleague or supervisor unhappy.
Without courage, especially at key moments, opportunities pass, sometimes before they are even recognized.
I have experienced this in how I interact with others, and occasionally I observe it how colleagues interact with me and others. I never thought that this would be a major obstacle on my path to greatness, or my department's.
(For what it's worth, Shore's second passage also describes the kind of students I like to work with, too. If it is hard for experienced adults to have this sort of gumption, imagine how much tougher an expectation it is to have of young people who are just learning how to step out into the world. Fortunately, as teachers, we have an opportunity to help students grow in this way.)
I've been enjoying time away from the office, classes, and even programming for the last week or so. After a long semester, spending time with my wife and daughters is just right.
It also gives me a chance to clean up my home office. What I am doing today is effectively refactoring: improving the structure of my stuff without adding any new capabilities. After this round of refactoring, I'll be ready to bring some new furniture in and do a couple of things I've been wanting to do since we moved in last December.
I won't strain the metaphor any farther, but I must say that my work day is a paradigm for this tweet by former student, musician, and software pro Chuck Hoffman:
"don't have time to refactor now" leads to "everything takes way more time because the code is confusing." The time gets spent either way.
True of code. True of papers piled high on a desktop or stacked in the corner of the room. In either world, you can pay now, or pay more later.
Our town was hit with a blizzard over the last couple of days. Not only did it close the local schools, it even shut down my university -- a powerful storm, indeed.
I thought I might treat the day off as 'found time', and hack a little code I've been thinking about...
I feel a kinship with [Cormac McCarthy's] sense of a perfect day. To sit in a room, alone, with an open terminal. To write, whether prose or code, but especially code. (11/21/09)
... but I never wrote a line of code. Instead, I shoveled snow (a lot of snow). I wrote Christmas cards in the kitchen while my daughters baked cookies for their teachers. We listened to Christmas music and made chili and laughed.
Unlike McCarthy, I do not think that everything other than writing is a waste of time. Today was a perfect day.
Can there be two kinds of perfect day? Can there be different kinds of perfect? Indeed, there are multitudes. The sky is always a perfect sky, even as it changes from moment to moment.
We live in a world of partial order. There is no total ordering on experience.
OR: for all p, passed(p)
~~~~
Last week saw the passing of computer scientist Amir Pnueli. Even though, Pnueli received the Turing Award, I do not have the impression that many computer scientists know much about his work. That is a shame. Pnueli helped to invent an important new sub-discipline of computing:
Pnueli received ACM's A. M. Turing Award in 1996 for introducing temporal logic, a formal technique for specifying and reasoning about the behavior of systems over time, to computer science. In particular, the citation lauded his landmark 1977 paper, "The Temporal Logic of Programs," as a milestone in the area of reasoning about the dynamic behavior of systems.
I was fortunate to read "The Temporal Logic of Programs" early in my time as a graduate student. When I started at Michigan State, most of its AI research was done in the world-class Pattern Recognition and Image Recognition lab. That kind of AI didn't appeal to me much, and I soon found myself drawn to the Design Automation Research Group, which was working on ways to derive hardware designs from specs and to prove assertions about the behavior of systems from their designs. This was a neat application area for logic, modeling, and reasoning about design. I began to work under Anthony Wojcik, applying the idea of modal logics to reasoning about hardware design. That's where I encountered the work of Pnueli, which was still relatively young and full of promise.
Classical propositional logic allows us to reason about the truth and falsehood of assertions. It assumes that the world is determinate and static: each assertion must be either true or false, and the truth value of an assertion never changes. Modal logic enables us to express and reason about contingent assertions. In a modal logic, one can assert "John might be in the room" to demonstrate the possibility of John's presence, regardless of whether he is or is not in the room. If John were known to be out of the country, one could assert "John cannot be in the room" to denote that it is necessarily true that he is not in the room. Modal logic is sometimes referred to as the logic of possibility and necessity.
These notions of contingency are formalized in the modal operators p, "possibly p," and p, "necessarily p." Much like the propositional operators "and" and "or", and can be used to express the other in combination with ¬, because necessity is really nothing more than possibility "turned inside out". The fundamental identities of modal logic embody this relationship:
Modal logic extends the operator set of classical logic to permit contingency. All the basic relationships of classical logic are also present in modal logic. and are not themselves truth functions but quantifiers over possible states of a contingent world.
When you begin to play around with modal operators, you start to discover some fun little relationships. Here are a few I remember enjoying:
The last of those is an example of a distributive property for modal operators. Part of my master's research was to derive or discover other properties that would be useful in our design verification tasks.
The notion of contingency can be interpreted in many ways. Temporal logic interprets the operators of modal logic as reasoning over time. p becomes "always p" or "henceforth p," and p becomes "sometimes p" or "eventually p." When we use temporal logic to reason over circuits, we typically think in terms of "henceforth" and "eventually." The states of the world represent discrete points in time at which one can determine the truth value of individual propositions. One need not assume that time is discrete by its nature, only that we can evaluate the truth value of an assertion at distinct points in time. The fundamental identities of modal logic hold in this temporal logic as well.
In temporal logic, we often define other operators that have specific meanings related to time. Among the more useful temporal logical connectives are:
My master's research focused specifically on applications of interval temporal logic, a refinement of temporal logic that treats sequences of points in time as the basic units of reasoning. Interval logics consider possible states of the world from a higher level. They are especially useful for computer science applications, because hardware and software behavior can often be expressed in terms of nested time intervals or sequences of intervals. For example, the change in the state of a flip-flop can be characterized by the interval of time between the instant that its input changes and the instant at which its output reflects the changed input.
Though I ultimately moved into the brand-new AI/KBS Lab for my doctoral work, I have the fondest memories of my work with Wojcik and the DARG team. It resulted in my master's paper, "Temporal Logic and its Use in the Symbolic Verification of Hardware", from which the above description is adapted. While Pnueli's passing was a loss for the computer science community, it inspired me to go back to that twenty-year-old paper and reminisce about the research a younger version of myself did. In retrospect, it was a pretty good piece of work. Had I continued to work on symbolic verification, it may have produced an interesting result or two.
Postscript. When I first read of Pnueli's passing, I didn't figure I had a copy of my master's paper. After twenty years of moving files from machine to machine, OS to OS, and external medium to medium, I figured it would have been lost in the ether. Yet I found both a hardcopy in my filing cabinet and an electronic version on disk. I wrote the paper in nroff format on an old Sparc workstation. nroff provided built-in char sequences for all of the special symbols I needed when writing about modal logic that worked perfectly -- unlike HTML, whose codes I've been struggling with for this entry. Wonderful! I'll have to see whether I can generate a PDF document from the old nroff source. I am sure you all would love to read it.
If you come hear to read only about computer science, software development, or teaching, then this entry probably isn't for you.
On Saturday, I attended the wedding of a family friend, the son of my closest friend from college. Some weddings inspire me, and this one did. I've been feeling a little jaded lately, and it was refreshing to see two wonderful young people, well-adjusted and good-hearted, starting a new chapter of life together.
During the minister's remarks to the bride and groom, I found myself thinking about love, and about big moments and little moments.
We often speak of one person loving another so much that he would lay down his life for her. That is a grand sort of love indeed. Many of our most romantic ideas about love come back to this kind of great personal sacrifice. It occurred to me that this is love in the big moment.
But how many of us are ever in a position where we must or even can demonstrate love in this way?
Nearly all of our chances to demonstrate love come in the nondescript moments that bathe us every day. These are not the big moments we dream of. We dream about big challenges, but the biggest challenge is to make small choices that demonstrate our love all the time. It is so easy for me to be selfish in the little desires that I act to satisfy daily. The real sacrifice is to surrender ourselves in those moments, to act in a way that puts another person, the one we love, at the front, to place her needs and wants ahead of our own.
When relationships falter, it is rarely because one person missed an opportunity to lay his life down -- literally. Much more often, it a result of small choices we make, of small sacrifices we could have made but didn't. I think that is one of the great sources of confusion for people whose at the end of relationships. They may well still be willing to lay down their loves for the loved one; what more could the other person want? It's hard to recognize all those little opportunities to sacrifice as they come by. How important they are.
The minister closed his remarks Saturday with a wish for the new couple that, at the end of long, happy lives together, they will be able to say, "I would choose you again." I make this wish for them, too. But I think one of the best ways to prepare for that distant moment is to wake up each day, say "I choose you" in the present tense, and then strive to live the little moments of that day well.
Artificial. Tyler Cowen writes about a new arena of battle for the Turing Test:
I wonder when the average quality of spam comment will exceed the average quality of a non-spam comment.
This is not the noblest goal for AI, but it may be one for which the economic incentive to succeed drives someone to work hard enough to do so.
Oh So Real. I have written periodically over the last sixteen months about being sick with an unnamed and undiagnosed malady. At times, I was sick enough that I was unable to run for stretches of several weeks. When I tried to run, I was able to run only slowly and only for short distances. What's worse, the symptoms always returned; sometimes they worsened. The inability of my doctors to uncover a cause worried me. The inability to run frustrated and disappointed me.
Yesterday I read an essay by a runner about the need to run through a battle with cancer:
I knew, though, if I was going to survive, I'd have to keep running. I knew it instinctively. It was as though running was as essential as breathing.
Jenny's essay is at turns poetic and clinical, harshly realistic and hopelessly romantic. It puts my own struggles into a much larger context and makes them seem smaller. Yet in my bones I can understand what she means: "... that is why I love running: nothing me feel more alive. I hope I can run forever."
"Only one, but it's always the right one."
-- Jose Raoul Capablanca,
when asked how many moves ahead
he looked while playing chess
When I was in high school, I played a lot of chess. That's also when I first learned about computer programming. I almost immediately was tantalized by the idea of writing a program to play chess. At the time, this was still a novelty. Chess programs were getting better, but they couldn't compete with the best humans yet, and I played just well enough to know how hard it was to play the game really well. Like so many people of that era, I thought that playing chess was a perfect paradigm of intelligence. It looked like such a wonderful challenge to the budding programmer in me.
I never wrote a program that played chess well, yet my programming life often crossed paths with the game. My first program written out of passion was a program to implement a ratings system for our chess club. Later, in college, I wrote a program to perform the Swiss system commonly used to run chess tournaments as a part of my senior project. This was a pretty straightforward program, really, but it taught me a lot about data structures, algorithms, and how to model problems.
Though I never wrote a great chessplaying program, that was the problem that mesmerized me and ultimately drew me to artificial intelligence and a major in computer science.
In a practical sense, chess has been "solved", but not in the way that most of us who loved AI as kids had hoped. Rather than reasoning symbolically about positions and plans, attacks and counterattacks, Deep Blue, Fritz, and all of today's programs win by deep search. This is a strategy that works well for serial digital computers but not so well for the human mind.
To some, the computer's approach seems uncivilized even today, but those of us who love AI ought be neither surprised nor chagrined. We have long claimed that intelligence can arise from any suitable architecture. We should be happy to learn how it arises most naturally for machines with fast processors and large memories. Deep Blue's approach may not help us to understand how we humans manage to play the game well in the face of its complexity and depth, but it turns out that this is another question entirely.
Reading David Mechner's All Systems Go last week brought back a flood of memories. The Eastern game of Go stands alone these days among the great two-person board games, unconquered by the onslaught of raw machine power. The game's complexity is enormous, with a branching factor at each ply so high that search-based programs soon drown in a flood of positions. As such, Go encourages programmers to dream the Big Dream of implementing a deliberative, symbolic reasoner in order to create a programs that plays the game well. The hubris and well-meaning naivete of AI researchers have promised huge advances throughout the years, only to have ambitious predictions go unfulfilled in the face of unexpected complexity. Well-defined problems such as chess turned out to be complex enough that programs reasoning like humans were unable to succeed. Ill-defined problems involving human language and the interconnected network of implicit knowledge that humans seem to navigate so easily -- well, they are even more resistant to our solutions.
Then, when we write programs to play games like chess well, many people -- including some AI researchers -- move the goal line. Schaefer et al. solved checkers with Chinook, but many say that its use of fast search and a big endgame databases is unfair. Chess remains unsolved in the formal sense, but even inexpensive programs available on the mass market play far, far better than all but a handful of humans in the world. The best program play better than the best humans.
Not so with Go. Mechner writes:
Go is too complex, too subtle a game to yield to exhaustive, relatively simpleminded computer searches. To master it, a program has to think more like a person.
And then:
Go sends investigators back to the basics--to the study of learning, of knowledge representation, of pattern recognition and strategic planning. It forces us to find new models for how we think, or at least for how we think we think.
Ah, the dream lives!
Even so, I am nervous when I read Mechner talking about the subtlety of Go, the depth of its strategy, and the impossibility of playing it well in by search and power. The histories of AI and CS have demonstrated repeatedly that what we think difficult often turns out to be straightforward for the right sort of program, and that what we think easy often turns out to be achingly difficult to implement. What Mechner calls 'subtle' about Go may well just be a name for our ignorance, for our lack of understanding today. It might be wise for Go aficionados to remain humble... Man's hubris survives only until the gods see fit to smash it.
We humans draw on the challenge of great problems to inspire us to study, work, and create. Lance Fortnow wrote recently about the mystique of the open problem. He expresses the essence of one of the obstacles we in CS face in trying to excite the current generation of students about our discipline: "It was much more interesting to go to the moon in the 60's than it is today." P versus NP may excite a small group of us, but when kids walk around with iPhones in their pockets and play on-line games more realistic than my real life, it is hard to find the equivalent of the moon mission to excite students with the prospect of computer science. Isn't all of computing like Fermat's last theorem: "Nothing there left to dream there."?
For old fogies like me, there is still a lot of passion and excitement in the challenge of a game like Go. Some days, I get the urge to ditch my serious work -- work that matters to people in the world, return to my roots, and write a program to play Go. Don't tell me it can't be done.
The economics blog Marginal Revolution has an occasional series of posts called "Markets in Everything", in which the writers report examples of markets at work in various aspects of everyday life. I've considered doing something similar here with computing, as a way to document some concrete examples of computational thinking and -- gasp! -- computer programs playing a role how we live, work, and play. Perhaps this will be a start.
Courtesy of Wicked Teacher of the West, I came across this story about NBA player Shane Battier, who stands out in an unusual way: by not standing out with his stats. A parallel theme of the story is how the NBA's Houston Rockets are using data and computer analysis in an effort to maximize their chances of victory. The connection to Battier is that the traditional statistics we associate with basketball -- points, rebounds, assists, blocked shots, and the like -- do not reflect his value. The Rockets think that Battier contributes far more to their chance of winning than his stat line shows.
The Rockets collect more detailed data about players and game situations, and Battier is able to use it to maximize his value. He has developed great instincts for the game, but he is an empiricist at heart:
The numbers either refute my thinking or support my thinking, and when there's any question, I trust the numbers. The numbers don't lie.
For an Indiana boy like myself, nothing could be more exciting than knowing that the Houston Rockets employ a head of basketball analytics. This sort of data analysis has long been popular among geeks who follow baseball, a game of discrete events in which the work of Bill James and like-minded statistician-fans of the American Pastime finds a natural home. I grew up a huge baseball fan and, like all boys my age, lived and died on the stats of my favorite players. But Indiana is basketball country, and basketball is my first and truest love. Combining hoops with computer science -- could there be a better job? There is at least one guy living the dream, in Houston.
I have written about the importance of solving real problems in CS courses, and many people are working to redefine introductory CS to put the concepts and skills we teach into context. Common themes include bioinformatics, economics, and media computation. Basketball may not be as important as sequencing the human genome, but it is real and it matters to a enough people to support a major entertainment industry. If I were willing to satisfy my own guilty pleasures, I would design a CS 1 course around Hoosier hysteria. Even if I don't, it's comforting to know that some people are beginning to use computer science to understand the game better.
Earlier today I listened to a TED talk by Tony Robbins. One snippet stood out. Here is a paraphrase, in part to clean up the language (because that's how I roll):
If I ask you whether you like variety, you'll say yes. Baloney. You like surprises you want. The others, you call problems.
Some people are better than others at accepting the surprises that they don't want. Perhaps that is why Robbins's anecdote reminded me of a story I read last summer in a book by John G. Miller called QBQ! The Question Behind the Question. The story, perhaps fictional, tells of a father and young daughter out for a fun plane ride one day, with dad behind the controls. When the plane's engine dies unexpectedly, dad turns to his daughter and says, calmly, I'm going to need to fly the plane differently.
I don't make generally New Year's resolutions, but when I am next tempted, I'll probably think again of this story. I want to be that guy, and I'm not.
----
(Quick book review: QBQ! is pretty standard for this genre of business self-help lit. It says a lot of things we all should already know, and probably do. But there are days when some of us need a reminder or a little pep talk. This book is full of short pep talks. It's a quick read and good enough at its task, as long as you remember that unless you change your behavior books like these are nothing but empty calories. A bit like software design methodologies.)
My expectations for the 500 Festival Mini-Marathon were rather low. I've been battling subpar health for a year, so my mileage has been down. I've gone through a few dry stretches of six to eight weeks without running much or at all.. I've been running again for the last ten weeks or so, but I've managed only to reach the mid-twenties of miles in any given week. My body just isn't ready for running many miles, let alone racing them.
My running buddy, Greg, and I arrived in downtown Indianapolis half an hour before the start time of the race. It was overcast and cool -- around 47 degrees -- with the slightest of breezes. I cast my lot with the possibility that we'd not run in the rain and left my cap in my checked back, but I did throw on my thinnest pair of gloves. A good choice.
When you run with 35,000 other runners, the start of a race is always a little crowded. After the official start of the race, Greg and I shuffled along for six and half minutes before we reached the starting line. From that point, we ran in tight traffic for only a third of a mile or so before we could move unencumbered. I was how quickly that moment came. I was also surprised at the pace of our first couple of miles. Even with the shuffling start we clocked a 9:08 for Mile 1, and then we did Mile 2 in 8:37. I won't be able to keep this up for much longer, I said, so don't feel bad about leaving me behind. But I didn't feel as if I were pressing, so I hung steady.
Talking as we ran helped me stay steady. I have gone to races with Greg and other friends before, but I have never actually run with them. We spend time together before and after the race, but during we find our own strides and run our own races. This time, we actually ran together. The miles clicked off. 8:33. 8:32. Can this be? 8:42. Ah, a little slower.
The we reached the famed Indianapolis Motor Speedway, home to the 500 mile race that gives its name to the race I am running. Race cars navigate this brick and asphalt oval in 40 seconds, but thousands of runners staked their claims in anywhere from twelve minutes to over an hour. We saw the 6-, 7-, and 8-mile markers inside the track, along with the 10K split and the halfway point. 8:46. 8:49. 8:44. Slower, yet hanging steady.
I felt a slight tug in my left calf just before the 9-mile marker. I did not mention it out loud, because I did not want to make it real. We kept talking, and I kept moving. 8:46. 8:29. What? 8:29?? The tenth mile was our fastest yet. I felt good -- not "just getting started" strong, but "I can keep doing this" strong. I thought of Barney Stinson's advice and just kept running.
I took a last sip of fuel just past the 10-mile mark. 8:22. Greg and I decided that we would let ourselves really run the last mile if we still felt good. We must have. We clipped off miles 12 and 13 in 16:16. Then came that last mad rush to the finish line. 1:52:25. I have never been so happy to run my second worst time ever. This was 8-10 minutes faster than I imagined I could run, and I finished strong, thinking I could do a little more if I had to. (Not another half, of course -- I am nowhere near marathon shape!)
Talking throughout the race definitely helped me. It provided a distraction from the fact that we were running hard, that the miles were piling up behind us. I never had a chance for my mind to tell I couldn't do what I was doing, because it didn't have a chance to focus on the distance. Our focus was on the running, on the moment. We took stock of each mile as a single mile and then took on the current mile. In an odd way, it was a most conscious race.
The only ill effect I have this morning is a barely sore left hamstring that gave its all for those last two mile and a minor headache. In all other ways I feel good and look forward to hitting the trails tomorrow morning with another challenge in mind.
The weekend itself was not an uninterrupted sequence of best case scenarios... As I pulled out of the parking garage after picking up my race packet in downtown Indianapolis, my car began gushing coolant. Was there any irony in the fact that I was at that moment listening to Zen and the Art of Motorcycle Maintenance? I am not one of those guys who tinkers with his own engine, but I know enough to know that you can't go far without coolant.
Still, I did not face a worst case scenario. I called the friend with whom I was to dine that night, and he came to get me. He arranged for a tow, and while I ran on Saturday morning a professional who knows his way around under the hood fixed the problem -- a faulty reservoir -- for only a couple of hundred dollars. Given the circumstances, I could hardly have asked for a better resolution.
Race day, May 2, was one year to the day of my last 100% healthy work-out... I do not think I am yet 100% healthy again, and I did not finish the half marathon with Ernie Banks's immortal words on my lips ("Let's play two!". But I have to say: Great day.
You are scanning a list of upcoming lectures on campus.
You see the title "Media Manipulation".
You get excited! Your thoughts turn to image rotations and filters, audio normalization and compression formats.
You read on to see the subtitle: "You, Me, and Them (the 'media' isn't what it used to be)" and realize that the talk isn't about CS; it's about communications and journalism.
You are disappointed.
----
(I'd probably enjoy this talk anyway... The topic is important, and I know and like the speaker. But still. To be honest, in recent weeks I have been less concerned with the media manipulating me than with the people in the media not doing the research they need to ensure their stories are accurate.)
David Patterson wrote a Viewpoint column for the March 2009 issue of Communications on advising graduate students, paired with a column by Jeffrey Ullman. One piece of Patterson's advice applies to more than advising grad students: "You're a role model; act like one.":
I am struck from parenting two now-grown sons that it's not what you say but what you do that has lasting impact. I bet this lesson applies to your academic progeny. Hence, I am conscious that students are always watching what I do, and try to act in ways that I'd like them to emulate later.For example, my joy of being a professor is obvious to everyone I interact with, whereas I hear that some colleagues at competing universities complain to their students how hectic their lives are.
I often worry about the message I send students in this regard. My life is more hectic and less fun with computer science since I became department head, and I imagine that most of the negative vibe I may give off is more about administration than academia. One time that I am especially careful about the image I project is when I meet with high school students who are prospective CS majors and their parents. Most of those encounters are scheduled in advance, and I can treat them almost like performances. But my interactions with current students on a daily basis? I'm probably hit-and-miss.
The idea that people will infer more from your deeds than your words is not new and does apply widely, to advisors, teachers, decision makers -- everyone, really. Anyone who has been a parent knows what Patterson means about having raised his sons. Long ago I marked this passage from Matthew Kelly's Building Better Families:
If you ask parents if they want their children to grow up to live passionate and purposeful lives they will say, "Absolutely!" But how many parents are living passionate and purposeful lives? Not so many.
Our example can set a negative tone or a positive tone. The best way to give children a zest for life is to live with zest and share your zest with them.
This applies to our students in class and in the research lab, too. My favorite passage in this regard comes not from Patterson's viewpoint but from The Wednesday Wars, which I quoted once before:
It's got to be hard to be a teacher all the time and not jump into a pool of clear water and come up laughing and snorting with water up your nose.
Through all my years in school, my best teachers jumped into the pool all the time and came up laughing and snorting with water up their noses. They wrote prose and code. They read about new ideas and wanted to try them out in the lab. Their excitement was palpable. Fun was part of the life, and that's what I wanted.
I hope I can embody a little of that excitement and fun as a faculty member to our students, as a father to my daughters. But some days, that is more of a challenge than others.
I spent most of the last four days of last week -- and nights -- digging out of the result of having lived 17 years in one house, as a moderate pack rat living with three major pack rats.
Remember that whole moving agile thing? Fuhgeddaboudit. The idea worked well for a few weeks. But then we encountered a problem everyone knows to avoid: the developers were the same people as the customers. When we got somewhere between 60% and 80% of our stuff moved, we reached something akin to a software prototype that offers most of the desired features. At that point, the customers went inexplicably AWOL. They were happy enough with the 80% solution.
Then came a long period of no measurable progress, no external motivation, and no Big Visible Chart to keep the developers honest.
Finally came the week of the closing on the sale of the old house. We were exceedingly lucky to have found buyers the first day the house was on the market and to have them want to close in a brisk five weeks. Hurray! ... except for the part about moving the rest of our stuff. We found ourselves in horrible crunch mode. The last 20% took 80% of the total move time. I worked around the clock for four days, stopping for classes and essential meeting. In the end, we just made it -- I, dead tired, with a bad cold, and a lot of stuff in boxes.
Why the title? After 17 years of saving things we "might need some day", we know the answer. We never did. We had boxes. Packing material. Textbooks, class notes. YAGNI. Really. I went from sentimental fool having a hard time tossing anything to pitching favored gifts and keepsakes like a disinterested pro. A few possessions rose above the newly-elevated threshold for what to keep, but not many. If I don't have a specific plan for using something in the next few weeks, it is gone. I have enough boxes and portfolios and a pile of notebooks and pads sufficient to outfit a medium-sized government agency. If I don't have a specific scenario for reminiscing over some memento, it too is gone. Sportsmanship trophies from 2nd-grade basketball leagues? I don't think so.
I ain't gonna need it. I trust that now. This is the best lesson for living more simply than I've ever received.
I have spent nearly every working minute this week sitting in front of this laptop, preparing a bunch of documents for an "academic program assessment" that is being done campus-wide at my university. Unfortunately, that makes this week Strike Two.
Last October: no OOPSLA for me.
This week: no SIGCSE for me.
The next pitch arrives at the plate in about a month... Will there be no ChiliPLoP for me?
That would be an inglorious Strike Three indeed. It would break my equivalent of DiMaggio's streak: I have never missed a ChiliPLoP. But budget rescissions, out-of-state travel restrictions, and work, work, work are conspiring against me. I intend to make my best effort. Say a little prayer.
I hope that you can survive my missing SIGCSE, as it will mean no reports from the front. Of course, you will notice two missing links on my 2008 report, so I do have some material in the bullpen!
Missing SIGCSE was tougher than usual, because this year I was to have been part of the New Teaching Faculty Roundtable on the day before the conference opened. I was looking forward to sharing what little wisdom I have gained in all my years teaching -- and to stealing as many good ideas as I could from the other panelists. Seeing all of the firepower on the roster of mentors, I have no doubts that the roundtable was a great success for the attendees. I hope SIGCSE offers the roundtable again next year.
Part of my working day today was spent in Cedar Rapids, an hour south of here. Some of you may recall that Cedar Rapids was devastated by flooding last summer, when much of the eastern part of the state was under 500-year flood waters. I surprised and saddened to see that so much of the downtown area still suffers the ill effects of the water. The public library is still closed while undergoing repair. But I was heartened to see a vibrant city rebuilding itself. A branch library has been opened at a mall on the edge of town, and it was buzzing with activity.
You know, having a library in the mall can be a good thing. It is perhaps more a part of some people's lives than a dedicated building in the city, and it serves as a nice counterpoint to the consumption and noise outside the door. Besides, I had easy access to excellent wireless service out in mall area even before the library opened, and easy access to all the food I might want whenever I needed to take a break. Alas, I really did spend nearly every working minute this week sitting in front of this laptop, so I worked my way right up to dinner time and a welcome drive home.
Many of us know this feeling:
When gods die, they die hard. It's not like they fade away, or grow old, or fall asleep. They die in fire and pain, and when they come out of you, they leave your guts burned. It hurts more than anything you can talk about. And maybe worst of all is, you're not sure if there will ever be another god to fill their place. You don't want the fire to go out inside you twice.
I'm tempted to say that the urge to share this came from a couple of experiences I had at the Rebooting Computing summit, but this passage goes beyond the emotions I felt there.
(This paragraph comes from a "young adult" novel called the The Wednesday Wars, by Gary D. Schmidt. I did not realize how many good books there are out there for young readers until my daughters started reading them and recommending them to me. If there is a book that can make Shakespeare more relevant to the life of a seventh-grader or more attractive for a teen to read than this one -- all the while being funny and on the mark for its audience -- I'd love to read it.)
This has been an unusual year in several ways. This month ends it fittingly, in an unusual way: my fewest ever postings to this blog in any calendar month since its inception. I have nor blogged much this month for a couple of reasons. The first is that I have tried to make my break time a break, and so have stayed away from my computer more than usual.
The second is less sedentary. My family bought a new house this month. We made an initial offer in October, worked through a lot of details and last-stage construction issues in November, and closed in early December. The last few weeks have been a combination of finishing fall term, tryine to rest a bit, and moving a car- or minivan-load at a time. Moving over the course of several weeks is how my wife and I planned to do it. Baby steps is an interesting way to move, as we grow into each space a bit at a time, with time to think before being buried in boxes labeled "downstairs bedroom". I am still enjoying it and seeing the advantages of it (not the least of which is time to throw out all of the stuff we don't want to move!), but I think it is starting to tire my family. They would like to be "moved". Come to think of it, so would I. It's about time to bring this iteration to a close.
Happy New Year to all.
My daughters received a new game from their mom for Christmas. It's called Apples to Apples. Each round, one of the players draws a card with an adjective on it. The rest of the players choose noun cards from their hands that match the adjective. The judge chooses one of the nouns as the best match, and the player who played it wins that round. The objective of the game is to win the most rounds.
I could tell you many things more about the game and how it's played in my family, but there is really only one thing to say:
I stink at this game.
If I am in a three-person game, I finish third. Four players? Fourth. You name the number of players, and I can tell you where I'll finish. Last.
It doesn't seem to matter with whom I play. Recently, I've been playing with my wife and daughters. Last night, my wife's brother joined us. My wife and I have played this game before, with friends from my office. The result is always the same.
My weakness may be heightened by the lobbying that can be part of the game. Players are allowed to try to sell their answers to the judge. I'm not a good salesman and besides don't really like to sell. But that doesn't account for my losing. If we play in silence, I lose.
It's not that I'm bad at all word games. I like many word games and generally do well playing them. If nothing else, I get better after I play a game for a while, by figuring out something about the strategy of the game and the players with whom I play. But in this game, the harder I try to play well, the worse I seem to do.
This must be how students feel in class sometimes. There is some consolation -- that I might become more empathetic as a result of feeling this way -- but, to be honest, it's just a bad feeling.
Last month my wife and I had the good fortune to see a Broadway touring company perform the Tony Award-winning Movin' Out, a musical created by Twyla Tharp from the music of Billy Joel. I've already mentioned that I am a big fan of Billy Joel, so the chance to listen to his songs for two hours was an easy sell. Some of you may recall that I also wrote an entry way back called Start with a Box that was inspired by a wonderful chapter from Twyla Tharp's The Creative Habit. So even if I knew nothing else about Tharp, Movin' Out would have piqued my interest.
This post isn't about the show, but my quick review is: Wow. The musicians were very good -- not imitating Joel, but performing his music in a way that felt authentic and alive. (Yes, I sang along, silently to myself. My wife said she saw my lips moving!) Tharp managed somehow to tell a compelling story by stitching together a set of unrelated songs written over the long course of Joel's career. I know all of these songs quite well, and occasionally found myself thinking, "But that's not what this song means...". Yet I didn't mind; I was hearing from within the story. And I loved the dance itself -- it was classical even when modern, not abstract like Merce Cunningham's Patterns in Space and Sound. My wife knows dance well, and she was impressed that the male dancers in this show were actually doing classical ballet. (In many performances, the men are more props than dancers, doing lifts and otherwise giving the female leads a foil for their moves.)
Now I see that Merlin Mann is gushing over Tharp and The Creative Habit. Whatever else I can say, Mann is a great source of links... He points us to a YouTube video of Tharp talking about "failing well", as well as the first chapter of her book available on line. Now you can read a bit to see if you want to bother with the whole book. I echo Mann's caveat: we both liked the first chapter, but we liked the rest of the book more.
Since my post three years ago on The Creative Habit, I've been meaning to return to some of the other cool ideas that Tharp writes about in this book. Seeing Movin' Out caused me to dig out my notes from that summer, and seeing Mann's posts has awakened my desire to write some of the posts I have in mind. The ideas I learned in this book relate well to how I write software, teach, and learn.
Here is a teaser that may connect with agile software developers and comfort students preparing for final exams:
The routine is as much a part of the creative process as the lightning bolt of inspiration, maybe more. And this routine is available to everyone.
Oddly, this quote brings to mind an analogy to sports. Basketball coaches often tell players not to rely on having a great shooting night in order to contribute to the team. Shooting is like inspiration; it comes and it goes, a gift of capricious gods. Defense, on the other hand, is always within the control of the player. It is grunt work, made up of effort, attention, and hustle. Every player can contribute on defense every night of the week.
For me, that's one of the key points in this message from Tharp: control what you can control. Build habits within which you work. Regular routine -- weekly, daily, even hourly -- are the scaffolding that keep you focused on making something. What's better, everyone can create and follow a routine.
While I sit and wait for the lightning bolt of inspiration to strike, I am not producing code, inspired or otherwise. Works of inspiration happen while I am working. Working as a matter of routine increases the chances that I will be producing something when the gods smile on me with inspiration. And if they don't... I will still be producing something.
... and a week of time away from work and worries.
There is still something special about an early morning run on fresh snow. The world seems new.
November has been a bad month for running, with my health at its lowest ebb since June, but even one three-mile jog brings back a good feeling.
I can build a set of books for our home finances based on the data I have at hand. I do not have to limit myself to the way accountants define transactions. Luca Pacioli was a smart guy who did a good thing, but our tools are better today than they were in 1494. Programs change things.
S-expressions really are a dandy data format. They make so many things straightforward. Some programmers may not like the parens, but simple list delimiters are all I need. Besides, Scheme's (read) does all the dirty work parsing my input.
After a week's rest, I imagine something like one of those signs from God:
That "sustainable pace" thing...
I meant that.
-- The Agile Alliance
I'd put Kent or Ward's name in there, but that's a lot of pressure for any man. And they might not appreciate my sense of humor.
The Biblical story of creation in six days (small steps) with feedback ("And he saw that it was good.") and a day of rest convinces me that God is an agile developer.
I leave today to attend the second SECANT workshop at Purdue. This is the sort of trip I like: close enough that I can drive, which bypasses all the headaches and inconveniences of flight, but far enough away that it is a break from home. My conference load has been light since April, and I can use a little break from the office. Besides, the intersection of computer science and the other sciences is an area of deep interest, and the workshop group is a diverse one. It's a bit odd to look forward to six hours on the road, but driving, listening to a book or to music, and thinking are welcome pursuits.
As I was checking out of the office, I felt compelled to make two public confessions. Here they are.
First, I recently ran across another recommendation for Georges Perec's novel, Life: A User's Manual. This was the third reputable recommendation I'd seen, and as is my general rule, after the third I usually add it to my shelf of books to read. As I was leaving campus, I stopped by the library to pick it up for the trip. I found it in the stacks and stopped. It's a big book -- 500 pages. It's also known for its depth and complexity. I returned the book to its place on the shelf and left empty-handed. I've written before of my preference for shorter books and especially like wonderful little books that are full of wisdom. But these days time and energy are precious enough resources that I have to look at a complex, 500-page book with a wary eye. It will make good reading some other day. I'm not proud to admit it, but my attention span isn't up to the task right now.
Second, on an even more frivolous note, there is at the time of this writing no Diet Mountain Dew in my office. I drank the last one yesterday afternoon while giving a quiz and taking care of pre-trip odds and ends. This is noteworthy in my mind only because of its rarity. I do not remember the last time the cupboard was bare. I'm not a caffeine hound like some programmers, but I don't drink coffee and admit some weakness for a tasty diet beverage while working.
I'll close with a less frivolous comment, something of a pattern I've been noticing in my life. Many months ago, I wrote a post on moving our household financial books from paper ledgers and journals into the twentieth century. I fiddled with Quicken for a while but found it too limiting; my system is a cross between naive home user and professional bookkeeping. Then I toyed with the idea of using a spreadsheet tool like Numbers to create a cascaded set o journals and ledgers. Yet at every turn I was thinking that I'd want to implement this or that behavior, which would strain the limits of typical spreadsheets. Then I came to my computer scientist's senses: When in doubt, write a program. I'd rather spend my time that way anyway, and the result is just what I want it to be. No settling. This pattern is, of course, no news at all to most of you, who roll your own blogging software and homework submission systems, even content management systems and book publishing systems, to scratch your own itches. It's not news to me, either, though sometimes my mind comes back to the power slowly. The financial software will grow slowly, but that's how I like it.
As a friend and former student recently wrote, "If only there were more time..."
Off to Purdue.
For the last five years, mid-August has meant more than getting ready for fall semester. It has meant 50+ mile weeks. It has meant once or twice weekly track workouts. It has meant hours each week running, before dawn in cool, moist air; in newly-risen sunlight; in the rain.
A month ago I was still figuring things out about my running, hoping to get well all the while. Unfortunately, I haven't gotten well. At times, I have gotten better, but never well, and punctuated every couple of weeks by a return of the same symptoms that have dogged me since May 2. I've been running since early June, because I wasn't getting better anyway. The last month or two, I have managed between 24 and 29 miles each week, with one 31.5-mile week that left tired for a week afterwards. Many people think that 25-30 mile weeks are awesome, but for me they aren't, and all the while I'm looking to get better.
My doctor is baffled. He has run every test he can imagine, and all he and his nurses can say is, man, you are healthy. That's good news! ... except for the part of not being well.
We'll keep looking, and I'll keep plodding along. But I really miss the summer of running I didn't have. August isn't quite the same.
On my way into a store this afternoon to buy some milk, I ran into an old friend. He moved to town a decade or so ago and taught art at the university for five years before moving on to private practice. As we reminisced about his time on the faculty, we talked about how much we both like working with students. He mentioned that he recently attended his 34th wedding of a former student.
Thirty-four weddings from five years of teaching. I've been teaching for sixteen years and have been invited to only a handful of weddings -- three or four.
Either art students are a different lot from CS students, or I am doing something wrong...
I have been home alone with my daughters for the last few days. It's always neat to have the chance to talk with the girls over an extended time, because they do say some amazing things.
Over lunch a few days ago, we were talking about classes at school. I had been telling them about Lockhart's Lament, which I had just started reading. (More on that later.) Our conversation turned to art class, and how some teachers don't seem to understand what it means to create art. One teacher always told the students to keep things simple, because then they would get done faster. My younger daughter couldn't hide her exasperation. "She doesn't understand. It's not about fastest; it's about masterpieces." That bit of wisdom reminded me why I know she will do good things in her life.
My puffed-up pride didn't last long. The next day, in the course of a conversation I've already forgotten otherwise, she told me, "It's not that you're old, Dad; you're just too old." That's too much truth for me.
Sometimes the girls like my relatively advanced age. Over the weekend, they pulled my old Trivial Pursuit game off the shelf, and we had several hours of fun. I love that they've already read enough in their short lives to be able to play the game pretty well, and that they enjoy spending an evening or two trading questions and answers with their old dad.
Here I am, sitting in the office on a quiet Friday afternoon, looking forward to the weekend and writing up a short blog entry on a couple of cool software patterns. In another window is playing the WWDC 2008 keynote talk.
Suddenly, my ears hear the soothing voice of Steve Jobs say:
Imagine you are a university professor and you are teaching a class in how to write iPhone apps.
Writing code. An iPhone. Teaching class. Sweet Dreams (are made of this).
Back to work.
In recent days, I have written about not reading books and the relationship of these ideas to writing, from my enjoyment of Pierre Bayard's How to Talk About Books You Haven't Read. A couple of readers have responded with comments about how important reading is. Don't worry -- much of what Bayard and I are saying here is a joke. But it is also true, when looked at with one's head held tilted just so, and that's part of what made the book interesting to me. For you software guys, think about Extreme Programming -- an idea taken to its limits, to see what the limits can teach us. You can be sure that I am not telling you not to read every line of every novel and short story by Kurt Vonnegut! (I certainly have, some many, many times, and I enjoyed every minute.) Neither is Bayard, though it may seem so sometimes.
In my entries inspired by the book, it seems as if I am talking about myself an awful lot. Or consider my latest article, on parsing in CS courses. I read an article by Martin Fowler and ended up writing about my course and my opinions of CS courses. My guess is that most folks out there are more interested in Fowler's ideas than mine, yet I write.
This is another source of occasional guilt... Shouldn't this blog be about great ideas? When I write about, say, Bayard's book, shouldn't the entry be about Bayard's book? Or at least about Bayard?
Bayard helps me to answer these questions. Let's switch from Montaigne, the focus of my last entry on this topic, to Wilde. The lead quote of Bayard's Chapter 12 was the first passage of the book to seize my attention as I thumbed through it:
My experience writing this blog biases me toward shouting out, "Amen, Brother Bayard!" But, if it is true that all of my writing is a pretext for writing my autobiography, then it is all the more remarkable that I have any readers at all. Certainly you all have figured this out by now.
Bayard claims -- and Wilde agrees -- that it cannot be any other way. You may find more interesting people writing about themselves and read what they write, but you'll still be reading about the writer. (This is cold consolation for someone like me, who knows myself to be not particularly interesting!)
Bayard explores Wilde's writing on this very subject, in particular his The Critic as Artist (HB++). Bayard begins his discussion with the surface connection of Wilde offering strident support for the idea of not reading. Wilde says that, in addition to making lists of books to read and lists of books worth re-reading, we should also make lists of books not to read. Indeed, a teacher or critic would do an essential service for the world by dissuading people from wasting their time reading the wrong books. Not reading of this sort is a "power acquired by specialists, a particular ability to grasp what is essential".
Bayard then moves on to a deeper connection. Wilde asserts in his typical fashion that the distinction between creating a work of art and critiquing a work of art is artificial. First, the artist, when creating, necessarily exercises her critical faculty in the "spirit of choice" and the "subtle tact of omission"; without this faculty no one can create art, at least not art worth considering. This is an idea that most people are willing to accept, especially those creative people who have some awareness of how they create.
But what of the critic? Many people consider critics to be parasites who at best report what we can experience ourselves and and at worst detract from our experience with their self-indulgent contributions.
Not Wilde:
Criticism is itself an art. And just as artistic creation implies the working of the critical faculty, and, indeed, without it cannot be said to exist at all, so Criticism is really creative in the highest sense of the word. Criticism is, in fact, both creative and independent.
This means that a blogger who primarily comments on the work of others can herself be making art, creating new value. By choosing carefully ideas to discuss, subtly omitting what does not matter, the critic creates a new work potentially worthy of consideration in its own right. (Suddenly, the idea of a mashup comes to mind.)
The idea of critic as an independent creator is key. Wilde says:
The critic occupies the same relation to the work of art he criticises as the artist does to the visible world of form and colour, or the unseen world of passion and thought. He does not even require for the perfection of his art the finest materials. Anything will serve his purpose....
To an artist so creative as the critic, what does subject-matter signify? No more and no less than it does to the novelist and the painter. Like them, he can find his motives everywhere. Treatment is the test. There is nothing that has not in it suggestion or challenge.
Bayard summarizes other comments from Wilde in this way:
The work being critiqued can be totally lacking in interest, then, without impairing the critical exercise, since the work is there only as a pretext.
But how can this be?? Because ultimately, the writer writes about himself. Freed from the idea that writing about something else is about that something, the writer is able to use the something as a trigger, a cue to write about the ideas that lie in his own mind. (Please read the first paragraph of the linked entry, if nothing else. Talk about not reading!)
As Wilde says,
That is what the highest criticism really is, the record of one's own soul.
Again, Bayard summarizes neatly:
Reflection on the self .. .is the primary justification for critical activity, and this alone can elevate criticism to the level of art.
As I read this chapter, I felt as if Bayard and Wilde were speaking directly to me and my own doubts as a blogger who likes to write about works I read, performances I see, and experiences as I have. It is a blogger's manifesto! Knowing and Doing feels personal to me because it is. Those works, performances, and experiences stimulate me to write, and that's okay. It is the nature of creativity to be sparked by something Other and to use that spark to express something that lies within the Self. Reading about Montaigne and his fear of forgetting what he had written was a trigger for me to write something I'd long been thinking. So I did.
I can take some consolation: This blog may not be worth reading, but not because I choose to connect what I read, see, hear, and feel to myself. It can be unworthy only to the extent that what is inside me is uninteresting.
By the way, I have just talked quite a bit about "The Critic as Artist", though I have never read it. I have only read the passages quoted by Bayard, and Bayard's commentary on it. I intend to read the original -- and begin forgetting it -- soon.
~~~~~
These three entries on Bayard's delightful little text cover a lot of ground in the neighborhood of guilt. We often feel shame at not having read something, or at not having grown from it. When we write for others, it is easy to become too concerned with getting things right, with being perfect, with putting on appearances. But consider this final quote from Bayard:
Truth destined for others is less important than truthfulness to ourselves, something attainable only by those who free themselves from the obligation to seem cultivated, which tyrannizes us from within and prevents us from being ourselves.
Long ago, near the beginning of this blog, I quoted Epictetus's The Enchiridion, via the movie Serendipity, of all places. That quote has a lot in common with what Bayard says here. Freeing ourselves from the obligation to seem cultivated -- being content to be thought foolish and stupid -- allows us to grow and to create. Epictetus evens refers to keeping our "faculty of choice in a state conformable to nature", just as Wilde stresses the role of critical faculty creating a work of art when we write.
Helping readers to see this truth and to release them from the obligation to appear knowing is the ultimate source of the value of How to Talk About Books You Haven't Read. Perhaps Bayard's will be proud that I mark it FB++.
Why is it that I feel compelled to write about getting a new Macbook Pro? Lots of people have one by now. But for a computer guy like me, a new laptop is one part professional tool and one part toy, a new user experience that shapes how I live.
Unlike my last laptop purchase, I splurged and bought an entry-level Macbook Pro. The 15" screen seems so much bigger than my iBook's 13" screen, because it is. The actual screen size is 13-1/8"x8-1/4" versus 9-3/4"x7-1/4", which is 50% larger. One motivation for buying the iBook last time was having a smaller machine for while flying. That worked out as planned, but even when I travel a lot I don't travel all that much. I'll have a chance to see how well the new machine travels next week when I visit Google.
Migrating files and configuration was much simpler this time. The Pro comes with a 200GB drive, rather than the 30GB(!) drive the 2005 iBook shipped with. Of course, this experience only accentuates that I am old. I think of that 30GB drive as horribly restrictive, yet not that many years ago I would have felt like a king with one. The new machine's drive is close enough to my office machine's 240-gig drive that I was able to mirror all of my files. That said, I was surprised a bit to find that the "200GB Serial ATA" drive advertised has an actual capacity of 186.31GB...)
It's a good idea for me to get a new machine every once in a while, and not just for the new technology, which is itself a wonderful advantage. I'm a creature of habit, more so than most people I know, and my brain benefits from being pulled out of its rut. My fingers must learn a new keyboard. I have to dig out a new bag to carry it in, because my OOPSLA 2005 isn't wide enough. The Leopard interface is just different enough to open my eyes to tasks that are now carried out subconsciously on the older machines.
Whenever I get a new machine and face the task of despoiling the pristine, out-of-the-box set-up of my system with my own files, I feel the urge eliminate clutter. A big part of this is always clearing out my stuff/ folder -- currently at 13,604 files and 1.46 GB on disk. (My stuff/ folder is full of folders, so I just took a break to write a quick Ruby script to count the files for me.) But this time I also paid close attention to /Applications/personal, where I store nearly all of the Mac applications I install on my machine. The only exceptions are major-league apps such as iWork and Adobe Creative Suite.
/Applications/personal on my desktop machine contains 59 apps total, including four "classic" (pre-OS ) programs. I also have two folders of apps on trial in the stuff/ folder, totaling another 27.
Hello. My name is Eugene. I am an application junkie.
Whenever I read about a cool app in a blog or an e-mail or a magazine, I go "Ooh!" and download it. I delete many of these; for the 86 on my machine, I've probably tried and deleted several multiples more. But often they find there way into a folder somewhere because I just know that I'll use them soon. But usually I don't. I don't use Paparazzi or Keyboard Cleaner, or PsyncX or WordService. They are all fine programs, I am sure, but they just never broke into my workflow. Same for Growl and AquamacsEmacs.
So this time, I decided to transfer only programs that I recall using as a part of my work or play. Right now, the new Macbook has 20 apps, ranging from workhorses such as NetNewsWire and VoodooPad to programming tools such as PLT Scheme and Scratch down to fun little utilities such as LittleSecrets and PagePacker -- and one game so far, SudokuCompanion. Let's see what I miss from the big stash, if anything...
And don't get me started on the widgets I installed back when Dashboard seemed so very cool. I almost never use a one of them. None have made it across the divide yet.
My Macbook Pro now knows me as wallingf. Perhaps I should give her a name, too. It's personal.
A former student recently mentioned a tough choice he faces. He has a great job at a Big Company here in the Midwest. The company loves him and wants him to stay for the long term. He likes the job, the company, and the community in which he lives. But this isn't the sort of job he originally had hoped for upon graduation.
Now a position of just the sort he was originally looking for is available to him in a sunny paradise. He says, "I have quite a decision to make.... it's hard to convince myself to leave the secure confines of [Big Company]. Now I see why their turnover rate is so low."
I had a hard time offering any advice. When I was growing up, my dad work for Ford Motor Company in an assembly plant, and he faced insecurity about the continuance of his job several times. I don't know how much this experience affected my outlook on jobs, but in any case my personality is one that tends to value security over big risk/big gain opportunities.
Now I hold a job with greater job security than anyone who works for a big corporation. An older colleague is fond of saying Real men don't accept tenure. I first heard him say that when I was in grad school, and I remember not getting it at all. What's not to like about tenure?
After a decade with tenure, I understand better now what he means. I always thought that the security provided by having tenure would promote taking risks, even if only of the intellectual sort. But too much security is just as likely to stunt growth and inhibit taking risks. I sometimes have to make a conscious effort to push myself out of my comfort zone. Intellectually, I feel free to try new things, but pushing myself out of a comfortable nest here into a new wnvironment -- well, that's another matter. What are the opportunity costs in that?
I love what Paul Graham says about young CS students and grads having the ability to take entrepreneurial risk, and how taking those risks may well be the safer choice in the long run. It's kind of like investing in stocks instead of bonds, I think. I encourage all of my students to give entrepreneurship a thought, and I encourage even more the ones whom I think have a significant chance to do something big. There is probably a bit of wistfulness in my encouragement, not having done that myself, but I don't think I'm simply projecting my own feelings. I really do believe that taking some employment risk, especially while young, is good for many CS grads.
But when faced with a concrete case -- a particular student having to make a particular decision -- I don't feel quite so cocksure in saying "go for it with abandon". This is not abstract theory; his job and home and fiancee are all in play. He will have to make this decision on his own, and I'd hate to push him toward something that isn't right for him from my cushy, secure seat in the tower. I feel a need to stay abstract in my advice and leave him to sort things out. Fortunately, he is a bright, level-headed guy, and I'm sure he'll do fine whichever way he chooses. I wish him luck.
Last night, I attended a Billy Joel concert. I last saw him perform live a decade or so ago. Billy was a decade older, and I was a decade older. He looked it, and I'm sure I do, too.
But when he started to play the piano, it could have been 1998 in the arena. Or 1988. Or 1978. The music flowing from his hands and his dancing feet filled me. Throughout the night I was 19 again, then 14, 10, and 25. I was lying on my parents' living room floor; sitting in the hand-me-down recliner that filled my college dorm room; dancing in Market Square Arena with an old girlfriend. I was rebellious teen, wistful adult, and mesmerized child.
There are moments when time seems more illusion than reality. Last night I felt like Billy Pilgrim, living two-plus hours unstuck in time.
Oh, and the music. There are not many artists who can, in the course of an evening, give you so many different kinds of music. From the pounding rock of "You May Be Right" to the gentle, plaintive "She's Always A Woman", and everything between. The Latin rhythms of "Don't Ask Me Why" extended with an intro of Beethoven's "Ode to Joy", and a "Root Beer Rag" worthy of Joplin.
Last night, my daughters aged 15 and 11 attended the concert with me. Music lives on, and time folds back on itself yet again.
[A transcript of the SIGCSE 2008 conference: Table of Contents]
This sort of entry usually comes after I write up the various conference sessions and have leftovers that didn't quite fit in an article. That may still happen, but I already have some sense of what will go where and have these items as miscellaneous observations.
First of all, I tried an experiment today. I did not blog in real-time. I used -- gasp! -- the antiquated technology of pen and paper to take notes during the sessions. On one or two occasions, I whipped open the laptop to do a quick Google search for a PhD dissertation or a book, but I steadfastly held back from the urge to type. I took notes on paper, but I couldn't fall into "writing" -- crafting sentences, then forming paragraphs, editing, ... All I could do was jot, and because I write slowly I had to be pickier about what I recorded. One result is that I paid more attention to the speakers, and less to a train of thought in my head. Another is that I'll have to write up the blog posts off-line, and that will take time!
As I looked through the conference program last night, I found myself putting on my department head hat, looking for sessions that would serve my department in the roles I now find myself in more often: CS1 for scientists, educational policy in CS, and the like. But when I got to the site and found myself having to choose between Door A and Door B... I found myself drifting into the room where Stuart Reges was talking about a cool question that seems to pick out good CS students, and the nifty assignments. Whatever my job title may be, I am a programmer and CS teacher. (More on both of those sessions in coming entries...)
Now, for a couple of non-CS, non-teaching observations.
There is so much for me to learn.
Last night, at dinner with my family, I casually mentioned this YouTube video in which Barack Obama answers a question from a Google interviewer about how to sort a million 32-bit integers. Obama gets a good laugh when he says that "the bubble sort would be the wrong way to go". My family knows that I enjoy pointing out pop references to CS, so I figured they'd take this one in stride and move.
But they didn't. Instead, they asked questions. What is "bubble sort"? Why do they call it that? As I described the idea, they followed with more questions and ideas of their own. I told them that bubble sort was the first sorting algorithm I ever learned, programming in BASIC as a junior in high school. My wife mentioned something like the selection sort, so I told them a bit about selection sort and insertion sort, and how they are considered "better" than bubble sort.
Why? they asked. That led us to Big-Oh notation and O(n²) versus (nlogn), and why the latter is better. We talked about how we can characterize an algorithm by its running time as proportional to n² or nlogn for some factor k, and the role that k plays in complicating our comparisons. I mentioned that O(n²) and a big k are part of the reason that bubble sort is considered bad, and that's what made the answer in the video correct -- and also why I am pretty sure that Obama did not understand any of the reasoning behind his answer, which is what made his deadpan confidence worth a chuckle.
(If you would like to learn more about bubble sort and have a chuckle of your own, read Owen Astrachan's Bubble Sort: An Archaeological Algorithmic Analysis (PDF), available from his web site.)
As the conversation wound down, we talked about how we ourselves sort things, and I got to mention my favorite sorting algorithm for day-to-day tasks, mergesort.
I suspect that my younger daughter enjoyed this conversation mostly for hearing daddy the computer scientist answer questions, but my wife and freshman daughter seemed to have grokked some of what we talked about. Honest -- this wasn't just me prattling on unprovoked. It was fun, yet strange. Maybe conversations like this one can help my daughters have a sense of the many kinds of things that computer scientists think about. Even if it was just bubble sort.
Leave it to George Costanza. In the episode of Seinfeld titled The Masseuse, George finally has a great relationship with a wonderful woman. Inexplicably, she likes everything about him. Yet all he can think about is Jerry's current girlfriend, a masseuse who can't stand George. Rather than turn his attention to his own loving partner, he makes such a strident effort to get the masseuse to like him that he drives her even further away -- and loses his own girl, who can't understand George's obsession. But it's really quite simple: George wants everyone to like him.
I understand that not everyone will like me. But deep inside it's easy to lose sight of that fact in the course of daily interactions. When I became department head, one of my goals was to treat everyone fairly, to be open and honest so that each member of the faculty could trust that I was giving him or her a fair hearing and doing the best I could to help him or her succeed within whatever conditions we found ourselves to be operating.
That's where George's problem tries to sneak in the door. What if I do treat everyone fairly and am open and honest; what if I do all I can so that each faculty can trust me and my intentions -- and still someone is unhappy with me? What then?
Trying to do what George tried to do is a recipe for disaster. As hard as it is sometimes, all I can do is what I can do. I should -- must -- act in a trustworthy manner, but I cannot make people like what I do, or like me. That is part of the territory. For me, though, the occasional encounter with this truth sucks a lot of psychic energy out of me.
This is the second semester of my third year as head, which means that I am undergoing a performance evaluation. I suppose the good news is that the dean feels comfortable enough with how I've done to do the review at all, rather than look for a new person for the next three-year appointment. He is using an assessment instrument developed by the IDEA Center at Kansas State. The faculty were asked to judge my performance on a number of tasks that are part of a head's job, such as "Guides the development of sound procedures for assessing faculty performance" and "Stimulates or rejuvenates faculty vitality/enthusiasm". My only role in the process was to rank each of the tasks in terms of their importance to the job.
I look at the review as both summative and formative. The summative side of the review is to determine how well I've done so far and whether I should get to keep doing it. The formative side is to give me feedback I can use to improve for the future. As you might guess from my fondness for so-called agile software development practices, I am much more interested in the formative role of the assessment. I know that my performance has not been ideal -- indeed, it's not even been close! -- but I also know that I can get better. Feedback from my colleagues and dean will help.
Though I was not asked to assess my performance on these issues, I do have a sense of my job performance. I have been only marginal in managing day-to-day affairs. That task requires a certain kind of focus and energy that I've had to develop on the job. I've also had to learn how to respond effectively in the face of a steady barrage of data, information, and requests. I have also been only marginal in "leadership" tasks, the ones that require I take initiative to create new opportunities for faculty and students to excel. This is an area where I have had a lot of ideas and discussed possibilities with the faculty, but finding time to move many of these ideas forward has been difficult.
In an area of particular importance to our department given its history, I have done a reasonable job of communicating information to the faculty, treating individual faculty fairly, and encouraging conversation. I recognized these tasks as primary challenges when I accepted my appointment and, while I had hoped to do better, I've done well so far to keep this dynamic front and center.
The results of the faculty survey are in; they arrived in my mailbox yesterday. I decided not to read the results right away... I have been a little under the weather and wanted to preserve my mental energy for work. The last session of my 5-week bash scripting course meets today, and I would rather be focused on wrapping up the class than on the data from my evaluation. I can tell myself not to fall victim to George's masseuse problem, but sometimes that is more easily done with conscious choices about how and when to engage relationships.
This afternoon, I'll look at the data, see what they can help me learn, and think about the future.
I wasn't expecting to hear John Maeda's name during the What is a Tree? talk, because I didn't know that researchers in Maeda's lab had created the language Processing. But hearing his name brought to mind something that has been in the back of my mind for a couple of months, since the close of my first theater experience. I had blogged about a few observations my mind had made about the processes of acting in and directing a play. The former were mostly introspective, and the latter were mostly external, as I watched our director coalesce what seemed like a mess into a half-way decent show. Some of these connections involved similarities I noticed between producing a play and creating software.
I made notes of a few more ideas that I hadn't mentioned yet, including:
I'm still wondering if those last two have any useful analogue in software development...
Since the show ended, I have occasionally tried to discern the value in the analogy between producing a play and creating software -- indeed, if there is any. That's where the connection to Maeda comes in. Last summer, I read the slender Laws of Simplicity, a collection of essays from Maeda's blog of the same name. The book suggest ten ways that we can design simpler systems and products. I must not have been in the right place to read the book just then, because I didn't get as much out of it as I had hoped. But one part of the book stuck with me.
For a metaphor to engage us deeply, Maeda wrote, it is essential that it relate, translate, and surprise. As I recall now, this means that the metaphor must relate the elements of the two things, that it must translate foreign elements from one of the things to the other, and that the result of this translation should surprise -- it should make us see or understand the other thing in a new way, give us insight.
There is a danger in finding analogies everywhere we look by making superficial connections. I am perhaps more prone to this risk than many other folks. That may be why I liked Maeda's relate/translate/surprise triad so much. Since reading it, I have used it as an external checkpoint for any new analogy that I want to make. If I can explain how the metaphor relates the two things, translates disparate elements, and surprises me, then I have some reason to think that the metaphor offers value -- at least more reason than just saying, "Hey, look at this cool new thing I noticed!"
To this point, I have not found the "surprise" in the theater experience that teaches me something new about how to think about making software. This doesn't mean that there is no value in the analogy, only that I haven't found it yet. By remaining skeptical a little while longer, I decrease the probability that I try to draw an inappropriate conclusion from the relationship.
Of course, just because I haven't yet found the surprise in the analogy doesn't mean that I did not find value in the experience that led me to it. A rich web of experiences is valuable in its own right, and enjoyable. It also provides the source material for learning.
Yesterday was the the sort of day that makes my CS friends and colleagues ask if I am crazy for being department head. It was the third day of classes this semester. A dozen students came by, for advising on course selection, for help switching sections, and the like. I produced a schedule mapping graduate assistants to open lab hours, ran it past the GAs and the faculty, and then distributed it. The phone rang repeatedly, with calls from other offices on campus and from off-campus folks asking questions about scholarship application deadlines.
Every time I started a new train of thought, an interrupt occurred. Context switch, new process, and return. Each task was, individually, just fine. In fact, I enjoy talking to students, new and returning, and helping them make choices about their studies. But little or no computer science happened.
Today was my teaching day, so I got to spend plenty of time thinking about shell scripts. That's not Computer Science, but it's computer science, and as a hacker I loved it. Of course, yesterday's interrupt-fest cut into my prep time enough that I didn't feel as prepared for class as I like to be. But I got to think about software tools, writing code, duplication, abstraction -- many of the things that make me a happy computer scientist.
Tomorrow I travel to Des Moines to help select the winners of the 2008 Prometheus Awards, the Academy Awards of IT in my state. Four hours on the road. A great outreach activity, an important role for my university, and conversation with some sharp, interesting people who are involved in my discipline's industry -- but little or no computer science.
The weekend will be here soon.
I have always liked the week before classes start for a new semester. There is a freshness to a new classroom of students, a new group of minds, a new set of lectures and assignments. Of course, most of these aren't really new. Many of my students this semester will have had me for class before, and most semesters I teach a course I've taught before, reusing at least some of the materials and ideas from previous offerings. Yet the combination is fresh, and there is a sense of possibility. I liked this feeling as a student, and I like it as a prof. It is one of the main reasons that I have always preferred a quarter system to a semester system: more new beginnings.
Since becoming department head, the joy is muted somewhat. For one thing, I teach one course instead of three, and instead of taking five. Another is that this first week is full of administrivia. There are graduate assistantship assignments to prepare, lab schedules to produce, last-chance registration sessions to run. Paperwork to be completed. These aren't the sort of tasks that can be easily automated or delegated or shoved aside. So they capture mindshare -- and time.
This week I have had two other admin-related items on my to-do list. First is an all-day faculty retreat my department is having later this week. The faculty actually chose to get together for what is in effect an extended meeting, to discuss the sort of issues that can't be discussed very easily during periodic meetings during the semester, which are both too short for deep discussion and too much dominated by short-term demands and deadlines. As strange as it sounds, I am looking forward to the opportunity to talk with my colleagues about the future of our department and about some concrete next actions we can take to move in the desired direction. There is always a chance that retreats like this can fall flat, and I bear some responsibility in trying to avoid that outcome, but as a group I think we can chart a strong course. One good side effect is that we will go off campus for a day and get away from the same old buildings and rooms that will fill our senses for much of the next sixteen weeks.
Second is the dean's announcement of my third-year review. Department heads here are reviewed periodically, typically every five years. I came into this position after a couple of less-than-ideal experiences for most of the faculty, so I am on a 3-year term. This will be similar to the traditional end-of-the-term student evaluations, only done by faculty of an administrator. In some ways, faculty can be much sharper critics than students. They have a lot of experience and a lot of expectations about how a department should be run. They are less likely to "be polite" out of habits learned as a child. I've been a faculty member and do recall how picky I was at times. And this evaluation will drag out for longer than a few minutes at the end of one class period, so I have many opportunities to take a big risk inadvertently. I'm not likely to pander, though; that's not my style.
I'm not all that worried. The summative part of the evaluation -- the part that judges how well I have done the job I'm assigned to do -- is an essential part of the dean determining whether he would like for me to continue. While it's rarely fun to receive criticism, it's part of life. I care what the faculty think about my performance so far, flawed as we all know it's been. Their feedback will play a large role in my determining whether I would like for me to continue in this capacity. The formative part of the evaluation -- the part that gives me feedback on how I can do my job better -- is actually something I look forward to. Participating in writers' workshops at PLoP long ago helped me to appreciate the value of suggestions for improvement. Sometimes they merely confirm what we already suspect, and that is valuable. Other times they communicate a possible incremental improvement, and that is valuable. At other times still they open doors that we did not even know were available, and that is really valuable.
I just hope that this isn't the sort of finding that comes out of the evaluation. Though I suppose that that would be valuable in its own way!
Some students do listen to what we say in class.
Back when I taught Artificial Intelligence every year, I used to relate a story from Russell and Norvig when talking about the role knowledge plays in how an agent can learn. Here is the quote that was my inspiration, from Pages 687-688 of their 2nd edition:
Sometimes one leaps to general conclusions after only one observation. Gary Larson once drew a cartoon in which a bespectacled caveman, Zog, is roasting his lizard on the end of a pointed stick. He is watched by an amazed crowd of his less intellectual contemporaries, who have been using their bare hands to hold their victuals over the fire. This enlightening experience is enough to convince the watchers of a general principle of painless cooking.
I continued to use this story long after I had moved on from this textbook, because it is a wonderful example of explanation-based learning.
Unfortunately, Russell and Norvig did not include the cartoon, and I couldn't find it anywhere. So I just told the story and then said to the class -- every class of AI students to go through my university over a ten-year stretch -- that I hoped to find it some day.
As of yesterday, I can, thanks to a former student. Ryan heard me on that day in his AI course and never forgot. He looked for that cartoon in many of the ways I have over the years, by googling and by thumbing through Gary Larson collections in the book stores. Not too long ago, he found it via a mix of the two methods and tracked it down in print. Yesterday, on one of his annual visits (he's a local), he brought me a gift-wrapped copy. And I was happy!
Sadly, I still can't show you or any of my former students who read my blog. Sorry. I once posted another Gary Larson cartoon in a blog entry, with a link to the author's web site, only to eventually a pre-cease-and-desist e-mail asking me to pull the cartoon from the entry. I'll not play with that fire again. This is almost another illustration of the playful message of the very cartoon in question: learning not to stick one's hand into the flame from a single example. But not quite -- it's really an example of learning from negative feedback.
Thanks to Ryan nonetheless, for remembering an old prof's story from many years ago and for thinking of him during this Christmas season! Both the book and the remembering make excellent gifts.
My first run as an actor has ended without a Broadway call. Nonetheless I consider it to have been successful enough. My character didn't cause any major interruptions in the flow of our three performances, and I even got us back on track a time or two. Performing in front of a crowd -- especially a crowd that contained personal friends -- was enough different from giving a lecture or speaking in public that it wracked a few nerves. But getting a laugh from a real audience was also enough different from a laugh in a lecture, too, and the buzz could feed the rest of the performance.
My first post on this topic recorded dome thoughts I had had on the relationship between developing software and directing a play. In those thoughts, the director or producer is cast as the software developer, or vice versa. In the last couple of weeks, my thoughts turned more often to my role as performer. Here are a few:
After a while, experience helps push self-consciousness into the background. It was even possible to get into a flow where the self disappeared for a moment. I think I need more experience in character to have more experiences like that! But those moments were special.
Most of the relationships I noticed between acting in the play and building software were really patterns of good teams. In every scene I depended upon the presence and performance of others -- and they depended on me. Being a good teammate mattered both on stage (while performing) and off (when preparing and when taking and giving feedback). "The key to acting," said our director, "is listening to other people." Funny how that is the key to so many things.
As I look back on this (first?) experience being a player in a stage production, I think that there is a lot to this notion that developing software is like producing a play -- and that producing a play is like developing software. The two media are so different, but they are both malleable, and both ultimately depend on their audiences (users).
Over the course of two weeks or so, the director did a lot of what I call refactoring. For example, he found the equivalent of duplicated code -- lines and even larger parts of scenes that don't move the story forward, given how the rest of the play is being staged. Removing duplicated stuff frees up stage real estate and time for making other additions and changes. He also aggressively sought and deleted dead space -- moments when no one was on stage (say, in the transition between scenes) or no active was taking place (say, when lighting changed). Dead space kills the energy of the show and distracts the viewer. Dead space is a little like dead code and over-designed code -- code that isn't contributing to the application. Cut it.
Every night after rehearsal and even shows, the director "ran notes" with us. This was a time after each "iteration" dedicated to debugging and refactoring. That's good practice in software.
One other connection jumped out to me yesterday. After we closed the show, I was chatting with Scott Smith, a local filmmaker whose is real-life husband to the woman who played my wife in the show. We were discussing how filmmaking has changed in the last decade or so. In the not-so-old days folks in video were strongly encouraged to become specialists in one of the stages: writing, directing, shooting, editing, and so on. Now, with the wide availability of relatively inexpensive equipment and digital tools, and economic pressures to deliver more complete services, even veterans such as Smith find themselves developing skills across the board, becoming not a jack-of-all-trades but master of none, but rather strong in all phases of the game.
I immediately thought of extreme programming's rapid development cycle that requires programmers to be not only writers of code but also writers of stories and tests, to be able to interact with clients and to grow designs and architectures. It's hard to be a master of all trades, but the sort of move we have seen in software and in filmmaking from specialist to generalist encourages a deep competence in all areas. Too often I have heard folks say "I am a generalist" as way to explain their lack of expertise in any one area. But the new generalist is competent across the board, perhaps expert in multiple areas, and able to contribute meaningful to the whole lifecycle.
One last idea. Just before our final show, the director gave us our daily pep talk. He said that come performers view the last show as occasion to do something wacky -- to misplace someone's prop, or deliver a crazy line not from the script, or to affect some voice or mannerism on stage. That sounds like fun, he said, but remember: For the audience out in the seats today, this is the first show. They deserve to see the best version of the show that we can give. For some reason, I thought of software developers and users. Maybe my mind was just hyperactive at that moment when we were about to create our illusion. Maybe not.
Classes are over. Next week, we do the semiannual ritual of finals week, which keeps many students on edge while at the same time releasing most of the tension in faculty. The tension for my compiler students will soon end, as the submission deadline is 39 minutes away as I type this sentence.
The compiler course has been a success several ways, especially in the most important: students succeeded in writing a compiler. Two teams submitted their completed programs earlier this week -- early! -- and a couple of others have completed the project since. These compilers work from beginning to end, generating assembly language code that runs on a simple simulated machine. Some of the language design decisions contributed to this level of success, so I feel good. (And I already know several ways to do better next time!)
I've actually wasted far too much time this week writing programs in our toy functional language, just because I enjoy watching them run under the power of my students' compilers.
More unthinkable: There is a greater-than-0% chance that at least one team will implement tail call optimization before our final exam period next. They don't have an exam to study for in my course -- the project is the purpose we are together -- so maybe...
In lieu of an exam, we will debrief the project -- nothing as formal as a retrospective, but an opportunity to demo programs, discuss their design, and talk a bit about the experience of writing such a large, non-trivial program. I have never found or made the time to do this sort of studio work during the semester in the compilers course, as I have in my other senior project courses. This is perhaps another way for me to improve this course next time around.
The end of week n is a good place to be. This weekend holds a few non-academic challenges for me: a snowy 5K with little hope for the planned PR and my first performances in the theater. Tonight is opening night... which feels as much like a scary final exam as anything I've done in a long time. My students may have a small smile in their hearts just now.
Context You are in an Interactive Performance, perhaps a play, using Scripted Dialogue.
Problem The performer speaking before you delivers a line incorrectly. The new line does not change the substance of the play, but it interrupts the linguistic flow.
Example Instead of saying "until the first of the year", the performer says as "for the rest of the year".
Forces You know your lines and want to deliver them correctly.
The author wrote the dialogue with a purpose in mind.
Delivering the line as you memorized it is the safest way for you to proceed, and also the safest route back on track.
BUT... Delivering the scripted line will call attention to the error. This may disconcert your partner. It will also break the mood for the audience.
So: Adapt your line to the set up. Respond in a way that is seamless to the audience, retains the key message of the story, and gets the dialogue back on track.
That is, catch what you are thrown.
Example Change your line to say "for the rest of the year?" instead of "until the first of the year?"
Related Patterns If the performer speaking before you misses a line entirely, or gets off the track of the Scripted Dialogue, deliver a Redirecting Line.
----
Postscript: This category of my blog is intended for software patterns and discussion thereof, but this is a pattern I just learned and felt a strong desire to right. I may well try to write Redirecting Line and maybe even the higher-level Scripted Dialogue and Interactive Performance patterns, if the mood strikes me and the time is available. I never thought of pattern language of performance when I signed on for this gig... And just so you know, I was the performer who mis-delivered his line in the example given above, where I first encountered Catch What You're Thrown.
In a stunning departure from my ordinary behavior, I have taken an acting role in a play. My daughters were recently cast in a production of Barbara Robinson's classic children's story The Best Christmas Pageant Ever, being put on by a local church. The director is well known in our area as an actor and as the long-time director of a tremendous local children's theater, and he has just returned to the area as youth director of said church. He is also the virtual training partner to whom I have referred a few times in my entries on marathon preparation.
This play is mostly about kids and ladies, and plenty of folks auditioned for those roles. But when the one guy who auditioned for the part of the father dropped out, the production was left with a big hole. My daughters joked that I should fill in; it would be fun. My running partner-as-director assured me that I could handle what is really a small supporting role, even though I have no acting experience to speak of. After some hemming and hawing, I decided to give it a go. A compressed rehearsal schedule and a relaxed venue were enough to lower my fears, and the chance to work with my daughters -- who love to perform and who are getting pretty good at it -- was enough to convince me to take a risk.
So, in a few weeks, I will appear on stage as father Bob Bradley, immortalized in a made-for-TV film starring Loretta Swit by veteran character actor Jackson Davies.
Fortunately, my role in the play is a bit larger than the dad's role in the movie. Davies played a small, straight part, and I get to go for a laugh or two. The dad, though also gets to deliver a key passage in the story, what I call my "Linus moment", in analogy to A Charlie Brown Christmas. My lines are neither as extensive nor quite a poignant as the spotlighted soliloquy of Linus's Biblical passage, but still it is a pivotal moment. How is that for pressure on a first-time actor with no discernible natural skill? May I rise to the challenge!
I'm still not sure what to expect. I figure in the worst case we have a little fun. In the best case, perhaps learning a bit about how to deliver a line and mug for the audience will improve my "stage presence" as a teacher and as a public speaker. I usually live my life on a rather narrow path, so stretching my boundaries is almost certainly a good thing.
Recently I mentioned the big pharmaceutical company Eli Lilly in an entry on the next generation of scientists, because one of its scientists spoke at the SECANT workshop I was attending. I have some roundabout personal connections to Lilly. It is based in my hometown.
When I was in high school and had moved to a small town in the next county, I used to go with some adult friends to play chess at the Eli Lilly Chess Club, which was the only old-style corporate chess club of its kind that I knew of. (Clubs like it used to exist in many big cities in the 19th and early 20th centuries. I don't know how common they are these days. The Internet has nearly killed face-to-face chess.) I recall quite a few Monday nights losing quarters for hours while playing local masters at speed chess, at 1:30-vs-5:00 odds!
Coincidentally, my high school hometown was also home to a Lilly Research Laboratories facility, which does work on vaccines, toxins, and agricultural concerns. Parents of several friends worked there, in a variety of capacities. When I was in college, I went out on a couple of dates with a girl from back home. Her father was a research scientist at Lilly in Greenfield. (A quick google search on his name even uncovers a link to one of his papers.) He is the sort of scientist that Kumar, our SECANT presenter, works with at Lilly. Interesting connection.
But I can go one step further and bring this even closer to my professional life these days. My friend's last name was Gries. It turns out that her father, Christian Gries, is brother to none other than distinguished computer scientist David Gries. I've mentioned Gries a few times in this blog and even wrote an extended review of one of his classic papers.
I don't think I was alert enough at the time to be sufficiently impressed that Karen's uncle was such a famous computer scientist. In any case, hero worship is hardly the basis for a long-term romantic relationship. Maybe she was wise enough to know that dating a future academic was a bad idea...
I recently realized something.
In books with academic settings, one often sees images of the professor, deep in thought, strolling along the tree-lined walks of campus. Students bustle about on the way between classes. The professor walks along, carefree, absorbed in whatever interesting problem has his or her mind. (All too often, it's a him.) Even if he is running late, has a meeting to attend or a class to lead, he hurries not. He is a professor and leads a life of his own design, even if administrators and students try to impinge on his time. Whatever deep thought occupies his mind comes first. So peaceful.
Movies show us these images, too. So peaceful.
I've never been like that. My campus setting looks much like the ones described in books and movies (though lately ours has looked more like a construction zone than an old-Ivy plat), but I always seem to be in hurry. Can't be late for class, or late for that meeting. Too much to do.
I've often asked myself, when will it be like in the books and movies.
My realization: The problem isn't with my campus or even my university. It's me.
The images in the books and movies are different because the prof ambling peacefully along isn't me. It's Professor Kingsfield. Many of these characters are clichés even when done well, but in any case they are different from me.
The only way for me to live out those images is to modify my own behavior or outlook. Peace comes from inside, not out there. But I don't think I am in need of a change... I'm not restless or dissatisfied; I'm just busy being me, solving problems and thinking about the latest something to cross my path.
So maybe what I need to change is my expectation -- the expectation that I can or even should be like the fictional people I see in those scenes. I suspect that having unrealistic expectations is the cause of as much disharmony as having the "wrong outlook". The outlook isn't always wrong. Sometimes it's just me.
Sponsored outlets in the walkways of the concourses:
and
The sponsor in this case is Chase, The Bank Formerly Known as Chase Manhattan and later as JP Morgan Chase. Each outlet plate has a blue Chase banner running down the wall from about eye level right down to the pair of outlets. The banners caught my eye, so I guess they worked. Eventually the gimmick will wear out its novelty -- perhaps it already has for other flyers, or elsewhere in the country; I don't fly often -- but I thought it was cute. Funny how changes in technology have made something as mundane as an open outlet so valuable!
Oh, and thanks to cashing in some very old, expiring frequent flyer miles, I flew first class for the first time in a long time, from Indianapolis to John Wayne/Orange County. It wasn't quite like the Seinfeld episode in which Jerry and Elaine experience the different sides of traveling first class and coach, but it was very, very nice. A good addition to my vacation.
... at the end of a long day.
I found only one relevant link -- the first link on the results page, of course -- but it was not for the shop. Instead it included a blog entry written by a friend of Scott's son, which quoted the full text of the son's eulogy for his father. My good friend and former boss died this past March after a long battle with lung disease. (In addition to being a chess hound and a professional sheet metal man, he smoked far too much.) The eulogy almost brought me to tears as it reminisced about the decent man I, too, remembered fondly and respected so. I have no simple way to contact Scott's son to thank him for sharing his eulogy, but I did leave a comment on the blog.
Not many years ago, the idea that I could have learned about Scott's passing in this way and read the eulogy would have been unthinkable. The connection was indirect, impersonal in some ways, but deeply personal. For all its shortcomings, our technology makes the world a better place to live.
But I don't actually mind not having comments. I sometimes miss the interactivity that comments would enable, but managing comments and combatting comment spam takes time, time that I would rather spend reading and blogging.
Oh, and he's spot on about that procrastinating thing.
Back to paradise.
Two current events have me thinking about AI, one good and one sad.
First, after reporting last week that checkers has been solved by Jonathan Schaeffer's team at the University of Alberta, this week I can look forward to the Man vs. Machine Poker Challenge at AAAI'07 The computer protagonist in this event, Polaris, also hails from Alberta and Schaeffer's poker group. In this event, which gets under way shortly in Vancouver, Polaris will play a duplicate match against two elite human pros, Phil Laak and Ali Eslami. Laak and Eslami will play opposite sides of the same deal against Polaris, in an attempt to eliminate the luck of the draw from the result.
I don't know much about computer card-playing. Back when I was teaching AI in the mid-1990s, I used Matthew Ginsberg's text, and from his research learned a bit about programs that play bridge. Of course, bridge players tend to view their game as a more intellectual task than poker (and as more complex than, say, chess), whereas poker introduces the human element of bluffing. It will be fun seeing how a "purely rational" being like Polaris bluffs and responds to bluffs in this match. If poker is anything at all like chess, I figure that the program's dispassionate stance will help it respond to bluffs in a powerful way. Making bluffs seems a different animal altogether.
I wish I could be in Vancouver to see the matches. Back in 1996 I was fortunate to be at AAAI'96 in Philadelphia for the first Kasparov-Deep Blue match. The human champ won a close match that year before losing to Deep Blue the next. We could tell from Kasparov's demeanor and behavior during this match, as well as from his public statements, that he was concerned that humans retain their superiority over machines. Emotion and mental intimidation were always a part of his chess.
On the contrary, former World Series of Poker champion Laak seems unconcerned at the prospect that Polaris might beat him in this match, or soon; indeed, he seems to enjoy the challenge and understand the computational disadvantage that we humans face in these endeavors. That's a healthier attitude, both long term and for playing his match this week. But I appreciated Kasparov's energy during that 1996 match, as it gave us demonstrative cues about his state of mind. I'll never forget the time he made a winning move and set back smugly to put his wristwatch back on. Whenever Garry put his watch back on, we knew that he thought he was done with the hard working of winning the game
The second story is sadder. Donald Michie, a pioneer in machine learning, has died. Unlike many of the other founders of my first love in computing, I never had any particular connection to Michie or his work, though I knew his name well from the series of volumes on machine learning that he compiled and edited, as they are staples of most university libraries. But then I read in his linked Times On-Line article:
In 1960 he built Menace, the Matchbox Educable Noughts and Crosses Engine, a game-playing machine consisting of 300 matchboxes and a collection of glass beads of different colours.
We Americans know Noughts and Crosses as tic-tac-toe. It turns out that Michie's game-playing machine -- one that needed a human CPU and peripherals in order to run -- was the inspiration for an article by Martin Gardner, which I read as a sophomore or junior in high school. This article was one of my first introductions to machine learning and fueled the initial flame of my love for AI. I even built Gardner's variant on Michie's machine, a set of matchboxes to play Hexapawn and watched it learn to play a perfect game. It was no Chinook or Deep Blue, but it made this teenager's mind marvel at the possibilities of machine intelligence.
So, I did have a more direct connection to Michie, and had simply forgotten! RIP, Dr. Michie.
My 25th high school reunion is next month. (I can just hear the pencils at work as students, current and former, figure out just how old I am.) So I took this opportunity to re-read Alan Lightman's novel Reunion, which is about a college professor's 30th college reunion. I first read this book when it came out several years ago, but the theme was more timely this time around.
I first learned about Lightman, a physicist-turned-novelist whose fact and fiction both rest on a physics foundation, from an endnote in David Bodanis's E=mc2, which referred me to Einstein's Dreams, This was an unusual book, only a couple of dozen short chapters, that consisted of a few fictional vignettes of Einstein's thinking and discussion with Hans Bethe as he reconceptualized time for his theory of relativity, interspersed among twenty or so fictional dreams that Einstein might have had about worlds in which time behaves differently than it does in our world. For example, in one world, time passes faster when one is at higher altitudes; in another, one occasionally gets stuck to a single place in time; in yet another, time moves backward.
I found this book delightful, both creative and wonderfully written. The conversations between Einstein and Bethe sounded authentic to this non-physicist, and the dream chapters were both "whimsical" and "provocative" (words I borrow from a literary review of the book) -- what would it be like if different neighborhoods lived in different decades or even centuries? Lightman writes as a poet, spare with words and description, precise in detail. Yet the book had a serious undercurrent, as it exposed some of the questions that physicists have raised about the nature of time, and how time interacts with human experience.
Later I found Reunion. It's more of a traditional human story, and I expect that some of my friends would derogate it as "chick lit". But I disagree. First, it's a man's story: a 52-year-old man keenly aware that time has passed beyond his dreams; a 22-year-old man alive with promise unaware that he is reaching branches in time that can never be passed again. And while its structure is that of a traditional novel, the underlying current is one of time's ambiguity: looking back, looking forward, standing still. Lightman even resorts in the shortest of passages to a common device which in other authors' hands is cliché, but which in his seems almost matter of fact. It's not science fiction because it sticks close to the way a real person might feel in this world, where time seems to move monotonically forward but in which our lives are a complex mishmash of present and past, future and never-was.
I enjoyed Reunion again and, though it's a bit of downer, it hasn't diminished my anticipation of stepping back in time to see people who were once my friends, and who because of how time works in my mind will always be my friends, to reminisce about back-when and since-then, and what-now. Time's linearity will show through, of course, in the graying of hair and the onset of wrinkles...
A former student recently wrote:
I am periodically reminded of a saying that is usually applied to fathers but fits teachers well -- when you are young it's amazing how little they know, but they get much smarter as I get older.
For a teacher, this sort of unsolicited comment is remarkably gratifying. It is also humbling. What I do matters. I have to stay on top of my game.
Philip Greenspun recently posted a provocative blog entry called Why do high school kids keep signing up to be undergrads at research universities? If you've never read any of Philip's stuff, this might seem like an odd and perhaps even naive piece. His claim is pretty straightforward: "Research universities do not bother to disguise the fact that promotion, status, salary, and tenure for faculty are all based on research accomplishments," so why don't our brightest, most ambitious high school students figure out that these institutions aren't really about teaching undergraduates? This claim might seem odd considering that Philip himself went to MIT and now teaches as an adjunct prof there. But he has an established track record of writing about how schools like Harvard, MIT, the Ivies, and their ilk could do a better job of educating undergrads, and at a lower cost.
My thoughts on this issue are mixed, though at a certain level I agree with his premise. More on how I agree below.
As an undergraduate, I went to a so-called regional university, one that grants Ph.D.s in many fields but which is not typical of the big research schools Philip considers. I chose the school for its relatively strong architecture school, which ranked in the top 15 or 20 programs nationally despite being at a school that overall catered largely to a regional student population. There I was part of a good honors college and was able to work closely with published scholars in a way that seems unlikely at a Research U. However, I eventually changed my major and studied computer science accounting. The accounting program had a good reputation, but its computer science department was average at best. It had a standard curriculum, and I was a good enough student and had enough good profs that I was able to receive a decent education and to have my mind opened to the excitement of doing computer science as an academic career. But when I arrived at grad school I was probably behind most of my peers in terms of academic preparation.
I went to a research school for my graduate study, though not one in the top tier of CS schools. It was at that time, I think, making an effort to broaden, deepen, and strengthen its CS program (something I think it has done). The department gave me great financial support and opportunities to teach several courses and do research with a couple of different groups. The undergrad students I taught and TAed sometimes commented that they felt like they were getting a better deal out of my courses than they got out of other courses at the university, but I was often surprised by how committed some of the very best researchers in the department were to their undergrad courses. Some of the more ambitious undergrads worked in labs with the grad students and got to know the research profs pretty well. At least one of those students is now a tenured prof in a strong CS program down south.
Now I teach at a so-called comprehensive university, one of those medium-sized state schools that offers neither the prestige of the big research school nor the prestige of an elite liberal arts school. We are in a no-man's land in other ways as well -- our faculty are expected to do research, but our teaching expectations and resources place an upper bound on what most faculty can do; our admissions standards grant access to a wider variety of students, but such folks tend to require a more active, more personal teaching effort.
What Greenspun says holds the essence of truth in a couple of ways. The first is that a lot of our best students think that they can only get a good education at one of the big research schools. That is almost certainly not true. The variation in quality among the programs at the less elite schools is greater, which requires students and their parents to be perhaps more careful in selecting programs. It also requires the schools themselves to do a better job communicating where their quality programs lie, because otherwise people won't know.
But a university such as mine can assemble a faculty that is current in the discipline, does research that contributes value (even basic knowledge), and cares enough about its mission to teach to devote serious energy to the classroom. I don't think that a comprehensive's teaching mission in any speaks ill of a research school faculty's desire to teach well but, as Greenspun points out, those faculty face strong institutional pressure to excel in other areas. The comprehensive school's lower admission standards means that weaker students have a chance that they couldn't get elsewhere. Its faculty's orientation means that stronger have a chance to excel in collaboration with faculty who combine interest and perhaps talent in both teaching and research.
If the MITs and Harvards don't excel in teaching undergrads, what value to they offer to bright, ambitious high school students? Commenters on the article answered in a way that sometimes struck me as cynical or mercenary, but I finally realized that perhaps they were simply being practical. Going to Research U. or Ivy C. buys you connections. For example:
Seems pretty plain that he's not looking to buy the educational experience, he's looking to buy the peers and the prestige of the university.And in my experience of what school is good for, he's making the right decision.
You wanna learn? Set up a book budget and talk your way into or build your own facilities to play with the subject you're interested in. Lectures are a lousy way to learn anyway.
But you don't go to college to learn, you go to college to make the friends who are going to be on a similar arc as you go through your own career, and to build your reputation by association....
And:
You will meet and make friends with rich kids with good manners who will provide critical angel funding and business connections for your startups.
Who cares if the undergrad instruction is subpar? Students admitted to these schools are strong academically and likely capable of fending for themselves when it comes to content. What these students really need is a frat brother who will soon be an investment banker in a major NYC brokerage.
It's really unfair to focus on this side of the connection connection. As many commenters also pointed out, these schools attract lots of smart people, from undergrads to grad students to research staff to faculty. And the assiduous undergrad gets to hang around with them, learning from them all. Paul Graham would say that these folks make a great pool of candidates to be partners in the start-up that will make you wealthy. And if strong undergrad can fend for him- or herself, why not do it at Harvard or MIT, in a more intellectual climate? Good points.
But Greenspun offers one potential obstacle, one that seems to grow each year: price. Is the education an undergrad receives at an Ivy League or research school, intellectual and business connections included, really worth $200,000? In one of his own comments, he writes:
Economists who've studied the question of whether or not an Ivy League education is worth it generally have concluded that students who were accepted to Ivy League schools and chose not to attend (saving money by going to a state university, for example) ended up with the same lifetime income. Being the kind of person who gets admitted to Harvard has a lot of economic value. Attending Harvard turned out not to have any economic value.
I'm guessing, though, that most of these students went to a state research university, not to a comprehensive. I'd be curious to see how the few students who did opt for the less prestigious but more teaching-oriented school fared. I'm guessing that most still managed to excel in their careers and amass comparable wealth -- at least wealth enough to live comfortably.
I'm not sure Greenspun thinks that everyone should agree with his answer so much as that they should at least be asking themselves the question, and not just assuming the prestige trumps educational experience.
This whole discussion leads me to want to borrow a phrase from Richard Gabriel that he applies to talent and performance as a writer. The perceived quality of your undergraduate institution does not determine how good you can get, only how fast you get can good.
I read Greenspun's article just as I was finishing reading the book Teaching at the People's University, by Bruce Henderson. This book describes the history and culture of the state comprehensive universities, paying special attention to the competing forces that on the one hand push their faculty to teach and serve an academically diverse student body and on the other expects research and the other trappings of the more prestigious research schools. Having taught at a comprehensive for fifteen years now, I can't say that the book has taught me much I didn't already know about the conflicting culture of these schools, but it paints a reasonably accurate picture of what the culture is like. It can be a difficult environment in which to balance the desire to pursue basic research that has a significant effect in the world and the desire to teach a broad variety of students well.
There is no doubt that many of the students who enroll in this sort of school are served well, because otherwise they would have little opportunity to receive a solid university education; the major research schools and elite liberal arts schools wouldn't admit them. That's a noble motivation and it provides a valuable service to the state, but what about the better students who choose a comprehensive? And what of the aspirations of faculty who are trained in a research-school environment to value their careers by the intellectual contribution they make to their discipline? Henderson does a nice job laying these issues out for people to consider explicitly, rather than to back into them when their expectations are unmet. This is not unlike what Greenspun does in his blog entry, laying an important question on the line that too often goes unasked until the answer is too late to matter.
All this said, I'm not sure that Greenspun was thinking of the comprehensives at all when he wrote his article. The only school he mentions as an alternative to MIT, Harvard, and the other Ivies is the Olin College of Engineering, which is a much different sort of institution than a mid-level state school. I wonder whether he would suggest that his young relative attend one of the many teacher-oriented schools in his home state of Massachusetts?
After having experienced two or three different kinds of university, would I choose a different path for myself in retrospect? This sort of guessing game is always difficult to play, because I have experienced them all under different conditions, and they have all shaped me in different ways. I sometimes think of the undergraduates who worked in our research lab while I was in grad school; they certainly had broader and deeper intellectual experiences than I had as as undergraduate. But as a first-generation university attendee I grew quite a bit as an undergraduate and had a lot of fun doing it. Had I been destined for a high-flying academic research career, I think I would have had one. Some of my undergrad friends have done well on that path. My ambition, goals, and inclinations are well suited for where I've landed; that's the best explanation for why I've landed here. Would my effect on the world have been greater had I started at a Harvard? That's hard to say, but I see lots of opportunities to contribute to the world from this perch. Would I be happier, or a better citizen, or a better father and husband? Unlikely.
I wish Greenspun's young relative luck in his academic career. And I hope that I can prepare my daughters to choose paths that allow them to grow and learn and contribute.
I don't run into Basic and Cobol all that often these days, but lately they seem to pop up all over. Once recently I even ran into them together in an article by Tim Bray on trends in programming language publishing:
Are there any here that might go away? The only one that feels threatened at all is VB, wounded perhaps fatally in the ungraceful transition to .NET. I suppose it's unlikely that many people would pick VB for significant new applications. Perhaps it's the closest to being this millennium's COBOL; still being used a whole lot, but not creatively.
Those are harsh words, but I suppose it's true that Cobol is no longer used "creatively". But we still receive huge call for Cobol instruction from industry, both companies that typically recruit our students and companies in the larger region -- Minneapolis, Kansas City, etc. -- who have learned that we have a Cobol course on the books. Even with industry involvement, there is effectively no student demand for the course. Whether VB is traveling the same path, I don't know. Right now, there is still decent demand for VB from students and industry.
Yesterday, I ran into both languages again, in a cool way... A reader and former student pointed out that I had "hit the big leagues" when my recent post on Alan Kay started scoring points at programming.reddit.com. When I went there for a vanity stroke, I ran into something even better, a Sudoku solver written in Cobol! Programmers are a rare and wonderful breed. Thanks to Bill Price for sharing it with us. [1]
While looking for a Cobol compiler for my Intel Mac [2], I ran instead into Chipmunk Basic, "an old-fashioned Basic interpreter" for Mac OS. This brings back great memories, especially in light of my upcoming 25th high school reunion. (I learned Basic as a junior, in the fall of 1980.) Chipmunk Basic doesn't seem to handle my old graphics-enabled programs, but it runs most of the programs my students wrote back in the early 1990s. Nice.
I've been considering a Basic-like language as a possible source language for my compiler students this fall. I first began having such thoughts when I read a special section on lightweight languages in a 2005 issue of Dr. Dobbs' Journal and found Tom Pitman's article The Return of Tiny Basic. Basic has certain limitations for teaching compilers, but it would be simple enough to tackle in full within a semester. It might also be nice for historical reasons, to expose today's students to something that opened the door to so many CS students for so many years.
----
[1] I spent a few minutes poking around Mr. Price's website. In some sort of cosmic coincidence, it seems that Mr. Price is took his undergraduate degree at the university where I teach (he's an Iowa native), and is an avid chessplayer -- not to mention a computer programmer! That's a lot of intersection with my life.
[2] I couldn't find a binary for a Mac OS X Cobol, only sources for OpenCOBOL. Building this requires building some extension packages that don't compile without a bunch of tinkering, and I ran out of time. If anyone knows of a decent binary package somewhere, please drop me a line.
Recently I wrote about the availability heuristic and how it may affect student behavior. Schneier tells us that this is often a useful rule of thumb, and it has served us well evolutionarily. But our changing world may be eroding its value, perhaps even making it dangerous in some situations:
But in modern society, we get a lot of sensory input from the media. That screws up availability, vividness, and salience, and means that heuristics that are based on our senses start to fail. When people were living in primitive tribes, if the idea of getting eaten by a saber-toothed tiger was more available than the idea of getting trampled by a mammoth, it was reasonable to believe that--for the people in the particular place they happened to be living--it was more likely they'd get eaten by a saber-toothed tiger than get trampled by a mammoth. But now that we get our information from television, newspapers, and the Internet, that's not necessarily the case. What we read about, what becomes vivid to us, might be something rare and spectacular. It might be something fictional: a movie or a television show. It might be a marketing message, either commercial or political. And remember, visual media are more vivid than print media. The availability heuristic is less reliable, because the vivid memories we're drawing upon aren't relevant to our real situation.
I sometimes wonder if my omnivorous blogging and promiscuous referencing of many different sources create a situation in which my readers attribute brilliance to me that rightly belongs to my sources.
A little part of my ego thinks that this would be okay. (You didn't read that here.)
However, if you finish the Schneier paragraph I quoted above, you will see that just the opposite is probably true:
And even worse, people tend not to remember where they heard something--they just remember the content. So even if, at the time they're exposed to a message they don't find the source credible, eventually their memory of the source of the information degrades and they're just left with the message itself.
So you'll remember the ideas I toss out, but you'll eventually forget that you read them here. And so you will not be able to blame me if it turns out to be nonsense...
Maybe you'd better not read my blog after all.
Over the last couple of months, I've been collecting some good lines and links to the articles that contain them. Some of these may show up someday in something I write, but it seems a shame to have them lie fallow in a text file until then. Besides, my blog often serves as my commonplace book these days. All of these pieces are worth reading for more than the quote.
If the code cannot express itself, then
a comment might be acceptable. If the code does
not express itself, the code should be fixed.
-- Tim Ottinger,
Comments Again
In a concurrent world, imperative is the wrong default!
-- Tim Sweeney of Epic Games,
The Next Mainstream Programming Language:
A Game Developer's Perspective, an invited talk at ACM POPL'06
(full slides in
PDF)
When you are tempted to encode data structure in a variable
name (e.g. Hungarian notation), you need to create an object
that hides that structure and exposes behavior.
-- Uncle Bob Martin
The Hungarian Abhorrence Principle
Lisp... if you don't like the syntax, write your own.
-- Gordon Weakliem,
Hashed Thoughts,
on simple syntax for complex data structures
Pairing is a practice that has (IIRC) at least five
different benefits. If you can't pair, then you need to find
somewhere else in the process to put those benefits.
-- John Roth, on the XP mailing list
Fumbling with the gear is the telltale sign that I'm out
of practice with my craft. ... And day by day, the enjoyment
of the craft is replaced by the tedium of work.
-- Mike Clark,
Practice
So when you get rejected by investors, don't think "we
suck," but instead ask "do we suck?" Rejection is a question,
not an answer.
-- Paul Graham,
The Hacker's Guide to Investors
Practice. Question rejection.
Be careful what you pretend to be
because you are what you pretend to be.
Sometimes, the universe speaks to us and catches us unaware.
Yesterday, I attended a workshop, about which I will have more to say later today. Toward the end, I saw a quote that struck me as an expression of this blog's purpose, and almost an unknowing source for the name of this blog:
Learning is about ... connecting teaching and knowing to action.
Connecting knowing to doing. That's what this blog is all about.
But long time readers know that "Knowing and Doing" almost wasn't the name of my blog. I considered several alternatives. Back in November 2004, I wrote about some of the alternatives. Most of the serious candidates came from Kurt Vonnegut, my favorite author. Indeed, that post wasn't primarily about the name of my blog but about Vonnegut himself, who was celebrating his 82nd birthday.
Here we are, trapped in the amber of the moment.
There is no why.
And then I wake up this morning to find the world atwitter with news of Vonnegut's passing yesterday. I'm not sure that anyone noticed, but Vonnegut died on a notable unbirthday, five months from the day of his birth. I think that Vonnegut would have liked that, as a great cosmic coincidence and as a connection to Lewis Carroll, a writer whose sense of unreality often matched Vonnegut's own. More than most, Kurt was in tune with just how much of what happens in this world is coincidence and happenstance. He wrote in part to encourage us not to put too much stock in our control over a very complex universe.
Busy, busy, busy.
Many people, critics included, considered Vonnegut a pessimist, an unhappy man writing dark humor as a personal therapy. But Vonnegut was not a pessimist. He was at his core one of the world's great optimists, an idealist who believed deeply in the irrepressible goodness of man. He once wrote that "Robin Hood" and the New Testament were the most revolutionary books of all time because they showed us a world in which people loved one another and looked out for the less fortunate. He wrote to remind us that people are lonely and that we have it in our own power to solve our own loneliness and the loneliness of our neighbors -- by loving one another, and building communities in which we all have the support we need to live.
Live by the foma that make you
brave and kind and healthy and happy.
I had the good fortune to see Kurt Vonnegut speak at the Wharton Center in East Lansing when I was a graduate student at Michigan State. I had the greater good fortune to see him speak when he visited UNI in the late 1990s. Then I saw his public talk, but I also sat in on a talk he gave to a foreign language class, on writing and translation. I also was able to sit in on his intimate meeting with the school's English Club, where he sat patiently in a small crowded room and told stories, answered questions, and generally fascinated awestruck fans, whether college students or old fogies like me. I am forever in the debt of the former student who let me know about those side events and made sure that I could be there with the students.
Sometimes the pool-pah
exceeds the power of humans to comment.
On aging, Vonnegut once said, "When Hemingway killed himself he put a period at the end of his life; old age is more like a semicolon." But I often think of Vonnegut staring down death and God himself in the form of old Bokonon, the shadow protagonist of his classic Cat's Cradle:
If I were a younger man, I would write a history of human stupidity; and I would climb to the top of Mount McCabe and lie down on my back with my history for a pillow; and I would take from the ground some of the blue-white poison that makes statues of men; and I would make a statue of myself, lying on my back, grinning horribly, and thumbing my nose at You Know Who.
The blue-white poison was, of course, Ice Nine. These days, that is the name of my rotisserie baseball team. I've used Vonnegut's words as names many times. Back at Ball State, my College Bowl team was named Slaughterhouse Five. (With our alternate, we were five.)
Kurt Vonnegut was without question my favorite writer. I spent teenage years reading Slaughterhouse Five and Cat's Cradle, Welcome to the Monkey House and Slapstick, The Sirens of Titan and Breakfast of Champions and Player Piano, the wonderfully touching God Bless You, Mr. Rosewater and the haunting Mother Night. Later I came to love Jailbird and Galapagos, Deadeye Dick and Hocus Pocus and especially Bluebeard. I reveled in his autobiographical collages, too, Wampeters, Foma, and Granfalloons, Palm Sunday, Fates Worse Than Death, and Timequake. His works affected me as much or more than those of any of the classic writers feted by university professors and critics.
The world is a lesser place today. But I am happy for the words he left us.
Tiger gotta hunt.
Bird gotta fly.
Man gotta sit and wonder why, why, why.
Tiger gotta sleep.
Bird gotta land.
Man gotta tell himself he understand.
If you've never read any Vonnegut, try it sometime. Start with Slaughterhouse Five or Cat's Cradle, both novels, or Welcome to the Monkey House, a collection of his short stories. Some of his short stories are simply stellar. If you like Cat's Cradle, check out my tabulation of The Books of Bokonon, which is proof that a grown man can still be smitten with a really good book.
And, yes, I still lust after naming my blog The Euphio Question.
Rest in peace, Kurt.
... to the Sixth
My dad turned 64 yesterday. That's a nice round number in the computing world, though he might not appreciate me pointing that out. It's hard for me to imagine him, or me, any different than we were when I was a little boy growing up at home. It's also hard for me to imagine that someday soon my daughter might be thinking the same about the two of us. Perhaps I need a bigger imagination.
... Months
This is the time of the academic year when folks seeking jobs at other institutions, in particular administrative promotions, begin to learn of their good fortune and to plan to depart. Several of my colleagues at the university will be moving on to new challenges after this academic year.
In a meeting this week, one such colleague said something that needed to be said, but which most people wouldn't say. It was on one of those topics that seems off limits, for political or personal reasons, and so it usually just hangs in the air like Muzak.
Upon hearing the statement, another colleague joked, "Two months. You have two months to speak the truth. Two months to be a truth teller."
It occurred to me then that this must be quite a liberating feeling -- to be able to speak truths that otherwise will go unspoken. Almost immediately on the heels of this thought, it occurred to me just how sad it is that such truths go unspoken. And that I am also unwilling to speak them. Perhaps I need greater courage, or more skill.
I honestly feel like my best work is still ahead of me.
I'm just not sure I can catch up to it.
I owe this gem, which pretty much sums up how I have felt all week, to comedian Drew Hastings, courtesy of the often bawdy but, to my tastes, always funny Bob and Tom Show. Hastings is nearly a decade older than I, but I think we all have this sense sooner or later. Let's hope it passes!
I owe you some computing content, so here is an interview with Fran Allen, who recently received the 2006 Turing Award. She challenges us to recruit women more effectively ("Could the problem be us?") and to help our programming languages and compilers catch up with advances in supercomputing ("Only the bold should apply!")
My roommate and I are staying in the Radisson Riverfront, which is right across the river from downtown Cincinnati at the nexus of the Ohio River and I-75. The view from our 14th-floor room is good.
So there I was, settling in after a long day at the conference. About 10 PM, the phone rings. Odd. The caller asks for me. Odd again. It's the front desk. The local police have asked all guests to come down to the lobby.
Apparently, someone had shot a bullet into a 12th-floor room. From somewhere outside.
Odd.
Down we went to the lobby. We heard lots of curious smalltalk, and no small amount impatience at being inconvenienced by this interruption. Some commented that we hardly seemed safer all herded into one place, in front of wide glass lobby windows open to the interstate exit.
After 25 minutes or so, the chief of the Covington Police Department called us together. "Welcome to Covington!" he offered in good humor. He explained what had happened, what the police had done to investigate, and that they now believed us to be safe. He thanked us for our patience and wished us a good visit. I can imagine that he is a good sort of person to have as a police chief -- a big part of the job is communicating with people who are in varying states of distress.
The rest of the night passed without event.
Let's see if OOPSLA can top this.
Heard on my drive to SIGCSE today:
These experiences have caused him to think very hard about what he is doing and where he is going. And the result of all this thinking is that he now understands he doesn't know what he is doing or where he is going.
This quote is about Ray Porter, a character in Steve Martin's novella Shopgirl, which I listened on my drive to SIGCSE today. While I am in most ways nothing at all like the divorced, 50-something Porter, I can certainly appreciate his sudden need to think very hard and his sudden realization that he is essentially clueless. Over the course of my career, I had grown to feel comfortable in my roles as an academic, as teacher and scholar. Which I switched into the Big Office Downstairs, I just assumed that things would proceed as usual.
But after a year and a half as department head, I experience occasional moments of "midterm crisis", in which I think that I don't really know what I'm doing or where I'm going. I often have a pretty good 50,000-foot view of what I want, but down in the trenches I usually feel a little disoriented. With experience, things have gotten better, but a year filled with academic program review, two time-consuming searches, and a curriculum revision have sapped my reservoirs of creativity and energy.
At least now I think I understand that I don't know what I am doing or where I am going. You know what they say about acceptance being the first step toward recovery!
By the way, I do recommend Shopgirl. I have read the book and then listened to it several times on tape while driving to PLoPs, SIGCSEs, and ICFPs. For some reason, Martin's writing holds my attention, and the story is sweet enough that I can get over constantly wondering, "Why does she put up with this?" Martin is a surprisingly good writer; though he is never going to win a Nobel Prize, he can spin a decent short yarn. My first exposure to his literary work was Picasso at the Lapin Agile, a stage play about a fantasy meeting between Picasso and Einstein at a Paris bar around the turn of the century.
Oh, and as for Shopgirl -- I haven't seen the movie yet! I'm glad that I read the book first, and then heard it read before seeing the film version. Just knowing that Martin and the glorious Clair Danes play the leading roles has changed my experience of the book...
In the beginning, I blogged on the danger of false pride and quoted the Stoic philosopher Epictetus. Now I have encountered another passage from Epictetus, at the mind hacker site Hacks
When you are going about any action, remind yourself what nature the action is. If you are going to bathe, picture to yourself the things which usually happen in the bath: some people splash the water, some push, some use abusive language, and others steal. Thus you will more safely go about this action if you say to yourself, "I will now go bathe, and keep my own mind in a state conformable to nature." And in the same manner with regard to every other action. For thus, if any hindrance arises in bathing, you will have it ready to say, "It was not only to bathe that I desired, but to keep my mind in a state conformable to nature; and I will not keep it if I am bothered at things that happen."
The notion of a "state conformable to nature" was central to his injunction against false pride, and here it remains the thoughtful person's goal, this time in the face of all that can go wrong in the course of living. This quote also resonates with me, because, just as I am inclined toward a false pride, I have a predisposition toward letting small yet expectable hindrances interfere with my frame of mind. Perhaps predisposition is the wrong word; perhaps it's just a bad habit.
As is often the case for me, after a second or third positive reference is all I need to commit to reading more. In the coming weeks, I now plan to read the Discourses of Epictetus. We even have a copy on our bookshelf at home. (My wife's more classical education proves useful to me again!)
As I prepared to leave for ChiliPLoP 2007, I looked to the trip as almost a vacation. While I will be working steadily through the 76 hours or so in Arizona, both on elementary patterns and various and sundry school duties, I will be off the treadmill that have been the last few days. I've hardly had time to breathe since Wednesday morning:
Wednesday: Prepare for and sit in on meetings all day, including a faculty meeting. Scramble to make last-minute changes to our fall semester schedule before the secretary goes on vacation. Exercise for an hour with my older daughter in her ballet class. I was in the class, and did all of the "floor barre" work that the rest of the class did. (At least that was fun time with Sarah!) For an old guy, I did all right. Sprint home to pack a quick overnight bag. Drive two hours to Des Moines. Crash in motel bed.
Thursday: Attend 7:00 AM breakfast at the State Capitol, sponsored by the Department of Economic Development to encourage state legislators to fully fund the department's budget request for 2007-2008. Mingle with legislators, and visited with IT bigwigs from most of the major players in the state. After two hours, drive two hours back home. Prepare for class. Meet my Programming Languages class. Attend meeting. Give dinner talk to local Kiwanis club on my department's efforts to participate in state economic development through curriculum and research projects that partner with regional companies. Crash in my own bed.
Friday: Arrive at office by 6:30 AM to do some leftover work. Coach team of local eighth graders who are preparing for a math competition. After several months, the competition is here -- tomorrow. Spend four hours visiting with prospective students and their parents. Scramble all afternoon to tie up loose ends in the office. Take one daughter to orchestra practice and then to play rehearsal. Pack for ChiliPLoP.
Saturday: Take eighth graders to math competition at 8:00 AM. Fill time and watch until 1:30. Take one daughter to play rehearsal. Kiss other daughter, already at rehearsal, good-bye for a few days. Head home, kiss wife good-bye, load bags in car, and hit road for 3+ hour drive to Minneapolis. (Take short nap along the way.) Grab dinner with friend. Crash on friend's couch.
Sunday: Rise at 5 AM for ride to airport. Encounter usual delays. Board plan at 6:30 AM for 7:00 AM flight. Then ... sit for two hours before take-off as plane requires computer system maintenance. I don't usually think of a two-hour in-plane delay as a respite, but this one was. I took a nap!
We are now in the air, approaching Phoenix. ChiliPLoP, as busy as it always, made more busy by some necessary work from back home, will seem a break.
This sort of week may be no big deal to my consultant friends, or maybe even to my academic friends with big outreach portfolios. But it's new to me. While each activity of the week offered value, I found myself looking forward to the TSA and the plane. Extra time of the plane didn't bother me.
Landing will be better. Work on computer science with some of my favorite colleagues will be better.
A run in the sun will be better.