August 01, 2024 5:52 PM

Lots of our code is smarter than we are

I read one of Tim Bray's recent blog posts this morning and really liked this passage, quoted from a post of his written four years ago:

The observation that computer programmers can build executable abstractions that work but they then have trouble understanding is not new and not surprising. Lots of our code is smarter than we are.

I know this happens to me. I wonder how often something similar happens to my students as they learn techniques and abstractions?

Imagine: They assemble several small pieces of code, each of which they understand individually at least a little, into a bigger program that mostly does what they want — but they have trouble understanding the program as a whole. Further, the size and complexity of the program as a whole makes them begin to doubt their understanding of the individual pieces.

That seems a little far-fetched on its face, but it would explain behavior I see in class all the time. It's a shame when a student who seems to be doing fine begins to lose confidence as the size and complexity of their programs grow. I try to address these growing pains as best I can, but so much of their learning happens away from me, back in their rooms writing code. A few ask for help and stay on track. The ones who don't, struggle and sometimes stop having fun.

If only I can help them see that this is, many ways, the natural order for even the best programmers: We build abstractions that work but which we have trouble understanding.

People like Tim Bray struggle with this, too. You'll be all right.

~~~~~

Wow. I let the entire month of July, and the last ten days of June, pass without posting. That's the first calendar month with a (0) next to it in the blog's archives. My long absence included Knowing and Doing's 20th birthday, July 9, 2004. We should have had a party!

That means July 2024 was the first month of my twenty-first year blogging. Putting up a bagel is not an auspicious start...

I have a few short posts in mind for the coming weeks. Let's see if I can follow through. I like to write here.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

June 05, 2024 2:58 PM

Quick Links from Reading

I have posted or re-posted several links on Mastodon recently, after having been mostly silently for a few weeks. Whenever I end up posting a flurry of items, I find myself thinking, "I may want to use these somewhere later..." and realizing that I don't want to rely on Mastodon search to find them. So here a few, for the record (and for the pleasure of those readers who don't follow my social media postings with the tenacity they deserve).

Oneliner-izer https://github.com/csvoss/onelinerizer converts any Python 2 script into a single line of code.

Favorite line from the documentation:
"Never pass up an opportunity to use the Y combinator."

In my programming languages course, we learn how local variables are a syntactic abstraction. If nothing else, I can use this tool to demonstrate translations of simple Python code into their equivalent function applications.

~~~~~

This is from a short blog post on receiving an AI-generated email message from a friend:

The effort, the clumsiness, and the time invested are where humanity is stored.
-- https://mrgan.com/ai-email-from-a-friend/

Then again, I would like this. I still send actual physical cards to many friends and members of my family each year at Christmas, with (short) personalized handwritten notes. I am not always the best of friends, but occasionally I invest a little effort, clumsiness, and time in saying "I'm thinking of you."

~~~~~

It's surveillance wine in safety bottles.
-- https://mastodon.world/@Mer__edith/112535616774247450

I haven't studied the GDPR provision that Whitaker is commenting on, so I have no comment on it directly. But I love this phrase.

~~~~

"But stay lucid, even during office hours."

That's a tough ask some days.

I was pointed in the direction of this Camus quote by a tweet from @DylanoA4. The image he posted ends with this quote, but the passage beginning with it is even more fitting as the set-up for a prof's joke about office hours. Here that is, courtesy of a post on goodreads:

But stay lucid, even during office hours. As soon as we are alone in its presence, strive after the nakedness into which the world rejects us. But above all, in order to be, never try to seem.

Every prof knows the rejection of empty office hours. "Where are all the students who need my help, or my sage advice on life and the burning topics of the day?" we cry, palms upturned. But don't count on using that time for any other task either... Students will arrive when you least expect them. Stay lucid, and strive after the nakedness into which the world rejects us.

All academics become Stoics, or spend their lives raging against the night.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

May 05, 2024 8:31 AM

Reading "Fahrenheit 451" in 2024

Sometimes, speculative fiction seems eerily on the mark:

Montag turned and looked at his wife, who sat in the middle of the parlor talking to an announcer, who in turn was talking to her. "Mrs. Montag," he was saying. This, that, and the other. "Mrs. Montag--" Something else and still another. The converter attachment, which had cost them one hundred dollars, automatically supplied her name whenever the announcer addressed his anonymous audience, leaving a blank where the proper syllables could be filled in. A special spot-wavex-scrambler also caused his televised image, in the area immediately about his lips, to mouth the vowels and consonants beautifully. He was a friend, no doubt of it, a good friend. "Mrs. Montag--now look right here."

"Spot-wavex-scrambler" is a great phrase. Someone should make it a product name.

That is a paragraph from Ray Bradbury's Fahrenheit 451. I was not far into the book before its description of technology used to entertain — distract, occupy, sedate — the population began to seem eerily familiar. It's not what we have now, and there hasn't been anything especially AI-like in the story yet, except perhaps the sinister robot dog at the fire station. But the entertainment tech hits close to the mark. Mildred wears earbuds all the time, listening to her shows or just to white noise.

The timeline isn't perfect, either ("We've started and won two atomic wars since 2022!"), but the timing isn't all that far off. Almost everyone these days is living with a sense of disruption from the events of the last decade or so, including wars, which is in rhythm with the story. The fictional government, I presume, makes people happy by surrounding them, literally, with video and audio entertainment 24/7 — all the better not to think about what's really happening out in the world.

Reading this is eerie for me in another way. I read a lot of Ray Bradbury when I was growing up, and for a long time I thought I had read Fahrenheit 451. But then I wasn't so sure, because I couldn't bring to mind any memory around reading it, let alone any memory of the content. (The latter is common for many books I read in high school.) On my last trip to the library, I checked out a copy in order to fill either the gap in my memory or the gap in my reading.

It's a prescient book. I see why it remains a common text in high school and college lit courses. I look forward to the rest of the story.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

April 28, 2024 12:22 PM

On Remaining Relevant as We Age

This morning, I finished reading How a Script Doctor Found His Own Voice, about screenwriter Scott Frank. Late in the piece, there's a bit on "how difficult it can be to remain relevant as a screenwriter as you age". Frank took caution from the experience of one of his mentors, director Sydney Pollack:

After decades of success making such movies as "Three Days of the Condor" and "Out of Africa", Pollack had "a way of working," Frank said. "And it stopped working." Suddenly, Pollack was out of step. Frank urged him to do "something different, something small, something that's not a love story where they end up together." He even tried to get Pollack to direct his thriller "The Lookout". But Pollack couldn't change. To Frank, the lesson was clear: you can't "just double down on what you used to do." The only way to remain vital is to take chances.

That called to mind something I read earlier in the week, a short blog post by Jessamyn West, on how the intersection of an LLM chatbot tool and a newsletter called "The Soul of a New Machine" made her laugh. She closes with a paragraph that felt familiar:

I'm now what folks might consider later-career. I'm faffing about with this newfangled technological stuff knowing both that it's a big deal and also that I only sort of care about it (at my peril? perhaps.) ....

I, too, am late in my career. As an academic computer scientist, "newfangled technological stuff" is my line of work, but... I can't think of many things less interesting for me to do than figuring out how to prompt an LLM to write code or text for me. My lack of enthusiasm may portend the sort of irrelevance that befell Pollack, but I hope not. Unlike Pollack, I feel no need to double down on what I've always done, and indeed am open to something new. So I'll keep poking around, enjoying what I enjoy, and hope to find a path more like the one Frank followed: taking a different kind of chance.

~~~~~

Postscript: If you think this post seems like an aftershock to a post from the turn of the year, you are not alone. Still searching.

Whatever you think of this post, though, I heartily recommend the New Yorker article on Scott Frank, which was engaging throughout and full of interesting bits on writing, filmmaking, and careers.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

April 21, 2024 12:41 PM

The Truths We Express To Children Are Really Our Hopes

In her Conversation with Tyler, scholar Katherine Rundell said something important about the books we give our children:

Children's novels tend to teach the large, uncompromising truths that we hope exist. Things like love will matter, kindness will matter, equality is possible. I think that we express them as truths to children when what they really are are hopes.

This passage immediately brought to mind Marick's Law: In software, anything of the form "X's Law" is better understood by replacing the word "Law" with "Fervent Desire". (More on this law below.)

While comments on different worlds, these two ideas are very much in sync. In software and so many other domains, we coin laws that are really much more expressions of our aspiration. This no less true in how we interact with young people.

We usually think that our job is to teach children the universal truths we have discovered about the world, but what we really teach them is our view of how the world can or should be. We can do that by our example. We can also do that with good books.

But aren't the universal truths in our children's literature true? Sometimes, perhaps, but not all of them are true all of the time, or for all people. When we tell stories, we are describing the world we want for our children, and giving them the hope, and perhaps the gumption, to make our truths truer than we ourselves have been able to.

I found myself reading lots of children's books and YA fiction when my daughters were young: to them, and with them, and on their recommendation. Some of them affected me enough that I quoted them in blog posts. There is so many good books for our youth in the library: honest, relevant to their experiences, aspirational, exemplary. I concur in Rundell's suggestion that adults should read children's fiction occasionally, both for pleasure and "for the unabashed politics of idealism that they have".

More on Marick's Law and Me

I remember posting Marick's Law on this blog in October 2015, when I wanted to share a link to it with Mike Feathers. Brian had tweeted the law in 2009, but a link to a tweet didn't feel right, not at a time when the idealism of the open web was still alive. In my post, I said "This law is too important to be left vulnerable to the vagaries of an internet service, so let's give it a permanent home".

In 2015, the idea that Twitter would take a weird turn, change its name to X, and become a place many of my colleagues don't want to visit anymore seemed far-fetched. Fortunately, Brian's tweet is still there and, at least for now, publicly viewable via redirect. Even so, given the events of the last couple of years, I'm glad I trusted my instincts and gave the law a more home on Knowing and Doing. (Will this blog outlive Twitter?)

The funny thing, though, is that that wasn't its first appearance here. I found the 2015 URL for use in this post by searching for the word "fervent" in my Software category. That search also brought up a Posts of the Day post from April 2009 — the day after Brian tweeted the law. I don't remember that post now, and I guess I didn't remember it in 2015 either.

Sometimes, "Great minds think alike" doesn't require two different people. With a little forgetfulness, they can be Past Me and Current Me.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Software Development

March 14, 2024 12:37 PM

Gene Expression

Someone sent me this image, from a slide deck they ran across somewhere:

A slide labeled 'Gene Expression'. The main image a casual shot of actor Gene Wilder, labeled 'One Gene'. There a three side images of Wilder as iconic characters he played in 'Willy Wonka & the Chocolate Factory', 'Young Frankenstein', and 'Blazing Saddles'. There are arrows from the main image to the three side images, labeled 'Many Outcomes'.

I don't know what to do with it other than to say this:

As a person named 'Eugene' and an admirer of Mr. Wilder's work, I smile every time I see it. That's a clever way to reinforce the idea of gene expression by analogy, using actors and roles.

When I teach OOP and FP, I'm always looking for simple analogies like this from the non-programming world to reinforce ideas that we are learning about in class. My OOP repertoire is pretty deep. As I teach functional programming each spring, I'm still looking for new FP analogies all the time.

~~~~~

Note: I don't know the original source of this image. If you know who created the slide, please let me know via email, Mastodon, or Twitter (all linked in the sidebar). I would love to credit the creator.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 27, 2024 7:10 PM

Today in "It's not the objects; it's the messages"

Alan Kay is fond of saying that object-oriented programming is not about the objects; it's about the messages. He also looks to the biological world for models of how to think about and write computer programs.

This morning I read two things on the exercise bike that brought these ideas to mind, one from the animal kingdom and one from the human sphere.

First was a surprising little article on how an invasive ant species is making it harder for Kenyan lions to hunt zebras, with elephants playing a pivotal role in the story, too. One of the scientists behind the study said:

"We often talk about conservation in the context of species. But it's the interactions which are the glue that holds the entire system together."

It's not just the animals. It's the interactions.

Then came @jessitron reflecting on what it means to be "the best":

And then I remembered: people are people through other people. Meaning comes from between us, not within us.

It's not just the people. It's the interactions.

Both articles highlighted that we are usually better served by thinking about interactions within systems, and not simply the components of system. That way lies a more reliable approach to build robust software. Alan Kay is probably somewhere nodding his head.

The ideas in Jessitron's piece fit nicely into the software analogy, but they mean even more in the world of people that she is reflecting on. It's easy for each of us to fall into the habit of walking around the world as an I and never quite feeling whole. Wholeness comes from connection to others. I occasionally have to remind myself to step back and see my day in terms of the students and faculty I've interacted with, whom I have helped and who have helped me.

It's not (just) the people. It's the interactions.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

January 21, 2024 8:28 AM

A Few Thoughts on How Criticism Affects People

The same idea popped up in three settings this week: a conversation with a colleague about student assessments, a book I am reading about women writers, and a blog post I read on the exercise bike one morning.

The blog post is by Ben Orlin at Math With Bad Drawings from a few months ago, about an occasional topic of this blog: being less wrong each day [ for example, 1 and 2 ]. This sentence hit close enough to home that I saved it for later.

We struggle to tolerate censure, even the censure of idiots. Our social instrument is strung so tight, the least disturbance leaves us resonating for days.

Perhaps this struck a chord because I'm currently reading A Room of One's Own, by Virginia Woolf. In one early chapter, Woolf considers the many reasons that few women wrote poetry, fiction, or even non-fiction before the 19th century. One is that they had so little time and energy free to do so. Another is that they didn't have space to work alone, a room of one's own. But even women who had those things had to face a third obstacle: criticism from men and women alike that women couldn't, or shouldn't, write.

Why not shrug off the criticism and soldier on? Woolf discusses just how hard that is for anyone to do. Even many of our greatest writers, including Tennyson and Keats, obsessed over every unkind word said about them or their work. Woolf concludes:

Literature is strewn with the wreckage of men who have minded beyond reason the opinions of others.

Orlin's post, titled Err, and err, and err again; but less, and less, and less, makes an analogy between the advance of scientific knowledge and an infinite series in mathematics. Any finite sum in the series is "wrong", but if we add one more term, it is less wrong than the previous sum. Every new term takes us closer to the perfect answer.

a black and white portrait of a bearded man
Source: Wikipedia, public domain

He then goes on to wonder whether the same is, or could be, true of our moral development. His inspiration is American psychologist and philosopher William James. I have mentioned James as an inspiration myself a few times in this blog, most explicitly in Pragmatism and the Scientific Spirit, where I quote him as saying that consciousness is "not a thing or a place, but a process".

Orlin connects his passage on how humans receive criticism to James's personal practice of trying to listen only to the judgment of ever more noble critics, even if we have to imagine them into being:

"All progress in the social Self," James says, "is the substitution of higher tribunals for lower."

If we hold ourselves to a higher, more noble standard, we can grow. When we reach the next plateau, we look for the next higher standard to shoot for. This is an optimistic strategy for living life: we are always imperfect, but we aspire to grow in knowledge and moral development by becoming a little less wrong each step of the way. To do so, we try to focus our attention on the opinions of those whose standard draws us higher.

Reading James almost always leaves my spirit lighter. After Orlin's post, I feel a need to read The Principles of Psychology in full.

These two threads on how people respond to criticism came together when I chatted with a colleague this week about criticism from students. Each semester, we receive student assessments of our courses, which include multiple-choice ratings as well as written comments. The numbers can be a jolt, but their effect is nothing like that of the written comments. Invariably, at least one student writes a negative response, often an unkind or ungenerous one.

I told my colleague that this is recurring theme for almost every faculty member I have known: Twenty-nine students can say "this was a good course, and I really like the professor", but when one student writes something negative... that is the only comment we can think about.

The one bitter student in your assessments is probably not the ever more noble critic that James encourages you to focus on. But, yeah. Professors, like all people, are strung pretty tight when it comes to censure.

Fortunately, talking to others about the experience seems to help. And it may also remind us to be aware of how students respond to the things we say and do.

Anyway, I recommend both the Orlin blog post and Woolf's A Room of One's Own. The former is a quick read. The latter is a bit longer but a smooth read. Woolf writes well, and once my mind got on the book's wavelength, I found myself engaged deeply in her argument.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

July 04, 2023 11:55 AM

Time Out

Any man can call time out, but no man
can say how long the time out will be.
-- Books of Bokonon

I realized early last week that it had been a while since I blogged. June was a morass of administrative work, mostly summer orientation. Over the month, I had made notes for several potential posts, on my web dev course, on the latest book I was reading, but never found -- made -- time to write a full post. I figured this would be a light month, only a couple of short posts, if I only I could squeeze another one in by Friday.

Then I saw that the date of my most recent post was May 26, with the request for ideas about the web course coming a week before.

I no longer trust my sense of time.

This blog has certainly become much quieter over the years, due in part to the kind and amount of work I do and in part to choices I make outside of work. I may even have gone a month between posts a few fallow times in the past. But June 2023 became my first calendar month with zero posts.

It's somewhat surprising that a summer month would be the first to shut me out. Summer is a time of no classes to teach, fewer student and faculty issues to deal with, and fewer distinct job duties. This occurrence is a testament to how much orientation occupies many of my summer days, and how at other times I just want to be AFK.

A real post or two are on their way, I promise -- a promise to myself, as well as to any of you who are missing my posts in your newsreader. In the meantime...

On the web dev course: thanks to everyone who sent thoughts! There were a few unanimous, or near unanimous, suggestions, such as to have students use VS code. I am now learning it myself, and getting used to an IDE that autocompletes pairs such as "". My main prep activity up to this point has been watching David Humphrey's videos for WEB 222. I have been learning a little HTML and JavaScript and a lot of CSS and how these tools work together on the modern web. I'm also learning how to teach these topics, while thinking about the differences between my student audience and David's.

On the latest book: I'm currently reading Shop Class as Soulcraft, by Matthew Crawford. It came out in 2010 and, though several people recommended it to me then, I had never gotten around to it. This book is prompting so many ideas and thoughts that I'm constantly jotting down notes and thinking about how these ideas might affect my teaching and my practice as a programmer. I have a few short posts in mind based on the book, if only I commit time to flesh them out. Here are two passages, one short and one long, from my notes.

Fixing things may be a cure for narcissism.

Countless times since that day, a more experienced mechanic has pointed out to me something that was right in front of my face, but which I lacked the knowledge to see. It is an uncanny experience; the raw sensual data reaching my eye before and after are the same, but without the pertinent framework of meaning, the features in question are invisible. Once they have been pointed out, it seems impossible that I should not have seen them before.

Both strike a chord for me as I learn an area I know only the surface of. Learning changes us.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

May 26, 2023 12:37 PM

It's usually counterproductive to be doctrinaire

A short passage from Innocence, by Penelope Fitzgerald:

In 1927, when they moved me from Ustica to Milan, I was allowed to plant a few seeds of chicory, and when they came up I had to decide whether to follow Rousseau and leave them to grow by the light of nature, or whether to interfere in the name of knowledge and authority. What I wanted was a a decent head of chicory. It's useless to be doctrinaire in such circumstances.

Sometimes, you just want a good head of chicory -- or a working program. Don't let philosophical ruminations get in the way. There will be time for reflection and evaluation later.

A few years ago, I picked up Fitzgerald's short novel The Bookshop while browsing the stacks at the public library. I enjoyed it despite the fact that (or perhaps because) it ended in a way that didn't create a false sense of satisfaction. Since then I have had Fitzgerald on my list of authors to explore more. I've read the first fifty pages or so of Innocence and quite like it.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

May 07, 2023 8:36 AM

"The Society for the Diffusion of Useful Knowledge"

I just started reading Joshua Kendall's The Man Who Made Lists, a story about Peter Mark Roget. Long before compiling his namesake thesaurus, Roget was a medical doctor with a local practice. After a family tragedy, though, he returned to teaching and became a science writer:

In the 1820s and 1830s, Roget would publish three hundred thousand words in the Encyclopaedia Brittanica and also several lengthy review articles for the Society for the Diffusion of Useful Knowledge, the organization affiliated with the new University of London, which sought to enable the British working class to educate itself.

What a noble goal, enabling the working class to educate itself. And what a cool name: The Society for the Diffusion of Useful Knowledge!

For many years, my university has provided a series of talks for retirees, on topics from various departments on campus. This is a fine public service, though without the grand vision -- or the wonderful name -- of the Society for the Diffusion of Useful Knowledge. I suspect that most universities depend too much on tuition and lower costs these days to mount an ambitious effort to enable the working class to educate itself.

Mental illness ran in Roget's family. Kendall wonders if Roget's "lifelong desire to bring order to the world" -- through his lecturing, his writing, and ultimately his thesaurus, which attempted to classify every word and concept -- may have "insulated him from his turbulent emotions" and helped him stave off the depression that afflicted several of his family members.

Academics often live an obsessive connection with the disciplines they practice and study. Certainly that sort of focus can can be bad for a person when taken too far. (Is it possible for an obsession not to go too far?) For me, though, the focus of studying something deeply, organizing its parts, and trying to communicate it to others through my courses and writing has always felt like a gift. The activity has healing properties all its own.

In any case, the name "The Society for the Diffusion of Useful Knowledge" made me smile. Reading has the power to heal, too.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

April 26, 2023 12:15 PM

Cultivating a Way of Seeing

Sometimes, I run across a sentence I wish I had written. Here are several paragraphs by Dan Bouk I would be proud to have written.

Museums offer a place to practice looking for and acknowledging beauty. This is, mostly, why I visit them.

As I wander from room to room, a pose diverts me, a glance attracts me, or a flash of color draws my eye. And then I look, and look, and look, and then move on.

Outside the museum, I find that this training sticks. I wander from subway car to platform, from park to city street, and a pose diverts me, a glance attracts me, or a flash of color draws my eye. People of no particular beauty reveal themselves to be beautiful. It feels as though I never left the museum, and now everything, all around me, is art.

This way of seeing persists, sometimes for days on end. It resonates with and reinforces my political commitment to the equal value of each of my neighbors. It vibrates with my belief in the divine spark, the image of God, that animates every person.

-- Dan Bouk, in On Walking to the Museum, Musing on Beauty and Safety


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

April 09, 2023 8:24 AM

It Was Just Revision

There are several revised approaches to "what's the deal with the ring?" presented in "The History of The Lord of the Rings", and, as you read through the drafts, the material just ... slowly gets better! Bit by bit, the familiar angles emerge. There seems not to have been any magic moment: no electric thought in the bathtub, circa 1931, that sent Tolkien rushing to find a pen.

It was just revision.

Then:

... if Tolkien can find his way to the One Ring in the middle of the fifth draft, so can I, and so can you.

-- Robin Sloan, How The Ring Got Good


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

March 31, 2023 3:57 PM

"I Just Need a Programmer, er, Writer"

This line line from Chuck Wendig's post on AI tools and writing:

Hell, it's the thing every writer has heard from some jabroni who tells you, "I got this great idea, you write it, we'll split the money 50/50, boom."

... brought to mind one of my most-read blog posts ever, "I Just Need a Programmer":

As head of the Department of Computer Science at my university, I often receive e-mail and phone calls from people with The Next Great Idea. The phone calls can be quite entertaining! The caller is an eager entrepreneur, drunk on their idea to revolutionize the web, to replace Google, to top Facebook, or to change the face of business as we know it. ...

They just need a programmer. ...

The opening of that piece sounds a little harsh more than a decade later, but the basic premise holds. And, as Wendig notes, it holds beyond the software world. I even once wrote a short follow-up when accomplished TV writer Ken Levine commented on his blog about the same phenomenon in screenwriting.

Some misconceptions are evergreen.

Adding AI to the mix adds a new twist. I do think human execution in telling stories will still matter, though. I'm not yet convinced that the AI tools have the depth of network to replace human creativity.

However, maybe tools such as ChatGPT can be the programmer people need. A lot of folks are putting these tools to good use creating prototypes, and people who know how to program are using them effectively as accelerators. Execution will still matter, but these programs may be useful contributors on the path to a product.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

February 26, 2023 8:57 AM

"If I say no, are you going to quit?"

Poet Marvin Bell, in his contribution to the collection Writers on Writing:

The future belongs to the helpless. I am often presented that irresistible question asked by the beginning poet: "Do you think I am any good?" I have learned to reply with a question: "If I say no, are you going to quit?" Because life offers any of us many excuses to quit. If you are going to quit now, you are almost certainly going to quit later. But I have concluded that writers are people who you cannot stop from writing. They are helpless to stop it.

Reading that passage brought to mind Ted Gioia's recent essay on musicians who can't seem to retire. Even after accomplishing much, these artists seem never want to stop doing their thing.

Just before starting Writers on Writing, I finished Kurt Vonnegut's Sucker's Portfolio, a slim 2013 volume of six stories and one essay not previously published. The book ends with an eighth piece: a short story unfinished at the time of Vonnegut's death. The story ends mid-sentence and, according to the book's editor, at the top of an unfinished typewritten page. In his mid-80s, Vonnegut was creating stories t the end.

I wouldn't mind if, when it's my time to go, folks find my laptop open to some fun little programming project I was working on for myself. Programming and writing are not everything there is to my life, but they bring me a measure of joy and satisfaction.

~~~~~

This week was a wonderful confluence of reading the Bell, Gioia, and Vonnegut pieces around the same time. So many connections... not least of which is that Bell and Vonnegut both taught at the Iowa Writers' Workshop.

There's also an odd connection between Vonnegut and the Gioia essay. Gioia used a quip attributed to the Roman epigrammist Martial:

Fortune gives too much to many, but enough to none.

That reminded me of a story Vonnegut told occasionally in his public talks. He and fellow author Joseph Heller were at a party hosted by a billionaire. Vonnegut asked Heller, "How does it make you feel to know that guy made more money yesterday than Catch-22 has made in all the years since it was published?" Heller answered, "I have something he'll never have: the knowledge that I have enough."

There's one final connection here, involving me. Marvin Bell was the keynote speaker at Camouflage: Art, Science & Popular Culture an international conference organized by graphic design prof Roy Behrens at my university and held in April 2006. Participants really did come from all around the world, mostly artists or designers of some sort. Bell read a new poem of his and then spoke of:

the ways in which poetry is like camouflage, how it uses a common vocabulary but requires a second look in order to see what is there.
I gave a talk at the conference called NUMB3RS Meets The DaVinci Code: Information Masquerading as Art. (That title was more timely in 2006 than 2023...) I presented steganography as a computational form of camouflage: not quite traditional concealment, not quite dazzle, but a form of dispersion uniquely available in the digital world. I recall that audience reaction to the talk was better than I feared when I proposed it to Roy. The computer science topic meshed nicely with the rest of the conference lineup, and the artists and writers who saw the talk seemed to appreciate the analogy. Anyway, lots of connections this week.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

February 13, 2023 10:34 AM

The Exuberance of Bruce Springsteen in Concert

Bruce Springsteen, on why he puts on such an intense physical show:

So the display of exuberance is critical. "For an adult, the world is constantly trying to clamp down on itself," he says. "Routine, responsibility, decay of institutions, corruption: this is all the world closing in. Music, when it's really great, pries that shit back open and lets people back in, it lets light in, and air in, and energy in, and sends people home with that and sends me back to the hotel with it. People carry that with them sometimes for a very long period of time."

This passage is from a 2012 profile of the Boss, We Are Alive: Bruce Springsteen at Sixty-Two. A good read throughout.

Another comment from earlier in the piece has been rumbling around my head since I read it. Many older acts, especially those of Springsteen's vintage, have become essentially "their own cover bands", playing the oldies on repeat for nostalgic fans. The Boss, though, "refuses to be a mercenary curator of his past" and continually evolves as an artist. That's an inspiration I need right now.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

December 22, 2022 1:21 PM

The Ability to Share Partial Results Accelerated Modern Science

This passage is from Lewis Thomas's The Lives of a Cell, in the essay "On Societies as Organisms":

The system of communications used in science should provide a neat, workable model for studying mechanisms of information-building in human society. Ziman, in a recent "Nature" essay, points out, "the invention of a mechanism for the systematic publication of fragments of scientific work may well have been the key event in the history of modern science." He continues:
A regular journal carries from one research worker to another the various ... observations which are of common interest. ... A typical scientific paper has never pretended to be more than another little piece in a larger jigsaw -- not significant in itself but as an element in a grander scheme. The technique of soliciting many modest contributions to the store of human knowledge has been the secret of Western science since the seventeenth century, for it achieves a corporate, collective power that is far greater than any one individual can exert [italics mine].

In the 21st century, sites like arXiv lowered the barrier to publishing and reading the work of other scientists further. So did blogs, where scientists could post even smaller, fresher fragments of knowledge. Blogs also democratized science, by enabling scientists to explain results for a wider audience and at greater length than journals allow. Then came social media sites like Twitter, which made it even easier for laypeople and scientists in other disciplines to watch -- and participate in -- the conversation.

I realize that this blog post quotes an essay that quotes another essay. But I would never have seen the Ziman passage without reading Lewis. Perhaps you would not have seen the Lewis passage without reading this post? When I was in college, the primary way I learned about things I didn't read myself was by hearing about them from classmates. That mode of sharing puts a high premium on having the right kind of friends. Now, blogs and social media extend our reach. They help us share ideas and inspirations, as well as helping us to collaborate on science.

~~~~

I first mentioned The Lives of a Cell a couple of weeks ago, in If only ants watched Netflix.... This post may not be the last to cite the book. I find something quotable and worth further thought every few pages.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns, Teaching and Learning

November 27, 2022 9:38 AM

I Toot From the Command Line, Therefore I Am

Like so many people, I have been checking out new social media options in the face of Twitter's upheaval. None are ideal, but for now I have focused most of my attention on Mastodon, a federation of servers implemented using the ActivityPub protocol. Mastodon has an open API, which makes it attractive to programmers. I've had an account there for a few years (I like to grab username wallingf whenever a new service comes out) but, like so many people, hadn't really used it. Now feels more like the time.

On Friday, I spent a few minutes writing a small script that posts to my Mastodon account from the command line. I occasionally find that sort of thing useful, so the script has practical value. Really, though, I just wanted to play a bit in code and take a look at Mastodon's API.

Several people in my feed posted, boosted, and retweeted a link to this DEV Community article, which walks readers through the process of posting a status update using curl or Python. Everything worked exactly as advertised, with one small change: the Developers link that used to be in the bottom left corner of one's Mastodon home page is now a Development link on the Preferences page.

I've read a lot in the last few weeks about how the culture of Mastodon is different from the culture of Twitter. I'm trying to take seriously the different culture. One concrete example is the use of content warnings or spoiler alerts to hide content behind a brief phrase or tag. This seems like a really valuable practice, useful in a number of different contexts. At the very least, it feels like the Subject: line on an email message or a Usenet News post. So I looked up how to post content warnings with my command-line script. It was dead simple, all done in a few minutes.

There may be efficiency problems under the hood with how Mastodon requests work, or so I've read. The public interface seems well done, though.

I went with Python for my script, rather than curl. That fits better with most my coding these days. It also makes it easier to grow the script later, if I want. bash is great for a few lines, but I don't like to live inside bash for very long. On any code longer than a few lines, I want to use a programming language. At a couple of dozen lines, my script was already long enough to merit a real language. I went mostly YAGNI this time around. There are no classes, just a sequence of statements to build the http request from some constants (server name, authorization token) and command-line args (the post, the content warning). I did factor the server name and authorization token out of the longer strings and include an option to write the post via stdin. I want the flexibility of writing longer toots now, and I don't like magic constants. If I ever need to change servers or tokens, I never have to look past the few first few lines of the file.

As I briefly imagined morphing the small but growing script into a Toot class, I recalled a project I gave my Intermediate Computing students back in 2009 or so: implement the barebones framework of a Twitter-like application. That felt cutting edge back then, and most of the students really liked putting their new OO design and programming skills to use in a program that seemed to matter. It was good fun, and a great playground for so many of the ideas they had learned that semester.

All in all, this was a great way to spend a few minutes on a free afternoon. The API was simple to use, and the result is a usable new command. I probably should've been grading or doing some admin work, but profs need a break, too. I'm thankful to enjoy little programming projects so much.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

November 23, 2022 1:27 PM

I Can't Imagine...

I've been catching up on some items in my newsreader that went unread last summer while I rode my bike outdoors rather than inside. This passage from a blog post by Fred Wilson at AVC touched on a personal habit I've been working on:

I can't imagine an effective exec team that isn't in person together at least once a month.

I sometimes fall into a habit of saying or thinking "I can't imagine...". I'm trying to break that habit.

I don't mean to pick on Wilson, whose short posts I enjoy for insight into the world of venture capital. "I can't imagine" is a common trope in both spoken and written English. Some writers use it as a rhetorical device, not as a literal expression. Maybe he meant it that way, too.

For a while now, though, I've been trying to catch myself whenever I say or think "I can't imagine...". Usually my mind is simply being lazy, or too quick to judge how other people think or act.

It turns out that I usually can imagine, if I try. Trying to imagine how that thinking or behavior makes sense helps me see what other people might be thinking, what their assumptions or first principles are. Even when I end up remaining firm in my own way of thinking, trying to imagine usually puts me in a better position to work with the other person, or explain my own reasoning to them more effectively.

Trying to imagine can also give me insight into the limits of my own thinking. What assumptions am I making that lead me to have confidence in my position? Are those assumptions true? If yes, when might they not to be true? If no, how do I need to update my thinking to align with reality?

When I hear someone say, "I can't imagine..." I often think of Russell and Norvig's textbook Artificial Intelligence: A Modern Approach, which I used for many years in class [1]. At the end of one of the early chapters, I think, they mention critics of artificial intelligence who can't imagine the field of AI ever accomplishing a particular goal. They respond cheekily to the effect, This says less about AI than it says about the critics' lack of imagination. I don't think I'd ever seen a textbook dunk on anyone before, and as a young prof and open-minded AI researcher, I very much enjoyed that line [2].

Instead of saying "I can't imagine...", I am trying to imagine. I'm usually better off for the effort.

~~~~

[1] The Russell and Norvig text first came out in 1995. I wonder if the subtitle "A Modern Approach" is still accurate... Maybe theirs is now a classical approach!

[2] I'll have to track that passage down when I am back in my regular office and have access to my books. (We are in temporary digs this fall due to construction.) I wonder if AI has accomplished the criticized goal in the time since Russell and Norvig published their book. AI has reached heights in recent years that many critics in 1995 could not imagine. I certainly didn't imagine a computer program defeating a human expert at Go in my lifetime, let alone learning to do so almost from scratch! (I wrote about AlphaGo and its intersection with my ideas about AI a few times over the years: [ 01/2016 | 03/2016 | 05/2017 | 05/2018 ].)


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

October 02, 2022 9:13 AM

Twitter Replies That No One Asked For

I've been pretty quiet on Twitter lately. One reason is that my daily schedule has been so different for the last six or eight weeksr: I've been going for bike rides with my wife at the end of the work day, which means I'm most likely to be reading Twitter late in the day. By then, many of the threads I see have played themselves out. Maybe I should jump in anyway? Even after more than a decade, I'm not sure I know how to Twitter properly.

Here are a few Twitter replies that no one asked for and that I chose not to send at the time.

• When people say, "That's the wrong question to ask", what they often seem to mean -- and should almost always say -- is, "That's not the question I would have asked."

• No, I will not send you a Google Calendar invite. I don't use Google Calendar. I don't even put every event into the calendaring system I *do* use.

• Yes, I will send you a Zoom link.

• COVID did not break me for working from home. Before the pandemic, I almost never worked at home during the regular work day. As a result, doing so felt strange when the pandemic hit us all so quickly. But I came first to appreciate and then to enjoy it, for many of the same reasons others enjoy it. (And I don't even have a long or onerous commute to campus!) Now, I try to work from home one day a week when schedules allow.

• COVID also did not break me for appreciating a quiet and relatively empty campus. Summer is still a great time to work on campus, when the pace is relaxed and most of the students who are on campus are doing research. Then again, so is fall, when students return to the university, and spring, when the sun returns to the world. The takeaway: It's usually a great time to be on campus.

I realize that some of these replies in absentia are effectively subtweets at a distance. All the more reason to post them here, where everyone who reads them has chosen to visit my blog, rather in a Twitter thread filled with folks who wouldn't know me from Adam. They didn't ask for my snark.

I do stand by the first bullet as a general observation. Most of us -- me included! -- would do better to read everyone else's tweets and blog posts as generously as possible.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 18, 2022 9:37 AM

Dread and Hope

First, a relatively small-scale dread. From Jeff Jarvis in What Is Happening to TV?

I dread subscribing to Apple TV+, Disney+, Discovery+, ESPN+, and all the other pluses for fear of what it will take to cancel them.

I have not seen a lot of popular TV shows and movies in the last decade or two because I don't want to deal with the hassle of unsubscribing from some service. I have a list of movies to keep an eye out for in other places, should they ever appear, or to watch at their original homes, should my desire to see them ever outgrow my preference to avoid certain annoyances.

Next, a larger-scale source of hope, courtesy of Neel Krishnaswami in The Golden Age of PL Research:

One minor fact about separation logic. John C. Reynolds invented separation logic when was 65. At the time that most people start thinking about retirement, he was making yet another giant contribution to the whole field!

I'm not thinking about retirement at all yet, but I am past my early days as a fresh, energetic, new assistant prof. It's good to be reminded every once in a while that the work we do at all stages of our careers can matter. I didn't make giant contributions when I was younger, and I'm not likely to make a giant contribution in the future. But I should strive to keep doing work that matters. Perhaps a small contribution remains to be made.

~~~~

This isn't much of a blog post, I know. I figure if I can get back into the habit of writing small thoughts down, perhaps I can get back to blogging more regularly. It's all about the habit. Wish me luck.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

August 29, 2022 4:44 PM

Radio Silence

a photo of a dolomite outcropping in Backbone State Park, Iowa

I did not intend for August to be radio silence on my blog and Twitter page. The summer just caught up with me, and my brain took care of itself, I guess, by turning off for a bit.

One bit of newness for the month was setting up a new Macbook Air. I finally placed my order on July 24. It was scheduled to arrive the week of August 10-17 but magically appeared on our doorstep on July 29. I've been meaning to write about the experience of setting up a new Mac laptop after working for seven years on a trusty Macbook Pro, but that post has been a victim of the August slowdown. I can say this: I pulled out the old Macbook Pro to watch Netflix on Saturday evening... and it felt *so* heavy. How quickly we adjust to new conditions and forget how lucky we were before.

Another pleasure in August was meeting up with Daniel Steinberg over Zoom. I remember back near the beginning of the pandemic Daniel said something on Twitter about getting together for a virtual coffee with friends and colleagues he could no longer visit. After far too long, I contacted him to set up a chat. We had a lot of catching up to do and ended up discussing teaching, writing, programming, and our families. It was one of my best hours for the month!

My wife and I took advantage of the last week before school started by going on a couple of hikes. We visited Backbone State Park for the first time and spent an entire day walking and enjoying scenery that most people don't associate with Iowa. The image at the top of this post comes from the park's namesake trail, which showcases some of the dolomite limestone cliffs leftover from before the last glaciers. Here's another shot, of an entrance to a cave carved out by icy water that still flows beneath the surface:

a photo of the entrance to a dolomite cave in Backbone State Park, Iowa

Closer to home, we took a long morning to walk through Hartman Reserve, a county preserve. Walking for a couple of hours as the sun rises and watching the trees and wildlife come to light is a great way to shake some rust off the mind before school starts.

I had a tough time getting ready mentally for the idea of a new school year. This summer's work offered more burnout than refreshment. As the final week before classes wound down, I had to get serious about class prep -- and it freed me up a bit. Writing code, thinking about CS, and getting back into the classroom with students still energize me. This fall is my compilers course. I'm giving myself permission to make only a few targeted changes in the course plan this time around. I'm hoping that this lets me build some energy and momentum throughout the semester. I'll need that in order to be there for the students.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

August 15, 2022 12:49 PM

No Comment

a picture of the orchid in my office from April 2021

From the closing pages from The Orchid Thief, which I mentioned in my previous post:

"The thing about computers," Laroche said, "the thing that I like is that I'm immersed in it but it's not a living thing that's going to leave or die or something. I like having the minimum number of living things to worry about in my life."

Actually, I have two comments.

If Laroche had gotten into open source software, he might have found himself with the opposite problem: software that won't die. Programmers sometimes think, "I know, I'll design and implement my own programming language!" Veterans of the programming languages community always seem to advise: think twice. If you put something out there, other people will use it, and now you are stuck maintaining a package forever. The same can be said for open source software more generally. Oh, and did I mention it would be really great if you added this feature?

I like having plants in my home and office. They give me joy every day. They also tend to live a lot longer than some of my code. The hardy orchid featured above bloomed like clockwork twice a year for me for five and a half years. Eventually it needed more space than the pot in my office could give, so it's gone now. But I'm glad to have enjoyed it for all those years.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Software Development

July 31, 2022 8:54 AM

Caring about something whittles the world down to a more manageable size

In The Orchid Thief, there is a passage where author Susan Orlean describes a drive across south Florida on her way to a state preserve, where she'll be meeting an orchid hunter. She ends the passage this way:

The land was marble-smooth and it rolled without a pucker to the horizon. My eyes grazed across the green band of ground and the blue bowl of sky and then lingered on a dead tire, a bird in flight, an old fence, a rusted barrel. Hardly any cars came toward me, and I saw no one in the rearview mirror the entire time. I passed so many vacant acres and looked past them to so many more vacant acres and looked ahead and behind at the empty road and up at the empty sky; the sheer bigness of the world made me feel lonely to the bone. The world is so huge that people are always getting lost in it. There are too many ideas and things and people, too many directions to go. I was starting to believe that the reason it matters to care passionately about something is that it whittles the world down to a more manageable size. It makes the world seem not huge and empty but full of possibility. If I had been an orchid hunter I wouldn't have see this space as sad-making and vacant--I think I would have seen it as acres of opportunity where the things I loved were waiting to be found.

John Laroche, the orchid hunter at the center of The Orchid Thief, comes off as obsessive, but I think many of us know that condition. We have found an idea or a question or a problem that grabs our attention, and we work on it for years. Sometimes, we'll follow a lead so far down a tunnel that it feels a lot like the swamps Laroche braves in search of the ghost orchid.

Even a field like computer science is big enough that it can feel imposing if a person doesn't have a specific something to focus their attention and energy on. That something doesn't have to be forever... Just as Laroche had cycled through a half-dozen obsessions before turning his energy to orchids, a computer scientist can work deeply in an area for a while and then move onto something else. Sometimes, there is a natural evolution in the problems one focuses on, while other times people choose to move into a completely different sub-area. I see a lot of people moving into machine learning these days, exploring how it can change the sub-field they used to focus exclusively on.

As a prof, I am fortunate to be able to work with young adults as they take their first steps in computer science. I get to watch many of them find a question they want to answer, a problem they want to work on for a few years, or an area they want to explore in depth until they master it. It's also sad, in a way, to work with a student who never quite finds something that sparks their imagination. A career in software, or anything, really, can look as huge and empty as Orlean's drive through south Florida if someone doesn't care deeply about something. When they do, the world seems not huge and empty, but full of possibility.

I'm about halfway through The Orchid Thief and am quite enjoying it.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

June 28, 2022 4:12 PM

You May Be Right

Billy Joel performing 'We Didn''t Start the Fire' at Notre Dame Stadium, June 25, 2022

I first saw Billy Joel perform live in 1983, with a college roommate and our girlfriends. It was my first pop/rock concert, and I fancied myself the biggest Billy Joel fan in the world. The show was like magic to a kid who had been listening to Billy's music on vinyl, and the radio, for years.

Since then, I've seen him more times than I can remember, most recently in 2008. My teenaged daughters went with me to that one, so it was magic for more reasons than one. I've even seen a touring Broadway show built around his music. So, yeah, I'm still a fan.

On Saturday morning, I drove to Elkhart, Indiana, to meet up with three friends from college to go see Billy perform outdoors at Notre Dame Stadium. We bought our tickets in October 2019, pre-COVID, expecting to see the show in the summer of 2020. After two years of postponement, Billy, the venue, and the fans were ready to go. Six hours is a long way to drive to see a two- or three-hour show, especially knowing that I had to drive six hours back the next morning. I'm not a college student any more!

You may be right; I may be crazy. But I would drive six hours again to see Billy. Even at 73, he puts on a great show. I hope I have that kind of energy -- and the desire to still do my professional thing -- when I reach that age. (I don't expect that 50,000 students will pay to see me do it, let alone drive six hours.) For this show, I had the bonus of being able to visit with good friends, one of whom I've known since grade school, after too long a time.

I went all fanboy in my short post about the 2008 concert, so I won't bore you again with my hyperbole. I'll just say that Billy performed "She's Always A Woman" and "Don't Ask Me Why" again, along with a bunch of the old favorites and a few covers: I enjoyed his impromptu version of "Don't Let the Sun Go Down on Me", bobbles and all. He played piano for one of his band members, Mike DelGuidice, who sang "Nessun Dorma". And the biggest ovation of the night may have gone to Crystal Taliafero, a multi-talented member of Billy's group, for her version of "Dancing in the Streets" during the extended pause in "The River of Dreams".

This concert crowd was the most people I've been around in a long time... I figured a show in an outdoor stadium was safe enough, with precautions. (I was one of the few folks who wore a mask in the interior concourse and restrooms.) Maybe life is getting back to normal.

If this was my last time seeing Billy Joel perform live, it was a worthy final performance. Who knows, though. I thought 2008 might be my last live show.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

June 14, 2022 2:48 PM

A Two Cultures Theory of Meetings

snow falling on a redwood cabin

Courtesy of Chad Orzel's blog:

This ended up reminding me of the Two Cultures theory of meetings that I heard (second-hand) from a former Dean (a Classics professor, for the record). This was prompted by her noticing that the scientists and engineers always seemed grumpy and impatient about having meetings during work hours, where folks from the non-STEM fields were more cheerful. She realized that this was largely because the STEM folks tended to do research in their labs and offices on campus, during the day, so having a meeting was directly taking them away from productive time. For folks on the more literary side of academia, the actual scholarly work of reading and writing was mostly done elsewhere— at home, in coffee shops, at archives— and at times outside the normal academic workday— in the evening, during the summer, etc. As a result, they tended to only come to campus for classes, meetings, and socialization, and the latter two tended to blend together.

Now I'm thinking back over my years as a faculty member and department head. I've been attending meetings mostly with administrators for so now long that I my experience is blunted: the days of science department heads are less different from the days of arts and humanities department heads than the differences for the corresponding faculty. Most admins seem reconciled, if ruefully, to their meetings.

Being a computer scientist affects my experience, too. Most of our faculty are software people who can read and write code from anywhere. In this regard, we are perhaps more like arts and humanities folks than other scientists are. When I think back on my interactions with CS colleagues, the ones least likely to want to meet at any old time are (1) people who do work with hardware in their labs and (2) people doing the most serious research. The second group tend to guard their creative time more carefully in all respects.

The other thing coloring my experience is... me. I am frequently grumpy and impatient about having meetings at all, during regular work hours or not, because so many of them come up on the wrong side of the cost/benefit ledger. A lot of university meetings happen only because they are supposed to happen. Many of my colleagues are congenial about this and manage to find ways to put the time to good use for them and, presumably, many other participants. I'd generally like to get back to work on more pressing, or interesting, matters.

But that is getting a bit far afield from the basic observation of a Two Cultures-style split, which is founded, I think, on the notion that the meetings in question are essential or at least important enough to hold. In that narrower context, I think Chad's colleague may be on to something.

~~~~~

Photo by Nikola Johnny Mirkovic on Unsplash.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

June 08, 2022 1:51 PM

Be a Long-Term Optimist and a Short-Term Realist

Before I went to bed last night, I stumbled across actor Robert De Niro speaking with Stephen Colbert on The Late Show. De Niro is, of course, an Oscar winner with fifty years working in films. I love to hear experts talk about what they do, so I stayed up a few extra minutes.

I think Colbert had just asked De Niro to give advice to actors who were starting out today, because De Niro was demurring: he didn't like to give advice, and everyone's circumstances are different. But then he said that, when he himself was starting out, he went on lots of auditions but always assumed that he wasn't going to get the job. There were so many ways not to get a job, so there was no reason to get his hopes up.

Colbert related that anecdote to his own experience getting started in show business. He said that whenever he had an acting job, he felt great, and whenever he didn't have a job, pessimism set in: he felt like he was never going to work again. De Niro immediately said, "oh, I never felt that way". He always felt like he was going to make it. He just had to keep going on auditions.

There was a smile on Colbert's face. He seemed to have trouble squaring De Niro's attitude toward auditions with his claimed confidence about eventual success. Colbert moved on with his interview.

It occurred to me that the combination of attitudes expressed by De Niro is a healthy, almost necessary, way to approach big goals. In the short term, accept that each step is uncertain and unlikely to pay off. Don't let those failures get you down; they are the price of admission. For the long term, though, believe deeply that you will succeed. That's the spirit you need to keep taking steps, trying new things when old things don't seem to work, and hanging around long enough for success to happen.

De Niro's short descriptions of his own experiences revealed how both sides of his demeanor contributed to him ultimately making it. He never knew what casting agents, directors, and producers were looking for, so he was willing to read every part in several different ways. Even though he didn't expect to get the job, maybe one of those people would remember him and mention him to a friend in the business, and maybe that connection would pay off. All he could do was audition.

The self-assurance De Niro seemed to feel almost naturally reminded me of things that Viktor Frankl and John McCain said about their ability to survive time in war camps. Somehow, they were able to maintain a confidence that they would eventually be free again. In the end, they were lucky to survive, but their belief that they would survive had given them a strength to persevere through much worse treatment than simply being rejected for a part in a movie. That perseverance helped them stay alive and take actions that would leave them in a position to be lucky.

I realize that the story De Niro tells, like those of Frankl and McCain, is potentially suspect due to survivor bias. We don't get to hear from people who believed that they would make it as actors but never did. Even so, their attitude seems like a pragmatic one to implement, if we can manage it: be a long-term optimist and a short-term realist. Do everything you can to hang around long enough for fortune to find us.

Like De Niro, I am not much one to give advice. In the haze of waking up and going back to sleep last night, though, I think his attitude gives us a useful model to follow.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

May 30, 2022 8:32 AM

I Have Written That Code

Last month, I picked up a copy of The Writing Life by Annie Dillard at the library. It's one of those books that everyone seems to quote, and I had never read it. I was pleased to find it is a slim volume.

It didn't take long to see one of the often-quoted passages, on the page before the first chapter:

No one expects the days to be gods. -- Emerson

Then, about a third of the way in, came the sentences for which everyone knows Dillard:

How we spend our days is, of course, how we spend our lives. What we do with this hour, and that one, is what we are doing.

Dillard's portrayal of the writing life describes some of the mystery that we non-writers imagine, but mostly it depicts the ordinariness of daily grind and the extended focus that looks like obsession to those of us on the outside.

Occasionally, her stories touched on my experience as a writer of programs. Consider this paragraph:

Every year the aspiring photographer brought a stack of his best prints to an old, honored photographer, seeking his judgment. Every year the old man studied the prints and painstakingly ordered them into two piles, bad and good. Every year that man moved a certain landscape print into the bad stack. At length he turned to the young man: "You submit this same landscape every year, and every year I put it on the bad stack. Why do you like it so much?" The young photographer said, "Because I had to climb a mountain to get it."

I have written that code. I bang my head against some problem for days or weeks. Eventually, I find a solution. Sometimes it's homely code that gets the job; usually it seems more elegant than it is, in relief against the work that went into discovering it. Over time, I realize that I need to change it, or delete it altogether, in order to make progress on the system in which it resides. But... the mountain.

It's a freeing moment when I get over the fixation and make the change the code needs. I'll always have the mountain, but my program needs to move in a different direction.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

April 04, 2022 5:45 PM

Leftover Notes

Like many people, I carry a small notebook most everywhere I go. It is not a designer's sketchbook or engineer's notebook; it is intended primarily to capturing information and ideas, a lá Getting Things Done, before I forget then. Most of the notes end up being transferred to one of my org-mode todo lists, to my calendar, or to a topical file for a specific class or project. Write an item in the notebook, transfer to the appropriate bin, and cross it off in the notebook.

I just filled the last line of my most recent notebook, a Fields Notes classic that I picked up as schwag at Strange Loop a few years ago. Most of the notebook is crossed out, a sign of successful capture and transfer. As I thumbed back through it, I saw an occasional phrase or line that never made into a more permanent home. That is pretty normal for my notebooks. At this point, I usually recycle the used notebook and consign untracked items to lost memories.

For some reason, this time I decided to copy all of the untracked down and savor the randomness of my mind. Who knows, maybe I'll use one of these notes some day.

The Feds

basic soul math

I want to be #0

routine, ritual

gallery.stkate.edu

M. Dockery

www.wastetrac.org/spring-drop-off-event

Crimes of the Art

What the Puck

Massachusetts ombudsman

I hope it's still funny...

chessable.com

art gallery

ena @ tubi

In Da Club (50 Cent)

Gide 25; 28 May : 1

HFOSS project

April 4-5: Franklin documentary

Mary Chapin Carpenter

"Silent Parade" by Keigo Higashino

www.pbs.org -- search Storm Lake

"Hello, Transcriber" by Hannah Morrissey

Dear Crazy Future Eugene

I recognize most of these, though I don't remember the reason I wrote all of them down. For whatever reason, they never reached an actionable status. Some books and links sound interesting in the moment, but by the time I get around to transcribing them elsewhere, I'm no longer interested enough to commit to reading, watching, or thinking about them further. Sometimes, something pops into my mind, or I see something, and I write it down. Better safe than sorry...

That last one -- Dear Crazy Future Eugene -- ends up in a lot of my notebooks. It's a phrase that has irrational appeal to me. Maybe it is destined to be the title of my next blog.

There were also three multiple-line notes that were abandoned:

poem > reality
words > fact
a model is not identical

I vaguely recall writing this down, but I forget what prompted it. I vaguely agree with the sentiment even now, though I'd be hard-pressed to say exactly what it means.

Scribble pages that separate notes from full presentation
(solutions to exercises)

This note is from several months ago, but it is timely. Just this week, a student in my class asked me to post my class notes before the session rather than after. I don't do this currently in large part because my sessions are a tight interleaving of exercises that the students do in class, discussion of possible solutions, and use of those ideas to develop the next item for discussion. I think that Scribble, an authoring system that comes with Racket, offers a way for me to build pages I can publish in before-and-after form, or at least in an outline form that would help students take notes. I just never get around to trying the idea out. I think the real reason is that I like to tinker with my notes right up to class time... Even so, the idea is appealing. It is already in my planning notes for all of my classes, but I keep thinking about it and writing it down as a trigger.

generate scanner+parser? expand analysis,
codegen (2 stages w/ IR -- simple exps, RTS, full)
optimization! would allow bigger source language?

This is evidence that I'm often thinking about my compiler course and ways to revamp it. This idea is also already in the system. But I keep to prompting myself to think about it again.

Anyway, that was a fun way to reflect on the vagaries of my mind. Now, on to my next notebook: a small pocket-sized spiral notebook I picked up for a quarter in the school supplies section of a big box store a while back. My friend Joe Bergin used to always have one of these in his shirt pocket. I haven't used a spiral-bound notebook for years but thought I'd channel Joe for a couple of months. Maybe he will inspire me to think some big thoughts.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

February 13, 2022 12:32 PM

A Morning with Billy Collins

It's been a while since I read a non-technical article and made as many notes as I did this morning on this Paris Review interview with Billy Collins. Collins was poet laureate of the U.S. in the early 2000s. I recall reading his collection, Sailing Alone Around the Room, at PLoP in 2002 or 2003. Walking the grounds at Allerton with a poem in your mind changes one's eyes and hears. Had I been blogging by then, I probably would have commented on the experience, and maybe one or two of the poems, in a post.

As I read this interview, I encountered a dozen or so passages that made me think about things I do, things I've thought, and even things I've never thought. Here are a few.

I'd like to get something straightened out at the beginning: I write with a Uni-Ball Onyx Micropoint on nine-by-seven bound notebooks made by a Canadian company called Blueline. After I do a few drafts, I type up the poem on a Macintosh G3 and then send it out the door.

Uni-Ball Micropoint pens are my preferred writing implement as well, though I don't write enough on paper any more to make buying a particular pen much worth the effort. Unfortunately, just yesterday my last Uni-Ball Micro wrote its last line. Will I order more? It's a race between preference and sloth.

I type up most of the things I write these days on a 2015-era MacBook Pro, often connected to a Magic Keyboard. With the advent of the M1 MacBook Pros, I'm tempted to buy a new laptop, but this one serves me so well... I am nothing if not loyal.

The pen is an instrument of discovery rather than just a recording implement. If you write a letter of resignation or something with an agenda, you're simply using a pen to record what you have thought out. In a poem, the pen is more like a flashlight, a Geiger counter, or one of those metal detectors that people walk around beaches with. You're trying to discover something that you don't know exists, maybe something of value.

Programming may be like writing in many ways, but the search for something to say isn't usually one of them. Most of us sit down to write a program to do something, not to discover some unexpected outcome. However, while I may know what my program will do when I get done, I don't always know what that program will look like, or how it will accomplish its task. This state of uncertainty probably accounts for my preference in programming languages over the years. Smalltalk, Ruby, and Racket have always felt more like flashlights or Geiger counters than tape recorders. They help me find the program I need more readily than Java or C or Python.

I love William Matthews's idea--he says that revision is not cleaning up after the party; revision is the party!

Refactoring is not cleaning up after the party; refactoring is the party! Yes.

... nothing precedes a poem but silence, and nothing follows a poem but silence. A poem is an interruption of silence, whereas prose is a continuation of noise.

I don't know why this passage grabbed me. Perhaps it's just the imagery of the phrases "interruption of silence" and "continuation of noise". I won't be surprised if my subconscious connects this to programming somehow, but I ought to be suspicious of the imposition. Our brains love to make connections.

She's this girl in high school who broke my heart, and I'm hoping that she'll read my poems one day and feel bad about what she did.

This is the sort of sentence I'm a sucker for, but it has no real connection to my life. Though high school was a weird and wonderful time for me, as it was for so many, I don't think anything I've ever done since has been motivated in this way. Collins actually goes on to say the same thing about his own work. Readers are people with no vested interest. We have to engage them.

Another example of that is my interest in bridge columns. I don't play bridge. I have no idea how to play bridge, but I always read Alan Truscott's bridge column in the Times. I advise students to do the same unless, of course, they play bridge. You find language like, South won with dummy's ace, cashed the club ace and ruffed a diamond. There's always drama to it: Her thirteen imps failed by a trick. There's obviously lots at stake, but I have no idea what he's talking about. It's pure language. It's a jargon I'm exterior to, and I love reading it because I don't know what the context is, and I'm just enjoying the language and the drama, almost like when you hear two people arguing through a wall, and the wall is thick enough so you can't make out what they're saying, though you can follow the tone.

I feel seen. Back when we took the local daily paper, I always read the bridge column by Charles Goren, which ran on the page with the crossword, crypto cipher, and other puzzles. I've never played bridge; most of what I know about the game comes from reading Matthew Ginsberg's papers about building AI programs to bid and play. Like Collins, I think I was merely enjoying sound of the language, a jargon that sounds serious and silly at the same time.

Yeats summarizes this whole thing in "Adam's Curse" when he writes: "A line will take us hours maybe, / Yet if it does not seem a moment's thought / Our stitching and unstitching has been naught."

I'm not a poet, and my unit of writing is rarely the line, but I know a feeling something like this in writing lecture notes for my students. Most of the worst writing consists of paragraphs and sections I have not spent enough time on. Most of the best sounds natural, a clean distillation of deep understanding. But those paragraphs and sections are the result of years of evolution. That's the time scale on which some of my courses grow, because no course ever gets my full attention in any semester.

When I finish a set of notes, I usually feel like the stitching and unstitching have not yet reached their desired end. Some of the text "seems a moment's thought", but much is still uneven or awkward. Whatever the state of the notes, though, I have move on to the next task: grading a homework assignment, preparing the next class session, or -- worst of all -- performing the administrivia that props up the modern university. More evolution awaits.

~~~~

This was a good read for a Sunday morning on the exercise bike, well recommended. The line on revision alone was worth the time; I expect it will be a stock tool in my arsenal for years to come.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Software Development, Teaching and Learning

November 22, 2021 2:23 PM

Quick Hits

It's been another one of those months when I think about blogging a lot but never set aside time to write. Rather than wait for the time to finish a piece I'm writing, about the process of writing a demo code generator for my compiler students, I thought I'd drop a few tidbits now, just for fun. Maybe that will break the ice for writing this holiday week.

• Two possible titles for my next blog: Dear Crazy Future Eugene and Eugene Wallingford's Descent Into Madness. (Hey to Sheldon Cooper.)

• A nice quote from one of my daughters' alumni magazines: A biology major who is now an executive at a nonprofit agency was asked about the value of having majored in science.

When science is taught the right way, she said, "it is relevant in just about every situation".
Everyone can benefit from thinking like a scientist, and feeling comfortable with that mode of thinking. (Hey to Chad Orzel and Eureka: Discovering Your Inner Scientist.)

• Dan Wang on the US's ability to be a manufacturer of batteries:

Batteries are hard to ship and tend to be developed for particular automakers. So they're made close to the site of auto assembly. The US could be a big battery maker if only it built the charging network and offered subsidies on the scale of Europe and China, it's not hard.
The worlds of manufacturing and big industry are different in fundamental ways from software. I learn a lot from Wang's deep dives into process knowledge and investment. A lot of his ideas apply to software, too.


Posted by Eugene Wallingford | Permalink | Categories: General

October 10, 2021 1:53 PM

Strange Loop 3: This and That

The week after Strange Loop has been a blur of catching up with all the work I didn't do while attending the conference, or at least trying. That is actually good news for my virtual conference: despite attending Strange Loop from the comfort of my basement, I managed not to get sucked into the vortex of regular business going on here.

A few closing thoughts on the conference:

• Speaking of "the comfort of my basement", here is what my Strange Loop conference room looked like:

my Strange Loop 2021 home set-up, with laptop on the left, 29-inch monitor in the center, and a beverage to the right

The big screen is a 29" ultra-wide LG monitor that I bought last year on the blog recommendation of Robert Talbert, which has easily been my best tech purchase of the pandemic. On that screen you'll see vi.to, the streaming platform used by Strange Loop, running in Safari. To its right, I have emacs open on a file of notes and occasionally an evolving blog draft. There is a second Safari window open below emacs, for links picked up from the talks and the conference Slack channels.

On the MacBookPro to left, I am running Slack, another emacs shell for miscellaneous items, and a PDF of the conference schedule, marked up with the two talks I'm considering in each time slot.

That set-up served me well. I can imagine using it again in the future.

• Attending virtually has its downsides, but also its upsides. Saturday morning, one attendee wrote in the Slack #virtual-attendees channel:

Virtual FTW! Attending today from a campsite in upstate New York and enjoying the fall morning air

I was not camping, but I experienced my own virtual victories at lunch time, when I was able to go for a walk with my wife on our favorite walking trails.

• I didn't experience many technical glitches at the conference. There were some serious AV issues in the room during Friday's second slot. Being virtual, I was able to jump easily into and out of the room, checking in on another talk while they debugged on-site. In another talk, we virtual attendees missed out on seeing the presenter's slides. The speaker's words turned out to be enough for me to follow. Finally, Will Byrd's closing keynote seemed to drop its feed a few times, requiring viewers to refresh their browsers occasionally. I don't have any previous virtual conferences to compare to, but this all seemed pretty minor. In general, the video and audio feedbacks were solid and of high fidelity.

• One final note, not related to The Virtual Experience. Like many conferences, Strange Loop has so many good talks that I usually have to choose among two or three talks I want to see in each slot. This year, I kept track of alt-Strange Loop, the schedule of talks I didn't attend but really wanted to. Comparing this list to the list of talks I did attend gives a representative account of the choices I faced. It also would make for a solid conference experience in its own right:

  • FRI 02 -- Whoops! I Rewrote it in Rust (Brian Martin)
  • FRI 03 -- Keeping Your Open Source Project Accessible to All (Treva Williams)
  • FRI 04 -- Impacting Global Policy by Understanding Litter Data (Sean Doherty)
  • FRI 05 -- Morel, A Functional Query Language (Julian Hyde)
  • FRI 06 -- Software for Court Appointed Special Advocates (Linda Goldstein)
  • SAT 02 -- Asami: Turn your JSON into a Graph in 2 Lines (Paula Gearon)
  • SAT 03 -- Pictures Of You, Pictures Of Me, Crypto Steganography (Sean Marcia)
  • SAT 04 -- Carbon Footprint Aware Software Development Tejas Chopra
  • SAT 05 -- How Flutter Can Change the Future of Urban Communities (Edward Thornton)
  • SAT 06 -- Creating More Inclusive Tech Spaces: Paths Forward (Amy Wallhermfechtel)

There is a tie for the honor of "talk I most wanted to see but didn't": Wallhermfechtel on creating more inclusive tech spaces and Marcia on crypto steganography. I'll be watching these videos on YouTube some time soon!

As I mentioned in Day 1's post, this year I tried to force myself out of usual zone, to attend a wider range of talks. Both lists of talks reflect this mix. At heart I am an academic with a fondness for programming languages. The tech talks generally lit me up more. Even so, I was inspired by some of the talks focused on community and the use of technology for the common good. I think I used my two days wisely.

That is all. Strange Loop sometimes gives me the sort of inspiration overdose that Molly Mielke laments in this tweet. This year, though, Strange Loop 2021 gave me something I needed after eighteen months of pandemic (and even more months of growing bureaucracy in my day job): a jolt of energy, and a few thoughts for the future.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

October 01, 2021 5:46 PM

Strange Loop 1: Day One

On this first day of my first virtual conference, I saw a number of Strange Loop-y talks: several on programming languages and compilers, a couple by dancers, and a meta-talk speculating on the future of conferences.

• I'm not a security guy or a cloud guy, so the opening keynote "Why Security is the Biggest Benefit of Using the Cloud" by AJ Yawn gave me a chance to hear what people in this space think and talk about. Cool trivia: Yawn played a dozen college basketball games for Leonard Hamilton at Florida State. Ankle injuries derailed his college hoops experience, and now he's a computer security professional.

• Richard Marmorstein's talk, "Artisanal, Machine-Generated API Libraries" was right on topic with my compiler course this semester. My students would benefit from seeing how software can manipulate AST nodes when generating target code.

Marmorstein uttered two of the best lines of the day:

  • "I could tell you a lot about Stripe, but all you need to know is Stripe has an API."
  • "Are your data structures working for you?"

I've been working with students all week trying to help them see how an object in their compiler such as a token can help the compiler do its job -- and make the code simpler to boot. Learning software design is hard.

• I learned a bit about the Nim programming language from Aditya Siram. As you might imagine, a language designed at the nexus Modula/Oberon, Python, and Lisp appeals to me!

• A second compiler-oriented talk, by Richard Feldman, demonstrated how opportunistic in-place mutation, a static optimization, can help a pure functional program outperform imperative code.

• After the talk "Dancing With Myself", an audience member complimented Mariel Pettee on "nailing the Strange Loop talk". The congratulations were spot-on. She hit the technical mark by describing the use of two machine learning techniques, variational auto encoding and graph neural networks. She hit the aesthetic mark by showing how computer models can learn and generate choreography. When the video for this talk goes live, you should watch.

Pettee closed with the expansive sort of idea that makes Strange Loop a must-attend conference. Dance has no universal language for "writing" choreography, and video captures only a single instance or implementation of a dance, not necessarily the full intent of the choreographer. Pettite had expected her projects to show how machine learning can support invention and co-creation, but now she sees how work like this might provide a means of documentation. Very cool. Perhaps CS can help to create a new kind of language for describing dance and movement.

• I attended Laurel Lawson's "Equitable Experiential Access: Audio Description" to learn more about ways in which videos and other media can provide a fuller, more equitable experience to everyone. Equity and inclusion have become focal points for so much of what we do at my university, and they apply directly to my work creating web-based materials for students. I have a lot to learn. I think one of my next steps will be to experience some of web pages (session notes, assignments, resource pages) solely through a screen reader.

• Like all human activities, traditional in-person conferences offer value and extract costs. Crista Lopes used her keynote closing Day 1 to take a sober look at the changes in their value and their costs in the face of technological advances over the last thirty years.

If we are honest with ourselves, virtual conferences are already able to deliver most of the value of in-person conferences (and, in some ways, provide more value), at much lower cost. The technology of going virtual is the easy part. The biggest challenges are social.

~~~~~

A few closing thoughts as Day 1 closes.

As Crista said, "Taking paid breaks in nice places never gets old." My many trips to OOPSLA and PLoP provided me with many wonderful physical experiences. Being in the same place with my colleagues and friends was always a wonderful social experience. I like driving to St. Louis and going to Strange Loop in person; sitting in my basement doesn't feel the same.

With time, perhaps my expectations will change.

It turns out, though, that "virtual Strange Loop" is a lot like "in-person Strange Loop" in one essential way: several cool new ideas arrive every hour. I'll be back for Day Two.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

September 30, 2021 4:42 PM

Off to Strange Loop

the Strange Loop splash screen from the main hall, 2018

After a couple of years away, I am attending Strange Loop. 2018 seems so long ago now...

Last Wednesday morning, I hopped in my car and headed south to Strange Loop 2018. It had been a few years since I'd listened to Zen and the Art of Motorcycle Maintenance on a conference drive, so I popped it into the tape deck (!) once I got out of town and fell into the story. My top-level goal while listening to Zen was similar to my top-level goal for attending Strange Loop this year: to experience it at a high level; not to get bogged down in so many details that I lost sight of the bigger messages. Even so, though, a few quotes stuck in my mind from the drive down. The first is an old friend, one of my favorite lines from all of literature:

Assembly of Japanese bicycle require great peace of mind.

The other was the intellectual breakthrough that unified Phaedrus's philosophy:

Quality is not an object; it is an event.

This idea has been on my mind in recent months. It seemed a fitting theme, too, for Strange Loop.

There will be no Drive South in 2021. For a variety of reasons, I decided to attend the conference virtually. The persistence of COVID is certainly one of big the reasons. Alex and the crew at Strange Loop are taking all the precautions one could hope for to mitigate risk, but even so I will feel more comfortable online this year than in rooms full of people from across the country. I look forward to attending in person again soon.

Trying to experience the conference at a high level is again one of my meta-level goals for attending. The program contains so many ideas that are new to me; I think I'll benefit most by opening myself to areas I know little or nothing about and seeing where the talks lead me.

This year, I have a new meta-level goal: to see what it is like to attend a conference virtually. Strange Loop is using Vito as its hub for streaming video and conference rooms and Slack as its online community. This will be my first virtual conference, and I am curious to see how it feels. With concerns such as climate change, public health, and equity becoming more prominent as conference-organizing committees make their plans, I suspect that we will be running more and more of our conferences virtually in the future, especially in CS. I'm curious to see how much progress has been made in the last eighteen months and how much room we have to grow.

This topic is even on the program! Tomorrow's lineup concludes with Crista Lopes speaking on the future of conferences. She's been thinking about and helping to implement conferences in new ways for a few years, so I look forward to hearing what she has to say.

Whatever the current state of virtual conferences, I fully expect that this conference will be a worthy exemplar. It always is.

So, I'm off to Strange Loop for a couple of days. I'll be in my basement.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

August 29, 2021 10:19 AM

Launching the Compiler Project with New Uncertainties

We will be forming project teams in my course this week, and students will begin work in earnest on Friday. Or so thinks the prof, who releases the first assignment on Thursday... I can dream.

I noticed one change this year when I surveyed students about their preferences for forming teams. In an ordinary year, most students submit at least one or two names of others in the class with whom they'd like to work; some already have formed the teams they want to work in. A few indicate someone they'd rather not work with, usually based on experiences in previous courses. This helps me help them form teams with a mix of new and familiar, with some hedge against expected difficulties. It's never perfect, but most years we end up with a decent set of teams and project experiences.

This year, though, students barely offered any suggestions for forming teams. Most students expressed no preference for whom they want to work with, and no one indicated someone they don't want to work with.

At first, this seemed strange to me, but then I realized that it is likely an effect of three semesters distorted by COVID-19. With one semester forced online and into isolation, a second semester with universal masking, no extracurricular activities, and no social life, and a third semester with continued masking and continued encouragement not to gather, these students have had almost no opportunitiy to get to know one another!

This isolation eliminates one of the great advantages of a residential university, both personally and professionally. I made so many friends in college, some of whom I'm still close to, and spent time with them whenever I wasn't studying (which, admittedly, was a lot). But it also affects the classroom, where students build bonds over semesters of taking courses together in various configurations. Those bonds carry over into a project course such as mine, where they lubricate the wheels of teams who have to work together more closely than before. They at least begin the project knowing each other a bit and sharing a few academic experiences.

Several students in my class this semester said, "I have no friends in this class" or even "I don't know any other CS majors". That is sad. It also raises the stakes for the compiler project, which may be there only chance to make acquaintances in their major before they graduate. I feel a lot more responsibility as I begin to group students into teams this semester, even as I know that I have less information available than ever before for doing a credible job.

I'm going to keep all this in mind as the semester unfolds and pay closer attention to how students and teams seem to be doing. Perhaps this course can not only help them have a satisfying and educational experience building a big piece of software, but also help them form some of the personal bonds that add grace notes to their undergrad years.

~~~~~

On an unrelated note, I received word a couple of weeks ago that this blog had been selected by Feedspot as one of the Top 20 Computer Science Blogs on the web. It's always nice to be recognized in this way. Given how little little I've blogged over the last couple of years, it is rather generous to include me on this list! I see there a number of top-quality blogs, several of which I read religiously, and most of which post entries with admirable regularity. It remains a goal of mine to return to writing here more regularly. Perhaps two entries within a week, light as they are, offer hope.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

May 06, 2021 3:19 PM

Sometimes You Have To Just Start Talking

I have been enjoying a few of James Propp's essays recently. Last month he wrote about the creation of zero. In Who Needs Zero, he writes:

But in mathematics, premature attempts to reach philosophical clarity can get in the way of progress both at the individual level and at the cultural level. Sometimes you have to just start talking before you understand what you're talking about.

This reminded me of a passage by Iris Murdoch in Metaphysics as a Guide to Morals, which I encountered in one of Robin Sloan's newsletters:

The achievement of coherence is itself ambiguous. Coherence is not necessarily good, and one must question its cost. Better sometimes to remain confused.

My brain seems hardwired to seek out and create abstractions. Perhaps it's just a deeply ingrained habit. Even so I am a pragmatist at heart. As Propp says, "Zero is as zero does."

Allowing oneself to remain confused, to forge ahead without having reached clarity yet, is essential to doing research, or to learning anything at all, really.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

February 28, 2021 9:47 AM

Find Your Passion? Master Something.

A few weeks ago, a Scott Galloway video clip made the rounds. In it, Galloway was saying something about "finding your passion" that many people have been saying for a long time, only in that style that makes Galloway so entertaining. Here's a great bit of practical advice on the same topic from tech guru Kevin Kelly:

Following your bliss is a recipe for paralysis if you don't know what you are passionate about. A better motto for most youth is "master something, anything". Through mastery of one thing, you can drift towards extensions of that mastery that bring you more joy, and eventually discover where your bliss is.

My first joking thought when I read this was, "Well, maybe not anything..." I mean, I can think of lots of things that don't seem worth mastering, like playing video games. But then I read about professional gamers making hundreds of thousands of dollars a year, so who am I to say? Find something you are good at, and get really good at it. As Galloway says, like Chris Rock before him, it's best to become good at something that other people will pay you for. But mastery of anything opens doors that passion can only bang on.

The key to the "master something, anything" mantra is the next sentence of Kelly's advice. When we master something, our expertise creates opportunities. We can move up or down the hierarchy of activities built from that mastery, or to related domains. That is where we are most likely to find the life that brings us joy. Even better, we will find it in a place where our mastery helps us get through the inevitable drudge work and over the inevitable obstacles that will pop in our way. I love to program, but some days debugging is a slog, and other days I butt up against thorny problems beyond my control. The good news is that I have skills to get through those days, and I like what I'm doing enough to push on through to the more frequent moments and days of bliss.

Passion is wonderful if you have it, but it's hard to conjure up on its own. Mastering a skill, or a set of skills, is something every one of us can do, and by doing it we can find our way to something that makes us happy.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

February 27, 2021 11:12 AM

All The Words

In a Paris Review interview, Fran Lebowitz joked about the challenge of writing:

Every time I sit at my desk, I look at my dictionary, a Webster's Second Unabridged with nine million words in it and think, All the words I need are in there; they're just in the wrong order.

Unfortunately, thinks this computer scientist, writing is a computationally more intense task than simply putting the words in the right order. We have to sample with replacement.

Computational complexity is the reason we can't have nice things.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

January 05, 2021 3:33 PM

Today's Reading

Lot's of good stuff on the exercise bike this morning...

Henry Rollins on making things because he must:

I'm a shipbuilder. I don't want to sail in them. I want you to sail in them. I'm just happy that they leave the harbor so I can have an empty workplace.

Deirdre Connolly on the wonder of human achievement:

We, ridiculous apes with big brains and the ability to cooperate, can listen to the universe shake from billions of light years away, because we learned math and engineering. Wild.

Sonya Mann on our ultimate task:

Our labor is the same as it ever was. Your job is to pioneer a resilient node in the network of civilization -- to dodge the punches, roll with the ones that you can't, and live to fight another day. That's what our ancestors did for us and it's what we'll do for those who come next: hold the line, explore when there's surplus, stay steady, and go down swinging when we have to.

Henry Rollins also said:

"What would a writer do in this situation?" I don't know, man. Ask one. And don't tell me what he said; I'm busy.

Back to work.


Posted by Eugene Wallingford | Permalink | Categories: General

January 03, 2021 5:08 PM

On the Tenth Day of Christmas...

... my daughter gave to me:

Christmas gifts from Sarah!

We celebrated Part 2 of our Zoom Family Christmas this morning. A package from one of our daughters arrived in the mail after Part 1 on Christmas Day, so we reprised our celebration during today's weekly call.

My daughter does not read my blog, at least not regularly, but she did search around there for evidence that I might already own these titles. Finding none, she ventured the long-distance gift. It was received with much joy.

I've known about I Am a Strange Loop for over a decade but have never read it. Somehow, Surfaces and Essences flew under my radar entirely. A book that is new to me!

These books will provide me many hours of enjoyment. Like Hofstadter's other books, they will probably bend my brain a bit and perhaps spark some welcome new activity.

~~~~~

Hofstadter appears in this blog most prominently in a set of entries I wrote after he visited my university in 2012:

I did mention I Am a Strange Loop in a later entry after all, a reflection on Alan Turing, representation, and universal machines. I'm glad that entry did not undermine my daughter's gift!


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

January 01, 2021 12:01 PM

This Version of the Facts

The physicist Leo Szilard once announced to his friend Hans Bethe that he was thinking of keeping a diary: "I don't intend to publish it; I am merely going to record the facts for the information of God." "Don't you think God knows the facts?" Bethe asked. "Yes," said Szilard. "He knows the facts, but He does not know this version of the facts."

I began 2021 by starting to read Disturbing the Universe, Freeman Dyson's autobiographical attempt to explain to people who are not scientists what the human situation looks like to someone who is a scientist. The above passage opens the author's preface.

Szilard's motive seems like a pretty good reason to write a blog: to record the one's own version of the facts, for oneself and for the information of God. Unlike Szilard, we have an alternative in between publishing and not publishing. A blog is available for anyone to read, at almost no cost, but ultimately it is for the author, and maybe for God.

I've been using the long break between fall and spring semesters to strengthen my blogging muscle and redevelop my blogging habit. I hope to continue to write more regularly again in the coming year.

Dyson's book is a departure from my recent reading. During the tough fall semester, I found myself drawn to fiction, reading Franny and Zooey by J. D. Salinger, The Bell Jar by Sylvia Plath, The Lucky Ones by Rachel Cusk, and The Great Gatsby by F. Scott Fitzgerald, with occasional pages from André Gide's diary in the downtime between books.

I've written about my interactions with Cusk before [ Outline, Transit, Kudos ], so one of her novels is no surprise here, but what's with those classics from sixty years ago or more? These stories, told by deft and observant writers, seemed to soothe me. They took the edge off of the long days. Perhaps I could have seen a run of classic books coming... In the frustrating summer run-up to fall, I read Thomas Mann's Death in Venice and Ursula Le Guin's The Lathe of Heaven.

For some reason, yesterday I felt the urge to finally pick up Dyson's autobiography, which had been on my shelf for a few months. A couple of years ago, I read most of Dyson's memoir, Maker of Patterns, and found him an amiable and thoughtful writer. I even wrote a short post on one of his stories, in which Thomas Mann plays a key role. At the time, I said, "I've never read The Magic Mountain, or any Mann, for that matter. I will correct that soon. However, Mann will have to wait until I finish Dyson...". 2020 may have been a challenge in many ways, but it gave me at least two things: I read my first Mann (Death in Venice is much more approachable than The Magic Mountain...), and it returned me to Dyson.

Let's see where 2021 takes us.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

December 27, 2020 10:10 AM

What Paul McCartney Can Teach Us About Software

From Sixty-Four Reasons to Celebrate Paul McCartney, this bit of wisdom that will sound familiar to programmers:

On one of the tapes of studio chatter at Abbey Road you can hear McCartney saying, of something they're working on, "It's complicated now. If we can get it simpler, and then complicate it where it needs to be complicated..."

People talk a lot about making software as simple as possible. The truth is, software sometimes has to be complicated. Some programs perform complex tasks. More importantly, programs these days often interact in complex environments with a lot of dissimilar, distributed components. We cannot avoid complexity.

As McCartney knows about music, the key is to make things as simple as can be and introduce complexity only where it is essential. Programmers face three challenges in this regard:

  • learning how to simplify code,
  • learning how to add complexity in a minimal, contained fashion, and
  • learning how to recognize the subtle boundary between essential simplicity and essential complexity.
I almost said that new programmers face those challenges, but after many years of programming, I feel like I'm still learning how to do all three of these things. I suspect other experienced programmers are still learning, too.

On an unrelated note, another passage in this article spoke to me personally as a programmer. While discussing McCartney's propensity to try new things and to release everything, good and bad, it refers to some of the songs on his most recent album (at that time) as enthusiastically executed misjudgments. I empathize with McCartney. My hard drive is littered with enthusiastically executed misjudgments. And I've never written the software equivalent of "Hey Jude".

McCartney just released a new album this month at the age of 78. The third album in a trilogy conceived and begun in 1970, it has already gone to #1 in three countries. He continues to write, record, and release, and collaborates frequently with today's artists. I can only hope to be enthusiastically producing software, and in tune with the modern tech world, when I am his age.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

December 21, 2020 8:30 AM

Watching the Scientific Method Take Off in Baseball

I make it a point never to believe anything
just because it's widely known to be so.
-- Bill James

A few years ago, a friend was downsizing and sent me his collection of The Bill James Abstract from the 1980s. Every so often I'll pick one up and start reading. This week, I picked up the 1984 issue.

my stack of The Bill James Abstract, courtesy of Charlie Richter

It's baseball, so I enjoy it a lot. My #1 team, the Cincinnati Reds, were in the summer doldrums for most of the 1980s but on their way to a surprising 1990 World Series win. My #2 team, the Detroit Tigers, won it all in 1984, with a performance so dominating that it seemed almost preordained. It's fun to reminisce about those days.

It's even more fascinating to watch the development of the scientific method in a new discipline.

Somewhere near the beginning of the 1984 abstract, James announces the stance that underlies all his work: never believe anything just because everyone else says it's true. Scattered through the book are elaborations of this philosophy. He recognizes that understanding the world of baseball will require time, patience, and revision:

In many cases, I have no clear evidence on the issue, no way of answering the question. ... I guess what I'm saying is that if we start trying to answer these questions now, we'll be able to answer them in a few years. An unfortunate side effect is that I'm probably going to get some of the answers wrong now; not only some of the answers but some of the questions.

Being wrong is par for the course for scientists; perhaps James felt some consolation in that this made him like Charles Darwin. The goal isn't to be right today. It is to be less wrong than yesterday. I love that James tells us that, early in his exploration, even some of the questions he is asking are likely the wrong questions. He will know better after he has collected some data.

James applies his skepticism and meticulous analysis to everything in the game: which players contribute the most offense or defense to the team, and how; how pitching styles affect win probabilities; how managers approach the game. Some things are learned quickly but are rejected by the mainstream. By 1984, for example, James and people like him knew that, on average, sacrifice bunts and most attempts to steal a base reduced the number of runs a team scores, which means that most of them hurt the team more than they help. But many baseball people continued to use them too often tactically and even to build teams around them strategically.

At the time of this issue, James had already developed models for several phenomena in the game, refined them as evidence from new seasons came in, and expanded his analysis into new areas. At each step, he channels his inner scientist: look at some part of the world, think about why it might work the way it does, develop a theory and a mathematical model, test the theory with further observations, and revise. James also loves to share his theories and results with the rest of us.

There is nothing new here, of course. Sabermetrics is everywhere in baseball now, and data analytics have spread to most sports. By now, many people have seen Moneyball (a good movie) or read the Michael Lewis book on which it was based (even better). Even so, it really is cool to read what are in effect diaries recording what James is thinking as he learns how to apply the scientific method to baseball. His work helped move an entire industry into the modern world. The writing reflects the curiosity, careful thinking, and humility that so often lead to the goal of the scientific mind:

to be less wrong than yesterday


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

December 10, 2020 3:36 PM

The Fate of Most Blogs

... was described perfectly by André Gide on August 8, 1891, long before the digital computer:

More than a month of blanks. Talking of myself bores me. A diary is useful during conscious, intentional, and painful spiritual evolutions. Then you want to know where you stand. But anything I should say now would be harpings on myself. An intimate diary is interesting especially when it records the awakening of ideas; or the awakening of the senses at puberty; or else when you feel yourself to be dying.

There is no longer any drama taking place in me; there is now nothing but a lot of ideas stirred up. There is no need to write myself down on paper.

Of course, Gide kept writing for many years after that moment of doubt. The Journals of André Gide are an entertaining read. I feel seen, as the kids say these days.


Posted by Eugene Wallingford | Permalink | Categories: General

December 08, 2020 2:06 PM

There Is No Step Two, But There Is A Step Three

In a not-too-distant post, Daniel Steinberg offered two lessons from his experience knitting:

So lesson one is to start and lesson two is to keep going.

This reminded me of Barney Stinson's rules for running a marathon (10s video):

Here's how you run a marathon.

Step 1: Start running.

<pause>

Oh, yeah -- there's no Step 2.

Daniel offers more lessons, though, including Lesson Three: Ask for help. After running the New York Marathon with no training, Barney learned this lesson the hard way. Several hours after the marathon, he found that he no longer had control of his legs, got stuck on the subway because he could not stand up on his own, and had to call the gang for help.

I benefit a lot from reading Daniel's blog posts, and Barney probably could have, too. We're all better off now that Daniel is writing for his blogs and newsletters regularly again. They are full of good stories, interesting links, and plenty of software wisdom.


Posted by Eugene Wallingford | Permalink | Categories: General, Running

July 16, 2020 10:47 AM

Dreaming in Git

I recently read a Five Books interview about the best books on philosophical wonder. One of the books recommended by philosopher Eric Schwitzgebel was Diaspora, a science fiction novel by Greg Egan I've never read. The story unfolds in a world where people are able to destroy their physical bodies to upload themselves into computers. Unsurprisingly, this leads to some fascinating philosophical possibilities:

Well, for one thing you could duplicate yourself. You could back yourself up. Multiple times.

And then have divergent lives, as it were, in parallel but diverging.

Yes, and then there'd be the question, "do you want to merge back together with the person you diverged from?"

Egan wrote Diaspora before the heyday of distributed version control, before darcs and mercurial and git. With distributed VCS, a person could checkout a new personality, or change branches and be a different person every day. We could run diffs to figure out what makes one version of a self so different from another. If things start going too wrong, we could always revert to an earlier version of ourselves and try again. And all of this could happen with copies of the software -- ourselves -- running in parallel somewhere in the world.

And then there's Git. Imagine writing such a story now, with Git's complex model of versioning and prodigious set of commands and flags. Not only could people branch and merge, checkout and diff... A person could try something new without ever committing changes to the repository. We'd have to figure out what it means to push origin or reset --hard HEAD. We'd be able to rewrite history by rebasing, amending, and squashing. A Git guru can surely explain why we'd need to --force-with-lease or --unset-upstream, but even I can imagine the delightful possibilities of git stash in my personal improvement plan.

Perhaps the final complication in our novel would involve a merge so complex that we need a third-party diff tool to help us put our desired self back together. Alas, a Python library or Ruby gem required by the tool has gone stale and breaks an upgrade. Our hero must find a solution somewhere in her tree of blobs, or be doomed to live a forever splintered life.

If you ever see a book named Dreaming in Git or Bug Report on an airport bookstore's shelves, take a look. Perhaps I will have written the first of my Git fantasies.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development

July 07, 2020 2:42 PM

Spurious Precision

When I pestered Conway for more details regarding the seminal Moscow meeting that inspired his triumphant half-day of discovery, he begged off. He was loath to add any "spurious precision", as he came to refer to his embellishments, advertent or accidental. "My memory. My memory is a liar," he said. "It's a good liar. It deceives even me."

I love the phrase "spurious precision". It's a great name for something I see in the world -- and, all too often, in my own mind. I should be as careful with my own memory as Conway tried to be in this instance.

(From a 2015 profile in The Guardian.)


Posted by Eugene Wallingford | Permalink | Categories: General

July 01, 2020 3:19 PM

Feeling Unstuck Amid the Pandemic

Rands recently wrote about his work-from-home routine. I love the idea of walking around a large wooded yard while doing audio meetings... One of his reasons for feeling so at ease struck a chord with me:

Everyone desperately wants to return to normality. I am a professional optimist, but we are not returning to normal. Ever. This is a different forever situation, and the sooner we realize that and start to plan accordingly, the sooner we will feel unstuck.

I have written or spoken a variation of this advice so many times over my fifteen years as department head, most often in the context of state funding and our university budget.

Almost every year for my first decade as head, we faced a flat or reduced budget, and every time several university colleagues expressed a desire to ride the storm out: make temporary changes to how we operate and wait for our budgets to return to normal. This was usually accompanied by a wistful desire that we could somehow persuade legislators of our deep, abiding value and thus convince them to allocate more dollars to the university or, failing that, that new legislators some future legislature would have different priorities.

Needless to say, the good old days never returned, and our budget remained on a downward slide that began in the late 1990s. This particular form of optimism was really avoidance of reality, and it led to many people living in a state of disappointment and discomfort for years. Fortunately, over the last five or ten years, most everyone has come to realize that what we have now is normal and has begun to plan accordingly. It is psychologically powerful to accept reality and begin acting with agency.

As for the changes brought on by the pandemic, I must admit that I am undecided about how much of what has changed over the last few months will be the normal way of the university going forward.

My department colleagues and I have been discussing how the need for separation among students in the classroom affects how we teach. Our campus doesn't have enough big rooms for everyone to move each class into a room with twice the capacity, so most of us are looking at ways to teach hybrid classes, with only half of our students in the classroom with us on any given day. This makes most of us sad and even a little depressed: how can we teach our courses as well as we always have in the past when new constraints don't allow us to do what we have optimized our teaching to do?

I have started thinking of the coming year in terms of hill climbing, an old idea from AI. After years of hard work and practice, most of us are at a local maximum in our teaching. The pandemic has disoriented us by dropping us at a random point in the environment. The downside of change in position is that we are no longer at our locally-optimal point for teaching our courses. The upside is that we get to search again under new conditions. Perhaps we can find a new local maximum, perhaps even one higher than our old max. If not, at least we have conducted a valuable experiment under trying conditions and can use what we learn going forward.

This analogy helps me approach my new course with more positive energy. A couple of my colleagues tell me it has helped them, too.

As many others have noted, the COVID-19 crisis has accelerated a few changes that were already taking place in our universities, in particular in the use of digital technology to engage students and to replace older processes. Of the other changes we've seen, some will certainly stick, but I'm not sure anyone really knows which ones. Part of the key to living with the uncertainty is not to tie ourselves too closely to what we did before.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Teaching and Learning

May 22, 2020 3:34 PM

What Good Can Come From All This?

Jerry Seinfeld:

"What am I really sick of?" is where innovation begins.

Steve Wozniak:

For a lot of entrepreneurs, they see something and they say, "I have to have this," and that will start them building their own.

Morgan Housel:

Necessity is the mother of invention, so our willingness to solve problems is about to surge.

A lot of people are facing a lot of different stresses right now, with the prospect that many of those stresses will continue on into the foreseeable future. For instance, I know a lot of CS faculty who are looking at online instruction and remote learning much carefully now that they may be doing it again in the fall. Many of us have some things to learn, and some real problems need to be solved.

"What am I really sick of?" can turn the dial up on our willingness to solve problems that have been lingering in the background for a while. Let's hope that some good can come from the disruption.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

April 08, 2020 2:42 PM

Three Quotes on Human Behavior

2019, Robin Sloan:

On the internet, if you stop speaking: you disappear. And, by corollary: on the internet, you only notice the people who are speaking nonstop.

Some of the people speaking nonstop are the ones I wish would disappear for a while.

~~~~~

1947, from Italy's Response to the Coronavirus:

Published in 1947, The Plague has often been read as an allegory, a book that is really about the occupation of France, say, or the human condition. But it's also a very good book about plagues, and about how people react to them -- a whole category of human behavior that we have forgotten.

A good book is good on multiple levels.

~~~~~

1628, William Harvey, "On the Motion of the Heart and Blood in Animals":

Doctrine, once sown, strikes deep its root, and respect for antiquity influences all men. Still the die is cast, and my trust is in my love of the truth and the candour of cultivated minds.

I don't know why, but the phrase "the candor of cultivated minds" really stuck with me when I read it this week.


Posted by Eugene Wallingford | Permalink | Categories: General

April 06, 2020 1:57 PM

Arithmetic is Fundamental

From a September 2009 edition of Scientific American, in a research report titled "Animals by the Numbers":

Recent studies, however, have uncovered new instances of a counting skill in different species, suggesting that mathematical abilities could be more fundamental in biology than previously thought. Under certain conditions, monkeys could sometimes outperform college students.

Having watched college students attempt to convert base 10 to base 2 using a standard algorithm, I am not surprised.

One animal recorded with Kool-Aid, was 10 to 20 percent less accurate than college students but beat them in reaction time. "The monkeys didn't mind missing every once in a while," Cantlon recounts. "It wants to get past the mistake and on to the next problem where to can get more Kool-Aid, whereas college students can't shake their worry over guessing wrong."

Well, that's changes things a bit. Our education system trains a willingness to fail out of our students. Animals face different kinds of social pressure.

That said, 10-20 percent less accurate is only a letter grade or two on many grading scales. Not too bad for our monkey friends, and they get some Kool-Aid to boot.

My wife was helping someone clean out their house and brought home a bunch of old Scientific Americans. I've had a good time browsing through the articles and seeing what people were thinking and saying a decade ago. The September 2009 issue was about the origins of ideas and products, including the mind. Fun reading.


Posted by Eugene Wallingford | Permalink | Categories: General

March 15, 2020 9:35 AM

Things I've Been Reading

This was a weird week. It started with preparations for spring break and an eye on the news. It turned almost immediately into preparations for at least two weeks of online courses and a campus on partial hiatus. Of course, we don't know how the COVID-19 outbreak will develop over the next three weeks, so we may be facing the remaining seven weeks of spring semester online, with students at a distance.

Here are three pieces that helped me get through the week.

Even If You Believe

From When Bloom Filters Don't Bloom:

Advanced data structures are very interesting, but beware. Modern computers require cache-optimized algorithms. When working with large datasets that do not fit in L3, prefer optimizing for a reduced number of loads over optimizing the amount of memory used.

I've always liked the Bloom filter. It seems such an elegant idea. But then I've never used one in a setting where performance mattered. It still surprises me how well current architectures and compilers optimize performance for us in ways that our own efforts can only frustrate. The article is also worth reading for its link to a nice visualization of the interplay among the parameters of a Bloom Filter. That will make a good project in a future class.

Even If You Don't Believe

From one of Tyler Cowen's long interviews:

Niels Bohr had a horseshoe at his country house across the entrance door, a superstitious item, and a friend asked him, "Why do you have it there? Aren't you a scientist? Do you believe in it?" You know what was Bohr's answer? "Of course I don't believe in it, but I have it there because I was told that it works, even if you don't believe in it."

You don't have to believe in good luck to have good luck.

You Gotta Believe

From Larry Tesler's annotated manual for the PUB document compiler:

In 1970, I became disillusioned with the slow pace of artificial intelligence research.

The commentary on the manual is like a mini-memoir. Tesler writes that he went back to the Stanford AI lab in the spring of 1971. John McCarthy sent him to work with Les Earnest, the lab's chief administrator, who had an idea for a "document compiler", a lá RUNOFF, for technical manuals. Tesler had bigger ideas, but he implemented PUB as a learning exercise. Soon PUB had users, who identified shortcomings that were in sync with Tesler's own ideas.

The solution I favored was what we would now call a WYSIWYG interactive text editing and page layout system. I felt that, if the effect of any change was immediately apparent, users would feel more in control. I soon left Stanford to pursue my dream at Xerox PARC (1973-80) and Apple Computer (1980-1997).

Thus began the shift to desktop publishing. And here I sit, in 2020, editing this post using emacs.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

February 10, 2020 2:37 PM

Some Things I Read Recently

Campaign Security is a Wood Chipper for Your Hopes and Dreams

Practical campaign security is a wood chipper for your hopes and dreams. It sits at the intersection of 19 kinds of status quo, each more odious than the last. You have to accept the fact that computers are broken, software is terrible, campaign finance is evil, the political parties are inept, the DCCC exists, politics is full of parasites, tech companies are run by arrogant man-children, and so on.

This piece from last year has some good advice, plenty of sarcastic humor from Maciej, and one remark that was especially timely for the past week:

You will fare especially badly if you have written an app to fix politics. Put the app away and never speak of it again.

Know the Difference Between Neurosis and Process

In a conversation between Tom Waits and Elvis Costello from the late 1980s, Waits talks about tinkering too long with a song:

TOM: "You have to know the difference between neurosis and actual process, 'cause if you're left with it in your hands for too long, you may unravel everything. You may end up with absolutely nothing."

In software, when we keep code in our hands for too long, we usually end up with an over-engineered, over-abstracted boat anchor. Let the tests tell you when you are done, then stop.

Sometimes, Work is Work

People say, "if you love what you do you'll never work a day in your life." I think good work can be painful--I think sometimes it feels exactly like work.

Some weeks more than others. Trust me. That's okay. You can still love what you do.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development

January 26, 2020 10:23 AM

The Narrative Impulse

Maybe people don't tell stories only to make sense of the world, but rather sometimes to deceive themselves?

It was an interesting idea, I said, that the narrative impulse might spring from the desire to avoid guilt, rather than from the need -- as was generally assumed -- to connect things together in a meaningful way; that it was a strategy calculated, in other words, to disburden ourselves of responsibility.

This is from Kudos, by Rachel Cusk. Kudos is the third book in an unconventional trilogy, following Outline and Transit. I blogged on a passage from Transit last semester, about making something that is part of who you are.

I have wanted to recommend Cusk and these books, but I do not feel up to the task of describing how or why I think so highly of them. They are unorthodox narratives about narrative. To me, Cusk is a mesmerizing story-teller who intertwines stories about people and their lives with the fabric of story-telling itself. She seems to value the stories we tell about ourselves, and yet see through them, to some overarching truth.

As for my own narrative impulse, I think of myself as writing posts for this blog in order to make connections among the many things I learn -- or at least that is I tell myself. Cusk has me taking seriously the idea that some of the stories I tell may come from somewhere else.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

January 06, 2020 3:13 PM

A Writing Game

I recently started reading posts in the archives of Jason Zweig's blog. He writes about finance for a living but blogs more widely, including quite a bit about writing itself. An article called On Writing Better: Sharpening Your Tools challenges writers to look at each word they write as "an alien object":

As the great Viennese journalist Karl Kraus wrote, "The closer one looks at a word, the farther away it moves." Your goal should be to treat every word you write as an alien object: You should be able to look at it and say, What is that doing here? Why did I use that word instead of a better one? What am I trying to say here? How can I get to where I'm going if I use such stale and lifeless words?

My mind immediately turned this into a writing game, an exercise that puts the idea into practice. Take any piece of writing.

  1. Choose a random word in the document.
  2. Change the word -- or delete it! -- in a way that improves the text.
  3. Go to 1.

Play the game for a fixed number of rounds or for a fixed period of time. A devilish alternative is to play until you get so frustrated with your writing that you can't continue. You could then judge your maturity as a writer by how long you can play in good spirits.

We could even automate the mechanics of the game by writing a program that chooses a random word in a document for us. Every time we save the document after a change, it jumps to a new word.

As with most first ideas, this one can probablyb be improved. Perhaps we should bias word selection toward words whose replacement or deletion are most likely to improve our writing. Changing "the" or "to" doesn't offer the same payoff as changing a lazy verb or deleting an abstract adverb. Or does it? I have a lot of room to improve as a writer; maybe fixing some "the"s and "to"s is exactly what I need to do. The Three Bears pattern suggests that we might learn something by tackling the extreme form of the challenge and seeing where it leads us.

Changing or deleting a single word can improve a piece of text, but there is bigger payoff available, if we consider the selected word in context. The best way to eliminate many vague nouns is to turn them back into verbs, where they act with vigor. To do that, we will have to change the structure of the sentence, and maybe the surrounding sentences. That forces us to think even more deeply about the text than changing a lone word. It also creates more words for us to fix in following rounds!

I like programming challenges of this sort. A writing challenge that constrains me in arbitrary ways might be just what I need to take time more often to improved my work. It might help me identify and break some bad habits along the way. Maybe I'll give this a try and report back. If you try it, please let me know the results!

And no, I did not play the game with this post. It can surely be improved.

Postscript. After drafting this post, I came across another article by Zweig that proposes just such a challenge for the narrower case of abstract adverbs:

The only way to see if a word is indispensable is to eliminate it and see whether you miss it. Try this exercise yourself:
  • Take any sentence containing "actually" or "literally" or any other abstract adverb, written by anyone ever.
  • Delete that adverb.
  • See if the sentence loses one iota of force or meaning.
  • I'd be amazed if it does (if so, please let me know).

We can specialize the writing game to focus on adverbs, another part of speech, or almost any writing weakness. The possibilities...


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns, Teaching and Learning

November 25, 2019 6:06 PM

Demonstrating That Two Infinities Are Equal

I remember first learning as a student that some infinities are bigger than others. For some sets of numbers, it was easy to see how. The set of integers is infinite, and the set of real numbers is infinite, and it seemed immediately clear that there are fewer integers than reals. Demonstrations and proofs of the fact were cool, but I already knew what they showed me.

Other relationships between infinities were not so easy to grok. Consider: There are an infinite numbers of points on a sheet of paper. There are an infinite numbers of points on a wall. These infinities are equal to one another. But how? Mathematician Yuri Manin demonstrates how:

I explained this to my grandson, that there are as many points in a sheet of paper as there are on the wall of the room. "Take the sheet of paper, and hold it so that it blocks your view of the wall completely. The paper hides the wall from your sight. Now if a beam of light comes out of every point on the wall and lands in your eye, it must pass through the sheet of paper. Each point on the wall corresponds to a point on the sheet of paper, so there must be the same number of each."

I remember reading that explanation in school and feeling both amazed and enlightened. What sorcery is this? So simple, so beautiful. Informal proofs of this sort made me want to learn more mathematics.

Manin told the story quoted above in an interview a decade or so ago with Mikhail Gelfand, We Do Not Choose Mathematics as Our Profession, It Chooses Us. It was a good read throughout and reminded me again how I came to enjoy math.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

October 30, 2019 3:30 PM

A Few Ideas from Economist Peter Bernstein

I found all kinds of wisdom in this interview with economist Peter Bernstein. It was originally published in 2004 and the updated online a couple of years ago. A lot of the wisdom sounds familiar, as most general wisdom does, but occasionally Bernstein offers a twist. For instance, I like this passage:

I make no excuses or apologies for changing my mind. The world around me changes, for one thing, but also I am continuously learning. I have never finished my education and probably never will.... I'm always telling myself, "I must sit down and explain why I said this, and why I was wrong."

People often speak the virtue of changing our minds, but Bernstein goes further: he feels a need to explain both the reason he thought what he did and the reason he was wrong. That sort of post-mortem can be immensely helpful to the rest of us as we try to learn, and the humility of explaining the error keeps us all better grounded.

I found quotable passages on almost every page. One quoted Leibniz, which I paraphrased as:

von Leibniz told Bernoulli that nature works in patterns, but "only for the most part". The other part -- the unpredictable part -- tends to be where the action is.

Poking around the fringes of a model that is pretty good or a pattern of thought that only occasionally fails us often brings surprising opportunities for advancement.

Many of Bernstein's ideas were framed specifically as about investing, of course, such as:

The riskiest moment is when you're right. That's when you're in the most trouble, because you tend to overstay the good decisions.

and:

Diversification is not only a survival strategy but also an aggressive strategy, because the next windfall might come from a surprising place.

These ideas are powerful outside the financial world, too, though. Investing too much importance in a productive research area can be risky because it becomes easy to stay there too long after the world starts to move away. Diversifying our programming language skills and toolsets might look like a conservative strategy that limits rapid advance in a research niche right now, but it also equips us to adapt more quickly when the next big idea happens somewhere we don't expect.

Anyway, the interview is a good long-but-quick read. There's plenty more to consider, in particular his application of Pascal's wager to general decision making. Give it a read if it sounds interesting.


Posted by Eugene Wallingford | Permalink | Categories: General

October 27, 2019 10:23 AM

Making Something That Is Part Of Who You Are

The narrator in Rachel Cusk's "Transit" relates a story told to her by Pavel, the Polish builder who is helping to renovate her flat. Pavel left Poland for London to make money after falling out with his father, a builder for whom he worked. The event that prompted his departure was a reaction to a reaction. Pavel had designed and built a home for his family. After finishing, he showed it to his father. His father didn't like it, and said so. Pavel chose to leave at that moment.

'All my life,' he said, 'he criticise. He criticise my work, my idea, he say he don't like the way I talk -- even he criticise my wife and my children. But when he criticise my house' -- Pavel pursed his lips in a smile -- 'then I think, okay, is enough.'

I generally try to separate myself from the code and prose I write. Such distance is good for the soul, which does not need to be buffeted by criticism, whether external or internal, of the things I've created. It is also good for the work itself, which is free to be changed without being anchored to my identity.

Fortunately, I came out of home and school with a decent sense that I could be proud of the things I create without conflating the work with who I am. Participating in writers' workshops at PLoP conferences early in my career taught me some new tools for hearing feedback objectively and focusing on the work. Those same tools help me to give feedback better. I use them in an effort to help my students develop as people, writers and programmers independent of the code and prose they write.

Sometimes, though, we make things that are expressions of ourselves. They carry part of us in their words, in what they say to the world and how they say it. Pavel's house is such a creation. He made everything: the floors, the doors, and the roof; even the beds his children slept in. His father had criticized his work, his ideas, his family before. But criticizing the house he had dreamed and built -- that was enough. Cusk doesn't give the reader a sense that this criticism was a last straw; it was, in a very real way, the only straw that mattered.

I think there are people in this world who would like just once in their lives to make something that is so much a part of who they are that they feel about it as Pavel does his house. They wish to do so despite, or perhaps because of, the sharp line it would draw through the center of life.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

September 14, 2019 2:56 PM

Listen Now

In a YC Female Founder Story, Danielle Morrill gives a wise answer to an old question:

Q: What do you wish someone had told you when you were 15?
I think people were telling me a lot of helpful things when I was 15 but it was very hard to listen.

This may seem more like a wry observation than a useful bit of wisdom. The fifteen-year-olds of today are no more likely to listen to us than we were to listen to adults when we were fifteen. But that presumes young people have more to learn than the rest of us. I'm a lot older than 15, and I still have plenty to learn.

Morrill's answer is a reminder to me to listen more carefully to what people are telling me now. Even now that can be hard, with all the noise out there and with my own ego getting in my way. Setting up my attention systems to identify valuable signals more reliably can help me learn faster and make me a lot more productive. It can also help future-me not want to look back wistfully so often, wishing someone had told me now what I know then.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

August 02, 2019 2:48 PM

Programming is an Infinite Construction Kit

As so often, Marvin Minsky loved to tell us about the beauty of programming. Kids love to play with construction sets like Legos, TinkerToys, and Erector sets. Programming provides an infinite construction kit: you never run out of parts!

In the linked essay, which was published as a preface to a 1986 book about Logo, Minsky tells several stories. One of the stories relates that once, as a small child, he built a large tower out of TinkerToys. The grownups who saw it were "terribly impressed". He inferred from their reaction that:

some adults just can't understand how you can build whatever you want, so long as you don't run out of sticks and spools.

Kids get it, though. Why do so many of us grow out of this simple understanding as we get older? Whatever its cause, this gap between children's imaginations and the imaginations of adults around them creates a new sort of problem when we give the children a programming language such as Logo or Scratch. Many kids take to these languages just as they do to Legos and TinkerToys: they're off to the races making things, limited only by their expansive imaginations. The memory on today's computers is so large that children never run out of raw material for writing programs. But adults often don't possess the vocabulary for talking with the children about their creations!

... many adults just don't have words to talk about such things -- and maybe, no procedures in their heads to help them think of them. They just do not know what to think when little kids converse about "representations" and "simulations" and "recursive procedures". Be tolerant. Adults have enough problems of their own.

Minsky thinks there are a few key ideas that everyone should know about computation. He highlights two:

Computer programs are societies. Making a big computer program is putting together little programs.

Any computer can be programmed to do anything that any other computer can do--or that any other kind of "society of processes" can do.

He explains the second using ideas pioneered by Alan Turing and long championed in the popular sphere by Douglas Hofstadter. Check out this blog post, which reflects on a talk Hofstadter gave at my university celebrating the Turing centennial.

The inability of even educated adults to appreciate computing is a symptom of a more general problem. As Minsky says toward the end of his essay, People who don't appreciate how simple things can grow into entire worlds are missing something important. If you don't understand how simple things can grow into complex systems, it's hard to understand much at all about modern science, including how quantum mechanics accounts for what we see in the world and even how evolution works.

You can usually do well by reading Minsky; this essay is a fine example of that. It comes linked to an afterword written by Alan Kay, another computer scientist with a lot to say about both the beauty of computing and its essential role in a modern understanding of the world. Check both out.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

July 30, 2019 3:27 PM

"Eugene-Past Knew Things That Eugene-Present Does Not"

A few months back, Mark Guzdial began to ponder a new research question:

I did some literature searches, and found a highly relevant paper: "Task specific programming languages as a first programming language." And the lead author is... me. I wrote this paper with Allison Elliott Tew and Mike McCracken, and published it in 1997. I honestly completely forgot that I had written this paper 22 years ago. Guzdial-past knew things that Guzdial-present does not.

I know this feeling too well. It seems that whenever I look back at an old blog post, especially from the early years, I am surprised to have already thought something, and usually to have thought it better and more deeply than I'm thinking it now! Perhaps this says something about the quality of my thinking now, or the quality of my blogging then. Or maybe it's simply an artifact of time and memory. In any case, stumbling across a link to an ancient blog entry often leads to a few moments of pleasure after an initial bit of disorientation.

On a related note, the fifteenth anniversary of my first blog post passed while I was at Dagstuhl earlier this month. For the first few years, I regularly wrote twelve to twenty posts a month. Then for a few years I settled into a pattern of ten to twelve monthly. Since early 2017, though, I've been in the single digits, with fewer substantial entries. I'm not giving Eugene-2025 much material to look back on.

With a new academic year soon upon us, I hope to write a bit more frequently and a bit more in depth about my programming, my teaching, and my encounters with computer science and the world. I think that will be good for me in many ways. Sometimes, knowing that I will write something encourages me to engage more deeply than I might otherwise. Nearly every time, the writing helps me to make better sense of the encounter. That's one way to make Eugene-Present a little smarter.

As always, I hope that whoever is still reading here finds it worth their time, too.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

July 05, 2019 12:40 PM

A Very Good Reason to Leave Your Home and Move to a New Country

He applied to switch his major from mathematics to computer science, but the authorities forbade it. "That is what tipped me to accept the idea that perhaps Russia is not the best place for me," he says. "When they wouldn't allow me to study computer science."

-- Sergey Aleynikov, as told to Michael Lewis and reported in Chapter 5 of Flash Boys.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 01, 2019 11:59 AM

Wandering the Stacks

In You Are Here, Ben Hunt writes:

You know what I miss most about the world before Amazon? I miss going to the library and looking up a book in the card catalog, searching the stacks for the book in question, and then losing myself in the experience of discovery AROUND the book I was originally searching for. It's one of the best feelings in the world, and I'm not sure that my children have ever felt it. I haven't felt it in at least 20 years.

My daughters, now in their mid-20s, have felt it. We were a library family, not a bookstore family or an Amazon family. Beginning as soon as they could follow picture books, we spent countless hours at the public library in our town and the one in the neighboring city. We took the girls to Story Time and to other activities, but mostly we went to read and wander and select a big stack of books to take home. The books we took home never lasted as long as we thought they would, so back we'd go.

I still wander the stacks myself, both at the university library and, less often these days, the local public libraries. I always start with a few books in mind, recommendations gathered from friends and articles I've read, but I usually bring home an unexpected bounty. Every year I find a real surprise or two, books I love but would never have known about if I hadn't let myself browse. Even when I don't find anything surprising to take home, it's worth the time I spend just wandering.

Writing a little code often makes my day better. So does going to the library. Walking among books, starting with a goal and then aimlessly browsing, calms me on days I need calming and invigorates me on days when my energy is down. Some days, it does both at the same time. Hunt is right: It's one of the best feelings in the world. I hope that whatever else modern technology does for our children, it gives them something to rival this feeling.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

June 28, 2019 3:39 PM

Another Peter Principle-Like Observation

Raganwald tweeted:

If you design a language for people who have a talent for managing accidental complexity, you'll beget more and more accidental complexity over time.
Someone who can manage accidental complexity will always take on more if it makes them more productive.

This reminded me of a blog post from last November in which I half-jokingly coined The Peter Principle of Software Growth:

Software grows until it exceeds our capacity to understand it.

In the case of Raganwald's tweet, languages that enable us to handle accidental complexity well lead to more accidental complexity, because the people who use them will be more be more ambitious -- until they reach their saturation point. Both of these observations about software resemble the original Peter Principle, in which people who succeed are promoted until they reach a point at which they can't, or don't, succeed.

I am happy to dub Raganwald's observation "The Peter Principle of Accidental Complexity", but after three examples, I begin to recognize a pattern... Is there a general name for this phenomenon, in which successful actors advance or evolve naturally until they reach a point at which the can't, or don't, succeed?

If you have any ideas, please email me or respond on Twitter.

In a playful mood at the end of a strange and hectic week, I am now wondering whether there is a Peter Principle of Peter Principles.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

June 18, 2019 3:09 PM

Notations, Representations, and Names

In The Power of Simple Representations, Keith Devlin takes on a quote attributed to the mathematician Gauss: "What we need are notions, not notations."

While most mathematicians would agree that Gauss was correct in pointing out that concepts, not symbol manipulation, are at the heart of mathematics, his words do have to be properly interpreted. While a notation does not matter, a representation can make a huge difference.

Spot on. Devlin's opening made me think of that short video of Richard Feynman that everyone always shares, on the difference between knowing the name of something and knowing something. I've seen people mis-interpret Feynman's words in both directions. The people who share this video sometimes seem to imply that names don't matter. Others dismiss the idea as nonsense: how can you not know the names of things and claim to know anything?

Devlin's distinction makes clear the sense in which Feynman is right. Names are like notations. The specific names we use don't really matter and could be changed, if we all agreed. But the "if we all agreed" part is crucial. Names do matter as a part of a larger model, a representation of the world that relates different ideas. Names are an index into the model. We need to know them so that we can speak with others, read their literature, and learn from them.

This brings to mind an article with a specific example of the importance of using the correct name: Through the Looking Glass, or ... This is the Red Pill, by Ben Hunt at Epsilon Theory:

I'm a big believer in calling things by their proper names. Why? Because if you make the mistake of conflating instability with volatility, and then you try to hedge your portfolio today with volatility "protection" ...., you are throwing your money away.

Calling a problem by the wrong name might lead you to the wrong remedy.

Feynman isn't telling us that names don't matter. He's telling us that knowing only names isn't valuable. Names are not useful outside the web of knowledge in which they mean something. As long as we interpret his words properly, they teach us something useful.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

April 28, 2019 10:37 AM

The Smart Already Know They Are Lucky

Writes Matthew Butterick:

As someone who had a good run in the tech world, I buy the theory that the main reason successful tech founders start another company is to find out if they were smart or merely lucky the first time. Of course, the smart already know they were also lucky, so further evidence is unnecessary. It's only the lucky who want proof they were smart.

From a previous update to The Billionaire's Typewriter, recently updated again. I'm not sure this is the main reason that most successful tech founders start another company -- I suspect that many are simply ambitious and driven -- but I do believe that most successful people are lucky many times over, and that the self-aware among them know it.


Posted by Eugene Wallingford | Permalink | Categories: General

March 31, 2019 4:07 PM

Writing Advice to the Aspiring Kurt Vonnegut

In the fall of 1945, Kurt Vonnegut was serving out the last few months of his military commitment after returning home from Dresden. During the day, he did paperwork in the secretarial pool, and at night he wrote stories in the hopes of making a living as a writer when he left the service. One day his wife, Jane, sent four of his stories to one of those agents who used to advertise in magazines and promise to help frustrated writers get into the business. Her cover letter touted Kurt's desire, ambition, and potential.

The agent wrote back with clear-eyed advice for an aspiring professional writer:

You say you think that Kurt is a potential Chekhov. To this I fervently reply "Heaven Save Him!" This is a very revealing statement. I'm glad you made it. I hope the virus has not become so entrenched that it can't be driven out of his system. I recognize the symptoms of a widely prevailing ailment.... Read Chekhov and enjoy him, yes, and all of the other great and inspiring ones, but don't encourage Kurt, or anybody else, to try to write like them. If you want to sell in the current market, you have got to write "current literature". I warmly applaud Kurt's desire to "say something" that will have some influence, however small, that will do something to help uplift humanity. Every writer worth a hoot has ambition. But don't think that it can't be done in terms of current fiction.... So then, what it adds up to or boils down to is this: you have got to master the current technique if you want acceptance for anything, good or drivel, in the current market. The "message to humanity" is a by-product: it always has been.... If you want to make a living writing you will first of all write to entertain, to divert, to amuse. And that in itself is a noble aim.

What a generous response. I don't know if he responded this way to everyone who contacted him, or if he saw something special in Jane Vonnegut's letter. But this doesn't feel like a generic form letter.

It's easy to idealize classic works of art and the writers, poets, and playwrights who created them. We forget sometimes that they were writing for an audience in their own time, sometimes a popular one, and that most often they were using the styles and techniques that connected with the people. Shakespeare and Mozart -- and Chekhov -- made great art and pushed boundaries, but they did so in their "current market". They entertained and amused who those saw performances of their works. And that's more than just okay; it, too, is a noble aim.

I found this story early in Charles Shields's And So It Goes. Shields met Vonnegut in the last year of his life and received his blessing to write the definitive biography of his life. It's not a perfect book, but it's easy to read and contains a boatload of information. I'm not sure what I'm just now getting around to reading it.


Posted by Eugene Wallingford | Permalink | Categories: General

February 28, 2019 4:29 PM

Ubiquitous Distraction

This morning, while riding the exercise bike, I read two items within twenty minutes or so that formed a nice juxtaposition for our age. First came The Cost of Distraction, an old blog post by L.M. Sacasas that reconsiders Kurt Vonnegut's classic story, "Harrison Bergeron" (*). In the story, it is 2081, and the Handicapper General of the United States ensures equality across the land by offsetting any advantages any individual has over the rest of the citizenry. In particular, those of above-average intelligence are required to wear little earpieces that periodically emit high-pitched sounds to obliterate any thoughts in progress. The mentally- and physically-gifted Harrison rebels, to an ugly end.

Soon after came Ian Bogost's Apple's AirPods Are an Omen, an article from last year that explores the cultural changes that are likely to ensue as more and more people wear AirPods and their ilk. ("Apple's most successful products have always done far more than just make money, even if they've raked in a lot of it....") AirPods free the wearer in so many ways, but they also bind us to ubiquitous distraction. Will we ever have a free moment to think deeply when our phones and laptops now reside in our heads?

As Sacasas says near the end of his post,

In the world of 2081 imagined by Vonnegut, the distracting technology is ruthlessly imposed by a government agency. We, however, have more or less happily assimilated ourselves to a way of life that provides us with regular and constant distraction. We have done so because we tend to see our tools as enhancements.

Who needs a Handicapper General when we all walk down to the nearest Apple Store or Best Buy and pop distraction devices into our own ears?

Don't get me wrong. I'm a computer scientist, and I love to program. I also love the productivity my digital tools provide me, as well as the pleasure and comfort they afford. I'm not opposed to AirBuds, and I may be tempted to get a pair someday. But there's a reason I don't carry a smart phone and that the only iPod I've ever owned is 1GB first-gen Shuffle. Downtime is valuable, too.

(*) By now, even occasional readers know that I'm a big Vonnegut fan who wrote a short eulogy on the occasion of his death, nearly named this blog after one of his short stories, and returns to him frequently.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

January 29, 2019 1:46 PM

Dependencies and Customizable Books

Shriram Krishnamurthi, in Books as Software:

I have said that a book is a collection of components. I have concrete evidence that some of my users specifically excerpt sections that suit their purpose. ...

I forecast that one day, rich document formats like PDF will recognize this reality and permit precisely such specifications. Then, when a user selects a group of desired chapters to generate a thinner volume, the software will automatically evaluate constraints and include all dependencies. To enable this we will even need "program" analyses that help us find all the dependencies, using textual concordances as a starting point and the index as an auxiliary data structure.

I am one of the users Krishnamurthi speaks of, who has excerpted sections from his Programming Languages: Application and Interpretation to suit the purposes of my course. Though I've not written a book, I do post, use, adapt, and reuse detailed lecture notes for my courses, and as a result I have seen both sides of the divide he discusses. I occasionally change the order of topics in a course, or add a unit, or drop a unit. An unseen bit of work is to account for the dependencies among concepts, examples, problems, and code in the affected sections, but also in the new whole. My life is simpler than book writers who have to deal at least in part with rich document formats: I do everything in a small, old-style subset of HTML, which means I can use simple text-based tools for manipulating everything. But dependencies? Yeesh.

Maybe I need to write a big makefile for my course notes. Alas, that would not help me manage dependencies in the way I'd like, or in the way Krishnamurthi forecasts. As such, it would probably make things worse. I suppose that I could create the tool I need.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

December 31, 2018 1:44 PM

Preserve Process Knowledge

This weekend I read the beginning of Dan Wang's How Technology Grows. One of the themes he presses is that when a country loses its manufacturing base, it also loses its manufacturing knowledge base. This in turn damages the economy's ability to innovate in manufacturing, even on the IT front. He concludes:

It can't be an accident that the countries with the healthiest communities of engineering practice are also in the lead in designing tools for the sector. They're able to embed knowledge into new tools, because they never lost the process knowledge in the first place.
Let's try to preserve process knowledge.

I have seen what happens within an academic department or a university IT unit when it loses process knowledge it once had. Sometimes, the world has changed in a way that makes the knowledge no longer valuable, and the loss is simply part of the organization's natural evolution. But other times the change that precipitated the move away from expertise is temporary or illusory, and the group suddenly finds itself unable to adapt other changes in the environment.

The portion of the article I read covered a lot of ground. For example, one reason that a manufacturing base matters so much is that services industries have inherent limits, summarized in:

[The] services sector [has] big problems: a lot of it is winner-take-all, and much of the rest is zero-sum.

This longer quote ends a section in which Wang compares the economies of manufacturing-focused Germany and the IT-focused United States:

The US and Germany are innovative in different ways, and they each have big flaws. I hope they fix these flaws. I believe that we can have a country in which wealth is primarily created by new economic activity, instead of by inheritance; which builds new housing stock, instead of permitting current residents to veto construction; which has a government willing to think hard about new projects that it should initiate, instead of letting the budget run on autopilot. I don't think that we should have to choose between industry and the internet; we can have a country that has both a vibrant industrial sector and a thriving internet sector.

This paragraph is good example of the paper's sub-title, "a restatement of definite optimism". Wang writes clearly and discusses a number of issues relevant to IT as the base for a nation's economy. How Technology Grows is an interesting read.


Posted by Eugene Wallingford | Permalink | Categories: General

December 26, 2018 2:44 PM

It's Okay To Say, "I Don't Know." Even Nobel Laureates Do It.

I ran across two great examples of humility by Nobel Prize-winning economists in recent conversations with Tyler Cowen. When asked, "Should China and Japan move to romanized script?", Paul Romer said:

I basically don't know the answer to that question. But I'll use that as a way to talk about something else ...

Romer could have speculated or pontificated; instead, he acknowledged that he didn't know the answer and pivoted the conversation to a related topic he had thought about (reforming spelling in English, for which he offered an interesting computational solution). By shifting the topic, Romer added value to the conversation without pretending that any answer he could give to the original question would have more value than as speculation.

A couple of months ago, Cowen sat with Paul Krugman. When asked whether he would consider a "single land tax" as a way to encourage a more active and more equitable economy, Krugman responded:

I just haven't done my homework on that.

... and left it there. To his credit, Cowen did not press for an uninformed answer; he moved on to another question.

I love the attitude that Krugman and Romer adopt and really like Krugman's specific answer, which echoed his response to another question earlier in the conversation. We need more people answering questions this way, more often and in more circumstances.

Such restraint is probably even more important in the case of Nobel laureates. If Romer and Klugman choose to speculate on a topic, a lot of people will pay attention, even if it is a topic they know little about. We might learn something from their speculations, but we might also forget that they are only uninformed speculation.

I think what I like best about these answers is the example that Romer and Klugman set for the rest of us: It's okay to say, "I don't know." If you have not done the homework needed to offer an informed answer, it's often best to say so and move on to something you're better prepared to discuss.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

December 24, 2018 2:55 PM

Using a Text Auto-Formatter to Enhance Human Communication

More consonance with Paul Romer, via his conversation with Tyler Cowen: They were discussing how hard it is to learn read English than other languages, due to its confusing orthography and in particular the mismatch between sounds and their spellings. We could adopt a more rational way to spell words, but it's hard to change the orthography of large language spoken by a large, scattered population. Romer offered a computational solution:

It would be a trivial translation problem to let some people write in one spelling form, others in the other because it would be word-for-word translation. I could write you an email in rationalized spelling, and I could put it through the plug-in so you get it in traditional spelling. This idea that it's impossible to change spelling I think is wrong. It's just, it's hard, and we should -- if we want to consider this -- we should think carefully about the mechanisms.

This sounds similar to a common problem and solution in the software development world. Programmers working in teams often disagree about the orthography of code, not the spelling so much as its layout, the use of whitespace, and the placement of punctuation. Being programmers, we often address this problem computationally. Team members can stylize their code anyway they see fit but, when they check it into the common repository, they run it through a language formatter. Often, these formatters are built into our IDEs. Nowadays, some languages even come with a built-in formatting tool, such as Go and gofmt.

Romer's email plug-in would play a similar role in human-to-human communication, enabling writers to use different spelling systems concurrently. This would make it possible to introduce a more rational way to spell words without having to migrate everyone to the new system all at once. There are still challenges to making such a big change, but they could be handled in an evolutionary way.

Maybe Romer's study of Python is turning him into a computationalist! Certainly, being a programmer can help a person recognize the possibility of a computational solution.

Add this idea to his recent discovery of C.S. Peirce, and I am feeling some intellectual kinship to Romer, at least as much as an ordinary CS prof can feel kinship to a Nobel Prize-winning economist. Then, to top it all off, he lists Slaughterhouse-Five as one of his two favorite novels. Long-time readers know I'm a big Vonnegut fan and nearly named this blog for one of his short stories. Between Peirce and Vonnegut, I can at least say that Romer and I share some of the same reading interests. I like his tastes.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 26, 2018 2:04 PM

Self-Help from Hamming

In yesterday's post, I mentioned re-reading Richard Hamming's 1986 talk, You and Your Research. Hamming himself found it useful to manage his own behavior in order to overcome his personal faults, in service of his goal to do great work. I have faults, too, and need occasional reminders to approach my work more intentionally.

I've been at low ebb recently with my own creative work, so there is plenty of low-hanging fruit to be picked after this read. In the short term, I plan to...

  • focus my reading and programming time on material that contributes to specific research and teaching problems I'm working on. In particular, as Hamming says, "you need to keep up more to find out what the problems are than ... to find the solutions" -- then get to work actually solving problems.

  • attend seminars in other departments regularly next semester, especially in our science departments. This action works in the opposite direction as the first bullet, as it broadens my vision beyond my own work. Its benefit is in providing a cross-fertilization of ideas and giving me more chances to converse with smart people outside my area who are solving interesting problems.

I'm also our department head, an administrative role that diverts much of my attention and energy from doing computer science. Hamming doesn't dismiss "management" outright, as so many scientists do. That's heartening, because organizations need good leaders to help create the conditions in which scientists do great work. He even explains why a capable scientist might reasonably choose to become a manager: "The day your vision, what you think needs to be done, is bigger than what you can do single-handedly, then you have to move toward management."

When I became head, I had some ideas about our department that I wanted to help implement from a leadership position. Do I still such ideas that I need to drive forward? If so, then I need to focus my administrative work on those goals. If not, then I need to think about next steps.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Personal

November 17, 2018 4:00 PM

Superior Ideas

In February 1943, an American friend sent physicist Freeman Dyson a copy of Kurt Gödel's "The Consistency of the Continuum Hypothesis" while he was an undergrad at Cambridge. Dyson wrote home about it to his parents:

I have been reading the immortal work (it is only sixty pages long) alternately with The Magic Mountain and find it hard to say which one is better. Mann of course writes better English (or rather the translator does); on the other hand the superiority of the ideas in Gödel just about makes up for that.

Imagine that, only five years later, Dyson would be "drinking tea with Gödel at his home in Princeton". Of course, after having taken classes with the likes of Hardy and Dirac, Dyson was well-prepared. He seems to have found himself surrounded by superior ideas much of his life and, despite his modesty, added a few himself.

I've never read The Magic Mountain, or any Mann, for that matter. I will correct that soon. However, Mann will have to wait until I finish Dyson's Maker of Patterns, in which I found this passage. It is a quite readable memoir that interleaves letters Dyson wrote to his family over the course of thirty-some years with explanatory text and historical asides. I'm glad I picked it up.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

October 21, 2018 9:53 AM

Find the Hard Work You're Willing to Do

I like this passage from John Urschel Goes Pro, about the former NFL player who is pursuing a Ph.D. in math:

The world thinks mathematicians are people for whom math is easy. That's wrong. Sure, some kids, like Urschel, have little trouble with school math. But everyone who starts down the road to creating really new mathematics finds out what Urschel did: It's a struggle. A prickly, sometimes lonely struggle whose rewards are uncertain and a long time coming. Mathematicians are the people who love that struggle.

It's cliché to tell kids to "find their passion". That always seems to me like an awful lot of pressure to put on young adults, let alone teenagers. I meet with potential CS majors frequently, both college students and high school students. Most haven't found their passion yet, and as a result many wonder if there is something wrong with them. I do my my best to assure them that, no, there is nothing wrong with them. It's an unreasonable expectation placed on them by a world that, usually with good intentions, is trying to encourage them.

I don't think there is anything I'd rather be than a computer scientist, but I did not walk a straight path to being one. Some choices early on were easy: I like biology as a body of knowledge, but I never liked studying biology. That seemed a decent sign that maybe biology wasn't for me. (High-school me didn't understand that there might be a difference between school biology and being a biologist...) But other choices took time and a little self-awareness.

From the time I was eight years old or so, I wanted to be an architect. I read about architecture; I sent away for professional materials from the American Institute of Architects; I took courses in architectural drafting at my high school. (There was an unexpected benefit to taking those courses: I got to meet a lot of people were not part of my usual academic crowd.) Then I went off to college to study architecture... and found that, while I liked many things about the field, I didn't really like to do the grunt work that is part of the architecture student's life, and when the assigned projects got more challenging, I didn't really enjoy working on them.

But I had enjoyed working on the hard projects I'd encountered in my programing class back in high school. They were challenges I wanted to overcome. I changed my major and dove into college CS courses, which were full of hard problems -- but hard problems that I wanted to solve. I didn't mind being frustrated for an entire semester one year, working in assembly language and JCL, because I wanted to solve the puzzles.

Maybe this is what people mean when they tell us to "find our passion", but that phrase seems pretty abstract to me. Maybe instead we should encourage people to find the hard problems they like to work on. Which problems do you want to keep working on, even when they turn out to be harder than you expected? Which kinds of frustration do you enjoy, or at least are willing to endure while you figure things out? Answers to these very practical questions might help you find a place where you can build an interesting and rewarding life.

I realize that "Find your passion" makes for a more compelling motivational poster than "What hard problems do you enjoy working on?" (and even that's a lot better than "What kind of pain are you willing to endure?"), but it might give some people a more realistic way to approach finding their life's work.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

October 04, 2018 4:46 PM

Strange Loop 6: Index + This and That

the view from the Stifel Theater

For my convenience and yours, here are all of Strange Loop 2018 posts:

... and few parting thoughts of the non-technical variety:

  • All the images used in these posts are photos I took at the conference. They are licensed CC Attribution-ShareAlike 3.0 Unported.

  • On Day One, Jason Dagit kept saying H.E., for "homomorphic encryption". For a while I was confused, because my brain kept hearing A.G.

  • I left my laptop in the hotel room this year, in order to engage more with the talks and the people than with a web browser. I'm glad I did: I enjoyed the talks more. I also took fewer and more focused notes. That made blogging easier and quicker.

  • I also decided not to acquire swag as greedily as usual, and I did a pretty good job of holding back... except for this beautiful hard-bound Jane Street notebook with graphed pages:
    swag from Jane Street Capital
    "Enter the Monad." Very nice. They must be doing well.

  • I left St. Louis with a lot of plastic. The Stifel Theater, the conference's main venue, does not recycle plastic. Like many conference goers, I went through a fair number of water and soda bottles. I hate to see all that plastic go into the landfill and, having driven down, I did not have to contribute. Twice a day, I took whatever bottles I had emptied, and whatever other bottles I found lying around, back to my car and through them in the trunk. When I got home, they went straight into the recycling bin. Yet another advantage to driving over flying.

I think that's all from Strange Loop 2018. It was fun.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

October 02, 2018 4:04 PM

Strange Loop 5: Day Two

the video screen announcing Philip Wadler's talk

Friday was a long day, but a good one. The talks I saw were a bit more diverse than on Day One: a couple on language design (though even one of those covered a lot more ground than that), one on AI, one on organizations and work-life, and one on theory:

• "All the Languages Together", by Amal Ahmed, discussed a problem that occurs in multi-language systems: when code written in one language invalidates the guarantees made by code written in the other. Most languages are not designed with this sort of interoperability baked in, and their FFI escape hatches make anything possible within foreign code. As a potential solution, Ahmed offered principled escape hatches designed with specific language features in mind. The proposed technique seems like it could be a lot of work, but the research is in its early stages, so we will learn more as she and her students implement the idea.

This talk is yet another example of how so many of our challenges in software engineering are a result of programming language design. It's good to see more language designers taking issues like these seriously, but we have a long way to go.

• I really liked Ashley Williams's talk on on the evolution of async in Javascript and Rust. This kind of talk is right up my alley... Williams invoked philosophy, morality, and cognitive science as she reviewed how two different language communities incorporated asynchronous primitives into their languages. Programming languages are designed, to be sure, but they are also the result of "contingent turns of history" (a lá Foucault). Even though this turned out to be more of a talk about the Rust community than I had expected, I enjoyed every minute. Besides, how can you not like a speaker who says, "Yes, sometimes I'll dress up as a crab to teach."?

(My students should not expect a change in my wardrobe any time soon...)

• I also enjoyed "For AI, by AI", by Connor Walsh. The talk's subtitle, "Freedom & Evolution of the Algopoetic Avant-Garde", was a bit disorienting, as was its cold open, but the off-kilter structure of the talk was easy enough to discern once Walsh got going: first, a historical review of humans making computers write poetry, followed by a look at something I didn't know existed... a community of algorithmic poets — programs — that write, review, and curate poetry without human intervention. It's a new thing, of Walsh's creation, that looks pretty cool to someone who became drunk on the promise of AI many years ago.

I saw two other talks the second day:

  • the after-lunch address by Philip Wadler, "Categories for the Working Hacker", which I wrote about separately
  • Rachel Krol's Some Things May Never Get Fixed, about how organizations work and how developers can thrive despite how they work

I wish I had more to say about the last talk but, with commitments at home, the long drive beckoned. So, I departed early, sadly, hopped in my car, headed west, and joined the mass exodus that is St. Louis traffic on a Friday afternoon. After getting past the main crush, I was able to relax a bit with the rest of Zen and the Art of Motorcycle Maintenance.

Even a short day at Strange Loop is a big win. This was the tenth Strange Loop, and I think I've been to five, or at least that's what my blog seems to tell me. It is awesome to have a conference like this in Middle America. We who live here benefit from the opportunities it affords us, and maybe folks in the rest of the world get a chance to see that not all great computing ideas and technology happen on the coasts of the US.

When is Strange Loop 2019?


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

September 29, 2018 6:19 PM

Strange Loop 1: Day One

the Strange Loop splash screen from the main hall

Last Wednesday morning, I hopped in my car and headed south to Strange Loop 2018. It had been a few years since I'd listened to Zen and the Art of Motorcycle Maintenance on a conference drive, so I popped it into the tapedeck (!) once I got out of town and fell into the story. My top-level goal while listening to Zen was similar to my top-level goal for attending Strange Loop this year: to experience it at a high level; not to get bogged down in so many details that I lost sight of the bigger messages. Even so, though, a few quotes stuck in my mind from the drive down. The first is an old friend, one of my favorite lines from all of literature:

Assembly of Japanese bicycle require great peace of mind.

The other was the intellectual breakthrough that unified Phaedrus's philosophy:

Quality is not an object; it is an event.
This idea has been on my mind in recent months. It seemed a fitting theme, too, for Strange Loop.

On the first day of the conference, I saw mostly a mixture of compiler talks and art talks, including:

@mraleph's "Six Years of Dart", in which he reminisced on the evolution of the language, its ecosystem, and its JIT. I took at least one cool idea from this talk. When he compared the performance of two JITs, he gave a histogram comparing their relative performances, rather than an average improvement. A new system often does better on some programs and worse on others. An average not only loses information; it may mislead.

• Jason Dagit's "Your Secrets are Safe with Julia", about a system that explores the use of homomorphic encryption to to compile secure programs. In this context, the key element of security is privacy. As Dagit pointed out, "trust is not transitive", which is especially important when it comes to sharing a person's health data.

• I just loved Hannah Davis's talk on "Generating Music From Emotion". She taught me about data sonification and its various forms. She also demonstrated some of her attempts to tease multiple dimensions of human emotion out of large datasets and to use these dimensions to generate music that reflects the data's meaning. Very cool stuff. She also showed the short video Dragon Baby, which made me laugh out loud.

• I also really enjoyed "Hackett: A Metaprogrammable Haskell", by Alexis King. I've read about this project on the Racket mailing list for a few years and have long admired King's ability in posts there to present complex ideas clearly and logically. This talk did a great job of explaining that Haskell deserves a powerful macro system like Racket's, that Racket's macro system deserves a powerful type system like Haskell's, and that integrating the two is more challenging than simply adding a stage to the compiler pipeline.

I saw two other talks the first day:

  • the opening keynote address by Simon Peyton Jones, "Shaping Our Children's Education in Computing" [ link ]
  • David Schmüdde, "Misuser" [ link ]
My thoughts on these talks are more extensive and warrant short entries of their own, to follow.

I had almost forgotten how many different kinds of cool ideas I can encounter in a single day at Strange Loop. Thursday was a perfect reminder.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

September 13, 2018 3:50 PM

Legacy

In an interview at The Great Discontent, designer John Gall is asked, "What kind of legacy do you hope to leave?" He replies:

I have no idea; it's not something I think about. It's the thing one has the least control over. I just hope that my kids will have nice things to say about me.

I admire this answer.

No one is likely to ask me about my legacy; I'm just an ordinary guy. But it has always seemed strange when people -- presidents, artists, writers, film stars -- are asked this question. The idea that we can or should craft our own legacy like a marketing brand seems venal. We should do things because they matter, because they are worth doing, because they make the world better, or at least better than it would be without us. It also seems like a waste of time. The simple fact is that most of us won't be remembered long beyond our deaths, and only then by close family members and friends. Even presidents, artists, writers, and film stars are mostly forgotten.

To the extent that anyone will have a legacy, it will decided in the future by others. As Gall notes, we don't have much control over how that will turn out. History is full of people whose place in the public memory turned out much differently than anyone might have guessed at the time.

When I am concerned that I'm not using my time well, it's not because I am thinking of my legacy. It's because I know that time is a precious and limited resource and I feel guilty for wasting it.

About the most any of us can hope is that our actions in this life leave a little seed of improvement in the world after we are gone. Maybe my daughters and former students and friends can make the world better in part because of something in the way I lived. If that's what people mean by their legacy, great, but it's likely to be a pretty nebulous effect. Not many of us can be Einstein or Shakespeare.

All that said, I do hope my daughters have good things to say about me, now and after I'm gone. I love them, and like them a lot. I want to make their lives happier. Being remembered well by them might also indicate that I put my time on Earth to good use.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 03, 2018 7:24 AM

Lay a Split of Good Oak on the Andirons

There are two spiritual dangers in not owning a farm. One is the danger of supposing that breakfast comes from the grocer, and the other that heat comes from the furnace.

The remedy for the first, according to Aldo Leopold, is to grow a garden, preferably in a place without the temptation and distraction of a grocery store. The remedy for the second is to "lay a split of good oak on the andirons" and let it warm your body "while a February blizzard tosses the trees outside".

I ran across Leopold's The Sand County Almanac in the local nature center late this summer. After thumbing through the pages during a break in a day-long meeting indoors, I added it to my long list of books to read. My reading list is actually stack, so there was some hope that I might get to it soon -- and some danger that it would be buried before I did.

Then an old high school friend, propagating a meme on Facebook, posted a picture of the book and wrote that it had changed his life, changed how he looked at the world. That caught my attention, so I anchored it atop my stack and checked a copy out of the university library.

It now serves as a quiet read for this city boy on a dark and rainy three-day weekend. There are no February blizzards here yet, of course, but autumn storms have lingered for days. In an important sense, I'm not a "city boy", as my by big-city friends will tell me, but I've lived my life mostly sheltered from the reality growing my own food and heating my home by a wonderful and complex economy of specialized labor that benefits us all. It's good to be reminded sometimes of that good fortune, and also to luxuriate in the idea of experiencing a different kind of life, even if only for a while.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

August 09, 2018 1:03 PM

Gerald Weinberg Has Passed Away

I just read on the old Agile/XP mailing list that Jerry Weinberg passed away on Tuesday, August 7. The message hailed Weinberg as "one of the finest thinkers on computer software development". I, like many, was a big fan of work.

My first encounter with Weinberg came in the mid-1990s when someone recommended The Psychology of Computer Programming to me. It was already over twenty years old, but it captivated me. It augmented years of experience in the trenches developing computer software with a deep understanding of psychology and anthropology and the firm but gentle mindset of a gifted teacher. I still refer back to it after all these years. Whenever I open it up to a random page, I learn something new again. If you've never read it, check it out now. You can buy the ebook -- along with many of Weinberg's books -- online through LeanPub.

After the first book, I was hooked. I never had the opportunity to attend one of Weinberg's workshops, but colleagues lavished them with praise. I should have made more of an effort to attend one. My memory is foggy now, but I do think I exchanged email messages with him once back in the late 1990s. I'll have to see if I can dig them up in one of my mail archives.

Fifteen years ago or so, I picked up a copy of Introduction to General Systems Thinking tossed out by a retiring colleague, and it became the first in a small collection of Weinberg books now on my shelf. As older colleagues retire in the coming years, I would be happy to salvage more titles and extend my collection. It won't be worth much on the open market, but perhaps I'll be able to share my love of Weinberg's work with students and younger colleagues. Books make great gifts, and more so a book by Gerald Weinberg.

Perhaps I'll share them with my non-CS friends and family, too. A couple of summers back, my wife saw a copy of Are Your Lights On?, a book Weinberg co-wrote with Donald Gause, sitting on the floor of my study at home. She read it and liked it a lot. "You get to read books like that for your work?" Yes.

I just read Weinberg's final blog entry earlier this week. He wasn't a prolific blogger, but he wrote a post every week or ten days, usually about consulting, managing, and career development. His final post touched on something that we professors experience at least occasionally: students sometimes solve the problems we et before them better than we expected, or better than we ourselves can do. He reminded people not to be defensive, even if it's hard, and to see the situation as an opportunity to learn:

When I was a little boy, my father challenged me to learn something new every day before allowing myself to go to bed. Learning new things all the time is perhaps the most important behavior in my life. It's certainly the most important behavior in our profession.

Weinberg was teaching us to the end, with grace and gratitude. I will miss him.

Oh, and one last personal note: I didn't know until after he passed that we shared the same birthday, a few years apart. A meaningless coincidence, of course, but it made me smile.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

August 07, 2018 3:04 PM

Too Bad Richard Feynman Didn't Have a Blog

There is a chapter in "Surely You're Joking, Mr. Feynman" about Feynman's work with biologists over summers and sabbaticals at Princeton and Cal Tech. He used a sabbatical year to work in a colleague's lab on bacteriophages, ribosomes, and RNA. After describing how he had ruined a potentially "fantastic and vital discovery" through sloppiness, he writes:

The other work on the phage I never wrote up -- Edgar kept asking me to write it up, but I never got around to it. That's the trouble with not being in your own field: You don't take it seriously.

I did write something informally on it. I sent it to Edgar, who laughed when he read it. It wasn't in the standard form that biologists use -- first, procedures, and so forth. I spent a lot of time explaining things that all the biologists knew. Edgar made a shortened version, but I couldn't understand it. I don't think they ever published it. I never published it directly.

Too bad Feynman didn't have a blog. I'll bet I could have learned something from his write-up. Not being a biologist, I generally can use some explanation intended for a lay reader, and Feynman's relaxed style might pull me through a biology paper. (Of all the sciences, biology is usually the biggest chore for me to learn.)

These days, scientists can post their informal writings on their blogs with little or no fuss. Standard form and formal style are for journals and conferences. Blog readers prefer relaxed writing and, for the most part, whatever form works best for the writer in order to get the ideas out to the world.

Imagine what a trove of stories Feynman could have told on his blog! He did tell them, of course, but in books like "Surely You're Joking, Mr. Feynman". But not everyone is going to write books, or have books written for them, so I'm glad to have the blogs of scientists, economists, and writers from many disciplines in my newsreader. For those who want something more formal before, or instead of, taking on the journal grind, we have arXiv.org. What a time to be alive.

Of course, when you read on in the chapter, you learn that James Watson (of Watson & Crick fame) heard about Feynman's work, thought it was interesting, invited Feynman to give a seminar talk at Harvard, and then went into the lab with him to conduct an experiment that very same week. I guess it all worked out for Feynman in the end.


Posted by Eugene Wallingford | Permalink | Categories: General

August 05, 2018 10:21 AM

Three Uses of the Knife

I just finished David Mamet's Three Uses of the Knife, a wide-ranging short book with the subtitle: "on the nature and purpose of drama". It is an extended essay on how we create and experience drama -- and how these are, in the case of great drama, the same journey.

Even though the book is only eighty or so pages, Mamet characterizes drama in so many ways that you'll have to either assemble a definition yourself or accept the ambiguity. Among them, he says that the job of drama and art is to "delight" us and that "the cleansing lesson of the drama is, at its highest, the worthlessness of reason."

Mamet clearly believes that drama is central to other parts of life. Here's a cynical example, about politics:

The vote is our ticket to the drama, and the politician's quest to eradicate "fill in the blank", is no different from the promise of the superstar of the summer movie to subdue the villain -- both promise us diversion for the price of a ticket and a suspension of disbelief.

As reader, I found myself using the book's points to ruminate about other parts of life, too. Consider the first line of the second essay:

The problems of the second half are not the problems of the first half.

Mamet uses this to launch into a consideration of the second act of a drama, which he holds equally to be a consideration of writing the second act of a drama. But with fall semester almost upon us, my thoughts jumped immediately to teaching a class. The problems of teaching the second half of a class are quite different from the problems of teaching the first half. The start of a course requires the instructor to lay the foundation of a topic while often convincing students that they are capable of learning it. By midterm, the problems include maintaining the students' interest as their energy flags and the work of the semester begins to overwhelm them. The instructor's energy -- my energy -- begins to flag, too, which echoes Mamet's claim that the journey of the creator and the audience are often substantially the same.

A theme throughout the book is how people immerse themselves in story, suspending their disbelief, even creating story when they need it to soothe their unease. Late in the book, he connects this theme to religious experience as well. Here's one example:

In suspending their disbelief -- in suspending their reason, if you will -- for a moment, the viewers [of a magic show] were rewarded. They committed an act of faith, or of submission. And like those who rise refreshed from prayers, their prayers were answered. For the purpose of the prayer was not, finally, to bring about intercession in the material world, but to lay down, for the time of the prayer, one's confusion and rage and sorrow at one's own powerlessness.

This all makes the book sound pretty serious. It's a quick read, though, and Mamet writes with humor, too. It feels light even as it seems to be a philosophical work.

The following paragraph wasn't intended as humorous but made me, a computer scientist, chuckle:

The human mind cannot create a progression of random numbers. Years ago computer programs were created to do so; recently it has been discovered that they were flawed -- the numbers were not truly random. Our intelligence was incapable of creating a random progression and therefore of programming a computer to do so.

This reminded me of a comment that my cognitive psychology prof left on the back of an essay I wrote in class. He wrote something to the effect, "This paper gets several of the particulars incorrect, but then that wasn't the point. It tells the right story well." That's how I felt about this paragraph: it is wrong on a couple of important facts, but it advances the important story Mamet is telling ... about the human propensity to tell stories, and especially to create order out of our experiences.

Oh, and thanks to Anna Gát for bringing the book to my attention, in a tweet to Michael Nielsen. Gát has been one of my favorite new follows on Twitter in the last few months. She seems to read a variety of cool stuff and tweet about it. I like that.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

July 17, 2018 2:32 PM

Get Attached to Solving Problems for People

In Getting Critiqued, Adam Morse reflects on his evolution from art student to web designer, and how that changed his relationship with users and critiques. Artists create things in which they are, at some level, invested. Their process matters. As a result, critiques, however well-intentioned, feel personal. The work isn't about a user; it's about you. But...

... design is different. As a designer, I don't matter. My work doesn't matter. Nothing I make matters in the context of my process. It's all about the people you are building for. You're just trying to solve problems for people. Once you realize this, it's the most liberating thing.

Now, criticism isn't really about you as artist. It's about how well the design meets the needs of the user. With that in mind, the artist can put some distance between himself or herself and think about the users. That's probably what the users are paying for anyway.

I've never been a designer, but I was fortunate to learn how better to separate myself from my work by participating in the software patterns community and its writers' workshop format. From the workshops, I came to appreciate the value of providing positive and constructive feedback in a supportive way. But I also learned to let critiques from others be about my writing and not about me. The ethos of writers' workshops is one of shared commitment to growth and so creates as supportive framework as possible in which to deliver suggestions. Now, even when I'm not in such an conspicuously supportive environment, I am better able to detach myself from my work. It's never easy, but it's easier. This mindset can wear off a bit over time, so I find an occasional inoculation via PLoP or another supportive setting to be useful.

Morse offers another source of reminder: the designs we create for the web -- and for most software, too-- are not likely to last forever. So...

Don't fall in love with borders, gradients, a shade of blue, text on blurred photos, fancy animations, a certain typeface, flash, or music that autoplays. Just get attached to solving problems for people.

That last sentence is pretty good advice for programmers and designers alike. If we detach ourselves from our specific work output a bit and instead attach ourselves to solving problems for other people, we'll be able to handle their critiques more calmly. As a result, we are also likely to do better work.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

June 03, 2018 10:50 AM

Two Thoughts on Teaching

... from my morning reading.

First, a sentence from Bryan Caplan, about one of his influences, philosopher Michael Huemer:

I think what's great about this book, and really all of Mike's work, is he always tries to start off with premises that make sense to people who don't already agree, and then try to get somewhere.

I value people who take the time to construct arguments in this way. It's surprisingly rare in academic discourse and public discourse. Teachers usually learn pretty quickly, though, that the most effective way to teach is start where your students are: recognize the state of their knowledge and respect their current beliefs. I try to remind myself of this principle regularly during a course, or I'm likely to go off track.

Second, the closing exchange from a 1987 interview with Stanley Kubrick. Kubrick has been talking about how the critics' views of his films tend to evolve over time. The interviewer wrapped up the conversation with:

Well, you don't make it easy on viewers or critics. You create strong feelings, but you won't give us any easy answers.

That's because I don't have any easy answers.

That seems like a pretty good aspiration to have for teaching, that people can say it creates strong feelings but doesn't give any easy answers. Much of teaching is simpler than this, of course, especially in a field such as computer science. A closure is something that we can understand as it is, as is, say, an algorithm for parsing a stream of tokens. But after you learn a few concepts and start trying to build or understand a complex system, easy answers are much harder to come by. Even so, I do hope that students leave my courses with strong feelings about their craft. Those feelings may not match my own, and they'll surely still be evolving, but they will be a product of the student engaging with some big ideas and trying them out on challenging problems.

Maybe if I keep reading interested articles on the exercise the bike and making connections to my craft, I can get this computer science thing down better.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

June 01, 2018 3:05 PM

Prepare to Appreciate the Solution

This post isn't really about chess, though it might seem at first to be.

In The Reviled Art, chess grandmaster Stuart Rachels says that most grandmasters don't like composed chess problems because they are too difficult. It's easy to imagine why average chessplayers find problems too difficult: they aren't all that great chess. But why grandmasters? Rachels contends that problems are hard for tournament players because they are counterintuitive: the solutions contradict the intuitions developed by players whose chess skill is developed and sharpened over the board.

Rachels then says:

Most problems stump me too, so I conceive of the time I spend looking at them as time spent preparing to appreciate their solutions -- not as time spent trying to solve them.

I love this attitude. If I view time spent banging my head against a puzzle or a hard problem as "trying to solve the problem", then not solving the problem might feel like failure. If I view that time as "preparing to appreciate the solution", then I can feel as if my time was well spent even if I don't solve it -- as long as I can appreciate the beauty or depth or originality of the solution.

This attitude is helpful outside of chess. Maybe I'm trying to solve a hard programming problem or trying to understand a challenging area of programming language theory that is new to me. I don't always solve the problem on my own or completely understand the new area without outside help or lots of time reading and thinking. But I often do appreciate the solution once I see it. All the time I spent working on the problem prepared me for that moment.

I often wish that more of my students would adopt Rachels's attitude. I frequently pose a problem for them to work on for a few minutes before we look at a solution, or several candidates, as a group. All too often some students look at the problem, think it's too difficult, and then just sit there waiting for me to show them the answer. This approach often results in them feeling two kinds of failure: they didn't solve the problem, and they don't even appreciate the solution when they see it. They haven't put in the work thinking about it that prepares their minds to really get the solution. Maybe I can do more to help students realize that the work is worth worth the effort even if they don't think they can solve the problem. Send me your suggestions!

Rachels's point about the counterintuitiveness of composed chess problems indicates another way in which trying to solve unorthodox problems can be worthwhile. Sometimes our intuitions let us down because they are too narrow, or even wrong. Trying to solve an unorthodox problem can help us broaden our thinking. My experience with chess compositions is that most of the ideas I need to solve them will not be helpful in over-the-board play; those kinds of positions simply don't occur in real games. But a few themes do apply, and practicing with them helps me learn how to play better in game situations. If nothing else, working on unorthodox problems reminds me to look outside the constraints of my intuitions sometimes when a problem in real life seems too hard.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

May 09, 2018 4:02 PM

Middles

In an old blog post promoting his book on timing, Daniel Pink writes:

... Connie Gersick's research has shown that group projects rarely progress in a steady, linear way. Instead, at the beginning of a project, groups do very little. Then at a certain moment, they experience a sudden burst of activity and finally get going. When is that moment? The temporal midpoint. Give a team 34 days, they get started in earnest on day 17. Give a team 11 days, they get really get going on day 6. In addition, there’s other research showing that being behind at the midpoint--in NBA games and in experimental settings--can boost performance in the second half.

So we need to recognize midpoints and try to use them as a spark rather than a slump.

I wonder if this research suggests that we should favor shorter projects over longer ones. If most of us start going full force only at the middle of our projects, perhaps we should make the middle of our projects come sooner.

I'll admit that I have a fondness for short over long: short iterations over long iterations in software development, quarters over semesters in educational settings, short books (especially non-fiction) over long books. Shorter cycles seem to lead to higher productivity, because I spend more time working and less time ramping up and winding down. That seems to be true for my students and faculty colleagues, too.

In the paragraph that follows the quoted passage, Pink points inadvertently to another feature of short projects that I appreciate: more frequent beginnings and endings. He talks about the poignancy of endings, which adds meaning to the experience. On the other end of the cycle are beginnings, which create a sense of newness and energy. I always look forward to the beginning of a new semester or a new project for the energy it brings me.

Agile software developers know that, on top of these reasons, short projects offer another potent advantage: more opportunities to take stock of what we have learned and feed that learning back into what we do.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

May 04, 2018 1:25 PM

No Venom Here

Ken Perlin liked Ready Player One at the theater and then went off to read some reviews:

Many critics seem incensed, indignant, left sputtering in outrage at the very idea of a Spielberg film that is simply fun, a pop confection designed mainly to entertain and delight.
Perhaps some of it is their feeling of horror that modern pop culture might be something worthy of celebrating, simply for the sake of celebrating a phenomenon that many people find delightful. But why the extreme degree of venom?

I am an unashamed fan of pop culture: music, TV, movies, and all the rest. My biggest complaint these days is that I can't keep up with all the good stuff being created... (It helps that I'm not as big a fan of superhero movies as most people.) Critics can claim to serve as the gatekeepers of culture if they want, but I'll enjoy "Shut Up and Dance" [ YouTube ] all the same.


Posted by Eugene Wallingford | Permalink | Categories: General

March 22, 2018 4:05 PM

Finally, Some Good News

It's been a tough semester. On top of the usual business, there have been a couple of extra stresses. First, I've been preparing for the departure of a very good friend, who is leaving the university and the area for family and personal reasons. Second, a good friend and department colleague took an unexpected leave that turned into a resignation. Both departures cast a distant pall over my workdays. This week, though, has offered a few positive notes to offset the sadness.

Everyone seems to complain about email these days, and I certainly have been receiving and sending more than usual this semester, as our students and I adjust to the change in our faculty. But sometimes an email message makes my day better. Exhibit 1, a message from a student dealing with a specific issue:

Thank you for your quick and helpful response!
Things don't look so complicated or hopeless now.

Exhibit 2, a message from a student who has been taming the bureaucracy that arises whenever two university systems collide:

I would like to thank you dearly for your prompt and thorough responses to my numerous emails. Every time I come to you with a question, I feel as though I am receiving the amount of respect and attention that I wish to be given.

Compliments like these make it a lot easier to muster the energy to deal with the next batch of email coming in.

There has also been good news on the student front. I received email from a rep at a company in Madison, Wisconsin, where one of our alumni works. They are looking for developers to work in a functional programming environment and are having a hard time filling the positions locally, despite the presence of a large and excellent university in town. Our alum is doing well enough that the company would like to hire more from our department, which is doing a pretty good job, too.

Finally, today I spoke in person with two students who had great news about their futures. One has accepted an offer to join the Northwestern U. doctoral program and work in the lab of Kenneth Forbus. I studied Forbus's work on qualitative reasoning and analogical reasoning as a part of my own Ph.D. work and learned a lot from him. This is a fantastic opportunity. The other student has accepted an internship to work at PlayStation this summer, working on the team that develops the compilers for its game engines. He told me, "I talked a lot about the project I did in your course last semester during my interview, and I assume that's part of the reason I got an offer." I have to admit, that made me smile.

I had both of these students in my intro class a few years back. They would have succeeded no matter who taught their intro course, or the compiler course, for that matter, so I can't take any credit for their success. But they are outstanding young men, and I have had the pleasure of getting to know over the last four years. News of the next steps in their careers makes me feel good, too.

I think I have enough energy to make it to the end of the semester now.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

March 12, 2018 3:43 PM

Technology is a Place Where We Live

Yesterday morning I read The Good Room, a talk Frank Chimero gave last month. Early on in the talk, Chimero says:

Let me start by stating something obvious: in the last decade, technology has transformed from a tool that we use to a place where we live.

This sentence jumped off the page both for the content of the assertion and for the decade time frame with which he bounds it. In the fall of 2003, I taught a capstone course for non-majors that is part of my university's liberal arts core. The course, titled "Environment, Technology, and Society", brings students from all majors on campus together in a course near the end of their studies, to apply their general education and various disciplinary expertises to problems of some currency in the world. As you might guess from the title, the course focuses on problems at the intersection of the natural environment, technology, and people.

My offering of the course put on a twist on the usual course content. We focused on the man-made environment we all live in, which even by 2003 had begun to include spaces carved out on the internet and web. The only textbook for the course was Donald Norman's The Design of Everyday Things, which I think every university graduate should have read. The topics for the course, though, had a decided IT flavor: the effect of the Internet on everyday life, e-commerce, spam, intellectual property, software warranties, sociable robots, AI in law and medicine, privacy, and free software. We closed with a discussion of what an educated citizen of the 21st century ought to know about the online world in which they would live in order to prosper as individuals and as a society.

The change in topic didn't excite everyone. A few came to the course looking forward to a comfortable "save the environment" vibe and were resistant to considering technology they didn't understand. But most were taking the course with no intellectual investment at all, as a required general education course they didn't care about and just needed to check off the list. In a strange way, their resignation enabled them to engage with the new ideas and actually ask some interesting questions about their future.

Looking back now after fifteen years , the course design looks pretty good. I should probably offer to teach it again, updated appropriately, of course, and see where young people of 2018 see themselves in the technological world. As Chimero argues in his talk, we need to do a better job building the places we want to live in -- and that we want our children to live in. Privacy, online peer pressure, and bullying all turned out differently than I expected in 2003. Our young people are worse off for those differences, though I think most have learned ways to live online in spite of the bad neighborhoods. Maybe they can help us build better places to live.

Chimero's talk is educational, entertaining, and quotable throughout. I tweeted one quote: "How does a city wish to be? Look to the library. A library is the gift a city gives to itself." There were many other lines I marked for myself, including:

  • Penn Station "resembles what Kafka would write about if he had the chance to see a derelict shopping mall." (I'm a big Kafka fan.)
  • "The wrong roads are being paved in an increasingly automated culture that values ease."
Check the talk out for yourself.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

March 06, 2018 4:11 PM

A Good Course in Epistemology

Theoretical physicist Marcelo Gleiser, in The More We Know, the More Mystery There Is:

But even if we did [bring the four fundamental forces together in a common framework], and it's a big if right now, this "unified theory" would be limited. For how could we be certain that a more powerful accelerator or dark matter detector wouldn't find evidence of new forces and particles that are not part of the current unification? We can't. So, dreamers of a final theory need to recalibrate their expectations and, perhaps, learn a bit of epistemology. To understand how we know is essential to understand how much we can know.

People are often surprised to hear that, in all my years of school, my favorite course was probably PHL 440 Epistemology, which I took in grad school as a cognate to my CS courses. I certainly enjoyed the CS courses I took as a grad student, and as an undergrad, too, and but my study of AI was enhanced significantly by courses in epistemology and cognitive psychology. The prof for PHL 440, Dr. Rich Hall, became a close advisor to my graduate work and a member of my dissertation committee. Dr. Hall introduced me to the work of Stephen Toulmin, whose model of argument influenced my work immensely.

I still have the primary volume of readings that Dr. Hall assigned in the course. Looking back now, I'd forgotten how many of W.V.O. Quine's papers we'd read... but I enjoyed them all. The course challenged most of my assumptions about what it means "to know". As I came to appreciate different views of what knowledge might be and how we come by it, my expectations of human behavior -- and my expectations for what AI could be -- changed. As Gleiser suggests, to understand how we know is essential to understanding what we can know, and how much.

Gleiser's epistemology meshes pretty well with my pragmatic view of science: it is descriptive, within a particular framework and necessarily limited by experience. This view may be why I gravitated to the pragmatists in my epistemology course (Peirce, James, Rorty), or perhaps the pragmatists persuaded me better than the others.

In any case, the Gleiser interview is a delightful and interesting read throughout. His humble of science may get you thinking about epistemology, too.

... and, yes, that's the person for whom a quine in programming is named. Thanks to Douglas Hofstadter for coining the term and for giving us programming nuts a puzzle to solve in every new language we learn.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns, Personal

January 22, 2018 3:50 PM

Same Footage, Different Film

In In the Blink of an Eye, Walter Murch tells the story of human and chimpanzee DNA, about how the DNA itself is substantially the same and how the sequencing, which we understand less well, creates different beings during the development of the newborn. He concludes by bringing the analogy back to film editing:

My point is that the information in the DNA can be seen as uncut film and the mysterious sequencing as the editor. You could sit in one room with a pile of dailies and another editor could sit in the next room with exactly the same footage and both of you could make different films out of the same material.

This struck me as quite the opposite of what programmers do. When given a new problem and a large language in which to solve it, two programmers can choose substantially different source material and yet end up telling the same story. Functional and OO programmers, say, may decompose the problem in a different way and rely on different language features to build their solutions, but in the end both programs will solve the same problem and meet the same user need. Like the chimp and the human, though, the resulting programs may be better adapted for living in different environments.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

January 17, 2018 3:51 PM

Footnotes

While discussing the effective use of discontinuities in film, both motion within a context versus change of context, Walter Murch tells a story about... bees:

A beehive can apparently be moved two inches each night without disorienting the bees the next morning. Surprisingly, if it is moved two miles, the bees also have no problem: They are forced by the total displacement of their environment to re-orient their sense of direction, which they can do easily enough. But if the hive is moved two yards, the bees become fatally confused. The environment does not seem different to them, so they do not re-orient themselves, and as a result, they will not recognize their own hive when they return from foraging, hovering instead in the empty space where the hive used to be, while the hive itself sits just two yards away.

This is fascinating, as well being a really cool analogy for the choices movies editors face when telling a story on film. Either change so little that viewers recognize the motion as natural, or change enough that they re-orient their perspective. Don't stop in the middle.

What is even cooler to me is that this story appears in a footnote.

One of the things I've been loving about In the Blink of an Eye is how Murch uses footnotes to teach. In many books, footnotes contain minutia or references to literature I'll never read, so I skip them. But Murch uses them to tell stories that elaborate on or deepen his main point but which would, if included in the text, interrupt the flow of the story he has constructed. They add to the narrative without being essential.

I've already learned a couple of cool things from his footnotes, and I'm not even a quarter of the way into the book. (I've been taking time to mull over what I read...) Another example: while discussing the value of discontinuity as a story-telling device, Murch adds a footnote that connects this practice to the visual discontinuity found ancient Egyptian painting. I never knew before why the perspective in those drawings was so unusual. Now I do!

My fondness for Murch's footnotes may stem from something more than their informative nature. When writing up lecture notes for my students, I like to include asides, digressions, and links to optional readings that expand on the main arc of the story. I'd like for them to realize that what they are learning is part of a world bigger than our course, that the ideas are often deeper and have wider implications than they might realize. And sometimes I just like to entertain with a connection. Not all students care about this material, but for the ones who do, I hope they get something out of them. Students who don't care can do what I do in other books: skip 'em.

This book gives me a higher goal to shoot for when including such asides in my notes: elaborate without being essential; entice without disrupting.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 07, 2018 10:25 AM

95:1

This morning, I read the first few pages of In the Blink of an Eye, an essay on film editing by Walter Murch. He starts by talking about his work on Apocalypse Now, which took well over a year in large part because of the massive amount of film Coppola shot: 1,250,000 linear feet, enough for 230 hours of running time. The movie ended up being about two hours and twenty-five minutes, so Murch and his colleagues culled 95 minutes of footage for every minute that made it into the final product. A more typical project, Murch says, has a ratio of 20:1.

Even at 20:1, Murch's story puts into clearer light the amount of raw material I create when designing a typical session for one of my courses. The typical session mixes exposition, examples, student exercises, and (less than I'd like to admit) discussion. Now, whenever I feel like a session comes up short of my goal, I will think back on Murch's 20:1 ratio and realize how much harder I might work to produce enough material to assemble a good session. If I want one of my sessions to be an Apocalypse Now, maybe I'll need to shoot higher.

This motivation comes at a favorable time. Yesterday I had a burst of what felt like inspiration for a new first day to my Programming Languages course. At the end of the brainstorm came what is now the working version of my opening line in the course: "In the beginning, there was assembly language.". Let's see if I have enough inspiration -- and make enough time -- to turn the idea into what I hope it can be: a session that fuels my students' imagination for a semester's journey through Racket, functional programming, and examining language ideas with interpreters.

I do hope, though, that the journey itself does not bring to mind Apocalypse Now.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 05, 2018 1:27 PM

Change of Terms

I received a Change of Terms message yesterday from one of my mutual fund companies, which included this unexpected note:

Direction to buy or sell Vanguard funds must be placed online or verbally and may no longer be submitted in writing.

I haven't mailed Vanguard or any other financial services company a paper form or a paper check in years, but still. When I was growing up, I never would have imagined that I would see the day when you could not mail a letter to a company in order to conduct financial business. Busy, busy, busy.

In the academic world, this is the time for another type change of terms, as we prepare to launch our spring semester semester on Monday. The temperatures in my part of the country the last two weeks make the name of the semester a cruel joke, but the hope of spring lives.

For me, the transition is from my compiler course to my programming languages course. Compilers went as well this fall as it has gone in a long time; I really wish I had blogged about it more. I can only hope that Programming Languages goes as well. I've been reading about some ways I might improve the course pedagogically. That will require me to change some old habits, but trying to do so is part of the fun of teaching. I intend to blog about my experiences with the new ideas. As I said, the hope of spring lives.

In any case, I get to write Racket code all semester, so at least I have that going for me, which is nice.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

December 28, 2017 8:46 AM

You Have to Learn That It's All Beautiful

In this interview with Adam Grant, Walter Jacobson talks about some of the things he learned while writing biographies of Benjamin Franklin, Albert Einstein, Steve Jobs, and Leonardo da Vinci. A common theme is that all four were curious and interested in a wide range of topics. Toward the end of the interview, Jacobson says:

We of humanities backgrounds are always doing the lecture, like, "We need to put the 'A' in 'STEM', and you've got to learn the arts and the humanities." And you get big applause when you talk about the importance of that.

But we also have to meet halfway and learn the beauty of math. Because people tell me, "I can't believe somebody doesn't know the difference between Mozart and Haydn, or the difference between Lear and Macbeth." And I say, "Yeah, but do you know the difference between a resistor and a transistor? Do you know the difference between an integral and a differential equation?" They go, "Oh no, I don't do math, I don't do science." I say, "Yeah, but you know what, an integral equation is just as beautiful as a brush stroke on the Mona Lisa." You've got to learn that they're all beautiful.

Appreciating that beauty made Leonardo a better artist and Jobs a better technologist. I would like for the students who graduate from our CS program to know some literature, history, and art and appreciate their beauty. I'd also like for the students who graduate from our university with degrees in literature, history, art, and especially education to have some knowledge of calculus, the Turing machine, and recombinant DNA, and appreciate their beauty.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

December 21, 2017 2:42 PM

A Writer with a Fondness for Tech

I've not read either of Helen DeWitt's novels, but this interview from 2011 makes her sound like a technophile. When struggling to write, she finds inspiration in her tools:

What is to be done?

Well, there are all sorts of technical problems to address. So I go into Illustrator and spend hours grappling with the pen tool. Or I open up the statistical graphics package R and start setting up plots. Or (purists will be appalled) I start playing around with charts in Excel.

... suddenly I discover a brilliant graphic solution to a problem I've been grappling with for years! How to display poker hands graphically in a way that sets a series of strong hands next to the slightly better hands that win.

Other times she feels the need for a prop, a lá Olivier:

I may have a vague idea about a character -- he is learning Japanese at an early age, say. But I don't know how to make this work formally, I don't know what to do with the narrative. I then buy some software that lets me input Japanese within my word-processing program. I start playing around, I come up with bits of Japanese. And suddenly I see that I can make visible the development of the character just by using a succession of kanji! I don't cut out text -- I have eliminated the need for 20 pages of text just by using this software.

Then she drops a hint about a work in progress, along with a familiar name:

Stolen Luck is a book about poker using Tuftean information design to give readers a feel for both the game and the mathematics.

Dewitt sounds like my kind of person. I wonder if I would like her novels. Maybe I'll try Lightning Rods first; it sounds like an easier read than The Last Samurai.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 24, 2017 12:30 PM

Thousand-Year Software

I recently read an old conversation between Neil Gaiman and Kazuo Ishiguro that started out as a discussion of genre but covered a lot of ground, including how stories mutate over time, and that the time scale of stories is so much larger than that of human lives. Here are a few of the passages about stories and time:

NG   Stories are long-lived organisms. They're bigger and older than we are.

NG   You sit there reading Pepys, and just for a minute, you kind of get to be 350, 400 years older than you are.

KI   There's an interesting emotional tension that comes because of the mismatch of lifespans in your work, because an event that might be tragic for one of us may not be so for the long-lived being.

KI   I'm often asked what my attitude is to film, theatrical, radio adaptations of my novels. It's very nice to have my story go out there, and if it's in a different form, I want the thing to mutate slightly. I don't want it to be an exact translation of my novel. I want it to be slightly different, because in a very vain kind of way, as a storyteller, I want my story to become like public property, so that it gains the status where people feel they can actually change it around and use it to express different things.

This last comment by Ishiguro made me think of open-source software. It can be adapted by anyone for almost any use. When we fork a repo and adapt it, how often does it grow into something new and considerably different? I often tell my compiler students about the long, mutated life of P-code, which was related by Chris Clark in a 1999 SIGPLAN Notices article:

P-code is an example [compiler intermediate representation] that took on a life of its own. It was invented by Nicklaus Wirth as the IL for the ETH Pascal compiler. Many variants of that compiler arose [Ne179], including the USCD Pascal compiler that was used at Stanford to define an optimizer [Cho83]. Chow's compiler evolved into the MIPS compiler suite, which was the basis for one of the DEC C compilers -- acc. That compiler did not parse the same language nor use any code from the ETH compiler, but the IL survived.

That's not software really, but a language processed by several generations of software. What are other great examples of software and languages that mutated and evolved?

We have no history with 100-year-old software yet, of course, let alone 300- or 1000-year-old software. Will we ever? Software is connected to the technology of a given time in ways that stories are not. Maybe, though, an idea that is embodied in a piece of software today could mutate and live on in new software or new technology many decades from now? The internet is a system of hardware and software that is already evolving into new forms. Will the world wide web continue to have life in a mutated form many years hence?

The Gaiman/Ishiguro conversation turned out to be more than I expected when I first found it. Good stuff. Oh, and as I wrap up this post, this passage resonates with me:

NG   I know that when I create a story, I never know what's going to work. Sometimes I will do something that I think was just a bit of fun, and people will love it and it catches fire, and sometimes I will work very hard on something that I think people will love, and it just fades: it never quite finds its people.

Been there, done that, my friend. This pretty well describes my experience blogging and tweeting all these years, and even writing for my students. I am a less reliable predictor of what will connect with readers than my big ego would ever have guessed.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 05, 2017 9:42 AM

One Way I'm Like Maurice Sendak

In this conversation, the interviewer asked Maurice Sendak, then age eighty-three, how long he could work in one stretch. The illustrator said that two hours was a long stretch for him.

Because I'm older, I get tired and nap. I love napping. Working and napping and reading and seeing my students. They're awfully nice. They're young and they're hopeful.

I'm not quite eighty-three, but I agree with every sentence in Sendak's answer. I could do worse than be as productive and as cantankerous for as long as he was.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 19, 2017 2:52 PM

Still Skeptical About Tweetstorms

The last couple of months have been the sparsest extended stretch on my blog since I began writing here in 2004. I have missed the feeling of writing, and I've wanted to write, but I guess never wanted it enough to set aside time to do the work. (There may be a deeper reason, the idea of which merits more thinking.) It's also a testament to the power of habit in my life: when I'm in the habit of writing, I write; when I fall out of the habit, I don't. During my unintended break from blogging, I've remained as active as usual on Twitter. But I haven't done much long-form writing other than lecture notes for my compiler class.

And that includes writing tweetstorms.

I'm one of those people who occasionally snarks on Twitter about tweetstorms. They always seem like a poor substitute for a blog entry or an essay. While I've probably written my last snarky tweet about tweetstorms, I remain skeptical of the form.

That said, my curiosity was aroused when Brian Marick, a writer and programmer whose work I always enjoy, tweeted yesterday:

[Note re: "write a blog post". I think the tweetstorm is different lit'ry form, and I like exploring it.]

I would love for Brian or anyone else to be able to demonstrate the value in a tweetstorm that is unique from equivalent writing in other forms. I've read many tweetstorms that I've enjoyed, including the epic Eric Garland disquisition considered by many to be the archetype of the genre. But in the end, every tweetstorm looks like either a bullet-point presentation that could be delivered in Powerpoint, or something that could stand on its own as an essay, if only the sentences were, you know, assembled into paragraphs.

I am sympathetic to the idea that there may be a new literary form lurking here. Like any constraint, the 140-character limit on tweets causes writers to be creative in a new way. Chaining a sequence of similarly constrained statements together as a meaningful whole requires a certain skill, and writers who master the style can pull me through to the end, almost despite myself. But I would read through to the end of a blog entry written as skillfully, and I wouldn't have to do the assembly of the work in my head as I go.

Perhaps the value lies in Twitter as an interaction mechanism. Twitter makes it easy to respond to and discuss the elements of a tweetstorm at the level of individual tweet. That's handy, but it can also be distracting. Not every Twitter platform manages the threading as well as it could. It's also not a new feature of the web; any blogging platform can provide paragraph-level linking as a primitive, and discussion forums are built on modular commentary and linking. Maybe tweetstorms are popular precisely because Twitter is a popular medium of the day. They are the path of least resistance.

That leads to what may be the real reason that people explore the form: Twitter lowers the barrier of entry into blogging to almost nothing: install an app, or point a web browser to your homepage, and you have a blogging platform. But that doesn't make the tweetstorm a new literary form of any particular merit. It's simply a chunking mechanism enforced by the nature of a limited interface. Is there anything more to it than that?

I'm an open-minded person, so when I say I'm skeptical about something, I really am open to changing my mind. When someone I respect says that there may be something to the idea, I know I should pay attention. I'll follow Brian's experiment and otherwise keep my mind open. I'm not expecting to undergo a conversion, but I'm genuinely curious about the possibilities.


Posted by Eugene Wallingford | Permalink | Categories: General

July 27, 2017 1:36 PM

How can we help students overcome "naturalness bias"?

In Leadership as a Performing Art, Ed Batista discusses, among other things, a "naturalness bias" that humans have when evaluating one another. Naturalness is "a preference for abilities and talents that we perceive as innate over those that appear to derive from effort and experience". Even when people express a preference for hard work and experience, they tend to judge more positively people who seem to be operating on natural skill and talent. As Batista notes, this bias affects not only how we evaluate others but also how we evaluate ourselves.

As I read this article, I could not help but think about how students who are new to programming and to computer science often react to their own struggles in an introductory CS course. These thoughts reached a crescendo when I came to these words:

One commonly-held perspective is that our authentic self is something that exists fully formed within us, and we discover its nature through experiences that feel more (or less) natural to us. We equate authenticity with comfort, and so if something makes us feel uncomfortable or self-conscious, then it is de facto inauthentic, which means we need not persist at it (or are relieved of our responsibility to try). But an alternative view is that our authentic self is something that we create over time, and we play an active role in its development through experiences that may feel uncomfortable or unnatural, particularly at first. As INSEAD professor of organizational behavior Herminia Ibarra wrote in The Authenticity Paradox in 2015,

Because going against our natural inclinations can make us feel like impostors, we tend to latch on to authenticity as an excuse for sticking with what's comfortable... By viewing ourselves as works-in-progress and evolving our professional identities through trial and error, we can develop a personal style that feels right to us and suits our organizations' changing needs. That takes courage, because learning, by definition, starts with unnatural and often superficial behaviors that can make us feel calculating instead of genuine and spontaneous. But the only way to avoid being pigeonholed and ultimately become better leaders is to do the things that a rigidly authentic sense of self would keep us from doing.

So many CS students and even computing professionals report suffering from impostor syndrome, sometimes precisely because they compare their internal struggles to learn with what appears to be the natural ability of their colleagues. But, as Ibarra says, learning, by definition, starts with the unnatural. To be uncomfortable is, in one sense, to be in a position to learn.

How might we teachers of computer science help our students overcome the naturalness bias they unwittingly apply when evaluating their own work and progress? We need strategies to help students see that CS is something we do, not something we are. You can feel uncomfortable and still be authentic.

This distinction is at the foundation of Batista's advice to leaders and, I think, at the foundation of good advice to students. When students can distinguish between their behavior and their identity, they are able to manage more effectively the expectations they have of their own work.

I hope to put what I learned in this article to good use both for my students and myself. It might help me be more honest -- and generous -- to myself when evaluating my performance as a teacher and an administrator, and more deliberate in how I try to get better.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Teaching and Learning

June 23, 2017 2:04 PM

No Summer Job? Learn How to Program

The article Why Aren't American Teenagers Working Anymore? comments on a general trend I have observed locally over the last few years: most high school students don't have summer jobs any more. At first, you might think that rising college tuition would provide an incentive to work, but the effect is almost the opposite:

"Teen earnings are low and pay little toward the costs of college," the BLS noted this year. The federal minimum wage is $7.25 an hour. Elite private universities charge tuition of more than $50,000.

Even in-state tuition at a public universities has grown large enough to put it out of the reach of the typical summer jobs. Eventually, there is almost no point in working a low-paying job; you'll have to borrow significant amount anyway.

These days, students have another alternative that might pay off better in the long run anyway. With a little gumption and free resources available on the web, many students can learn to program, build websites, and make mobile apps. Time spent not working a job but developing skills that are in high demand and which pay well might be time spent well.

Even as a computer scientist, though, I'm traditional enough to be a little uneasy with this idea. Don't young people benefit from summer jobs in ways other than a paycheck? The authors of this article offer the conventional thinking:

A summer job can help teenagers grow up as it expands their experience beyond school and home. Working teens learn how to manage money, deal with bosses, and get along with co-workers of all ages.

You know what, though... A student working on an open-source project can learn also how to deal with people in positions of relative authority and learn how to get along with collaborators of all ages. They might even get to interact with people from other cultures and make a lasting contribution to something important.

Maybe instead of worrying about teenagers getting summer jobs we should introduce them to programming and open-source software.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

June 15, 2017 2:14 PM

The Melancholy Department Head

In The Melancholy Dean, Matt Reed notes that, while most management books speak at the level of the CEO or a founder, most managers work further down the chain of command.

Most managers are closer to deans than to presidents. They're in the middle. ... it's not unusual that they find themselves tasked with carrying out policies with which they personally disagree. When success in a position relies largely on "soft power", having to carry out positions with which you personally disagree can be a real strain.

Obviously, if the disagreements become too large or frequent, the right move is to step out of the role. But that's the exception. More commonly, there's a vague sense of "I wouldn't have done it that way" that falls well short of a crisis of conscience, but can be enough to sap motivation. That's especially true when budgets are tightening and adverse decisions are made for you.

I have seen this happen to deans, but I also know the feeling myself. Here, department heads are administrators, and formally they depend upon the dean and provost for their positions. As public universities have to face falling state appropriations, increasing regulatory requirements, and increased competition for students, they often find themselves operating with more of a corporate mentality than the hallowed halls of academia we all dream of from yesteryear. Even with good and open leaders making decisions in upper administration (which I have been fortunate to have in my time as an administrator), more agency lives outside the department, more of the department head's time is spent carrying out activities defined elsewhere, and fewer strategic decisions are made by the head and faculty within the department.

It does wear on a person. These days, academic middle managers of all sorts have to cultivate the motivation they need to carry on. The good news is, through it all, we are helping students, and helping faculty help students. Knowing that, and doing at least a little programming every day, helps me relieve whatever strain I might feel. Even so, I could use more closure most days of the week.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

June 10, 2017 10:28 AM

98% of the Web in One Sentence

Via Pinboard's creator, the always entertaining Maciej Cegłowski:

Pinboard is not much more than a thin wrapper around some carefully tuned database queries.

You are ready to make your millions. Now all you need is an idea.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

June 06, 2017 2:39 PM

Using Programs and Data Analysis to Improve Writing, World Bank Edition

Last week I read a tweet that linked to an article by Paul Romer. He is an economist currently working at the World Bank, on leave from his chair at NYU. Romer writes well, so I found myself digging deeper and reading a couple of his blog articles. One of them, Writing, struck a chord with me both as a writer and as a computer scientist.

Consider:

The quality of written prose should be higher in documents that will have many readers.

This is true of code, too. If a piece of code will be read many times, whether by one person or several, then each minute spent making it shorter and clearer improves reading comprehension every single time. That's even more important in code than in text, because so often we read code in order to change it. We need to understand it at even deeper level to ensure that our changes have the intended effect. Time spent making code better repays itself many times over.

Romer caused a bit of a ruckus when he arrived at the World Bank by insisting, to some of his colleagues' displeasure, that everyone in his division writer clearer, more concise reports. His goal was admirable: He wanted more people to be able to read and understand these reports, because they deal with policies that matter to the public.

He also wanted people to trust what the World Bank was saying by being able more readily to see that a claim was true or false. His article looks at two different examples that make a claim about the relationship between education spending and GDP per capita. He concludes his analysis of the examples with:

In short, no one can say that the author of the second claim wrote something that is false because no one knows what the second claim means.

In science, writing clearly builds trust. This trust is essential for communicating results to the public, of course, because members of the public do not generally possess the scientific knowledge they need to assess the truth of claim directly. But it is also essential for communicating results to other scientists, who must understand the claims at a deeper level in order to support, falsify, and extend them.

In the second half of the article, Romer links to a study of the language used in World Bank's yearly reports. It looks at patterns such as the frequency of the word "and" in the reports and the ratio of nouns to verbs. (See this Financial Times article for a fun little counterargument on the use of "and".)

Romer wants this sort of analysis to be easier to do, so that it can be used more easily to check and improve the World Bank's reports. After looking at some other patterns of possible interest, he closes with this:

To experiment with something like this, researchers in the Bank should be able to spin up a server in the cloud, download some open-source software and start experimenting, all within minutes.

Wonderful: a call for storing data in easy-to-access forms and a call for using (and writing) programs to analyze text, all in the name not of advancing economics technically but of improving its ability to communicate its results. Computing becomes a tool integrated into the process of the World Bank doing its designated job. We need more leaders in more disciplines thinking this way. Fortunately, we hear reports of such folks more often these days.

Alas, data and programs were not used in this way when Romer arrived at the World Bank:

When I arrived, this was not possible because people in ITS did not trust people from DEC and, reading between the lines, were tired of dismissive arrogance that people from DEC displayed.

One way to create more trust is to communicate better. Not being dismissively arrogant is, too, though calling that sort of behavior out may be what got Romer in so much hot water with the administrators and economists at the World Bank in the first place.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development

May 31, 2017 2:28 PM

Porting Programs, Refactoring, and Language Translation

In his commonplace book A Certain World, W.H. Auden quotes C.S. Lewis on the controversial nature of tramslation:

[T]ranslation, by its very nature, is a continuous implicit commentary. It can become less tendentious only by becoming less of a translation.

Lewis was merely acknowledging a truth about language: Translators must have a point of view, and often that point of view will be controversial.

I once saw Kurt Vonnegut speak with a foreign language class here many years ago. One of the students asked him what he thought about the quality of the translations done for his book. Vonnegut laughed and said that his books were so peculiar and so steeped in Americana that translating one was akin to writing a new book. He said that his translators deserved all the royalties from the books they created by translating him. They had to write brand new works.

These memories came to mind again recently while I was reading Tyler Cowen's conversation with Jhumpa Lahiri, especially when Lahiri said this:

At one point I was talking about this idea, in antiquity: in Latin, the word for "translator" is "interpreter". I teach translation now, and I talk a lot to my students about translation being the most intimate form of reading and how there was the time when translating and interpreting and analyzing were all one thing.

As my mind usually does, it began to think about computer programs.

Like many programmers, I often find myself porting a program from one language to another. This is clearly translation but, as Vonnegut and and Lahiri tell us, it is also a form of interpretation. To port a piece of code, I have to understand its meaning and express that meaning in a new language. That language has its own constructs, idioms, patterns, and set of community practices and expectations. To port a program, one must have a point of view, so the process can be, to use Lewis's word, tendentious.

I often refactor code, too, both my own programs and programs written by others. This, too, is a form of translation, even though it leaves the new code written in the same language as the original. Refactoring is necessarily an opinionated act, and thus tendentious.

Occasionally, I refactor a program in order to learn what it does and how it does it. In those cases, I'm not judging the original code as anything but ill-suited to my current state of knowledge. Even so, when I get done, I usually like my version better, if only a little bit. It expresses what I learned in the process of rewriting the code.

It has always been hard for me to port a program without refactoring it, and now I understand why. Both activities are a kind of translation, and translation is by its nature an activity that requires a point of view.

This fall, I will again teach our "Translation of Programming Languages" course. Writing a compiler requires one to become intimate not only with specific programs, the behavior of which the compiler must preserve, but also the language itself. At the end of the project, my students know the grammar, syntax, and semantics of our source language in a close, almost personal way. The target language, too. I don't mind if my students develop a strong point of view, even a controversial one, along the way. (I'm actually disappointed if the stronger students do not!) That's a part of writing new software, too.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns, Software Development, Teaching and Learning

May 21, 2017 10:07 AM

Computer Programs Have Much to Learn, and Much to Teach Us

In his recent interview with Tyler Cowen, Garry Kasparov talks about AI, chess, politics, and the future of creativity. In one of the more intriguing passages, he explains that building databases for chess endgames has demonstrated how little we understand about the game and offers insight into how we know that chess-playing computer programs -- now so far beyond humans that even the world champion can only score occasionally against commodity programs -- still have a long way to improve.

He gives as an example a particular position with a king, two rooks, a knight on one side versus a king and two rooks on the other. Through the retrograde analysis used to construct endgame databases, we know that, with ideal play by both sides, the stronger side can force checkmate in 490 moves. Yes, 490. Kasparov says:

Now, I can tell you that -- even being a very decent player -- for the first 400 moves, I could hardly understand why these pieces moved around like a dance. It's endless dance around the board. You don't see any pattern, trust me. No pattern, because they move from one side to another.

At certain points I saw, "Oh, but white's position has deteriorated. It was better 50 moves before." The question is -- and this is a big question -- if there are certain positions in these endgames, like seven-piece endgames, that take, by the best play of both sides, 500 moves to win the game, what does it tell us about the quality of the game that we play, which is an average 50 moves? [...]

Maybe with machines, we can actually move our knowledge much further, and we can understand how to play decent games at much greater lengths.

But there's more. Do chess-playing computer programs, so much superior to even the best human players, understand these endgames either? I don't mean "understand" in the human sense, but only in the sense of being able to play games of that quality. Kasparov moves on to his analysis of games between the best programs:

I think you can confirm my observations that there's something strange in these games. First of all, they are longer, of course. They are much longer because machines don't make the same mistakes [we do] so they could play 70, 80 moves, 100 moves. [That is] way, way below what we expect from perfect chess.

That tells us that [the] machines are not perfect. Most of those games are decided by one of the machines suddenly. Can I call it losing patience? Because you're in a position that is roughly even. [...] The pieces are all over, and then suddenly one machine makes a, you may call, human mistake. Suddenly it loses patience, and it tries to break up without a good reason behind it.

That also tells us [...] that machines also have, you may call it, psychology, the pattern and the decision-making. If you understand this pattern, we can make certain predictions.

Kasparov is heartened by this, and it's part of the reason that he is not as pessimistic about the near-term prospects of AI as some well-known scientists and engineers are. Even with so-called deep learning, our programs are only beginning to scratch the surface of complexity in the universe. There is no particular reason to think that the opaque systems evolved to drive our cars and fly our drones will be any more perfect in their domains than our game-playing programs, and we have strong evidence from the domain of games that programs are still far from perfect.

On a more optimistic note, advances in AI give us an opportunity to use programs to help us understand the world better and to improve our own judgment. Kasparov sees this in chess, in the big gaps between the best human play, the best computer play, and perfect play in even relatively simple positions; I wrote wistfully about this last year, prompted by AlphaGo's breakthrough. But the opportunity is much more valuable when we move beyond playing games, as Cowen alluded in an aside during Kasparov's explanation: Imagine how bad our politics will look in comparison to computer programs that do it well! We have much to learn.

As always, this episode of Conversations with Tyler was interesting and evocative throughout. If you are a chess player, there is an special bonus. The transcript includes a pointer to Kasparov's Immortal Game against Veselin Topalov at Wijk aan Zee in 1999, along with a discussion of some of Kasparov's thoughts on the game beginning with the pivotal move 24. Rxd4. This game, an object of uncommon beauty, will stand as an eternal reminder why, even in the face of advancing AI, it will always matter that people play and compete and create.

~~~~

If you enjoyed this entry, you might also like Old Dreams Live On. It looks more foresightful now that AlphaGo has arrived.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

April 15, 2017 10:39 AM

Science Seeks Regularity

A week or so ago I tweeted that Carver Mead was blowing my mind: an electron a mile long! I read about that idea in this Spectator interview that covers both Mead's personal life and his professional work in engineering. Well worth a read.

Mead is not satisfied with the current state of physics and biology, or at least with the incomplete theories that we seem to have accepted in lieu of a more coherent conceptual understanding of how the world works. Ultimately, he sides with Einstein in his belief that there is a more coherent explanation:

I think Einstein was being a scientist in the truest sense in his response to the Copenhagen interpretation. He said that none of us would be scientists if deep down we didn't believe there is a set of regularities in the operation of physical law. That is a matter of faith. It is not something anybody has proven, but none of us would be scientists if we didn't have that faith.

Like Einstein, Mead believes that unpredictability at the lowest levels of a system does not imply intrinsic uncertainty. We need a different view that brings regularities to the forefront of our theories.

I also like this line from near the end of the interview:

People don't even know where to put the decimal point.

Mead says this as part of his assurance that artificial intelligence is nowhere near the level of what even the fruit fly can do, let alone the human brain. A lot has happened in AI during fifteen years since this interview; a computer program even beats our best players in Go now. Still, there is so much that we don't understand and cannot replicate.

I wonder if Mead's "decimal point" aphorism also might apply, metaphorically, to his view of the areas of science in which we have settled for, or are currently stuck with, unsatisifying theories. Our mathematical models cover a lot of ground, decimal point-wise, but there is still a simpler, more coherent picture to see. Maybe, though, that is the engineer in Mead showing through.


Posted by Eugene Wallingford | Permalink | Categories: General

April 02, 2017 12:02 PM

Reading an Interview with John McPhee Again, for the First Time

"This weekend I enjoyed Peter Hessler's interview of McPhee in The Paris Review, John McPhee, The Art of Nonfiction No. 3."

That's a direct quote from this blog. Don't remember it? I don't blame you; neither do I. I do remember blogging about McPhee back when, but as I read the same Paris Review piece again last Sunday and this, I had no recollection of reading it before, no sense of déjà vu at all.

Sometimes having a memory like mine is a blessing: I occasionally get to read something for the first time again. If you read my blog, then you get to read my first impressions for a second time.

I like this story that McPhee told about Bob Bingham, his editor at The New Yorker:

Bingham had been a writer-reporter at The Reporter magazine. So he comes to work at The New Yorker, to be a fact editor. Within the first two years there, he goes out to lunch with his old high-school friend Gore Vidal. And Gore says, What are you doing as an editor, Bobby? What happened to Bob Bingham the writer? And Bingham says, Well, I decided that I would rather be a first-rate editor than a second-rate writer. And Gore Vidal draws himself up and says, And what is wrong with a second-rate writer?

I can just hear the faux indignation in Vidal's voice.

McPhee talked a bit about his struggle over several years to write a series of books on geology, which had grown out of an idea for a one-shot "Talk of the Town" entry. The interviewer asked him if he ever thought about abandoning the topic and moving on to something he might enjoy more. McPhee said:

The funny thing is that you get to a certain point and you can't quit. Because I always worried: if you quit, you'll quit again. The only way out was to go forward, to learn your way and write your way out of it.

I know that feeling. Sometimes, I really do need to quit something and move on, but I always wonder whether quitting this time will make it easier to do next time. Because sometimes, I need to stick it out and, as McPhee says, learn my way out of the difficulty. I have no easy answers for knowing when quitting is the right thing to do.

Toward the end of the interview, the conversation turned to the course McPhee teaches at Princeton, once called "the literature of fact". The university first asked him to teach on short notice, over the Christmas break in 1974, and he accepted immediately. Not everyone thought it was a good idea:

One of my dear friends, an English teacher at Deerfield, told me: Do not do this. He said, Teachers are a dime a dozen -- writers aren't. But my guess is that I've been more productive as a writer since I started teaching than I would have been if I hadn't taught. In the overall crop rotation, it's a complementary job: I'm looking at other people's writing, and the pressure's not on me to do it myself. But then I go back quite fresh.

I know a lot of academics who feel this way. Then again, it's a lot easier to stay fresh in one's creative work if one has McPhee's teaching schedule, rather than a full load of courses:

My schedule is that I teach six months out of thirty-six, and good Lord, that leaves a lot of time for writing, right?

Indeed it does. Indeed it does.

On this reading of the interview, I marked only two passages that I wrote about last time. One came soon after the above response, on how interacting with students is its own reward. The other was a great line about the difference between mastering technique and having something to say: You demonstrated you know how to saddle a horse. Now go find the horse.

That said, I unconsciously channeled this line from McPhee just yesterday:

Writing teaches writing.

We had a recruitment event on campus, and I was meeting with a dozen or so prospective students and their entourages. We were talking about our curriculum, and I said a few words about our senior project courses. Students generally like these courses, even though they find them difficult. The students have never had to write a big program over the course of several months, and it's harder than it looks. The people who hire our graduates like these courses, too, because they know that these courses are places where students really begin to learn to program.

In the course of my remarks, I said something to the effect, "You can learn a lot about programming in classes where you study languages and techniques and theory, but ultimately you learn to write software by writing software. That's what the project courses are all about." There were a couple of experienced programmers in the audience, and they were all nodding their heads. They know McPhee is right.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

March 10, 2017 2:51 PM

Reading Is A Profoundly Creative Act

This comes from Laura Miller, a book reviewers and essayist for Slate, in a Poets & Writers interview:

I also believe that reading is a profoundly creative act, that every act of reading is a collaboration between author and reader. I don't understand why more people aren't interested in this alchemy. It's such an act of grace to give someone else ten or fifteen hours out of your own irreplaceable life, and allow their voice, thoughts, and imaginings into your head.

I think this is true of all reading, whether fiction or nonfiction, literary or technical. I often hear CS profs tell their students to read "actively" by trying code out in an interpreter, asking continually what the author means, and otherwise engaging with the material. Students who do have a chance to experience what Miller describes: turning over a few hours of their irreplaceable lives to someone who understands a topic well, allow their voice, thoughts, and imaginings into their heads, and coming out on the other end of the experience with new thoughts -- and maybe even a new mind.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

February 12, 2017 11:11 AM

Howard Marks on Investing -- and Software Development

Howard Marks is an investor and co-founder of Oaktree Capital Management. He has a big following in the financial community for his views on markets and investing, which often stray from orthodoxy, and for his straightforward writing and speaking style. He's a lot like Warren Buffett, with less public notoriety.

This week I read Marks's latest memo [ PDF ] to Oak Tree's investors, which focuses on expert opinion and forecasting. This memo made me think a lot about software development. Whenever Marks talks about experts predicting how the market would change and how investors should act, I thought of programming. His comments sound like the wisdom of an agile software developer.

Consider what he learned from the events of 2016:

  1. First, no one really knows what events are going to transpire.
  2. And second, no one knows what the market's reaction to those events will be.

Investors who got out of the market for the last couple of months of 2016, based on predictions about what would happen, missed a great run-up in value.

If a programmer cannot predict what will happen in the future, or how stakeholders will respond to these changes, then planning in too much detail is at best an inefficient use of time and energy. At worst it is a way to lock yourself into code that you really need to change but can't.

Or consider these thoughts on surprises (the emphasis in the original):

It's the surprises no one can anticipate that would move markets most if they were to happen. But (a) most people can't imagine them and (b) most of the time they don't happen. That's why they're called surprises.

To Marks, this means that investors should not try to get cute, predict the future, and outsmart the market. The best they can do is solid technical analysis of individual companies and invest based on observable facts about value and value creation.

To me, this means that we programmers shouldn't try to prepare for surprises by designing them into our software. Usually, the best we can do is to implement simple, clean code that does just what it does and no more. The only prediction we can make about the future is that we may well have to change our code. Creating clean interfaces and hiding implementation choices enable us to write code that is as straightforward as possible to change when the unimaginable happens, or even the imaginable.

Marks closes this memo with five quotes about forecasting from a collection he has been building for forty years. I like this line from former GE executive Ian Wilson, which expresses the conundrum that every designer faces:

No amount of sophistication is going to allay the fact that all of your knowledge is about the past and all your decisions are about the future.

It isn't really all that strange that the wisdom of an investor like Marks might be of great value to a programmer. Investors and programmers both have to choose today how to use a finite resource in a way that maximizes value now and in the future. Both have to make these choices based on knowledge gleaned from the past. Both are generally most successful when the future looks like the past.

A big challenge for investors and programmers alike is to find ways to use their experience of the past in a way that maximizes value across a number of possible futures, both the ones we can anticipate and the ones we can't.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

February 10, 2017 3:55 PM

Follow-Up on Learning by Doing and Ubiquitous Information

A few quick notes on my previous post about the effect of ubiquitous information on knowing and doing.

~~~~

The post reminded a reader of something that Guy Steele said at DanFest, a 2004 festschrift in honor of Daniel Friedman's 60th birthday. As part of his keynote address, Steele read from an email message he wrote in 1978:

Sussman did me a very big favor yesterday -- he let me flounder around trying to build a certain LISP interpreter, and when I had made and fixed a critical bug he then told me that he had made (and fixed) the same mistake in Conniver. I learned a lot from that bug.

Isn't that marvelous? "I learned a lot from that bug."

Thanks to this reader for pointing me to a video of Steele's DanFest talk. You can watch this specific passage at the 12:08 mark, but really: You now have a link to an hour-long talk by Guy Steele that is titled "Dan Friedman--Cool Ideas". Watch the entire thing!

~~~~

If all you care about is doing -- getting something done -- then ubiquitous information is an amazing asset. I use Google and StackOverflow answers quite a bit myself, mostly to navigate the edges of languages that I don't use all the time. Without these resources, I would be less productive.

~~~~

Long-time readers may have read the story about how I almost named this blog something else. ("The Euphio Question" still sets my heart aflutter.) Ultimately I chose a title that emphasized the two sides of what I do as both a programmer and a teacher. The intersection of knowing and doing is where learning takes place. Separating knowing from doing creates problems.

In a post late last year, I riffed on some ideas I had as I read Learn by Painting, a New Yorker article about an experiment in university education in which everyone made art as a part of their studies.

That article included a line that expressed an interesting take on my blog's title: "Knowing and doing are two sides of the same activity, which is adapting to our environment."

That's cool thought, but a rather pedestrian sentence. The article includes another, more poetic line that fits in nicely with the theme of the last couple of days:

Knowing is better than not knowing, but knowing without doing is as good as not knowing.

If I ever adopt a new tagline for my blog, it may well be this sentence. It is not strictly true, at least in a universal sense, but it's solid advice nonetheless.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

January 28, 2017 8:10 AM

Curiosity on the Chessboard

I found a great story from Lubomir Kavalek in his recent column, Chess Champions and Their Queens. Many years ago, Kavalek was talking with Berry Withuis, a Dutch journalist, about Rashid Nezhmedtinov, who had played two brilliant queen sacrifices in the span of five years. The conversation reminded Withuis of a question he once asked of grandmaster David Bronstein:

"Is the Queen stronger than two light pieces?"

(The bishop and knight are minor, or "light", pieces.)

The former challenger for the world title took the question seriously. "I don't know," he said. "But I will tell you later."

That evening Bronstein played a simultaneous exhibition in Amsterdam and whenever he could, he sacrificed his Queen for two minor pieces. "Now I know," he told Withuis afterwards. "The Queen is stronger."

How is that for an empirical mind? Most chessplayers would have immediately answered "yes" to Withuis's question. But Bronstein -- one of the greatest players never to be world champion and author of perhaps the best book of tournament analysis in history -- didn't know for sure. So he ran an experiment!

We should all be so curious. And humble.

I wondered for a while if Bronstein could have improved his experiment by channeling Kent Beck's Three Bears pattern. (I'm a big fan of this learning technique and mention it occasionally here, most recently last summer.) This would require him to play many games from the other side of the sacrifice as well, with a queen against his opponents' two minor pieces. Then I realized that he would have a hard time convincing any of his opponents to sacrifice their queens so readily! This may be the sort of experiment that you can only conduct from one side, though in the era of chess computers we could perhaps find, or configure, willing collaborators.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns, Teaching and Learning

January 26, 2017 3:37 PM

Another "I'm Successful Because I Was Lucky" Admission

This one from essayist and cartoonist Tim Kreider, in an AdviceToWriters interview:

What's your advice to new writers?
I first have to say that whatever moderate success I may have achieved has been so much a result of dumb luck that I feel fraudulent presuming to offer any advice to young writers, as if I did any of this on purpose or according to plan.

I appreciate humble advice.

The advice Kreider goes on to give aspiring writers is mostly obvious, as he says up front that it will be, but he also shares a quote from Thoreau that I like:

Read the best books first, or you may not have a chance to read them at all.

As someone who most days runs out of time to read as much computer science as I want, I value this reminder.


Posted by Eugene Wallingford | Permalink | Categories: General

January 06, 2017 4:29 PM

Moving a Flatter Email Archive

I'm not a New Year's resolution person, but I did make a change recently that moved me out of my comfort zone. Here's a quick version of the story.

I'm a hierarchical guy, like a lot of computer scientists, I imagine. That helps me manage a lot of complexity, but sometimes it also consumes more personal time than I'd like.

I'm also a POP mail guy. For many years, Eudora was my client of choice. A while back, I switched to Mail.app on OS X. In both, I had an elaborate filing system in which research mail was kept in a separate folder from teaching mail, which was kept in a separate folder from personal was kept in a separate folder from .... There were a dozen or so top-level folders, each having sub-folders.

Soon after I became department head a decade or so ago, I began to experience the downsides of this approach as much as the upsides. Some messages wanted to live in two folders, but I had to choose one. Even when the choice was easy, I found myself spending too many minutes each week filing away messages I would likely never think of again.

For years now, my browser- and cloud-loving friends have been extolling to me the value of leaving all my mail on the server, hitting 'archive' when I wanted to move a message out of my inbox, and then using the mail client's search feature to find messages when I need them later. I'm not likely to become a cloud email person any time soon, but the cost in time and mental energy of filing messages hierarchically finally became annoying enough that I decided to move into the search era.

January 1 was the day.

But I wasn't ready to go all the way. (Change is hard!) I'd still like to have a gross separation of personal mail from professional mail, and gross separation among email related to teaching, research, professional work, and university administration. If Mail.app had tags or labels, I might use them, but it doesn't. At this point, I have five targeted archive folders:

  • department business (I'm chair)
  • university business
  • TPLoP business (I'm a regional editor)
  • correspondence with my wife and daughters
  • other personal correspondence
  • personal finance and commerce
a folder for the course I am currently teaching, a folder for bulk mail and unavoidable mailing lists, and a folder for everything else. Everything else includes messages from mailing lists I choose to be on, such as the Racket users listserv and personal lists. None of these has subfolders.

I still have three other small hierarchies. The first is where I keep folders for other courses I have taught or plan to teach. I like the idea of keeping course questions and materials easy to find. The second is for hot topics I am working on as department head. For instance, we are currently doing a lot of work on outcomes assessment, and it's helpful to have all those messages in a separate bin. When a topic is no longer hot, I'll transfer its messages to the department archive. The third is is a set of two or three small to-do boxes. Again, it's helpful to an organizer like me to have such messages in a separate bin so that I can find and respond to them quickly; eventually those messages will move to the appropriate flat archive.

Yes, there is still a lot going on here, but it's a big change for me. So far, so good. I've not felt any urges to create subfolders yet, and I've used search to find things when I've needed them. After I become habituated to this new way of living, perhaps I'll feel daring enough to go even flatter.

Let's not talk about folders in my file system, though. Hierarchy reigns supreme there, as it always has.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

December 28, 2016 1:01 PM

Unclear on the Concept, or How Ken Griffey, Jr., is like James Monroe

Today I read an article on faithless electors, those members of the Electoral College who over the years did not vote for the candidate they were pledged. One story from 1820 made me think of baseball's Hall of Fame!

William Plummer, Sr. was pledged to vote for Democratic-Republican candidate James Monroe. Instead, he cast his vote for John Quincy Adams, also of the Democratic-Republican Party, although Adams was not a candidate in the 1820 election.

Supposedly, Plummer did not feel that the Electoral College should unanimously elect any president other than George Washington.

There are many Hall of Fame voters each year who practice Mr. Plummer's ostentatious electoral purity. They don't vote for anyone on his first ballot, preserving "the legacy of their predecessors", none of whom -- neither Cobb nor Ruth, Mays nor Aaron -- were elected unanimously.

(Some leave a great off the ballot for a more admirable reason: to use the vote to support a player they believe deserves entry but who does not receive many other votes. They hope to give such a player more time to attract a sufficient body of voters. I cut these voters a lot more slack than I cut the Plummers of the world.)

It was a silly idea in the case of President Monroe, whose unanimous election would have done nothing to diminish Washington's greatness or legacy, and it's a silly idea in the case of baseball greats like Ken Griffey, Junior.


Posted by Eugene Wallingford | Permalink | Categories: General

December 19, 2016 3:04 PM

Higher Education Has Become A Buyer's Market

... as last week's Friday Fragments reminds us.

Much of higher education is based on the premise of a seller's market. In a seller's market, the institution can decide the terms on which it will accept students. At the very elite, exclusive places, that's still largely true. Swarthmore turns away far more than it admits, and it does so on its own terms. But most of us aren't Swarthmore.

The effects of this change are numerous. It's hard to set prices, let alone correlate price and quality. University administrations are full of people confused by the shifting market. They are also full of people frantic at the thought of a drop in enrollment or retention. There are easy ways to keep these numbers up, of course, but most folks aren't willing to pay the associated price.

Interesting times, indeed.


Posted by Eugene Wallingford | Permalink | Categories: General

October 30, 2016 9:25 AM

Which Part of Speech Am I?

I saw a passage attributed to Søren Kierkegaard that I might translate as:

The life of humanity could very well be conceived as a speech in which different people represented the various parts of speech [...]. How many people are merely adjectives, interjections, conjunctions, adverbs; how few are nouns, verbs; how many are copula?

This is a natural thing to ponder around my birthday. It's not a bad thing to ask myself more often: Which part of speech will I be today?


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

October 22, 2016 2:00 PM

Competence and Creating Conditions that Minimize Mistakes

I enjoyed this interview with Atul Gawande by Ezra Klein. When talking about making mistakes, Gawande notes that humans have enough knowledge to cut way down on errors in many disciplines, but we do not always use that knowledge effectively. Mistakes come naturally from the environments in which we work:

We're all set up for failure under the conditions of complexity.

Mistakes are often more a matter of discipline and attention to detail than a matter of knowledge or understanding. Klein captures the essence of Gawande's lesson in one of his questions:

We have this idea that competence is not making mistakes and getting everything right. [But really...] Competence is knowing you will make mistakes and setting up a context that will help reduce the possibility of error but also help deal with the aftermath of error.

In my experience, this is a hard lesson for computer science students to grok. It's okay to make mistakes, but create conditions where you make as few as possible and in which you can recognize and deal with the mistakes as quickly as possible. High-discipline practices such as test-first and pair programming, version control, and automated builds make a lot more sense when you see them from this perspective.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development, Teaching and Learning

October 15, 2016 10:47 AM

A View of Self-Driving Cars from 1956

A friend and fellow classic science fiction fan told me that one of his favorite books as a teenager was Robert Heinlein's The Door into Summer. I've read a lot of Heinlein but not this one, so I picked it up at the library.

Early in the book, protagonist Daniel B. Davis needed to make the most of the next twenty-fours. He located his car, dropped some money into the "parking attendant", set course, and relaxed as the car headed out into traffic:

Or tried to relax. Los Angeles traffic was too fast and too slashingly murderous for me to be really happy under automatic control; I wanted to redesign their whole installation--it was not a really modern "fail safe". By the time we were west of Western Avenue and could go back on manual control I was edgy and wanted a drink.

This scene is set December 1970; Heinlein wrote it in 1956. He may have missed the year in which self-driving cars were already common technology by 45 years or more, but I think he got the feeling right. People like to be in control of their actions, especially when dropped into hectic conditions they can't control. Heinlein's character is an engineer, so naturally he thinks he could design a better. None of my programmer friends are like this, of course.

It's also interesting to note that automatic control was required in the most traffic. Once he got into a calmer setting, Davis could go back to driving himself. The system allows the human to drive only when he isn't a danger to other people, or even himself!

Today, it is commonplace to think that the biggest challenges of the move to self-driving cars are cultural, not technological: getting people to accept that the cars can drive themselves more safely than humans can drive them, and getting people to give up control. It's neat to see that Heinlein recognized this sixty years ago.


Posted by Eugene Wallingford | Permalink | Categories: General

October 06, 2016 2:46 PM

Computers Shouldn't Need a Restart Button (Memories of Minix)

An oldie but goodie from Andrew Tanenbaum:

Actually, MINIX 3 and my research generally is **NOT** about microkernels. It is about building highly reliable, self-healing, operating systems. I will consider the job finished when no manufacturer anywhere makes a PC with a reset button. TVs don't have reset buttons. Stereos don't have reset buttons. Cars don't have reset buttons. They are full of software but don't need them. Computers need reset buttons because their software crashes a lot. I know that computer software is different from car software, but users just want them both to work and don't want lectures why they should expect cars to work and computers not to work. I want to build an operating system whose mean time to failure is much longer than the lifetime of the computer so the average user never experiences a crash.

I remember loving MINIX 1 (it was just called Minix then, of course) when I first learned it in grad school. I did not have any Unix experience coming out of my undergrad and had only begun to feel comfortable with BSD Unix in my first few graduate courses. Then I was assigned to teach the Operating Systems course, working with one of the CS faculty. He taught me a lot, but so did Tanenbaum -- through Minix. That is one of the first times I came to really understand that the systems we use (the OS, the compiler, the DBMS) were just programs that I could tinker with, modify, and even write.

Operating systems is not my area, and I have no expertise for evaluating the whole microkernel versus monolith debate. But I applaud researchers like Tanenbaum who are trying to create general computer systems that don't need to be rebooted. I'm a user, too.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

October 02, 2016 10:03 AM

Tom Wolfe on Writer's Block

In the Paris Review's The Art of Fiction No. 123, Tom Wolfe tells how he learned about writer's block. Wolfe was working at Esquire magazine, and his first editor, Byron Dobell, had assigned him to write an article about car customizers. After doing all his research, he was totally blocked.

I now know what writer's block is. It's the fear you cannot do what you've announced to someone else you can do, or else the fear that it isn't worth doing. That's a rarer form. In this case I suddenly realized I'd never written a magazine article before and I just felt I couldn't do it. Well, Dobell somehow shamed me into writing down the notes that I had taken in my reporting on the car customizers so that some competent writer could convert them into a magazine piece. I sat down one night and started writing a memorandum to him as fast as I could, just to get the ordeal over with. It became very much like a letter that you would write to a friend in which you're not thinking about style, you're just pouring it all out, and I churned it out all night long, forty typewritten, triple-spaced pages. I turned it in in the morning to Byron at Esquire, and then I went home to sleep.

Later that day, Dobell called him to say that they were deleting the "Dear Byron" at the top of the memo and running the piece.

Most of us need more editing than that after we write anything, but... No matter; first you have to write something. Even if it's the product of a rushed all-nighter, just to get an obligation off our table.

When I write, and especially when I program, my reluctance to start usually grows out of a different sort of fear: the fear that I won't be able to stop, or want to. Even simple programming tasks can become deep holes into which we fall. I like that feeling, but I don't have enough control of my work schedule most days to be able to risk disappearing like that. What I could use is an extra dose of audacity or impetuosity. Or maybe a boss like Byron Dobell.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

September 28, 2016 3:16 PM

Language, and What It's Like To Be A Bat

My recent post about the two languages resonated in my mind with an article I finished reading the day I wrote the post: Two Heads, about the philosophers Paul and Pat Churchland. The Churchlands have been on a forty-year quest to change the language we use to describe our minds, from popular terms based in intuition and introspection to terms based in the language of neuroscience. Changing language is hard under any circumstances, and it is made harder when the science they need is still in its infancy. Besides, maybe more traditional philosophers are right and we need our traditional vocabulary to make sense of what it feels like to be human?

The New Yorker article closes with these paragraphs, which sounds as if they are part of a proposal for a science fiction novel:

Sometimes Paul likes to imagine a world in which language has disappeared altogether. We know that the two hemispheres of the brain can function separately but communicate silently through the corpus callosum, he reasons. Presumably, it will be possible, someday, for two separate brains to be linked artificially in a similar way and to exchange thoughts infinitely faster and more clearly than they can now through the muddled, custom-clotted, serially processed medium of speech. He already talks about himself and Pat as two hemispheres of the same brain. Who knows, he thinks, maybe in his children's lifetime this sort of talk will not be just a metaphor.

If, someday, two brains could be joined, what would be the result? A two-selved mutant like Joe-Jim, really just a drastic version of Siamese twins, or something subtler, like one brain only more so, the pathways from one set of neurons to another fusing over time into complex and unprecedented arrangements? Would it work only with similar brains, already sympathetic, or, at least, both human? Or might a human someday be joined to an animal, blending together two forms of thinking as well as two heads? If so, a philosopher might after all come to know what it is like to be a bat, although, since bats can't speak, perhaps he would be able only to sense its batness without being able to describe it.

(Joe-Jim is a character from a science fiction novel, Robert Heinlein's Orphans of the Sky.)

What a fascinating bit of speculation! Can anyone really wonder why kids are drawn to science fiction?

Let me add my own speculation to the mix: If we do ever find a way to figure out what it's like to be a bat, people will find a way to idescribe what it's like to be a bat. They will create the language they need. Making language is what we do.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns

September 26, 2016 2:54 PM

A Lesson from Tolkien about Commitment and Ignorance

Elrond has just addressed the Fellowship of the Ring, reminding them that only the Ring-Bearer is charged with completing the task ahead of them. The others "go with him as free companions", to assist in whatever ways they are able. He enters into an exchange with Gimli:

"The further you go, the less easy it will be to withdraw; yet no oath or bond is laid on to go further than you will. For you do not yet know the strength of your hearts, and you cannot foresee what each may meet upon the road."

"Faithless is he who says farewell when the road darkens," said Gimli.

"Maybe," said Elrond, "but let him not vow to walk in the dark who has not seen the nightfall."

"Yet sworn word may strengthen the quaking heart," said Gimli.

"Or break it," said Elrond. "Look not too far ahead!"

This is a tension we all live: the desire to make unconditional promises about the future to our lovers and compatriots despite not evening knowing what is possible in that future. I love Elrond's response, "Let him not vow to walk in the dark who has not seen the nightfall." Members of the Fellowship found that their future contained evil beyond their comprehension and temptations beyond their imagination.

Our challenge is to constantly balance this tension: to live with the confidence of Gimli, but tempered by the pragmatic awareness of our ignorance that Elrond offers. Sometimes, commitment gives us the strength to continue on in the face of fear. Sometimes, though, there is no shame in turning back.


Posted by Eugene Wallingford | Permalink | Categories: General

September 25, 2016 9:40 AM

There Is Only One Culture, But Two Languages

W.H. Auden, in A Certain World, on the idea of The Two Cultures:

Of course, there is only one. Of course, the natural sciences are just as "humane" as letters. There are, however, two languages, the spoken verbal language of literature, and the written sign language of mathematics, which is the language of science. This puts the scientist at a great advantage, for, since like all of us he has learned to read and write, he can understand a poem or a novel, whereas there are very few men of letters who can understand a scientific paper once they come to the mathematical parts.

When I was a boy, we were taught the literary languages, like Latin and Greek, extremely well, but mathematics atrociously badly. Beginning with the multiplication table, we learned a series of operations by rote which, if remembered correctly, gave the "right" answer, but about any basic principles, like the concept of number, we were told nothing. Typical of the teaching methods then in vogue is the mnemonic which I had to learn.
Minus times Minus equals Plus:
The reason for this we need not discuss.

Sadly, we still teach young people that it's okay if math and science are too hard to master. They grow into adults who feel a chasm between "arts and letters" and "math and science". But as Auden notes rightly, there is no chasm; there is mostly just another language to learn and appreciate.

(It may be some consolation to Auden that we've reached a point where most scientists have to work to understand papers written by scientists in other disciplines. They are written in highly specialized languages.)

In my experience, it is more acceptable for a humanities person to say "I'm not a science person" or "I don't like math" than for a scientist to say something similar about literature, art, or music. The latter person is thought, silently, to be a Philistine; the former, an educated person with a specialty.

I've often wondered if this experience suffers from observation bias or association bias. It may well. I certainly know artists and writers who have mastered both languages and who remain intensely curious about questions that span the supposed chasm between their specialties and mine. I'm interested in those questions, too.

Even with this asymmetry, the presumed chasm between cultures creates low expectations for us scientists. Whenever my friends in the humanities find out that I've read all of Kafka's novels and short stories; that Rosencrantz and Guildenstern Are Dead is my favorite play, or that I even have a favorite play; that I really enjoyed the work of choreographer Merce Cunningham; that my office bookshelf includes the complete works of William Shakespeare and a volume of William Blake's poetry -- I love the romantics! -- most seem genuinely surprised. "You're a computer scientist, right?" (Yes, I like Asimov, Heinlein, Clarke, and Bradbury, too.)

Auden attributes his illiteracy in the language of mathematics and science to bad education. The good news is that we can reduce, if not eliminate, the language gap by teaching both languages well. This is a challenge for both parents and schools and will take time. Change is hard, especially when it involves the ways we talk about the world.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

September 22, 2016 3:03 PM

RSS > Email

Newsletters delivered by email seem all the rage these days. I subscribe to only two or three. Every once in a while, the bulk mailer used by these folks gets blacklisted by some spam filtering service used by our mail server, the mail server respects the blacklist, and I don't receive my newsletter. We've whitelisted the senders of two particular newsletters, but even so I occasionally don't receive the message.

 the standard icon for RSS subscription, via Wikipedia

This is one reason I still love RSS. My newsreader is in control of the exchange. Once authors post their articles and updates their feeds, my newsreader can see them. I hit refresh, and the articles appear. RSS is not perfect; occasionally a blog updates its feed and I see a bunch of old articles in my reader. But I've been following some bloggers for well over a decade, and it has served us all well.

Do not expect me to hit you up for your email address anytime soon. I understand some of the reasons for going the newsletter route, but I think I'll keep publishing on my blog with a newsfeed for a while. That said, I love to hear from readers. Send me email any time, or tweet me at @wallingf.


Posted by Eugene Wallingford | Permalink | Categories: General

September 18, 2016 3:49 PM

Talking Shop

a photo of the Blueridge Orchard, visited by cyclists on the Cedar Valley Farm Ride

I agree with W.H. Auden:

Who on earth invented the silly convention that it is boring or impolite to talk shop? Nothing is more interesting to listen to, especially if the shop is not one's own.

My wife went on a forty-mile bike ride this morning, a fundraiser for the Cedar Valley Bicycle Collective, which visited three local farms. At those stops, I had the great fortune to listen to folks on all three farms talk shop. We learned about making ice cream and woodcarving at Three Pines Farm. We learned about selecting, growing, and picking apples -- and the damage hail and bugs can do -- at Blueridge Orchard. And the owner of the Fitkin Popcorn Farm talked about the popcorn business. He showed us the machines they use to sort the corn out of the field, first by size and then by density. He also talked about planting fields, harvesting the corn, and selling the product nationally. I even learned that we can pop the corn while it's still on the ears! (This will happen in my house very soon.)

I love to listen to people talk shop. In unguarded moments, they speak honestly about something they love and know deeply. They let us in on what it is like for them to work in their corner of the world. However big I try to make my world, there is so much more out there to learn.

The Auden passage is from his book A Certain World, a collage of poems, quotes, and short pieces from other writers with occasional comments of his own. Auden would have been an eclectic blogger! This book feels like a Tumblr blog, without all the pictures and 'likes'. Some of the passages are out of date, but they let us peak in on the mind of an accomplished poet. A little like good shop talk.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

September 16, 2016 12:41 PM

We Are All Mashups

Sheldon reminds Leonard that he never really had a personality, on The Bib Bang Theory

There is a scene in The Big Bang Theory where Sheldon laments that, without realizing it, he had allowed his girl/friend to alter his personality. Leonard responds, "Well, you didn't really have a 'personality'. You just had some shows you liked."

This scene came to mind when I read a passage from Kenneth Goldsmith's Uncreative Writing earlier this week:

I don't think there's a stable or essential "me". I am an amalgamation of many things: books I've read, movies I've seen, television shows I've watched, conversations I've had, songs I've sung, lovers I've loved. In fact, I'm a creation of so many people and so many ideas, to the point where I feel I've actually had few original thoughts and ideas; to think that what I consider to be "mine" was "original" would be blindingly egotistical.

It is occasionally daunting when I realize how much I am a product of the works, people, and ideas I've encountered. How can I add anything new? But when I surrender to the fact that I can't, it frees me to write and do things that I like. What I make may not be new, but it can still be useful or valuable, even if only to me.

I wonder what it's like for kids to grow up in a self-consciously mash-up culture. My daughters have grown up in a world where technology and communication have given everyone the ability to mix and modify other work so easily. It's a big part of the entertainment they consume.

Mash-up culture must feel hugely empowering in some moments and hugely terrifying in others. How can anyone find his or her own voice, or say something that matters? Maybe they have a better sense than I did growing up that nothing is really new and that what really matters is chasing your interests, exploring the new lands you enter, and sharing what you find. That's certainly been the source of my biggest accomplishments and deepest satisfactions.

(I ran across the passage from Goldsmith on Austin Kleon's blog.)


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 10, 2016 10:41 AM

Messages Rather Than Conversations

Kevin Kelly, in Amish Hackers:

One Amish-man told me that the problem with phones, pagers, and PDAs (yes he knew about them) was that "you got messages rather than conversations". That's about as an accurate summation of our times as any. Henry, his long white beard contrasting with his young bright eyes told me, "If I had a TV, I'd watch it." What could be simpler?

Unlike some younger Amish, I still do not carry a smart phone. I do own a cell but use it only when traveling. If our home phone disappeared overnight, it would likely take several days before my wife or I even noticed.

I also own a television, a now-déclassé 32" flat screen. Henry is right: having a TV, I find myself watching it on occasion. I enjoy it but have to guard vigilantly against falling into a hypnotic trance. It turns out that I form certain habits quite easily.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 06, 2016 2:44 PM

"Inception" and the Simulation Argument

If Carroll's deconstruction of the simulation argument is right, then the more trouble we have explaining consciousness, the more that should push us to believe we're in a ground-level simulation. There's probably a higher-level version of physics in which consciousness makes sense. Our own consciousness is probably being run in a world that operates on that higher-level law. And we're stuck in a low-resolution world whose physics doesn't allow consciousness -- because if we weren't, we'd just keep recursing further until we were.

-- Scott Alexander, The View From Ground Level

two characters from the film 'Inception' walking in a dream world where space folds back on itself

In the latest installment of "You Haven't Seen That Yet?", I watched the film Inception yesterday. There was only one person watching, but still the film gets two hearty thumbs-up. All those Ellen Pages waking up, one after the other...

Over the last few years, I've heard many references to the idea from physics that we are living in a simulation, that our universe is a simulation created by beings in another universe. It seems that some physicists think and talk about this a lot, which seems odd to me. Empiricism can't help us much to unravel the problem; arguments pro and con come down to the sort of logical arguments favored by mathematicians and philosophers, abstracted away from observation of the physical world. It's a fun little puzzle, though. The computer science questions are pretty interesting, too.

Ideas like this are old hat to those of us who read a lot of science fiction growing up, in particular Philip K. Dick. Dick's stories were often predicated on suspending some fundamental attribute of reality, or our perception of it, and seeing what happened to our lives and experiences. Now that I have seen Memento (a long-time favorite of mine) and Inception, I'm pretty happy. What Philip K. Dick was with the written word to kids of my generation, Christopher Nolan is on film to a younger generation. I'm glad I've been able to experience both.

~~~~

The photo above comes from Matt Goldberg's review of Inception. It shows Arthur, the character played by Joseph Gordon-Levitt, battling with a "projection" in three-dimensional space that folds back on itself. Such folding is possible in dream worlds and is an important element in designing dreams that enable all the cool mind games that are central to the film.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

September 03, 2016 4:49 PM

The Innate Boundary

If you love a demanding task, one that requires both discipline and talent -- shooting hoops, playing drums, writing code -- you eventually discover an innate boundary: you can apprehend real virtuosity, especially as it's used to best you, but you can't quite incorporate it. You will never be more than almost-great.

-- Tad Friend, in Squash for the Midlife Slump.

Still, you get to love. That's a good thing.


Posted by Eugene Wallingford | Permalink | Categories: General

August 22, 2016 4:18 PM

A New Way to Debug Our Election Systems

In The Security of Our Election Systems, Bruce Schneier says that we no longer have time to sound alarm about security flaws in our election systems and hope that government and manufacturers will take action. Instead...

We must ignore the machine manufacturers' spurious claims of security, create tiger teams to test the machines' and systems' resistance to attack, drastically increase their cyber-defenses and take them offline if we can't guarantee their security online.

How about this:

The students in my department love to compete in cyberdefense competitions (CDCs), in which they are charged with setting up various systems and then defending them against attack from experts for some period, say, twenty-four hours. Such competitions are growing in popularity across the country.

Maybe we should run a CDC with the tables turned. Manufacturers are required to set up their systems and to run the full set of services they promise when they sell the systems to government agencies. Students across the US would then be given a window of twenty-fours or more to try to crack the systems, with the manufacturers or even our election agencies trying to keep their systems up and running securely. Any vulnerabilities that the students find would be made public, enabling the manufacturers to fix them and the state agencies to create and set up new controls.

Great idea or crazy fantasy?


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

August 14, 2016 10:55 AM

Hemingway on Teachers, While Teaching

Ernest Hemingway sitting on a dock next to his boat, Pilar, in the 1930s

Early in Arnold Samuelson's With Hemingway: A Year in Key West and Cuba, Papa is giving an impromptu lecture about writing to two aspiring young writers. (He does that a lot in the book, whenever the men are out sailing and fishing.) This particular lecture was prompted by what he thought was bad advice in a book by a successful pulp fiction author on how to get started as a writer. An earlier session had focused on the shortcomings of going to college to learn how to become a writer.

Toward the end of his discourse, Hemingway tells the young writers to do daily writing exercise and generously offers to read read their work, giving feedback on how to get better. This offer elicits a few more remarks about the idea of college writing professors:

"They ought to have me teach some of those college classes. I could teach them something. Most professors of English composition can tell the students what's wrong with their pieces but they don't know how to make them good, because, if they knew that, they'd be writers themselves and they wouldn't have to teach."

"What do you think of the life of a professor?"

"All right for a person who is vain and likes to have the adulation of his students. Well, now, do you fellows think you can remember everything Professor Hemingway has told you? Are you prepared for a written examination on the lecture?"

Teaching computer science must be different from teaching fiction writing. I have been teaching for quite a few years now and have never received any adulation. Then again, though, I've never experienced much derision either. My students seems to develop a narrower set of emotions. Some seem to like me quite a bit and look for chances to take another course with me. Other students are... indifferent. To them, I'm just the guy standing in the way of them getting to somewhere else they want to be.

Hemingway's "have to teach" dig is cliché. Perhaps the Ernest Hemingways and Scott Fitzgeralds of the world should be devoting all of their time to writing, but there have a been any number of excellent authors who have supplemented their incomes and filled the down time between creative bursts by helping other writers find a path for themselves. Samuelson's book itself is a testament to how much Papa loved to share his wisdom and to help newcomers find their footing in a tough business. During all those hours at sea, Hemingway was teaching.

Still, I understand what Hemingway means when he speaks of the difference between knowing that something is bad and knowing how to make something good. One of the biggest challenges I faced in my early years as a professor was figuring out how to go beyond pointing out errors and weaknesses in my students' code to giving them concrete advice on how two design and write good programs. I'm still learning how to do that.

I'm lucky that I like to write programs myself. Writing code and learning new styles and languages is the only way to stay sharp. Perhaps if I were really good, I'd leave academia and build systems for Google or some hot start-up, as Hemingway would have it. I'm certainly under no illusion that I can simulate that kind of experience working at a university. But I do think a person can both do and teach, and that the best teachers are ones who take both seriously. In computer science, it is a constant challenge to keep up with students who are pushing ahead into a world that keeps changing.

~~~~

The photo above comes from the John F. Kennedy Presidential Library and Museum. It shows Hemingway sitting on a dock next to his boat, Pilar, sometime in the 1930s. The conversation quoted above took place on the Pilar in 1934.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

August 11, 2016 10:49 AM

To Founders in Search of Patience and Low Costs

Nils Pihl, CEO at Traintracks.io, writes about the benefits of launching the start-up in Beijing:

It took two years of hard work and late nights at the whiteboard to build a prototype of something we knew we could be proud of -- and what Silicon Valley investor would agree to fund something that would take two years to release? Not only that, but it would have cost us roughly 6 times as much money to develop it in Silicon Valley -- for no immediate benefit.

If moving to Beijing is not an option for you, fear not. You do not have to travel that far to find patient investors, great programmers, and low cost of living. Try Des Moines. Or St. Louis. Or Indianapolis. Or, if you must live in a Major World City, try Chicago. Even my small city can offer a good starting point, though programmers are not as plentiful as we might like.

The US Midwest has a lot of advantages for founders, but none of the smog you'll find in Beijing and much shorter commutes than you will find in all the places people tell you you have to go.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

August 04, 2016 11:51 AM

The Spirit Of Our Time: Sensation Brought To An Extreme

Jack Levine, on painting as a realist in the 1950s, a time of abstract expressionism and art as social commentary:

The difficulty is for me to be affirmative. I'm a little inhibited, as you have noticed, by not being against any of these people. The spirit of denunciation is more in the spirit of our time: sensation brought to an extreme.

Levine might just as well have been talking about today's social and political climate. Especially if he had had a Facebook or Twitter account.

~~~~

(This passage comes from Conversations with Artists. These entries also draw passages from it: [ 07/19 | 07/27 | 07/31 ]. This is my last entry drawn from the book, at least for now.)


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

August 03, 2016 1:56 PM

Programming: Don't Knock It Till You Try It

We have a fair number of students on campus outside of CS who want to become web designers, but few of them think they should learn to program. Some give it a try when one of our communications profs tells them how exciting and liberating it can be. In general, though, it's a hard sell. Programming sounds boring to them, full of low-level details better left to techies over in computer science.

This issue pervades the web design community. In The Bomb in the Garden, Matthew Butterick does a great job of explaining why the web as a design medium is worth saving, and pointing to ways in which programming can release the creativity we need to keep it alive.

Which brings me to my next topic--what should designers know about programming?

And I know that some of you will think this is beating a dead horse. But when we talk about restoring creativity to the web, and expanding possibilities, we can't avoid the fact that just like the web is a typographic medium, it's also a programmable medium.

And I'm a designer who actually does a lot of programming in my work. So I read the other 322,000 comments about this on the web. I still think there's a simple and non-dogmatic answer, which is this:

You don't have to learn programming, but don't knock it till you try it.

It's fun for me when one of the web design students majoring in another department takes his or her first programming class and is sparked by the possibilities that writing a program opens up. And we in CS are happy to help them go deeper into the magic.

Butterick speaks truth when he says he's a designer who does a lot of programming in his work. Check out Pollen, the publishing system he created to write web-based books. Pollen's documentation says that it "helps authors make functional and beautiful digital books". That's true. It's a very nice package.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 31, 2016 10:19 AM

"I Live In Air Filled With Images..."

Leonard Baskin waved his arms around his head:

I tell you honestly that I do not live in air. I live in air filled with images, waiting, waiting. And they are mad at me because I don't make them. This is not a fantasy. It is real, I assure you.

I know a few programmers who feel the same way about code. I have periods of such immediacy myself.

This is one of those double-edged phenomena, though. Many people would like to find some goal or activity that so enlivens their world, but they also do not want it to drive them to endless distraction. Fortunately, when we get deep into creating a new something, the swirl goes away for a while.

(This passage comes from Conversations with Artists, which I have now quoted a few times. I promise not to type the entire book into my blog.)


Posted by Eugene Wallingford | Permalink | Categories: General

July 13, 2016 11:19 AM

A Student Asks About Pursuing Research Projects

Faculty in my department are seeking students to work on research projects next. I've sent a couple of messages to our student mailing list this week with project details. One of my advisees, a bright guy with a good mind and several interests, sent me a question about applying. His question got to the heart of a concern many students have, so I responded to the entire student list. I thought I'd share the exchange as an open letter to all students out there who are hesitant about pursuing an opportunity.

The student wrote something close to this:

Both professors' projects seem like great opportunities, but I don't feel even remotely qualified for either of them. I imagine many students feel like this. The projects both seem like they'd entail a really advanced set of skills -- especially needing mathematics -- but they also require students with at least two semesters left of school. Should I bother contacting them? I don't want to jump the gun and rule myself out.

Many students "self-select out" -- choose not to pursue an opportunity -- because they don't feel qualified. That's too bad. You would be surprised how often the profs would be able to find a way to include a student who are interested in their work. Sometimes, they work around a skill the student doesn't have by finding a piece of the project he or she can contribute to. More often, though, they help the student begin to learn the skill they need. We learn many things best by doing them.

Time constraints can be a real issue. One semester is not enough time to contribute much to some projects. A grant may run for a year and thus work best with a student who will be around for two or more semesters. Even so, the prof may be able to find a way to include you. They like what they do and like to work with other people who do, too.

My advice is to take a chance. Contact the professor. Stop in to talk with him or her about your interest, your skills, and your constraints. The worst case scenario is that you get to know the professor a little better while finding out that this project is not a good fit for you. Another possible outcome, though, is that you find a connection that leads to something fruitful. You may be surprised!

~~~~

Postcript. One student has stopped in already this morning to thank me for the encouragement and to say that he is going to contact one of the profs. Don't let a little uncertainty stand in the way of pursuing something you like.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

May 30, 2016 8:43 AM

"The One Form of Poverty That Should Be Shunned"

In her essay "The Importance of Being Scared", poet Wislawa Szymborska talked about the fairy tales of Hans Christian Andersen, which were often scary in ways that are out of sync with modern sensibilities. Of Andersen, she wrote:

He didn't believe that you should try to be good because it pays (as today's moral tales insistently advertise, though it doesn't necessarily turn out that way in real life), but because evil stems from intellectual and emotional stuntedness and is the one form of poverty that should be shunned.

I love that phrase: "the one form of poverty that should be shunned", as well as Andersen's prescription.

I need to read more Szymborska. She is often quite funny. This book review quotes a passage from the review of a book on caves that made me smile:

The first to discover caves were of course those animals who could find their way in the dark. Cavemen, who had already lost this gift, couldn't venture too far into their caves. They had to stick to the edges. It's not that they didn't have the nerve, they just didn't have flashlights.

That joke hits close to home in my work as a programmer and teacher, and even as department head. Sometimes, I don't need more nerve. I need a flashlight.


Posted by Eugene Wallingford | Permalink | Categories: General

May 15, 2016 9:36 AM

An Interview about Encryption

A local high student emailed me last week to say that he was writing a research paper about encryption and the current conversation going on regarding its role in privacy and law enforcement. He asked if I would be willing to answer a few interview questions, so that he could have a few expert quotes for his paper. I'm always glad when our local schools look to the university for expertise, and I love to help young people, so I said yes.

I have never written anything here about my take on encryption, Edward Snowden, or the FBI case against Apple, so I figured I'd post my answers. Keep in mind that my expertise is in computer science. I am not a lawyer, a political scientist, or a philosopher. But I am an informed citizen who knows a little about how computers work. What follows is a lightly edited version of the answers I sent the student.

  1. Do you use encryption? If so, what do you use?

    Yes. I encrypt several disk images that hold sensitive financial data. I use encrypted files to hold passwords and links to sensitive data. My work laptop is encrypted to protect university-related data. And, like everyone else, I happily use https: when it encrypts data that travels between me and my bank and other financial institutions on the web.

  2. In light of the recent news on groups like ISIS using encryption, and the Apple v. Department of Justice, do you support legislation that eliminates or weakens powerful encryption?

    I oppose any legislation that weakens strong encryption for ordinary citizens. Any effort to weaken encryption so that the government can access data in times of need weakens encryption for all people at all times and against all intruders.

  3. Do you think the general good of encryption (protection of data and security of users) outweighs or justifies its usage when compared to the harmful aspects of it (being used by terrorists groups or criminals)?

    I do. Encryption is one of the great gifts that computer science has given humanity: the ability to be secure in one's own thoughts, possessions, and communication. Any tool as powerful as this one can be misused, or used for evil ends.

    Encryption doesn't protect us from only the U.S. government acting in good faith. It protects people from criminals who want to steal our identities and our possessions. It protects people from the U.S. government acting in bad faith. And it protects people from other governments, including governments that terrorize their own people. If I were a citizen of a repressive regime in the Middle East, Africa, Southeast Asia, or anywhere else, I would want the ability to communicate without intrusion from my government.

    Those of us who are lucky to live in safer, more secure circumstances owe this gift to the people who are not so lucky. And weakening it for anyone weakens it for everyone.

  4. What is your response to someone who justifies government suppression of encryption with phrases like "What are you hiding?" or "I have nothing to hide."?

    I think that most people believe in privacy even when they have nothing to hide. As a nation, we do not allow police to enter our homes at any time for any reason. Most people lock their doors at night. Most people pull their window shades down when they are bathing or changing clothes. Most people do not have intimate relations in public view. We value privacy for many reasons, not just when we have something illegal to hide.

    We do allow the police to enter our homes when executing a search warrant, after the authorities have demonstrated a well-founded reason to believe it contains material evidence in an investigation. Why not allow the authorities to enter or digital devices under similar circumstances? There are two reasons.

    First, as I mentioned above, weakening encryption so that the government can access data in times of legitimate need weakens encryption for everyone all the time and makes them vulnerable against all intruders, including bad actors. It is simply not possible to create entry points only for legitimate government uses. If the government suppresses encryption in order to assist law enforcement, there will be disastrous unintended side effects to essential privacy of our data.

    Second, our digital devices are different than our homes and other personal property. We live in our homes and drive our cars, but our phones, laptops, and other digital devices contain fundamental elements of our identity. For many, they contain the entirety of our financial and personal information. They also contain programs that enact common behaviors and would enable law enforcement to recreate past activity not stored on the device. These devices play a much bigger role in our lives than a house.

  5. In 2013 Edward Snowden leaked documents detailing surveillance programs that overstepped boundaries spying on citizens. Do you think Snowden became "a necessary evil" to protect citizens that were unaware of surveillance programs?

    Initially, I was unsympathetic to Snowden's attempt to evade detainment by the authorities. The more I learned about the programs that Snowden had uncovered, the more I came to see that his leak was an essential act of whistleblowing. The American people deserve to know what their government is doing. Indeed, citizens cannot direct their government if they do not know what their elected officials and government agencies are doing.

  6. In 2013 to now, the number of users that are encrypting their data has significantly risen. Do you think that Snowden's whistleblowing was the action responsible for a massive rise in Americans using encryption?

    I don't know. I would need to see some data. Encryption is a default in more software and on more devices now. I also don't know what the trend line for user encryption looked like before his release of documents.

  7. Despite recent revelations on surveillance, millions of users still don't voluntarily use encryption. Do you believe it is fear of being labeled a criminal or the idea that encryption is unpatriotic or makes them an evil person?

    I don't know. I expect that there are a number of bigger reasons, including apathy and ignorance.


  8. Encryption defaults on devices like iPhones, where the device is encrypted while locked with a passcode is becoming a norm. Do you support the usage of default encryption and believe it protects users who aren't computer savvy?

    I like encryption by default on my devices. It comes with risks: if I lose my password, I lose access to my own data. I think that users should be informed that encryption is turned on by default, so that they can make informed choices.

  9. Should default encryption become required by law or distributed by the government to protect citizens from foreign governments or hackers?

    I think that we should encourage people to encrypt their data. At this point, I am skeptical of laws that would require it. I am not a legal scholar and do not know that the government has the authority to require it. I also don't know if that is really what most Americans want. We need to have a public conversation about this.

  10. Do you think other foreign countries are catching up or have caught up to the United States in terms of technical prowess? Should we be concerned?

    People in many countries have astonishing technical prowess. Certainly individual criminals and other governments are putting that prowess to use. I am concerned, which is one reason I encrypt my own data and encourage others to do so. I hope that the U.S. government and other American government agencies are using encryption in an effort to protect us. This is one reason I oppose the government mandating weakness in encryption mechanisms for its own purposes.

  11. The United States government disclosed that it was hacked and millions of employees information was compromised. Target suffered a breach that resulted in credit card information being stolen. Should organizations and companies be legally responsible for breaches like these? What reparations should they make?

    I am not a lawyer, but... Corporations and government agencies should take all reasonable precautions to protect the sensitive data they store about their customers and citizens. I suspect that corporations are already subject to civil suit for damages caused by data breaches, but that places burdens on people to recover damages for losses due to breached data. This is another area where we as a people need to have a deeper conversation so that we can decide to what extent we want to institute safeguards into the law.

  12. Should the US begin hacking into other countries infrastructures and businesses to potentially damage that country in the future or steal trade secrets similar to what China has done to us?

    I am not a lawyer or military expert, but... In general, I do not like the idea of our government conducting warfare on other peoples and other governments when we are not in a state of war. The U.S. should set a positive moral example of how a nation and a people should behave.

  13. Should the US be allowed to force companies and corporations to create backdoors for the government? What do believe would be the fallout from such an event?

    No. See the third paragraph of my answer to #4.

As I re-read my answers, I realize that, even though I have thought a lot about some of these issues over the years, I have a lot more thinking to do. One of my takeaways from the interview is that the American people need to think about these issues and have public conversations in order to create good public policy and to elect officials who can effectively steward the government in a digital world. In order for this to happen, we need to teach everyone enough math and computer science that they can participate effectively in these discussions and in their own governance. This has big implications for our schools and science journalism.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

May 07, 2016 10:55 AM

Without Wonder, Without Awe

Henry Miller, in "The Books in My Life" (1969):

Every day of his life the common man makes use of what men in other ages would have deemed miraculous means. In the range of invention, if not in powers of invention, the man of today is nearer to being a god than at any time in history. (So we like to believe!) Yet never was he less godlike. He accepts and utilizes the miraculous gifts of science unquestioningly; he is without wonder, without awe, reverence, zest, vitality, or joy. He draws no conclusions from the past, has no peace or satisfaction in the present, and is utterly unconcerned about the future. He is marking time.

It's curious to me that this was written around the same time as Stewart Brand's clarion call that we are as gods. The zeitgeist of the 1960s, perhaps.

"The Books in My Life" really has been an unexpected gift. As I noted back in November, I picked it up on a lark after reading a Paris Review interview with Miller, and have been reading it off and on since. Even though he writes mostly of books and authors I know little about, his personal reflections and writing style click with me. Occasionally, I pick up one of the books he discusses, ost recently Richard Jefferies's The Story of My Heart.

When other parts of the world seem out of sync, picking up the right book can change everything.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

May 05, 2016 1:45 PM

Philosopher-Programmer

In her 1942 book Philosophy in a New Key, philosopher Susanne Langer wrote:

A question is really an ambiguous proposition; the answer is its determination.

This sounds like something a Prolog programmer might say in a philosophical moment. Langer even understood how tough it can be to write effective Prolog queries:

The way a question is asked limits and disposes the ways in which any answer to it -- right or wrong -- may be given.

Try sticking a cut somewhere and see what happens...

It wouldn't be too surprising if a logical philosopher reminded me of Prolog, but Langer's specialties were consciousness and aesthetics. Now that I think about it, though, this connection makes sense, too.

Prolog can be a lot of fun, though logic programming always felt more limiting to me than most other styles. I've been fiddling again with Joy, a language created by a philosopher, but every so often I think I should earmark some time to revisit Prolog someday.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development

April 28, 2016 4:12 PM

A Homeric Take on the Power of Programming

By participating in history instead of standing by to watch we shall at least be able to enjoy the present. ... You should read your Homer. Gods who manipulate the course of destiny are no more likely to achieve their private ambitions than are men who suffer the slings and arrows of outrageous fortune; but gods have much more fun!

If we are to be thwarted in our ambitions, let us at least enjoy the striving. Writing code is one way to strive bigger.

~~~~~

From We Are As Gods, the iconic piece in the The Whole Earth Catalog that gave us the line, "We are as gods and might as well get good at it" -- misremembered, but improved in the misremembering.


Posted by Eugene Wallingford | Permalink | Categories: General

April 27, 2016 4:35 PM

"I Had No Need Of That Hypothesis"

Joshua Brown closes his blog Simple vs Complex with this delightful little story:

1802: Emperor Napoleon sits in state at the Chateau de Malmaison, ready to receive the the mathematical physicist Pierre Laplace and his just completed Celestial Mechanics. In this book, Laplace has explained the formation of the solar system for the first time and has modeled exactly how the planets and stars work. For all his brutality and battlefield expedience, Napoleon is a sophisticate and an enthusiast of the arts and sciences. He is intellectually curious.

"Tell me, Monsieur Laplace, how did the solar system come about?"

"A chain of natural causes would account for the construction and preservation of the celestial system," Laplace explains.

"But you don't mention God or his intervention even once, as Newton did?"

"I had no need of that hypothesis."
One hundred years earlier, Sir Isaac Newton had created a celestial model of his own. In it, he surmised that the planetary orbits were out of control and not stable, and that a God was needed to explain their course. Laplace went further than Newton, showing "it works without that, too."

Whatever one's position on faith in a supernatural deity, Laplace models precisely the attitude that scientists must bring to their work. Let's explain every phenomenon with the fewest and simplest hypotheses.


Posted by Eugene Wallingford | Permalink | Categories: General

April 25, 2016 1:26 PM

"So Little of the Great to Conceal"

In a recent post, Clive Thompson quotes a short passage from Carlo Rovelli's Seven Brief Lessons on Physics in which Rovelli notes that genius hesitates when it comes upon great ideas. Einstein introduced quantum theory with "It seems to me...", and Darwin demurred even in his own notebooks on natural selection with "I think...". Thompson writes:

It's not a bad litmus test for the people around us in everyday life. The ones who are proposing genuinely startling and creative ideas are liable to be ... careful about it. It's the ones with small ideas who are shouting them from the rooftops.

These thought brought to mind a wonderful passage from Okakura Kakuzo's The Book of Tea:

Perhaps we reveal ourselves too much in small things because we have so little of the great to conceal.

Those who encounter a great idea are most willing to let their uncertainty show. Those who express no uncertainty often have no greatness to conceal.

Earlier in the book, Okakura writes another line that I see quoted often:

Those who cannot feel the littleness of great things in themselves are apt to overlook the greatness of little things in others.

This passage takes on a different flavor for me when considered in the light of Rovelli's observation.


Posted by Eugene Wallingford | Permalink | Categories: General

April 22, 2016 12:03 PM

Universities, Cities, and Start-Ups

If I were the city of Des Moines, I'd be thinking about Paul Graham's advice on how to make Pittsburgh a startup hub. Des Moines doesn't have a Carnegie Mellon, but it is reasonably close to two major research universities and has a livable downtown. While Des Moines is not likely to become a major startup hub, it could create the sort of culture needed to sustain a healthy ecosystem for new companies. Such an ecosystem would strengthen its already solid, if unspectacular, IT industry.

Regarding the universities' role in this process, Graham says:

Being that kind of talent magnet is the most important contribution universities can make toward making their city a startup hub. In fact it is practically the only contribution they can make.

But wait, shouldn't universities be setting up programs with words like "innovation" and "entrepreneurship" in their names? No, they should not. These kind of things almost always turn out to be disappointments. They're pursuing the wrong targets. The way to get innovation is not to aim for innovation but to aim for something more specific, like better batteries or better 3D printing. And the way to learn about entrepreneurship is to do it, which you can't in school.

Our university has an entrepreneurship program. I like a lot of what they do for students, but I worry about it becoming about entrepreneurship more than students starting companies. Academics are great at creating programs to talk about stuff, and a lot of what I see our students do is reading about entrepreneurship and studying what other entrepreneurs have done and are done. I'm reminded of an online Q-n-A with Elon Musk's ex-wife. She said that one thing Elon was not doing was sitting around thinking about what other entrepreneurs were doing.

As in so many things, I am also reminded of an aphorism from Kent Beck: "Do stuff, or talk about stuff, but don't talk about doing stuff." An entrepreneur does things. The best thing a university can do is to help students learn what they need to solve hard problems and then get out of their way.


Posted by Eugene Wallingford | Permalink | Categories: General

April 11, 2016 2:53 PM

A Tax Form is Really a Program

I finally got around to preparing my federal tax return this weekend. As I wrote a decade ago, I'm one of those dinosaurs who still does taxes by hand, using pencil and paper. Most of this works involves gathering data from various sources and entering numbers on a two-page Form 1040. My family's finances are relatively simple, I'm reasonably well organized, and I still enjoy the annual ritual of filling out the forms.

For supporting forms such as Schedules A and B, which enumerate itemized deductions and interest and dividend income, I reach into my books. My current accounting system consists of a small set of Python programs that I've been developing over the last few years. I keep all data in plain text files. These files are amenable to grep and simple Python programs, which I use to create lists and tally numbers to enter into forms. I actually enjoy the process and, unlike some people, enjoy reflecting once each year about how I support "we, the people" in carrying out our business. I also reflect on the Rube Goldberg device that is US federal tax code.

However, every year there is one task that annoys me: computing the actual tax I owe. I don't mind paying the tax, or the amount I owe. But I always forget how annoying the Qualified Dividends and Capital Gain Tax Worksheet is. In case you've never seen it, or your mind has erased its pain from your memory in an act of self-defense, here it is:

Qualified Dividends and Capital Gain Tax Worksheet--Line 44

It may not seem so bad at this moment, but look at that logic. It's a long sequence of "Enter the smaller of line X or line Y" and "Add lines Z and W" instructions, interrupted by an occasional reference to an entry on another form or a case statement to select a constant based on your filing status. By the time I get to this logic puzzle each year, I am starting to tire and just want to be done. So I plow through this mess by hand, and I start making mistakes.

This year I made a mistake in the middle of the form, comparing the wrong numbers when instructed to choose the smaller. I realized my mistake when I got to a line where the error resulted in a number that made no sense. (Fortunately, I was still alert enough to notice that much!) I started to go back and refigure from the line with the error, when suddenly sanity kicked it.

This worksheet is a program written in English, being executed by a tired, error-prone computer: me. I don't have to put up with this; I'm a programmer. So I turned the worksheet into a Python program.

This is what the Qualified Dividends and Capital Gain Tax Worksheet for Line 44 of Form 1040 (Page 44 of the 2015 instruction book) could be, if we weren't still distributing everything as dead PDF:

line   = [None] * 28

line[ 0] = 0.00 # unused line[ 1] = XXXX # 1040 line 43 line[ 2] = XXXX # 1040 line 9b line[ 3] = XXXX # 1040 line 13 line[ 4] = line[ 2] + line[ 3] line[ 5] = XXXX # 4952 line 4g line[ 6] = line[ 4] - line[ 5] line[ 7] = line[ 1] - line[ 6] line[ 8] = XXXX # from worksheet line[ 9] = min(line[ 1],line[ 8]) line[10] = min(line[ 7],line[ 9]) line[11] = line[9] - line[10] line[12] = min(line[ 1],line[ 6]) line[13] = line[11] line[14] = line[12] - line[13] line[15] = XXXX # from worksheet line[16] = min(line[ 1],line[15]) line[17] = line[ 7] + line[11] line[18] = line[16] - line[17] line[19] = min(line[14],line[18]) line[20] = 0.15 * line[19] line[21] = line[11] + line[19] line[22] = line[12] - line[21] line[23] = 0.20 * line[22] line[24] = XXXX # from tax table line[25] = line[20] + line[23] + line[24] line[26] = XXXX # from tax table line[27] = min(line[25],line[26])

i = 0 for l in line: print('{:>2} {:10.2f}'.format(i, l)) i += 1

This is a quick-and-dirty first cut, just good enough for what I needed this weekend. It requires some user input, as I have to manually enter values from other forms, from the case statements, and from the tax table. Several of these steps could be automated, with only a bit more effort or a couple of input statements. It's also not technically correct, because my smaller-of tests don't guard for a minimum of 0. Maybe I'll add those checks soon, or next year if I need them.

Wouldn't it be nice, though, if our tax code were written as computer code, or if we could at least download worksheets and various forms as simple programs? I know I can buy commercial software to do this, but I shouldn't have to. There is a bigger idea at play here, and a principle. Computers enable so much more than sharing PDF documents and images. They can change how we write many ideas down, and how we think. Most days, we barely scratch the surface of what is possible.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

April 07, 2016 3:47 PM

Notes from Today's Reading

Getting Older

In Fun With Aging, "Dean Dad" Matt Reed pontificates on reaching a Certain Age.

When Mom was the age I am now, I was in grad school. That can't possibly be right, but that's what the math says.

When my mom was the age I am now, I was already in my second year as an assistant professor, a husband, and father to a two-year-old daughter. Wow.

Getting old: what a strange thing to happen to a little boy.

That said, I am one up on Reed: I know one of Justin Bieber's recent songs and quite like it.

An Interesting Juxtaposition

Earlier this week, I read The Real Reason Middle America Should Be Angry, about St. Louis's fall from national prominence. This morning, I read The Refragmentation, Paul Graham's essay on the dissolution of the 20th century's corporate and cultural order abetted, perhaps accelerated, by computation.

Both tell a story of the rise and fall of corporations across the 20th century. Their conclusions diverge widely, though, especially on the value of government policies that affect scale. I suspect there are elements of truth in both arguments. In any case, they make interesting bookends to the week.

A Network of Links

Finally, as I tweeted yesterday, a colleague told me that he was going to search my blog. He had managed to forget where his own blog lives, and he remembered that I linked to it once.

At first, I chuckled at this situation as a comment on his forgetfulness, and ruefully as a comment on the passing of the age of the blog. But later I realized that this is as much a comment on the wonderfulness of blogging culture, in which links are life and, as long as the network is alive, conversation can be revived.

I hope he blogs again.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

April 05, 2016 4:06 PM

Umberto Eco and the Ineffable Power of Books

In What Unread Books Can Teach Us Oliver Burkeman relates this story about novelist and scholar Umberto Eco:

While researching his own PhD, Eco recalls, he got deeply stuck, and one day happened to buy a book by an obscure 19th-century abbot, mainly because he liked the binding. Idly paging through it, he found, in a throwaway line, a stunning idea that led him to a breakthrough. Who'd have predicted it? Except that, years later, when a friend asked to see the passage in question, he climbed a ladder to a high bookshelf, located the book... and the line wasn't there. Stimulated by the abbot's words, it seems, he'd come up with it himself. You never know where good ideas will come from, even when they come from you.

A person can learn something from a book he or or she has read, even if the book doesn't contain what the person learned. This is a much steadier path to knowledge than resting in the comfort that all information is available at the touch of a search engine.

A person's anti-library helps to make manifest what one does not yet know. As Eco reminds us, humility is an essential ingredient in this prescription.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

March 30, 2016 3:21 PM

Quick Hits at the University

This morning I read three pieces with some connection to universities and learning. Each had a one passage that made me smart off silently as I pedaled.

From The Humanities: What's The Big Idea?:

Boyarin describes his own research as not merely interdisciplinary but "deeply post-disciplinary." (He jokes that when he first came to Berkeley, his dream was to be 5 percent in 20 departments.)

Good luck getting tenure that way, dude.

"Deeply post-disciplinary" is a great bit of new academic jargon. Universities are very much organized by discipline. Figuring out how to support scholars who work outside the lines is a perpetual challenge, one that we really should address at scale if we want to enable universities to evolve.

From this article on Bernie Sanders's free college plan:

Big-picture principles are important, but implementation is important, too.

Hey, maybe he just needs a programmer.

Implementing big abstractions is hard enough when the substance is technical. When you throw in social systems and politics, implementing any idea that deviates very far from standard practice becomes almost impossible. Big Ball of Mud, indeed.

From Yours, Isaac Asimov: A Life in Letters:

Being taught is the intellectual analog of being loved.

I'll remind my students of this tomorrow when I give them Exam 3, on syntactic abstraction. "I just called to say 'I love you'."

Asimov is right. When I think back on all my years in school, I feel great affection for so many of my teachers, and I recall feeling their affection for me. Knowledge is not only power, says Asimov; it is happiness. When people help me learn they offer me knew ways to be happy.

( The Foundation Trilogy makes me happy, too.)


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

March 16, 2016 2:33 PM

Doing the Obvious Things Well

The San Antonio Spurs perennially challenge for the championship of the National Basketball Association. Like most great NBA teams, they have several excellent players. However, the foundation of their success isn't the high-flying sort of offense that people often associate with professional basketball, but rather a meticulous defense, the sort of defense people usually associate with defensive specialists and hard-working journeymen.

What's the secret? This article tells us there is no magic:

It's easy to think some form of incomprehensible genius is responsible for the subtle components of an elite defense, but in all reality, doing the obvious thing and doing it well (which is the hard part) is often all it takes.

This is true of so many things in life. Figure out what you need to do to excel, and then practice it -- both in preparation for the act and in the act itself.

It's not very exciting, but grunt work and attention to detail are usually the primary components of excellence. A little luck helps, of course; the Spurs were able to draft all-time great Tim Duncan as David Robinson's career was winding down. But even that luck is tinged with an unexciting message... What makes Duncan great isn't flashy style and supernatural skills. It's mostly doing the obvious things and doing them well.


Posted by Eugene Wallingford | Permalink | Categories: General

February 12, 2016 3:34 PM

Computing Everywhere: Detecting Gravitational Waves

a linearly-polarized gravitational wave
a linearly-polarized gravitational wave
Wikimedia Commons (CC BY-SA 3.0 US)

This week the world is excitedly digesting news that the interferometer at LIGO has detected gravitational waves being emitted by the merger of two black holes. Gravitational waves were predicted by Einstein one hundred years ago in his theory of General Relativity. Over the course of the last century, physicists have amassed plenty of indirect evidence that such waves exist, but this is the first time they have detected them directly.

The physics world is understandably quite excited by this discovery. We all should be! This is another amazing moment in science: Build a model. Make a falsifiable prediction. Wait for 100 years to have the prediction confirmed. Wow.

We in computer science can be excited, too, for the role that computation played in the discovery. As physicist Sabine Hossenfelder writes in her explanation of the gravitational wave story:

Interestingly, even though it was long known that black hole mergers would emit gravitational waves, it wasn't until computing power had increased sufficiently that precise predictions became possible. ... General Relativity, though often praised for its beauty, does leave you with one nasty set of equations that in most cases cannot be solved analytically and computer simulations become necessary.

As with so many cool advances in the world these days, whether in the sciences or the social sciences, computational modeling and simulation were instrumental in helping to confirm the existence of Einstein's gravitational waves.

So, fellow computer scientists, celebrate a little. Then, help a young person you know to see why they might want to study CS, alone or in combination with some other discipline. Computing is one of the fundamental tools we need these days in order to contribute to the great tableau of human knowledge. Even Einstein can use a little computational help now and then.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

January 17, 2016 10:07 AM

The Reluctant Mr. Darwin

Yesterday, I finished reading The Reluctant Mr. Darwin, a short biography of Charles Darwin by David Quammen published in 2006. It covers Darwin's life from the time he returns from his voyage on the HMS Beagle to his death in 1882, with a short digression to discuss Alfred Russel Wallace's early voyages and independent development of ideas on evolution and its mechanisms.

Before reading this book, I knew the basics of Darwin's theories but nothing about his life and very little about the milieu in which he worked and developed his theories. After reading, I have a better appreciation for the caution with which Darwin seemed to have worked, and the care he took to record detailed observations and to support his ideas with evidence from both nature and breeding. I also have a sense of how Wallace's work related to and affected Darwin's work. I could almost feel Darwin's apprehension upon receiving Russell's letter from southeast Asia, outlining ideas Darwin had been developing, refining, and postponing for twenty years.

The Reluctant Mr. Darwin is literary essay, not scholarly history. I enjoyed reading it. The book is at its best when talking about Darwin's life and work as a scientist, his attitudes and his work habits. The writing is clear, direct, and entertaining. When talking about Darwin's theories themselves, however, and especially about their effect in the world and culturally, the book comes across as too earnest and a bit too breathless for my taste. But this is a minor quibble. It's a worthwhile read.


Posted by Eugene Wallingford | Permalink | Categories: General

January 11, 2016 10:51 AM

Some Writing by Administrators Isn't Bad; It's Just Different

Jim Garland is a physicist who eventually became president of Miami University of Ohio. In Bad Writing by Administrators, Rachel Toor asked Garland how his writing evolved as he moved up the administrative hierarchy. His response included:

Truthfully, I did my deepest thinking as a beginning assistant professor, writing obscure papers on the quantum-mechanical properties of solids at liquid-helium temperatures. Over the years, I became shallower and broader, and by the time I left academe, I was worrying about the seating arrangement of donors in the president's football box.

I have experienced this even in my short step into the department head's office. Some of the writing I do as head is better than my writing before: clear, succinct, and qualified precisely. It is written for a different audience, though, and within a much different political context. My colleagues compliment me occasionally for having written a simple, straightforward note that says something they've been struggling to articulate.

Other times, my thinking is more muddled, and that shows through in what I write. When I try to fix the writing before I fix my thinking, I produce bad writing.

Some writing by administrators really is bad, but a lot of it is simply broader and shallower than what we write as academics. The broader it becomes, the less interesting the content is to the academics still living inside of us. Yet our target audience often can appreciate the value of that less interesting writing when it serves its purpose.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

December 26, 2015 2:12 PM

Moments of Alarm, Written Word Edition

In The Art of Fiction No. 156, George Plimpton asked William Styron, "Are you worried about the future of the written word?" Styron said, "Not really." But he did share the sort of moment that causes him alarm:

Not long ago I received in the mail a doctoral thesis entitled "Sophie's Choice: A Jungian Perspective", which I sat down to read. It was quite a long document. In the first paragraph it said, In this thesis my point of reference throughout will be the Alan J. Pakula movie of Sophie's Choice. There was a footnote, which I swear to you said, Where the movie is obscure I will refer to William Styron's novel for clarification.

Good thing there was an original source to consult.


Posted by Eugene Wallingford | Permalink | Categories: General

December 12, 2015 3:04 PM

Agreement: The Rare and Beautiful Exception

From How to Disagree:

Once disagreement starts to be seen as utterly normal, and agreement the rare and beautiful exception, we can stop being so surprised and therefore so passionately annoyed when we meet with someone who doesn't see eye-to-eye with us.

Sometimes, this attitude comes naturally to me. Other times, though, I have to work hard to make it my default stance. Things usually go better for me when I succeed.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

November 25, 2015 10:50 AM

It Started with a Tweet

Bret Victor's much-heralded What Can a Technologist Do About Climate Change? begins:

This started with a tweet. I'm embarrassed how often that happens.

Why be embarrassed? I am occasionally embarrassed when I tweet snarky and mildly regrettable things, but only because they are snarky and regrettable. However, having a thought, writing it down, and thinking some more is a perfectly honorable way to start writing an essay. Writing something down on Twitter has the advantage of sharing the idea with one's followers, which creates the possibility of getting feedback on the idea from smart, thoughtful people.

Sharing idle thoughts with the world can add value to them. They aren't always so idle.


Posted by Eugene Wallingford | Permalink | Categories: General

November 19, 2015 2:45 PM

Hope for the Mature Researcher

In A Primer on Graph Isomorphism, Lance Fortnow puts László Babai's new algorithm for the graph isomorphism problem into context. To close, he writes:

Also we think of theory as a young person's game, most of the big breakthroughs coming from researchers early in their careers. Babai is 65, having just won the Knuth Prize for his lifetime work on interactive proofs, group algorithms and communication complexity. Babai uses his extensive knowledge of combinatorics and group theory to get his algorithm. No young researcher could have had the knowledge base or maturity to be able to put the pieces together the way that Babai did.

We often hear that research, especially research aimed at solving our deepest problems, is a young person's game. Great work takes a lot of stamina. It often requires a single-minded focus that comes naturally to a young person but which is a luxury unavailable to someone with a wider set of obligations beyond work. Babai's recent breakthrough reminds us that other forces are at play, that age and broad experience can be advantages, too.

This passage serves as a nice counterweight to Garrison Keillor's The slow rate of learning... line, quoted in my previous post. Sometimes, slow and steady are what it takes to get a big job done.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 17, 2015 3:32 PM

Choice Passages from Recent Reads

Eric Schmidt, in an interview for Blitzscaling:

Every great project has started with a graduate student and an assistant professor looking for tenure.

~~~~~

Sir Peter Swinnerton-Dyer, quoted in an article about John Horton Conway:

I'm not sure that I can describe how charisma happens. It just is or isn't. And with most mathematicians it markedly isn't.

~~~~~

Alan Jacobs, in his 79 Theses on Technology For Disputation, as recorded by Chad Wellmon:

Everyone should sometimes write by hand, to recall what it's like to have second thoughts before the first ones are completely recorded.

~~~~~

Garrison Keillor, in his novel "Love Me":

We puritans overdramatize these things. We want there to be lions stalking us, whereas it's only some old coyote. Not utter degradation; just poor choices.

Truth, resignation, and freedom, all in one brief passage. I also like this one:

The slow rate of learning is discouraging to the older man, but thank God for illumination at whatever hour.

Yes, indeed.


Posted by Eugene Wallingford | Permalink | Categories: General

November 08, 2015 9:37 AM

Enthusiastic Recommendation Is Not A Vice

Novelist Henry Miller lamented one of his greatest vices, recommending books and authors too enthusiastically, but ultimately decided that he would not apologize for it:

However, this vice of mine, as I see it, is a harmless one compared with those of political fanatics, military humbugs, vice crusaders, and other detestable types. In broadcasting to the world my admiration and affection, my gratitude and reverence, ... I fail to see that I am doing any serious harm. I may be guilty of indiscretion, I may be regarded as a naïve dolt, I may be criticized justly or unjustly for my taste, or lack of it; I may be guilty, in the high sense, of "tampering" with the destiny of others; I may be writing myself down as one more "propagandist", but -- how am I injuring anyone? I am no longer a young man. I am, to be exact, fifty-eight years of age. (Je me nomme Louis Salavin.) Instead of growing more dispassionate about books, I find the contrary is taking place.

I'm a few years younger than Messrs. Miller and Salavin, but I share this vice of Miller's, as well as his conclusion. When you reach a certain age, you realize that admiration, affection, gratitude, and reverence, especially for a favorite book or author, are all to be cherished. You want to share them with everyone you meet.

Even so, I try to rein in my vice in the same way Miller himself knew he ought in his soberer moments, by having a lighter touch when I recommend. Broadcasting one's admiration and affection too enthusiastically often has the opposite effect to the one intended. The recipients either take the recommendation on its face and read with such high expectations that they will surely be disappointed, or they instinctively (if subconsciously) react with such skepticism that they read with an eye toward deflating the recommendation.

I will say that I have been enjoying The Books In My Life, from which the above passage comes. I've never read any of Miller's novels, only a Paris Review interview with him. This book about the books that shaped him has been a pleasant introduction to Miller's erudite and deeply personal style. Alas, the occasional doses of French are lost on me without the help of Google Translate.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

November 03, 2015 4:10 PM

Academic Computer Science is Not University IT

The Department of Biology operates a greenhouse so that its faculty can cultivate and study a wide range of plants. The greenhouse offers students a chance to see, smell, and touch real, living plants that they may never have encountered before. With the greenhouse as an ecosystem, students get to learn about the relationships among species and a little about how species evolve. The ecological setting in which the plants are grown provides the context needed for faculty to demonstrate realistically how organisms are connected within environments.

Faced with budget cuts, the university has decided that it is no longer cost-effective to have biology staff operate the greenhouse. We already have a grounds and landscaping unit as part of the physical plant, and its staff has expertise for working with a variety plants as a part of managing lawns and gardens. To save money, the administration is centralizing all plant management services in grounds and landscaping. If the folks in Biology needs anything done in or for the greenhouse, they call a designated contact person. They will have to streamline the offerings in the greenhouse, based on university-wide decisions about what kind of plants we can afford to support.

~~~~

The Department of Art has a number of faculty who specialize in drawing, both pencil and ink, and in painting. All students who major in art take two courses in drawing as part of the foundations sequence, and many studio art majors take painting. Both media help students learn to see and teach them about how their materials interact with their vision and affect the shape of their creative works.

Faced with budget cuts, the university has decided that it is no longer cost-effective to have the art faculty select and buy their own pencils, ink, and paints. We already have a couple of units on campus who purchase and use these materials. Operation and Maintenance does a wide variety of carpentry projects that include painting. All campus staff use pencils and ink pens, so Business Operations has purchasing agreements with several office supplies wholesalers. These agreements ensure that university staff can stock a range of pencils, pens, and paints at the best possible price.

When one of the drawing faculty calls over for a particular set of soft graphic pencils, ranging in hardness from 9B to H, she is told that the university has standardized on a set with a smaller range. Satndardization allows us to buy in bulk and to save management overhead. "At least they aren't all No. 2 pencils," thinks the art prof.

When one of the painting faculty calls over to Facilities for a particular set of acrylic paints, the warehouse manager says, "Sure, just let me know what colors you need and we'll by them. We have a great contract with Sherwin Williams." The prof isn't sure where he'll put all the one-gallon cans, though.

~~~~

... just kidding. No university would ever do that, right? Biologists run their own greenhouses and labs, and art faculty select, buy, and manage specialty materials in their studios. Yet academic Computer Science departments often work under nearly identical circumstances, because computers are part of university's IT infrastructure.

Every few years at academic institutions, the budget and management pendulum swings toward centralization of IT services, as a way to achieve economies of scale and save money. Then, a few years later, it swings back toward decentralization, as a way to provide better and finer-grained services to individual departments. Too often, the services provided to CS faculty and students are forced to go along for the ride.

My university is going through one of its periodic recentralizations, at the orders of the Board of Regents. Every time we centralize, we have to have the same conversations about how Computer Science fits into the picture, because most non-CS people ultimately see our use of computers and software as fundamentally the same as, say, the English department's or the Psychology department's. However interesting those departments' use of technology is (and in this day, most faculty and students use technology in interesting ways, regardless of discipline), it is not the same thing as what Computer Science does.

Academic computing has never been limited to computer scientists, of course. Many mathematicians and physicists rely on a very different sort of computing than the folks who use it only for library-style research, writing, and presentation. So do faculty in a few other disciplines. Just as Biology and Art need specialized laboratories and materials, so do those departments that are working at the edge of computing require specialized laboratories and materials. Computer Science is simply the discipline that is farthest out along this curve.

The same thing goes for support staff as for equipment. Few administrators would think of "centralizing" the lab technician and supplies manager for Biology or Chemistry into a non-academic unit on campus, or ask academic departments to depend on a non-academic unit to provide discipline-specific services that are critical to the departments' mission. Lab technicians and equipment managers need to be hired by the departments (or the college) that need them and serve the departments directly. So, too, do certain departments need to have system administrators and lab managers who work for them to meet the specialized needs of academic computing, serving the department or college directly.

Hardware and software are a computer scientist's greenhouse and artistic media. They are our our library and our telescopes, our tallgrass prairie preserves and our mass spectrometers. It is essential that university administrations think of -- and provide for -- Computer Science and other computation-laden departments as academic disciplines first, and not just as consumers of generic IT services. Doing so requires, among other things, leaving control of essential hardware, software, and policies for their use within the academic departments.

~~~~~

Disclaimer. The vignettes above were written by me. I am very much neither a biologist nor a studio artist. If any of the details clash with reality, please see them as creative liberties taken by the author to serve a theme.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

October 30, 2015 4:35 PM

Taking Courses Broad and Wide

Nearly nine years ago, digital strategist Russell Davies visited the University of Oregon to work with students and faculty in the advertising program and wrote a blog entry about his stint there. Among his reflections on what the students should be doing and learning, he wrote:

We're heading for a multi-disciplinary world and that butts right up against a university business model. If I were preparing myself for my job right now I'd do classes in film editing, poetry, statistics, anthropology, business administration, copyright law, psychology, drama, the history of art, design, coffee appreciation, and a thousand other things. Colleges don't want you doing that, that destroys all their efficiencies, but it's what they're going to have to work out.

I give similar advice to prospective students of computer science: If they intend to take their CS degrees out into the world and make things for people, they will want to know a little bit about many different things. To maximize the possibilities of their careers, they need a strong foundation in CS and an understanding of all the things that shape how software and software-enhanced gadgets are imagined, made, marketed, sold, and used.

Just this morning, a parent of a visiting high school student said, after hearing about all the computer science that students learn in our programs, "So, our son should probably drop his plans to minor in Spanish?" They got a lot more than a "no" out of me. I talked about the opportunities to engage with the growing population of Spanish-speaking Americans, even here in Iowa; the opportunities available to work for companies with international divisions; and how learning a foreign language can help students study and learn programming languages differently. I was even able to throw in a bit about grammars and the role they play in my compiler course this semester.

I think the student will continue with his dream to study Spanish.

I don't think that the omnivorous course of study that Davies outlines is at odds with the "efficiencies" of a university at all. It fits pretty well with a liberal arts education, which even of our B.S. students have time for. But it does call for some thinking ahead, planning to take courses from across campus that aren't already on some department's list of requirements. A good advisor can help with that.

I'm guessing that computer science students and "creatives" are not the only ones who will benefit from seeking a multi-disciplinary education these days. Davies is right. All university graduates will live in a multi-disciplinary world. It's okay for them (and their parents) to be thinking about careers when they are in school. But they should prepare for a world in which general knowledge and competencies buoy up their disciplinary knowledge and help them adapt over time.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

October 22, 2015 4:22 PM

Aramaic, the Intermediate Language of the Ancient World

My compiler course is making the transition from the front end to the back end. Our attention is on static analysis of abstract syntax trees and will soon turn to other intermediate representations.

In the compiler world, an "intermediate representation" or intermediate language is a notation used as a stepping stone between the abstract syntax tree and the machine language that is ultimately produced. Such a stepping stone allows the compiler to take smaller steps in translation process and makes it easier to improve the code before getting down into the details of machine language.

We sometimes see intermediate languages in the "real world", too. They tend to arise as a result of cultural and geopolitical forces and, while they usually serve different purposes in human affairs than in compiler affairs, they still tend to be practical stepping stones to another language.

Consider the case of Darius I, whose Persian armies conquered most of the Middle East around 500 BC. As John McWhorter writes in The Atlantic, at the time of Darius's conquest,

... Aramaic was so well-entrenched that it seemed natural to maintain it as the new empire's official language, instead of using Persian. For King Darius, Persian was for coins and magnificent rock-face inscriptions. Day-to-day administration was in Aramaic, which he likely didn't even know himself. He would dictate a letter in Persian and a scribe would translate it into Aramaic. Then, upon delivery, another scribe would translate the letter from Aramaic into the local language. This was standard practice for correspondence in all the languages of the empire.

For sixty years, many compiler writers have dreamed of a universal intermediate language that would ease the creation of compilers for new languages and new machines, to no avail. But for several hundred years, Aramaic was the intermediate representation of choice for a big part of the Western world! Alas, Greek and Arabic later came along to supplant Aramaic, which now seems to be on a path to extinction.

This all sounds a lot like the world of programming, in which languages come and go as we develop new technologies. Sometimes a language, human or computer, takes root for a while as the result of historical or technical forces. Then a new regime or a new culture rises, or an existing culture gains in influence, and a different language comes to dominate.

McWhorter suggests that English may have risen to prominence at just the right moment in history to entrench itself as the world's intermediate language for a good long run. We'll see. Human languages and computer languages may operate on different timescales, but history treats them much the same.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

October 15, 2015 8:18 AM

Perfection Is Not A Pre-Requisite To Accomplishing Something Impressive

In Not Your Typical Role Model, mathematician Hannah Fry tells us some of what she learned about Ada Lovelace, "the 19th century programmer", while making a film about her. Not all of it was complimentary. She concludes:

Ada was very, very far from perfect, but perfection is not a pre-requisite to accomplishing something impressive. Our science role models shouldn't always be there to celebrate the unachievable.

A lot of accomplished men of science were far from perfect role models, too. In the past, we've often been guilty of covering up bad behavior to protect our heroes. These days, we sometimes rush to judge them. Neither inclination is healthy.

By historical standards, it sounds like Lovelace's imperfections were all too ordinary. She was human, like us all. Lovelace thought some amazing things and wrote them down for us. Let's celebrate that.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

September 22, 2015 2:57 PM

"Good Character" as an Instance of Postel's Law

Mike Feathers draws an analogy I'd never thought of before in The Universality of Postel's Law: what we think of as "good character" can be thought of as an application of Postel's Law to ordinary human relations.

Societies often have the notion of 'good character'. We can attempt all sorts of definitions but at its core, isn't good character just having tolerance for the foibles of others and being a person people can count on? Accepting wider variation at input and producing less variation at output? In systems terms that puts more work on the people who have that quality -- they have to have enough control to avoid 'going off' on people when others 'go off on them', but they get the benefit of being someone people want to connect with. I argue that those same dynamics occur in physical systems and software systems that have the Postel property.

These days, most people talk about Postel's Law as a social law, and criticisms of it even in software design refer to it as creating moral hazards for designers. But Postel coined this "principle of robustness" as a way to talk about implementing TCP, and most references I see to it now relate to HTML and web browsers. I think it's pretty cool when a software design principle applies more broadly in the design world, or can even be useful for understanding human behavior far removed from computing. That's the sign of a valuable pattern -- or anti-pattern.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns, Software Development

September 19, 2015 11:56 AM

Software Gets Easier to Consume Faster Than It Gets Easier to Make

In What Is the Business of Literature?, Richard Nash tells a story about how the ideas underlying writing, books, and publishing have evolved over the centuries, shaped by the desires of both creators and merchants. One of the key points is that technological innovation has generally had a far greater effect on the ability to consume literature than on the ability to create it.

But books are just one example of this phenomenon. It is, in fact, a pattern:

For the most part, however, the technical and business-model innovations in literature were one-sided, far better at supplying the means to read a book than to write one. ...

... This was by no means unique to books. The world has also become better at allowing people to buy a desk than to make a desk. In fact, from medieval to modern times, it has become easier to buy food than to make it; to buy clothes than to make them; to obtain legal advice than to know the law; to receive medical care than to actually stitch a wound.

One of the neat things about the last twenty years has been the relatively rapid increase in the ability for ordinary people to to write and disseminate creative works. But an imbalance remains.

Over a shorter time scale, this one-sidedness has been true of software as well. The fifty or sixty years of the Software Era have given us seismic changes in the availability, ubiquity, and backgrounding of software. People often overuse the word 'revolution', but these changes really have had an immense effect in how and when almost everyone uses software in their lives.

Yet creating software remains relatively difficult. The evolution of our tools for writing programs hasn't kept pace with the evolution in platforms for using them. Neither has the growth in our knowledge of how make great software.

There is, of course, a movement these days to teach more people how to program and to support other people who want to learn on their own. I think it's wonderful to open doors so that more people have the opportunity to make things. I'm curious to see if the current momentum bears fruit or is merely a fad in a world that goes through fashions faster than we can comprehend them. It's easier still to toss out a fashion that turns out to require a fair bit of work.

Writing software is still a challenge. Our technologies have not changed that fact. But this is also true, as Nash reminds us, of writing books, making furniture, and a host of other creative activities. He also reminds us that there is hope:

What we see again and again in our society is that people do not need to be encouraged to create, only that businesses want methods by which they can minimize the risk of investing in the creation.

The urge to make things is there. Give people the resources they need -- tools, knowledge, and, most of all, time -- and they will create. Maybe one of the new programmers can help us make better tools for making software, or lead us to new knowledge.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns, Software Development

September 11, 2015 3:55 PM

Search, Abstractions, and Big Epistemological Questions

Andy Soltis is an American grandmaster who writes a monthly column for Chess Life called "Chess to Enjoy". He has also written several good books, both recreational and educational. In his August 2015 column, Soltis talks about a couple of odd ways in which computers interact with humans in the chess world, ways that raise bigger questions about teaching and the nature of knowledge.

As most people know, computer programs -- even commodity programs one can buy at the store -- now play chess better than the best human players. Less than twenty years ago, Deep Blue first defeated world champion Garry Kasparov in a single game. A year later, Deep Blue defeated Kasparov in a closely contested six-game match. By 2005, computers were crushing Top Ten players with regularity. These days, world champion Magnus Larson is no match for his chess computer.

a position in which humans see the win, but computers don't

Yet there are still moments where humans shine through. Soltis opens with a story in which two GMs were playing a game the computers thought Black was winning, when suddenly Black resigned. Surprised journalists asked the winner, GM Vassily Ivanchuk, what had happened. It was easy, he said: it only looked like Black was winning. Well beyond the computers' search limits, it was White that had a textbook win.

How could the human players see this? Were they searching deeper than the computers? No. They understood the position at a higher level, using abstractions such as "being in the square" and passed pawns like splitting a King like "pants". (We chessplayers are an odd lot.)

When you can define 'flexibility' in 12 bits,
it will go into the program.

Attempts to program computers to play chess using such abstract ideas did not work all that well. Concepts like king safety and piece activity proved difficult to implement in code, but eventually found their way into the programs. More abstract concepts like "flexibility", "initiative", and "harmony" have proven all but impossible to implement. Chess programs got better -- quickly -- when two things happened: (1) programmers began to focus on search, implementing metrics that could be applied rapidly to millions of positions, and (2) computer chips got much, much faster.

Pawn Structure Chess, by Andy Soltis

The result is that chess programs can beat us by seeing farther down the tree of possibilities than we do. They make moves that surprise us, puzzle us, and even offend our sense of beauty: "Fischer or Tal would have played this move; it is much more elegant." But they win, easily -- except when they don't. Then we explain why, using ideas that express an understanding of the game that even the best chessplaying computers don't seem to have.

This points out one of the odd ways computers relate to us in the world of chess. Chess computers crush us all, including grandmasters, using moves we wouldn't make and many of us do not understand. But good chessplayers do understand why moves are good or bad, once they figure it out. As Soltis says:

And we can put the explanation in words. This is why chess teaching is changing in the computer age. A good coach has to be a good translator. His students can get their machine to tell them the best move in any position, but they need words to make sense of it.

Teaching computer science at the university is affected by a similar phenomenon. My students can find on the web code samples to solve any problem they have, but they don't always understand them. This problem existed in the age of the book, too, but the web makes available so much material, often undifferentiated and unexplained, so, so quickly.

The inverse of computers making good moves we don't understand brings with it another oddity, one that plays to a different side of our egos. When a chess computer loses -- gasp! -- or fails to understand why a human-selected move is better than the moves it recommends, we explain it using words that make sense of human move. These are, of course, the same words and concepts that fail us most of the time when we are looking for a move to beat the infernal machine. Confirmation bias lives on.

Soltis doesn't stop here, though. He realizes that this strange split raises a deeper question:

Maybe it's one that only philosophers care about, but I'll ask it anyway:

Are concepts like "flexibility" real? Or are they just artificial constructs, created by and suitable only for feeble, carbon-based minds?

(Philosophers are not the only ones who care. I do. But then, the epistemology course I took in grad school remains one of my two favorite courses ever. The second was cognitive psychology.)

Aristotle

We can implement some of our ideas about chess in programs, and those ideas have helped us create machines we can no longer defeat over the board. But maybe some of our concepts are simply be fictions, "just so" stories we tell ourselves when we feel the need to understand something we really don't. I don't think so, the pragmatist in me keeps pushing for better evidence.

Back when I did research in artificial intelligence, I always chafed at the idea of neural networks. They seemed to be a fine model of how our brains worked at the lowest level, but the results they gave did not satisfy me. I couldn't ask them "why?" and receive an answer at the conceptual level at which we humans seem to live. I could not have a conversation with them in words that helped me understand their solutions, or their failures.

Now we live in a world of "deep learning", in which Google Translate can do a dandy job of translating a foreign phrase for me but never tell me why it is right, or explain the subtleties of choosing one word instead of another. Add more data, and it translates even better. But I still want the sort of explanation that Ivanchuk gave about his win or the sort of story Soltis can tell about why a computer program only drew a game because it saddled itself with inflexible pawn structure.

Perhaps we have reached the limits of my rationality. More likely, though, is that we will keep pushing forward, bringing more human concepts and abstractions within the bounds of what programs can represent, do, and say. Researchers like Douglas Hofstadter continue the search, and I'm glad. There are still plenty of important questions to ask about the nature of knowledge, and computer science is right in the middle of asking and answering them.

~~~~

IMAGE 1. The critical position in Ivanchuk-Jobava, Wijk aan Zee 2015, the game to which Soltis refers in his story. Source: Chess Life, August 2015, Page 17.

IMAGE 2. The cover of Andy Soltis's classic Pawn Structure Chess. Source: the book's page at Amazon.com.

IMAGE 3. A bust of Aristotle, who confronted Plato's ideas about the nature of ideals. Source: Classical Wisdom Weekly.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

August 31, 2015 4:13 PM

Catch-22: Faculty and University Administration

I agree with Timothy Burke that the evolution of university administration is shaped in part by the unintended consequences of faculty behavior:

I think some of my colleagues across the country are potentially contributing to the creation of the distanced, professionalized, managerial administrations that they say that they despise, and they're doing it in part through half-voiced expectations about what an ideal administrator might be like.

This passage comes from Performing the Role, in which Burke discusses some of the fall-out from a botched faculty hiring at the University of Illinois last year. Even if you don't know much about the Salaita case, you may find Burke's piece worth reading. It captures pretty well how universities seem to be shifting toward a professionalized administrative class and the ways in which this shift clashes -- and meshes -- with faculty expectations and behavior.

This line, in particular, sums up a surprising amount of my experience as a department head for the last decade:

I think we somehow expect that administrative leaders should be unfailingly polite, deferential, patient, and solicitous when we're the ones talking with them and bold, confrontational, and aggressive when they're talking to anyone else.

The next one has affected me less directly, but I see it in the expectations across campus all the time:

We seem to expect administrative leaders to escape structural traps that we cannot imagine a way to escape from.

Burke ends the paragraph containing those sentences with a summary that many administrators can appreciate: "There's a lot of Catch-22 going on here."

Burke is always thoughtful, and thought-provoking, on matters of academia and culture. If those topics interest, his blog is often worth reading.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

August 25, 2015 1:57 PM

The Art of Not Reading

The beginning of a new semester brings with it a crush of new things to read, write, and do, which means it's a good time to remember this advice from Arthur Schopenhauer:

Hence, in regard to our subject, the art of not reading is highly important. This consists in not taking a book into one's hand merely because it is interesting the great public at the time -- such as political or religious pamphlets, novels, poetry, and the like, which make a noise and reach perhaps several editions in their first and last years of existence. Remember rather that the man who writes for fools always finds a large public: and only read for a limited and definite time exclusively the works of great minds, those who surpass other men of all times and countries, and whom the voice of fame points to as such. These alone really educate and instruct.

"The man who writes for fools always finds a large public." You do not have to be part of it. Time is limited. Read something that matters.

The good news for me is that there is a lot of writing about compilers by great minds. This is, of course, also the bad news. Part of my job is to help my students navigate the preponderance of worthwhile readings.

Reading in my role as department head is an altogether different matter...

~~~~

The passage above is from On Books and Reading, which is available via Project Gutenberg, a wonderful source of many great works.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

August 19, 2015 4:07 PM

Working Too Much Means Never Having to Say "No"

Among the reasons David Heinemeier Hansson gives in his advice to Fire the Workaholics is that working too much is a sign of bad judgment:

If all you do is work, your value judgements are unlikely to be sound. Making good calls on "is it worth it?" is absolutely critical to great work. Missing out on life in general to put more hours in at the office screams "misguided values".

I agree, in two ways. First, as DHH says, working too much is itself a general indicator that your judgment is out of whack. Second is the more specific case:

For workaholics, doing more work always looks like a reasonable option. As a result, when you are trying to decide, "Should I make this or not?", you never have to choose not to make the something in question -- even when not making it is the right thing to do. That sort of indifferent decision making can be death in any creative endeavor.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

August 06, 2015 10:22 AM

Not So Different

Trevor Blackwell on The Lessons of Viaweb:

[Scott Kirsner]: What was the biggest challenge you faced with Viaweb?

[Trevor Blackwell]: Focusing every day on the few things that mattered and not getting distracted by the hundreds of things that didn't.

Maybe the life of a department head isn't all that different from the life of an entrepreneur after all. Well, except for the $49 million.


Posted by Eugene Wallingford | Permalink | Categories: General

July 30, 2015 2:45 PM

The Word Came First

James Somers's article You're Probably Using the Wrong Dictionary describes well how a good dictionary can change your life. In comparing a definition from Webster's 1913 Revised Unabridged Dictionary with a definition from the New Oxford Dictionary, which he offers as an exemplar of the pedestrian dictionaries we use today, he reminds us that words are elusive and their definitions only approximations:

Notice, too, how much less certain the Webster definition seems about itself, even though it's more complete -- as if to remind you that the word came first, that the word isn't defined by its definition here, in this humble dictionary, that definitions grasp, tentatively, at words, but that what words really are is this haze and halo of associations and evocations, a little networked cloud of uses and contexts.

Such poetry is not wasted on words; it is not, too use his own example from the essay, fustian. Words deserve this beauty, and a good dictionary.

There is also a more general reminder just beneath the surface here. In so many ways, more knowledge makes us less certain, not more, and more circumspect, not less. It is hard to make sharp distinctions within a complex web of ideas when you know a little about the web.

I strongly second Somers's recommendation of John McPhee's work, which I blogged about indirectly a few years ago. I also strongly second his recommendation of Webster's 1913 Revised Unabridged Dictionary. I learned about it from another blog article years ago and have been using it ever since. It's one of the first things I install whenever I set up a new computer.


Posted by Eugene Wallingford | Permalink | Categories: General

July 26, 2015 10:03 AM

A Couple of Passages on Disintermediation

"Disintermediation" is just a fancy word for getting other people out of the space between the people who create things and the people who read or listen to those things.

1. In What If Authors Were Paid Every Time Someone Turned a Page?, Peter Wayner writes:

One latter-day Medici posted a review of my (short) book on Amazon complaining that even 99 cents was too expensive for what was just a "blog post". I've often wondered if he was writing that comment in a Starbucks, sipping a $6 cup of coffee that took two minutes to prepare.

Even in the flatter world of ebooks, Amazon has the power to shape the interactions of creators and consumers and to influence strongly who makes money and what kind of books we read.

2. Late last year, Steve Albini spoke on the surprisingly sturdy state of the music industry:

So there's no reason to insist that other obsolete bureaux and offices of the lapsed era be brought along into the new one. The music industry has shrunk. In shrinking it has rung out the middle, leaving the bands and the audiences to work out their relationship from the ends. I see this as both healthy and exciting. If we've learned anything over the past 30 years it's that left to its own devices bands and their audiences can get along fine: the bands can figure out how to get their music out in front of an audience and the audience will figure out how to reward them.

Most of the authors and bands who aren't making a lot of money these days weren't making a lot of money -- or any money at all -- in the old days, either. They had few effective ways to distribute their writings or their music.

Yes, there are still people in between bands and their fans, and writers and their readers, but Albini reminds us how much things have improved for creators and audiences alike. I especially like his takedown of the common lament, "We need to figure out how to make this work for everyone." That sentence has always struck me as the reactionary sentiment of middlemen who no longer control the space between creators and audiences and thus no longer get their cut of the transaction.

I still think often about what this means for universities. We need to figure out how to make this internet thing work for everyone...


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 24, 2015 2:07 PM

Sentences of the Day

Three sentences stood out from the pages of my morning reading. The first two form an interesting dual around power and responsibility.

The Power to Name Things

Among the many privileges of the center, for example, is the power to name things, one of the greatest powers of all.

Costica Bradatan writes this in Change Comes From the Margins, a piece on social change. We programmers know quite well the power of good names, and thus the privilege we have in being able to create them and the responsibility we have to do that well.

The Avoidance of Power as Irresponsibility

Everyone's sure that speech acts and cultural work have power but no one wants to use power in a sustained way to create and make, because to have power persistently, in even a small measure, is to surrender the ability to shine a virtuous light on one's own perfected exclusion from power.

This sentence comes from the heart of Timothy Burke's All Grasshoppers, No Ants, his piece on one of the conditions he thinks ails our society as a whole. Burke's essay is almost an elaboration of Teddy Roosevelt's well-known dismissal of critics, but with an insightful expression of how and why rootless critics damage society as a whole.

Our Impotence in the Face of Depression

Our theories about mental health are often little better than Phlogiston and Ether for the mind.

Quinn Norton gives us this sentence in Descent, a personally-revealing piece about her ongoing struggle with depression. Like many of you, I have watched friends and loved ones fight this battle, which demonstrates all too readily the huge personal costs of civilization's being in such an early stage of understanding this disease, its causes, and its effective treatment.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 21, 2015 3:02 PM

'Send' Is The Universal Verb

In the mid-1980s, Ray Ozzie left IBM with the idea of creating an all-in-one software platform for business collaboration, based on his experience using the group messaging system in the seminal computer-assisted instruction system Plato. Ozzie's idea eventually became Lotus Notes. This platform lives on today in an IBM product, but it never had the effect that Ozzie envisioned for it.

In Office, Messaging, and Verbs, Benedict Evans tells us that Ozzie's idea is alive and well and finally taking over the world -- in the form of Facebook:

But today, Facebook's platform on the desktop is pretty much Ray Ozzie's vision built all over again but for consumers instead of enterprise and for cat pictures instead of sales forecasts -- a combination of messaging with embedded applications and many different data types and views for different tasks.

"Office, Messaging, and Verbs" is an engaging essay about how collaborative work and the tools we use to do it co-evolve, changing each other in turn. You need a keyboard to do the task at hand... But is the task at hand your job, or is it merely the way you do your job today? The answer depends on where you are on the arc of evolution.

Alas, most days I need to create or consume a spreadsheet or two. Spreadsheets are not my job, but they are way people in universities and most other corporate entities do too many of their jobs these days. So, like Jack Lemmon in The Apartment, I compute my cell's function and pass it along to the next person in line.

I'm ready for us to evolve further down the curve.

~~~~

Note: I added the Oxford comma to Evans's original title. I never apologize for inserting an Oxford comma.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 20, 2015 2:59 PM

Rethinking Accounting Software and Interfaces in the 1980s

In Magic Ink: Information Software and the Graphical Interface, Bret Victor reminds us that the dominant style of user interface today was created long before today's computers:

First, our current UI paradigm was invented in a different technological era. The initial Macintosh, for example, had no network, no mass storage, and little inter-program communication. Thus, it knew little of its environment beyond the date and time, and memory was too precious to record significant history. Interaction was all it had, so that's what its designers used. And because the computer didn't have much to inform anyone of, most of the software at the time was manipulation software -- magic versions of the typewriter, easel, and ledger-book. Twenty years and an internet explosion later, software has much more to say, but an inadequate language with which to say it.

William McCarthy, creator of the REA model of accounting

Victor's mention of the accounting ledger brings to mind the work being done since the early 1980s by Bill McCarthy, an accounting professor at Michigan State. McCarthy is motivated by a similar set of circumstances. The techniques by which we do financial accounting were created long before computers came along, and the constraints that made them necessary no longer exist. But he is looking deeper than simply the interaction style of accounting software; he is interested in upending the underlying model of accounting data.

McCarthy proposed the resources, events, agents (REA) model -- essentially an application of database theory from CS -- as an alternative to traditional accounting systems. REA takes advantage of databases and other computing ideas to create a more accurate model of a business and its activity. It eliminates many of the artifacts of double-entry bookkeeping, including debits, credits, and placeholder accounts such as accounts receivable and payable, because they can generated in real time from more fine-grained source data. An REA model of a business enables a much wider range of decision support than the traditional accounting model while still allowing the firm to produce all the artifacts of traditional accounting as side effect.

(I had the good fortune to work with McCarthy during my graduate studies and even helped author a conference paper on the development of expert systems from REA models. He also served on my dissertation committee.)

In the early years, many academic accountants reacted with skepticism to the idea of REA. They feared losing the integrity of the traditional accounting model, which carried a concomitant risk to the trust placed by the public in audited financial statements. Most of these concerns were operational, not theoretical. However, a few people viewed REA as somehow dissing the system that had served the profession so well for so long.

Victor includes a footnote in Magic Ink that anticipates a similar concern from interaction designers to his proposals:

Make no mistake, I revere GUI pioneers such as Alan Kay and Bill Atkinson, but they were inventing rules for a different game. Today, their windows and menus are like buggy whips on a car. (Although Alan Kay clearly foresaw today's technological environment, even in the mid-'70s. See "A Simple Vision of the Future" in his fascinating Early History of Smalltalk (1993).)

"They were inventing rules for a different game." This sentence echoes how I have always felt about Luca Pacioli, the inventor of double-entry bookkeeping. It was a remarkable technology that helped to enable the growth of modern commerce by creating a transparent system of accounting that could be trusted by insiders and outsiders alike. But he was inventing rules for a different game -- 500 years ago. Half a century dwarfs the forty or fifty year life of windows, icons, menus, and pointing and clicking.

I sometimes wonder what might have happened if I had pursued McCarthy's line of work more deeply. It dovetails quite nicely with software patterns and would have been well-positioned for the more recent re-thinking of financial support software in the era of ubiquitous mobile computing. So many interesting paths...


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 07, 2015 2:09 PM

Echoes: Symbols and Stories

Kevin Lawler, in Invention:

Steve Jobs gets credit for a lot of things he didn't do. Jobs himself said it best: "People like symbols, so I'm the symbol of certain things." Sometimes that means using Jobs as a stand-in for the many designers who work at Apple. Jobs usually makes for a good story. We like narratives, and we can build several entertaining ones around Jobs. Telling stories lets us gloss over other people by attributing their work to one person.

Peter Sheridan Dodds, in Homo Narrativus and the Trouble with Fame:

These two traits -- our compulsion to tell stories and our bias towards the individual -- conspire to ruin our intuitive understanding of fame.

The symbols we create and the stories we tell intertwine. Knowing that we are biased cognitively to treat them in a particular puts us in a better position to overcome the bias.


Posted by Eugene Wallingford | Permalink | Categories: General

July 06, 2015 4:48 PM

Echoes: Aligning Expectations with Reality

Adam Bosworth, in Say No:

Have you ever been busy for an entire week and felt like you got nothing meaningful done? Either your expectations are off or your priorities are.

Brent Simmons, in Love:

If there's a way out of despair, it's in changing our expectations.

Good advice from two people who have been in the trenches for a while.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

July 03, 2015 12:50 PM

Good Music Uses Your Whole Mind

A couple of months back, someone posted a link to an interview with guitarist Steve Vai, to share its great story about how Vai came to work with Frank Zappa. I liked the entire piece, including the first paragraph, which sets the scene on how Vai got into music in the first place:

Steve Vai: I recall when I was very, very young I was always tremendously excited whenever I was listening to the radio or records. Even back then a peculiar thing happened that still happens to me today. When I listen to music I can't focus on anything else. When there's wallpaper music on the radio it's not a problem but if a good song comes on it's difficult for me to carry on a conversation or multitask. It's always odd to me when I'm listening to something or playing something for somebody and they're having a discussion in the middle of a piece of music [laughs].

I have this pattern. When a song connects with me, I want to listen; not read or talk, simply listen. And, yes, sometimes it's "just" a pop song. For a while, whenever "Shut Up and Dance" by Walk the Moon came on the radio, it had my full attention. Ah, who am I kidding? It still has that effect on me.

Also, I love Vai's phrase "wallpaper music". I often work with music on in the background, and some music I like knows how to stay there. For me, that's a useful role for songs to play. Working in an environment with some ambient noise is much better for me than working in complete silence, and music makes better ambient noise for me than life in a Starbucks.

When I was growing up, I noticed that occasionally a good song would come on the air, and my level of concentration managed to hold it at bay. When I realized that I had missed the song, I was disappointed. Invariably in those cases, I had been solving a math problem or a writing a computer program. That must have been a little bit like the way Vai felt about music: I wanted to know how to do that, so I put my mind into figuring out how. I was lucky to find a career in which I can do that most of the time.

Oh, and by the way, Steve Vai can really play.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

June 22, 2015 3:27 PM

Strategy Under Time Constraints

an old analog chess clock

In Proving Too Much, Scott Alexander writes this about a rhetorical strategy that most people disapprove of:

Because here is a fundamental principle of the Dark Arts -- you don't need an argument that can't be disproven, only an argument that can't be disproven in the amount of time your opponent has available.

This is dark art in the world of ideas, where truth is more important than winning an argument. But it is a valuable strategy in games like chess, which are often played under time constraint. In competition, winning sometimes matters more the beauty or truth.

Suppose that my opponent has only a few minutes or seconds left on the clock. Suppose also that it's my move and that I have two possible moves to make. One is objectively better, in that it leads to the better expected outcome for me in theory, but that it is easy for my opponent to find good responses. The other move is weaker, perhaps even allowing my opponent to get an advantage over me, but that it would be hard for her to find the right path in the time available.

In this case, I may actually want to play the weaker move, because it maximizes my chance of winning in the circumstances of the game. My opponent has to use extra time to untangle the complexity of the position, and even if she finds the right move, there may not be enough time left to execute the plan. This approach is more volatile for me than playing the safer move, as it increases my risk of losing at the same time that it increases my chances of prevailing. But on balance, I am better off.

This may seem like a crazy strategy, but anyone who has played a lot of speed chess knows its value. Long-time world champion Emanuel Lasker was reputed to have employed a similar strategy, sometimes playing the move that would most unsettle the particular opponent he was playing that day, rather than the absolute best move. (Wikipedia says, though that this reputation may have been undeserved.)

There are chessplayers who would object to this strategy as much as people object to its use in argumentation. There is truth in chess, too, and most chessplayers deeply appreciate making beautiful moves and playing beautiful games. Some grandmasters have sought beautiful combinations to their own detriment. For example, Mikhail Tal may have been able to retain or regain his world title if not for a propensity to seek complication in search of beauty. He gave us many brilliancies as a result, but he also lost just often enough to keep him on the fringes of the world championship.

Much of the time, though, we chessplayers are trying to win the game, and practicing the dark arts is occasionally the best way to do so. That may mean making a move that confounds the opponent just long enough to win the game.


Posted by Eugene Wallingford | Permalink | Categories: General

June 09, 2015 2:48 PM

I'm Behind on Blogging About My Courses...

... so much so, that I may never catch up. The last year and a half have been crazy, and I simply have not set aside enough time to blog. A big part of the time crunch was teaching three heavy preps in 2014: algorithms, agile software development, and our intro course. It is fitting, then, that blogging about my courses has suffered most of all -- even though, in the moment, I often have plenty to say. Offhand, I can think of several posts for which I once had big plans and for which I still have drafts or outlines sitting in my ideas/ folder:

  • readers' thoughts on teaching algorithms in 2014, along with changes I made to my course. Short version: The old canon still covers most of the important bases.
  • reflections on teaching agile dev again after four years. Short version: The best learning still happens in the trenches working with the students, who occasionally perplex me and often amaze me.
  • reflections on teaching Python in the intro for the first course for the first time. Short version: On balance, there are many positives, but wow, there is a lot of language there, and way too many resources.
  • a lament on teaching programming languages principles when the students don't seem to connect with the material. Surprise ending: Some students enjoyed the course more than I realized.

Thoughts on teaching Python stand out as especially trenchant even many months later. The intro course is so important, because it creates habits and mindsets in students that often long outlive the course. Teaching a large, powerful, popular programming language to beginners in the era of Google, Bing, and DuckDuckGo is a Sisyphean task. No matter how we try to guide the students' introduction to language features, the Almighty Search Engine sits ever at the ready, delivering size and complexity when they really need simple answers. Maybe we need language levels a lá the HtDP folks.

Alas, my backlog is so deep that I doubt I will ever have time to cover much of it. Life goes on, and new ideas pop up every day. Perhaps I can make time the posts outlined above.

Right now, my excitement comes from the prospect of teaching my compilers course again for the first time in two years. The standard material still provides a solid foundation for students who are heading off into the the world of software development. But in the time since I last taught the course, some neat things have happened in the compiler world that will make the course better, if only by putting the old stuff into a more modern context. Consider announcements just this week about Swift, in particular that the source code is being open-sourced and the run-time ported to Linux. The moment these two things happen, the language instantly becomes of greater interest to more of my students. Its openness also makes it more suitable as content for a university course.

So, there will be plenty to blog about, even if I leave my backlog untouched. That's a good thing.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

June 07, 2015 9:26 AM

Agile Moments, Ernest Hemingway Edition

I couldn't help thinking of big visible charts when I read this paragraph in The Paris Review's interview with Ernest Hemingway:

[Hemingway] keeps track of his daily progress -- "so as not to kid myself" -- on a large chart made out of the side of a cardboard packing case and set up against the wall under the nose of a mounted gazelle head. The numbers on the chart showing the daily output of words differ from 450, 575, 462, 1250, back to 512, the higher figures on days [he] puts in extra work so he won't feel guilty spending the following day fishing on the Gulf Stream.

He uses the chart to keep himself honest. Even our greatest writers can delude themselves into thinking they are making enough progress when they aren't. All the more so for those of us who are still learning, whether how to run a marathon, how to write prose, or how to make software. When a group of people are working together, a chart can help the individuals maintain a common, and honest, understanding of how the team is doing.

Oh, and notice Hemingway's technology: the side of a cardboard packing case. No fancy dashboard for this writer who is known for his direct, unadorned style. If you think you need a digital dashboard with toggles, flashing lights, and subviews, you are doing it wrong. The point of the chart is to keep you honest, not give you another thing to do when you are not doing what you should be doing.

There is another lesson in this passage beyond the chart, about sustainable pace. Most of the numbers are in the ballpark of 500 (average: 499 3/4!), except for one day when he put in a double day. Perhaps 500 words a day is a pace that Hemingway finds productive over time. Yet he allows himself an occasional bit of overtime -- for something important, like time away from his writing desk, out on the water. Many of us programmers need to be reminded every so often that getting away from our work is valuable, and worth an occasional 0 on the big visible chart. It's also a more human motivation for overtime than the mad rush to a release date.

A few pages later in the interview, we read Hemingway repeating a common adage among writers that also echoes nicely against the agile practices:

You read what you have written and, as you always stop when you know what is going to happen next, you go on from there.

Hemingway stops each day at a point where the story will pull him forward the next morning. In this, XP devotees can recognize the habit of ending each day with a broken test. In the morning, or whenever we next fire up our editors, the broken test tells us exactly where to begin and gives us a concrete goal. By the time the test passes, our minds are ready to move on to something new.

Agility is useful when fighting bulls. Apparently, it helps when writing novels, too.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

June 04, 2015 2:33 PM

If the Web is the Medium, What is the Message?

How's this for a first draft:

History may only be a list of surprises, but you sure as heck don't want to lose the list.

That's part of the message in Bret Victor's second 'Web of Alexandria' post. He Puts it in starker terms:

To forget the past is to destroy the future. This is where Dark Ages come from.

Those two posts followed a sobering observation:

60% of my fav links from 10 yrs ago are 404. I wonder if Library of Congress expects 60% of their collection to go up in smoke every decade.

But it's worse than that, Victor tells us in his follow-up. As his tweet notes, the web has turned out to be unreliable as a publication medium. We publish items because we want them to persist in the public record, but they don't rarely persist for very long. However, the web has turned out to be a pernicious conversational medium as well. We want certain items shared on the web to be ephemeral, yet often those items are the ones that last forever. At one time, this may have seemed like only an annoyance, but now we know it to be dangerous.

The problem isn't that the web is a bad medium. In one sense, the web isn't really a medium at all; it's an infrastructure that enables us to create new kinds of media with historically uncharacteristic ease. The problem is that we are using web-based media for many different purposes, without understanding how each medium determines "the social and temporal scope of its messages".

The same day I read Victor's blog post, I saw this old Vonnegut quote fly by on Twitter:

History is merely a list of surprises. ... It can only prepare us to be surprised yet again.

Alas, on the web, history appears to be a list of cat pictures and Tumblr memes, with all the important surprises deleted when the author changed internet service providers.

In a grand cosmic coincidence, on the same day I read Victor's blog post and saw the Vonnegut quote fly by, I also read a passage from Marshall McLuhan in a Farnam Street post. It ends:

The modern world abridges all historical times as readily as it reduces space. Everywhere and every age have become here and now. History has been abolished by our new media.

The internet certainly amplifies the scale of McLuhan's worry, but the web has created unique form of erasure. I'm sure McLuhan would join Victor in etching an item on history's list of surprises:

Protect the past.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

June 02, 2015 1:46 PM

"I Just Need a Programmer", Screenplay Edition

Noted TV writer, director, producer, and blogger Ken Levine takes on a frequently-asked question in the latest edition of his "Friday Questions" feature:

I have a great idea for a movie, but I'm not a writer, I'm not in show biz, and I don't live in New York or LA. What do I do with this great idea? (And I'm sure you've never heard this question before, right?)

Levine is gentle in response:

This question does come up frequently. I wish I had a more optimistic answer. But the truth is execution is more valued than ideas. ...

Is there any domain where this isn't true? Yet professionals in every domain seem to receive this question all the time. I certainly receive the "I just need a programmer..." phone call or e-mail every month. If I went to cocktail parties, maybe I'd hear it at them, too.

The bigger the gap between idea and product, the more valuable, relatively speaking, execution is than having ideas. For many app ideas, executing the idea is not all that far beyond the reach of many people. Learn a little Objective C, and away you go. In three or four years, you'll be set! By comparison, writing a screenplay that anyone in Hollywood will look at (let alone turn into a blockbuster film) seems like Mount Everest.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

June 01, 2015 2:21 PM

I'd Like to Be Bored for a While

When asked if he has ever been bored, Italo Calvino responded:

Yes, in my childhood. But it must be pointed out that childhood boredom is a special kind of boredom. It is a boredom full of dreams, a sort of projection into another place, into another reality. In adulthood boredom is made of repetition, it is the continuation of something from which we are no longer expecting any surprise. And I -- would that I had time to get bored today!

Children are better at boredom than adults are, because we let them be. We should let adults be good at boredom every ocacsionally.

(Passage from this Paris Review interview, which I quoted a couple of times several weeks ago.)


Posted by Eugene Wallingford | Permalink | Categories: General

May 26, 2015 2:59 PM

If You Want to Help Students, You May Want to Help Faculty

In The Missing Middle, Matt Reed recommends a way for donors to have a real effect at teaching universities: pay for conference travel.

I've mentioned before that the next philanthropist who wants to make a massive difference in the performance of teaching-intensive public colleges -- whether community colleges or the smaller four-years -- could do it by underwriting conference travel. Right now, most colleges are lucky to send one or two people to most conferences. When an entire team attends the same presentation, it's much easier to get what chemists call activation energy. I've seen it personally.

This echoes a conversation the department heads in my college had earlier this month. Ongoing professional development is important for faculty, both in their research and their teaching. Faculty who are struggling in the classroom need more help than others, but even good teachers need to have their batteries charged every once in a while.

There tends to be more support for faculty development in their research than in their teaching, even at so-called teaching universities. Even so, professional development in research is often a natural side effect of external funding for research, and faculty at these universities don't always conduct research at a scale that is competitive for external funding.

And faculty development in their teaching? There aren't many sources to support this other than the university budget itself. Given the current state of funding for public universities, which is likely the new normal, these funds are being squeezed out of the budget, if they were ever present at all.

Professors need to stay current in their profession, and many need to address weaknesses over the course of their careers. When universities neglect faculty development, the faculty suffer, and so do their students. Often, the best way to help students is to help the faculty.

All that said, I am not holding my breath that dollars will be coming in from donors any time soon. People love to help students directly, but indirect support for students and support for other parts of the university are notoriously hard sells.


Posted by Eugene Wallingford | Permalink | Categories: General

May 09, 2015 9:28 AM

A Few Thoughts on Graduation Day

Today is graduation day for the Class of 2015 at my university. CS students head out into the world, most with a job in hand or nearly so, ready to apply their hard-earned knowledge and skills to all variety of problems. It's an exciting time for them.

This week also brought two other events that have me thinking about the world in which my students my will live and the ways in which we have prepared them. First, on Thursday, the Technology Association of Iowa organized a #TechTownHall on campus, where the discussion centered on creating and retaining a pool of educated people to participate in, and help grow, the local tech sector. I'm a little concerned that the TAI blog says that "A major topic was curriculum and preparing students to provide immediate value to technology employers upon graduation." That's not what universities do best. But then, that is often what employers want and need.

Second, over the last two mornings, I read James Fallows's classic The Case Against Credentialism, from the archives of The Atlantic. Fallows gives a detailed account of the "professionalization" of many lines of work in the US and the role that credentials, most prominently university degrees, have played in the movement. He concludes that our current approach is biased heavily toward evaluating the "inputs" to the system, such as early success in school and other demonstrations of talent while young, rather than assessing the outputs, namely, how well people actually perform after earning their credentials.

Two passages toward the end stood out for me. In one, Fallows wonders if our professionalized society creates the wrong kind of incentives for young people:

An entrepreneurial society is like a game of draw poker; you take a lot of chances, because you're rarely dealt a pat hand and you never know exactly what you have to beat. A professionalized society is more like blackjack, and getting a degree is like being dealt nineteen. You could try for more, but why?

Keep in mind that this article appeared in 1985. Entrepreneurship has taken a much bigger share of the public conversation since then, especially in the teach world. Still, most students graduating from college these days are likely thinking of ways to convert their nineteens into steady careers, not ways to risk it all on the next Amazon or Über.

Then this quote from "Steven Ballmer, a twenty-nine-year-old vice-president of Microsoft", on how the company looked for new employees:

We go to colleges not so much because we give a damn about the credential but because it's hard to find other places where you have large concentrations of smart people and somebody will arrange the interviews for you. But we also have a lot of walk-on talent. We're looking for programming talent, and the degree is in no way, shape, or form very important. We ask them to send us a program they've written that they're proud of. One of our superstars here is a guy who literally walked in off the street. We talked him out of going to college and he's been here ever since.

Who would have guessed in 1985 the visibility and impact that Ballmer would have over the next twenty years? Microsoft has since evolved from the entrepreneurial upstart to the staid behemoth, and now is trying to reposition itself as an important player in the new world of start-ups and mobile technology.

Attentive readers of this blog may recall that I fantasize occasionally about throwing off the shackles of the modern university, which grow more restrictive every year as the university takes on more of the attributes of corporate and government bureaucracy. In one of my fantasies, I organize a new kind of preparatory school for prospective software developers, one with a more modern view of learning to program but also an attention to developing the whole person. That might not satisfy corporate America's need for credentials, but it may well prepare students better for a world that needs poker players as much as it needs blackjack players. But where would the students come from?

So, on a cloudy graduation day, I think about Fallows's suggestion that more focused vocational training is what many grads need, about the real value of a liberal university education to both students and society, and about how we can best prepare CS students participate to in the world. It is a world that needs not only their technical skills but also their understanding of what tech can and cannot do. As a society, we need them to take a prominent role in civic and political discourse.

One final note on the Fallows piece. It is a bit long, dragging a bit in the middle like a college research paper, but opens and closes strongly. With a little skimming through parts of less interest, it is worth a read. Thanks to Brian Marick for the recommendation.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

April 30, 2015 6:00 PM

Software is a Means of Communication, Just Like a Research Paper

I can't let my previous post be my only comment on Software in Scientific Research. Hinsen's bigger point is worth a post of its own.

Software is a means of communication, just like papers or textbooks.

... much like the math that appears in a paper or a textbook -- except that, done properly, a computer program runs and provides a dynamic demonstration of an idea.

The main questions asked about scientific software [qua software] are "What does it do?" and "How efficient is it?" When considering software as a means of communication, we would ask questions such as "Is it well-written, clear, elegant?", "How general is the formulation?", or "Can I use it as the basis for developing new science?".

This shift requires a different level of understanding of programs and programming than many scientists (and other people who do not program for a living) have. But it is a shift that needs to take place, so we should so all we can to help scientists and others become more fluent. (Hey to Software Carpentry and like-minded efforts.)

We take for granted that all researchers are responsible for being able to produce and, more importantly, understand the other essential parts of scientific communication:

We actually accept as normal that the scientific contents of software, i.e., the models implemented by it, are understandable only to software specialists, meaning that for the majority of users, the software is just a black box. Could you imagine this for a paper? "This paper is very obscure, but the people who wrote it are very smart, so let's trust them and base our research on their conclusions." Did you ever hear such a claim? Not me.

This is a big part of the challenge we face in getting faculty across the university to see the vital role that computing should play in modern education -- as well as the roles it should not play. The same is true in the broader culture. We'll see if efforts such as code.org can make a dent in this challenge.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

April 26, 2015 9:55 AM

Yesterday's Questions Can Have Different Answers Today

I wrote on Twitter Thursday [ 1 | 2 ] that I end up modifying my lecture notes every semester, no matter how well done they were the last time I taught the course. From one semester to the next, I find that I am more likely to change the introductions, transitions, and conclusions of a session than the body. The intros, transitions, and conclusions help to situate the material in a given place and time: the context of this semester and this set of students. The content, once refined, tends to stabilize, though occasionally I feel a need to present even it in a different way, to fit the current semester.

Novelist Italo Calvino knew this feeling as well, when he was preparing to be interviewed:

Rarely does an interviewer ask questions you did not expect. I have given a lot of interviews and I have concluded that the questions always look alike. I could always give the same answers. But I believe I have to change my answers because with each interview something has changed either inside myself or in the world. An answer that was right the first time may not be right again the second.

This echoes my experience preparing for lecture. The answer that was right the last time does not seem right again this time. Sometimes, I have changed. With any luck, I have learned new things since the last time I taught the course, and that makes for a better story. Sometimes, the world has changed: a new programming language such as Clojure or Scala has burst onto the scene, or a new trend in industry such as mobile app development has made a different set of issues relevant to the course. I need to tell a different story that acknowledges -- and takes advantage of -- these changes.

Something else always changes for a teacher, too: the students. It's certainly true the students in the class are different every time I teach a course. But sometimes, the group is so different from past groups that the old examples, stories, and answers just don't seem to work. Such has been the case for me this semester. I've had to work quite a bit to understand how my students think and incorporate that into my class sessions and homework assignments. This is part of the fun and challenge of being a teacher.

We have to be careful not to take this analogy too far. Teaching computer science is different from an author giving an interview about his or her life. For one thing, there is a more formal sense of objective truth in the content of, say, a programming language course. An object is still a closure; a closure is still an object that other code can interact with over time. These answers tend to stay the same over time. But even as a course communicates the same fundamental truths from semester to semester, the stories we need to tell about these truths will change.

Ever the fantastic writer, Calvino saw in his interview experience the shape of a new story, a meta-story of sorts:

This could be the basis of a book. I am given a list of questions, always the same; every chapter would contain the answers I would give at different times. ... The changes would then become the itinerary, the story that the protagonist lives. Perhaps in this way I could discover some truths about myself.

This is one of the things I like about teaching. I often discover truths about myself, and occasionally transform myself.

~~~~

The passages quote above come from The Art of Fiction No. 130, Italo Calvino in The Paris Review. It's not the usual Paris Review interview, as Calvino died before the interviewer was done. Instead, it is a pastiche of four different sources. It's a great read nonetheless.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

April 09, 2015 3:26 PM

Two Factors for Succeeding at Research, or Investing

Think differently, of course. But be humble. These attitudes go hand-in-hand.

To make money in the markets, you have to think independently and be humble. You have to be an independent thinker because you can't make money agreeing with the consensus view, which is already embedded in the price. Yet whenever you're betting against the consensus there's a significant probability you're going to be wrong, so you have to be humble.

This applies equally well to doing research. You can't make substantial progress with the conventional wisdom, because it defines and limits the scope of the solution. So think differently. But when you leave the safety of conventional wisdom, you find yourself working in an immense space of ideas. There is a significant chance that you will be wrong a lot. So be humble.

(The quote is from Learn or Die: Using Science to Build a Leading-Edge Learning Organization by Ray Dalio.)


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

March 13, 2015 3:07 PM

Two Forms of Irrelevance

When companies become irrelevant to consumers.
From The Power of Marginal, by Paul Graham:

The big media companies shouldn't worry that people will post their copyrighted material on YouTube. They should worry that people will post their own stuff on YouTube, and audiences will watch that instead.

You mean Grey's Anatomy is still on the air? (Or, as today's teenagers say, "Grey's what?")

When people become irrelevant to intelligent machines.
From Outing A.I.: Beyond the Turing Test, by Benjamin Bratton:

I argue that we should abandon the conceit that a "true" Artificial Intelligence must care deeply about humanity -- us specifically -- as its focus and motivation. Perhaps what we really fear, even more than a Big Machine that wants to kill us, is one that sees us as irrelevant. Worse than being seen as an enemy is not being seen at all.

Our new computer overlords indeed. This calls for a different sort of preparation than studying lists of presidents and state capitals.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

March 10, 2015 4:45 PM

Learning to Program is a Loser's Game

After a long break from playing chess, I recently played a few games at the local club. Playing a couple of games twice in the last two weeks has reminded me that I am very rusty. I've only made two horrible blunders in four games, but I have made many small mistakes, the kind of errors that accumulate over time and make a position hard to defend, even untenable. Having played better in years past, these inaccuracies are irksome.

Still, I managed to win all four games. As I've watched games at the club, I've noticed that most games are won by the player who makes the second-to-last blunder. Most of the players are novices, and they trade mistakes: one player leaves his queen en prise; later, his opponent launches an underprepared attack that loses a rook; then the first player trades pieces and leaves himself with a terrible pawn structure -- and so on, the players trading weak or bad moves until the position is lost for one of them.

My secret thus far has been one part luck, one part simple strategy: winning by not losing.

This experience reminded me of a paper called The Loser's Game, which in 1975 suggested that it was no longer possible for a fund manager to beat market averages over time because most of the information needed to do well was available to everyone. To outperform the market average, a fund manager has to profit from mistakes made by other managers, sufficiently often and by a sufficient margin to sustain a long-term advantage. Charles Ellis, the author, contrasts this with the bull markets of the 1960s. Then, managers made profits based on the specific winning investments they made; in the future, though, the best a manager could hope for was not to make the mistakes that other investors would profit from. Fund management had transformed from being a Winner's Game to a Loser's Game.

the cover of Extraordinary Tennis for the Ordinary Tennis Player

Ellis drew his inspiration from another world, too. Simon Ramo had pointed out the differences between a Winner's Game and a Loser's Game in Extraordinary Tennis for the Ordinary Tennis Player. Professional tennis players, Ramo said, win based on the positive actions they take: unreturnable shots down the baseline, passing shots out of the reach of a player at the net, service aces, and so on. We duffers try to emulate our heroes and fail... We hit our deep shots just beyond the baseline, our passing shots just wide of the sideline, and our killer serves into the net. It turns out that mediocre players win based on the errors they don't make. They keep the ball in play, and eventually their opponents make a mistake and lose the point.

Ramo saw that tennis pros are playing a Winner's Game, and average players are playing a Loser's Game. These are fundamentally different games, which reward different mindsets and different strategies. Ellis saw the same thing in the investing world, but as part of a structural shift: what had once been a Winner's Game was now a Loser's Game, to the consternation of fund managers whose mindset is finding the stocks that will earn them big returns. The safer play now, Ellis says, is to minimize mistakes. (This is good news for us amateurs investors!)

This is the same phenomenon I've been seeing at the chess club recently. The novices there are still playing a Loser's Game, where the greatest reward comes to those who make the fewest and smallest mistakes. That's not very exciting, especially for someone who fancies herself to be Adolf Anderssen or Mikhail Tal in search of an immortal game. The best way to win is to stay alive, making moves that are as sound as possible, and wait for the swashbuckler across the board from you to lose the game.

What does this have to do with learning to program? I think that, in many respects, learning to program is a Loser's Game. Even a seemingly beginner-friendly programming language such as Python has an exacting syntax compared to what beginners are used to. The semantics seem foreign, even opaque. It is easy to make a small mistake that chokes the compiler, which then spews an error message that overwhelms the new programmer. The student struggles to fix the error, only to find another error waiting somewhere else in the code. Or he introduces a new error while eliminating the old one, which makes even debugging seem scary. Over time, this can dishearten even the heartiest beginner.

What is the best way to succeed? As in all Loser's Games, the key is to make fewer mistakes: follow examples closely, pay careful attention to syntactic details, and otherwise not stray too far from what you are reading about and using in class. Another path to success is to make the mistakes smaller and less intimidating: take small steps, test the code frequently, and grow solutions rather than write them all at once. It is no accident that the latter sounds like XP and other agile methods; they help to guard us from the Loser's Game and enable us to make better moves.

Just as playing the Loser's Game in tennis or investing calls for a different mindset, so, too does learning to program. Some beginners seem to grok programming quickly and move on to designing and coding brilliantly, but most of us have to settle in for a period of discipline and growth. It may not be exciting to follow examples closely when we want to forge ahead quickly to big ideas, but the alternative is to take big shots and let the compiler win all the battles.

Unlike tennis and Ellis's view of stock investing, programming offers us hope: Nearly all of us can make the transition from the Loser's Game to the Winner's Game. We are not destined to forever play it safe. With practice and time, we can develop the discipline and skills necessary to making bold, winning moves. We just have to be patient and put time and energy into the process of becoming less mistake-prone. By adopting the mindset needed to succeed in a Loser's Game, we can eventually play the Winner's Game.

I'm not too sure about the phrases "Loser's Game" and "Winner's Game", but I think that this analogy can help novice programmers. I'm thinking of ways that I can use it to help my students survive until they can succeed.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

February 16, 2015 4:15 PM

An Example of Science in Action

Here is another interesting piece from The New Yorker, this time on an example of science in action. Jon Krakauer is the author of Into the Wild, about adventurer Chris McCandless. Eighteen months ago, he published a claim that McCandless had likely died as a result of eating the seeds of Hedysarum alpinum, known as wild potato. Krakauer's theory was based on lab analysis of seeds from the plant showing that it contained a particular toxic alkaloid. A critic of the claim, Tom Clausen, suggested that Krakauer's theory would be credible only after being subjected to more thorough testing and published in a peer-reviewed journal.

So that's what Krakauer did. He worked with the same lab to do more thorough testing and found that his toxic alkaloid theory didn't hold up after all. Instead, detailed analysis found that Hedysarum alpinum instead contains an amino acid that acts as an antimetabolite and for which toxicity in animals has been well documented. This work went through peer review and is being published next month in a scientific journal.

That's how science works. If a claim is challenged by other scientists, it is subjected to further tests. When those tests undermine the claim, it is withdrawn. Often, though, the same tests that undermine one hypothesis can point us in the direction of another and give us the information we need to construct a better theory.

A cautionary lesson from science also jumps out of this article, though. While searching the scientific literature for studies as part of the re-analysis of Hedysarum alpinum, he found a paper that pointed him in the direction of toxic non-protein amino acids. Krakauer writes:

I had missed this article in my earlier searches because I had been looking for a toxic alkaloid instead of a toxic amino acid. Clausen had been looking for a toxic alkaloid as well, when he and Edward Treadwell reported, in a peer-reviewed paper published in the journal Ethnobotany Research & Applications, that "no chemical basis for toxicity could be found" in H. alpinum seeds.

Clausen's team had been looking specifically for alkaloids, but then concluded more generally that "no chemical basis for toxicity could be found". This claim is broader than their results can support. Only the narrower claim that they could find no chemical basis for alkaloid toxicity seems warranted by the evidence. That is probably the conclusion Clausen's team should have drawn. Our conclusions should be as narraow as possible, given the data.

Anyway, Krakauer has written a fascinating article, accessible even to a non-biologist like me. Check it out.


Posted by Eugene Wallingford | Permalink | Categories: General

February 06, 2015 3:11 PM

What It Feels Like To Do Research

In one sentence:

Unless you tackle a problem that's already solved, which is boring, or one whose solution is clear from the beginning, mostly you are stuck.

This is from Alec Wilkinson's The Pursuit of Beauty, about mathematician Yitang Zhang, who worked a decade on the problem of bounded gaps between prime numbers. As another researcher says in the article,

When you try to prove a theorem, you can almost be totally lost to knowing exactly where you want to go. Often, when you find your way, it happens in a moment, then you live to do it again.

Programmers get used to never feeling normal, but tackling the twin prime problem is on a different level altogether. The same is true for any deep open question in math or computing.

I strongly recommend Wilkinson's article. It describes what life for untenured mathematicians is like, and how a single researcher can manage to solve an important problem.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

February 05, 2015 3:57 PM

If You Want to Become a Better Writer...

... write for undergraduates. Why?

Last fall, Steven Pinker took a stab at explaining why academics stink at writing. He hypothesizes that cognitive science and human psychology explain much of the problem. Experts often find it difficult to imagine that others do not know what experts know, which Pinker calls the curse of knowledge. They work around the limitations of short-term memory by packaging ideas into bigger and more abstract units, often called chunking. Finally, they tend to think about the things they understand well in terms of how they use them, not in terms of what they look like, a transition called functional fixity.

Toward the end of the article, Pinker summarizes:

The curse of knowledge, in combination with chunking and functional fixity, helps make sense of the paradox that classic style is difficult to master. What could be so hard about pretending to open your eyes and hold up your end of a conversation? The reason it's harder than it sounds is that if you are enough of an expert in a topic to have something to say about it, you have probably come to think about it in abstract chunks and functional labels that are now second nature to you but are still unfamiliar to your readers--and you are the last one to realize it.

Most academics aren't trying to write bad prose. They simply don't have enough practice writing good prose.

When Calvin explained to Hobbes, "With a little practice, writing can be an intimidating and impenetrable fog," he got it backward. Fog comes easily to writers; it's the clarity that requires practice. The naïve realism and breezy conversation in classic style are deceptive, an artifice constructed through effort and skill.

Wanting to write better is not sufficient. Exorcising the curse requires writers to learn new skills and to practice. One of the best ways to see if the effort is paying off is to get feedback: show the work to real readers and see if they can follow it.

That's where undergraduates come in. If you want to become a better writer or a better speaker, teach undergraduates regularly. They are about as far removed as you can get from an expert while still having an interest in the topic and some inclination to learn more about it.

When I write lecture notes for my undergrads, I have to eliminate as much jargon as possible. I have to work hard to put topics into the best order for learners, not for colleagues who are also expert in the area. I have to find stories to illuminate ideas, and examples to illustrate them. When I miss the intended mark on any of these attempts, my students will let me know, either through their questions or through their inability to perform as I expected. And then I try again.

My lecture notes are far from perfect, but they are always much better after a few iterations teaching a course than they are the first time I do. The weakest parts tend to be for material I'm adding to the course for the first time; the best parts tend to be revisions of existing material. These facts are no surprise to any writer or presenter, of course. Repetition and effort are how we make things better.

Even if you do not consider yourself a teacher by trade, if you want to improve your ability to communicate science, teach undergrads. Write lecture notes and explanations. Present to live students and monitor lab sessions. The students will reward you with vigorous feedback. Besides, they are good people to get to know.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 31, 2015 11:51 AM

Failure with a Purpose

I am committed to being wrong bigger and more often in 2015. Yet I am mindful of Avdi Grimm's admonition:

... failure isn't always that informative. You can learn a thousand different ways to fail and never learn a single way to succeed.

To fail for failure's sake is foolish and wasteful. In writing, the awful stuff you write when you start isn't usually valuable in itself, but rather for what we learn from studying and practicing. In science, failing isn't usually valuable in itself, but rather for what you learn when you prove an idea wrong. The scientist's mindset has a built-in correction for dealing with failure: every surprising result prompts a new attempt to understand why and build a better model.

As Grimm says, be sure you know what purpose your failure will serve. Sometimes, taking bigger risks intellectually can help us get off a plateau in our thinking, or even a local maximum. The failure pays off when we pay attention to the outcome and find a better hill to climb.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

January 29, 2015 4:27 PM

A Reminder from Marcus Aurelius

... from Book 6 of The Meditations, courtesy of George Berridge:

Item 52 from Book 6 of The Meditations, by Marcus Aurelius

You are not compelled to form any opinion about this matter before you, nor to disturb your piece of mind at all. Things in themselves have no power to extort a verdict from you.

This seems especially sound advice in this era, full of devices that enable other people to bombard our minds with matters they find Very Important Indeed. Maintain your piece of mind until you encounter a thing that your own mind knows to be important.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

January 19, 2015 2:14 PM

Beginners, Experts, and Possibilities

Last Thursday, John Cook tweeted:

Contrary to the Zen proverb, there may be more possibilities in the expert's mind than in the beginner's.

This summed up nicely one of the themes in my Programming Languages course that afternoon. Some students come into the course knowing essentially only one language, say Python or Ada. Others come knowing several languages well, including their intro language, Java, C, and maybe a language they learned on the job, such as Javascript or Scala.

Which group do you think has a larger view of what a programming language can be? The more knowledgable, to be sure. This is especially true when their experience includes languages from different styles: procedural, object-oriented, functional, and so on.

Previous knowledge affects expectations. Students coming directly out of their first year courses are more likely to imagine that all languages are similar to what they already know. Nothing in their experience contradicts that idea.

Does this mean that the Zen notion of beginner's mind is wrongheaded? Not at all. I think an important distinction can be made between analysis and synthesis. In a context where we analyze languages, broad experience is more valuable than lack of experience, because we are able to bring to our seeing a wider range of possibilities. That's certainly my experience working with students over the years.

However, in a context, where we create languages, broad experience can be an impediment. When we have seen many different languages, it it can be difficult to create something that looks much different from the languages what we've already seen. Something in our minds seems to pull us toward an existing language that already solves the constraint they are struggling with. Someone else has already solved this problem; their solution is probably best.

This is also my experience working with students over the years. My freshmen will almost always come up with a fresher language design than my seniors. The freshmen don't know much about languages yet, and so their minds are relatively unconstrained. (Fantastically so, sometimes.) The seniors often seem to end up with something that is superficially new but, at its core, thoroughly predictable.

The value of "Zen mind, beginner's mind" also follows a bit from the distinction between expertise and experience. Experts typically reach a level of where they solve problem using heuristics to solve problems. There patterns and shortcuts are efficient, but they also tend to be "compiled" and not all that open to critical examination. We create best when we are able to modify, rearrange, and discard, and that's harder to do when our default mode of thinking is in pre-compiled units.

It should not bother us that useful adages and proverbs contradict one another. The world is complex. As Bokononists say, Busy, busy, busy.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 09, 2015 3:40 PM

Computer Science Everywhere, Military Edition

Military Operations Orders are programs that are executed by units. Code re-use and other software engineering principles applied regularly to these.

An alumnus of my department, a CS major-turned-military officer, wrote those lines in an e-mail responding to my recent post, A Little CS Would Help a Lot of College Grads. Contrary to what many people might imagine, he has found what he learned in computer science to be quite useful to him as an Army captain. And he wasn't even a programmer:

One of the biggest skills I had over my peers was organizing information. I wasn't writing code, but I was handling lots of data and designing systems for that data. Organizing information in a way that was easy to present to my superiors was a breeze and having all the supporting data easily accessible came naturally to me.

Skills and principles from software engineering and project development apply to systems other than software. They also provide a vocabulary for talking about ideas that non-programmers encounter every day:

I did introduce my units to the terms border cases, special cases, and layers of abstraction. I cracked a smile every time I heard those terms used in a meeting.

Excel may not be a "real programming language", but knowing the ways in which it is a language can make managers of people and resources more effective at what they do.

For more about how a CS background has been useful to this officer, check out CS Degree to Army Officer, a blog entry that expands on his experiences.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

January 01, 2015 11:29 AM

Being Wrong in 2015

Yesterday, I read three passages about being wrong. First, this from a blog entry about Charles Darwin's "fantastically wrong" idea for how natural selection works:

Being wildly wrong is perfectly healthy in science, because when someone comes along to prove that you're wrong, that's progress. Somewhat embarrassing progress for the person being corrected, sure, but progress nonetheless.

Then, P.G. Wodehouse shared in his Paris Review interview that it's not all Wooster and Jeeves:

... the trouble is when you start writing, you write awful stuff.

And finally, from a touching reflection on his novelist father, this delicious sentence by Colum McCann:

He didn't see this as a failure so much as an adventure in limitations.

My basic orientation as a person is one of small steps, small progress, trying to be a little less wrong than yesterday. However, such a mindset can lead to a conservatism that inhibits changes in direction. One goal I have for 2015 is to take bigger risks intellectually, to stretch my thinking more than I have lately. I'll trust Wodehouse that when I start, I may well be awful. I'll recall Darwin's example that it's okay to be wildly wrong, because then someone will prove me wrong (maybe even me), and that will be progress. And if, like McCann's father, I can treat being wrong as merely an adventure in my limitations, perhaps fear and conservatism won't hold me back from new questions worth asking.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

December 26, 2014 8:32 AM

Editing and the Illusion of Thought

Martin Amis, in The Paris Review, The Art of Fiction No. 151:

By the way, it's all nonsense about how wonderful computers are because you can shift things around. Nothing compares with the fluidity of longhand. You shift things around without shifting them around--in that you merely indicate a possibility while your original thought is still there. The trouble with a computer is that what you come out with has no memory, no provenance, no history--the little cursor, or whatever it's called, that wobbles around the middle of the screen falsely gives you the impression that you're thinking. Even when you're not.

My immediate reaction was that Mr. Amis needs version control, but there is something more here.

When writing with pencil and paper, we work on an artifact that embodies the changes it has gone through. We see the marks and erasures; we see the sentence where it once was once at the same time we see the arrow telling us where it now belongs. When writing in a word processor, our work appears complete, even timeless, though we know it isn't. Mark-up mode lets us see some of the document's evolution, but the changes feel more distant from our minds. They live out there.

I empathize with writers like Amis, whose experience predates the computer. Longhand feels different. Teasing out what what was valuable, even essential, in previous experience and what was merely the limitation of our tools is one of the great challenges of any time. How do we make new tools that are worth the change, that enable us to do more and better?


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

December 14, 2014 9:38 AM

Social Media, Baseball, and Addiction

In a recent interview with Rolling Stone, rock star Geddy Lee revealed himself as a fan of baseball, but not of social media:

Geddy Lee isn't a big fan of social media. "I sometimes look on Twitter to follow baseball transactions," he says. "But that's it. I'm also not on Facebook or anything. I see it as an addiction and I have enough addictions. God knows I pick up my phone enough to check baseball scores."

As a baseball fan without a smart phone, I am in no rush to judge. I don't need more addictions, either.

The recently-concluded winter baseball meetings likely kept Lee as busy following transactions as they kept me, with several big trades and free agent signings. My Reds and Tigers both made moves that affect expectations for 2015.

Pitchers and catchers report in a little over two months. Lee and I will be checking scores again soon enough.


Posted by Eugene Wallingford | Permalink | Categories: General, Photos

December 02, 2014 2:53 PM

Other People's Best Interests

Yesterday I read:

It's hard for me to figure out people voting against their own self-interests.

I'm not linking to the source, because it wouldn't be fair to single the speaker out, especially when so many other things in the article are spot-on. Besides, I hear many different people express this sentiment from time time, people of various political backgrounds and cultural experiences. It seems a natural human reaction when things don't turn out the way we think they should.

Here is something I've learned from teaching and from working with teams doing research and writing software:

If you find yourself often thinking that people aren't acting in their own self-interests, maybe you don't know what their interests are.

It certainly may be true that people are not acting in what you think is their own self-interest. But it's rather presumptuous to think that you other people's best interest better than they do.

Whenever I find myself in this position, I have some work to do. I need to get to know my students, or my colleagues, or my fellow citizens, better. In cases where it's really true, and I have strong reason to think they aren't acting in their own best interest, I have an opportunity to help them learn. This kind of conversation calls for great care, though, because often we are dealing with people's identities and most deeply-held believes.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Teaching and Learning

November 25, 2014 1:43 PM

Concrete Play Trumps All

Areschenko-Johannessen, Bundesliga 2006-2007

One of the lessons taught by the computer is that concrete play trumps all.

This comment appeared in the review of a book of chess analysis [ paywalled ]. The reviewer is taking the author to task for talking about the positional factors that give one player "a stable advantage" in a particular position, when a commercially-available chess program shows the other player can equalize easily, and perhaps even gain an advantage.

It is also a fitting comment on our relationship with computers these days more generally. In areas such as search and language translation, Google helped us see that conventional wisdom can often be upended by a lot of data and many processors. In AI, statistical techniques and neural networks solve problems in ways that models of human cognition cannot. Everywhere we turn, it seems, big data and powerful computers are helping us to redefine our understanding of the world.

We humans need not lose all hope, though. There is still room for building models of the world and using them to reason, just as there is room for human analysis of chess games. In chess, computer analysis is pushing grandmasters to think differently about the game. The result is a different kind of understanding for the more ordinary of us, too. We just have to be careful to check our abstract understanding against computer analysis. Concrete play trumps all, and it tests our hypotheses. That's good science, and good thinking.

~~~~

(The chess position is from Areschenko-Johannessen 2006-2007, used as an example in Chess Training for Post-Beginners by Yaroslav Srokovski and cited in John Hartmann's review of the book in the November 2014 issue of Chess Life.)


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 23, 2014 8:50 AM

Supply, Demand, and K-12 CS

When I meet with prospective students and their parents, we often end up discussing why most high schools don't teach computer science. I tell them that, when I started as a new prof here, about a quarter of incoming freshmen had taken a year of programming in high school, and many other students had had the opportunity to do so. My colleagues and I figured that this percentage would go way up, so we began to think about how we might structure our first-year courses when most or all students already knew how to program.

However, the percentage of incoming students with programming experience didn't go up. It went way down. These days, about 10% of our freshman know how to program when they start our intro course. Many of those learned what they know on their own. What happened, today's parents ask?

A lot of things happened, including the dot-com bubble, a drop in the supply of available teachers, a narrowing of the high school curriculum in many districts, and the introduction of high-stakes testing. I'm not sure how much each contributed to the change, or whether other factors may have played a bigger role. Whatever the causes, the result is that our intro course still expects no previous programming experience.

Yesterday, I saw a post by a K-12 teacher on the Racket users mailing list that illustrates the powerful pull of economics. He is leaving teaching for software development industry, though reluctantly. "The thing I will miss the most," he says, "is the enjoyment I get out of seeing youngsters' brains come to life." He also loves seeing them succeed in the careers that knowing how to program makes possible. But in that success lies the seed of his own career change:

Speaking of my students working in the field, I simply grew too tired of hearing about their salaries which, with a couple of years experience, was typically twice what I was earning with 25+ years of experience. Ultimately that just became too much to take.

He notes that college professors probably know the feeling, too. The pull must be much stronger on him and his colleagues, though; college CS professors are generally paid much better than K-12 teachers. A love of teaching can go only so far. At one level, we should probably be surprised that anyone who knows how to program well enough to teach thirteen- or seventeen-year-olds to do it stays in the schools. If not surprised, we should at least be deeply appreciative of the people who do.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

November 11, 2014 7:53 AM

The Internet Era in One Sentence

I just love this:

When a 14-year-old kid can blow up your business in his spare time, not because he hates you but because he loves you, then you have a problem.

Clay Shirky attributes it to Gordy Thompson, who managed internet services at the New York Times in the early 1990s. Back then, it was insightful prognostication; today, it serves as an epitaph for many an old business model.

Are 14-year-old kids making YouTube videos to replace me yet?


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 01, 2014 3:27 PM

Passion is a Heavy Burden

Mark Guzdial blogged this morning about the challenge of turning business teachers into CS teachers. Where is the passion? he asks.

These days, I wince every time I hear word 'passion'. We apply it to so many things. We expect teachers to have passion for the courses they teach, students to have passion for the courses they take, and graduates to have passion for the jobs they do and the careers they build.

Passion is a heavy burden. In particular, I've seen it paralyze otherwise well-adjusted college students who think they need to try another major, because they don't feel a passion for the one they are currently studying. They don't realize that often passion comes later, after they master something, do it for a while, and come to appreciate it ways they could never imagine before. I'm sure some of these students become alumni who are discontent with their careers, because they don't feel passion.

I think requiring all CS teachers to have a passion for CS sets the bar too high. It's an unrealistic expectation of prospective teachers and of the programs that prepare them.

We can survive without passionate teachers. We should set our sights on more realistic and relevant goals:

  • Teachers should be curious. They should have a desire to learn new things.
  • Teachers should be professional. They should have a desire to do their jobs well.
  • Teachers should be competent. They should be capable of doing their jobs well.

Curiosity is so much more important than passion for most people in most contexts. If you are curious, you will like encountering new ideas and learning new skills. That enjoyment will carry you a long way. It may even help you find your passion.

Perhaps we should set similarly realistic goals for our students, too. If they are curious, professional, and competent, they will most likely be successful -- and content, if not happy. We could all do worse.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

October 23, 2014 4:21 PM

A Quick Word on the Yik Yak Controversy

There has been some controversy on my campus recently about a slew of hurtful posts made on the social media application Yik Yak. The following is something I wrote for my intro CS students, with minor changes.

Computing is often in the news, but we don't talk much about current events in class. That's not the focus of this course, and we have plenty to do...

But the recent news story in the Northern Iowan about Yik Yak has been on my mind. Yik Yak is a social media app that lets people make comments anonymously and vote on other people's comments. This kind of app has many possible uses, some of which are positive. Many people live under conditions where they need to be able to communicate anonymously.

Unfortunately, some people in the UNI area have been using it to post hurtful comments about various groups. This behavior is simply mean.

Yik Yak is a social app, so the controversy is about people and how they behave. In this regard, my reaction has been similar to so many others' reactions. I am always sad to be reminded that people actually think such things, and sadder to know that they feel compelled to say them out loud. To do so anonymously is an act of cowardice.

But this controversy is also about what we do, because Yik Yak is a program. We call it an "app", but that's just the term du jour. It is a computer program. Programmers wrote it.

We could have an interesting discussion about apps like this: their uses, the good and bad they enable, how to grow and guide communities of users, and so on. I do not use Yik Yak and am not a member of its community. I don't know much beyond what has been reported about it in the media. However, I have been part of Internet-based communities since I was in college, and they all seem to have a lot in common with one another. So this situation feels quite familiar to me.

I am not going to lecture a group of young people about the ways they communicate and congregate on-line. Let me just say this.

When you learn to program, you inherit power to affect the world. You can make things, programs and apps and services that real people use. You can choose to use your power to do good things, to make the world better. Or you can not choose to. Not choosing may mean creating something whose effects you did not consider, or whose community behaves in ways you did not intend.

Please take your power seriously. Think about the effects of what you do when you write a program. Choose wisely.


Posted by Eugene Wallingford | Permalink | Categories: General

October 21, 2014 3:13 PM

Ernest Hemingway, Programmer

In The Wave in the Mind, a collection of talks and essays, Ursula Le Guin describes Ernest Hemingway as "someone who did Man right". She also gives us insight to Hemingway's preferences in programming languages. Anyone who has read Hemingway knows that he loved short sentences. Le Guin tells us more:

Ernest Hemingway would have died rather than have syntax. Or semicolons.

So, Java and C are out. Python would fit. Or maybe Lisp. All the greats know Lisp.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

September 24, 2014 3:54 PM

Is It Really That Hard?

This morning, I tweeted:

Pretty sure I could build a git-based curriculum management system in two weeks that would be miles better than anything on the market now.

Yes, I know that it is easy to have ideas, and that carrying an idea through to a product is the real challenge. At least I don't just need a programmer...

My tweet was the result of temporary madness provoked by yet another round of listening to non-CS colleagues talk about one of the pieces of software we use on campus. It is a commercial product purchased for one task only, to help us manage the cycle of updating the university catalog. Alas, in its current state, it can handle only one catalog at a time. This is, of course, inconvenient. There are always at least two catalogs: the one in effect at this moment, and the one in progress of being updated. That doesn't even take into account all of the old catalogs still in effect for the students who entered the university when they were The Catalog.

Yes, we need version control. Either the current software does not provide it, or that feature is turned off.

The madness arises because of the deep internal conflict that occurs within me when I'm drawn into such conversations. Everyone assumes that programs "can't do this", or that the programmers who wrote our product were mean or incompetent. I could try to convince them otherwise by explaining the idea of version control. But their experience with commercial software is so uniformly bad that they have a hard time imagining I'm telling the truth. Either I misunderstand the problem, or I am telling them a white lie.

The alternative is to shake my head, agree with them implicitly, and keep thinking about how to teach my intro students how to design simple programs.

I'm convinced that a suitable web front-end to a git back end could do 98% of what we need, which is about 53% more than either of our last two commercial solutions has done for us.

Maybe it's time for me to take a leave of absence, put together a small team of programmers, and do this. Yes, I would need a team. I know my limitations, and besides working with a few friends would be a lot more fun. The current tools in this space leave a lot of room for improvement. Built well and marketed well, this product would make enough money from satisfaction-starved universities to reward everyone on the team well enough for all to retire comfortably.

Maybe not. But the idea is free the taking. All I ask is that if you build it, give me a shout-out on your website. Oh, and cut my university a good deal when we buy your software to replace whatever product we are grumbling about when you reach market.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

September 15, 2014 4:22 PM

It's All Just Keystrokes

Laurie Penny describes one effect of so many strands of modern life converging into the use of a single device:

That girl typing alone at the internet café might be finishing off her novel. Or she might be breaking up with her boyfriend. Or breaking into a bank. Unless you can see her screen, you can't know for sure. It's all just keystrokes.

Some of it is writing in ways we have always written; some it is writing in ways only recently imagined. Some of it is writing for a computer. A lot of it is writing.

(Excerpt from Why I Write.)


Posted by Eugene Wallingford | Permalink | Categories: General

September 12, 2014 1:49 PM

The Suffocating Gerbils Problem

I had never heard of the "suffocating gerbils" problem until I ran across this comment in a Lambda the Ultimate thread on mixing declarative and imperative approaches to GUI design. Peter Van Roy explained the problem this way:

A space rocket, like the Saturn V, is a complex piece of engineering with many layered subsystems, each of which is often pushed to the limits. Each subsystem depends on some others. Suppose that subsystem A depends on subsystem B. If A uses B in a way that was not intended by B's designers, even though formally B's specification is being followed by A, then we have a suffocating gerbils problem. The mental image is that B is implemented by a bunch of gerbils running to exhaustion in their hoops. A is pushing them to do too much.

I first came to appreciate the interrelated and overlapping functionality of engineered subsystems in graduate school, when I helped a fellow student build a software model of the fuel and motive systems of an F-18 fighter plane. It was quite a challenge for our modeling language, because the functions and behaviors of the systems were intertwined and did not follow obviously from the specification of components and connections. This challenge motivated the project. McDonnell Douglas was trying to understand the systems in a new way, in order to better monitor performance and diagnose failures. (I'm not sure how the project turned out...)

We suffocate gerbils at the university sometimes, too. Some functions depend on tenure-track faculty teaching occasional overloads, or the hiring of temporary faculty as adjuncts. When money is good, all is well. As budgets tighten, we find ourselves putting demands on these subsystems to meet other essential functions, such as advising, recruiting, and external engagement. It's hard to anticipate looming problems before they arrive in full failure; everything is being done according to specification.

Now there's a mental image: faculty gerbils running to exhaustion.

If you are looking for something new to read, check out some of Van Roy's work. His Concepts, Techniques, and Models of Computer Programming offers all kinds of cool ideas about programming language design and use. I happily second the sentiment of this tweet:

Note to self: read all Peter Van Roy's LtU comments in chronological order and build the things that don't exist yet: http://lambda-the-ultimate.org/user/288/track?from=120&sort=asc&order=last%20post

There are probably a few PhD dissertations lurking in those comments.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

August 19, 2014 1:49 PM

The Universal Justification

Because we need it to tell better stories.

Ethan Zuckerman says that this is the reason people are addicted to big data, quoting Macej Ceglowski's wonderful The Internet with a Human Face But if you look deep enough, this is the reason that most of us do so many of the things we do. We want to tell better stories.

As I teach our intro course this fall, I am going to ask myself occasionally, "How does what we are learning today help my students tell a better story?" I'm curious to see how that changes the way I think about the things we do in class.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

July 28, 2014 1:00 PM

Sometimes, You Have To Speak Up For Yourself

Wisdom from the TV:

"Whatever happened to humility, isn't that a virtue or something?"

"One of the highest. People in power are always saying so."

It is worth noting that one of the antonyms of "humble" is "privileged".

~~~~

This passage apparently occurs in an episode of Orange Is The New Black. I've never seen the show, but the exchange is quoted in this discussion of the show.

I just realized how odd it is to refer to Orange Is The New Black as a TV show. It is Netflix original series, which shows up on your TV only if you route your Internet viewing through that old box. Alas, 30- and 60-minute serialized shows have always been "TV" to me. I'm caught in the slipstream as our dominant entertainment media change forms.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

July 10, 2014 3:08 PM

The Passing of the Postage Stamp

In this New York Times article on James Baldwin's ninetieth birthday, scholar Henry Louis Gates laments:

On one hand, he's on a U.S. postage stamp; on the other hand, he's not in the Common Core.

I'm not qualified to comment on Baldwin and his place in the Common Core. In the last few months, I read several articles about and including Baldwin, and from those I have come to appreciate better his role in twentieth-century literature. But I also empathize with anyone trying to create a list of things that every American should learn in school.

What struck me in Gates's comment was the reference to the postage stamp. I'm old enough to have grown up in a world where the postage stamp held a position of singular importance in our culture. It enabled communication at a distance, whether geographical or personal. Stamps were a staple of daily life.

In such a world, appearing on a stamp was an honor. It indicated a widespread acknowledgment of a person's (or organization's, or event's) cultural impact. In this sense, the Postal Service's decision to include James Baldwin on a stamp was a sign of his importance to our culture, and a way to honor his contributions to our literature.

Alas, this would have been a much more significant and visible honor in the 1980s or even the 1990s. In the span of the last decade or so, the postage stamp has gone from relevant and essential to archaic.

When I was a boy, I collected stamps. It was a fun hobby. I still have my collection, even if it's many years out of date now. Back then, stamp collecting was a popular activity with a vibrant community of hobbyists. For all I know, that's still true. There's certainly still a vibrant market for some stamps!

But these days, whenever I use a new stamp, I feel as if I'm holding an anachronism in my hands. Computing technology played a central role in the obsolescence of the stamp, at least for personal and social communication.

Sometimes people say that we in CS need to a better job helping potential majors see the ways in which our discipline can be used to effect change in the world. We never have to look far to find examples. If a young person wants to be able to participate in how our culture changes in the future, they can hardly do better than to know a little computer science.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

July 09, 2014 12:35 PM

Why I Blog, Ten Years On

A blog can be many things.

It can an essay, a place to work out what I think, in the act of writing.

It can be a lecture, a place to teach something, however big or small, in my own way.

It can be memoir, a place to tell stories about my life, maybe with a connection to someone else's story.

It can be a book review or a conference review, a place to tell others about something I've read or seen that they might like, too. Or not.

It can be an open letter, a place to share news, good or bad, in a broadcast that reaches many.

It can be a call for help, a request for help from anyone who receives the message and has the time and energy to respond.

It can be a riff on someone else's post. I'm not a jazz musician, but I like to quote the melodies in other people's writing. Some blog posts are my solos.

It can be a place to make connections, to think about how things are similar and different, and maybe learn something in the process.

A blog is all of these, and more.

A blog can also be a time machine. In this mode, I am the reader. My blog reminds me who I was at another time.

This effect often begins with a practical question. When I taught agile software development this summer, I looked back to when I taught it last. What had I learned then but forgotten since? How might I do a better job this time around?

When I visit blog posts from the past, though, something else can happen. I sometimes find myself reading on. The words mesmerize me and pull me forward on the page, but back in time. It is not that the words are so good that I can't stop reading. It's that they remind me who I was back then. A different person wrote those words. A different person, yet me. It's quite a feeling.

A blog can combine any number of writing forms. I am not equally good writing in all of these forms, or even passably good in any of them. But they are me. Dave Winer has long said that a blog is the unedited voice of a person. This blog is the unedited voice of me.

When I wrote my first blog post ten years ago today, I wasn't sure if anyone wanted to hear my voice. Over the years, I've had the good fortune to interact with many readers, so I know someone is listening. That still amazes me. I'm glad that something you read here is worth the visit.

Back in those early days, I wondered if it even mattered whether anyone else would read. The blog as essay and as time machine are valuable enough on their own to make writing worth the effort to me. But I'll be honest: it helps a lot knowing that other people are reading. Even when you don't send comments by e-mail, I know you are there. Thank you for your time.

I don't write as often as I did in the beginning. But I still have things to say, so I'll keep writing.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

June 26, 2014 11:12 AM

Debunking Christensen?

A lot of people I know have been discussing the recent New Yorker article "debunking" Clayton Christensen's theory of disruptive innovation. I'm withholding judgment, because that usually is the right thing for me to do when discussing theories about systems we don't understand well and critiques of such theories. The best way to find out the answer is to wait for more data.

That said, we have seen this before in the space of economics and business management. A few years back, the book Good to Great by James Collins became quite popular on my campus, because our new president, an economist by training, was a proponent of its view of how companies had gone from being merely steady producers to being stars in their markets. He hoped that we could use some of its prescriptions to help transform our university from a decent public comprehensive into a better, stronger institution.

But in recent years we have seen critiques of Collins's theory. The problem: some of the companies that Collins touts in the book have fallen on hard times and been unable to sustain their greatness. (As I said, more data usually settles all scores.) Good to Great's prescriptions weren't enough for companies to sustain greatness; maybe they were not sufficient, or even necessary, for achieving (short-term) market dominance.

This has long been a weakness of the business management literature. When I was an undergrad double majoring in CS and accounting, I read a lot of case studies about successful companies, and my professors tried to help us draw out truths that would help any company succeed. Neither the authors of the case studies nor the professors seemed aware that we were suffering from a base case of survivor bias. Sure, that set of strategies worked for Coca Cola. Did other companies use the same strategies and fail? If so, why? Maybe Coca Cola just got lucky. We didn't really know.

My takeaway from reading most business books of this sort is that they tell great stories. They give us posthoc explanations of complex systems that fit the data at hand, but they don't have much in the way of predictive power. Buying into such theories wholesale as a plan for the future is rarely a good idea.

These books can still be useful to people who read them as inspirational stories and a source of ideas to try. For example, I found Collins's idea of "getting the right people on the bus" to be helpful when I was first starting as department head. I took a broad view of the book and learned some things.

And that said, I have speculated many times here about the future of universities and even mentioned Christensen's idea of disruption a couple of times [ 1 | 2 ]. Have I been acting on a bad theory?

I think the positive reaction to the New Yorker article is really a reaction to the many people who have been using the idea of disruptive innovation as a bludgeon in the university space, especially with regard to MOOCs. Christensen himself has sometimes been guilty of speaking rather confidently about particular ways to disrupt universities. After a period of groupthink in which people know without evidence that MOOCs will topple the existing university model, many of my colleagues are simply happy to have someone speak up on their side of the debate.

The current way that universities do business faces a number of big challenges as the balance of revenue streams and costs shifts. Perhaps universities as we know them now will ultimately be disrupted. This does not mean that any technology we throw at the problem will be the disruptive force that topples them. As Mark Guzdial wrote recently,

Moving education onto MOOCs just to be disruptive isn't valuable.

That's the most important point to take away from the piece in the New Yorker: disruptors ultimately have to provide value in the market. We don't know yet if MOOCs or any other current technology experiment in education can do that. We likely won't know until after it starts to happen. That's one of the important points to take away from so much of the business management literature. Good descriptive theories often don't make good prescriptive theories.

The risk people inside universities run is falling into a groupthink of their own, in which something very like the status quo is the future of higher education. My colleagues tend to speak in more measured tones than some of the revolutionaries espousing on-line courses and MOOCs, but their words carry an unmistakable message: "What we do is essential. The way we do has stood the test of time. No one can replace us." Some of my colleagues admit ruefully that perhaps something can replace the university as it is, but that we will all be worse off as a result.

That's dangerous thinking, too. Over the years, plenty of people who have said, "No one can do what we do as well as we do" have been proven wrong.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

May 05, 2014 4:35 PM

Motivated by Teaching Undergrads

Recently, a gentleman named Seth Roberts passed away. I didn't know Roberts and was not familiar with his work. However, several people I respect commented on his life and career, so I took a look at one colleague's reminiscence. Roberts was an interesting fellow who didn't do things the usual way for a research academic. This passage stood out:

Seth's academic career was unusual. He shot through college and graduate school to a tenure-track job at a top university, then continued to do publication-quality research for several years until receiving tenure. At that point he was not a superstar but I think he was still considered a respected member of the mainstream academic community. But during the years that followed, Seth lost interest in that thread of research (you can see this by looking at the dates of most of his highly-cited papers). He told me once that his shift was motivated by teaching introductory undergraduate psychology: the students, he said, were interested in things that would affect their lives, and, compared to that, the kind of research that leads to a productive academic career did not seem so appealing.

That last sentence explains, I think, why so many computer science faculty at schools that are not research-intensive end up falling away from traditional research and publishing. When you come into contact with a lot of undergrads, you may well find yourself caring more deeply about things that will affect their lives in a more direct way. Pushing deeper down a narrow theoretical path, or developing a novel framework for file system management that most people will never use, may not seem like the best way to use your time.

My interests have certainly shifted over the years. I found myself interested in software development, in particular tools and practices that students can use to make software more reliably and teaching practices that would students learn more effectively. Fortunately, I've always loved programming qua programming, and this has allowed me to teach different programming styles with an eye on how learning them will help my students become better programmers. Heck, I was even able to stick with it long enough that functional programming became popular in industry! I've also been lucky that my interest in languages and compilers has been of interest to students and employers over the last few years.

In any event, I can certainly understand how Roberts diverged from the ordained path and turned his interest to other things. One challenge for leaving the ordained path is to retain the mindset of a scientist, seeking out opportunities to evaluate ideas and to disseminate the ones that appear to hold up. You don't need to publish in the best journals to disseminate good ideas widely. That may not even be the best route.

Another challenge is to find a community of like-minded people in which to work. An open, inquisitive community is a place to find new ideas, a place to try ideas out before investing too much in a doomed one, and a place to find the colleagues most of us need to stay sane while exploring what interests. The software and CS worlds have helped create the technology that makes it possible to grow such communities in new ways, and our own technology now supports some amazing communities of software and CS people. It is a good time to be an academic or developer.

I've enjoyed reading about Roberts' career and learning about what seems to have been one of the academia's unique individuals. And I certainly understand how teaching introductory undergrads might motivate a different worldview for an academic. It's good to be reminded that it's okay to care about the things that will affect the lives of our students now rather than later.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

April 27, 2014 7:20 PM

Knowing and Doing in the Wild, Antifragile Edition

a passage from Taleb's 'Antifragile' that mentions knowing and doing

Reader Aaron Friel was reading Taleb's Antifragile and came across a passage that brought to mind this blog. Because of "modernity's connectivity, and the newfound invisibility of causal chains", Taleb says, ...

The intellectual today is vastly more powerful and dangerous than before. The "knowledge world" causes separation of knowing and doing (within the same person) and leads to the fragility of society.

He wondered if this passage was the source of the title of my blog. Knowing and Doing predates Taleb's book by nearly decade, so it wasn't the source. But the idea expressed in this passage was certainly central to how the blog got its name. I hoped to examine the relationship between knowing and doing, and in particular the danger of separating them in the classroom or in the software studio. So, I'm happy to have someone make a connection to this passage.

Even so, I still lust after naming my blog The Euphio Question. RIP, Mr. Vonnegut.


Posted by Eugene Wallingford | Permalink | Categories: General

April 22, 2014 2:56 PM

Not Writing At All Leads To Nothing

In a recent interview, novelist and journalist Anna Quindlen was asked if she ever has writer's block. Her answer:

Some days I fear writing dreadfully, but I do it anyway. I've discovered that sometimes writing badly can eventually lead to something better. Not writing at all leads to nothing.

I deal with CS students all the time who are paralyzed by starting on a programming assignment, for fear of doing it wrong. All that gets them is never done. My job in those cases is less likely to involve teaching them something new they need to do the assignment and than to involve helping them get past the fear. A teacher sometimes has to be a psychologist.

I'd like to think that, at my advanced age and experience, I am beyond such fears myself. But occasionally they are there. Sometimes, I just have to force myself to write that first simple test, watch it fail, and ask myself, "What now?" As code happens, it may be good, or it may be bad, but it's not an empty file. Refactoring helps me make it better as I go along. I can always delete it all and start over, but by then I know more than I did at the outset, and I usually am ready to plow ahead.


Posted by Eugene Wallingford | Permalink | Categories: General

April 09, 2014 3:26 PM

Programming Everywhere, Vox Edition

In a report on the launch of Vox Media, we learn that line between software developers and journalists at Vox is blurred, as writers and reporters work together "to build the tools they require".

"It is thrilling as a journalist being able to envision a tool and having it become a real thing," Mr. Topolsky said. "And it is rare."

It will be less rare in the future. Programming will become a natural part of more and more people's toolboxes.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

March 12, 2014 3:55 PM

Not Content With Content

Last week, the Chronicle of Higher Ed ran an article on a new joint major at Stanford combining computer science and the humanities.

[Students] might compose music or write a short story and translate those works, through code, into something they can share on the web.

"For students it seems perfectly natural to have an interest in coding," [the program's director] said. "In one sense these fields might feel like they're far apart, but they're getting closer and closer."

The program works in both directions, by also engaging CS students in the societal issues created by ubiquitous networks and computing power.

We are doing something similar at my university. A few years ago, several departments began to collaborate on a multidisciplinary program called Interactive Digital Studies which went live in 2012. In the IDS program, students complete a common core of courses from the Communication Studies department and then take "bundles" of coursework involving digital technology from at least two different disciplines. These areas of emphasis enable students to explore the interaction of computing with various topics in media, the humanities, and culture.

Like Stanford's new major, most of the coursework is designed to work at the intersection of disciplines, rather than pursuing disciplines independently, "in parallel".

The initial version of the computation bundle consists of an odd mix of application tools and opportunities to write programs. Now that the program is in place, we are finding that students and faculty alike desire more depth of understanding about programming and development. We are in the process of re-designing the bundle to prepare students to work in a world where so many ideas become web sites or apps, and in which data analytics plays an important role in understanding what people do.

Both our IDS program and Stanford's new major focus on something that we are seeing increasingly at universities these days: the intersections of digital technology and other disciplines, in particular the humanities. Computational tools make it possible for everyone to create more kinds of things, but only if people learn how to use new tools and think about their work in new ways.

Consider this passage by Jim O'Loughlin, a UNI English professor, from a recent position statement on the the "digital turn" of the humanities:

We are increasingly unlikely to find writers who only provide content when the tools for photography, videography and digital design can all be found on our laptops or even on our phones. It is not simply that writers will need to do more. Writers will want to do more, because with a modest amount of effort they can be their own designers, photographers, publishers or even programmers.

Writers don't have to settle for producing "content" and then relying heavily on others to help bring the content to an audience. New tools enable writers to take greater control of putting their ideas before an audience. But...

... only if we [writers] are willing to think seriously not only about our ideas but about what tools we can use to bring our ideas to an audience.

More tools are within the reach of more people now than ever before. Computing makes that possible, not only for writers, but also for musicians and teachers and social scientists.

Going further, computer programming makes it possible to modify existing tools and to create new tools when the old ones are not sufficient. Writers, musicians, teachers, and social scientists may not want to program at that level, but they can participate in the process.

The critical link is preparation. This digital turn empowers only those who are prepared to think in new ways and to wield a new set of tools. Programs like our IDS major and Stanford's new joint major are among the many efforts hoping to spread the opportunities available now to a larger and more diverse set of people.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

March 11, 2014 4:52 PM

Change The Battle From Arguments To Tests

In his recent article on the future of the news business, Marc Andreessen has a great passage in his section on ways for the journalism industry to move forward:

Experimentation: You may not have all the right answers up front, but running many experiments changes the battle for the right way forward from arguments to tests. You get data, which leads to correctness and ultimately finding the right answers.

I love that clause: "running many experiments changes the battle for the right way forward from arguments to tests".

While programming, it's easy to get caught up in what we know about the code we have just written and assume that this somehow empowers us to declare sweeping truths about what to do next.

When students are first learning to program, they often fall into this trap -- despite the fact that they don't know much at all. From other courses, though, they are used to thinking for a bit, drawing some conclusions, and then expressing strongly-held opinions. Why not do it with their code, too?

No matter who we are, whenever we do this, sometimes we are right, and sometimes, we are wrong. Why leave it to chance? Run a simple little experiment. Write a snippet of code that implements our idea, and run it. See what happens.

Programs let us test our ideas, even the ideas we have about the program we are writing. Why settle for abstract assertions when we can do better? In the end, even well-reasoned assertions are so much hot air. I learned this from Ward Cunningham: It's all talk until the tests run.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

February 25, 2014 3:31 PM

Abraham Lincoln on Reading the Comment Section

From Abraham Lincoln's last public address:

As a general rule, I abstain from reading the reports of attacks upon myself, wishing not to be provoked by that to which I cannot properly offer an answer.

These remarks came two days after Robert E. Lee surrendered at Appomattox Court House. Lincoln was facing abuse from the North and the South, and from within his party and without.

The great ones speak truths that outlive their times.


Posted by Eugene Wallingford | Permalink | Categories: General

February 22, 2014 2:05 PM

MOOCs: Have No Fear! -- Or Should We?

The Grumpy Economist has taught a MOOC and says in his analysis of MOOCs:

The grumpy response to moocs: When Gutenberg invented moveable type, universities reacted in horror. "They'll just read the textbook. Nobody will come to lectures anymore!" It didn't happen. Why should we worry now?

The calming effect of his rather long entry is mitigated by other predictions, such as:

However, no question about it, the deadly boring hour and a half lecture in a hall with 100 people by a mediocre professor teaching utterly standard material is just dead, RIP. And universities and classes which offer nothing more to their campus students will indeed be pressed.

In downplaying the potential effects of MOOCs, Cochrane seems mostly to be speaking about research schools and more prestigious liberal arts schools. Education is but one of the "goods' being sold by such schools; prestige and connections are often the primary benefits sought by students there.

I usually feel a little odd when I read comments on teaching from people who teach mostly graduate students and mostly at big R-1 schools. I'm not sure their experience of teaching is quite the same as the experience of most university professors. Consequently, I'm suspicious of the prescriptions and predictions they make for higher education, because our personal experiences affect our view of the world.

That said, Cochrane's blog spends a lot of time talking about the nuts and bolts of creating MOOCs, and his comments on fixed and marginal costs are on the mark. (He may be grumpy, but he is an economist!) And a few of his remarks about teaching apply just as well to undergrads at a state teaching university as they do to U. of Chicago's doctoral program in economics. One that stood out:

Most of my skill as a classroom teacher comes from the fact that I know most of the wrong answers as well as the right ones.

All discussions of MOOCs ultimately include the question of revenue. Cochrane reminds us that universities...

... are, in the end, nonprofit institutions that give away what used to be called knowledge and is now called intellectual property.

The question now, though, is how schools can afford to give away knowledge as state support for public schools declines sharply and relative cost structure makes it hard for public and private schools alike to offer education at a price reasonable for their respective target audiences. The R-1s face a future just as challenging as the rest of us; how can they afford to support researchers who spend most of their time creating knowledge, not teaching it to students?

MOOCs are a weird wrench thrown into this mix. They seem to taketh away as much as they giveth. Interesting times.


Posted by Eugene Wallingford | Permalink | Categories: General

February 05, 2014 4:08 PM

Eccentric Internet Holdout Delbert T. Quimby

Back in a July 2006 entry, I mentioned a 1995 editorial cartoon by Ed Stein, then of the Rocky Mountain News. The cartoon featured "eccentric Internet holdout Delbert T. Quimby", contentedly passing another day non-digitally, reading a book in his den and drinking a glass of wine. It's always been a favorite of mine.

The cartoon had at least one other big fan. He looked for it on the web but had no luck finding it. When he googled the quote, though, there my blog entry was. Recently, his wife uncovered a newspaper clipping of the cartoon, and he remembered the link to my blog post. In an act of unprovoked kindness, he sent me a scan of the cartoon. So, 7+ years later, here it is:

Eccentric Internet holdout Delbert T. Quimby passes yet another day non-digitally.

The web really is an amazing place. Thanks, Duncan.

In 1995, being an Internet holdout was not quite as radical as it would be today. I'm guessing that most of the holdouts in 2014 are Of A Certain Age, remembering a simpler time when information was harder to come. To avoid the Internet and the web entirely these days is to miss out on a lot of life.

Even so, I am eccentric enough still to appreciate time off-line, a good book in my hand and a beverage at my side. Like my digital devices, I need to recharge every now and then.

(Right now, I am re-reading David Lodge's Small World. It's fun to watch academics made good sport of.)


Posted by Eugene Wallingford | Permalink | Categories: General

February 03, 2014 4:07 PM

Remembering Generosity

For a variety of reasons, the following passage came to mind today. It is from a letter that Jonathan Schoenberg wrote as part of the "Dear Me, On My First Day of Advertising" series on The Egotist forum:

You got into this business by accident, and by the generosity of people who could have easily been less generous with their time. Please don't forget it.

It's good for me to remind myself frequently of this. I hope I can be as generous with time to my students and colleagues as as so many of my professors and colleagues were with their time. Even when it means explaining nested for-loops again.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

January 27, 2014 3:29 PM

An Example of the Difference Between Scientists and Humanists

Earlier today, I tweeted a link to The origin of consciousness in the breakdown of the bicameral mind, in which Erik Weijers discusses an unusual theory about the origin of consciousness developed by Julian Jaynes:

[U]ntil a few thousand years ago human beings did not 'view themselves'. They did not have the ability: they had no introspection and no concept of 'self' that they could reflect upon. In other words: they had no subjective consciousness. Jaynes calls their mental world the bicameral mind.

It sounds odd, I know, but I found Jaynes's hypothesis to be a fascinating extrapolation of human history. Read more of Weijers's review if you might be interested.

A number of people who saw my tweet expressed interest in the article or a similar fascination with Jaynes's idea. Two people mentioned the book in which Jaynes presented his hypothesis. I responded that I would now have to dive into the book and learn more. How could I resist the opportunity?

Two of the comments that followed illustrate nicely the differing perspectives of the scientist and the humanist. First, Chris said:

My uncle always loved that book; I should read it, since I suspect serious fundamental evidentiary problems with his thesis.

And then Liz said:

It's good! I come from a humanities angle, so I read it as a thought experiment & human narrative.

The scientist thinks almost immediately of evidence and how well supported the hypothesis might be. The humanist thinks of the hypothesis first as a human narrative, and perhaps only then as a narrow scientific claim. Both perspectives are valuable; they simply highlight different forms of the claim.

From what I've seen on Twitter, I think that Chris and Liz are like me and most of the people I know: a little bit scientist, a little bit humanist -- interested in both the story and the argument. All that differs sometimes is the point from which we launch our investigations.


Posted by Eugene Wallingford | Permalink | Categories: General

January 27, 2014 11:39 AM

The Polymath as Intellectual Polygamist

Carl Djerassi, quoted in The Last Days of the Polymath:

Nowadays people [who] are called polymaths are dabblers -- are dabblers in many different areas. I aspire to be an intellectual polygamist. And I deliberately use that metaphor to provoke with its sexual allusion and to point out the real difference to me between polygamy and promiscuity.

On this view, a dilettante is merely promiscuous, making no real commitment to any love interest. A polymath has many great loves, and loves them all deeply, if not equally.

We tend to look down on dilettantes, but they can perform a useful service. Sometimes, making a connection between two ideas at the right time and in the right place can help spur someone else to "go deep" with the idea. Even when that doesn't happen, dabbling can bring great personal joy and provide more substantial entertainment than a lot of pop culture.

Academics are among the people these days with a well-defined social opportunity to be explore at least two areas deeply and seriously: their chosen discipline and teaching. This is perhaps the most compelling reason to desire a life in academia. It even offers a freedom to branch out into new areas later in one's career that is not so easily available to people who work in industry.

These days, it's hard to be a polymath even inside one's own discipline. To know all sub-areas of computer science, say, as well as the experts in those sub-areas is a daunting challenge. I think back to the effort my fellow students and I put in over the years that enabled us to take the Ph.D. qualifying exams in CS. I did quite well across the board, but even then I didn't understand operating systems or programming languages as well as experts in those areas. Many years later, despite continued reading and programming, the gap has only grown.

I share the vague sense of loss, expressed by the author of the article linked to above, of a time when one human could master multiple areas of discourse and make fundamental advances to several. We are certainly better off for collective understanding the world so much much better, but the result is a blow to a certain sort of individual mind and spirit.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

January 26, 2014 3:05 PM

One Reason We Need Computer Programs

Code bridges the gap between theory and data. From A few thoughts on code review of scientific code:

... there is a gulf of unknown size between the theory and the data. Code is what bridges that gap, and specifies how edge cases, weird features of the data, and unknown unknowns are handled or ignored.

I learned this lesson the hard way as a novice programmer. Other activities, such as writing and doing math, exhibit the same characteristic, but it wasn't until I started learning to program that the gap between theory and data really challenged me.

Since learning to program myself, I have observed hundreds of CS students encounter this gap. To their credit, they usually buckle down, work hard, and close the gap. Of course, we have to close the gap for every new problem we try to solve. The challenge doesn't go away; it simply becomes more manageable as we become better programmers.

In the passage above, Titus Brown is talking to his fellow scientists in biology and chemistry. I imagine that they encounter the gap between theory and data in a new and visceral way when they move into computational science. Programming has that power to change how we think.

There is an element of this, too, in how techies and non-techies alike sometimes lose track of how hard it is to create a successful start up. You need an idea, you need a programmer, and you need a lot of hard work to bridge the gap between idea and executed idea.

Whether doing science or starting a company, the code teaches us a lot about out theory. The code makes our theory better.

As Ward Cunningham is fond of saying, it's all talk until the tests run.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

December 18, 2013 3:31 PM

Favorite Passages from Today's Reading

From The End of the Facebook Era:

This is why social networks [like Google+] are struggling even more than Facebook to get a foothold in the future of social networking. They are betting on last year's fashion -- they're fighting Facebook for the last available room on the Titanic when they should be looking at all of the other ships leaving the marina.

A lot of people and organizations in this world are fighting over the last available room on their sector's version of the Titanic. Universities may well be among them. Who is leaving the marina?

From We Need to Talk About TED:

Astrophysics run on the model of American Idol is a recipe for civilizational disaster.

...

TED's version [of deep technocultural shift] has too much faith in technology, and not nearly enough commitment to technology. It is placebo technoradicalism, toying with risk so as to re-affirm the comfortable.

I like TED talks as much as the next person, but I often wonder how much change they cause in the world, as opposed to serving merely as chic entertainment for the comfortable First World set.


Posted by Eugene Wallingford | Permalink | Categories: General

December 17, 2013 3:32 PM

Always Have At Least Two Alternatives

Paraphrasing Kent Beck:

Whenever I write a new piece of code, I like to have at least two alternatives in mind. That way, I know I am not doing the worst thing possible.

I heard Kent say something like this at OOPSLA in the late 1990s. This is advice I give often to students and colleagues, but I've never had a URL that I could point them to.

It's tempting for programmers to start implementing the first good idea that comes to mind. It's especially tempting for novices, who sometimes seem surprised that they have even one good idea. Where would a second one come from?

More experienced students and programmers sometimes trust their skill and experience a little too easily. That first idea seems so good, and I'm a good programmer... Famous last words. Reality eventually catches up with us and helps us become more humble.

Some students are afraid: afraid they won't get done if they waste time considering alternatives, or afraid that they will choose wrong anyway. Such students need more confidence, the kind born out of small successes.

I think the most likely explanation for why beginners don't already seek alternatives is quite simple. They have not developed the design habit. Kent's advice can be a good start.

One pithy statement is often enough of a reminder for more experienced programmers. By itself, though, it probably isn't enough for beginners. But it can be an important first step for students -- and others -- who are in the habit of doing the first thing that pops into their heads.

Do note that this advice is consistent with XP's counsel to do the simplest thing that could possibly work. "Simplest" is a superlative. Grammatically, that suggests having at least three options from which to choose!


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

December 03, 2013 3:17 PM

The Workaday Byproducts of Striving for Higher Goals

Why set audacious goals? In his piece about the Snowfall experiment, David Sleight says yes, and not simply for the immediate end:

The benefits go beyond the plainly obvious. You need good R&D for the same reason you need a good space program. It doesn't just get you to the Moon. It gives you things like memory foam, scratch-resistant lenses, and Dustbusters. It gets you the workaday byproducts of striving for higher goals.

I showed that last sentence a little Twitter love, because it's something people often forget to consider, both when they are working in the trenches and when they are selecting projects to work on. An ambitious project may have a higher risk of failure than something more mundane, but it also has a higher chance of producing unexpected value in the form of new tools and improved process.

This is also something that university curricula don't do well. We tend to design learning experiences that fit neatly into a fifteen-week semester, with predictable gains for our students. That sort of progress is important, of course, but it misses out on opportunities for students to produce their own workaday byproducts. And that's an important experience for students to have.

It also gives a bad example of what learning should feel like, and what it should do for us. Students generally learn what we teach them, or what we make easiest for them to learn. If we always set before them tasks of known, easily-understood dimensions, then they will have to learn after leaving us that the world doesn't usually work like that.

This is one of the reasons I am such a fan of project-based computer science education, as in the traditional compiler course. A compiler is an audacious enough goal for most students that they get to discover their own personal memory foam.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

November 26, 2013 1:38 PM

Saying Thanks, and Giving Back

When someone asked Benjamin Franklin why he had declined to seek a patent for his famous stove, he said:

I declined it from a principle which has ever weighed with me on such occasions, that as we enjoy great advantages from the inventions of others, we should be glad of an opportunity to serve others by any invention of ours.

This seems a fitting sentiment to recall as I look forward to a few days of break with my family for Thanksgiving. I know I have a lot to be thankful for, not the least of which are the inventions of so many others that confer great advantage on me. This week, I give thanks for these creations, and for the creators who shared them with me.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

November 21, 2013 3:06 PM

Agile Thoughts, Healthcare.gov Edition

Clay Shirky explains the cultural attitudes that underlie Healthcare.gov's problems in his recent essay on the gulf between planning and reality. The danger of this gulf exists in any organization, whether business or government, but especially in large organizations. As the number of levels grows between the most powerful decision makers and the workers in the trenches, there is an increasing risk of developing "a culture that prefers deluding the boss over delivering bad news".

But this is also a story of the danger inherent in so-called Big Design Up Front, especially for a new kind of product. Shirky oversimplifies this as the waterfall method, but the basic idea is the same:

By putting the most serious planning at the beginning, with subsequent work derived from the plan, the waterfall method amounts to a pledge by all parties not to learn anything while doing the actual work.

You may learn something, of course; you just aren't allowed to let it change what you build, or how.

Instead, waterfall insists that the participants will understand best how things should work before accumulating any real-world experience, and that planners will always know more than workers.

If the planners believe this, or they allow the workers to think they believe this, then workers will naturally avoid telling their managers what they have learned. In the best case, they don't want to waste anyone's time if sharing the information will have no effect. In the worst case, they might fear the results of sharing what they have learned. No one likes to admit that they can't get the assigned task done, however unrealistic it is.

As Shirky notes, many people believe that a difficult launch of Healthcare.gov was unavoidable, because political and practical factors prevented developers from testing parts of the project as they went along and adjusting their actions in response. Shirky hits this one out of the park:

That observation illustrates the gulf between planning and reality in political circles. It is hard for policy people to imagine that Healthcare.gov could have had a phased rollout, even while it is having one.

You can learn from feedback earlier, or you can learn from feedback later. Pretending that you can avoid problems you already know exist never works.

One of the things I like about agile approaches to software development is they encourage us not to delude ourselves, or our clients. Or our bosses.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading, Software Development

November 14, 2013 2:55 PM

Toward A New Data Science Culture in Academia

Fernando Perez has a nice write-up, An Ambitious Experiment in Data Science, describing a well-funded new project in which teams at UC Berkeley, the University of Washington, and NYU will collaborate to "change the culture of universities to create a data science culture". A lot of people have been quoting Perez's entry for its colorful assessment of academic incentives and reward structures. I like this piece for the way Perez defines and outlines the problem, in terms of both data science across disciplines and academic culture in general.

For example:

Most scientists are taught to treat computation as an afterthought. Similarly, most methodologists are taught to treat applications as an afterthought.

Methodologists here includes computer scientists, who are often more interested in new data structures, algorithms, and protocols.

This "mirror" disconnect is a problem for a reason many people already understand well:

Computation and data skills are all of a sudden everybody's problem.

(Here are a few past entries of mine that talk about how programming and the nebulous "computational thinking" have spread far and wide: 1 | 2 | 3 | 4.)

Perez rightly points out that the open-source software, while imperfect, often embodies the principles or science and scientific collaboration better than the academy. It will be interesting to see how well this data science project can inject OSS attitudes into big research universities.

He is concerned because, as I have noted before, are, as a whole, a conservative lot. Perez says this in a much more entertaining way:

There are few organizations more proud of their traditions and more resistant to change than universities (churches and armies might be worse, but that's about it).

I think he gives churches and armies more credit than they deserve.

The good news is that experiments of the sort being conducted in the Berkley/UW/NYU project are springing up on a smaller scale around the world. There is some hope for big change in academic culture if a lot of different people at a lot of different institutions experiment, learn, and create small changes that can grow together as they bump into one another.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 09, 2013 12:25 PM

An Unusual Day

My university is hosting an on-campus day to recruit HS students and transfer students today. At a day like this, I usually visit with one or two potential majors and chat with one or two others who might be interested in a CS or programming class. All are usually men.

Today was unusual.

Eight people visited the department to learn about the major.

I spoke with three people who intend to major in other areas, such as accounting and physics, and want to take a minor in CS.

I spoke with a current English major here is set to graduate in May but now is thinking about employability and considering picking up a second degree in CS.

I spoke with three female students who are interested in CS. These include the English major and a student who has taken several advanced math courses at a good private school nearby, really likes them, and is thinking of combining math and CS in a major here.

The third is a high school freshman who has taken taken all the tech courses available at her schools, helps the tech teacher with the schools computers, and wants to learn more. She told me, "I just think it would be cool to write programs and make things happen."

Some recruiting days are better than others. This is one.


Posted by Eugene Wallingford | Permalink | Categories: General

October 30, 2013 11:41 AM

Discipline Can Be Structural As Well As Personal

There is a great insight in an old post by Brian Marick, Discipline and Skill, which I re-read this week. The topic sentence asserts:

Discipline can be a personal virtue, but it must also be structural.

Extreme Programming illustrates this claim. It draws its greatest power from the structural discipline it creates for developers. Marick goes on:

For example, one of the reasons to program in pairs is that two people are less likely to skip a test than one is. Removing code ownership makes it more likely someone within glaring distance will see that you didn't leave code as clean as you should have. The business's absolute insistence on getting working -- really working -- software at frequent intervals makes the pain of sloppiness strike home next month instead of next year, stiffening the resolve to do the right thing today.

P consists of a lot of relatively simple actions, but simple actions can be hard to perform, especially consistently and especially in opposition to deeply ingrained habits. XP practices work together to create structural discipline that helps developers "do the right thing".

We see the use of social media playing a similar role these days. Consider diet. People who are trying to lose weight or exercise more have to do some pretty simple things. Unfortunately, those things are not easy to do consistently, and they are opposed by deep personal and cultural habits. In order to address this, digital tool providers like FitBit make it easy for users to sync their data to a social media account and share with others.

This is a form of social discipline, supported by tools and practices that give structure to the actions people want to take. Just like XP. Many behaviors in life work this way.

(Of course, I'm already on record as saying that XP is a self-help system. I have even fantasized about XP's relationship to self-help in the cinema.)


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

October 12, 2013 11:27 AM

StrangeLoop: This and That, Volume 3

[My notes on StrangeLoop 2013: Table of Contents]

Six good talks a day is about my limit. Seven for sure. Each creates so much mental activity that my brain soon loses the ability to absorb more. Then, I need a walk.

~~~~

After Jenny Finkel's talk on machine, someone asked if Prismatic's system had learned any features or weights that she found surprising. I thought her answer was interesting. I paraphrase: "No. As a scientist, you should understand why the system is the way that it is, or find the bug if it shouldn't be that way."

In a way, this missed the point. I'm guessing the questioner was looking to hear about a case that required them to dig in because the answer was correct but they didn't know why yet, or incorrect and the bug wasn't obvious. But Finkel's answer shows how matter-of-fact scientists can be about what they find. The world is as it is, and scientists try to figure out why. That's all.

~~~~

The most popular corporate swag this year was stickers to adorn one's laptop case. I don't put stickers on my gear, but I like looking at other people's stickers. My favorites were the ones that did more than simply display the company name. Among them were asynchrony:

asynchrony laptop sticker

-- which is a company name but also a fun word in its own right -- and data-driven:

O'Reilly laptop sticker

-- by O'Reilly. I also like the bound, graph-paper notebooks that O'Reilly hands out. Classy.

~~~~

In a previous miscellany I mentioned Double Multitasking Guy. Not me, not this time. I carried no phone, as usual, and this time I left my laptop back in the hotel room. Not having any networked technology in hand creates a different experience, if not a better one.

Foremost, having no laptop affects my blogging. I can't take notes as quickly, or as voluminously. One of the upsides of this is that it's harder for me to distract myself by writing complete sentences or fact-checking vocabulary and URLs. Quick, what is the key idea here? What do I need to look up? What do I need to learn next?

~~~~

With video recording now standard at tech conferences, and with StrangeLoop releasing its videos so quickly now, a full blow-by-blow report of each talk becomes somewhat less useful. Some people find summary reports helpful, though, because they don't want to watch the full talks or have the time to do so. Short reports let these folks keep their pulse on the state of the world. Others are looking for some indication of whether they want to invest the time to watch.

For me, the reports serve another useful purpose. They let me do a little light analysis and share my personal impressions of what I hear and learn. Fortunately, that sort of blog entry still finds an audience.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 28, 2013 12:17 PM

StrangeLoop: This and That, Volume 2

[My notes on StrangeLoop 2013: Table of Contents]

I am at a really good talk and look around the room. So many people are staring at their phones, scrolling away. So many others are staring at their laptops, typing away. The guy next to me: doing both at the same time. Kudos, sir. But you may have missed the point.

~~~~

Conference talks are a great source of homework problems. Sometimes, the talk presents a good problem directly. Others, watching the talk sets my subconscious mind in motion, and it creates something useful. My students thank you. I thank you.

~~~~

Jenny Finkel talked about the difference between two kinds of recommenders: explorers, who forage for new content, and exploiters, who want to see what's already popular. The former discovers cool new things occasionally but fails occasionally, too. The latter is satisfied most of the time but rarely surprised. As conference goes, I felt this distinction at play in my own head this year. When selecting the next talk to attend, I have to take a few risks if I ever hope to find something unexpected. But when I fail, a small regret tugs at me.

~~~~

We heard a lot of confident female voices on the StrangeLoop stages this year. Some of these speakers have advanced academic degrees, or at least experience in grad school.

~~~~

The best advice I received on Day 1 perhaps came not from a talk but from the building:

The 'Do not Climb on Bears' sign on a Peabody statue

"Please do not climb on bears." That sounds like a good idea most everywhere, most all the time.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

September 23, 2013 4:22 PM

StrangeLoop: This and That, Volume 1

[My notes on StrangeLoop 2013: Table of Contents]

the Peabody Opera House's Broadway series poster

I'm working on a post about the compiler talks I attended, but in the meantime here are a few stray thoughts, mostly from Day 1.

The Peabody Opera House really is a nice place to hold a conference of this size. If StrangeLoop were to get much larger, it might not fit.

I really don't like the word "architected".

The talks were scheduled pretty well. Only once in two days did I find myself really wanting to go to two talks at the same time. And only once did I hear myself thinking, "I don't want to hear any of these...".

My only real regret from Day 1 was missing Scott Vokes's talk on data compression. I enjoyed the talk I went to well enough, but I think I would have enjoyed this one more.

What a glorious time to be a programming language theory weenie. Industry practitioners are going to conferences and attending talks on dependent types, continuations, macros, immutable data structures, and functional reactive programming.

Moon Hooch? Interesting name, interesting sound.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

August 29, 2013 4:31 PM

Asimov Sees 2014, Through Clear Eyes and Foggy

Isaac Asimov, circa 1991

A couple of years ago, I wrote Psychohistory, Economics, and AI, in which I mentioned Isaac Asimov and one way that he had influenced me. I never read Asimov or any other science fiction expecting to find accurate predictions of future. What drew me in was the romance of the stories, dreaming "what if?" for a particular set of conditions. Ultimately, I was more interested in the relationships among people under different technological conditions than I was in the technology itself. Asimov was especially good at creating conditions that generated compelling human questions.

Some of the scenarios I read in Asimov's SF turned out to be wildly wrong. The world today is already more different from the 1950s than the world of the Foundation, set thousands of years in the future. Others seem eerily on the mark. Fortunately, accuracy is not the standard by which most of us judge good science fiction.

But what of speculation about the near future? A colleague recently sent me a link to Visit to the World's Fair of 2014, an article Asimov wrote in 1964 speculating about the world fifty years hence. As I read it, I was struck by just how far off he was in some ways, and by how close he was in others. I'll let you read the story for yourself. Here are a few selected passages that jumped out at me.

General Electric at the 2014 World's Fair will be showing 3-D movies of its "Robot of the Future," neat and streamlined, its cleaning appliances built in and performing all tasks briskly. (There will be a three-hour wait in line to see the film, for some things never change.)

3-D movies are now common. Housecleaning robots are not. And while some crazed fans will stand in line for many hours to see the latest comic-book blockbuster, going to a theater to see a movie has become much less important part of the culture. People stream movies into their homes and into their hands. My daughter teases me for caring about the time any TV show or movie starts. "It's on Hulu, Dad." If it's not on Hulu or Netflix or the open web, does it even exist?

Any number of simultaneous conversations between earth and moon can be handled by modulated laser beams, which are easy to manipulate in space. On earth, however, laser beams will have to be led through plastic pipes, to avoid material and atmospheric interference. Engineers will still be playing with that problem in 2014.

There is no one on the moon with whom to converse. Sigh. The rest of this passage sounds like fiber optics. Our world is rapidly becoming wireless. If your device can't connect to the world wireless web, does it even exist?

In many ways, the details of technology are actually harder to predict correctly than the social, political, economic implications of technological change. Consider:

Not all the world's population will enjoy the gadgety world of the future to the full. A larger portion than today will be deprived and although they may be better off, materially, than today, they will be further behind when compared with the advanced portions of the world. They will have moved backward, relatively.

Spot on.

When my colleague sent me the link, he said, "The last couple of paragraphs are especially relevant." They mention computer programming and a couple of its effects on the world. In this regard, Asimov's predictions meet with only partial success.

The world of A.D. 2014 will have few routine jobs that cannot be done better by some machine than by any human being. Mankind will therefore have become largely a race of machine tenders. Schools will have to be oriented in this direction. ... All the high-school students will be taught the fundamentals of computer technology will become proficient in binary arithmetic and will be trained to perfection in the use of the computer languages that will have developed out of those like the contemporary "Fortran" (from "formula translation").

The first part of this paragraph is becoming truer every day. Many people husband computers and other machines as they do tasks we used to do ourselves. The second part is, um, not true. Relatively few people learn to program at all, let alone master a programming language. And how many people understand this t-shirt without first receiving an impromptu lecture on the street?

Again, though, Asimov is perhaps closer on what technological change means for people than on which particular technological changes occur. In the next paragraph he says:

Even so, mankind will suffer badly from the disease of boredom, a disease spreading more widely each year and growing in intensity. This will have serious mental, emotional and sociological consequences, and I dare say that psychiatry will be far and away the most important medical specialty in 2014. The lucky few who can be involved in creative work of any sort will be the true elite of mankind, for they alone will do more than serve a machine.

This is still speculation, but it is already more true than most of us would prefer. How much truer will it be in a few years?

My daughters will live most of their lives post-2014. That worries the old fogey in me a bit. But it excites me more. I suspect that the next generation will figure the future out better than mine, or the ones before mine, can predict it.

~~~~

PHOTO. Isaac Asimov, circa 1991. Britannica Online for Kids. Web. 2013 August 29. http://kids.britannica.com/comptons/art-136777.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

August 28, 2013 3:07 PM

Risks and the Entrepreneurs Who Take Them

Someone on the SIGCSE mailing list posted a link to an article in the Atlantic, that explores a correlation between entrepreneurship, teenaged delinquency, and white male privilege. The article starts with

It does not strike me as a coincidence that a career path best suited for mild high school delinquents ends up full of white men.

and concludes with

To be successful at running your own company, you need a personality type that society is a lot more forgiving of if you're white.

The sender of the link was curious what educational implications these findings have, if any, for how we treat academic integrity in the classroom. That's an interesting question, though my personal tendency to follow rules and not rock the boat has always made me more sensitive to the behavior of students who employ the aphorism "ask for forgiveness, not permission" a little too cavalierly for my taste.

My first reaction to the claims of this article was tied to how I think about the kinds of risks that entrepreneurs take.

When most people in the start-up world talk about taking risks, they are talking about the risk of failure and, to a lesser extent, the risk of being unconventional, not the risk of being caught doing something wrong. In my personal experience, the only delinquent behavior our entrepreneurial former students could be accused of is not doing their homework as regularly as they should. Time spent learning for their business is time not spent on my course. But that's not delinquent behavior; it's curiosity focused somewhere other than my classroom.

It's not surprising, though, that teens who were willing take legal risks are more likely willing to take business risk, and (sadly) legal risks in their businesses. Maybe I've simply been lucky to have worked with students and other entrepreneurs of high character.

Of course, there is almost certainly a white male privilege associated with the risk of failure, too. White males are often better positioned financially and socially than women or minorities to start over when a company fails. It's also easier to be unconventional and stand out from the crowd when you don't already stand out from the crowd due to your race or gender. That probably accounts for the preponderance of highly-educated white men in start-ups better than a greater willingness to partake in "aggressive, illicit, risk-taking activities".


Posted by Eugene Wallingford | Permalink | Categories: General

July 24, 2013 11:44 AM

Headline: "Dinosaurs Object to Meteor's Presence"

Don't try to sell a meteor to a dinosaur...

Nate Silver recently announced that he is leaving the New York Times for ESPN. Margaret Sullivan offers some observations on the departure, into how political writers at the Times viewed Silver and his work:

... Nate disrupted the traditional model of how to cover politics.

His entire probability-based way of looking at politics ran against the kind of political journalism that The Times specializes in: polling, the horse race, campaign coverage, analysis based on campaign-trail observation, and opinion writing. ...

His approach was to work against the narrative of politics. ...

A number of traditional and well-respected Times journalists disliked his work. The first time I wrote about him I suggested that print readers should have the same access to his writing that online readers were getting. I was surprised to quickly hear by e-mail from three high-profile Times political journalists, criticizing him and his work. ...

Maybe Silver decided to acquiesce to Hugh MacLeod's advice. Maybe he just got a better deal.

The world changes, whether we like it or not. The New York Times and its journalists probably have the reputation, the expertise, and the strong base they need to survive the ongoing changes in journalism, with or without Silver. Other journalists don't have the luxury of being so cavalier.

I don't know any more attitudes inside the New York Times than what I see reported in the press, but Sullivan's article made me think of one of Anil Dash's ten rules of the internet:

When a company or industry is facing changes to its business due to technology, it will argue against the need for change based on the moral importance of its work, rather than trying to understand the social underpinnings.

I imagine that a lot of people at the Times are indeed trying to understand the social underpinnings of the changes occurring in the media and trying to respond in useful ways. But that doesn't mean that everyone on the inside is, or even that the most influential and high-profile people in the trenches are. And that's adds an internal social challenge to the external technological challenge.

Alas, we see much the same dynamic playing out in universities across the country, including my own. Some dinosaurs have been around for a long time. Others are near the beginning of their careers. The internal social challenges are every bit as formidable as the external economic and technological ones.


Posted by Eugene Wallingford | Permalink | Categories: General

July 23, 2013 9:46 AM

Some Meta-Tweeting Silliness

my previous tweet
missed an opportunity,
should have been haiku

(for @fogus)


Posted by Eugene Wallingford | Permalink | Categories: General

July 15, 2013 2:41 PM

Version Control for Writers and Publishers

Mandy Brown again, this time on on writing tools without memory:

I've written of the web's short-term memory before; what Manguel trips on here is that such forgetting is by design. We designed tools to forget, sometimes intentionally so, but often simply out of carelessness. And we are just as capable of designing systems that remember: the word processor of today may admit no archive, but what of the one we build next?

This is one of those places where the software world has a tool waiting to reach a wider audience: the version control system. Programmers using version control can retrieve previous states of their code all the way back to its creation. The granularity of the versions is limited only by the frequency with which they "commit" the code to the repository.

The widespread adoption of version control and the existence of public histories at place such as GitHub have even given rise to a whole new kind of empirical software engineering, in which we mine a large number of repositories in order to understand better the behavior of developers in actual practice. Before, we had to contrive experiments, with no assurance that devs behaved the same way under artificial conditions.

Word processors these days usually have an auto-backup feature to save work as the writer types text. Version control could be built into such a feature, giving the writer access to many previous versions without the need to commit changes explicitly. But the better solution would be to help writers learn the value of version control and develop the habits of committing changes at meaningful intervals.

Digital version control offers several advantages over the writer's (and programmer's) old-style history of print-outs of previous versions, marked-up copy, and notebooks. An obvious one is space. A more important one is the ability to search and compare old versions more easily. We programmers benefit greatly from a tool as simple as diff, which can tell us the textual differences between two files. I use diff on non-code text all the time and imagine that professional writers could use it to better effect than I.

The use of version control by programmers leads to profound changes in the practice of programming. I suspect that the same would be true for writers and publishers, too.

Most version control systems these days work much better with plain text than with the binary data stored by most word processing programs. As discussed in my previous post, there are already good reasons for writers to move to plain text and explicit mark-up schemes. Version control and text analysis tools such as diff add another layer of benefit. Simple mark-up systems like Markdown don't even impose much burden on the writer, resembling as they do how so many of us used to prepare text in the days of the typewriter.

Some non-programmers are already using version control for their digital research. Check out William Turkel's How To for doing research with digital sources. Others, such The Programming Historian and A Companion to Digital Humanities, don't seem to mention it. But these documents refer mostly to programs for working with text. The next step is to encourage adoption of version control for writers doing their own thing: writing.

Then again, it has taken a long time for version control to gain such widespread acceptance even among programmers, and it's not yet universal. So maybe adoption among writers will take a long time, too.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 11, 2013 2:57 PM

Talking to the New University President about Computer Science

Our university recently hired a new president. Yesterday, he and the provost came to a meeting of the department heads in humanities, arts, and sciences, so that he could learn a little about the college. The dean asked each head to introduce his or her department in one minute or less.

I came in under a minute, as instructed. Rather than read a litany of numbers that he can read in university reports, I focused on two high-level points:

  • Major enrollment has recovered nicely since the deep trough after the dot.com bust and is now steady. We have near-100% placement, but local and state industry could hire far more graduates.
  • For the last few years we have also been working to reach more non-majors, which is a group we under-serve relative to most other schools. This should be an important part of the university's focus on STEM and STEM teacher education.

I closed with a connection to current events:

We think that all university graduates should understand what 'metadata' is and what computer programs can do with it -- enough so that they can understand the current stories about the NSA and be able to make informed decisions as a citizen.

I hoped that this would be provocative and memorable. The statement elicited laughs and head nods all around. The president commented on the Snowden case, asked me where I thought he would land, and made an analogy to The Man Without a Country. I pointed out that everyone wants to talk about Snowden, including the media, but that's not even the most important part of the story. Stories about people are usually of more interest than stories about computer programs and fundamental questions about constitutional rights.

I am not sure how many people believe that computer science is a necessary part of a university education these days, or at least the foundations of computing in the modern world. Some schools have computing or technology requirements, and there is plenty of press for the "learn to code" meme, even beyond the CS world. But I wonder how many US university graduates in 2013 understand enough computing (or math) to understand this clever article and apply that understand to the world they live in right now.

Our new president seemed to understand. That could bode well for our department and university in the coming years.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 08, 2013 1:05 PM

A Random Thought about the Metadata and Government Surveillance

In a recent mischievous mood, I decided it might be fun to see the following.

The next whistleblower with access to all the metadata that the US government is storing on its citizens assembles a broad list of names: Republican and Democrat; legislative, executive, and judicial branches; public official and private citizens. The only qualification for getting on the list is that the person has uttered any variation of the remarkably clueless statement, "If you aren't doing anything wrong, then you have nothing to hide."

The whistleblower thens mine the metadata and, for each person on this list, publishes a brief that demonstrates just how much someone with that data can conclude -- or insinuate -- about a person.

If they haven't done anything wrong, then they don't have anything to worry about. Right?


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

June 10, 2013 2:41 PM

Unique in Exactly the Same Way

Ah, the idyllic setting of my youth:

When people refer to "higher education" in this country, they are talking about two systems. One is élite. It's made up of selective schools that people can apply to -- schools like Harvard, and also like U.C. Santa Cruz, Northeastern, Penn State, and Kenyon. All these institutions turn most applicants away, and all pursue a common, if vague, notion of what universities are meant to strive for. When colleges appear in movies, they are verdant, tree-draped quadrangles set amid Georgian or Gothic (or Georgian-Gothic) buildings. When brochures from these schools arrive in the mail, they often look the same. Chances are, you'll find a Byronic young man reading "Cartesian Meditations" on a bench beneath an elm tree, or perhaps his romantic cousin, the New England boy of fall, a tousle-haired chap with a knapsack slung back on one shoulder. He is walking with a lovely, earnest young woman who apparently likes scarves, and probably Shelley. They are smiling. Everyone is smiling. The professors, who are wearing friendly, Rick Moranis-style glasses, smile, though they're hard at work at a large table with an eager student, sharing a splayed book and gesturing as if weighing two big, wholesome orbs of fruit. Universities are special places, we believe: gardens where chosen people escape their normal lives to cultivate the Life of the Mind.

I went to a less selective school than the ones mentioned here, but the vague ideal of higher education was the same. I recognized myself, vaguely, in the passage about the tousle-haired chap with a knapsack, though on a Midwestern campus. I certainly pined after a few lovely, earnest young women with a fondness for scarves and the Romantic poets in my day. These days, I have become the friendly, glasses-wearing, always-smiling prof in the recruiting photo.

The descriptions of movie scenes and brochures, scarves and Shelley and approachable professors, reminded me most of something my daughter told me as she waded through recruiting literature from so many schools a few years ago, "Every school is unique, dad, in exactly the same way." When the high school juniors see through the marketing facade of your pitch, you are in trouble.

That unique-in-the-same-way character of colleges and university pitches is a symptom of what lies at the heart of the coming "disruption" of what we all think of as higher education. The traditional ways for a school to distinguish itself from its peers, and even from schools it thinks of as lesser rivals, are becoming less effective. I originally wrote "disappearing", but they are now ubiquitous, as every school paints the same picture, stresses the same positive attributes, and tries not to talk too much about the negatives they and their peers face. Too many schools chasing too few tuition-paying customers accelerates the process.

Trying to protect the ideal of higher education is a noble effort now being conducted in the face of a rapidly changing landscape. However, the next sentence of the recent New Yorker article Laptop U, from which the passage quoted above comes, reminds us:

But that is not the kind of higher education most Americans know. ...

It is the other sort of higher education that will likely be the more important battleground on which the higher ed is disrupted by technology.

We are certainly beginning to have such conversations at my school, and we are starting to hear rumblings from outside. My college's dean and our new university president recently visited the Fortune 100 titan that dominates local industry. One of the executives there gave them several documents they've been reading there, including "Laptop U" and the IPPR report mentioned in it, "An Avalanche is Coming: Higher Education and the Revolution Ahead".

It's comforting to know your industry partners value you enough to want to help you survive a coming revolution. It's also hard to ignore the revolution when your partners begin to take for granted that it will happen.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

June 07, 2013 1:53 PM

Sentences to Ponder

Henry Rollins:

When one beast dumps you, summon the guts to find another. If it tries to kill you, the party has definitely started. Otherwise, life is a slow retirement.

Rollins is talking about why he's not making music anymore, but his observation applies to other professions. We all know programmers who are riding out the long tail of an intellectual challenge that died long ago. College professors, too.

I have to imagine that this is a sad life. It certainly leaves a lot of promise unfulfilled.

If you think you have a handle on the beast, then the beast has probably moved on. Find a new beast with which to do battle.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

June 04, 2013 2:43 PM

A Simple Confession

My Unix-toting brethren may revoke my CS card for saying this, but I really do like to install programs this way:

    Installing ApplicationX

1. Open the disk image 2. Drag ApplicationX to your Applications folder 3. Eject the disk image

The app loses points if I really have to drag it to the Applications folder. The Desktop should do.

I understand the value in ./configure and ./make and setting paths and... but it sure is nice when I don't have to use them.


Posted by Eugene Wallingford | Permalink | Categories: General

May 31, 2013 1:44 PM

Quotes of the Week, in Four Dimensions

Engineering.

Michael Bernstein, in A Generation Ago, A Thoroughly Modern Sampling:

The AI Memos are an extremely fertile ground for modern research. While it's true that what this group of pioneers thought was impossible then may be possible now, it's even clearer that some things we think are impossible now have been possible all along.

When I was in grad school, we read a lot of new and recent research papers. But the most amazing, most educational, and most inspiring stuff I read was old. That's often true today as well.

Science.

Financial Agile tweets:

"If it disagrees with experiment, it's wrong". Classic.

... with a link to The Scientific Method with Feynman, which has a wonderful ten-minute video of the physicist explaining how science works. Among its important points is that guessing is huge part of science. It's just that scientists have a way of telling which guesses are right and which are wrong.

Teaching.

James Boyk, in Six Words:

Like others of superlative gifts, he seemed to think the less gifted could do as well as he, if only they knew a few powerful specifics that could readily be conveyed. Sometimes he was right!

"He" is Leonid Hambro, who played with Victor Borge and P. D. Q. Bach but was also well-known as a teacher and composer. Among my best teachers have been some extraordinarily gifted people. I'm thankful for the time they tried to convey their insights to the likes of me.

Art.

Amanda Palmer, in a conference talk:

We can only connect the dots that we collect.

Palmer uses this sentence to explain in part why all art is about the artist, but it means something more general, too. You can build, guess, and teach only with the raw materials that you assemble in your mind and your world. So collect lots of dots. In this more prosaic sense, Palmer's sentence applies to not only to art but also to engineering, science, and teaching.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

May 26, 2013 9:45 AM

Programming Magic and Business Skeuomorphism

Designer Craig Mod offers Marco Arment's The Magazine as an exemplar of Subcompact Publishing in the digital age: "No cruft, all substance. A shadow on the wall."; a minimal disruptor that capitalizes on the digital medium without tying itself down with the strictures of twentieth-century hardcopy technology.

After detailing the advantages of Arment's approach, Mod points out the primary disadvantage: you have to be able to write an iOS application. Which leads to this gem

The fact that Marco -- a programmer -- launched one of the most 'digitally indigenous' contemporary tablet publications is indicative of two things:
  1. Programmers are today's magicians. In many industries this is obvious, but it's now becoming more obvious in publishing. Marco was able to make The Magazine happen quickly because he saw that Newsstand was underutilized and understood its capabilities. He knew this because he's a programmer. Newsstand wasn't announced at a publishing conference. It was announced at the WWDC.
  2. The publishing ecosystem is now primed for complete disruption.

If you are a non-programmer with ideas, don't think I just need a programmer; instead think, I need a technical co-founder. A lot of people think of programming as Other, as a separate world from what they do. Entrepreneurs such as Arment, and armies of young kids writing video games and apps for their friends, know instead that it is a tool they can use to explore their interests.

Mod offers an a nice analogy from the design world to explain why entrenched industry leaders and even prospective entrepreneurs tend to fall into the trap of mimicking old technology in their new technologies: business skeuomorphism.

For example, designers "bring the mechanical camera shutter sound to digital cameras because it feels good" to users. In a similar way, a business can transfer a decision made under the constraints of one medium or market into a new medium or market in which the constraints no longer apply. Under new constraints, and with new opportunities, the decision is no longer a good one, let alone necessary or optimal.

As usual, I am thinking about how these ideas relate to the disruption of university education. In universities, as in the publishing industry, business skeuomorphism is rampant. What is the equivalent of the Honda N360 in education? Is it Udacity or Coursera? Enstitute? Or something simpler?


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

May 17, 2013 3:26 PM

Pirates and Tenure

I recently read The Sketchbook of Susan Kare, "the Artist Who Gave Computing a Human Face", which referred to the Apple legend of the Pirate Flag:

[Kare's] skull-and-crossbones design would come in handy when Jobs issued one of his infamous motivational koans to the Mac team: "It's better to be a pirate than join the Navy."

For some reason, that line brought to mind a favorite saying of one of my friends, Sid Kitchel:

Real men don't accept tenure.

If by some chance they do accept tenure, they should at least never move into administration, even temporarily. It's a bad perch from which to be a pirate.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

April 23, 2013 4:16 PM

"Something Bigger Than Me"

In this interview with The Setup, Patric King talks about his current work:

Right now, my focus is bringing well-designed marketing to industries I believe in, to help them develop more financing. ... It is not glamorous, but it is the right thing to do. Designing pretty things is nice, but it's time for me to do something bigger than me.

Curly says, 'one thing... just one thing'

That's a pretty good position to be in: bringing value to a company or industry you believe in. Sometimes, we find such positions by virtue of the career path we choose. Those of us who teach as a part of our jobs are lucky in this regard.

Other times, we have to make a conscious decision to seek positions of this sort, or create the company we want to be in. That's what King has done. His skill set gives him more latitude than many people have. Those of us who can create software have more freedom than most other people, too. What an opportunity.

King's ellipsis is filled with the work that matters to him. As much as possible, when the time is right, we all should find the work that replaces our own ellipses with something that really matters to us, and to the world.


Posted by Eugene Wallingford | Permalink | Categories: General

April 14, 2013 6:25 PM

Scientists Being Scientists

Watson and Crick announced their discovery of the double helix structure of DNA in Molecular Structure of Nucleic Acids, a marvel of concise science writing. It has been widely extolled for how much information it packs into a single page, including the wonderfully understated line, "It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material."

As I read this paper again recently, though, this passage stood out:

The previously published X-ray data on deoxyribose nucleic acid are insufficient for a rigorous test of our structure. So far as we can tell, it is roughly compatible with the experimental data, but it must be regarded as unproved until it has been checked against more exact results.

They are unpretentious sentences. They do nothing special, stating simply that more and better data are needed to test their hypothesis. This is not a time for hyperbole. It is a time to get back to the lab.

Just scientists being scientists.


Posted by Eugene Wallingford | Permalink | Categories: General

April 10, 2013 4:03 PM

Minor Events in the Revolution at Universities

This morning I ran across several articles that had me thinking yet again about the revolution I see happening in the universities (*).

First, there was this recent piece in the New York Times about software that grades essays. Such software is probably essential for MOOCs in many disciplines, but it would also be useful in large lecture sections of traditional courses at many universities. The software isn't perfect, and skeptics abound. But the creator of the EdX software discussed in the article says:

This is machine learning and there is a long way to go, but it's good enough and the upside is huge.

It's good enough, and the upside is huge. Entrenched players scoff. Classic disruption at work.

Then there was this piece from the Nieman Journalism Lab about an online Dutch news company that wants readers to subscribe to individual journalists. Is this really news in 2013? I read a lot of technical and non-technical material these days via RSS feeds from individual journalists and bloggers. Of course, that's not the model yet for traditional newspapers and magazines.

... but that's the news business. What about the revolution in universities? The Nieman Lab piece reminded me of an old article in Vanity Fair about Politico, a news site founded by a small group of well-known political journalists who left their traditional employers to start the company. They all had strong "personal brands" and journalistic credentials. Their readers followed them to their new medium. Which got me to thinking...

What would happen if the top 10% of the teachers at Stanford or Harvard or Williams College just walked out to start their own university?

Of course, in the time since that article was published, we have seen something akin to this, with the spin-off of companies like Coursera and Udacity. However, these new education companies are partnering with traditional universities and building off the brands of their partners. At this point in time, the brand of a great school still trumps the individual brands of most all its faculty. But one can imagine a bolder break from tradition.

What happens when technology gives a platform to a new kind of teacher who bypasses the academic mainstream to create and grow a personal brand? What happens when this new kind of teacher bands together with a few like-minded renegades to use the same technology to scale up to the size of a traditional university, or more?

That will never happen, or so many of us in the academy are saying. This sort of thinking is what makes the Dutch news company mentioned above seem like such a novelty in the world of journalism. Many journalists and media companies, though, now recognize the change that has happened around them.

Which leads to a final piece I read this morning, a short blog entry by Dave Winer about Ezra Klein's epiphany on how blogging and journalism are now part of a single fabric. Winer says:

It's tragic that it took a smart guy like Klein so long to understand such a basic structural truth about how news, his own profession, has been working for the last 15 years.

I hope we aren't saying the same thing about the majority of university professors fifteen or twenty years from now. As we see in computers that grade essays, sometimes a new idea is good enough, and the upside is huge. More and more people will experiment with good-enough ideas, and even ideas that aren't good enough yet, and as they do the chance of someone riding the upside of the wave to something really different increases. I don't think MOOCs are a long-term answer to any particular educational problem now or in the future, but they are one of the laboratories in which these experiments can be played out.

I also hope that fifteen or twenty years from now someone isn't saying about skeptical university professors what Winer says so colorfully about journalists skeptical of the revolution that has redefined their discipline while they worked in it:

The arrogance is impressive, but they're still wrong.

~~~~

(*).   Nearly four years later, Revolution Out There -- and Maybe In Here remains one of my most visited blog entries, and one that elicits more reader comments than most. I think it struck a chord.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

April 09, 2013 3:16 PM

Writing a Book Is Like Flying A Spaceship

I've always liked this quote from the preface of Pragmatic Ajax, by Gehtland, Galbraith, and Almaer:

Writing a book is a lot like (we imagine) flying a spaceship too close to a black hole. One second you're thinking "Hey, there's something interesting over there," and a picosecond later, everything you know and love has been sucked inside and crushed.

Programming can be like that, too, in a good way. Just be sure to exit the black hole on the other side.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

March 30, 2013 8:43 AM

"It's a Good Course, But..."

Earlier this week I joined several other department heads to eat lunch with a bunch of high school teachers who were on campus for the Physics Olympics. The teachers were talking shop about the physics courses at their schools, and eventually the conversation turned to AP Physics. One of the teachers said, "It's a good course, but..."

A lot of these teachers would rather not offer AP Physics at all. One teacher described how in earlier days they were able to teach an advanced physics course of their own design. They had freedom to adapt to the interest of their students and to try out new ideas they encountered at conferences. Even though the advanced physics course had first-year physics as a prerequisite, they had plenty of students interested and able to take the second course.

The introduction of AP Physics created some problems. It's a good course, they all agreed, but it is yet another AP course for their students to take, and yet another AP exam for the students to prepare for. Most students can't or don't want to take all the AP courses, due to the heavier workload and often grueling pace. So in the end, they lose potential students who choose not to take the physics class.

Several of these teachers tried to make this case to heads of their divisions or to their principals, but to no avail.

This makes me sad. I'd like to see as many students taking science and math courses in high school as possible, and creating unnecessary bottlenecks hurts that effort.

There is a lot of cultural pressure these days to accelerate the work that HS students do. K-12 school districts and their administrators see the PR boon of offering more, and more advanced courses. State legislators are creating incentives for students to earn college credit while in high school, and funding for schools can reflect that. Parents love the idea of their children getting a head start on college, both because it might save money down the line and because they earn some vicarious pleasure in the achievement of their children.

On top of all this, the students themselves often face a lot of peer pressure from their friends and other fellow students to be doing and achieving more. I've seen that dynamic at work as my daughters have gone through high school.

Universities don't seem as keen about AP as they used to, but they send a mixed message to parents and students. On the one hand, many schools give weight in their admission decisions to the number of AP courses completed. This is especially true with more elite schools, which use this measure as a way to demonstrate their selectivity. Yet many of those same schools are reluctant to give full credit to students who pass the AP exam, at least as major credit, and require students to take their intro course anyway.

This reluctance is well-founded. We don't see any students who have taken AP Computer Science, so I can't commit on that exam but I've talked with several Math faculty here about their experiences with calculus. They say that, while AP Calculus teaches a lot of good material, but the rush to cover required calculus content often leaves students with weak algebra skills. They manage to succeed in the course despite these weaknesses, but when they reach more advanced university courses -- even Calc II -- these weaknesses come back to haunt them.

As a parent of current and recent high school students, I have observed the student experience. AP courses try to prepare students for the "college experience" and as a result cover a lot of material. The students see them as grueling experiences, even when they enjoy the course content.

That concerns me a bit. For students who know they want to be math or science majors, these courses are welcome challenges. For the rest of the students, who take the courses primarily to earn college credit or to explore the topic, these courses are so grueling that this dampen the fun of learning.

Call me old-fashioned, but I think of high school as a time to learn about a lot of different things, to sample broadly from all areas of study. Sure, students should build up the skills necessary to function in the workplace and go to college, but the emphasis should be on creating a broadly educated citizen, not training a miniature college student. I'd rather students get excited about learning physics, or math, or computer science, so that they will want to dive deeper when they get to college.

A more relaxed, more flexible calculus class or physics course might attract more students than a grueling AP course. This is particularly important at a time when everyone is trying to increase interest in STEM majors.

My daughters have had a lot of great teachers, both in and out of their AP courses. I wish some of those teachers had had more freedom to spark student interest in the topic, rather than student and teacher alike facing the added pressure of taking the AP exam, earning college credits, and affecting college admission decisions

It's a good course, but feel the thrill first.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

February 28, 2013 2:52 PM

The Power of a Good Abstract

Someone tweeted a link to Philip Greenspun's M.S. thesis yesterday. This is how you grab your reader's attention:

A revolution in earthmoving, a $100 billion industry, can be achieved with three components: the GPS location system, sensors and computers in earthmoving vehicles, and SITE CONTROLLER, a central computer system that maintains design data and directs operations. The first two components are widely available; I built SITE CONTROLLER to complete the triangle and describe it here.

Now I have to read the rest of the thesis.

You could do worse than use Greenspun's first two sentences as a template for your next abstract:

A revolution in <major industry or research area> can be achieved with <n> components: <component-1>, <component-2>, ... and <component-n>. The first <n-1> components are widely available. I built <program name> to meet the final need and describe it here.

I am adding this template to my toolbox of writing patterns, alongside Kent Beck's four-sentence abstract (scroll down to Kent's name), which generalizes the idea of one startling sentence that arrests the reader. I also like good advice on how to write concise, incisive thesis statements, such as that in Matt Might's Advice for PhD Thesis Proposals and Olin Shivers's classic Dissertation Advice.

As with any template or pattern, overuse can turn a good idea into a cliché. If readers repeatedly see the same cookie-cutter format, it begins to look stale and will cause the reader to lose interest. So play with variations on the essential theme: I have solved an important problem. This is my solution.

If you don't have a great abstract, try again. Think hard about your own work. Why is this problem important? What is the big win from my solution? That's a key piece of advice in Might's advice for graduate students: state clearly and unambiguously what you intend to achieve.

Indeed, approaching your research in a "test-driven" way makes a lot of sense. Before embarking on a project, try to write the startling abstract that will open the paper or dissertation you write when you have succeeded. If you can't identify the problem as truly important, then why start at all? Maybe you should pick something more valuable to work on, something that matters enough you can write a startling abstract for the esult. That's a key piece of advice shared by Richard Hamming in his You and Your Research.

And whatever you do, don't oversell a minor problem or a weak solution with an abstract that promises too much. Readers will be disappointed at best and angry at worst. If you oversell even a little bit too many times, you will become like the boy who cried wolf. No one will believe your startling claim even when it's on the mark.

Greenspun's startling abstract ends as strongly as it begins. Of course, it helps if you can close with a legitimate appeal to ameliorating poverty around the world:

This area is exciting because so much of the infrastructure is in place. A small effort by computer scientists could cut the cost of earthmoving in half, enabling poor countries to build roads and rich countries to clean up hazardous waste.

I'm not sure adding another automating refactoring to Eclipse or creating another database library can quite rise to the level of empowering the world's poor. But then, you may have a different audience.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns, Teaching and Learning

February 17, 2013 12:16 PM

The Disruption of Education: B.F. Skinner, MOOCs, and SkillShare

Here are three articles, all different, but with a connection to the future of education.

•   Matthew Howell, Teaching Programming

Howell is a software developer who decided to start teaching programming on the side. He offers an on-line course through SkillShare that introduces non-programmers to the basic concepts of computer programming, illustrated using Javascript running in a browser. This article describes some of his reasons for teaching the course and shares a few things he has learned. One was:

What is the ideal class size? Over the year, I've taught classes that ranged in size from a single person to as many as ten. Through that experience, I've settled on five as my ideal.

Anyone who has taught intro programming in a high school or university is probably thinking, um, yeah, that would be great! I once taught an intermediate programming section with fifty or so people, though most of my programming courses have ranged from fifteen to thirty-five students. All other things being equal, smaller is better. Helping people learn to write and make things almost usually benefits from one-on-one time and time for small groups to critique design together.

Class size is, of course, one of the key problems we face in education these days, both K-12 and university. For a lot of teaching, n = 5 is just about perfect. For upper-division project courses, I prefer four groups of four students, for a total of sixteen. But even at that size, the costs incurred by a university offering sections of are rising a lot faster than its revenues.

With MOOCs all the rage, Howell is teaching at the other end of spectrum. I expect the future of teaching to see a lot of activity at both scales. Those of us teaching in the middle face bleaker prospects.

•   Mike Caulfield, B. F. Skinner on Teaching Machines (1954)

Caulfield links to this video of B.F. Skinner describing a study on the optimal conditions for self-instruction using "teaching machines" in 1954. Caulfield points out that, while these days people like to look down on Skinner's behaviorist view of learning, he understood education better than many of his critics, and that others are unwittingly re-inventing many of his ideas.

For example:

[Skinner] understands that it is not the *machine* that teaches, but the person that writes the teaching program. And he is better informed than almost the entire current educational press pool in that he states clearly that a "teaching machine" is really just a new kind of textbook. It's what a textbook looks like in an age where we write programs instead of paragraphs.

That's a great crystallizing line by Caulfield: A "teaching machine" is what a textbook looks like in an age where we write programs instead of paragraphs.

Caulfield reminds us that Skinner said these things in 1954 and cautions us to stop asking "Why will this work?" about on-line education. That question presupposes that it will. Instead, he suggests we ask ourselves, "Why will this work this time around?" What has changed since 1954, or even 1994, that makes it possible this time?

This is a rightly skeptical stance. But it is wise to be asking the question, rather than presupposing -- as so many educators these days do -- that this is just another recursion of the "technology revolution" that never quite seems to revolutionize education after all.

•   Clayton Christensen in Why Apple, Tesla, VCs, academia may die

Christensen didn't write this piece, but reporter Cromwell Schubarth quotes him heavily throughout on how disruption may be coming to several companies and industries of interest to his Silicon Valley readership.

First, Christensen reminds young entrepreneurs that disruption usually comes from below, not from above:

If a newcomer thinks it can win by competing at the high end, "the incumbents will always kill you".

If they come in at the bottom of the market and offer something that at first is not as good, the legacy companies won't feel threatened until too late, after the newcomers have gained a foothold in the market.

We see this happening in higher education now. Yet most of my colleagues here on the faculty and in administration are taking the position that leaves legacy institutions most vulnerable to overthrow from below. "Coursera [or whoever] can't possibly do what we do", they say. "Let's keep doing what we do best, only better." That will work, until it doesn't.

Says Christensen:

But now online learning brings to higher education this technological core, and people who are very complacent are in deep trouble. The fact that everybody was trying to move upmarket and make their university better and better and better drove prices of education up to where they are today.

We all want to get better. It's a natural desire. My university understands that its so-called core competency lies in the niche between the research university and the liberal arts college, so we want to optimize in that space. As we seek to improve, we aspire to be, in our own way, like the best schools in their niches. As Christensen pointed out in The Innovator's Dilemma, this is precisely the trend that kills an institution when it meets a disruptive technoology.

Later in the article, Christensen talks about how many schools are getting involved in online learning, sometimes investing significant resources, but almost always in service of the existing business model. Yet other business models are being born, models that newcomers are willing -- and sometimes forced -- to adopt.

One or more of these new models may be capable of toppling even the most successful institutions. Christensen describes one such candidate, a just-in-time education model in which students learn something, go off to use it, and then come back only when they need to learn what they need to know in order to take their next steps.

This sort of "learn and use", on-the-job learning, whether online or in person, is a very different way of doing things from school as we know it. It id not especially compatible with the way most universities are organized to educate people. It is, however, plenty compatible with on-line delivery and thus offers newcomers to the market the pebble they may use to bring down the university.

~~~~

The massively open on-line course is one form the newcomers are taking. The smaller, more intimate offering enabled by the likes of SkillShare is another. It may well be impossible for legacy institutions caught in the middle to fend off challenges from both directions.

As Caulfield suggests, though, we should be skeptical. We have seen claims about technology upending schools before. But we should adopt the healthy skepticism of the scientist, not the reactionary skepticism of the complacent or the scared. The technological playing field has changed. What didn't work in 1954 or 1974 or 1994 may well work this time.

Will it? Christensen thinks so:

Fifteen years from now more than half of the universities will be in bankruptcy, including the state schools. In the end, I am excited to see that happen.

I fear that universities like mine are at the greatest risk of disruption, should the wave that Christensen predicts come. I don't know many university faculty are excited to see it happen. I just hope they aren't too surprised if it does.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

February 07, 2013 5:01 PM

Quotes of the Day

Computational Thinking Division. From Jon Udell, another lesson that programming and computing teach us which can be useful out in the world:

Focus on understanding why the program is doing what it's doing, rather than why it's not doing what you wanted it to.

This isn't the default approach of everyone. Most of my students have to learn this lesson as a part of learning how to program. But it can be helpful outside of programming, in particular by influencing how we interact with people. As Udell says, it can be helpful to focus on understanding why one's spouse or child or friend is doing what she is doing, rather than on why she isn't doing what you want.

Motivational Division. From the Portland Ballet, of all places, several truths about being a professional dancer that generalize beyond the studio, including:

There's a lot you don't know.
There may not be a tomorrow.
There's a lot you can't control.
You will never feel 100% ready.

So get to work, even if it means reading the book and writing the code for the fourth time. That is where the fun and happiness are. All you can affect, you affect by the work you do.

Mac Chauvinism Division. From Matt Gemmell, this advice on a particular piece of software:

There's even a Windows version, so you can also use it before you've had sufficient success to afford a decent computer.

But with enough work and a little luck, you can afford better next time.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Managing and Leading

February 06, 2013 10:06 AM

Shared Governance and the 21st Century University

Mitch Daniels, the new president of Purdue University, says this about shared governance in An Open Letter to the People of Purdue, his initial address to the university community:

I subscribe entirely to the concept that major decisions about the university and its future should be made under conditions of maximum practical inclusiveness and consultation. The faculty must have the strongest single voice in these deliberations, but students and staff should also be heard whenever their interests are implicated. I will work hard to see that all viewpoints are fairly heard and considered on big calls, including the prioritization of university budgetary investments, and endeavor to avoid surprises even on minor matters to the extent possible.

Shared governance implies shared accountability. It is neither equitable or workable to demand shared governing power but declare that cost control or substandard performance in any part of Purdue is someone else's problem. We cannot improve low on-time completion rates and maximize student success if no one is willing to modify his schedule, workload, or method of teaching.

Participation in governance also requires the willingness to make choices. "More for everyone" or "Everyone gets the same" are stances of default, inconsistent with the obligations of leadership.

I love the phrase, inconsistent with the obligations of leadership.

Daniels recently left the governor's house in Indiana for the president's house at Purdue. His initial address is balanced, open, and forward-looking. It is respectful of what universities do and forthright about the need to recognize changes in the world around us, and to change in response.

My university is hiring a new president, too. Our Board of Regents will announce its selection tomorrow. It is probably too much to ask that we hire a new president with the kind of vision and leadership that Daniels brings to West Lafayette. I do hope that we find someone up to the task of leading a university in a new century.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

February 03, 2013 11:10 AM

Faulkner Teaches How to Study

novelist William Faulkner, dressed for work

From this Paris Review interview with novelist William Faulkner:

INTERVIEWER

Some people say they can't understand your writing, even after they read it two or three times. What approach would you suggest for them?

FAULKNER

Read it four times.

The first three times through the book are sunk cost. At this moment, you don't understand. What should you do? Read it again.

I'm not suggesting you keep doing the same failing things over and over. (You know what Einstein said about insanity.) If you read the full interview, you'll see that Faulkner isn't suggesting that, either. We're suggesting you get back to work.

Studying computer science is different from reading literature. We can approach our study perhaps more analytically than the novel reader. And we can write code. As an instructor, I try to have a stable of ideas that students can try when they are having trouble grasping a new concept or understanding a reading, such as:

  • Assemble a list of specific questions to ask your prof.
  • Talk to a buddy who seems to understand what you don't.
  • Type the code from the paper in character-by-character, thinking about it as you do.
  • Draw a picture.
  • Try to explain the parts you do understand to another student.
  • Focus on one paragraph, and work backward from there to the ideas it presumes you already know.
  • Write your own program.

One thing that doesn't work very well is being passive. Often, students come to my office and say, "I don't get it." They don't bring much to the session. But the best learning is not passive; it's active. Do something. Something new, or just more.

Faulkner is quite matter-of-fact about creating and reading literature. If it isn't right, work to make it better. Technique? Method? Sure, whatever you need. Just do the work.

This may seem like silly advice. Aren't we all working hard enough already? Not all of us, and not all the time. I sometimes find that when I'm struggling most, I've stopped working hard. I get used to understanding things quickly, and then suddenly I don't. Time to read it again.

I empathize with many of my students. College is a shock to them. Things came easily in high school, and suddenly they don't. These students mean well but seem genuinely confused about what they should do next. "Why don't I understand this already?"

Sometimes our impatience is born from such experience. But as Bill Evans reminds us, some problems are too big to conquer immediately. He suggests that we accept this up front and enjoy the whole trip. That's good advice.

Faulkner shrugs his shoulders and tells us to get back to work.

~~~~

PHOTO. William Faulkner, dressed for work. Source: The Centered Librarian.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 26, 2013 5:52 PM

Computing Everywhere: Indirection

Alice: The hardest word you'll ever be asked to spell is "ichdericious".

Bob: Yikes. Which word?

A few of us have had fun with the quotations in English and Scheme over the last few days, but this idea is bigger than symbols as data values in programs or even words and strings in natural language. They are examples of a key element of computational thinking, indirection, which occurs in real life all the time.

A few years ago, my city built a new water park. To account for the influx of young children in the area, the city dropped the speed limit in the vicinity of the pool from 35 MPH to 25 MPH. The speed limit in that area has been 35 MPH for a long time, and many drivers had a hard time adjusting to the change. So the city put up a new traffic sign a hundred yards up the road, to warn drivers of the coming change. It looks like this one:

traffic sign: 40 MPH speed limit ahead

The white image in the middle of this sign is a quoted version of what drivers see down the road, the usual:

traffic sign: 40 MPH speed limit

Now, many people slow down to the new speed limit well in advance, often before reaching even the warning sign. Maybe they are being safe. Then again, maybe they are confusing a sign about a speed limit sign with the speed limit sign itself.

If so, they have missed a level of indirection.

I won't claim that computer scientists are great drivers, but I will say that we get used to dealing with indirection as a matter of course. A variable holds a value. A pointer holds the address of a location, which holds a value. A URL refers to a web page. The list goes on.

Indirection is a fundamental element in the fabric of computation. As computation becomes an integral part of nearly everyone's daily life, there is a lot to be gained by more people understanding the idea of indirection and recognizing opportunities to put it to work to mutual benefit.

Over the last few years, Jon Udell has been making a valiant attempt to bring this issue to the attention of computer scientists and non-computer scientists alike. He often starts with the idea of a hyperlink in a web page, or the URL to which it is tied, as a form of computing indirection that everyone already groks. But his goal is to capitalize on this understanding to sneak the communication strategy of pass by reference into people's mental models.

As Udell says, most people use hyperlinks every day but don't use them as well as they might, because the distinction between "pass by value" and "pass by reference" is not a part of their usual mental machinery:

The real problem, I think, is that if you're a newspaper editor, or a city official, or a citizen, pass-by-reference just isn't part of your mental toolkit. We teach the principle of indirection to programmers. But until recently there was no obvious need to teach it to everybody else, so we don't.

He has made the community calendar his working example of pass by reference, and his crusade:

In the case of calendar events, you're passing by value when you send copies of your data to event sites in email, or when you log into an events site and recopy data that you've already written down for yourself and published on your own site.

You're passing by reference when you publish the URL of your calendar feed and invite people and services to subscribe to your feed at that URL.

"Pass by reference rather than by value" is one of Udell's seven ways to think like the web, his take on how to describe computational thinking in a world of distributed, network media. That essay is a good start on an essential module in any course that wants to prepare people to live in a digital world. Without these skills, how can we hope to make the best use of technology when it involves two levels of indirection, as shared citations and marginalia do?

Quotation in Scheme and pass-by-reference are different issue, but they are related in a fundamental way to the concept of indirection. We need to arm more people with this concept than just CS students learning how programming languages work.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

January 25, 2013 4:47 PM

More on Real-World Examples of Quotation

My rumination on real-world examples of quotation to use with my students learning Scheme sparked the imaginations of several readers. Not too surprisingly, they came up with better examples than my own... For example, musician and software developer Chuck Hoffman suggested:

A song, he sang.
"A song", he sang.

The meaning of these is clearly different depending on whether we treat a song as a variable or as a literal.

My favorite example came from long-time friend Joe Bergin:

"Lincoln" has seven letters.
Lincoln has seven letters.

Very nice. Joe beat me with my own example!

As Chuck wrote, song titles create an interesting challenge, whether someone is singing a certain song or singing in a way defined by the words that happen to also be the song's title. I have certainly found it hard to find words both that are part of a title or a reference and that flow seamlessly in a sentence.

This turns out to be a fun form of word play, independent of its use as a teaching example. Feel free to send me your favorites.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

January 20, 2013 10:28 AM

Scored Discussions

My wife has been on a long-term substitute teaching assignment for the last few weeks. Yesterday, I ran across the following rubric used by one of the middle school teachers there to grade "scored discussions". The class reads a book, which they discuss as a group. Students are evaluated by their contribution to the discussion, including their observable behavior.

Productive behavior
  • Uses positive body language and eye contact (5)
  • Makes a relevant comment (1)
  • Offers supporting evidence (2)
  • Uses an analogy (3)
  • Asks a clarifying question (2)
  • Listens actively -- rephrases comment before responding (3)
  • Uses good speaking skills -- clear speech, loud enough, not too fast (2)

Nonproductive behavior

  • Not paying attention (-2)
  • Interrupting (-3)
  • Irrelevant comment (-2)
  • Monopolizing (-3)

Most adults, including faculty, should be glad that their behavior is not graded according to this standard. I daresay that many of us would leave meetings with a negative score more often that we would like to admit.

I think I'll use this rubric to monitor my own behavior at the next meeting on my calendar.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

December 31, 2012 8:22 AM

Building Things and Breaking Things Down

As I look toward 2013, I've been thinking about Alan Kay's view of CS as science [ link ]:

I believe that the only kind of science computing can be is like the science of bridge building. Somebody has to build the bridges and other people have to tear them down and make better theories, and you have to keep on building bridges.

In 2013, what will I build? What will I break down, understand, and help others to understand better?

One building project I have in mind is an interactive text. One analysis project in mind involves functional design patterns.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns

December 29, 2012 8:47 AM

Beautiful Sentences

Matthew Ward, in the Translator's Note to "The Stranger" (Vintage Books, 1988):

I have also attempted to venture further into the letter of Camus's novel, to capture what he said and how he said it, not what he meant. In theory, the latter should take care of itself.

This approach works pretty well for most authors and most books, I imagine.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

December 12, 2012 4:18 PM

Be a Driver, Not a Passenger

Some people say that programming isn't for everyone, just as knowing how to tinker under the hood of one's car isn't for everyone. Some people design and build cars; other people fix them; and the rest of us use them as high-level tools.

Douglas Rushkoff explains why this analogy is wrong:

Programming a computer is not like being the mechanic of an automobile. We're not looking at the difference between a mechanic and a driver, but between a driver and a passenger. If you don't know how to drive the car, you are forever dependent on your driver to take you where you want to go. You're even dependent on that driver to tell you when a place exists.

This is CS Education week, "a highly distributed celebration of the impact of computing and the need for computer science education". As a part of the festivities, Rushkoff was scheduled to address members of Congress and their staffers today about "the value of digital literacy". The passage quoted above is one of ten points he planned to make in his address.

As good as the other nine points are -- and several are very good -- I think the distinction between driver and passenger is the key, the essential idea for folks to understand about computing. If you can't program, you are not a driver; you are a passenger on someone else's trip. They get to decide where you go. You may want to invent a new place entirely, but you don't have the tools of invention. Worse yet, you may not even have the tools you need to imagine the new place. The world is as it is presented to you.

Don't just go along for the ride. Drive.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

December 09, 2012 5:12 PM

Just Build Things

The advantage of knowing how to program is that you can. The danger of knowing how to program is that you will want to.

From Paul Graham's How to Get Startup Ideas:

Knowing how to hack also means that when you have ideas, you'll be able to implement them. That's not absolutely necessary..., but it's an advantage. It's a big advantage, when you're considering an idea ..., if instead of merely thinking, "That's an interesting idea," you can think instead, "That's an interesting idea. I'll try building an initial version tonight."

Writing programs, like any sort of fleshing out of big ideas, is hard work. But what's the alternative? Not being able to program, and then you'll just need a programmer.

If you can program, what should you do?

[D]on't take any extra classes, and just build things. ... But don't feel like you have to build things that will become startups. That's premature optimization. Just build things.

Even the professor in me has to admit this is true. You will learn a lot of valuable theory, tools, and practices in class. But when a big idea comes to mind, you need to build it.

As Graham says, perhaps the best way that universities can help students start startups is to find ways to "leave them alone in the right way".

Of course, programming skills are not all you need. You'll probably need to be able to understand and learn from users:

When you find an unmet need that isn't your own, it may be somewhat blurry at first. The person who needs something may not know exactly what they need. In that case I often recommend that founders act like consultants -- that they do what they'd do if they'd been retained to solve the problems of this one user.

That's when those social science courses can come in handy.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

November 23, 2012 9:34 AM

In the Spirit of the Weekend

I am thankful for human beings' capacity to waste time.

We waste it in the most creative ways. My life is immeasurably better because other people have wasted time and created art and literature. Even much of the science and technology I enjoy came from people noodling around in their free time. The universe has blessed me, and us.

~~~~

At my house, Thanksgiving lasts the whole weekend. I don't mind writing a Thanksgiving blog the day after, even though the rest of the world has already moved on to Black Friday and the next season on the calendar. My family is, I suppose, wasting time.

This note of gratitude was prompted by reading a recent joint interview with Brian Eno and Ha-Joon Chang, oddities in their respective disciplines of music and economics. I am thankful for oddities such as Eno and Chang, who add to the world in ways that I cannot. I am also thankful that I live in a world that provides me access to so much wonderful information with such ease. I feel a deep sense of obligation to use my time in a way that repays these gifts I have been given.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

November 20, 2012 12:20 PM

The Paper Was Rejected, But Do Readers Care?

The research paper I discussed in a recent blog entry on student use of a new kind of textbook has not been published yet. It was rejected by ICER 2012, a CS education conference, for what are surely good reasons from the reviewers' perspective. The paper neither describes the results of an experiment nor puts the evaluation in the context of previous work. As the first study of this sort, though, that would be difficult to do.

That said, I did not hesitate to read the paper and try to put its findings to use. The authors have a solid reputation for doing good work, and I trust them to have done reasonable work and to have written about it honestly. Were there substantial flaws with the study or the paper, I trusted myself to take them into account as I interpreted and used the results.

I realize that this sort of thing happens every day, and has for a long time: academics reading technical reports and informal papers to learn from the work of their colleagues. But given the state of publishing these days, both academic and non-academic, I couldn't help but think about how the dissemination of information is changing.

Guzdial's blog is a perfect example. He has developed a solid reputation as a researcher and as an interpreter of other people's work. Now, nearly every day, we can all read his thoughts about his work, the work of others, and the state of the world. Whether the work is published in a journal or conference or not, it will reach an eager audience. He probably still needs to publish in traditional venues occasionally in order to please his employer and to maintain a certain stature, but I suspect that he no longer depends upon that sort of publication in the way researchers ten or thirty years ago.

True, Guzdial developed his reputation in part by publishing in journals and conferences, and they can still play that role for new researchers who are just developing their reputations. But there are other ways for the community to discover new work and recognize the quality of researchers and writers. Likewise, journals and conferences still can play a role in archiving work for posterity. But as the internet and web reach more and more people, and as we learn to do a better job of archiving what we publish there, that role will begin to fade.

The gates really are coming down.


Posted by Eugene Wallingford | Permalink | Categories: General

October 11, 2012 3:21 PM

Writing Advice for Me

I'm not a big fan of Top Ten lists on the web, unless they come from fellow Hoosier David Letterman. But I do like Number 9 on this list of writing tips:

Exclude all words that just don't add anything. This was the very best piece of advice I read when I first started blogging. Carefully re-read posts that you have written and try to remove all the extraneous words that add little or nothing.

This advice strikes a chord in me because I struggle to follow it, even when I am writing about it.


Posted by Eugene Wallingford | Permalink | Categories: General

October 01, 2012 7:40 AM

StrangeLoop 9: This and That

the Peabody Opera House

Every conference leaves me with unattached ideas floating around after I write up all my entries. StrangeLoop was no different. Were I master of Twitter, one who live-posted throughout the conference, many of this might have been masterful tweets. Instead, they are bullets in a truly miscellaneous blog entry.

~~~~

The conference was at the Peabody Opera House (right), an 80-year-old landmark in downtown St. Louis. It shares a large city block with the ScottTrade Center, home of the NHL Blues, and a large parking garage ideally positioned for a conference goer staying elsewhere. The main hall was perfect for plenary sessions, and four side rooms fit the parallel talks nicely.

~~~~

When I arrived at 8:30 AM on Monday, the morning refreshment table contained, in addition to the perfunctory coffee, Diet Mountain Dew in handy 12-ounce bottles. Soda was available all day. This made me happy.

Sadly, the kitchen ran out of Diet Dew before Tuesday morning. Such is life. I still applaud the conference for meeting the preferences of its non-coffee drinkers.

~~~~

During the Akka talk, I saw some code on a slide that made me mutter Ack! under my breath. That made me chuckle.

~~~~

"Man, there are a lot of Macs and iPads in this room."
-- me, at every conference session

~~~~

the St. Louis Arch, down the street from the Opera House

On Monday, I saw @fogus across the room in his Manfred von Thun jersey. I bow to you, sir. Joy is one of my favorites.

After seeing @fogus's jersey tweet, I actually ordered one for myself. Unfortunately, it didn't arrive in time for the conference. A nice coincidence: Robert Floyd spent most of his career at Stanford, whose mascot is... the Cardinal. (The color, not the bird.)

~~~~

During Matthew Flatt's talk, I couldn't help but think Alan Kay would be proud. This is programming taken to the extreme. Kay always said that Smalltalk didn't need an operating system; just hook those primitives directly to the underlying metal. Racket might be able to serve as its own OS, too.

~~~~

I skipped a few talks. During lunch each day, I went outside to walk. That's good for my knee as well as my head. Then I skipped one talk that I wanted to see at the end of each day, so that I could hit the exercise bike and pool. The web will surely provide me reports of both ( The Database as a Value and The State of JavaScript ). Sometimes, fresh air and exercise are worth the sacrifice.

~~~~

my StrangeLoop 2012 conference badge

I turned my laptop off for the last two talks of the conference that I attended. I don't think that the result was being able to think more or better, but I definitely did thought differently. Global connections seemed to surface more quickly, whereas typing notes seemed to keep me focused on local connections.

~~~~

Wednesday morning, as I hit the road for home, I ran into rush hour traffic driving toward downtown St. Louis. It took us 41 minutes to travel 12 miles. Love St. Louis and this conference as much as I do, I was glad to be heading home to a less crowded place.

~~~~

Even though I took walks at lunch, I was able to sneak into the lunch talks late. Tuesday's talk on Plato (OOP) and Aristotle (FP) brought a wistful smile. I spent a couple of years in grad school drawing inspiration for our lab's approach to knowledge-based systems from the pragmatists, in contrast to the traditional logical views of much of the AI world.

That talk contained two of my favorite sentences from the conference:

Computer scientists are applied metaphysicists.

And:

We have the most exciting job in the history of philosophy.

Indeed. We can encode, implement, and experiment with every model of the world we create. It is good to be the king.

This seems like a nice way to close my StrangeLoop posts for now. Now, back to work.


Posted by Eugene Wallingford | Permalink | Categories: General

September 19, 2012 4:57 PM

Don't Stop The Car

I'm not a Pomodoro guy, but this advice from The Timer Knows Best applies more generally:

Last month I was teaching my wife to drive [a manual transmission car], and it's amazing how easy stick shifting is if the car is already moving.... However, when the car is stopped and you need to get into 1st gear, it's extremely difficult. [So many things can go wrong:] too little gas, too much clutch, etc. ...

The same is true with the work day. Once you get going, you want to avoid coming to a standstill and having to get yourself moving again.

As I make the move from runner to cyclist, I have learned how much easier to keep moving on a bike than it is to start moving on a bike.

This is true of programming, too. Test-driven development helps us get started by encouraging us to focus on one new piece of functionality to implement. Keep it small, make it work, and move on to another small step. Pretty soon you are moving, and you are on your way.

Another technique many programs use to get started is to code a failing test before you stop the day before. This failing test focuses you even more quickly and recruits your own memory for help in recreating the feeling of motion more quickly. It's like a way to leave the car running in second gear.

I'm trying to help my students, who are mostly still learning how to write code, learn how to get started when they program. Many of them seem repeatedly to find themselves sitting still, grinding their gears and trying to figure out how to write the next bit of code and get it running. Ultimately, the answer may come down to the same thing we learn when we learn to drive a stick: practice, practice, practice, and eventually you get the feel of how the gearshift works.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

August 31, 2012 3:22 PM

Two Weeks Along the Road to OOP

The month has flown by, preparing for and now teaching our "intermediate computing" course. Add to that a strange and unusual set of administrative issues, and I've found no time to blog. I did, however manage to post what has become my most-retweeted tweet ever:

I wish I had enough money to run Oracle instead of Postgres. I'd still run Postgres, but I'd have a lot of cash.

That's an adaptation of tweet originated by @petdance and retweeted my way by @logosity. I polished it up, sent it off, and -- it took off for the sky. It's been fun watching its ebb and flow, as it reaches new sub-networks of people. From this experience I must learn at least one lesson: a lot of people are tired of sending money to Oracle.

The first two weeks of my course have led the students a few small steps toward object-oriented programming. I am letting the course evolve, with a few guiding ideas but no hard-and-fast plan. I'll write about the course's structure after I have a better view of it. For now, I can summarize the first four class sessions:

  1. Run a simple "memo pad" app, trying to identify behavior (functions) and state (persistent data). Discuss how different groupings of the functions and data might help us to localize change.
  2. Look at the code for the app. Discuss the organization of the functions and data. See a couple of basic design patterns, in particular the separation of model and view.
  3. Study the code in greater detail, with a focus on the high-level structure of an OO program in Java.
  4. Study the code in greater detail, with a focus on the lower-level structure of classes and methods in Java.

The reason we can spend so much time talking about a simple program is that students come to the course without (necessarily) knowing any Java. Most come with knowledge of Python or Ada, and their experiences with such different languages creates an interesting space in which to encounter Java. Our goal this semester is for students to learn their second language as much as possible, rather than having me "teach" it to them. I'm trying to expose them to a little more of the language each day, as we learn about design in parallel. This approach works reasonably well with Scheme and functional programming in a programming languages course. I'll have to see how well it works for Java and OOP, and adjust accordingly.

Next week we will begin to create things: classes, then small systems of classes. Homework 1 has them implementing a simple array-based class to an interface. It will be our first experience with polymorphic objects, though I plan to save that jargon for later in the course.

Finally, this is the new world of education: my students are sending me links to on-line sites and videos that have helped them learn programming. They want me to check them and and share with the other students. Today I received a link to The New Boston, which has among its 2500+ videos eighty-seven beginning Java and fifty-nine intermediate Java titles. Perhaps we'll come to a time when I can out-source all instruction on specific languages and focus class time on higher-level issues of design and programming...


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

August 09, 2012 1:36 PM

Sentences to Ponder

In Why Read?, Mark Edmundson writes:

A language, Wittgenstein thought, is a way of life. A new language, whether we learn it from a historian, a poet, a painter, or a composer of music, is potentially a new way to live.

Or from a programmer.

In computing, we sometimes speak of Perlis languages, after one of Alan Perlis's best-known epigrams: A language that doesn't affect the way you think about programming is not worth knowing. A programming language can change how we think about our craft. I hope to change how my students think about programming this fall, when I teach them an object-oriented language.

But for those of us who spend our days and nights turning ideas into programs, a way of thinking is akin to a way of life. That is why the wider scope of Wittgenstein's assertion strikes me as so appropriate for programmers.

Of course, I also think that programmers should follow Edmundson's advice and learn new languages from historians, writers, and artists. Learning new ways to think and live isn't just for humanities majors.

(By the way, I'm enjoying reading Why Read? so far. I read Edmundson's Teacher many years ago and recommend it highly.)


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

July 23, 2012 3:14 PM

Letting Go of Old Strengths

Ward Cunningham commented on what it's like to be "an old guy who's still a programmer" in his recent Dr. Dobb's interview:

A lot of people think that you can't be old and be good, and that's not true. You just have to be willing to let go of the strengths that you had a year ago and get some new strengths this year. Because it does change fast, and if you're not willing to do that, then you're not really able to be a programmer.""

That made me think of the last comment I made in my posts on JRubyConf:

There is a lot of stuff I don't know. I won't run out of things to read and learn and do for a long, long, time.

This is an ongoing theme in the life of a programmer, in the life of a teacher, and the life of an academic: the choice we make each day between keeping up and settling down. Keeping up is a lot more fun, but it's work. If you aren't comfortable giving up what you were awesome at yesterday, it's even more painful. I've been lucky mostly to enjoy learning new stuff more than I've enjoyed knowing the old stuff. May you be so lucky.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

July 20, 2012 3:39 PM

A Philosopher of Imitation

Ian Bogost, in The Great Pretender: Turing as a Philosopher of Imitation, writes:

Intelligence -- whatever it is, the thing that goes on inside a human or a machine -- is less interesting and productive a topic of conversation than the effects of such a process, the experience it creates in observers and interlocutors.

This is a very nice one-sentence summary of Turing's thesis in Computing Machinery and Intelligence. I wrote a bit about Turing's ideas on machine intelligence a few months back, but the key idea in Bogost's essay relates more closely to my discussion in Turing's ideas on representation and universal machines.

In this centennial year of his birth, we can hardly go wrong in considering again and again the depth of Turing's contributions. Bogost uses a lovely turn of phrase in his title: a philosopher of imitation. What may sound like a slight or a trifle is, in fact, the highest of compliments. Turing made that thinkable.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 18, 2012 2:31 PM

Names, Values, and The Battle of Bull Run

the cover of 'Encyclopedia Brown Finds the Clues'

Author Donald Sobol died Monday. I know him best from his long-running series, Encyclopedia Brown. Like many kids of my day, I loved these stories. I couldn't get enough. Each book consisted of ten or so short mysteries solved by Encyclopedia or Sally Kimball, his de facto partner in the Brown Detective Agency. I wanted to be Encyclopedia.

The stories were brain teasers. Solving them required knowledge and, more important, careful observation and logical deduction. I learned to pay close attention while reading Encyclopedia Brown, otherwise I had no hope of solving the crime before Encyclopedia revealed the solution. In many ways, these stories prepared me for a career in math and science. They certainly were a lot of fun.

One of the stories I remember best after all these years is "The Case of the Civil War Sword", from the very first Encyclopedia Brown book. I'm not the only person who found it memorable; Rob Bricken ranks it #9 among the ten most difficult Encyclopedia Brown mysteries. The solution to this case turned on the fact that one battle had two different names. Northerners often named battles for nearby bodies of water or prominent natural features, while Southerners named them for the nearest town or prominent man-made features. So, the First Battle of Bull Run and the First Battle of Manassas were the same event.

This case taught me a bit of historical trivia and opened my mind to the idea that naming things from the Civil War was not trivial at all.

This story taught me more than history, though. As a young boy, it stood out as an example of something I surely already knew: names aren't unique. The same value can have different names. In a way, Encyclopedia Brown taught me one of my first lessons about computer science.

~~~~

IMAGE: the cover of Encyclopedia Brown Finds the Clues, 1966. Source: Topless Robot.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

July 16, 2012 3:02 PM

Refactoring Everywhere: In Code and In Text

Charlie Stross is a sci-fi writer. Some of my friends have recommended his fiction, but I've not read any. In Writing a novel in Scrivener: lessons learned, he, well, describes what he has learned writing novels using Scrivener, an app for writers well known in the Mac OS X world.

I've used it before on several novels, notably ones where the plot got so gnarly and tangled up that I badly needed a tool for refactoring plot strands, but the novel I've finished, "Neptune's Brood", is the first one that was written from start to finish in Scrivener...

... It doesn't completely replace the word processor in my workflow, but it relegates it to a markup and proofing tool rather than being a central element of the process of creating a book. And that's about as major a change as the author's job has undergone since WYSIWYG word processing came along in the late 80s....

My suspicion is that if this sort of tool spreads, the long-term result may be better structured novels with fewer dangling plot threads and internal inconsistencies. But time will tell.

Stross's lessons don't all revolve around refactoring, but being able to manage and manipulate the structure of the evolving novel seems central to his satisfaction.

I've read a lot of novels that seemed like they could have used a little refactoring. I always figured it was just me.

The experience of writing anything in long form can probably be improved by a good refactoring tool. I know I find myself doing some pretty large refactorings when I'm working on the set of lecture notes for a course.

Programmers and computer scientists have the advantage of being more comfortable writing text in code, using tools such as LaTex and Scribble, or homegrown systems. My sense, though, is that fewer programmers use tools like this, at least at full power, than might benefit from doing so.

Like Stross, I have a predisposition against using tools with proprietary data formats. I've never lost data stored in plaintext to version creep or application obsolescence. I do use apps such as VoodooPad for specific tasks, though I am keenly aware of the exit strategy (export to text or RTFD ) and the pain trade-off at exit (the more VoodooPad docs I create, the more docs I have to remember to export before losing access to the app). One of the things I like most about MacJournal is that it's nothing but a veneer over a set of Unix directories and RTF documents. The flip side is that it can't do for me nearly what Scrivener can do.

Thinking about a prose writing tool that supports refactoring raises an obvious question: what sort of refactoring operations might it provide automatically? Some of the standard code refactorings might have natural analogues in writing, such as Extract Chapter or Inline Digression.

Thinking about automated support for refactoring raises another obvious question, the importance of which is surely as clear to novelists as to software developers: Where are the unit tests? How will we know we haven't broken the story?

I'm not being facetious. The biggest fear I have when I refactor a module of a course I teach is that I will break something somewhere down the line in the course. Your advice is welcome!


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

July 14, 2012 11:01 AM

"Most Happiness Comes From Friction"

Last time, I mentioned again the value in having students learn broadly across the sciences and humanities, including computer science. This is a challenge going in both directions. Most students like to concentrate on one area, for a lot of different reasons. Computer science looks intimidating to students in other majors, perhaps especially to the humanities-inclined.

There is hope. Earlier this year, the Harvard Magazine ran The Frisson of Friction, an essay by Sarah Zhang, a non-CS student who decided to take CS 50, Harvard's intro to computer science. Zhang tells the story of finding a thorny, semicolon-induced bug in a program (an extension for Google's Chrome browser) on the eve of her 21st birthday. Eventually, she succeeded. In retrospect, she writes:

Plenty of people could have coded the same extension more elegantly and in less time. I will never be as good a programmer as -- to set the standard absurdly high -- Mark Zuckerberg. But accomplishments can be measured in terms relative to ourselves, rather than to others. Rather than sticking to what we're already good at as the surest path to résumé-worthy achievements, we should see the value in novel challenges. How else will we discover possibilities that lie just beyond the visible horizon?

... Even the best birthday cake is no substitute for the deep satisfaction of accomplishing what we had previously deemed impossible -- whether it's writing a program or writing a play.

The essay addresses some of the issues that keep students from seeking out novel challenges, such as fear of low grades and fear of looking foolish. At places like Harvard, students who are used to succeeding find themselves boxed in by their friends' expectations, and their own, but those feelings are familiar to students at any school. Then you have advisors who subtly discourage venturing too far from the comfortable, out of their own unfamiliarity and fear. This is a social issue as big as any pedagogical challenge we face in trying to make introductory computer science more accessible to more people.

With work, we can help students feel the deep satisfaction that Zhang experienced. Overcoming challenges often leads to that feeling. She quotes a passage about programmers in Silicon Valley, who thrive on such challenges: "Most happiness probably comes from friction." Much satisfaction and happiness come out of the friction inherent in making things. Writing prose and writing programs share this characteristic.

Sharing the deep satisfaction of computer science is a problem with many facets. Those of us who know the satisfaction know it's a problem worth solving.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

July 13, 2012 12:02 PM

How Science -- and Computing -- Are Changing History

While reading a recent Harvard Magazine article about Eric Mazur's peer instruction technique in physics teaching, I ran across a link to an older paper that fascinated me even more! Who Killed the Men of England? tells several stories of research at the intersection of history, archaeology, genomics, evolution, demography, and simulation, such as the conquest of Roman England by the Anglo Saxons.

Not only in this instance, but across entire fields of inquiry, the traditional boundaries between history and prehistory have been melting away as the study of the human past based on the written record increasingly incorporates the material record of the natural and physical sciences. Recognizing this shift, and seeking to establish fruitful collaborations, a group of Harvard and MIT scholars have begun working together as part of a new initiative for the study of the human past. Organized by [professor of medieval history Michael] McCormick, who studies the fall of the Roman empire, the aim is to bring together researchers from the physical, life, and computer sciences and the humanities to explore the kinds of new data that will advance our understanding of human history.

... The study of the human past, in other words, has entered a new phase in which science has begun to tell stories that were once the sole domain of humanists.

I love history as much as computing and was mesmerized by these stories of how scientists reading the "material record" of the world are adding to our knowledge of the human past.

However, this is more than simply a one-way path of information flowing from scientists to humanists. The scientific data and models themselves are underconstrained. The historians, cultural anthropologists, and demographers are able to provide context to the data and models and so extract even more meaning from them. This is a true collaboration. Very cool.

The rise of science is erasing boundaries between the disciplines that we all studied in school. Scholars are able to define new disciplines, such as "the study of the human past", mentioned in the passage above. These disciplines are organized with a greater focus on what is being studied than on how we are studying it.

We are also blurring the line between history and pre-history. It used to be that history required a written record, but that is no longer a hard limit. Science can read nature's record. Computer scientists can build models using genomic data and migration data that suggest possible paths of change when the written and scientific record are incomplete. These ideas become part of the raw material that humanists use to construct a coherent story of the past.

This change in how we are able to study the world highlights the importance of a broad education, something I've written about a few times recently [ 1 | 2 | 3 ] and not so recently. This sort of scholarship is best done by people who are good at several things, or at least curious and interested enough in several things to get to know them intimately. As I wrote in Failure and the Liberal Arts, it's important both not to be too narrowly trained and not to be too narrowly "liberally educated".

Even at a place like Harvard, this can leave scholars in a quandary:

McCormick is fired with enthusiasm for the future of his discipline. "It is exciting. I jump up every morning. But it is also challenging. Division and department boundaries are real. Even with a generally supportive attitude, it is difficult [to raise funds, to admit students who are excellent in more than one discipline, and so on]. ..."

So I will continue to tell computer science students to take courses from all over the university, not just from CS and math. This is one point of influence I have as a professor, advisor, and department head. And I will continue to look for ways to encourage non-CS students to take CS courses and students outside the sciences to study science, including CS. As that paragraph ends:

"... This is a whole new way of studying the past. It is a unique intellectual opportunity and practically all the pieces are in place. This should happen here--it will happen, whether we are part of it or not."

"Here" doesn't have to be Harvard. There is a lot of work to be done.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

June 30, 2012 10:52 AM

"What Were Alleged to be Ideas"

James Webb Young begins his book A Technique for Producing Ideas with a prefatory note:

The subject is properly one which belongs to the professional psychologist, which I am not. This treatment of it, therefore, can have value only as an expression of the personal experience of one who has had to earn his living by producing what were alleged to be ideas.

With a little tweaking, such as occasionally substituting a different profession for psychologist, this would make a nice disclaimer for many of my blog entries.

Come to think of it, with a little tweaking, this could serve as the basis of a disclaimer for about 98% of the web.

Thanks to David Schmüdde for a pointer to Young's delightful little book.


Posted by Eugene Wallingford | Permalink | Categories: General

June 26, 2012 4:23 PM

Adventures in Advising

Student brings me a proposed schedule for next semester.

Me: "Are you happy with this schedule?"

Student: "If I weren't, why would I have made it?"

All I can think is, "Boy, are you gonna have fun as a programmer."


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

June 06, 2012 3:33 PM

Advice, Platitudes, and Reasonable Decisions

I recently listened to a short clip from Seth Godin's book "The Dip". In it, he quotes Vince Lombardi as saying, "winners never quit, and quitters never win", and then says something to the effect of:

Winners quit all the time. They just quit the right stuff at the right time.

This reminded of my recent Good Ideas Aren't Always Enough, in which I talk briefly about Ward Cunningham's experience trying to create a universal mark-up language for wiki.

How did Ward know it was the right time to stop pushing for a universal mark-up? Perhaps success was right around the corner. Maybe he just needed a better argument, or a better example, or a better mark-up language.

Inherent in this sort of lesson is a generic variation of the Halting Problem. You can't be sure that an effort will fail until it fails. But the process may never fail explicitly, simply churning on forever. What then?

That's one of the problems with giving advice of the sort my entry gave, or of the sort that Godin gives in his book. The advice itself is empty, because the opposite advice is also true. You only know which advice is right in any given context after the fact -- if ever.

How did Ward know? I'm guessing a combination of:

  • knowledge about the problem,
  • experience with this problem and others like it,
  • relationship with the community of people involved,
  • and... a little luck.

And someone may come along some day with a better argument, or a better example, or a better mark-up language, and succeed. We won't know until it happens.

Maybe such advice is nothing more than platitude. Without any context, it isn't all that helpful, except as motivation to persevere in the face of a challenge (if you want to push on) or consolation in the face of a setback (if you want to focus your energy elsewhere). Still, I think it's useful to know that other people -- accomplished people -- have faced the same choice. Both outcomes are possible. Knowing that, we can use our knowledge, experience, and relationships to make choices that make sense in our current circumstances and live with the outcomes.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

June 01, 2012 4:39 PM

Good Ideas Aren't Always Enough

Ward Cunningham

In his recent Dr. Dobb's interview, Ward Cunningham talked about the wiki community's efforts to create a universal mark-up language. Despite the many advantages of a common language, the idea never took hold. Ward's post-mortem:

So the only thing I can conclude is that as nice as having a universal or portable mark-up would be, it's not nice enough to cause people to give up what they're working on when they work on their wiki.

This is an important lesson to learn, whatever your discipline or your community. It's especially important if you hope to be an agent of change. Good ideas aren't always enough to induce change, even in a community of people working together in an explicit effort to create better ideas. There needs to be enough energy to overcome the natural inertia associated with any set of practices.

Ward's next sentence embodies even more wisdom:

I accept that as the state of nature and don't worry about it too much anymore.

Denial locks you up. Either you continue in vain to push the rejected idea, or you waste precious time and energy lamenting the perceived injustice of the failure.

Acceptance frees you to move on to your project in peace.


Posted by Eugene Wallingford | Permalink | Categories: General

May 31, 2012 3:17 PM

A Department Head's Fantasy

(I recently finished re-reading Straight Man, the 1997 novel by Richard Russo. This fantasy comes straight out of the book.)

Hank Devereaux, beleaguered chair of the English department, has been called in to meet with Dickie Pope, campus CEO. He arrives at Pope's office just as the CEO is wrapping up a meeting with chief of security Lou Steinmetz and another man. Pope says, "Hank, why don't you go on in and make yourself comfortable. I want to walk these fellas to the door." We join Devereaux's narration:

When I go over to Dickie's high windows to take in the view, I'm in time to see the three men emerge below, where they continue their conversation on the steps.... Lou's campus security cruiser is parked at the curb, and the three men stroll toward it. They're seeing Lou off, I presume, .... But when they get to the cruiser, to my surprise, all three men climb into the front seat and drive off. If this is a joke on me, I can't help but admire it. In fact, I make a mental note to employ a version of it myself, soon. Maybe, if I'm to be fired today, I'll convene some sort of emergency meeting, inviting Gracie, and Paul Rourke, and Finny, and Orshee, and one or two other pebbles from my shoe. I'll call the meeting to order, then step outside on some pretext or other, and simply go home. Get Rachel [my secretary] to time them and report back to me on how long it takes them to figure it out. Maybe even get some sort of pool going.

My relationship with my colleagues is nothing like Devereaux's. Unlike him, I like my colleagues. Unlike his colleagues, mine have always treated me with collegiality and respect. I have no reason to wish them ill will or discomfort.

Still. It is a great joke. And I imagine that there are a lot of deans and department chairs and VPs out there who harbor dark fantasies of this sort all the time, especially during those inevitable stretches of politics that plague universities. Even the most optimistic among us can be worn down by the steady drip-drip-drip of dysfunction. There have certainly been days this year when I've gone home at the end of a long week with a sense of doom and a desire for recompense.

Fortunately, an occasional fantasy is usually all I need to deflate the doom and get back to business. That is the voyeuristic allure of novels like Straight Man for me.

But there may come a day when I can't resist temptation. If you see me walking on campus wearing a Groucho Marx nose and glasses, all bets are off.


Posted by Eugene Wallingford | Permalink | Categories: General, Managing and Leading

May 22, 2012 7:53 PM

A Few Days at JRubyConf

It's been fourteen months since I last attended a conference. I decided to celebrate the end of the year, the end of my compiler course, and the prospect of writing a little code this summer by attending JRubyConf 2012. I've programmed a fair amount in Ruby but have only recently begun to play with JRuby, an implementation of Ruby in Java which runs atop the JVM. There are some nice advantages to this, including the ability to use Java graphics with Ruby models and the ability to do real concurrency. It also offers me a nice combination for the summer. I will be teaching our sophomore-level intermediate computing course this fall, which focuses in large part on OO design and Java implementation, as JRuby will let me program in Ruby while doing a little class prep at the same time.

the Stone Arch Bridge in Minneapolis

Conference organizer Nick Sieger opened the event with the obligatory welcome remarks. He said that he thinks the overriding theme of JRubyConf is being a bridge. This is perhaps a natural effect of Minneapolis, a city of many bridges, as the hometown of JRuby, its lead devs, and the conference. The image above is of the Stone Arch Bridge, as seen from the ninth level of the famed Guthrie Center, the conference venue. (The yellow tint is from the window itself.)

The goal for the conference is to be a bridge connecting people to technologies. But it also aims to be a bridge among people, promoting what Sieger called "a more sensitive way of doing business". Emblematic of this goal were its Sunday workshop, a Kids CodeCamp, and its Monday workshop, Railsbridge. This is my first open-source conference, and when I look around I see the issue that so many people talk about. Of 150 or so attendees, there must be fewer than one dozen women and fewer than five African-Americans. The computing world certainly has room to make more and better connections into the world.

My next few entries will cover some of the things I learn at the conference. I start with a smile on my face, because the conference organizers gave me a cookie when I checked in this morning:

the sugar cookie JRubyConf gave me at check-in

That seems like a nice way to say 'hello' to a newcomer.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

May 11, 2012 2:31 PM

Get Busy; Time is Short

After an award-winning author had criticized popular literature, Stephen King responded with advice that is a useful reminder to us all:

Get busy. You have a short life span. You need to stop this crap about sitting there and talking about what we do, and actually do it. Because God gave you some talent, but he also gave you a certain number of years.

You don't have to be an award-winning author to waste precious time commenting on other people's work. Anyone with a web browser can fill his or her day talking about stuff, and not actually making stuff. For academics, it is a professional hazard. We need to balance the analytic and the creative. We learn by studying others' work and writing about it, but we also need to make time to make.

(The passage above comes from Stephen King, The Art of Fiction No. 189, in the wonderful on-line archive of interviews from the Paris Review.)


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

May 08, 2012 3:22 PM

Quality and Quantity, Thoroughbred Edition

I'll Have Another was not highly sought after as a yearling, when he was purchased for the relatively small sum of $11,000.

On Saturday, I'll Have Another rallied down the stretch to win the 2012 Kentucky Derby, passing Bodemeister, one of the race favorites that had led impressively from the gate. Afterward, a television commentator asked the horse's trainer, "What did you and the owner see in the horse way back that made you want to buy it?" The trainer's answer was unusually honest. He said something to the effect,

We buy a lot of horses. Some work out, and some don't. There is a lot of luck involved. You do the right things and see what happens.

This is as a good an example as I've heard in a while of the relationship between quantity and quality, which my memory often connects with stories from the book Art and Fear. People are way too fond of mythologizing successes and then romanticizing the processes that lead to them. In most vocations and most avocations, the best way to succeed is to do the right things, to work hard, be unlucky a lot, and occasionally get lucky.

This mindset does not to diminish the value of hard work and good practices. No, it exalts their value. What it diminishes is our sense of control over outcomes in a complex world. Do your best and you will get better. Just keep in mind that we often have a lot less control over success and failure than our mythology tends to tell us.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

April 21, 2012 3:57 PM

A Conflict Between Fashion and the Unfashionable

Passage of the day, courtesy of Dave Winer:

They have started incubators in every major city on the planet. Unfortunately it hasn't been stylish to learn how to program for a number of years, so there aren't that many programmers available to hire. And it takes years to get really good at this stuff.

Hey, they just need a programmer. Or fifty.

While we teach CS students to program, we need to cultivate an entrepreneurial spirit, too. What an opportunity awaits someone with ideas and the ability to carry them out.


Posted by Eugene Wallingford | Permalink | Categories: General

April 20, 2012 3:14 PM

Better Than Everyone, University Edition

Seth Godin recently mentioned something that Clay Shirky has said about the television industry: Forty years ago, you only had to be better than two other shows. Now you have to better than everybody.

At the same time technology makes it easier for people to put their creations in front of potential viewers, it makes it harder for established players to retain control over market share. As Godin summarized, ".. with a million choices, each show earns the attention it gets in every single moment".

I've mused here periodically about how these same technological changes will ultimately affect universities. It seems that many people agree that education, even higher ed, is "ripe for disruption". Startups such as Boundless are beginning to take their shot at what seems an obvious market, the intersection of education and the beleaguered publishing industry: textbooks.

Though on-line education has been growing now for years, I haven't written anything about it. For one thing, I don't know what I really think of it yet. As much as I think out loud when I blog, I usually at least have a well-formed thought or two. When it comes to on-line education, my brain is still mostly full of mush.

Not long ago, the threat of on-line education to the way traditional universities operate did not seem imminent. That is, I think, starting to change. When the primary on-line players were non-traditional alternatives such as the University of Phoenix, it seemed easy enough to sell the benefits of the brick-and-ivy campus-based education to people. But as these schools slowly build a track record -- and an alumni base -- they will become a common enough part of the popular landscape that they become an acceptable alternative to many people. And as the cost of brick-and-ivy education rises, it becomes harder and harder to sell people on its value.

Of course, we now see a burgeoning in the number of on-line offerings from established universities. Big-name schools like MIT and Harvard have made full courses, and even suites of courses, available on-line. One of my more experienced colleagues began to get antsy when this process picked up speed a few years ago. Who wouldn't prefer MIT's artificial intelligence course over ours? These courses weren't yet available for credit, which left us with hope. We offer our course as part of a coherent program of study that leads to a credential that students and employers value. But in time...

... that would change. And it has. Udacity has spun itself off from Stanford and is setting its sights on a full on-line curriculum. A recent Computer World article talks about MITx, a similar program growing out of MIT. These programs are still being created and will likely offer a different sort of credential than the universities that gave birth to them, at least at the start. Is there still hope?

Less and less. As the article reports, other established universities are now offering full CS programs on-line. The University of Illinois at Springfield started in 2006 and now has more computer science students enrolled in its on-line undergrad and M.S. programs (171 and 146, respectively) than their on-campus counterparts (121 and 129). In June, Oregon State will begin offering a CS degree program on-line.

The natural reaction of many schools is to join in the rush. Schools like many are putting more financial and faculty resources into the creation of on-line courses and programs, because "that's where the future lies".

I think, though, that Shirky's anecdote about the TV industry serves as an important cautionary tale. The caution has two prongs.

First, you have to adapt. When a disruptive technology comes along, you have to respond. You may think that you are good enough or dominant enough to survive the wave, but you probably aren't. Giants that retain their position atop a local maximum when a new technology redefines an industry quickly change from giants to dinosaurs.

Adapting isn't easy. Clayton Christensen and his colleagues have documented how difficult it is for a company that is very good at something and delivering value in its market to change course. Even with foresight and a vision, it is difficult to overcome inertia and external forces that push a company to stay on the same track.

Second, technology lowers barriers for producers and consumers alike. It's no longer enough to be the best teaching university in your state or neighborhood. Now you have to better than everybody. If you are a computer science department, that seems an insurmountable task. Maybe you can be better than Illinois-Springfield (and maybe not!), but how can you be better than Stanford, MIT, and Harvard?

Before joining the rush to offer programs on-line, you might want to have an idea of what it is that you will be the best at, and for whom. With degrees from Illinois-Springfield, Oregon State, Udacity, Stanford, MIT, and Harvard only a few clicks away, you will have to earn the attention -- and tuition -- you receive from every single student.

But don't dally. It's lonely as the dominant player in a market that no longer exists.


Posted by Eugene Wallingford | Permalink | Categories: General

April 09, 2012 2:53 PM

Should I Change My Major?

Recruiting brochures for academic departments often list the kinds of jobs that students get when they graduate. Brochures for CS departments tend to list jobs such as "computer programmer", "system administrator", "software engineer", and "systems analyst". More ambitious lists include "CS professor" and "entrepreneur". I've been promoting entrepreneurship as a path for our CS grads for a few years now.

This morning, I was browsing the tables at one of my college's preview day sessions and came across my new all-time favorite job title for graduates. If you major in philosophy at my university, it turns out that one of the possible future job opportunities awaiting you is...

Bishop or Pope

Learning to program gives you superhuman strength, but I'm not sure a CS major can give you a direct line to God. I say, "Go for it."


Posted by Eugene Wallingford | Permalink | Categories: General

April 06, 2012 4:29 PM

A Reflection on Alan Turing, Representation, and Universal Machines

Douglas Hofstadter speaking at UNI

The day after Douglas Hofstadter spoke here on assertions, proof's and Gödel's theorem, he gave a second public lecture hosted by the philosophy department. Ahead of time, we knew only that Hofstadter would reflect on Turing during his centennial. I went in expecting more on the Turing test, or perhaps a popular talk on Turing's proof of The Halting Problem. Instead, he riffed on Chapter 17 from I Am a Strange Loop.

In the end, we are self-perceiving, self-inventing, locked-in mirages that are little miracles of self-reference.

Turing, he said, is another peak in the landscape occupied by Tarski and Gödel, whose work he had discussed the night before. (As a computer scientist, I wanted to add to this set contemporaries such as Alonzo Church and Claude Shannon.) Hofstadter mentioned Turing's seminal paper about the Entscheidungsproblem but wanted to focus instead on the model of computation for which he is known, usually referred to by the name "Turing machine". In particular, he asked us to consider a key distinction that Turing made when talking about his model: that between dedicated and universal machines.

A dedicated machine performs one task. Human history is replete with dedicated machines, whether simple, like the wheel, or complex, such as a typewriter. We can use these tools with different ends in mind, but the basic work is fixed in their substance and structure.

The 21st-century cell phone is, in contrast, a universal machine. It can take pictures, record audio, and -- yes -- even be used as a phone. But it can also do other things for us, if we but go to the app store and download another program.

Hofstadter shared a few of his early personal experiences with programs enabling line printers to perform tasks for which they had not been specifically designed. He recalled seeing a two-dimensional graph plotted by "printing" mostly blank lines that contained a single *. Text had been turned into graphics. Taking the idea further, someone used the computer to print a large number of cards which, when given to members of the crowd at a football game, could be used to create a massive two-dimensional message visible from afar. Even further, someone used a very specific layout of the characters available on the line printer to produce a print-out that appeared from the other side of the room to be a black-and-white photograph of Raquel Welch. Text had been turned into image.

People saw each of these displays as images by virtue of our eyes and mind interpreting a specific configuration of characters in a certain way. We can take that idea down a level into the computer itself. Consider this transformation of bits:

0000 0000 0110 1011 → 0110 1011 0000 0000

A computer engineer might see this as a "left shift" of 8 bits. A computer programmer might see it as multiplying the number on the left by 256. A graphic designer might see us moving color from one pixel to another. A typesetter may see one letter being changed into another. What one sees depends on how one interprets what the data represent and what the process means.

Alan Turing was the first to express clearly the idea that a machine can do them all.

"Aren't those really binary numbers?", someone asked. "Isn't that real, and everything else interpretation?" Hofstadter said that this is a tempting perspective, but we need to keep in mind that they aren't numbers at all. They are, in most computers, pulses of electricity, or the states of electronic components, that we interpret as 0s and 1s.

After we have settled on interpreting those pulses or states as 0s and 1s, we then interpret configurations of 0s and 1s to mean something else, such as decimal numbers, colors, or characters. This second level of interpretation exposes the flaw in popular claims that computers can do "only" process 0s and 1s. Computers can deal with numbers, colors, or characters -- anything that can be represented in any way -- when we interpret not only what the data mean but also what the process means.

(In the course of talking representations, he threw in a cool numeric example: Given an integer N, factor it as 2^a * 3^b * 5^c *7^d ... and use [a.b.c.d. ...] to stand for N. I see a programming assignment or two lying in wait.)

The dual ideas of representation and interpretation take us into a new dimension. The Principia Mathematica describes a set of axioms and formal rules for reasoning about numeric structures. Gödel saw that it could be viewed at a higher level, as a system in its own right -- as a structure of integers. Thus the Principia can talk about itself. It is, in a sense, universal.

This is the launching point for Turing's greatest insight. In I Am a Strange Loop, Hofstadter writes:

Inspired by Gödel's mapping of PM into itself, Alan Turing realized that the critical threshold for this kind of computational universality comes exactly at the point where a machine is flexible enough to read and correctly interpret a set of data that describes its own structure. At this crucial juncture, a machine can, in principle, explicitly watch how it does any particular task, step by step. Turing realized that a machine that has this critical level of flexibility can imitate any other machine, no matter how complex the latter is. In other words, there is nothing more flexible than a universal machine. Universality is as far as you can go!

Alan Turing

Thus was Turing first person to recognize the idea of a universal machine, circa 1935-1936: that a Turing machine can be given, as input, data that encodes its own instructions. This is the beginning of perhaps the biggest of the Big Ideas of computer science: the duality of data and program.

We should all be glad he didn't patent this idea.

Turing didn't stop there, of course, as I wrote in my recent entry on the Turing test. He recognized that humans are remarkably capable and efficient representational machines.

Hofstadter illustrates this with the idea of "hub", a three-letter word that embodies an enormous amount of experience and knowledge, chunked in numerous ways and accreted slowly over time. The concept is assembled in our minds out of our experiences. It is a representation. Bound up in that representation is an understanding of ourselves as actors in certain kinds of interactions, such as booking a flight on an airplane.

It is this facility with representations that distinguishes us humans from dogs and other animals. They don't seem capable of seeing themselves or others as representations. Human beings, though, naturally take other people's representations into their own. This results in a range of familiarities and verisimilitude. We "absorb" some people so well that we feel we know them intimately. This is what we mean when we say that someone is "in our soul". We use the word 'soul' not in a religious sense; we are referring to our essence.

Viewed this way, we are all distributed beings. We are "out there", in other people, as well as "in here", in ourselves. We've all had dreams of the sort Hofstadter used as example, a dream in which his deceased father appeared, seemingly as real as he ever had been while alive. I myself recently dreamt that I was running, and the experience of myself was as real as anything I feel when I'm awake. Because we are universal machines, we are able to process the representations we hold of ourselves and of others and create sensations that feel just like the ones we have when we interact in the world.

It is this sense that we are self-representation machines that gives rise to the title of his book, "I am a strange loop". In Hofstadter's view, our identity is a representation of self that we construct, like any other representation.

This idea underlies the importance of the Turing test. It takes more than "just syntax" to pass the test. Indeed, syntax is itself more than "just" syntax! We quickly recurse into the dimension of representation, of models, and a need for self-reference that makes our syntactic rules more than "just" rules.

Indeed, as self-representation machines, we are able to have a sense of our own smallness within the larger system. This can be scary, but also good. It makes life seem precious, so we feel a need to contribute to the world, to matter somehow.

Whenever I teach our AI course, I encounter students who are, for religious or philosophical reasons, deeply averse to the idea of an intelligent machine, or even of scientific explanations of who we are. When I think about identity in terms of self-representation, I can't help but feel that, at an important level, it does not matter. God or not, I am in awe of who we are and how we got to here.

So, we owe Alan Turing a great debt. Building on the work of philosophers, mathematicians, and logicians, Turing gave us the essential insight of the universal machine, on which modern computing is built. He also gave us a new vocabulary with which to think about our identity and how we understand the world. I hope you can appreciate why celebrating his centennial is worthwhile.

~~~~

IMAGE 1: a photo of Douglas Hofstadter speaking at UNI, March 7, 2012. Source: Kevin C. O'Kane.

IMAGE 2: the Alan Turing centenary celebration. Source: 2012 The Alan Turing Year.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

April 04, 2012 4:39 PM

Computational Search Answers an Important Question

Update: Well, this is embarrassing. Apparently, Mat and I were the victims of a prank by the folks at ChessBase. You'd think that, after more than twenty-five years on the internet, I would be more circumspect at this time of year. Rather than delete the post, I will leave it here for the sake of posterity. If nothing else, my students can get a chuckle from their professor getting caught red-faced.

I stand behind my discussion of solving games, my recommendation of Rybka, and my praise for My 60 Memorable Games (my favorite chess book of all time. I also still marvel at the chess mind of Bobby Fischer.

~~~~

Thanks to reader Mat Roberts for pointing me to this interview with programmer Vasik Rajlich, which describes a recent computational result of his: one of the most famous openings in chess, the King's Gambit, is a forced draw.

Games are, of course, a fertile testbed for computing research, including AI and parallel computation. Many researchers make one of their goals to "solve" a game, that is, to show that, with best play by both players, a game has a particular outcome. Games with long histories and large communities of players naturally attract a lot of interest, and solving one of them is usually considered a valuable achievement.

For us in CS, interest grows as with the complexity of the game. Solving Connect Four was cool, but solving Othello on a full-sized board would be cooler. Almost five years ago, I blogged about what I still consider the most impressive result in this domain: the solving of checkers by Jonathan Schaeffer and his team at the University of Alberta.

the King's Gambit

The chess result is more limited. Rajlich, an International Master of chess and the programmer of the chess engine Rybka, has shown results only for games that begin 1.e4 e5 2.f4 exf4. If White plays 3.Nf3 -- the most common next move -- then Black can win with 3... d6. 3.Bc4 also loses. Only one move for White can force a draw, the uncommon 3.Be2. Keep in mind that these results all assume best play by both players from there on out. White can win, lose, or draw in all variations if either player plays a sub-optimal move.

I say "only" when describing this result because it leaves a lot of chess unsolved, all games starting with some other sequence of moves. Yet the accomplishment is still quite impressive! The King's Gambit is one of the oldest and most storied opening sequences in all of chess, and it remains popular to this day among players at every level of skill.

Besides, consider the computational resources that Rajlich had to use to solve even the King's Gambit:

... a cluster of computers, currently around 300 cores [created by Lukas Cimiotti, hooked up to] a massively parallel cluster of IBM POWER 7 Servers provided by David Slate, senior manager of IBM's Semantic Analysis and Integration department -- 2,880 cores at 4.25 GHz, 16 terabytes of RAM, very similar to the hardware used by IBM's Watson in winning the TV show "Jeopardy". The IBM servers ran a port of the latest version of Rybka, and computation was split across the two clusters, with the Cimiotti cluster distributing the search to the IBM hardware.

Oh, and this set up had to run for over four months to solve the opening. I call that impressive. If you want something less computationally intensive yet still able to beat you me and everybody we know at chess, you can by Rybka, a chess engine available commercially. (An older version is available for free!)

What effect will this result have on human play? Not much, practically speaking. Our brains aren't big enough or fast enough to compute all the possible paths, so human players will continue to play the opening, create new ideas, and explore the action in real time over the board. Maybe players with the Black pieces will be more likely to play one of the known winning moves now, but results will remain uneven between White and Black. The opening leads to complicated positions.

the cover of Bobby Fischer's 'My 60 Memorable Games'

If, like some people, you worry that results such as this one somehow diminish us as human beings, take a look again at the computational resources that were required to solve this sliver of one game, the merest sliver of human life, and then consider: This is not the first time that someone claimed the King's Gambit busted. In 1961, an eighteen-year-old U.S. chess champion named Bobby Fischer published an article claiming that 1.e4 e5 2.f4 exf4 3.Nf3 was a forced loss. His prescription? 3... d6. Now we know for sure. Like so many advances in AI, this one leaves me marveling at the power of the human mind.

Well, at least Bobby Fischer's mind.

~~~~

IMAGE 1: The King's Gambit. Source: Wikimedia Commons.

IMAGE 2: a photograph of the cover of my copy of My 60 Memorable Games by Bobby Fischer. Bobby analyzes a King's Gambit or two in this classic collection of games.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

March 30, 2012 5:22 PM

A Reflection on Alan Turing, the Turing Test, and Machine Intelligence

Alan Turing

In 1950, Alan Turing published a paper that launched the discipline of artificial intelligence, Computing Machinery and Intelligence. If you have not read this paper, go and do so. Now. 2012 is the centennial of Turing's birth, and you owe yourself a read of this seminal paper as part of the celebration. It is a wonderful work from a wonderful mind.

This paper gave us the Imitation Game, an attempt to replace the question of whether a computer could be intelligent by withn something more concrete: a probing dialogue. The Imitation became the Turing Test, now a staple of modern culture and the inspiration for contests and analogies and speculation. After reading the paper, you will understand something that many people do not: Turing is not describing a way for us to tell the difference between human intelligence and machine intelligence. He is telling us that the distinction is not as important as we seem to think. Indeed, I think he is telling us that there is no distinction at all.

I mentioned in an entry a few years ago that I always have my undergrad AI students read Turing's paper and discuss the implications of what we now call the Turing Test. Students would often get hung up on religious objections or, as noted in that entry, a deep and a-rational belief in "gut instinct". A few ended up putting their heads in the sand, as Turing knew they might, because they simply didn't want to confront the implication of intelligences other than our own. And yet they were in an AI course, learning techniques that enable us to write "intelligent" programs. Even students with the most diehard objections wanted to write programs that could learn from experience.

Douglas Hofstadter, who visited campus this month, has encountered another response to the Turing Test that surprised him. On his second day here, in honor of the Turing centenary, Hofstadter offered a seminar on some ideas related to the Turing Test. He quoted two snippets of hypothetical man-machoine dialogue from Turing's seminal paper in his classic Gödel, Escher, Bach. Over the the years, he occasionally runs into philosophers who think the Turing Test is shallow, trivial to pass with trickery and "mere syntax". Some are concerned that it explores "only behavior". Is behavior all there is? they ask.

As a computer programmer, the idea that the Turing test explores only behavior never bothered me. Certainly, a computer program is a static construct and, however complex it is, we can read and understand it. (Students who take my programming languages course learn that even another program can read and process programs in a helpful way.) This was not a problem for Hofstadter either, growing up as he did in a physicist's household. Indeed, he found Turing's formulation of the Imitation Game to be deep and brilliant. Many of us who are drawn to AI feel the same. "If I could write a program capable of playing the Imitation Game," we think, "I will have done something remarkable."

One of Hofstadter's primary goals in writing GEB was to make a compelling case form Turing's vision.

Douglas Hofstadter

Those of us who attended the Turing seminar read a section from Chapter 13 of Le Ton beau de Marot, a more recent book by Hofstadter in which he explores many of the same ideas about words, concepts, meaning, and machine intelligence as GEB, in the context of translating text from one language to another. Hofstadter said the focus in this book is on the subtlety of words and the ideas they embody, and what that means for translation. Of course, these are the some of the issues that underlie Turing's use of dialogue as sufficient for us to understand what it means to be intelligent.

In the seminar, he shared with us some of his efforts to translate a modern French poem into faithful English. His source poem had itself been translated from older French into modern French by a French poet friend of his. I enjoyed hearing him talk about "the forces" that pushed him toward and away from particular words and phrases. Le Ton beau de Marot uses creative dialogues of the sort seen in GEB, this time between the Ace Mechanical Translator (his fictional computer program) and a Dull Rigid Human. Notice the initials of his raconteurs! They are an homage to Turing. The human translator, Douglas R. Hofstadter himself, is cast in the role of AMT, which shares its initials with Alan M. Turing, the man who started this conversation over sixty years ago.

Like Hofstadter, I have often encountered people who object to the Turing test. Many of my AI colleagues are comfortable with a behavioral test for intelligence but dislike that Turing considers only linguistic behavior. I am comfortable with linguistic behavior because it captures what is for me the most important feature of intelligence: the ability to express and discuss ideas.

Others object that it sets too low a bar for AI, because it is agnostic on method. What if a program "passes the test", and when we look inside the box we don't understand what we see? Or worse, we do understand what we see and are unimpressed? I think that this is beside the point. Not to say that we shouldn't want to understand. If we found such I program, I think that we would make it an overriding goal to figure out how it works. But how an entity manages to be "intelligent" is a different question from whether it is intelligent. That is precisely Turing's point!

I agree with Brian Christian, who won the prize for being "The Most Human Human" in a competition based on Turing's now-famous test. In an interview with The Paris Review, he said,

Some see the history of AI as a dehumanizing narrative; I see it as much the reverse.

Turing does not diminish what it is to be human when he suggests that a computer might be able to carry on a rich conversation about something meaningful. Neither do AI researchers or teenagers like me, who dreamed of figuring just what it is that makes it possible for humans to do what we do. We ask the question precisely because we are amazed. Christian again:

We build these things in our own image, leveraging all the understanding of ourselves we have, and then we get to see where they fall short. That gap always has something new to teach us about who we are.

As in science itself, every time we push back the curtain, we find another layer of amazement -- and more questions.

I agree with Hofstadter. If a computer could do what it does in Turing's dialogues, then no one could rightly say that it wasn't "intelligent", whatever that might mean. Turing was right.

~~~~

PHOTOGRAPH 1: the Alan Turing centenary celebration. Source: 2012 The Alan Turing Year.

PHOTOGRAPH 2: Douglas Hofstadter in Bologna, Italy, 2002. Source: Wikimedia Commons.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General

March 27, 2012 4:53 PM

Faculty Workload and the Cost of Universities

This morning, @tonybibbs tweeted me a link to a Washington Post piece called Do college professors work hard enough?, wondering what I might think.

Author David Levy calls for "reforms for outmoded employment policies that overcompensate faculty for inefficient teaching schedules". Once, he says, faculty were generally underpaid relative to comparably educated professionals; now senior faculty at most state universities earn salaries roughly in line with comparable professionals.

Not changed, however, are the accommodations designed to compensate for low pay in earlier times. Though faculty salaries now mirror those of most upper-middle-class Americans working 40 hours for 50 weeks, they continue to pay for teaching time of nine to 15 hours per week for 30 weeks, making possible a month-long winter break, a week off in the spring and a summer vacation from mid-May until September.

My initial impressions after a quick read this morning were

  1. Yes, some faculty work too little.
  2. Most faculty work more than he seems to think.
  3. Changing #1 is hard.

After a second read, that still my impression. Let me expand.

Before beginning, let me note that Levy mentions three kinds of institutions: research universities, teaching universities, and community colleges. I myself can't offer informed comment on community college faculty. I have spent my professional career as a faculty member and department head at a teaching university. I also spent six years in grad school at an R-1 institution and have many colleagues and friends who work at research schools. Finally, I am in Computer Science, not a more stable discipline. These are the experience on which I draw.

First, #2. Levy seems willing to grant that faculty at research institutions work longer hours, or if not at least that the work they do is so valuable as to earn high pay. I agree. Levy seems unwilling to grant similar effort or importance to what faculty at teaching universities. He thinks himself generous in allowing that the latter might spend as much time in prep as in class and concludes that "the notion that faculty in teaching institutions work a 40-hour week is a myth".

At my school, data on faculty workload have routinely showed that on average faculty work more than fifty hours per week. When I was full time faculty, my numbers were generally closer to sixty. (As a department head for the last few years, I have generally worked more.) These numbers are self-reported, but I have good reason to trust them, having observed what faculty in my department do.

If we aren't meeting an R-1 school's stringent requirements for research, publication, and grant writing, what are we doing? We actually do spend more hours per week working outside the classroom as inside. We are preparing new course materials, meeting with students in office hours and the lab, and experimenting with new programming languages and technologies that can improve our courses (or making some of our course content obsolete). We advise undergrads and supervise their research projects. Many departments have small grad programs, which bring with them some of the duties that R-1 profs face.

We also do scholarship. Most teaching schools do expect some research and publication, though clearly not at the level expected by the R-1s. Teaching schools are also somewhat broader in the venues for publication that they accept, allowing teaching conferences such as SIGCSE or formal workshops like the SPLASH (neé OOPSLA) Educators' Symposium. Given these caveats, publishing a paper or more per year is not an unusual scholarship expectation at schools like mine.

During the summer, faculty at teaching universities are often doing research, writing, or preparing and teaching workshops for which they are paid little, if anything. Such faculty may have time for more vacation than other professionals, but I don't think many of them are sailing the Caribbean for the 20+ weeks that Levy presumes they have free.

Levy does mention service to the institution in the form of committee work. Large organizations do not run themselves. From what I remember of my time in grad school, most of my professors devoted relatively little time to committees. They were busy doing research and leading their teams. The university must have had paid staff doing a lot of the grunt work to keep the institution moving. At a school like mine, many faculty carry heavy service loads. Perhaps we could streamline the bureaucracy to eliminate some of this work, or higher staff to do it, but it really does consume a non-trivial amount of some faculty members' time and energy.

After offering these counterpoints -- which I understand may be seen as self-serving, given where I work -- what of #1? It is certainly the case that some university faculty work too little. Expectations for productivity in research and other scholarship have often been soft in the past, and only now are many schools coming to grips with the full cost of faculty productivity.

Recently, my school has begun to confront a long-term decline in real funding from the state, realizing that it cannot continue to raise tuition to make up for the gap. One administrative initiative asked department heads and faculty to examine scholarly productivity of faculty and assign professors who have not produced enough papers, grant proposals, or other scholarly results over a five-year period to teach an extra course. There were some problems in how administrators launched and communicated this initiative, but the idea is a reasonable one. If faculty are allocated time for scholarship but aren't doing much, then they can use that time to teach a course.

The reaction by most of faculty was skeptical and concerned. (This was true of department heads as well, because most of us think of ourselves as faculty temporarily playing an administrator's role.)

That brings us to #3. Changing a culture is hard. It creates uncertainty. When expectations have been implicit, it is hard to make them explicit in a way that allows enforcement while at the same time recognizing the value in what most faculty have been doing. The very word "enforcement" runs counter to the academic culture, in which faculty are left free to study and create in ways that improve their students' education and in which it is presumed faculty are behaving honorably.

In this sense, Levy's article hits on an issue that faces universities and the people who pay for them: taxpayers, students and parents who pay tuition, and granting agencies. I agree with Levy that addressing this issue is essential as universities come to live in a world with different cost structures and different social contracts. He seems to understand that change will be hard. However, I'm not sure he has an accurate view of what faculty at teaching universities are already doing.


Posted by Eugene Wallingford | Permalink | Categories: General

February 29, 2012 4:40 PM

From Mass Producing Rule Followers to Educating Creators

The bottom is not a good place to be, even if you're capable of getting there.

Seth Godin's latest manifesto, Stop Stealing Dreams, calls for a change to the way we educate our children. I've written some about how changes in technology and culture will likely disrupt universities, but Godin bases his manifesto on a simpler premise: we have to change what we achieve through education because what we need has changed. Historically, he claims, our K-12 system has excelled at one task: "churning out kids who are stuck looking for jobs where the boss tells them exactly what to do".

As negatively as that is phrased, it may well have been a reasonable goal for a new universal system of compulsory education in the first half of the 1900s. But times have changed, technology has changed, our economy has changed, and our needs have changed. Besides, universal education is a reality now, not a dream, so perhaps we should set our sights higher.

I only began to read Godin's book this afternoon. I'm curious to see how well the ideas in it apply to university education. The role of our universities has changed over time, too, including rapid growth in the number of people continuing their education after high school. The number and variety of public universities grew through the 1960s and 1970s in part to meet the new demand.

Yet, at its root, undergraduate education is, for most students, a continuation of the same model they experienced K-12: follow a prescribed program of study, attend classes, do assignments, pass tests, and follow rules. A few students avail themselves of something better as undergrads, but it's really not until grad school that most people have a chance to participate fully in the exploration for and creation of knowledge. And that is the result of self-selection: those most interested in such an education seek it out. Alas, many undergrads seem hardly prepared to begin driving their own educations, let alone interested.

That is one of the challenges university professors face. From my experience as a student and a father of students, I know that many HS teachers are working hard to open their students' minds to bigger ideas, too -- when they have the chance, that is, amid the factory-style mass production system that dominates many high schools today.

As I sat down to write this, it occurred to me that learning to program is a great avenue toward becoming a creator and an innovator. Sadly, most CS programs seem satisfied to keep doing the same old thing: to churn out people who are good at doing what they are told. I think many university professors, myself included, could do better by keeping this risk in mind. Every day as I enter the classroom, I should ask myself what today's session will do for my students: kill a dream, or empower it?

While working out this morning, my iPod served up John Hiatt's song, "Everybody Went Low" (available on YouTube). The juxtaposition of "going low" in the song and Godin's warning about striving for the bottom created an interesting mash-up in my brain. As Hiatt sings, when you are at the bottom, there is:

Nothing there to live up to
There's nothing further down
Turn it off or turn around

Big systems with lots of moving parts are hard to turn around. I hope we can do it before we get too low.


Posted by Eugene Wallingford | Permalink | Categories: General

February 25, 2012 3:04 PM

I Did the Reading, and Watched the Video

David Foster Wallace, 2006

It seems that I've been running across David Foster Wallace everywhere for the last few months. I am currently reading his collection A Supposedly Fun Thing I'll Never Do Again. I picked it up for a tennis-turned-philosophy essay titled, improbably, "Tennis Player Michael Joyce's Professional Artistry as a Paradigm of Certain Stuff about Choice, Freedom, Discipline, Joy, Grotesquerie, and Human Completeness". (You know I am a tennis fan.) On the way to reading that piece, I got hooked on the essay about filmmaker David Lynch. I am not a fan of Wallace's fiction, but his literary non-fiction arrests me.

This morning, I am listening to a lengthy uncut interview with Wallace from 2003, courtesy of fogus. In it, Wallace comes across just as he does in his written work: smart, well-read, and deeply thoughtful. He also seems so remarkably pleasant -- not the sort of thing I usually think of as a default trait in celebrities. His pleasantness feels very familiar to me as a fellow Midwesterner.

The video also offers occasional haunting images, both his mannerisms but especially his eyes. His obvious discomfort makes me uncomfortable as I watch. It occurs to me that I feel this way only because I know how his life ended, but I don't think that's true.

The interview contains many thought-provoking responses and interchanges. One particular phrase will stay with me for a while. Wallace mentions the fondness Americans have for the freedom to choice, the freedom to satisfy our desires. He reminds us that inherent in such freedom is a grave risk: a "peculiar kind of slavery", in which we feel we must satisfy our desires, we must act on our impulses. Where is the freedom in that prison?

There is also a simple line that appealed to the teacher in me: "It takes skill and education to get good enough at reading or listening to be able to derive pleasure from it." This is one of the challenges that faces teachers everywhere. Many things require skill and education -- and time -- in order for students to be able to derive satisfaction and even pleasure from them. Computer programming is one.

I recommend this interview to anyone interested in modern culture, especially American culture.

As I listened, I was reminded of this exchange from a short blog entry by Seth Godin from last year:

A guy asked his friend, the writer David Foster Wallace,

"Say, Dave, how'd y'get t'be so dang smart?"

His answer:

"I did the reading."

Wallace clearly did the reading.

~~~~

PHOTOGRAPH: David Foster Wallace at the Hammer Museum in Los Angeles, January 2006. Source: Wikimedia Commons.


Posted by Eugene Wallingford | Permalink | Categories: General

February 06, 2012 6:26 PM

Shopping Blog Entries to a Wider Audience

Over the last couple of years, our university relations department has been trying to promote more actively the university's role in the public sphere. One element of this effort is pushing faculty work and professional commentary out into wider circulation. For example, before and after the recent presidential caucuses in Iowa, they helped connect local political science profs with media who were looking for professional commentary from in the trenches.

Well, they have now discovered my blog and are interested in shopping several pieces of general interest to more traditional media outlets, such as national newspapers and professional magazines. Their first effort involves a piece I wrote about liberal education last month, which builds on two related pieces, here and here. I'm in the process of putting it into a form suitable for standalone publication. This includes polishing up some of the language, as well as not relying on links to other articles -- one of the great wins of the networked world.

Another big win of the networked world is the ease with which we can get feedback and make our ideas and our writing better. If you have any suggestions for how I might improve the presentation of the ideas in these pieces, or even the ideas themselves, please let me know. As always, I appreciate your thoughts and willingness to discuss them with me.

When I mentioned this situation in passing on Twitter recently, a former student asked whether my blog's being on the university's radar would cause me to write differently. The fact is that I always tried to respect my university, my colleagues, and my students when I write, and to keep their privacy and integrity in mind. This naturally results in some level of self-censorship. Still, I have always tried to write openly and honestly about what I think and learn.

You can rest assured. This blog remains mine alone and will continue to speak in my voice. I will write as openly and honestly as ever. That is the only way that the things I write could ever be of much interest to readers such as you, let alone to me.


Posted by Eugene Wallingford | Permalink | Categories: General

January 25, 2012 3:45 PM

Pragmatism and the Scientific Spirit

the philosopher William James

Last week, I found myself reading The Most Entertaining Philosopher, about William James. It was good fun. I have always liked James. I liked the work of his colleagues in pragmatism, C.S. Peirce and John Dewey, too, but I always liked James more. For all the weaknesses of his formulation of pragmatism, he always seemed so much more human to me than Peirce, who did the heavy theoretical lifting to create pragmatism as a formal philosophy. And he always seemed a lot more fun than Dewey.

I wrote an entry a few years ago called The Academic Future of Agile Methods, which described the connection between pragmatism and my earlier AI, as well as agile software development. I still consider myself a pragmatist, though it's tough to explain just what that means. The pragmatic stance is too often confounded with a self-serving view of the world, a "whatever works is true" philosophy. Whatever works... for me. James's references to the "cash value" of truth didn't help. (James himself tried to undo the phrase's ill effects, but it has stuck. Even in the 1800s, it seems, a good sound bite was better than the truth.)

As John Banville, the author NY Times book review piece says, "It is far easier to act in the spirit of pragmatism than to describe what it is." He then gives "perhaps the most concise and elegant definition" of pragmatism, by philosopher C. I. Lewis. It is a definition that captures the spirit of pragmatism as well as any few lines can:

Pragmatism could be characterized as the doctrine that all problems are at bottom problems of conduct, that all judgments are, implicitly, judgments of value, and that, as there can be ultimately no valid distinction of theoretical and practical, so there can be no final separation of questions of truth of any kind from questions of the justifiable ends of action.

This is what drew me to pragmatism while doing work in knowledge-based systems, as a reaction to the prevailing view of logical AI that seemed based in idealist and realist epistemologies. It is also what seems to me to distinguish agile approaches to software development from the more common views of software engineering. I applaud people who are trying to create an overarching model for software development, a capital-t Theory, but I'm skeptical. The agile mindset is, or at least can be, pragmatic. I view software development in much the way James viewed consciousness: "not a thing or a place, but a process".

As I read again about James and his approach, I remember my first encounters with pragmatism and thinking: Pragmatism is science; other forms of epistemology are mathematics.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

January 12, 2012 3:38 PM

At Least I'm Not Alone

One of the things I love about reading professional blogs and twitter feeds is reassuring myself that I am not crazy in many of my compulsions and obsessions.

On the exercise bike yesterday morning, I read Matt Might's End artificial scarcities to increase productivity. Many years ago I saw my colleague, friend, and hero Joe Bergin do something that I now do faithfully: always carry with me a pad of paper, small enough to fit comfortably in most any pocket, and a good pen. When your life is writing code, writing lectures, writing blog entries, you often want to write at the oddest of times. Now I am always ready to jot down any idea that comes into my head as soon as it does. I may throw it away later as a hare-brained scheme, but I prefer that to losing an idea for lack of a notepad.

Our house has pens, pencils, and usually paper in nearly every room. I have them in every bag I carry an in most coats I wear. The kind of pen matters some; I hate splotching and bleeding through. I have a fondness for a particular older model of Uniball pens, but I'm not obsessed with them. I do have a box of them in my desk at home, and every once in a while I'll pull one out to replace a pen that has run dry. They feel right in my hand.

Like Might, I have MacbookPro power adapters in every room in which I work, as well as one in my travel bag. The cost of having three or four adapters have been well worth the peace of mind. I even have a back-up battery or two on hand most of the time. (My Pro is one of the older ones with the removable battery.) I like to have one each in my home and school offices, where i do most of my work and from which most excursions begin.

On the bike this morning, I read Rands in Repose's bag pr0n essay from last month. Loved it! Like Lopp and many other geeks, I have at times obsessed over my bag. Back in high school I carried an attache case my parents gave me for Christmas. (Yes, I was that guy.) Since college and grad school, I've gone through several styles of bag, including freebies given out at conferences and a couple of nice ones my wife gave me as gifts. A few have provided what I desire: compactness, with a few compartments but not too many.

One of my favorites was from SIGCSE in the late 1990s. I still have it, though it shows its age and wear. Another is a bag I got at one of the PLoP conferences in the early part of the previous decade. It was perfect for an iBook, but is too small for my Pro. I still have it, too, waiting for a time when it will fit my needs again. Both were products of the days of really good conference swag. My current bag is a simple leather case that my wife gave me. It's been serving me well for a couple of years.

Each person has his or her particular point of obsession. Mine is the way the shoulder strap attaches to the body of bag. So many good bags have died too soon when the metallic clasp holding strap to body broke, or the clasp worked loose, or the fabric piece wore through.

Strange but true: One of my all time favorite bags was a $5 blue vinyl diaper bag that my wife bought at a garage sale in the early 1990s. No one knew it was a diaper bag, or so I think; at a glance it was rather inncouous. This bag was especially useful at a time when I traveled a lot, attending 4-6 conferences a year and doing more personal travel than I do these days. The changing pad served as a great sleeve to protect my laptop (first a G3 clamshell, then an iBook). The side compartments designed to hold two baby bottles were great for bottles of water or soda. This was especially handy for a long day flying -- back when we could do such crazy things as carry drinks with us. This bag also passed Rands' airport security line test. It allowed for easy in-and-out of the laptop, and then rolled nicely on its side for going through x-ray. I still think about returning to this bag some day.

I'm sure that this sort of obsessiveness is a positive trait for programmers. So many of us have it, it must be.


Posted by Eugene Wallingford | Permalink | Categories: General

December 30, 2011 11:05 AM

Pretending

Kurt Vonnegut never hesitated to tell his readers the morals of his stories. The frontispiece of his novel Mother Night states its moral upfront:

We are what we pretend to be, so we must be careful about what we pretend to be.

Pretending is a core thread that runs through all of Vonnegut's work. I recognized this as a teenager, and perhaps it is what drew me to his books and stories. As a junior in high school, I wrote my major research paper in English class on the role fantasy played in the lives of Vonnegut's characters. (My teachers usually resisted my efforts to write about authors such as Kafka, Vonnegut, and Asimov, because they weren't part of "the canon". They always relented, eventually, and I got to spend more time thinking about works I loved.)

I first used this sentence about pretending in my eulogy for Vonnegut, which includes a couple of other passages on similar themes. Several of those are from Bokononism, the religion created in his novel Cat's Cradle as a way to help the natives of the island of San Lorenzo endure their otherwise unbearable lives. Bokononism had such an effect on me that I spent part of one summer many years ago transcribing The Books of Bokonon onto the web. (In these more modern times, I share Bokonon's wisdom via Twitter.)

Pretending is not just a way to overcome pain and suffering. Even for Vonnegut, play and pretense are the ways we construct the sane, moral, kind world in which we want to live. Pretending is, at its root, a necessary component in how we train our minds and bodies to think and act as we want them to. Over the years, I've written many times on this blog about the formation of habits of mind and body, whether as a computer scientist, a student, or a distance runner.

Many people quote Aristotle as the paradigm of this truth:

We are what we repeatedly do. Excellence, then, is not an act, but a habit.

I like this passage but prefer another of his, which I once quoted in a short piece, What Remains Is What Will Matter:

Excellence is an art won by training and habituation. We do not act rightly because we have virtue or excellence, but rather we have those because we have acted rightly.

This idea came charging into my mind this morning as I read an interview with Seth Godin. He and his interviewers are discussing the steady stream of rejection that most entrepreneurs face, and how some people seem able to fight through it to succeed. What if a person's natural "thermostat" predisposes them to fold in the face of rejection? Godin says:

I think we can reset our inclinations. I'm certain that pretending we can is better than admitting we can't.

Vonnegut and Aristotle would be proud. We are what we pretend to be. If we wish to be virtuous, then we must act rightly. If we wish to be the sort of person who responds to rejection by working harder and succeeding, then we must work harder. We become the person we pretend to be.

As children, we think pretending is about being someone we aren't. And there is great fun in that. As teenagers, sometimes we feel a need to pretend, because we have so little control over our world and even over our changing selves. As adults, we tend to want to put pretending away as child's play. But this obscures a truth that Vonnegut and Aristotle are trying to teach us:

Pretending is just as much about being who we are as about being who we aren't.

As you and I consider the coming of a new year and what we might resolve to do better or differently in the coming twelve months that will make a real difference in our lives, I suggest we take a page out of Vonnegut's playbook.

Think about the kind of world you want to live in, then live as if it exists.

Think about the kind of person you want to be, then live as if you are that person.


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

December 19, 2011 4:49 PM

"I Love The Stuff You Never See"

I occasionally read and hear people give advice about how to find a career, vocation, or avocation that someone will enjoy and succeed in. There is a lot of talk about passion, which is understandable. Surely, we will enjoy things we are passionate about, and perhaps then we want to put in the hours required to succeed. Still, "finding your passion" seems a little abstract, especially for someone who is struggling to find one.

This weekend, I read A Man, A Ball, A Hoop, A Bench (and an Alleged Thread)... Teller!. It's a story about the magician Teller, one half of the wonderful team Penn & Teller, and his years-long pursuit of a particular illusion. While discussing his work habits, Teller said something deceptively simple:

I love the stuff you never see.

I knew immediately just what he meant.

I can say this about teaching. I love the hours spent creating examples, writing sample code, improving it, writing and rewriting lecture notes, and creating and solving homework assignments. When a course doesn't go as I had planned, I like figuring out why and trying to fix it. Students see the finished product, not the hours spent creating it. I enjoy both.

I don't necessarily enjoy all of the behind-the-scenes work. I don't really enjoy grading. But my enjoyment of the preparation and my enjoyment of the class itself -- the teaching equivalent of "the performance" -- carries me through.

I can also say the same thing about programming. I love to fiddle with source code, organizing and rewriting it until it's all just so. I love to factor out repetition and discover abstractions. I enjoy tweaking interfaces, both the interfaces inside my code and the interfaces my code's users see. I love that sudden moment of pleasure when a program runs for the first time. Users see the finished product, not the hours spent creating it. I enjoy both.

Again, I don't necessarily enjoy everything that I have to do the behind the scenes. I don't enjoy twiddling with configuration files, especially at the interface to the OS. Unlike many of my friends, I don't always enjoy installing and uninstalling, all the libraries I need to make everything work in the current version of the OS and interpreter. But that time seems small compared the time I spend living inside the code, and that carries me through.

In many ways, I think that Teller's simple declaration is a much better predictor of what you will enjoy in a career or avocation than other, fancier advice you'll