February 21, 2018 3:38 PM

Computer Programs Aren't Pure Abstractions. They Live in the World.

Guile Scheme guru Andy Wingo recently wrote a post about langsec, the idea that we can bake system security into our programs by using languages that support proof of correctness. Compilers can then be tools for enforcing security. Wingo is a big fan of the langsec approach but, in light of the Spectre and Meltdown vulnerabilities, is pessimistic that it really matter anymore. If bad actors can exploit the hardware that executes our programs, then proving that the code is secure doesn't do much good.

I've read a few blog posts and tweets that say Wingo is too pessimistic, that efforts to make our languages produce more secure code will still pay off. I think my favorite such remark, though, is a comment on Wingo's post itself, by Thomas Dullien:

I think this is too dark a post, but it shows a useful shock: Computer Science likes to live in proximity to pure mathematics, but it lives between EE and mathematics. And neglecting the EE side is dangerous - which not only Spectre showed, but which should have been obvious at the latest when Rowhammer hit.
There's actual physics happening, and we need to be aware of it.

It's easy for academics, and even programmers who work atop an endless stack of frameworks, to start thinking of programs as pure abstractions. But computer programs, unlike mathematical proofs, come into contact with real, live hardware. It's good to be reminded sometimes that computer science isn't math; it lives somewhere between math and engineering. That is good in so many ways, but it also has its downsides. We should keep that in mind.


Posted by Eugene Wallingford | Permalink | Categories: Computing

February 16, 2018 2:54 PM

Old Ideas and New Words

In this Los Angeles Review of Books interview, novelist Jenny Offill says:

I was reading a poet from the Tang dynasty... One of his lines from, I don't know, page 812, was "No new feelings". When I read that I laughed out loud. People have been writing about the same things since the invention of the written word. The only originality comes from the language itself.

After a week revising lecture notes and rewriting a recruiting talk intended for high school students and their parents, I know just what Offill and that Tang poet mean. I sometimes feel the same way about code.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development

February 12, 2018 4:03 PM

How Do We Choose The Programming Languages We Love?

In Material as Metaphor, the artist Anni Albers talks about how she came to choose media in which she worked:

How do we choose our specific material, our means of communication? "Accidentally". Something speaks to us, a sound, a touch, hardness or softness, it catches us and asks us to be formed. We are finding our language, and as we go along we learn to obey their rules and their limits. We have to obey, and adjust to those demands. Ideas flow from it to us and though we feel to be the creator we are involved in a dialogue with our medium. The more subtly we are tuned to our medium, the more inventive our actions will become. Not listening to it ends in failure.

This expresses much the way I feel about different programming languages and styles. I can like them all, and sometimes do! I go through phases when one style speaks to me more than another, or when one language seems to be in sync with how I am thinking. When that happens, I find myself wanting to learn its rules, to conform so that I can reach a point where I feel creative enough to solve interesting problems in the language.

If I find myself not liking a language, it's usually because I'm not listening to it; I'm fighting back. When I first tried to learn Haskell, I refused to bend to its style of functional programming. I had worked hard to grok FP in Scheme, and I was so proud of my hard-won understanding that I wanted to impose it on the new language. Eventually, I retreated for a while, returned more humbly, and finally came to appreciate Haskell, if not master it deeply.

My experience with Smalltalk went differently. One summer I listened to what it was telling me, slowly and patiently, throwing code away and starting over several times on an application I was trying to build. This didn't feel like a struggle so much as a several-month tutoring session. By the end, I felt ideas flowing through me. I think that's the kind of dialogue Albers is referring to.

If I want to master a new programming language, I have to be willing to obey its limits and to learn how to use its strengths as leverage. This can be a conscious choice. It's frustrating when that doesn't seem to be enough.

I wish I could always will myself into the right frame of mind to learn a new way of thinking. Albers reminds us that often a language speaks to us first. Sometimes, I just have to walk away and wait until the time is right.


Posted by Eugene Wallingford | Permalink | Categories: Personal, Software Development, Teaching and Learning

January 22, 2018 3:50 PM

Same Footage, Different Film

In In the Blink of an Eye, Walter Murch tells the story of human and chimpanzee DNA, about how the DNA itself is substantially the same and how the sequencing, which we understand less well, creates different beings during the development of the newborn. He concludes by bringing the analogy back to film editing:

My point is that the information in the DNA can be seen as uncut film and the mysterious sequencing as the editor. You could sit in one room with a pile of dailies and another editor could sit in the next room with exactly the same footage and both of you could make different films out of the same material.

This struck me as quite the opposite of what programmers do. When given a new problem and a large language in which to solve it, two programmers can choose substantially different source material and yet end up telling the same story. Functional and OO programmers, say, may decompose the problem in a different way and rely on different language features to build their solutions, but in the end both programs will solve the same problem and meet the same user need. Like the chimp and the human, though, the resulting programs may be better adapted for living in different environments.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development

January 17, 2018 3:51 PM

Footnotes

While discussing the effective use of discontinuities in film, both motion within a context versus change of context, Walter Murch tells a story about... bees:

A beehive can apparently be moved two inches each night without disorienting the bees the next morning. Surprisingly, if it is moved two miles, the bees also have no problem: They are forced by the total displacement of their environment to re-orient their sense of direction, which they can do easily enough. But if the hive is moved two yards, the bees become fatally confused. The environment does not seem different to them, so they do not re-orient themselves, and as a result, they will not recognize their own hive when they return from foraging, hovering instead in the empty space where the hive used to be, while the hive itself sits just two yards away.

This is fascinating, as well being a really cool analogy for the choices movies editors face when telling a story on film. Either change so little that viewers recognize the motion as natural, or change enough that they re-orient their perspective. Don't stop in the middle.

What is even cooler to me is that this story appears in a footnote.

One of the things I've been loving about In the Blink of an Eye is how Murch uses footnotes to teach. In many books, footnotes contain minutia or references to literature I'll never read, so I skip them. But Murch uses them to tell stories that elaborate on or deepen his main point but which would, if included in the text, interrupt the flow of the story he has constructed. They add to the narrative without being essential.

I've already learned a couple of cool things from his footnotes, and I'm not even a quarter of the way into the book. (I've been taking time to mull over what I read...) Another example: while discussing the value of discontinuity as a story-telling device, Murch adds a footnote that connects this practice to the visual discontinuity found ancient Egyptian painting. I never knew before why the perspective in those drawings was so unusual. Now I do!

My fondness for Murch's footnotes may stem from something more than their informative nature. When writing up lecture notes for my students, I like to include asides, digressions, and links to optional readings that expand on the main arc of the story. I'd like for them to realize that what they are learning is part of a world bigger than our course, that the ideas are often deeper and have wider implications than they might realize. And sometimes I just like to entertain with a connection. Not all students care about this material, but for the ones who do, I hope they get something out of them. Students who don't care can do what I do in other books: skip 'em.

This book gives me a higher goal to shoot for when including such asides in my notes: elaborate without being essential; entice without disrupting.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 15, 2018 9:22 AM

The Cut

Walter Murch, in In the Blink of an Eye:

A vast amount of preparation, really, to arrive at the innocuously brief moment of decisive act: the cut -- the moment of transition from one shot to the next -- something that, appropriately enough, should look almost self-evidently simple and effortless, if it is even noticed at all.

This can apply to software development, I think, but I haven't thought about that yet. I read the passage at the beginning of a new semester, when my mind is filled with another offering of my programming languages course. So I've been thinking about how this quote works in the context of my course: the overall structure of the course as well as the structure of individual class sessions. The course consists of three major units, connected by a thread, with the third having its own substructure. Each session is a more cohesive story with its own microstructure: the flow of a lecture, the sequence of exercises students do, the sequence of examples that students see. Moments of transition are everywhere, at multiple scales.

When a session goes badly, or not as well as I'd hoped, I am quite aware of the cuts that did not work. They seem awkward, or ill-motivated, or jarring. The students notice some of these, too, but they don't always let me know of their disorientation right away. That's one benefit of building frequent exercises and review questions into a class session: at least I have a chance of finding out sooner when something isn't working the way I'd planned.

Reading Murch has given me a new vocabulary for thinking about transitions visually. In particular, I've been thinking about two basic types of transition:

  • one that signals motion within a context
  • one that signals a change of context
These are a natural part of any writer's job, but I've found it helpful to think about them more explicitly as I worked on class this week.
Charlie Brown's teacher drones on

For example, I've been trying to think more often about how one kind of cut can be mistaken for the other and how that might affect students. What happens when what I intend as a small move within a context seems so disconnected for students that they think I've changed contexts? What happens when what I intend as a big shift to a new topic sounds to students like the WAH-WAH-WAH of Charlie Brown's teacher? I can always erect massive signposts to signal transitions of various kinds, but that can be just as jarring to readers or listeners as unannounced cuts. It is also inelegant, because it fails to respect their ability to construct their own understanding of what they are learning.

Trying on Murch's perspective has not been a panacea. The first session of the course, the one with a new opening story, went well, though it needs a few more iterations to become good. My second session went less well. I tried to rearrange a session that already worked well, and my thinking about transitions was too self-conscious. The result was a synthesis of two threads that didn't quite work, leaving me feeling a bit jumbled myself by connections that were incomplete and jumps that seemed abrupt. Fortunately, I think I managed to recognize this soon enough in class that I was able to tell a more coherent story than my outline prepared me to tell. The outline needs a lot more work.

In the longer run, though, thinking about transitions more carefully should help me do a better job leading students in a fruitful direction. I'll keep at it.


Posted by Eugene Wallingford | Permalink | Categories: Teaching and Learning

January 14, 2018 9:24 AM

Acceleration

This was posted on the Racket mailing list recently:

"The Little Schemer" starts slow for people who have programmed before, but seeing that I am only half-way through and already gained some interesting knowledge from it, one should not underestimate the acceleration in this book.

The Little Schemer is the only textbook I assign in my Programming Languages course. These students usually have only a little experience: often three semesters, two in Python and one in Java; sometimes just the two in Python. A few of the students who work in the way the authors intend have an A-ha! experience while reading it. Or maybe they are just lucky... Other students have only a WTF? experience.

Still, I assign the book, with hope. It's relatively inexpensive and so worth a chance that a few students can use it to grok recursion, along with a way of thinking about writing functions that they haven't seen in courses or textbooks before. The book accelerates from the most basic ideas of programming to "interesting" knowledge in a relatively short number of pages. Students who buy in to the premise, hang on for the ride, and practice the ideas in their own code soon find that they, too, have accelerated as programmers.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Teaching and Learning

January 07, 2018 10:25 AM

95:1

This morning, I read the first few pages of In the Blink of an Eye, an essay on film editing by Walter Murch. He starts by talking about his work on Apocalypse Now, which took well over a year in large part because of the massive amount of film Coppola shot: 1,250,000 linear feet, enough for 230 hours of running time. The movie ended up being about two hours and twenty-five minutes, so Murch and his colleagues culled 95 minutes of footage for every minute that made it into the final product. A more typical project, Murch says, has a ratio of 20:1.

Even at 20:1, Murch's story puts into clearer light the amount of raw material I create when designing a typical session for one of my courses. The typical session mixes exposition, examples, student exercises, and (less than I'd like to admit) discussion. Now, whenever I feel like a session comes up short of my goal, I will think back on Murch's 20:1 ratio and realize how much harder I might work to produce enough material to assemble a good session. If I want one of my sessions to be an Apocalypse Now, maybe I'll need to shoot higher.

This motivation comes at a favorable time. Yesterday I had a burst of what felt like inspiration for a new first day to my Programming Languages course. At the end of the brainstorm came what is now the working version of my opening line in the course: "In the beginning, there was assembly language.". Let's see if I have enough inspiration -- and make enough time -- to turn the idea into what I hope it can be: a session that fuels my students' imagination for a semester's journey through Racket, functional programming, and examining language ideas with interpreters.

I do hope, though, that the journey itself does not bring to mind Apocalypse Now.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

January 05, 2018 1:27 PM

Change of Terms

I received a Change of Terms message yesterday from one of my mutual fund companies, which included this unexpected note:

Direction to buy or sell Vanguard funds must be placed online or verbally and may no longer be submitted in writing.

I haven't mailed Vanguard or any other financial services company a paper form or a paper check in years, but still. When I was growing up, I never would have imagined that I would see the day when you could not mail a letter to a company in order to conduct financial business. Busy, busy, busy.

In the academic world, this is the time for another type change of terms, as we prepare to launch our spring semester semester on Monday. The temperatures in my part of the country the last two weeks make the name of the semester a cruel joke, but the hope of spring lives.

For me, the transition is from my compiler course to my programming languages course. Compilers went as well this fall as it has gone in a long time; I really wish I had blogged about it more. I can only hope that Programming Languages goes as well. I've been reading about some ways I might improve the course pedagogically. That will require me to change some old habits, but trying to do so is part of the fun of teaching. I intend to blog about my experiences with the new ideas. As I said, the hope of spring lives.

In any case, I get to write Racket code all semester, so at least I have that going for me, which is nice.


Posted by Eugene Wallingford | Permalink | Categories: General, Teaching and Learning

December 28, 2017 8:46 AM

You Have to Learn That It's All Beautiful

In this interview with Adam Grant, Walter Jacobson talks about some of the things he learned while writing biographies of Benjamin Franklin, Albert Einstein, Steve Jobs, and Leonardo da Vinci. A common theme is that all four were curious and interested in a wide range of topics. Toward the end of the interview, Jacobson says:

We of humanities backgrounds are always doing the lecture, like, "We need to put the 'A' in 'STEM', and you've got to learn the arts and the humanities." And you get big applause when you talk about the importance of that.
But we also have to meet halfway and learn the beauty of math. Because people tell me, "I can't believe somebody doesn't know the difference between Mozart and Haydn, or the difference between Lear and Macbeth." And I say, "Yeah, but do you know the difference between a resistor and a transistor? Do you know the difference between an integral and a differential equation?" They go, "Oh no, I don't do math, I don't do science." I say, "Yeah, but you know what, an integral equation is just as beautiful as a brush stroke on the Mona Lisa." You've got to learn that they're all beautiful.

Appreciating that beauty made Leonardo a better artist and Jobs a better technologist. I would like for the students who graduate from our CS program to know some literature, history, and art and appreciate their beauty. I'd also like for the students who graduate from our university with degrees in literature, history, art, and especially education to have some knowledge of calculus, the Turing machine, and recombinant DNA, and appreciate their beauty.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning