February 18, 2020 4:00 PM

Programming Feels Like Home

I saw Robin Sloan's An App Can Be a Home-Cooked Meal floating around Twitter a few days back. It really is quite good; give it a read if you haven't already. This passage captures a lot of the essay's spirit in only a few words:

The exhortation "learn to code!" has its foundations in market value. "Learn to code" is suggested as a way up, a way out. "Learn to code" offers economic leverage, a squirt of power. "Learn to code" goes on your resume.
But let's substitute a different phrase: "learn to cook." People don't only learn to cook so they can become chefs. Some do! But far more people learn to cook so they can eat better, or more affordably, or in a specific way. Or because they want to carry on a tradition. Sometimes they learn just because they're bored! Or even because -- get this -- they love spending time with the person who's teaching them.

Sloan expresses better than I ever have an idea that I blog about every so often. Why should people learn to program? Certainly it offers a path to economic gain, and that's why a lot of students study computer science in college, whether as a major, a minor, or a high-leverage class or two. There is nothing wrong with that. It is for many a way up, a way out.

But for some of us, there is more than money in programming. It gives you a certain power over the data and tools you use. I write here occasionally about how a small script or a relatively small program makes my life so much easier, and I feel bad for colleagues who are stuck doing drudge work that I jump past. Occasionally I'll try to share my code, to lighten someone else's burden, but most of the time there is such a mismatch between the worlds we live in that they are happier to keep plugging along. I can't say that I blame them. Still, if only they could program and used tools that enabled them to improve their work environments...

But... There is more still. From the early days of this blog, I've been open with you all:

Here's the thing. I like to write code.

One of the things that students like about my classes is that I love what I do, and they are welcome to join me on the journey. Just today a student in my Programming Languages drifted back to my office with me after class , where we ended up talking for half an hour and sketching code on a whiteboard as we deconstructed a vocabulary choice he made on our latest homework assignment. I could sense this student's own love of programming, and it raised my spirits. It makes me more excited for the rest of the semester.

I've had people come up to me at conferences to say that the reason they read my blog is because they like to see someone enjoying programming as much as they do. many of them share links with their students as if to say, "See, we are not alone." I look forward to days when I will be able to write in this vein more often.

Sloan reminds us that programming can be -- is -- more than a line on a resume. It is something that everyone can do, and want to do, for a lot of different reasons. It would be great if programming "were marbled deeply into domesticity and comfort, nerdiness and curiosity, health and love" in the way that cooking is. That is what makes Computing for All really worth doing.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Personal, Software Development, Teaching and Learning

February 10, 2020 2:37 PM

Some Things I Read Recently

Campaign Security is a Wood Chipper for Your Hopes and Dreams

Practical campaign security is a wood chipper for your hopes and dreams. It sits at the intersection of 19 kinds of status quo, each more odious than the last. You have to accept the fact that computers are broken, software is terrible, campaign finance is evil, the political parties are inept, the DCCC exists, politics is full of parasites, tech companies are run by arrogant man-children, and so on.

This piece from last year has some good advice, plenty of sarcastic humor from Maciej, and one remark that was especially timely for the past week:

You will fare especially badly if you have written an app to fix politics. Put the app away and never speak of it again.

Know the Difference Between Neurosis and Process

In a conversation between Tom Waits and Elvis Costello from the late 1980s, Waits talks about tinkering too long with a song:

TOM: "You have to know the difference between neurosis and actual process, 'cause if you're left with it in your hands for too long, you may unravel everything. You may end up with absolutely nothing."

In software, when we keep code in our hands for too long, we usually end up with an over-engineered, over-abstracted boat anchor. Let the tests tell you when you are done, then stop.

Sometimes, Work is Work

People say, "if you love what you do you'll never work a day in your life." I think good work can be painful--I think sometimes it feels exactly like work.

Some weeks more than others. Trust me. That's okay. You can still love what you do.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Software Development

January 26, 2020 10:23 AM

The Narrative Impulse

Maybe people don't tell stories only to make sense of the world, but rather sometimes to deceive themselves?

It was an interesting idea, I said, that the narrative impulse might spring from the desire to avoid guilt, rather than from the need -- as was generally assumed -- to connect things together in a meaningful way; that it was a strategy calculated, in other words, to disburden ourselves of responsibility.

This is from Kudos, by Rachel Cusk. Kudos is the third book in an unconventional trilogy, following Outline and Transit. I blogged on a passage from Transit last semester, about making something that is part of who you are.

I have wanted to recommend Cusk and these books, but I do not feel up to the task of describing how or why I think so highly of them. They are unorthodox narratives about narrative. To me, Cusk is a mesmerizing story-teller who intertwines stories about people and their lives with the fabric of story-telling itself. She seems to value the stories we tell about ourselves, and yet see through them, to some overarching truth.

As for my own narrative impulse, I think of myself as writing posts for this blog in order to make connections among the many things I learn -- or at least that is I tell myself. Cusk has me taking seriously the idea that some of the stories I tell may come from somewhere else.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

January 22, 2020 3:54 PM

The Roots of TDD -- from 1957

In 1957, Dan McCracken published Digital Computer Programming, perhaps the first book on the new art of programming. His book shows that the roots of extreme programming run deep. In this passage, McCracken encourages both the writing of tests before the writing of code and the involvement of the customer in the software development process:

The first attack on the checkout problem may be made before coding is begun. In order to fully ascertain the accuracy of the answers, it is necessary to have a hand-calculated check case with which to compare the answers which will later be calculated by the machine. This means that stored program machines are never used for a true one-shot problem. There must always be an element of iteration to make it pay. The hand calculations can be done at any point during programming. Frequently, however, computers are operated by computing experts to prepare the problems as a service for engineers or scientists. In these cases it is highly desirable that the "customer" prepare the check case, largely because logical errors and misunderstandings between the programmer and customer may be pointed out by such procedure. If the customer is to prepare the test solution is best for him to start well in advance of actual checkout, since for any sizable problem it will take several days or weeks to calculate the test.

I don't have a copy of this book, but I've read a couple of other early books by McCracken, including one of his Fortran books for engineers and scientists. He was a good writer and teacher.

I had the great fortune to meet Dan at an NSF workshop in Clemson, South Carolina, back in the mid-1990s. We spent many hours in the evening talking shop and watching basketball on TV. (Dan was cheering his New York Knicks on in the NBA finals, and he was happy to learn that I had been a Knicks and Walt Frazier fan in the 1970s.) He was a pioneer of programming and programming education who was willing to share his experience with a young CS prof who was trying to figure out how to teach. We kept in touch by email thereafter. It was honor to call him a friend.

You can find the above quotation in A History of Test-Driven Development (TDD), as Told in Quotes, by Rob Myers. That post includes several good quotes that Myers had to cut from his upcoming book on TDD. "Of course. How else could you program?"


Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development, Teaching and Learning

January 10, 2020 2:25 PM

Questions

I recently read an interview with documentary filmmaker Errol Morris, who has unorthodox opinions about documentaries and how to make them. In particular, he prefers to build his films out of extended interviews with a single subject. These interviews give him all the source material he needs, because they aren't about questions and answers. They are about stories:

First of all, I think all questions are more or less rhetorical questions. No one wants their questions answered. They just want to state their question. And, in answering the question, the person never wants to answer the question. They just want to talk.

Morris isn't asking questions; he is stating them. His subjects are not answering questions; they are simply talking.

(Think about this the next time your listening to an interview with a politician or candidate for office...)

At first, I was attracted to the sentiment in this paragraph. Then I became disillusioned with what I took to be its cynicism. Now, though, after a week or so, I am again enamored with its insight. How many of the questions I ask of software clients and professional colleagues are really statements of a position? How many of their answers are disconnected from the essential element of my questions? Even when these responses are disconnected, they communicate a lot to me, if only I listen. My clients and colleagues are often telling me exactly what they want me to know. This dynamic is present surprisingly often when I work with students at the university, too. I need to listen carefully when students don't seem to be answering my question. Sometimes it's because they have misinterpreted the question, and I need to ask differently. Sometimes it's because they are telling me what they want me to know, irrespective of the question.

And when my questions aren't really questions, but statements or some other speech act... well, I know I have some work to do.

In case you find Morris's view on interviews cynical and would prefer to ponder the new year with greater hope, I'll leave you with a more ambiguous quote about questions:

There are years that ask questions, and years that answer.

That's from Their Eyes Were Watching God, by Zora Neale Hurston. In hindsight, it may be true or false for any given year. As a way to frame the coming months, though, it may be useful.

I hope that 2020 brings you the answers you seek, or the questions.


Posted by Eugene Wallingford | Permalink | Categories: Software Development, Teaching and Learning

January 06, 2020 3:13 PM

A Writing Game

I recently started reading posts in the archives of Jason Zweig's blog. He writes about finance for a living but blogs more widely, including quite a bit about writing itself. An article called On Writing Better: Sharpening Your Tools challenges writers to look at each word they write as "an alien object":

As the great Viennese journalist Karl Kraus wrote, "The closer one looks at a word, the farther away it moves." Your goal should be to treat every word you write as an alien object: You should be able to look at it and say, What is that doing here? Why did I use that word instead of a better one? What am I trying to say here? How can I get to where I'm going if I use such stale and lifeless words?

My mind immediately turned this into a writing game, an exercise that puts the idea into practice. Take any piece of writing.

  1. Choose a random word in the document.
  2. Change the word -- or delete it! -- in a way that improves the text.
  3. Go to 1.

Play the game for a fixed number of rounds or for a fixed period of time. A devilish alternative is to play until you get so frustrated with your writing that you can't continue. You could then judge your maturity as a writer by how long you can play in good spirits.

We could even automate the mechanics of the game by writing a program that chooses a random word in a document for us. Every time we save the document after a change, it jumps to a new word.

As with most first ideas, this one can probablyb be improved. Perhaps we should bias word selection toward words whose replacement or deletion are most likely to improve our writing. Changing "the" or "to" doesn't offer the same payoff as changing a lazy verb or deleting an abstract adverb. Or does it? I have a lot of room to improve as a writer; maybe fixing some "the"s and "to"s is exactly what I need to do. The Three Bears pattern suggests that we might learn something by tackling the extreme form of the challenge and seeing where it leads us.

Changing or deleting a single word can improve a piece of text, but there is bigger payoff available, if we consider the selected word in context. The best way to eliminate many vague nouns is to turn them back into verbs, where they act with vigor. To do that, we will have to change the structure of the sentence, and maybe the surrounding sentences. That forces us to think even more deeply about the text than changing a lone word. It also creates more words for us to fix in following rounds!

I like programming challenges of this sort. A writing challenge that constrains me in arbitrary ways might be just what I need to take time more often to improved my work. It might help me identify and break some bad habits along the way. Maybe I'll give this a try and report back. If you try it, please let me know the results!

And no, I did not play the game with this post. It can surely be improved.

Postscript. After drafting this post, I came across another article by Zweig that proposes just such a challenge for the narrower case of abstract adverbs:

The only way to see if a word is indispensable is to eliminate it and see whether you miss it. Try this exercise yourself:
  • Take any sentence containing "actually" or "literally" or any other abstract adverb, written by anyone ever.
  • Delete that adverb.
  • See if the sentence loses one iota of force or meaning.
  • I'd be amazed if it does (if so, please let me know).

We can specialize the writing game to focus on adverbs, another part of speech, or almost any writing weakness. The possibilities...


Posted by Eugene Wallingford | Permalink | Categories: General, Patterns, Teaching and Learning

January 02, 2020 4:37 PM

The Test

Over the holiday, I realized something about myself.

A simple test exists for determining when I have reached the enlightenment of the Buddha: I will be able to fly without tension or complaint over delayed flights, airline tickets and policies, and the TSA.

The good news: I survived another trip. Special thanks to Felipe at United Airlines customer service for doing his best to help with a delayed flight on Christmas Eve, to Nic R. at the Cedar Rapids Airport for solving our problem, and to Joy, the flight attendant on the first leg of our journey, for a soothing excursion.

I did more than survive, though. Once in Boston, my wife and I had a wonderful time visiting our daughters for the week between Christmas Eve and New Year's Eve! I wrote once about my older daughter leaving for college. Both are graduated now, making their way out in the world as adults. This was our first holiday away from our home -- what used to be their home -- and in their new homes. I enjoyed spending time with them on their turf and on their terms. We talked about their futures, but also routine life: recipes, movies, and life in the city. On the last day, my older daughter and I made candles. It was an unexpected joy.

For us, a trip to Boston includes visits to a museum whenever possible. This trip included three: the Harvard art museums, the Isabella Stewart Gardner, and the MFA. My daughters showed me paintings they have discovered on their own visits, and I shared with them ones that I like. I usually discover a work or two on each visit that grab my eye for the first time, or again in a new way. On my first visit to the Harvard museum, one painting really grabbed me: "Leander's Tower on the Bosporus", by Sanford Robinson Gifford.

Leander's Tower on the Bosporus, by Sanford Robinson Gifford

I've been to the MFA a few times and have a few favorites. It has a large collection of work by John Singer Sargent, a Bostonian who had a long relationship with the museum. This time, his paiting "The Master and His Pupils" drew me in as it had not before:

The Master and His Pupils, by John Singer Sargent

Our evenings often brought movies. We saw a new studio release, Star Wars: The Rise of Skywalker, in the theater; a made-for-BBC movie about Agatha Christie; and a movie that has flown under my radar for twenty years, GalaxyQuest. I enjoyed all three! The Star Wars film has its flaws, but it was a fun and appropriate end to a series that has spanned most of my life. We were lucky to stumble upon the Christie film while scrolling Netflix; it felt a lot like her mystery novels. I was surprised by how much I enjoyed GalaxyQuest. So much fun! How had I not seen it before?

The trip ended as it began, with an unexpected delay that stretched an already long travel day. Our time on the ground in Chicago at least offered the consolation of a computer malfunction that echoed our delay, designed for a programmer: Javascript for the travel-weary.

a '@KevlinHenney' from Chicago O'Hare

Again, though, there was good news: lots of time to walk with my wife, which was a good way to spend the day, and a good way to end our trip.

Now, back to working on my enlightenment...


Posted by Eugene Wallingford | Permalink | Categories: Personal

December 26, 2019 1:55 PM

An Update on the First Use of the Term "Programming Language"

This tweet and this blog entry on the first use of the term "programming language" evoked responses from readers with some new history and some prior occurrences.

Doug Moen pointed me to the 1956 Fortran manual from IBM, Chapter 2 of which opens with:

Any programming language must provide for expressing numerical constants and variable quantities.

I was aware of the Fortran manual, which I link to in the notes for my compiler course, and its use of the term. But I had been linking to a document dated October 1957, and the file at fortran.com is dated October 15, 1956. That beats the January 1957 Newell and Shaw paper by a few months.

As Moen said in his email, "there must be earlier references, but it's hard to find original documents that are early enough."

The oldest candidate I have seen comes from @matt_dz. His tweet links to this 1976 Stanford tech report, "The Early Development of Programming Languages", co-authored by Knuth. On Page 26, it refers to work done by Arthur W. Burks in 1950:

In 1950, Burks sketched a so-called "Intermediate Programming Language" which was to be the step one notch above the Internal Program Language.

Unfortunately, though, this report's only passage from Burke refers to the new language as "Intermediate PL", which obscures the meaning of the 'P'. Furthermore, the title of Burke's paper uses "program" in the language's name:

Arthur W. Burks, "An intermediate program language as an aid in program synthesis", Engineering Research Institute, Report for Burroughs Adding Machine Company (Ann Arbor, Mich.: Univ. of Michigan, 1951), ii+15 pp.

The use of "program language" in this title is consistent with the terminology in Burks's previous work on an "Internal Program Language", to which Knuth et al. also refer.

Following up on the Stanford tech report, Douglas Moen found the book Mathematical Methods in Program Development, edited by Manfred Broy and Birgit Schieder. It includes a paper that attempts "to identify the first 'programming language', and the first use of that term". Here's a passage from Page 219, via Google Books:

There is not yet an indication of the term 'programming languages'. But around 1950 the term 'program' comes into wide use: 'The logic of programming electronic digital computers' (A. W. Burks 1950), 'Programme organization and initial orders for the EDSAC' (D. J. Wheeler 1950), 'A program composition technique' (H. B. Curry 1950), 'La conception du programme' (Corrado Bohm 1952), and finally 'Logical or non-mathematical programmes' (C. S. Strachey 1952).

And then, on Page 224, it comments specifically on Burks's work:

A. W. Burks ('An intermediate program language as an aid in program synthesis', 1951) was among the first to use the term program(ming) language.

The parenthetical in that phrase -- "the first to use the term program(ming) language" -- leads me to wonder if Burks may use "program language" rather than "programming language" in his 1951 paper.

Is it possible that Knuth et al. retrofitted the use of "programming language" onto Burks's language? Their report documents the early development of PL ideas, not the history of the term itself. The authors may have used a term that was in common parlance by 1976 even if Burks had not. I'd really like to find an original copy of Burks's 1951 ERI report to see if he ever uses "programming language" when talking about his Intermediate PL. Maybe after the holiday break...

In any case, the use of program language by Burks and others circa 1950 seems to be the bridge between use of the terms "program" and "language" independently and the use of "programming language" that soon became standard. If Burke and his group never used the new term for its intermediate PL, it's likely that someone else did between 1951 and release of the 1956 Fortran manual.

There is so much to learn. I'm glad that Crista Lopes tweeted her question on Sunday and that so many others have contributed to the answer!


Posted by Eugene Wallingford | Permalink | Categories: Computing

December 23, 2019 10:23 AM

The First Use of the Term "Programming Language"?

Yesterday, Crista Lopes asked a history question on Twitter:

Hey, CS History Twitter: I just read Iverson's preface of his 1962 book carefully, and suddenly this occurred to me: did he coin the term "programming language"? Was that the first time a programming language was called "programming language"?

In a follow-up, she noted that McCarthy's CACM paper on LISP from roughly the same time called Lisp a 'programming system'", not a programming language.

I had a vague recollection from my grad school days that Newell and Simon might have used the term. I looked up IPL, the Information Processing Language they created in the mid-1950s with Shaw. IPL pioneered the notion of list processing, though at the level of assembly language. I first learned of it while devouring Newell and Simon's early work on AI and reading every thing I could find about programs such as the General Problem Solver and Logic Theorist.

That wikipedia page has a link to this unexpected cache of documents on IPL from Newell, Simon, and Shaw's days at Rand. The oldest of these is a January 1957 paper, Programming the Logic Theory Machine, by Newell and Shaw that was presented at the Western Joint Computer Conference (WJCC) the next month. It details their efforts to build computer systems to perform symbolic reasoning, as well as the language they used to code their programs.

There it is on Page 5: a section titled "Requirements for the Programming Language". They even define what they mean by programming language:

We can transform these statements about the general nature of the program of LT into a set of requirements for a programming language. By a programming language we mean a set of symbols and conventions that allows a programmer to specify to the computer what processes he wants carried out.

Other than the gendered language, that definition works pretty well even today.

The fact that Newell and Shaw defined "programming language" in this paper indicates that the term probably was not in widespread use at the time. The WJCC was a major computing conference of the day. The researchers and engineers who attended it would likely be familiar with common jargon of the industry.

Reading papers about IPL is an education across a range of ideas in computing. Researchers at the dawn of computing had to contend with -- and invent -- concepts at multiple levels of abstraction and figure out how to implement the on machines with limited size and speed. What a treat these papers are.

I love to read original papers from the beginning of our discipline, and I love to learn about the history of words. A few of my students do, too. One student stopped in after the last day of my compilers course this semester to thank me for telling stories about the history of compilers occasionally. Next semester, I teach our Programming Languages and Paradigms course again, and this little story might add a touch of color to our first days together.

All this said, I am neither a historian of computer science nor a lexicographer. If you know of an earlier occurrence of the term "programming language" than Newell and Shaw's from January 1957, I would love to hear from you by email or on Twitter.


Posted by Eugene Wallingford | Permalink | Categories: Computing

December 20, 2019 1:45 PM

More Adventures in Constrained Programming: Elo Predictions

I like tennis. The Tennis Abstract blog helps me keep up with the game and indulge my love of sports stats at the same time. An entry earlier this month gave a gentle introduction to Elo ratings as they are used for professional tennis:

One of the main purposes of any rating system is to predict the outcome of matches--something that Elo does better than most others, including the ATP and WTA rankings. The only input necessary to make a prediction is the difference between two players' ratings, which you can then plug into the following formula:
1 - (1 / (1 + (10 ^ (difference / 400))))

This formula always makes me smile. The first computer program I ever wrote because I really wanted to was a program to compute Elo ratings for my high school chess club. Over the years I've come back to Elo ratings occasionally whenever I had an itch to dabble in a new language or even an old favorite. It's like a personal kata of variable scope.

I read the Tennis Abstract piece this week as my students were finishing up their compilers for the semester and as I was beginning to think of break. Playful me wondered how I might implement the prediction formula in my students' source language. It is a simple functional language with only two data types, integers and booleans; it has no loops, no local variables, no assignments statements, and no sequences. In another old post, I referred to this sort of language as akin to an integer assembly language. And, heaven help me, I love to program in integer assembly language.

To compute even this simple formula in Klein, I need to think in terms of fractions. The only division operator performs integer division, so 1/x for any x gives 0. I also need to think carefully about how to implement the exponentiation 10 ^ (difference / 400). The difference between two players' ratings is usually less than 400 and, in any case, almost never divisible by 400. So My program will have to take an arbitrary root of 10.

Which root? Well, I can use our gcd() function (implemented using Euclid's algorithm, of course) to reduce diff/400 to its lowest terms, n/d, and then compute the dth root of 10^n. Now, how to take the dth root of an integer for an arbitrary integer d?

Fortunately, my students and I have written code like this in various integer assembly languages over the years. For instance, we have a SQRT function that uses binary search to hone in on the integer closest to the square of a given integer. Even better, one semester a student implemented a square root program that uses Newton's method:

   xn+1 = xn - f(xn)/f'(xn)

That's just what I need! I can create a more general version of the function that uses Newton's method to compute an arbitrary root of an arbitrary base. Rather than work with floating-point numbers, I will implement the function to take its guess as a fraction, represented as two integers: a numerator and a denominator.

This may seem like a lot of work, but that's what working in such a simple programming language is like. If I want my students' compilers to produce assembly language that predicts the result of a professional tennis match, I have to do the work.

This morning, I read a review of Francis Su's new popular math book, Mathematics for Human Flourishing. It reminds us that math isn't about rules and formulas:

Real math is a quest driven by curiosity and wonder. It requires creativity, aesthetic sensibilities, a penchant for mystery, and courage in the face of the unknown.

Writing my Elo rating program in Klein doesn't involve much mystery, and it requires no courage at all. It does, however, require some creativity to program under the severe constraints of a very simple language. And it's very much true that my little programming diversion is driven by curiosity and wonder. It's fun to explore ideas in a small space uses limited tools. What will I find along the way? I'll surely make design choices that reflect my personal aesthetic sensibilities as well as the pragmatic sensibilities of a virtual machine that know only integers and booleans.

As I've said before, I love to program.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development, Teaching and Learning