September 20, 2018 4:44 PM

Special Numbers in a Simple Language

This fall I am again teaching our course in compiler development. Working in teams of two or three, students will implement from scratch a complete compiler for a simple functional language that consists of little more than integers, booleans, an if statement, and recursive functions. Such a language isn't suitable for much, but it works great for writing programs that do simple arithmetic and number theory. In the past, I likened it to an integer assembly language. This semester, my students are compiling a Pascal-like language of this sort that call Flair.

If you've read my blog much in the falls over the last decade or so, you may recall that I love to write code in the languages for which my students write their compilers. It makes the language seem more real to them and to me, gives us all more opportunities to master the language, and gives us interesting test cases for their scanners, parsers, type checkers, and code generators. In recent years I've blogged about some of my explorations in these languages, including programs to compute Farey numbers and excellent numbers, as well as trying to solve one of my daughter's AP calculus problems.

When I run into a problem, I usually get an itch to write a program, and in the fall I want to write it in my students' language.

Yesterday, I began writing my first new Flair program of the semester. I ran across this tweet from James Tanton, which starts:

N is "special" if, in binary, N has a 1s and b 0s and a & b are each factors of N (so non-zero).

So, 10 is special because:

  • In binary, 10 is 1010.
  • 1010 contains two 1s and two 0s.
  • Two is a factor of 10.

9 is not special because its binary rep also contains two 1s and two 0s, but two is not a factor of 9. 3 is not special because its binary rep has no 0s at all.

My first thought upon seeing this tweet was, "I can write a Flair program to determine if a number is special." And that is what I started to do.

Flair doesn't have loops, so I usually start every new program by mapping out the functions I will need simply to implement the definition. This makes sure that I don't spend much time implementing loops that I don't need. I ended up writing headers and default bodies for three utility functions:

  • convert a decimal number to binary
  • count the number of times a particular digits occurs in a number
  • determine if a number x divides evenly into a number n

With these helpers, I was ready to apply the definition of specialness:

    return divides(count(1, to_binary(n)), n)
       and divides(count(0, to_binary(n)), n)

Calling to_binary on the same argument is wasteful, but Flair doesn't have local variables, either. So I added one more helper to implement the design pattern "Function Call as Variable Assignment", apply_definition:

    function apply_definition(binary_n : integer, n : integer) : boolean
and called it from the program's main:
    return apply_definition(to_binary(n), n)

This is only the beginning. I still have a lot of work to do to implement to_binary, count and divides, using recursive function calls to simulate loops. This is another essential design pattern in Flair-like languages.

As I prepared to discuss my new program in class today, I found bug: My divides test was checking for factors of binary_n, not the decimal n. I also renamed a function and one of its parameters. Explaining my programs to students, a generalization of rubber duck debugging, often helps me see ways to make a program better. That's one of the reasons I like to teach.

Today I asked my students to please write me a Flair compiler so that I can run my program. The course is officially underway.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Patterns, Teaching and Learning

September 13, 2018 3:50 PM

Legacy

In an interview at The Great Discontent, designer John Gall is asked, "What kind of legacy do you hope to leave?" He replies:

I have no idea; it's not something I think about. It's the thing one has the least control over. I just hope that my kids will have nice things to say about me.

I admire this answer.

No one is likely to ask me about my legacy; I'm just an ordinary guy. But it has always seemed strange when people -- presidents, artists, writers, film stars -- are asked this question. The idea that we can or should craft our own legacy like a marketing brand seems venal. We should do things because they matter, because they are worth doing, because they make the world better, or at least better than it would be without us. It also seems like a waste of time. The simple fact is that most of us won't be remembered long beyond our deaths, and only then by close family members and friends. Even presidents, artists, writers, and film stars are mostly forgotten.

To the extent that anyone will have a legacy, it will decided in the future by others. As Gall notes, we don't have much control over how that will turn out. History is full of people whose place in the public memory turned out much differently than anyone might have guessed at the time.

When I am concerned that I'm not using my time well, it's not because I am thinking of my legacy. It's because I know that time is a precious and limited resource and I feel guilty for wasting it.

About the most any of us can hope is that our actions in this life leave a little seed of improvement in the world after we are gone. Maybe my daughters and former students and friends can make the world better in part because of something in the way I lived. If that's what people mean by their legacy, great, but it's likely to be a pretty nebulous effect. Not many of us can be Einstein or Shakespeare.

All that said, I do hope my daughters have good things to say about me, now and after I'm gone. I love them, and like them a lot. I want to make their lives happier. Being remembered well by them might also indicate that I put my time on Earth to good use.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 05, 2018 3:58 PM

Learning by Copying the Textbook

Or: How to Learn Physics, Professional Golfer Edition

Bryson DeChambeau is a professional golfer, in the news recently for consecutive wins in the FedExCup playoff series. But he can also claim an unusual distinction as a student of physics:

In high school, he rewrote his physics textbook.

DeChambeau borrowed the textbook from the library and wrote down everything from the 180-page book into a three-ring binder. He explains: "My parents could have bought one for me, but they had done so much for me in golf that I didn't want to bother them in asking for a $200 book. ... By writing it down myself I was able to understand things on a whole comprehensive level.

I imagine that copying texts word-for-word was a more common learning strategy back when books were harder to come by, and perhaps it will become more common again as textbook prices rise and rise. There is certainly something to be said for it. Writing by hand takes time, and all the while our brains can absorb terms, make connections among concepts, and process the material into long-term memory. Zed Shaw argues for this as a great way to learn computer programming, implementing it as a pedagogical strategy in his "Learn <x> the Hard Way" series of books. (See Learn Python the Hard Way as an example.)

I don't think I've ever copied a textbook word-for-word, and I never copied computer programs from "Byte" magazine, but I do have similar experiences in note taking. I took elaborate notes all through high school, college, and grad school. In grad school, I usually rewrote all of my class notes -- by hand; no home PC -- as I reviewed them in the day or two after class. My clean, rewritten notes had other benefits, too. In a graduate graph algorithms course, they drew the attention of a classmate who became one of my best friends and were part of what attracted the attention of the course's professor, who asked me to consider joining his research group. (I was tempted... Graph algorithms was one of my favorite courses and research areas!)

I'm not sure many students these days benefit from this low-tech strategy. Most students who take detailed notes in my course seem to type rather than write which, if what I've read is correct, has fewer cognitive advantages. But at least those students are engaging with the material consciously. So few students seem to take detailed notes at all these days, and that's a shame. Without notes, it is harder to review ideas, to remember what they found challenging or puzzling in the moment, and to rehearse what they encounter in class into their long-term memories. Then again, maybe I'm just having a "kids these days" moment.

Anyway, I applaud DeChambeau for saving his parents a few dollars and for the achievement of copying an entire physics text. He even realized, perhaps after the fact, that it was an excellent learning strategy.

(The above passage is from The 11 Most Unusual Things About Bryson DeChambeau. He sounds like an interesting guy.)


Posted by Eugene Wallingford | Permalink | Categories: Computing, Teaching and Learning

September 03, 2018 7:24 AM

Lay a Split of Good Oak on the Andirons

There are two spiritual dangers in not owning a farm. One is the danger of supposing that breakfast comes from the grocer, and the other that heat comes from the furnace.

The remedy for the first, according to Aldo Leopold, is to grow a garden, preferably in a place without the temptation and distraction of a grocery store. The remedy for the second is to "lay a split of good oak on the andirons" and let it warm your body "while a February blizzard tosses the trees outside".

I ran across Leopold's The Sand County Almanac in the local nature center late this summer. After thumbing through the pages during a break in a day-long meeting indoors, I added it to my long list of books to read. My reading list is actually stack, so there was some hope that I might get to it soon -- and some danger that it would be buried before I did.

Then an old high school friend, propagating a meme on Facebook, posted a picture of the book and wrote that it had changed his life, changed how he looked at the world. That caught my attention, so I anchored it atop my stack and checked a copy out of the university library.

It now serves as a quiet read for this city boy on a dark and rainy three-day weekend. There are no February blizzards here yet, of course, but autumn storms have lingered for days. In an important sense, I'm not a "city boy", as my by big-city friends will tell me, but I've lived my life mostly sheltered from the reality growing my own food and heating my home by a wonderful and complex economy of specialized labor that benefits us all. It's good to be reminded sometimes of that good fortune, and also to luxuriate in the idea of experiencing a different kind of life, even if only for a while.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

August 31, 2018 3:06 PM

Reflection on a Friday

If you don't sit facing the window, you could be in any town.

I read that line this morning in Maybe the Cumberland Gap just swallows you whole, where it is a bittersweet observation of the similarities among so many dying towns across Appalachia. It's a really good read, mostly sad but a little hopeful, that applies beyond one region or even one country.

My mind is self-centered, though, and immediately reframed the sentence in a way that cast light on my good fortune.

I just downloaded a couple of papers on return-oriented programming so that I can begin working with an undergraduate on an ambitious research project. I have a homework assignment to grade sitting in my class folder, the first of the semester. This weekend, I'll begin to revise a couple of lectures for my compiler course, on NFAs and DFAs and scanning text. As always, there is a pile of department work to do on my desk and in my mind.

I live in Cedar Falls, Iowa, but if I don't sit facing the window, I could be in Ames or Iowa City, East Lansing or Durham, Boston or Berkeley. And I like the view out of my office window very much, thank you, so I don't even want to trade.

Heading into a three-day weekend, I realize again how fortunate I am. Do I put my good fortune to good enough use?


Posted by Eugene Wallingford | Permalink | Categories: Computing, Personal, Teaching and Learning

August 17, 2018 2:19 PM

LangSec and My Courses for the Year

As I a way to get into the right frame of mind for the new semester and the next iteration of my compiler course, I read Michael Hicks's Software Security is a Programming Languages Issue this morning. Hicks incorporates software security into his courses on the principles of programming languages, with two lectures on security before having students study and use Rust. The article has links to lecture slides and supporting material, which makes it a post worth bookmarking.

I started thinking about adding LangSec to my course late in the spring semester, as I brainstormed topics that might spice the rest of the course up for both me and my students. However, time was short, so I stuck with a couple of standalone sessions on topics outside the main outline: optimization and concatenative languages. They worked fine but left me with an itch for something new.

I think I'll use the course Hicks and his colleagues teach as a starting point for figuring out how I might add to next spring's course. Students are interested in security, it's undoubtedly an essential issue for today's grads, and it is a great way to demonstrate how the design of programming languages is more than just the syntax of a loop or the lambda calculus.

Hicks's discussion of Rust also connects with my fall course. Two years ago, an advanced undergrad used Rust as the implementation language for his compiler. He didn't know the language but wanted to pair it with Haskell in his toolbox. The first few weeks of the project were a struggle as he wrestled with mastering ownership and figuring out some new programming patterns. Eventually he hit a nice groove and produced a working compiler with only a couple of small holes.

I was surprised how easy it was for me install the tools I needed to compile, test, and explore his code. That experience increased my interest in learning the language, too. Adding it to my spring course would give me the last big push I need to buckle down.

This summer has been a blur of administrative stuff, expected and unexpected. The fall semester brings the respite of work I really enjoy: teaching compilers and writing some code. Hurray!


Posted by Eugene Wallingford | Permalink | Categories: Computing, Teaching and Learning

August 09, 2018 1:03 PM

Gerald Weinberg Has Passed Away

I just read on the old Agile/XP mailing list that Jerry Weinberg passed away on Tuesday, August 7. The message hailed Weinberg as "one of the finest thinkers on computer software development". I, like many, was a big fan of work.

My first encounter with Weinberg came in the mid-1990s when someone recommended The Psychology of Computer Programming to me. It was already over twenty years old, but it captivated me. It augmented years of experience in the trenches developing computer software with a deep understanding of psychology and anthropology and the firm but gentle mindset of a gifted teacher. I still refer back to it after all these years. Whenever I open it up to a random page, I learn something new again. If you've never read it, check it out now. You can buy the ebook -- along with many of Weinberg's books -- online through LeanPub.

After the first book, I was hooked. I never had the opportunity to attend one of Weinberg's workshops, but colleagues lavished them with praise. I should have made more of an effort to attend one. My memory is foggy now, but I do think I exchanged email messages with him once back in the late 1990s. I'll have to see if I can dig them up in one of my mail archives.

Fifteen years ago or so, I picked up a copy of Introduction to General Systems Thinking tossed out by a retiring colleague, and it became the first in a small collection of Weinberg books now on my shelf. As older colleagues retire in the coming years, I would be happy to salvage more titles and extend my collection. It won't be worth much on the open market, but perhaps I'll be able to share my love of Weinberg's work with students and younger colleagues. Books make great gifts, and more so a book by Gerald Weinberg.

Perhaps I'll share them with my non-CS friends and family, too. A couple of summers back, my wife saw a copy of Are Your Lights On?, a book Weinberg co-wrote with Donald Gause, sitting on the floor of my study at home. She read it and liked it a lot. "You get to read books like that for your work?" Yes.

I just read Weinberg's final blog entry earlier this week. He wasn't a prolific blogger, but he wrote a post every week or ten days, usually about consulting, managing, and career development. His final post touched on something that we professors experience at least occasionally: students sometimes solve the problems we et before them better than we expected, or better than we ourselves can do. He reminded people not to be defensive, even if it's hard, and to see the situation as an opportunity to learn:

When I was a little boy, my father challenged me to learn something new every day before allowing myself to go to bed. Learning new things all the time is perhaps the most important behavior in my life. It's certainly the most important behavior in our profession.

Weinberg was teaching us to the end, with grace and gratitude. I will miss him.

Oh, and one last personal note: I didn't know until after he passed that we shared the same birthday, a few years apart. A meaningless coincidence, of course, but it made me smile.


Posted by Eugene Wallingford | Permalink | Categories: General, Software Development, Teaching and Learning

August 07, 2018 3:04 PM

Too Bad Richard Feynman Didn't Have a Blog

There is a chapter in "Surely You're Joking, Mr. Feynman" about Feynman's work with biologists over summers and sabbaticals at Princeton and Cal Tech. He used a sabbatical year to work in a colleague's lab on bacteriophages, ribosomes, and RNA. After describing how he had ruined a potentially "fantastic and vital discovery" through sloppiness, he writes:

The other work on the phage I never wrote up -- Edgar kept asking me to write it up, but I never got around to it. That's the trouble with not being in your own field: You don't take it seriously.
I did write something informally on it. I sent it to Edgar, who laughed when he read it. It wasn't in the standard form that biologists use -- first, procedures, and so forth. I spent a lot of time explaining things that all the biologists knew. Edgar made a shortened version, but I couldn't understand it. I don't think they ever published it. I never published it directly.

Too bad Feynman didn't have a blog. I'll bet I could have learned something from his write-up. Not being a biologist, I generally can use some explanation intended for a lay reader, and Feynman's relaxed style might pull me through a biology paper. (Of all the sciences, biology is usually the biggest chore for me to learn.)

These days, scientists can post their informal writings on their blogs with little or no fuss. Standard form and formal style are for journals and conferences. Blog readers prefer relaxed writing and, for the most part, whatever form works best for the writer in order to get the ideas out to the world.

Imagine what a trove of stories Feynman could have told on his blog! He did tell them, of course, but in books like "Surely You're Joking, Mr. Feynman". But not everyone is going to write books, or have books written for them, so I'm glad to have the blogs of scientists, economists, and writers from many disciplines in my newsreader. For those who want something more formal before, or instead of, taking on the journal grind, we have arXiv.org. What a time to be alive.

Of course, when you read on in the chapter, you learn that James Watson (of Watson & Crick fame) heard about Feynman's work, thought it was interesting, invited Feynman to give a seminar talk at Harvard, and then went into the lab with him to conduct an experiment that very same week. I guess it all worked out for Feynman in the end.


Posted by Eugene Wallingford | Permalink | Categories: General

August 05, 2018 10:21 AM

Three Uses of the Knife

I just finished David Mamet's Three Uses of the Knife, a wide-ranging short book with the subtitle: "on the nature and purpose of drama". It is an extended essay on how we create and experience drama -- and how these are, in the case of great drama, the same journey.

Even though the book is only eighty or so pages, Mamet characterizes drama in so many ways that you'll have to either assemble a definition yourself or accept the ambiguity. Among them, he says that the job of drama and art is to "delight" us and that "the cleansing lesson of the drama is, at its highest, the worthlessness of reason."

Mamet clearly believes that drama is central to other parts of life. Here's a cynical example, about politics:

The vote is our ticket to the drama, and the politician's quest to eradicate "fill in the blank", is no different from the promise of the superstar of the summer movie to subdue the villain -- both promise us diversion for the price of a ticket and a suspension of disbelief.

As reader, I found myself using the book's points to ruminate about other parts of life, too. Consider the first line of the second essay:

The problems of the second half are not the problems of the first half.

Mamet uses this to launch into a consideration of the second act of a drama, which he holds equally to be a consideration of writing the second act of a drama. But with fall semester almost upon us, my thoughts jumped immediately to teaching a class. The problems of teaching the second half of a class are quite different from the problems of teaching the first half. The start of a course requires the instructor to lay the foundation of a topic while often convincing students that they are capable of learning it. By midterm, the problems include maintaining the students' interest as their energy flags and the work of the semester begins to overwhelm them. The instructor's energy -- my energy -- begins to flag, too, which echoes Mamet's claim that the journey of the creator and the audience are often substantially the same.

A theme throughout the book is how people immerse themselves in story, suspending their disbelief, even creating story when they need it to soothe their unease. Late in the book, he connects this theme to religious experience as well. Here's one example:

In suspending their disbelief -- in suspending their reason, if you will -- for a moment, the viewers [of a magic show] were rewarded. They committed an act of faith, or of submission. And like those who rise refreshed from prayers, their prayers were answered. For the purpose of the prayer was not, finally, to bring about intercession in the material world, but to lay down, for the time of the prayer, one's confusion and rage and sorrow at one's own powerlessness.

This all makes the book sound pretty serious. It's a quick read, though, and Mamet writes with humor, too. It feels light even as it seems to be a philosophical work.

The following paragraph wasn't intended as humorous but made me, a computer scientist, chuckle:

The human mind cannot create a progression of random numbers. Years ago computer programs were created to do so; recently it has been discovered that they were flawed -- the numbers were not truly random. Our intelligence was incapable of creating a random progression and therefore of programming a computer to do so.

This reminded me of a comment that my cognitive psychology prof left on the back of an essay I wrote in class. He wrote something to the effect, "This paper gets several of the particulars incorrect, but then that wasn't the point. It tells the right story well." That's how I felt about this paragraph: it is wrong on a couple of important facts, but it advances the important story Mamet is telling ... about the human propensity to tell stories, and especially to create order out of our experiences.

Oh, and thanks to Anna Gát for bringing the book to my attention, in a tweet to Michael Nielsen. Gát has been one of my favorite new follows on Twitter in the last few months. She seems to read a variety of cool stuff and tweet about it. I like that.


Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Personal

July 31, 2018 4:23 PM

Software Projects, Potential Employers, and Memories

I spent a couple of hours this morning at a roundtable discussion listening to area tech employers talk about their work and their companies' needs. It was pretty enjoyable (well, except perhaps for the CEO who too frequently prefaced his remarks with "What the education system needs to understand is ..."). To a company, they all place a lot of value on the projects that job candidates have done. Their comments reminded me of an old MAA blog post in which a recent grad said:

During the fall of my junior year, I applied for an internship at Red Ventures, a data analytics and technology company just outside Charlotte. Throughout the rigorous interview process, it wasn't my GPA that stood out. I stood out among the applicants, in part, because I was able to discuss multiple projects I had taken ownership of and was extremely passionate about.

I encourage this mentality in my students, though I think "passionate about" is too strong a condition (not to mention cliché). Students should have a few projects that they are interested in, or proud of, or maybe just completed.

Most of the students taking my compiler course this fall won't be applying for a compiler job when they graduate, but they will have written a compiler as part of a team. They will have met a spec, collaborated on code, and delivered a working product. That is evidence of skill, to be sure, but also of hard work and persistence. It's a significant accomplishment.

The students who take our intelligent systems course or our real-time embedded systems will be able to say the same thing. Some students will also be able to point to code they wrote for a club or personal projects. They key is to build things, care about them, and "deliver", whatever that means in the context of that particular project.

I made note of one new piece of advice to give our students, offered by a former student I mentioned in a blog post many years ago who is now head of a local development team for mobile game developer Jam City: Keep all the code you write. It can be a GitHub repo, as many people now recommend, but it doesn't have to be. A simple zip file organized by courses and projects can be enough. Such a portfolio can show prospective employers what you've done, how you've grown, and how much you care about the things you make. It can say a lot.

You might even want to keep all that code for Future You. I'm old enough that it was difficult to keep digital copies of all the code I wrote in college. I have a few programs from my undergrad days and a few more from grad school, which have migrated across storage media as time passed, but I missing much of my class work as a young undergrad and all of the code I wrote in high school. I sometimes wish I could look back at some of that code...


Posted by Eugene Wallingford | Permalink | Categories: Personal, Software Development, Teaching and Learning