November 10, 2019 11:06 AM

Three of the Hundred Falsehoods CS Students Believe

Jan Schauma recently posted a list of one hundred Falsehoods CS Students (Still) Believe Upon Graduating. There is much good fun here, especially for a prof who tries to help CS students get ready for the world, and a fair amount of truth, too. I will limit my brief comments to three items that have been on my mind recently even before reading this list.

18. 'Email' and 'Gmail' are synonymous.

CS grads are users, too, and their use of Gmail, and systems modeled after it, contributes to the truths of modern email: top posting all the time, with never a thought of trimming anything. Two-line messages sitting atop icebergs of text which will never be read again, only stored in the seemingly infinite space given us for free.

Of course, some of our grads end up in corporate and IT, managing email as merely one tool in a suite of lowest-common-denominator tools for corporate communication. The idea of email as a stream of text that can, for the most part, be read as such, is gone -- let alone the idea that a mail stream can be processed by programs such as procmail to great benefit.

I realize that most users don't ask for anything more than a simple Gmail filter to manage their mail experience, but I really wish it were easier for more users with programming skills to put those skills to good use. Alas, that does not fit into the corporate IT model, and not even the CS grads running many of these IT operations realize or care what is possible.

38. Employers care about which courses they took.

It's the time of year when students register for spring semester courses, so I've been meeting with a lot of students. (Twice as many as usual, covering for a colleague on sabbatical.) It's interesting to encounter students on both ends of the continuum between not caring at all what courses they take and caring a bit too much. The former are so incurious I wonder how they fell into the major at all. The latter are often more curious but sometimes are captive to the idea that they must, must, must take a specific course, even if it meets at a time they can't attend or is full by the time they register.

I do my best to help them get into these courses, either this spring or in a late semester, but I also try to do a little teaching along the way. Students will learn useful and important things in just about every course they take, if they want to, and taking any particular course does not have to be either the beginning or the end of their learning of that topic. And if the reason they think they must take a particular course is because future employers will care, they are going to be surprised. Most of the employers who interview our students are looking for well-rounded CS grads who have a solid foundation in the discipline and who can learn new things as needed.

90. Two people with a CS degree will have a very similar background and shared experience/knowledge.

This falsehood operates in a similar space to #38, but at the global level I reached at the end of my previous paragraph. Even students who take most of the same courses together will usually end their four years in the program with very different knowledge and experiences. Students connect with different things in each course, and these idiosyncratic memories build on one another in subsequent courses. They participate in different extracurricular activities and work different part-time jobs, both of shape and augment what they learn in class.

In the course of advising students over two, three, or four years, I try to help them see that their studies and other experiences are helping them to become interesting people who know more than they realize and who are individuals, different in some respects from all their classmates. They will be able to present themselves to future employers in ways that distinguish them from everyone else. That's often the key to getting the job they desire now, or perhaps one they didn't even realize they were preparing for while exploring new ideas and building their skillsets.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Teaching and Learning

October 30, 2019 3:30 PM

A Few Ideas from Economist Peter Bernstein

I found all kinds of wisdom in this interview with economist Peter Bernstein. It was originally published in 2004 and the updated online a couple of years ago. A lot of the wisdom sounds familiar, as most general wisdom does, but occasionally Bernstein offers a twist. For instance, I like this passage:

I make no excuses or apologies for changing my mind. The world around me changes, for one thing, but also I am continuously learning. I have never finished my education and probably never will.... I'm always telling myself, "I must sit down and explain why I said this, and why I was wrong."

People often speak the virtue of changing our minds, but Bernstein goes further: he feels a need to explain both the reason he thought what he did and the reason he was wrong. That sort of post-mortem can be immensely helpful to the rest of us as we try to learn, and the humility of explaining the error keeps us all better grounded.

I found quotable passages on almost every page. One quoted Leibniz, which I paraphrased as:

von Leibniz told Bernoulli that nature works in patterns, but "only for the most part". The other part -- the unpredictable part -- tends to be where the action is.

Poking around the fringes of a model that is pretty good or a pattern of thought that only occasionally fails us often brings surprising opportunities for advancement.

Many of Bernstein's ideas were framed specifically as about investing, of course, such as:

The riskiest moment is when you're right. That's when you're in the most trouble, because you tend to overstay the good decisions.

and:

Diversification is not only a survival strategy but also an aggressive strategy, because the next windfall might come from a surprising place.

These ideas are powerful outside the financial world, too, though. Investing too much importance in a productive research area can be risky because it becomes easy to stay there too long after the world starts to move away. Diversifying our programming language skills and toolsets might look like a conservative strategy that limits rapid advance in a research niche right now, but it also equips us to adapt more quickly when the next big idea happens somewhere we don't expect.

Anyway, the interview is a good long-but-quick read. There's plenty more to consider, in particular his application of Pascal's wager to general decision making. Give it a read if it sounds interesting.


Posted by Eugene Wallingford | Permalink | Categories: General

October 27, 2019 10:23 AM

Making Something That Is Part Of Who You Are

The narrator in Rachel Cusk's "Transit" relates a story told to her by Pavel, the Polish builder who is helping to renovate her flat. Pavel left Poland for London to make money after falling out with his father, a builder for whom he worked. The event that prompted his departure was a reaction to a reaction. Pavel had designed and built a home for his family. After finishing, he showed it to his father. His father didn't like it, and said so. Pavel chose to leave at that moment.

'All my life,' he said, 'he criticise. He criticise my work, my idea, he say he don't like the way I talk -- even he criticise my wife and my children. But when he criticise my house' -- Pavel pursed his lips in a smile -- 'then I think, okay, is enough.'

I generally try to separate myself from the code and prose I write. Such distance is good for the soul, which does not need to be buffeted by criticism, whether external or internal, of the things I've created. It is also good for the work itself, which is free to be changed without being anchored to my identity.

Fortunately, I came out of home and school with a decent sense that I could be proud of the things I create without conflating the work with who I am. Participating in writers' workshops at PLoP conferences early in my career taught me some new tools for hearing feedback objectively and focusing on the work. Those same tools help me to give feedback better. I use them in an effort to help my students develop as people, writers and programmers independent of the code and prose they write.

Sometimes, though, we make things that are expressions of ourselves. They carry part of us in their words, in what they say to the world and how they say it. Pavel's house is such a creation. He made everything: the floors, the doors, and the roof; even the beds his children slept in. His father had criticized his work, his ideas, his family before. But criticizing the house he had dreamed and built -- that was enough. Cusk doesn't give the reader a sense that this criticism was a last straw; it was, in a very real way, the only straw that mattered.

I think there are people in this world who would like just once in their lives to make something that is so much a part of who they are that they feel about it as Pavel does his house. They wish to do so despite, or perhaps because of, the sharp line it would draw through the center of life.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

October 25, 2019 3:55 PM

Enjoyment Bias in Programming

Earlier this week, I read this snippet about the benefits of "enjoyment bias" in Morgan Housel's latest blog post:

2. Enjoyment bias: An inefficient investing strategy that you enjoy will outperform an efficient one that feels like work because anything that feels like work will eventually be abandoned.
Getting anything to work requires giving it an appropriate amount of time. Giving it time requires not getting bored or burning out. Not getting bored or burning out requires that you love what you're doing, because that's when the hard parts become acceptable.

The programmer in me immediately thought, "I have this pattern." My guess is that this bias applies to a lot of things outside of investing. In software development, the choices of development methodology and programming language often benefit from enjoyment bias.

In programming as in investing, we can take this too far and hurt ourselves, our teams, and our users. Anything can be overdone. But, in general, we are more likely to stick with the hard work of building software when we enjoy the way we are building it and the tools we are using. Don't let others shame you away from what works for you.

This bias actually reminded me of a short bit from one of Paul Graham's essays on, of all things, procrastination:

I think the way to "solve" the problem of procrastination is to let delight pull you instead of making a to-do list push you. Work on an ambitious project you really enjoy, and sail as close to the wind as you can, and you'll leave the right things undone.

Delight can keep you happily working when the going gets rough, and it can pull you toward work when a lack of delight would leave you killing time on stuff that doesn't matter.

(By the way, I think that several other biases described by Housel are also useful in programming. Consider the value of reasonable ignorance, number three on his list....)


Posted by Eugene Wallingford | Permalink | Categories: Patterns, Software Development

September 14, 2019 2:56 PM

Listen Now

In a YC Female Founder Story, Danielle Morrill gives a wise answer to an old question:

Q: What do you wish someone had told you when you were 15?
I think people were telling me a lot of helpful things when I was 15 but it was very hard to listen.

This may seem more like a wry observation than a useful bit of wisdom. The fifteen-year-olds of today are no more likely to listen to us than we were to listen to adults when we were fifteen. But that presumes young people have more to learn than the rest of us. I'm a lot older than 15, and I still have plenty to learn.

Morrill's answer is a reminder to me to listen more carefully to what people are telling me now. Even now that can be hard, with all the noise out there and with my own ego getting in my way. Setting up my attention systems to identify valuable signals more reliably can help me learn faster and make me a lot more productive. It can also help future-me not want to look back wistfully so often, wishing someone had told me now what I know then.


Posted by Eugene Wallingford | Permalink | Categories: General, Personal

September 13, 2019 3:12 PM

How a government boondoggle paved the way for the expansion of computing

In an old interview at Alphachatterbox, economist Brad DeLong adds another programming tale to the annals of unintended consequences:

So the Sage Air Defense system, which never produced a single usable line of software running on any piece of hardware -- we spent more on the Sage Air Defense System than we did on the entire Manhattan Project. And it was in one sense the ultimate government Defense Department boondoggle. But on the other hand it trained a whole generation of computer programmers at a time when very little else was useful that computer programmers could exercise their skills on.
And by the time the 1960s rolled around we not only ... the fact that Sage had almost worked provided say American Airlines with the idea that maybe they should do a computer-driven reservations system for their air travel, which I think was the next big Manhattan Project-scale computer programming project.
And as that moved on the computer programmers began finding more and more things to do, especially after IBM developed its System 360.
And we were off and running.

As DeLong says earlier in the conversation, this development upended IBM president Thomas Watson's alleged claim that there was "a use for maybe five computers in the world". This famous quote is almost certainly an urban legend, but Watson would not have been as off-base as people claim even if he had said it. In the 1950s, there was not yet a widespread need for what computers did, precisely because most people did not yet understand how computing could change the landscape of every activity. Training a slew of programmers for a project that ultimately failed had the unexpected consequence of creating the intellectual and creative capital necessary to begin exploring the ubiquitous applications of computing. Money unexpectedly well spent.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development

September 12, 2019 3:57 PM

Pain and Shame

Today's lecture notes for my course include a link to @KentBeck's article on Prune, which I still enjoy.

The line that merits its link in today's session is:

We wrote an ugly, fragile state machine for our typeahead, which quickly became a source of pain and shame.

My students will soon likely experience those emotions about the state machines; they are building for lexers for their semester-long compiler project. I reassure them: These emotions are normal for programmers.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development

August 30, 2019 4:26 PM

Unknown Knowns and Explanation-Based Learning

Like me, you probably see references to this classic quote from Donald Rumsfeld all the time:

There are known knowns; there are things we know we know. We also know there are known unknowns; that is to say, we know there are some things we do not know. But there are also unknown unknowns -- the ones we don't know we don't know.

I recently ran across it again in an old Epsilon Theory post that uses it to frame the difference between decision making under risk (the known unknowns) and decision-making under uncertainty (the unknown unknowns). It's a good read.

Seeing the passage again for the umpteenth time, it occurred to me that no one ever seems to talk about the fourth quadrant in that grid: the unknown knowns. A quick web search turns up a few articles such as this one, which consider unknown knowns from the perspective of others in a community: maybe there are other people who know something that you do not. But my curiosity was focused on the first-person perspective that Rumsfeld was implying. As a knower, what does it mean for something to be an unknown known?

My first thought was that this combination might not be all that useful in the real world, such as the investing context that Ben Hunt writes about in Epsilon Theory. Perhaps it doesn't make any sense to think about things you don't know that you know.

As a student of AI, though, I suddenly made an odd connection ... to explanation-based learning. As I described in a blog post twelve years ago:

Back when I taught Artificial Intelligence every year, I used to relate a story from Russell and Norvig when talking about the role knowledge plays in how an agent can learn. Here is the quote that was my inspiration, from Pages 687-688 of their 2nd edition:

Sometimes one leaps to general conclusions after only one observation. Gary Larson once drew a cartoon in which a bespectacled caveman, Zog, is roasting his lizard on the end of a pointed stick. He is watched by an amazed crowd of his less intellectual contemporaries, who have been using their bare hands to hold their victuals over the fire. This enlightening experience is enough to convince the watchers of a general principle of painless cooking.

I continued to use this story long after I had moved on from this textbook, because it is a wonderful example of explanation-based learning.

In a mathematical sense, explanation-based learning isn't learning at all. The new fact that the program learns follows directly from other facts and inference rules already in its database. In EBL, the program constructs a proof of a new fact and adds the fact to its database, so that it is ready-at-hand the next time it needs it. The program has compiled a new fact, but in principle it doesn't know anything more than it did before, because it could always have deduced that fact from things it already knows.

As I read the Epsilon Theory article, it struck me that EBL helps a learner to surface unknown knowns by using specific experiences as triggers to combine knowledge it already into a piece of knowledge that is usable immediately without having to repeat the (perhaps costly) chain of inference ever again. Deducing deep truths every time you need them can indeed be quite costly, as anyone who has ever looked at the complexity of search in logical inference systems can tell you.

When I begin to think about unknown knowns in this way, perhaps it does make sense in some real-world scenarios to think about things you don't know you know. If I can figure it all out, maybe I can finally make my fortune in the stock market.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Patterns, Teaching and Learning

August 25, 2019 10:00 AM

Learn the Basics, Struggle a Bit, Then Ask Questions

Earlier this week, there was a meme on Twitter where people gave one-line advice to young students as they stepped onto a college campus as first-years, to help them enjoy and benefit from their college years. I didn't have anything clever or new to say, so I didn't join in, but something I read this morning triggered a bit of old wisdom that I wish more students would try to live out. In tweet-size form, it might be: "Learn the basics, struggle a bit, then ask questions." Here's the blog-size version.

In Tyler Cowen's conversation with Google economist Hal Varian, Cowen asks about a piece of advice Varian had once given to graduate students: "Don't look at the literature too soon." Is that still good advice, and why? Yes, Varian replied...

VARIAN: Because if you look at the literature, you'll see this completely worked-out problem, and you'll be captured by that person's viewpoint. Whereas, if you flounder around a little bit yourself, who knows? You might come across a completely different phenomenon. Now, you do have to look at the literature. I want to emphasize that. But it's a good idea to wrestle with a problem a little bit on your own before you adopt the standard viewpoint.

Grad students are often trying to create new knowledge, so it's best for them not to lock themselves into existing ways of thinking too soon. Thus: Don't look at the literature too soon.

I work mostly with undergrads, who study in a different context than grad students. But I think that the core of Varian's advice works well for undergrads, too: Start by learning a few basic ideas in class. Then try to solve problems. Then ask questions.

Undergrads are usually trying to master foundational material, not create new knowledge, so it's tempting to want to jump straight to answers. But it's still to valuable approach the task of learning as a process of building one's own understanding of problems before seeking answers. Banging on a bunch of problems helps us to build instincts about what the important issues and to explore the fuzzy perimeter between the basics and the open questions that will vex us after we master them. That happens best when we don't see a solution right away, when what we learned in class doesn't seem to point us directly to a solution and we have to find our own way.

But do ask questions! A common theme among students who struggle in my courses is the belief they just have to work harder or longer on a problem. Too many times I've had a student tell me "I spent an hour on each of the five homework problems." Really? My goal is for each problem to take 15 minutes or less. After half an hour, or maybe a second attempt the next day, maybe you are missing something small but important. Ask a question; maybe a little nudge can put you on the right track. Sometimes, your question will help me realize that it's the problem which is flawed and needs a tweak!

Back at the beginning of the process, too strong a belief in the ability to figure things out on one's own creates a different sort of breakdown in the learning process: It can be tempting to skip over what you read in your textbook and what you learn in class, and start trying to solving problems. "It's basic material, right? I'll figure it out." You might, but that's taking the idea to an unhealthy level. There's a difference between seeking answers too soon and trying to solve problems without the basic tools you need. Trust your profs a little bit... In class, they are trying to give you the basic tools you need to solve interesting problems.

There's nothing new here. But let's be honest; there isn't much new to be found in ways to learn. Even in the digital age, the basic tenets remain true. That's why I extol curiosity and persistence and why I'd rather be Mr. Miyagi than an answer machine. Learning will be uncomfortable. The trick is to find a way to balance the curiosity with the discomfort, the not-knowing with the receiving of answers and moving on. I wish I had great advice for how to find that balance, but I think people ultimately have to do that for themselves. We benefit by being part of a community of learners, but we each learn in our own ways and on our own time.

Actually, writing up this post has led me to a goal for myself as a teacher this year, and which may be good advice for my fellow teachers: Be more explicit about my expectations of students. This is true both at the micro-level of, say, how much time to spend on homework problems before seeking help, and at the macro-level of how to approach learning. If I want students to do something, I should at least remove the barriers between what they are thinking they should do and what I would like for them to do.

So there's some advice for students and some advice for teachers. Let's enjoy the new year and learn together.


Posted by Eugene Wallingford | Permalink | Categories: Teaching and Learning

August 08, 2019 2:42 PM

Encountering an Old Idea Three Times in Ten Days

I hope to eventually write up a reflection on my first Dagstuhl seminar, but for now I have a short story about how I encountered a new idea three times in ten days, purely by coincidence. Actually, the idea is over one hundred fifty years old but, as my brother often says, "Hey, it's new to me."

On the second day of Dagstuhl, Mark Guzdial presented a poster showing several inspirations for his current thinking about task-specific programming languages. In addition to displaying screenshots of two cool software tools, the poster included a picture of an old mechanical device that looked both familiar and strange. Telegraphy had been invented in the early 1840s, and telegraph operators needed some way to type messages. But how? The QWERTY keyboard was not created for the typewriter until the early 1870s, and no other such devices were in common use yet. To meet the need, Royal Earl House adapted a portion of a piano keyboard to create the input device for the "printing telegraph", or teleprinter. The photo on Mark's poster looked similar to the one on Wikipedia page for the teleprinter.

There was a need for a keyboard thirty years before anyone designed a standard typing interface, so telegraphers adapted an existing tool to fit their needs. What if we are in that same thirty-year gap in the design of programming languages? This has been one of Mark's inspirations as he works with non-computer scientists on task-specific programming languages. I had never seen an 1870s teleprinter before and thought its keyboard to be a rather ingenious way to solve a very specific problem with a tool borrowed from another domain.

When Dagstuhl ended, my wife and I spent another ten days in Europe on a much-needed vacation. Our first stop was Paris, and on our first full day there we visited the museum of the Conservatoire National des Arts et Métiers. As we moved into the more recent exhibits of the museum, what should I see but...

a Hughes teleprinter with piano-style keyboard, circa 1975, in the CNAM museum, Paris

... a Hughes teleprinter with piano-style keyboard, circa 1975. Déjà vu! I snapped a photo, even though the device was behind glass, and planned to share it with Mark when I got home.

We concluded our vacation with a few days in Martinici, Montenegro, the hometown of a department colleague and his wife. They still have a lot of family in the old country and spend their summers there working and relaxing. On our last day in this beautiful country, we visited its national historical museum, which is part of the National Museum of Montenegro in the royal capital of Cetinje. One of the country's most influential princes was a collector of modern technology, and many of his artifacts are in the museum -- including:

a teleprinter with piano-style keyboard in the Historical Museum of Montenegro, Cetinje

This full-desk teleprinter was close enough to touch and examine up close. (I didn't touch!) The piano keyboard on the device shows the wear of heavy use, which brings to mind each of my laptops' keyboards after a couple of years. Again, I snapped a photo, this time in fading light, and made a note to pass it on.

In ten days, I went from never having heard much about a "printing telegraph" to seeing a photo of one, hearing how it is an inspiration for research in programming language design, and then seeing two such devices that had been used in the 19th-century heyday of telegraphy. It was an unexpected intersection of my professional and personal lives. I must say, though, that having heard Mark's story made the museum pieces leap into my attention in a way that they might not have otherwise. The coincidence added a spark to each encounter.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Personal