April 17, 2018 4:25 PM

The Tension Among Motivation, Real Problems, and Hard Work

Christiaan Huygens's first pendulum clock

In today's edition of "Sounds great, but...", I point you to Learn calculus like Huygens, a blog entry that relates some interesting history about the relationship between Gottfried Leibniz and Christiaan Huygens, "the greatest mathematician in the generation before Newton and Leibniz". It's a cool story: a genius of the old generation learns a paradigm-shifting new discipline via correspondence with a genius of the new generation who is in the process of mapping the discipline.

The "Sounds great, but..." part comes near the end of the article, when the author extrapolates from Huygens's attitude to what is wrong with math education these days. It seems that Huygens wanted to see connections between the crazy new operations he was learning, including second derivatives, and the real world. Without seeing these connections, he wasn't as motivated to put in the effort to learn them.

The author then asserts:

The point is not that mathematics needs to be applied. It is that it needs to be motivated. We don't study nature because we refuse to admit value in abstract mathematics. We study nature because she has repeatedly proven herself to have excellent mathematical taste, which is more than can be said for the run-of-the-mill mathematicians who have to invent technical pseudo-problems because they can't solve any real ones.

Yikes, that got ugly fast. And it gets uglier, with the author eventually worrying that we alienate present-day Huygenses with a mass of boring problems that are disconnected from reality.

I actually love the heart of that paragraph: We don't study nature because we refuse to admit value in abstract mathematics. We study nature because she has repeatedly proven herself to have excellent mathematical taste.... This is a reasonable claim, and almost poetic. But the idea that pseudo-problems invented by run-of-the-mill mathematicians are the reason students today aren't motivated to learn calculus or other advanced mathematics seems like a massive overreach.

I'm sympathetic to the author's position. I watched my daughters slog through AP Calculus, solving many abstract problems and many applied problems that had only a thin veneer of reality wrapped around them. As someone who enjoyed puzzles for puzzles' sake, I had enjoyed all of my calculus courses, but it seemed as if my daughters and many of their classmates never felt the sort of motivation that Huygens craved and Leibniz delivered.

I also see many computer science students slog through courses in which they learn to program, apply computational theory to problems, and study the intricate workings of software and hardware systems. Abstract problems are a fine way to learn how to program, but they don't always motivate students to put in a lot of work on challenging material. However, real problems can be too unruly for many settings, though, so simplified, abstract problems are common.

But it's not quite as easy to fix this problem by saying "learn calculus like Huygens: solve real problems!". There are a number of impediments to this being a straightforward solution in practice.

One is the need for domain knowledge. Few, if any, of the students sitting in today's calculus classes have much in common with Huygens, a brilliant natural scientist and inventor who had spent his life investigating hard problems. He brought a wealth of knowledge to his study of mathematics. I'm guessing that Leibniz didn't have to search long to find applications with which Huygens was already familiar and whose solutions he cared about.

Maybe in the old days all math students were learning a lot of science at the same time as they learned math, but that is not always so now. In order to motivate students with real problems, you need real problems from many domains, in hopes of hitting all students' backgrounds and interests. Even then, you may not cover them all. And, even if you do, you need lots of problems for them to practice on.

I think about these problems every day from the perspective of a computer science prof, and I think there are a lot of parallels between motivating math students and motivating CS students. How do I give my students problems from domains they both know something about and are curious enough to learn more about? How do I do that in a room with thirty-five students with as many different backgrounds? How do I do that in the amount of time I have to develop and extend my course?

Switching to a computer science perspective brings to mind a second impediment to the "solve real problems" mantra. CS education research offers some evidence that using context-laden problems, even from familiar contexts, can make it more difficult for students to solve programming problems. The authors of the linked paper say:

Our results suggest that any advantage conveyed by a familiar context is dominated by other factors, such as the complexity of terminology used in the description, the length of the problem description, and the availability of examples. This suggests that educators should focus on simplicity of language and the development of examples, rather than seeking contexts that may aid in understanding problems.

Using familiar problems to learn new techniques may help motivate students initially, but that may come at other costs. Complexity and confusion can be demotivating.

So, "learn calculus like Huygens" sounds great, but it's not quite so easy to implement in practice. After many years designing and teaching courses, I have a lot of sympathy for the writers of calculus and intro programming textbooks. I also don't think it gets much easier as students advance through the curriculum. Some students are motivated no matter what the instructor does; others need help. The tension between motivation and the hard work needed to master new techniques is always there. Claims that the tension is easy to resolve are usually too glib to be helpful.

The Huygens-Leibniz tale really is a cool story, though. You might enjoy it.

(The image above is a sketch of Christiaan Huygens's first pendulum clock, from 1657. Source: Wikipedia.)

Posted by Eugene Wallingford | Permalink | Categories: Teaching and Learning

April 06, 2018 3:19 PM

Maps and Abstractions

I've been reading my way through Frank Chimero's talks online and ran across a great bit on maps and interaction design in What Screens Want. One of the paragraphs made me think about the abstractions that show up in CS courses:

When I realized that, a little light went off in my head: a map's biases do service to one need, but distort everything else. Meaning, they misinform and confuse those with different needs.

CS courses are full of abstractions and models of complex systems. We use examples, often simplified, to expose or emphasize a single facet a system, as a way to help students cut through the complexity. For example, compilers and full-strength interpreters are complicated programs, so we start with simple interpreters operating over simple languages. Students get their feet wet without drowning in detail.

In the service of trying not to overwhelm students, though, we run the risk of distorting how they think about the parts we left out. Worse, we sometimes distort even their thinking about the part we're focusing on, because they don't see its connections to the more complete picture. There is an art to identifying abstractions, creating examples, and sequencing instruction. Done well, we can minimize the distortions and help students come to understand the whole with small steps and incremental increases in size and complexity.

At least that's what I think on my good days. There are days and even entire semesters when things don't seem to progress as smoothly as I hope or as smoothly as past experience has led me to expect. Those days, I feel like I'm doing violence to an idea when I create an abstraction or adopt a simplifying assumption. Students don't seem to be grokking the terrain, so change the map. We try different problems or work through more examples. It's hard to find the balance sometimes between adding enough to help and not adding so much as to overwhelm.

The best teachers I've encountered know how to approach this challenge. More importantly, they seem to enjoy the challenge. I'm guessing that teachers who don't enjoy it must be frustrated a lot. I enjoy it, and even so there are times when this challenge frustrates me.

Posted by Eugene Wallingford | Permalink | Categories: Computing, Teaching and Learning

March 29, 2018 3:05 PM

Heresy in the Battle Between OOP and FP

For years now, I've been listening to many people -- smart, accomplished people -- feverishly proclaim that functional programming is here to right the wrongs of object-oriented programming. For many years before that, I heard many people -- smart, accomplished people -- feverishly proclaim that object-oriented programming was superior to functional programming, an academic toy, for building real software.

Alas, I don't have a home in the battle between OOP and FP. I like and program in both styles. So it's nice whenever I come across something like Alan Kay's recent post on Quora, in response to the question, "Why is functional programming seen as the opposite of OOP rather than an addition to it?" He closes with a paragraph I could take on as my credo:

So: both OOP and functional computation can be completely compatible (and should be!). There is no reason to munge state in objects, and there is no reason to invent "monads" in FP. We just have to realize that "computers are simulators" and figure out what to simulate.

As in many things, Kay encourages to go beyond today's pop culture of programming to create a computational medium that incorporates big ideas from the beginning of our discipline. While we work on those ideas, I'll continue to write programs in both styles, and to enjoy them both. With any luck, I'll bounce between mindsets long enough that I eventually attain enlightenment, like the venerable master Qc Na. (See the koan at the bottom of that link.)

Oh: Kay really closes his post with

I will be giving a talk on these ideas in July in Amsterdam (at the "CurryOn" conference).

If that's not a reason to go to Amsterdam for a few days, I don't know what is. Some of the other speakers looks pretty good, too.

Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development

March 23, 2018 3:45 PM

A New Way to Design Programming Languages?

Greg Wilson wrote a short blog post recently about why JavaScript isn't suitable for teaching Data Carpentry-style workshops. In closing, he suggests an unusual way to design programming languages:

What I do know is that the world would be a better place if language designers adopted tutorial-driven design: write the lessons that introduce newcomers to the language, then implement the features those tutorials require.

That's a different sort of TDD than I'm used to...

This is the sort of idea that causes me to do a double or triple take. At first, it has an appealing ring to it, when considering how difficult it is to teach most programming languages to novices. Then I think a bit and decide that it sounds crazy because, really, are we going to hamstring our languages by focusing on the struggles of beginners? But then it sits in my mind for a while and I start to wonder if we couldn't grow a decent language this way. It's almost like using the old TDD to implement new the TDD.

The PLT Scheme folks have designed a set of teaching languages that enable beginners to grow into an industry-strength language. That design project seems to have worked from the outsides in, with a target language in mind while designing the teaching languages. Maybe Wilson's idea of starting at the beginning isn't so crazy after all.

Posted by Eugene Wallingford | Permalink | Categories: Software Development, Teaching and Learning

March 22, 2018 4:05 PM

Finally, Some Good News

It's been a tough semester. On top of the usual business, there have been a couple of extra stresses. First, I've been preparing for the departure of a very good friend, who is leaving the university and the area for family and personal reasons. Second, a good friend and department colleague took an unexpected leave that turned into a resignation. Both departures cast a distant pall over my workdays. This week, though, has offered a few positive notes to offset the sadness.

Everyone seems to complain about email these days, and I certainly have been receiving and sending more than usual this semester, as our students and I adjust to the change in our faculty. But sometimes an email message makes my day better. Exhibit 1, a message from a student dealing with a specific issue:

Thank you for your quick and helpful response!
Things don't look so complicated or hopeless now.

Exhibit 2, a message from a student who has been taming the bureaucracy that arises whenever two university systems collide:

I would like to thank you dearly for your prompt and thorough responses to my numerous emails. Every time I come to you with a question, I feel as though I am receiving the amount of respect and attention that I wish to be given.

Compliments like these make it a lot easier to muster the energy to deal with the next batch of email coming in.

There has also been good news on the student front. I received email from a rep at a company in Madison, Wisconsin, where one of our alumni works. They are looking for developers to work in a functional programming environment and are having a hard time filling the positions locally, despite the presence of a large and excellent university in town. Our alum is doing well enough that the company would like to hire more from our department, which is doing a pretty good job, too.

Finally, today I spoke in person with two students who had great news about their futures. One has accepted an offer to join the Northwestern U. doctoral program and work in the lab of Kenneth Forbus. I studied Forbus's work on qualitative reasoning and analogical reasoning as a part of my own Ph.D. work and learned a lot from him. This is a fantastic opportunity. The other student has accepted an internship to work at PlayStation this summer, working on the team that develops the compilers for its game engines. He told me, "I talked a lot about the project I did in your course last semester during my interview, and I assume that's part of the reason I got an offer." I have to admit, that made me smile.

I had both of these students in my intro class a few years back. They would have succeeded no matter who taught their intro course, or the compiler course, for that matter, so I can't take any credit for their success. But they are outstanding young men, and I have had the pleasure of getting to know over the last four years. News of the next steps in their careers makes me feel good, too.

I think I have enough energy to make it to the end of the semester now.

Posted by Eugene Wallingford | Permalink | Categories: General, Personal, Teaching and Learning

March 12, 2018 3:43 PM

Technology is a Place Where We Live

Yesterday morning I read The Good Room, a talk Frank Chimero gave last month. Early on in the talk, Chimero says:

Let me start by stating something obvious: in the last decade, technology has transformed from a tool that we use to a place where we live.

This sentence jumped off the page both for the content of the assertion and for the decade time frame with which he bounds it. In the fall of 2003, I taught a capstone course for non-majors that is part of my university's liberal arts core. The course, titled "Environment, Technology, and Society", brings students from all majors on campus together in a course near the end of their studies, to apply their general education and various disciplinary expertises to problems of some currency in the world. As you might guess from the title, the course focuses on problems at the intersection of the natural environment, technology, and people.

My offering of the course put on a twist on the usual course content. We focused on the man-made environment we all live in, which even by 2003 had begun to include spaces carved out on the internet and web. The only textbook for the course was Donald Norman's The Design of Everyday Things, which I think every university graduate should have read. The topics for the course, though, had a decided IT flavor: the effect of the Internet on everyday life, e-commerce, spam, intellectual property, software warranties, sociable robots, AI in law and medicine, privacy, and free software. We closed with a discussion of what an educated citizen of the 21st century ought to know about the online world in which they would live in order to prosper as individuals and as a society.

The change in topic didn't excite everyone. A few came to the course looking forward to a comfortable "save the environment" vibe and were resistant to considering technology they didn't understand. But most were taking the course with no intellectual investment at all, as a required general education course they didn't care about and just needed to check off the list. In a strange way, their resignation enabled them to engage with the new ideas and actually ask some interesting questions about their future.

Looking back now after fifteen years , the course design looks pretty good. I should probably offer to teach it again, updated appropriately, of course, and see where young people of 2018 see themselves in the technological world. As Chimero argues in his talk, we need to do a better job building the places we want to live in -- and that we want our children to live in. Privacy, online peer pressure, and bullying all turned out differently than I expected in 2003. Our young people are worse off for those differences, though I think most have learned ways to live online in spite of the bad neighborhoods. Maybe they can help us build better places to live.

Chimero's talk is educational, entertaining, and quotable throughout. I tweeted one quote: "How does a city wish to be? Look to the library. A library is the gift a city gives to itself." There were many other lines I marked for myself, including:

  • Penn Station "resembles what Kafka would write about if he had the chance to see a derelict shopping mall." (I'm a big Kafka fan.)
  • "The wrong roads are being paved in an increasingly automated culture that values ease."
Check the talk out for yourself.

Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Teaching and Learning

March 06, 2018 4:11 PM

A Good Course in Epistemology

Theoretical physicist Marcelo Gleiser, in The More We Know, the More Mystery There Is:

But even if we did [bring the four fundamental forces together in a common framework], and it's a big if right now, this "unified theory" would be limited. For how could we be certain that a more powerful accelerator or dark matter detector wouldn't find evidence of new forces and particles that are not part of the current unification? We can't. So, dreamers of a final theory need to recalibrate their expectations and, perhaps, learn a bit of epistemology. To understand how we know is essential to understand how much we can know.
the table of contents from PHL 440's readings

People are often surprised to hear that, in all my years of school, my favorite course was probably PHL 440 Epistemology, which I took in grad school as a cognate to my CS courses. I certainly enjoyed the CS courses I took as a grad student, and as an undergrad, too, and but my study of AI was enhanced significantly by courses in epistemology and cognitive psychology. The prof for PHL 440, Dr. Rich Hall, became a close advisor to my graduate work and a member of my dissertation committee. Dr. Hall introduced me to the work of Stephen Toulmin, whose model of argument influenced my work immensely.

I still have the primary volume of readings that Dr. Hall assigned in the course. Looking back now, I'd forgotten how many of W.V.O. Quine's papers we'd read... but I enjoyed them all. The course challenged most of my assumptions about what it means "to know". As I came to appreciate different views of what knowledge might be and how we come by it, my expectations of human behavior -- and my expectations for what AI could be -- changed. As Gleiser suggests, to understand how we know is essential to understanding what we can know, and how much.

Gleiser's epistemology meshes pretty well with my pragmatic view of science: it is descriptive, within a particular framework and necessarily limited by experience. This view may be why I gravitated to the pragmatists in my epistemology course (Peirce, James, Rorty), or perhaps the pragmatists persuaded me better than the others.

In any case, the Gleiser interview is a delightful and interesting read throughout. His humble of science may get you thinking about epistemology, too.

... and, yes, that's the person for whom a quine in programming is named. Thanks to Douglas Hofstadter for coining the term and for giving us programming nuts a puzzle to solve in every new language we learn.

Posted by Eugene Wallingford | Permalink | Categories: Computing, General, Patterns, Personal

March 04, 2018 11:07 AM

A Seven-Year Itch

Seven years ago, I went out for my last run. I didn't know at the time that it would be my last run. A month or so later, I noted that I had been sick for a couple of weeks and then sore for a couple of weeks. After another four weeks, I reported that my knee wasn't going to get better in a way that would enable me to run regularly again. That was it.

My knee is better now in most important ways, though. A simple fix wasn't possible, but a more involved surgery was successful. Today, I walk a lot, especially with my wife, ride a bike a lot, again especially with my wife, and otherwise live a normal physical life. The repaired knee is not as mobile or responsive as my other knee but, all things considered, life is pretty good.

Even so, I miss running. A couple of years ago, I wrote that even five years on, I still dreamed about running occasionally. I'll be up early some morning, see a sunrise, and think, "This would make for a great run." Sometimes, when I go out after a snowfall, I'll remember what it was like to be the first person running on fresh snow out on the trails, under ice- or snow-covered branches. I miss that feeling, and so many others. I still enjoy sunrises and new snow, of course, but that enjoyment has long been tangled up with the feel of running: the pumping lungs, the long strides, the steady flow of scenery. Walking and biking have never given me the same feeling.

My orthopedic surgeon was worried that I would be like a lot of former runners and not stay "former", but I've been pretty well-behaved. In seven years I have rarely broken into even the slowest of trots, to cross a street or hurry to class. The doctor explained to me the effects of running on my reconstructed knee, the risk profile associated with contact sports, and what contact would likely mean for the future of the knee. As emotional I can seem about running, I'm much too rational to throw caution out the door for a brief thrill of running. So I don't run.

Even so, I often think back to the time I was rehabilitating my knee after surgery. Our athletic department has a therapy pool with an underwater treadmill, and my therapist had me use it to test my endurance and knee motion. The buoyancy of the water takes enough pressure off the legs that the impact on the knee doesn't damage the joint. I think I can achieve the same effect in the ocean, so the next time I get to a coast, I may try an underwater run. And I dream of getting rich enough to install one of those therapy pools in my house. I may not be a runner anymore, but I'm adaptable and perfectly willing to enjoy the benefits of technology.

Posted by Eugene Wallingford | Permalink | Categories: Personal, Running

February 26, 2018 3:55 PM

Racket Love

Racket -- "A Programmable Programming Language" -- is the cover story for next month's Communications of the ACM. The new issue is already featured on the magazine's home page, including a short video in which Matthias Felleisen explains the idea of code as more than a machine artifact.

My love of Racket is no surprise to readers of this blog. Still one of my favorite old posts here is The Racket Way, a write-up of my notes from Matthew Flatt's talk of the same name at StrangeLoop 2012. As I said in that post, this was a deceptively impressive talk. I think that's especially fitting, because Racket is a deceptively impressive language.

One last little bit of love from a recent message to the Racket users mailing list... Stewart Mackenzie describes his feelings about the seamless interweaving of Racket and Typed Racket via a #lang directive:

So far my dive into Racket has positive. It's magical how I can switch from untyped Racket to typed Racket simply by changing #lang. Banging out my thoughts in a beautiful lisp 1, wave a finger, then finger crack to type check. Just sublime.

That's what you get when your programming language is as programmable as your application.

Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development

February 21, 2018 3:38 PM

Computer Programs Aren't Pure Abstractions. They Live in the World.

Guile Scheme guru Andy Wingo recently wrote a post about langsec, the idea that we can bake system security into our programs by using languages that support proof of correctness. Compilers can then be tools for enforcing security. Wingo is a big fan of the langsec approach but, in light of the Spectre and Meltdown vulnerabilities, is pessimistic that it really matter anymore. If bad actors can exploit the hardware that executes our programs, then proving that the code is secure doesn't do much good.

I've read a few blog posts and tweets that say Wingo is too pessimistic, that efforts to make our languages produce more secure code will still pay off. I think my favorite such remark, though, is a comment on Wingo's post itself, by Thomas Dullien:

I think this is too dark a post, but it shows a useful shock: Computer Science likes to live in proximity to pure mathematics, but it lives between EE and mathematics. And neglecting the EE side is dangerous - which not only Spectre showed, but which should have been obvious at the latest when Rowhammer hit.
There's actual physics happening, and we need to be aware of it.

It's easy for academics, and even programmers who work atop an endless stack of frameworks, to start thinking of programs as pure abstractions. But computer programs, unlike mathematical proofs, come into contact with real, live hardware. It's good to be reminded sometimes that computer science isn't math; it lives somewhere between math and engineering. That is good in so many ways, but it also has its downsides. We should keep that in mind.

Posted by Eugene Wallingford | Permalink | Categories: Computing