April 29, 2019 2:42 PM

The Path to Nothing

Dick Gabriel writes, in Lessons From The Science of Nothing At All:

Nevertheless, the spreadsheet was something never seen before. A chart indicating the 64 greatest events in accounting and business history contains VisiCalc.

This reminds me of a line from The Tao of Pooh:

Take the path to Nothing, and go Nowhere until you reach it.

A lot of research is like this, but even more so in computer science, where the things we produce are generally made out of nothing. Often, like VisiCalc, they aren't really like anything we've ever seen or used before either.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development

April 28, 2019 10:37 AM

The Smart Already Know They Are Lucky

Writes Matthew Butterick:

As someone who had a good run in the tech world, I buy the theory that the main reason successful tech founders start another company is to find out if they were smart or merely lucky the first time. Of course, the smart already know they were also lucky, so further evidence is unnecessary. It's only the lucky who want proof they were smart.

From a previous update to The Billionaire's Typewriter, recently updated again. I'm not sure this is the main reason that most successful tech founders start another company -- I suspect that many are simply ambitious and driven -- but I do believe that most successful people are lucky many times over, and that the self-aware among them know it.


Posted by Eugene Wallingford | Permalink | Categories: General

April 16, 2019 3:40 PM

The Importance of Giving Credit in Context

From James Propp's Prof. Engel's Marvelously Improbable Machines:

Chip-firing has been rediscovered independently in three different academic communities: mathematics, physics, and computer science. However, its original discovery by Engel is in the field of math education, and I strongly feel that Engel deserves credit for having been the first to slide chips around following these sorts of rules. This isn't just for Engel's sake as an individual; it's also for the sake of the kind of work that Engel did, blending his expertise in mathematics with his experience in the classroom. We often think of mathematical sophistication as something that leads practitioners to create concepts that can only be understood by experts, but at the highest reaches of mathematical research, there's a love of clarity that sees the pinnacle of sophistication as being the achievement of hard-won simplicity in settings where before there was only complexity.

First of all, Petri nets! I encountered Petri nets for the first time in a computer architecture course, probably as a master's student, and it immediately became my favorite thing about the course. I was never much into hardware and architecture, but Petri nets showed me a connection back to graph theory, which I loved. Later, I studied how to apply temporal logic to modeling hardware and found another way to appreciate my architecture courses.

But I really love the point that Propp makes in this paragraph and the section it opens. Most people think of research and teaching as being different sort of activities. But the kind of thinking one does in one often crosses over into the other. The sophistication that researchers have and use help us make sense of complex ideas and, at their best, help us communicate that understanding to a wide audience, not just to researchers at the same level of sophistication. The focus that teachers put on communicating challenging ideas to relative novices can encourage us to seek new formulations for a complex idea and ways to construct more complex ideas out of the new formulations. Sometimes, that can lead to an insight we can use in research.

In recent years, my research has benefited a couple times from trying to explain and demonstrate concatenative programming, as in Forth and Joy, to my undergraduate students. These haven't been breakthroughs of the sort that Engel made with his probability machines, but they've certainly help me grasp in new ways ideas I'd been struggling with.

Propp argues convincingly that it's important that we tell stories like Engel's and recognize that his breakthrough came as a result of his work in the classroom. This might encourage more researchers to engage as deeply with their teaching as with their research. Everyone will benefit.

Do you know any examples similar to the one Propp relates, but in the field of computer science? If so, I would love to hear about them. Drop me a line via email or Twitter.

Oh, and if you like Petri nets, probability, or fun stories about teaching, do read Propp's entire piece. It's good fun and quite informative.


Posted by Eugene Wallingford | Permalink | Categories: Computing, Teaching and Learning

April 08, 2019 11:55 AM

TDD is a Means for Delaying Intuition

In one of his "Conversations with Tyler", Tyler Cowen talks with Daniel Kahneman about intuition and its relationship to thinking fast and slow. According to Kahneman, evidence supports his position that most people have not had the experience necessary to develop intuition that is good enough for solving problems directly. So he thinks that most people, including most so-called experts, should slow down.

So I think delaying intuition is a very good idea. Delaying intuition until the facts are in, at hand, and looking at dimensions of the problem separately and independently is a better use of information.
The problem with intuition is that it forms very quickly, so that you need to have special procedures in place to control it except in those rare cases...
...
Break the decision up. It's not so much a matter of time because you don't want people to get paralyzed by analysis. But it's a matter of planning how you're going to make the decision, and making it in stages, and not acting without an intuitive certainty that you are doing the right thing. But just delay it until all the information is available.

This is one of the things that I find most helpful about test-driven design when I practice it faithfully. It's pretty easy for me to think that I know everything I need to implement a program after I've read the spec and thought about it for a few minutes. I mean, I've written a lot of code over the years... If my intuition tells me where to go, I can start right away and have the whole path ahead of me in my mind.

But how often do I really have evidence that my intuitive plan is the correct one? If I'm wrong, I'll spend a bunch of time and write a bunch of code, only later to find out I'm wrong. What's worse, all that code I've written usually ends up feeling like a constraint within which I have to work as I try to dig myself out of the mess.

Writing one test at a time and implementing just the code I need to pass it is a way to force my intuitive mind to slow down. It helps me think about the actual problem I'm solving, rather than the abstraction my expert brain infers from the spec. The program grows slowly along side my understanding and points me in the direction of the next small step to take.

TDD is a procedure I can put in place to help me control my intuition until the facts are in, and it encourages me to look at different dimensions of the problem independently as write the code to solve them.


Posted by Eugene Wallingford | Permalink | Categories: Patterns, Software Development

April 04, 2019 4:40 PM

A Non-Software Example of Preparatory Refactoring

When I refactor my code, I most often think in terms of what Martin Fowler calls preparatory refactoring: if it's difficult to add a new feature to my program, I refactor the existing code into a state where the feature fits naturally somewhere, then I add the feature. As is often the case, Kent Beck captured this notion memorably in a tweet-sized aphorism. I was delighted to see that Martin's blog post, which I remember fondly, cites the same tweet!

Ideas from programming are usually instances of more general ideas from the broader world, specialized to a world of bits and computation. Back when I taught our object-oriented programming course every semester, I often referred my students to a web site that offered real-life examples of the various design patterns we were learning. I remember an example of Decorator that showed how we can embellish a painting with a matte or a frame, and its use of the U.S. Constitution's specification of the President to illustrate the idea of a Singleton. I can't find that site on the web just now, but there's almost surely a local copy buried in one of my course websites from way back.

The idea of refactoring is useful outside the world of software, too. Just yesterday, my dean suggested what I consider to be a preparatory refactoring.

A committee on campus is charged with proposing ways that the university can restructure its colleges and departments. With the exception of a merger of two colleges a few years ago, we have had essentially the same academic structure for my entire time here. In those years, disciplines have changed and budgets have changed, so now the administration is thinking that the university might be more efficient or more productive with a different structure. Thinking about the process from this perspective, restructuring is largely a reaction to change that has already happened.

My dean suggested another way to approach the task. Think, he said, of the new academic programs that you'd like to create in the future. We may not have money available right now to create a new major or to organize a new research center, but one day we might. What university structure would make adding this program go more smoothly and work better once in place? Which departments would the new program want to work with? Which administrative structures already in place would minimize unnecessary overhead of the new program? As much as possible, he suggested, let's try to create a new academic structure that suits the future programs we'd like to build. That will reduce friction later, which is good: Administrative friction often grinds new academic ideas to a halt before they get off the ground.

In programming terms, this is quite bit different than the sort of refactoring I prefer. I try to practice YAGNI and refactor only for the specific feature that I want to add right now, not taking into account imagined future features I may never need. In terms of academic structure, though, this sort of ip-front design makes a lot of sense. Academic structures are much harder to change than software; getting a few things right now may make future changes at the program level much easier to implement later.

Thinking about academic restructuring this way has another positive side effect: it might entice faculty to be more engaged, because what we do now matters to the future we would like to build. It's not merely a reaction to past changes.

My dean is suggesting that we build academic structures now that make the changes we want to implement later (when the resources and requisite will exist) easier to implement. Building those structures now may take more work than simply responding to past changes, but it will be worth the effort when we are ready to create new programs. I think Kent and Martin might be proud.


Posted by Eugene Wallingford | Permalink | Categories: Managing and Leading, Patterns, Software Development