June 21, 2019 2:35 PM

Computing Everywhere, Sea Mammal Edition

In The Narluga Is a Strange Beluga-Narwhal Hybrid, Ed Yong tells the story of a narluga, the offspring of a beluga father and a narwhal mother:

Most of its DNA was a half-and-half mix between the two species, but its mitochondrial DNA -- a secondary set that animals inherit only from their mothers -- was entirely narwhal.

This strange hybrid had a mouth and teeth unlike either of its parents, the product of an unexpected DNA computation:

It's as if someone took the program for creating a narwhal tusk and ran it in a beluga's mouth.

The analogy to software doesn't end there, though...

There's something faintly magical about that. This fluky merger between two species ended up with a mouth that doesn't normally exist in nature but still found a way of using it. It lived neither like a beluga nor a narwhal, but it lived nonetheless.

Fluky and abnormal; a one-off, yet it adapts and survives. That sounds like a lot of the software I've used over the years and, if I'm honest, like some of the software I've written, too.

That said, nature is amazing.

Posted by Eugene Wallingford | Permalink | Categories: Computing

June 20, 2019 3:51 PM

Implementing a "Read Lines" Operator in Joy

I wasn't getting any work done today on my to-do list, so I decided to write some code.

One of my learning exercises to open the Summer of Joy is to solve the term frequency problem from Crista Lopes's Exercises in Programming Style. Joy is a little like Scheme: it has a lot of cool operations, especially higher-order operators, but it doesn't have much in the way of practical level tools for basic tasks like I/O. To compute term frequencies on an arbitrary file, I need to read the file onto Joy's stack.

I played around with Joy's low-level I/O operators for a while and built a new operator called readfile, which expects the pathname for an input file on top of the stack:

    DEFINE readfile ==
        (* 1 *)  [] swap "r" fopen
        (* 2 *)  [feof not] [fgets swap swonsd] while
        (* 3 *)  fclose.

The first line leaves an empty list and an input stream object on the stack. Line 2 reads lines from the file and conses them onto the list until it reaches EOF, leaving a list of lines under the input stream object on the stack. The last line closes the stream and pops it from the stack.

This may not seem like a big deal, but I was beaming when I got it working. First of all, this is my first while in Joy, which requires two quoted programs. Second, and more meaningful to me, the loop body not only works in terms of the dip idiom I mentioned in my previous post, it even uses the higher-order swonsd operator to implement the idiom. This must be how I felt the first time I mapped an anonymous lambda over a list in Scheme.

readfile leaves a list of lines on the stack. Unfortunately, the list is in reverse order: the last line of the file is the front of the list. Besides, given that Joy is a stack-based language, I think I'd like to have the lines on the stack itself. So I noodled around some more and implemented the operator pushlist:

    DEFINE pushlist ==
        (* 1 *)  [ null not ] [ uncons ] while
        (* 2 *)  pop.

Look at me... I get one loop working, so I write another. The loop on Line 1 iterates over a list, repeatedly taking (head . tail) and pushing head and tail onto the stack in that order. Line 2 pops the empty list after the loop terminates. The result is a stack with the lines from the file in order, first line on top:

    line-n ... line-3 line-2 line-1

Put readfile and pushlist together:

    DEFINE fileToStack == readfile pushlist.
and you get fileToStack, something like Python's readlines() function, but in the spirit of Joy: the file's lines are on the stack ready to be processed.

I'll admit that I'm pleased with myself, but I suspect that this code can be improved. Joy has a lot of dandy higher-order operators. There is probably a better way to implement pushlist and maybe even readfile. I won't be surprised if there is a more idiomatic way to implement the two that makes the two operations plug together with less rework. And I may find that I don't want to leave bare lines of text on the stack after all and would prefer having a list of lines. Learning whether I can improve the code, and how, are tasks for another day.

My next job for solving the term frequency problem is to split the lines into individual words, canonicalize them, and filter out stop words. Right now, all I know is that I have two more functions in my toolbox, I learned a little Joy, and writing some code made my day better.

Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development, Teaching and Learning

June 18, 2019 3:09 PM

Notations, Representations, and Names

In The Power of Simple Representations, Keith Devlin takes on a quote attributed to the mathematician Gauss: "What we need are notions, not notations."

While most mathematicians would agree that Gauss was correct in pointing out that concepts, not symbol manipulation, are at the heart of mathematics, his words do have to be properly interpreted. While a notation does not matter, a representation can make a huge difference.

Spot on. Devlin's opening made me think of that short video of Richard Feynman that everyone always shares, on the difference between knowing the name of something and knowing something. I've seen people mis-interpret Feynman's words in both directions. The people who share this video sometimes seem to imply that names don't matter. Others dismiss the idea as nonsense: how can you not know the names of things and claim to know anything?

Devlin's distinction makes clear the sense in which Feynman is right. Names are like notations. The specific names we use don't really matter and could be changed, if we all agreed. But the "if we all agreed" part is crucial. Names do matter as a part of a larger model, a representation of the world that relates different ideas. Names are an index into the model. We need to know them so that we can speak with others, read their literature, and learn from them.

This brings to mind an article with a specific example of the importance of using the correct name: Through the Looking Glass, or ... This is the Red Pill, by Ben Hunt at Epsilon Theory:

I'm a big believer in calling things by their proper names. Why? Because if you make the mistake of conflating instability with volatility, and then you try to hedge your portfolio today with volatility "protection" ...., you are throwing your money away.

Calling a problem by the wrong name might lead you to the wrong remedy.

Feynman isn't telling us that names don't matter. He's telling us that knowing only names isn't valuable. Names are not useful outside the web of knowledge in which they mean something. As long as we interpret his words properly, they teach us something useful.

Posted by Eugene Wallingford | Permalink | Categories: General, Patterns

June 11, 2019 3:04 PM

Summer of Joy

"Elementary" ideas are really hard & need to be revisited
& explored & re-revisited at all levels of mathematical
sophistication. Doing so actually moves math forward.

-- James Tanton

Three summers ago, I spent a couple of weeks re-familiarizing myself with the concatenative programming language Joy and trying to go a little deeper with the style. I even wrote a few blog entries, including a few quick lessons I learned in my first week with the language. Several of those lessons hold up, but please don't look at the code linked there; it is the raw code of a beginner who doesn't yet get the idioms of the style or the language. Then other duties at work and home pulled me away, and I never made the time to get back to my studies.

my Summer of Joy folder

I have dubbed this the Summer of Joy. I can't devote the entire summer to concatenative programming, but I'm making a conscious effort to spend a couple of days each week in real study and practice. After only one week, I have created enough forward momentum that I think about problems and solutions at random times of the day, such as while walking home or making dinner. I think that's a good sign.

An even better sign is that I'm starting to grok some of the idioms of the style. Joy is different from other concatenative languages like Forth and Factor, but it shares the mindset of using stack operators effectively to shape the data a program uses. I'm finally starting to think in terms of dip, an operator that enables a program to manipulate data just below the top of the stack. As a result, a lot of my code is getting smaller and beginning to look like idiomatic Joy. When I really master dip and begin to think in terms of other "dipping" operators, I'll know I'm really on my way.

One of my goals for the summer is to write a Joy compiler from scratch that I can use as a demonstration in my fall compiler course. Right now, though, I'm still in Joy user mode and am getting the itch for a different sort of language tool... As my Joy skills get better, I find myself refactoring short programs I've written in the past. How can I be sure that I'm not breaking the code? I need unit tests!

So my first bit of tool building is to implement a simple JoyUnit. As a tentative step in this direction, I created the simplest version of RackUnit's check-equal? function possible:

    DEFINE check-equal == [i] dip i =.
This operator takes two quoted programs (a test expression and an expected result), executes them, and compares the results. For example, this test exercises a square function:
    [ 2 square ] [ 4 ] check-equal.

This is, of course, only the beginning. Next I'll add a message to display when a test fails, so that I can tell at a glance which tests have failed. Eventually I'll want my JoyUnit to support tests as objects that can be organized into suites, so that their results can be tallied, inspected, and reported on. But for now, YAGNI. With even a few simple functions like this one, I am able to run tests and keep my code clean. That's a good feeling.

To top it all off, implementing JoyUnit will force me to practice writing Joy and push me to extend my understanding while growing the set of programming tools I have at my disposal. That's another good feeling, and one that might help me keep my momentum as a busy summer moves on.

Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development, Teaching and Learning

June 10, 2019 2:04 PM

Teach, That's My Advice

In Tyler Cowen's conversation with poet Elisa New, he asks closes with one of his standard questions:

COWEN: Last question. You meet an 18-year-old, and this person wants in some way to be a future version of you, Elisa New, and asks you for advice. What advice do you give them?
NEW: Teach.
COWEN: Teach.
NEW: Yes, teach the young, and yes, that's the advice. Because what teaching is, is learning to converse with others. It's to experience a topic as it grows richer and richer under the attentions of a community. That's what a classroom that really works is. It's a community that's ever rewarding.

New's justification for teaching has two parts. The first struck me as central to the task of becoming a poet, or a writer of any sort: learning to converse -- to express and exchange ideas -- with others. To converse is to use words and to experience their effects, both as speaker and listener. Over my years in the classroom, I've come to appreciate this benefit of teaching. It's made me a better teacher and, if not a better writer, at least a writer more aware of the different ways in which I can express my ideas.

New's second justification captures well the central value of teaching to an academic. To teach is to experience a topic as it grows richer under the attention of a community. What a wonderful phrase!

Some people think that teaching will steal time from their work as a literary scholar, historian, or scientist. But teaching helps us to see deeper into our discipline by urging us to examine it over and over from new vantage points. Every new semester and every new student creates a new conversation for me, and these conversations remind me that there is even more to a topic than I think -- more often than I ever thought they would before I became a professor. Just when I think I've mastered something, working with students seems most likely to help me see something new, in a way different than I might see something new through my own study.

This exposes one of the advantages of working in a graduate program or in an ongoing research lab: building a community that has some continuity over time. Teaching at an undergraduate institution means that not as many of my students will be able to work with me and one another on the same topic over time. Even so, follow-up courses and undergrad research projects do allow us to create overlapping communities with a lifespan longer than a single semester. It simply requires a different mindset than working in a big research lab.

So I heartily echo Professor New: teach, that's my advice.

Posted by Eugene Wallingford | Permalink | Categories: Teaching and Learning

May 22, 2019 1:45 PM

The Futility of Software

A thought from <antirez> in an essay on the struggles of an open source maintainer (paraphrased a bit):

Sometimes I believe that writing software, while great, will never be huge like writing a book that will survive for centuries. Not because software is not as great per se, but because as a side effect it is also useful... and will be replaced when something more useful is around.

We write most software with a particular use in mind, so it is really only fair to compare it to non-fiction books, which also have a relatively short shelf life. To be fair, though, not many fiction books survive for centuries, either. Language and fashion doom them almost as much as evolving technology destines most software to fade away within a generation, and a short generation at that.

Still, I won't be surprised if the DNA of Smalltalk-80 or some early Lisp implementation lives on deep in a system that developers use in the 22nd century.

Posted by Eugene Wallingford | Permalink | Categories: Software Development

May 19, 2019 10:48 AM

Me and Not Me

At one point in the novel "Outline", by Rachel Cusk, a middle-aged man relates a conversation that he had with his elderly mother, in which she says:

I could weep just to think that I'll never see you again as you were at the age of six -- I would give anything, she said, to meet that six-year-old one more time.

This made me think of two photographs I keep on the wall at my office, of my grown daughters when they were young. In one, my older daughter is four; in the other, my younger daughter is two. Every once in a while, my wife asks why I don't replace them with something newer. My answer is always the same: They are my two favorite pictures in the world. When my daughters were young, they seemed to be infinite bundles of wonder: always curious, discovering things and ideas everywhere they went, making connections. They were restless in a good way, joyful, and happy. We can be all of these things as we grow into adulthood, but I experienced them so much differently as a father, watching my girls live them.

I love the people my daughters are now, and are becoming, and cherish my relationship with them. Yet, like the old woman in Cusk's story, there is a small part of me that would love to meet those little girls again. When I see one of my daughters these days, she is both that little girl, grown up, and not that little girl, a new person shaped by her world and by her own choices. The photographs on my wall keep alive memories not just of a time but also of specific people.

As I thought about Cusk's story, it occurred to me that the idea of "her and not her" does not apply only to my daughters, or to my wife, old pictures of whom I enjoy with similar intensity. I am me and not me.

I'm both the little guy who loved to read encyclopedias and shoot baskets every day, and not him. I'm not the same guy who walked into high school in a new city excited about the possibilities it offered and nervous about how I would fit in, yet I grew out of him. I am at once the person who started college as an architecture major -- who from the time he was eight years old had wanted to be an architect -- and not him. I'm not the same person who defended a Ph.D. dissertation half a life ago, but who I am owes a lot to him. I am both the man my wife married and not, being now the man that man has become.

And, yes, the father of those little girls pictured on my wall: me and not me. This is true in how they saw me then and how they see me now.

I'm not sure how thinking about this distinction will affect future me. I hope that it will help me to appreciate everyone in my life, especially my daughters and my wife, a bit more for who they are and who they have been. Maybe it will even help me be more generous to 2019 me.

Posted by Eugene Wallingford | Permalink | Categories: Patterns, Personal

May 07, 2019 11:15 AM

A PL Design Challenge from Alan Kay

In an answer on Quora from earlier this year:

There are several modern APL-like languages today -- such as J and K -- but I would criticize them as being too much like the classic APL. It is possible to extract what is really great from APL and use it in new language designs without being so tied to the past. This would be a great project for some grad students of today: what does the APL perspective mean today, and what kind of great programming language could be inspired by it?

The APL perspective was more radical even twenty years ago, before MapReduce became a thing and before functional programming ascended. When I was an undergrad, though, it seemed otherworldly: setting up a structure, passing it through a sequence of operators that changed its shape, and then passing it through a sequence of operators that folded up a result. We knew we weren't programming in Fortran anymore.

I'm still fascinated by APL, but I haven't done a lot with it in the intervening years. These days I'm still thinking about concatenative programming in languages like Forth, Factor, and Joy, a project I reinitiated (and last blogged about) three summers ago. Most concatenative languages work with an implicit stack, which gives it a very different feel from APL's dataflow style. I can imagine, though, that working in the concision and abstraction of concatenative languages for a while will spark my interest in diving back into APL-style programming some day.

Kay's full answer is worth a read if only for the story in which he connects Iverson's APL notation, and its effect on how we understand computer systems, to the evolution of Maxwell's equations. Over the years, I've heard Kay talk about McCarthy's Lisp interpreter as akin to Maxwell's equations, too. In some ways, the analogy works even better with APL, though it seems that the lessons of Lisp have had a larger historical effect to date.

Perhaps that will change? Alas, as Kay says in the paragraph that precedes his challenge:

As always, time has moved on. Programming language ideas move much slower, and programmers move almost not at all.

Kay often comes off as pessimistic, but after all the computing history he has lived through (and created!), he has earned whatever pessimism he feels. As usual, reading one of his essays makes me want to buckle down and do something that would make him proud.

Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development

April 29, 2019 2:42 PM

The Path to Nothing

Dick Gabriel writes, in Lessons From The Science of Nothing At All:

Nevertheless, the spreadsheet was something never seen before. A chart indicating the 64 greatest events in accounting and business history contains VisiCalc.

This reminds me of a line from The Tao of Pooh:

Take the path to Nothing, and go Nowhere until you reach it.

A lot of research is like this, but even more so in computer science, where the things we produce are generally made out of nothing. Often, like VisiCalc, they aren't really like anything we've ever seen or used before either.

Posted by Eugene Wallingford | Permalink | Categories: Computing, Software Development

April 28, 2019 10:37 AM

The Smart Already Know They Are Lucky

Writes Matthew Butterick:

As someone who had a good run in the tech world, I buy the theory that the main reason successful tech founders start another company is to find out if they were smart or merely lucky the first time. Of course, the smart already know they were also lucky, so further evidence is unnecessary. It's only the lucky who want proof they were smart.

From a previous update to The Billionaire's Typewriter, recently updated again. I'm not sure this is the main reason that most successful tech founders start another company -- I suspect that many are simply ambitious and driven -- but I do believe that most successful people are lucky many times over, and that the self-aware among them know it.

Posted by Eugene Wallingford | Permalink | Categories: General