January 28, 2015 3:38 PM
The Relationship Between Coding and Literacy
Many people have been discussing Chris Granger's recent essay Coding is not the New Literacy, and most seem to approve of his argument. Reading it brought to my mind this sentence from Alan Kay in VPRI Memo M-2007-007a, The Real Computer Revolution Hasn't Happened Yet:
Literacy is not just being able to read and write, but being able to deal fluently with the kind of ideas that are important enough to write about and discuss.
Literacy requires both the low-level skills of reading and writing and the higher-order capacity for using them on important ideas.
That is one thing that makes me uneasy about Granger's argument. It is true that teaching people only low-level coding skills won't empower them if they don't know how to use them to use them fluently to build models that matter. But neither will teaching them how to build models without giving them access to the programming skills they need to express their ideas beyond what some tool gives them.
We sometimes do a better job introducing programming to kids, because we use tools that allow students to build models they care about and can understand. In the VPRI memo, Kay describes experiences teaching elementary school, students to use eToys to model physical phenomena. In the end, they learn physics and the key ideas underlying calculus. But they also learn the fundamentals of programming, in an environment that opens up into Squeak, a flavor of Smalltalk.
I've seen teachers introduce students to Scratch in a similar way. Scratch is a drag-and-drop programming environment, but it really is a open-ended and lightweight modeling tool. Students can learn low-level coding skills and higher-level thinking skills in tandem.
That is the key to making Granger's idea work in the best way possible. We need to teach people how to think about and build models in a way that naturally evolves into programming. I am reminded of another quote from Alan Kay that I heard back in the 1990s. He reminded us that kindergarteners learn and use the same language that Shakespeare used It is possible for their fluency in the language to grow to the point where they can comprehend some of the greatest literature ever created -- and, if they possess some of Shakepeare's genius, to write their own great literature. English starts small for children, and as they grow, it grows with them. We should aspire to do the same thing for programming.
Granger reminds us that literacy is really about composition and comprehension. But it doesn't do much good to teach people how to solidify their thoughts so that they can be written if they don't know how to write. You can't teach composition until your students know basic reading and writing.
Maybe we can find a way to teach people how to think in terms of models and how to implement models in programs at the same time, in a language system that grows along with their understanding. Granger's latest project, Eve, may be a step in that direction. There are plenty of steps left for us to take in the direction of languages like Scratch, too.
January 23, 2015 3:35 PM
Agile Design and Jazz
Anyone interested in thinking about how programmers can design software without mapping its structure out in advance should read Ted Gioia's Jazz: The Aesthetics of Imperfection, which appeared in the Winter 1987 issue of The Hudson Review (Volume 39, Number 4, pages 585-600). It explores in some depth the ways in which jazz, which relies heavily on spur-of-the-moment improvisation and thus embraces imperfection, can still produce musical structure worthy of the term "art".
Gioia's contrast of producing musical form via the blueprint method and the retrospective method will resonate with anyone who has grown a large software system via small additions to, and refactorings of, an evolving code base. This paragraph brings to mind the idea of selecting a first test to pass at random:
Some may feel that the blueprint method is the only method by which an artist can adhere to form. But I believe this judgment to be quite wrong. We can imagine the artist beginning his work with an almost random maneuver, and then adapting his later moves to this initial gambit. For example, the musical improviser may begin his solo with a descending five-note phrase and then see, as he proceeds, that he can use this same five-note phrase in other contexts in the course of his improvisation.
Software is different from jazz performance in at least one way that makes the notion of retrospective form even more compelling. In jazz, the most valuable currency is live performance, and each performance is a new creation. We see something similar in the code kata, where each implementation starts from scratch. But unlike jazz, software can evolve over time. When we nurture the same code base over time, we can both incorporate new features and eliminate imperfections from previous iterations. In this way, software developers can create retrospective designs that both benefit from improvisation and reach stable architectures.
Another interesting connection crossed my mind as I read about the role the recording technology played in the development of jazz. With the invention of the phonograph:
... for the first time sounds could be recorded with the same precision that books achieved in recording words. Few realize how important the existence of the phonograph was to the development of improvised music. Hitherto, the only method of preserving musical ideas was through notation, and here the cumbersome task of writing down parts made any significant preservation of improvisations unfeasible. But with the development of the phonograph, improvised music could take root and develop; improvising musicians who lived thousands of miles apart could keep track of each other's development, and even influence each other without ever having met.
Software has long been a written form, but in the last two decades we have seen an explosion of ways in which programs could be recorded and shared with others. The internet and the web enabled tools such as SourceForge and GitHub, which in turn enabled the growth of communities dedicated to the creation and nurturing of open source software. The software so nurtured has often been the product of many people, created through thousands of improvisations by programmers living in all corners of the world. New programmers come to these repositories, the record stores of our world, and are able to learn from masters by studying their moves and their creations. They are then able to make their own contributions to existing projects, and to create new projects of their own.
As Gioia says of jazz, this is not to make the absurd claim that agile software did not exist before it was recorded and shared in this way, but the web and the public repository have had profound impacts on the way software is created. The retrospective form espoused by agile software design methods, the jazz of our industry, has been one valuable result.
Check out Gioia's article. It repaid my investment with worthy connections. If nothing else, it taught me a lot about jazz and music criticism.
January 19, 2015 2:14 PM
Beginners, Experts, and Possibilities
Last Thursday, John Cook tweeted:
Contrary to the Zen proverb, there may be more possibilities in the expert's mind than in the beginner's.
Which group do you think has a larger view of what a programming language can be? The more knowledgable, to be sure. This is especially true when their experience includes languages from different styles: procedural, object-oriented, functional, and so on.
Previous knowledge affects expectations. Students coming directly out of their first year courses are more likely to imagine that all languages are similar to what they already know. Nothing in their experience contradicts that idea.
Does this mean that the Zen notion of beginner's mind is wrongheaded? Not at all. I think an important distinction can be made between analysis and synthesis. In a context where we analyze languages, broad experience is more valuable than lack of experience, because we are able to bring to our seeing a wider range of possibilities. That's certainly my experience working with students over the years.
However, in a context, where we create languages, broad experience can be an impediment. When we have seen many different languages, it it can be difficult to create something that looks much different from the languages what we've already seen. Something in our minds seems to pull us toward an existing language that already solves the constraint they are struggling with. Someone else has already solved this problem; their solution is probably best.
This is also my experience working with students over the years. My freshmen will almost always come up with a fresher language design than my seniors. The freshmen don't know much about languages yet, and so their minds are relatively unconstrained. (Fantastically so, sometimes.) The seniors often seem to end up with something that is superficially new but, at its core, thoroughly predictable.
The value of "Zen mind, beginner's mind" also follows a bit from the distinction between expertise and experience. Experts typically reach a level of where they solve problem using heuristics to solve problems. There patterns and shortcuts are efficient, but they also tend to be "compiled" and not all that open to critical examination. We create best when we are able to modify, rearrange, and discard, and that's harder to do when our default mode of thinking is in pre-compiled units.
It should not bother us that useful adages and proverbs contradict one another. The world is complex. As Bokononists say, Busy, busy, busy.
January 18, 2015 10:26 AM
The Infinite Horizon
In Mathematics, Live: A Conversation with Laura DeMarco and Amie Wilkinson, Amie Wilkinson recounts the pivotal moment when she knew she wanted to be a mathematician. Insecure about her abilities in mathematics, unsure about what she wanted to do for a career, and with no encouragement, she hadn't applied to grad school. So:
I came back home to Chicago, and I got a job as an actuary. I enjoyed my work, but I started to feel like there was a hole in my existence. There was something missing. I realized that suddenly my universe had become finite. Anything I had to learn for this job, I could learn eventually. I could easily see the limits of this job, and I realized that with math there were so many things I could imagine that I would never know. That's why I wanted to go back and do math. I love that feeling of this infinite horizon.
After having written software for an insurance company during the summers before and after my senior year in college, I knew all too well the "hole in my existence" that Wilkinson talks about, the shrinking universe of many industry jobs. I was deeply interested in the ideas I had found in Gödel, Escher, Bach, and in the idea of creating an intelligent machine. There seemed no room for those ideas in the corporate world I saw.
I'm not sure when the thought of graduate school first occurred to me, though. My family was blue collar, and I didn't have much exposure to academia until I got to Ball State University. Most of my friends went out to get jobs, just like Wilkinson. I recall applying for a few jobs myself, but I never took the job search all that seriously.
At least some of the credit belongs to one of my CS professors, Dr. William Brown. Dr. Brown was an old IBM guy who seemed to know so much about how to make computers do things, from the lowest-level details of IBM System/360 assembly language and JCL up to the software engineering principles needed to write systems software. When I asked him about graduate school, he talked to me about how to select a school and a Ph.D. advisor. He also talked about the strengths and weaknesses of my preparation, and let me know that even though I had some work to do, I would be able to succeed.
These days, I am lucky even to have such conversations with my students.
For Wilkinson, DeMarco and me, academia was a natural next step in our pursuit of the infinite horizon. But I now know that we are fortunate to work in disciplines where a lot of the interesting questions are being asked and answers by people working in "the industry". I watch with admiration as many of my colleagues do amazing things while working for companies large and small. Computer science offers so many opportunities to explore the unknown.
Reading Wilkinson's recollection brought a flood of memories to mind. I'm sure I wasn't alone in smiling at her nod to finite worlds and infinite horizons. We have a lot to be thankful for.
January 16, 2015 2:59 PM
Programming Language As Artistic Medium
Says Ramsey Nasser:
I have always been fascinated by esolangs. They are the such an amazing intersection of technical and formal rigor on one hand and nerdy inside humor on the other. The fact that they are not just ideas, but *actual working languages* is incredible. Its something that could only exist in a field as malleable and accessible as code. NASA engineers cannot build a space station as a joke.
Because we can create programming languages as a joke, or for any other reason, a programming language can be both message and medium.
Esolang is enthusiast shorthand for esoteric programming language. I'm not an enthusiast on par with many, but I've written a few Ook! interpreters and played around with others. Piet is the most visually appealing of the esoteric languages I've encountered. The image to the right is a "Hello, World" program written in Piet, courtesy of the Wikimedia Commons.
Recently I have been reading more about the work of Nasser, a computer scientist and artist formerly at the Eyebeam Art + Technology Center. In 2010, he created the Zajal programming language as his MFA thesis project at the Parsons School of Design. Zajal was inspired by Processing and runs on top of Ruby. A couple of years ago, he received widespread coverage for Qalb, a language with Arabic script characters and a Scheme-like syntax. Zajal enables programmers to write programs with beautiful output; Qalb enables programmers to write programs that are themselves quite beautiful.
I wouldn't call Zajal or Qalb esoteric programming languages. They are, in an important way, quite serious, exploring the boundary between "creative vision" and software. As he says at the close of the interview quoted above, we now live in a world in which "code runs constantly in our pockets":
Code is a driving element of culture and politics, which means that code that is difficult to reason about or inaccessible makes for a culture and politics that are difficult to reason about and inaccessible. The conversation about programming languages has never been more human than it is now, and I believe this kind of work will only become more so as software spreads.
As someone who teaches computer science students to think more deeply about programming languages, I would love to see more and different kinds of people entering the conversation.
January 12, 2015 10:26 AM
WTF Problems and Answers for Questions Unasked
Dan Meyer quotes Scott Farrand in WTF Math Problems:
Anything that makes students ask the question that you plan to answer in the lesson is good, because answering questions that haven't been asked is inherently uninteresting.
My challenge this semester: getting students to ask questions about the programming languages they use and how they work. I myself have many questions about languages! My experience teaching our intro course last semester reminded me that what interests me (and textbook authors) doesn't always interest my students.
If you have any WTF? problems for a programming languages course, please share.
January 09, 2015 3:40 PM
Computer Science Everywhere, Military Edition
Military Operations Orders are programs that are executed by units. Code re-use and other software engineering principles applied regularly to these.
An alumnus of my department, a CS major-turned-military officer, wrote those lines in an e-mail responding to my recent post, A Little CS Would Help a Lot of College Grads. Contrary to what many people might imagine, he has found what he learned in computer science to be quite useful to him as an Army captain. And he wasn't even a programmer:
One of the biggest skills I had over my peers was organizing information. I wasn't writing code, but I was handling lots of data and designing systems for that data. Organizing information in a way that was easy to present to my superiors was a breeze and having all the supporting data easily accessible came naturally to me.
Skills and principles from software engineering and project development apply to systems other than software. They also provide a vocabulary for talking about ideas that non-programmers encounter every day:
I did introduce my units to the terms border cases, special cases, and layers of abstraction. I cracked a smile every time I heard those terms used in a meeting.
Excel may not be a "real programming language", but knowing the ways in which it is a language can make managers of people and resources more effective at what they do.
For more about how a CS background has been useful to this officer, check out CS Degree to Army Officer, a blog entry that expands on his experiences.
January 01, 2015 11:29 AM
Being Wrong in 2015
Yesterday, I read three passages about being wrong. First, this from a blog entry about Charles Darwin's "fantastically wrong" idea for how natural selection works:
Being wildly wrong is perfectly healthy in science, because when someone comes along to prove that you're wrong, that's progress. Somewhat embarrassing progress for the person being corrected, sure, but progress nonetheless.
Then, P.G. Wodehouse shared in his Paris Review interview that it's not all Wooster and Jeeves:
... the trouble is when you start writing, you write awful stuff.
And finally, from a touching reflection on his novelist father, this delicious sentence by Colum McCann:
He didn't see this as a failure so much as an adventure in limitations.
My basic orientation as a person is one of small steps, small progress, trying to be a little less wrong than yesterday. However, such a mindset can lead to a conservatism that inhibits changes in direction. One goal I have for 2015 is to take bigger risks intellectually, to stretch my thinking more than I have lately. I'll trust Wodehouse that when I start, I may well be awful. I'll recall Darwin's example that it's okay to be wildly wrong, because then someone will prove me wrong (maybe even me), and that will be progress. And if, like McCann's father, I can treat being wrong as merely an adventure in my limitations, perhaps fear and conservatism won't hold me back from new questions worth asking.
December 31, 2014 10:15 AM
Reinventing Education by Reinventing Explanation
One of the more important essays I read in 2014 was Michael Nielsen's Reinventing Explanation. In it, Nielsen explores how we might design media that help us explain scientific ideas better than we are able with our existing tools.
... it's worth taking non-traditional media seriously not just as a vehicle for popularization or education, which is how they are often viewed, but as an opportunity for explanations which can be, in important ways, deeper.
This essay struck me deep. Nielsen wants us to consider how we might take what we have learned using non-traditional media to popularize and educate and use it to think about how to explain more deeply. I think that learning how to use non-traditional media to explain more deeply will help us change the way we teach and learn.
In too many cases, new technologies are used merely as substitutes for old technology. The web has led to an explosion of instructional video aimed at all levels of learners. No matter how valuable these videos are, most merely replace reading a textbook or a paper. But computational technology enables us to change the task at hand and even redefine what we do. Alan Kay has been telling this story for decades, pointing us to the work of Ivan Sutherland and many others from the early days of computing.
Nielsen points to Bret Victor as an example of someone trying to develop tools that redefine how we think. As Victor himself says, he is following in the grand tradition of Kay, Sutherland, et al. Victor's An Ill-Advised Personal Note about "Media for Thinking the Unthinkable" is an especially direct telling of his story.
Vi Hart is another. Consider her recent Parable of the Polygons, created with Nicky Case, which explains dynamically how local choices and create systemic bias. This simulation uses computation to help people think differently about an idea they might not understand as viscerally from a traditional explanation. Hart has a long body of working using visualization to explain differently, and the introduction of computing extends the depth of her approach.
Over the last few weeks, I have felt myself being pulled by Nielsen's essay and the example of people such as Victor and Hart to think more about how we might design media that help us to teach and explain scientific ideas more deeply. Reinventing explanation might help us reinvent education in a way that actually matters. I don't have a research agenda yet, but looking again at Victor's work is a start.
December 29, 2014 3:27 PM
Exceptions Should Be Exceptional
Exceptions signal something outside the expected bounds of behavior of the code in question. But if you're running some checks on outside input, this is because you expect some messages to fail -- and if a failure is expected behavior, then you shouldn't be using exceptions.
That is a snippet from Replacing Throwing Exceptions with Notification in Validations, a refactoring Martin Fowler published earlier this month. The refactoring is based on an extraordinarily useful piece of software design advice: exceptions should be unexpected. If something is expected, it's not exceptional. Make your software say so. A notification mechanism can carry as much information about system behavior as exceptions and generally provides superior cohesion and division of labor.
Over the last few years, I've come to see that is really a tenet of good system design more generally. A couple of examples from my university experience:
- If your curriculum depends on frequent student requests to enable programs of study that faculty accept as reasonable, then you should probably modify the curriculum to allow what is reasonable. Not only are you gumming up the larger system with unnecessary paperwork, you are likely disadvantaging students who aren't savvy or cheeky enough to disregard the rules.
- If the way you pay for instruction and equipment doesn't match the stated budget, then you should change the budget to reflect reality. If you don't control the writing of the budget, then you should find ways to communicate reality to the budget writers whenever possible. Sometimes, you can work around the given budget to accomplish what you really need for a long time. But over time the system will evolve in response to other external forces, and you reach a point where the budget in no way reflects reality. A sudden change in funding can put you in a state of real crisis. Few people will be able to understand why.
People sometimes tell me that I am naive to think complex systems like a university or even a curriculum should reflect reality closely. Programming has taught me that we almost always benefit from keeping our design as clean, as understandable, and as truthful as we can. I am pragmatic enough to understand that there are exceptions even to this tenet, in life and in software. But exceptions should be exceptional.