TITLE: Advice on my Advice to a Prospective Web Developers AUTHOR: Eugene Wallingford DATE: August 04, 2009 1:45 PM DESC: ----- BODY: Thanks to everyone who responded to my call for advice on what advice I should give to a student interested in studying at the university with a goal of becoming a web developer. People interpreted the article in lots of different ways and so were able to offer suggestions in many different ways. In general, most confirmed the gist of my advice. Learn a broad set of technical skills from the spectrum of computer science, because that prepares one to contribute to the biggest part of the web space, or study design, because that's they part the techies tend to do poorly. A couple of readers filled me in on the many different kinds of web development programs being offered by community colleges and technical institutes. We at the university could never compete in this market, at least not as a university. Mike Holmes wrote a bit about the confusion people have about computer science, with a tip of the hat to Douglas Adams. This confusion does play a role in prospective students' indecision about pursuing CS in college. People go through phases where they think of the computer as replacing an existing technology or medium: calculator, typewriter and printing press, sketchpad, stereo, television. Most of us in computer science seem to do one of two things: latch onto the current craze, or stand aloof from the latest trend and stick with theory. The former underestimates what computing can be and do, while the latter is so general that we appear not to care about what people want or need. It is tough to balance these forces. In some twittering around my request, Wade Arnold tweeted about the technical side of the issue:
@wallingf Learn Java for 4 years to really know one language well. Then they will pick up php, ruby, or python for domain specific speed
The claim is that by learning a single language really well, a person really learns how to program. After that, she can learn other languages and fill in the gaps, both language-wise and domain-wise. This advice runs counter to what many, many people say, myself included: students should learn lots of different languages and programming styles in order to really learn how to program. I think Wade agrees with that over the long term of a programmer's career. What's unusual in his advice is the idea that a student could or should spend all four years of undergrad study mastering one language before branching out. A lot of CS profs will dismiss this idea out of hand; indeed, one of the constant complaints one hears in certain CS ed circles is that too many schools have "gone Java" to the exclusion of all other languages and to the lasting detriment of their students. My department's curriculum has, since before I arrived, required students to study a single language for their entire first year, in an effort to help students learn one language well enough that they learn how to program before moving on to new languages and styles. When that language was, say, Pascal, students could pretty well learn the whole language and get a lot of practice using it. C is simple enough for that purpose, I suppose, but C++, Ada, and Java aren't. If we want students to master those languages at a comparable level, we might well need four years. I think that says more about the languages we use than about students learning enough programming in a year to be ready to generalize and fill in gaps with new languages and styles. This entry has gotten longer than I expected, yet I have more to say. I'll write more in the days to come. -----