TITLE: Workshop 2: Exception Gnomes, Garbage Collection Fairies, and Problems
AUTHOR: Eugene Wallingford
DATE: November 16, 2007 10:52 AM
DESC:
-----
BODY:
[A transcript of the
SECANT 2007 workshop:
Table of Contents]
Thursday afternoon at the
NSF workshop
saw a hodgepodge of sessions around the intersection of science
ed and computing.
The first session was on computing courses for science majors.
Owen Astrachan
described some of the courses being taught at Duke, including
his course using genomics as the source of
problems
to learn programming. He entertained us all with pictures of
scientists and programmers, in part to demonstrate how many of
the people who matter in the domains where real problems live
are not computer geeks. The problems that matter in
the world are not the ones that tend to excite CS profs...
Unbelievable but true!
Not everyone knows about the Towers of Hanoi.
... or cares.
John Zelle
described curriculum initiatives at Wartburg College to bring
computing to science students. Wartburg has taken several
small steps along the way:
- a more friendly CS1 course
- an introductory course in computational science
- integration of computing into the physics curriculum
- a CS senior project course that collaborates with the sciences
- (coming) a revision of the calculus sequence
At first, Zelle said a "CS1 course friendlier to scientists",
but then he backed up to the more general statement. The idea
of needing a friendlier intro course even for our majors is
something many of us in CS have been talking about for a while,
and something I
wrote about a while back.
I was also interested in hearing about Wartburg's senior projects.
More recently, I wrote about
project-based computer science education.
Senior project courses are a great idea, and one that CS faculty
can buy into at many schools. That makes it a good first step
to perhaps changing the whole CS program, if a faculty were so
inclined. The success of such project-centered courses is just
about the only way to convince some faculty that a project focus
is a good idea in most, if not all, CS courses.
Wartburg's computational science course covers many of the traditional
topics,including modeling, differential equations, numerical
methods, and data visualization. It also covers the basics of
parallel programming, which is less common in such a course.
Zelle argued that every computational scientist should know
a bit about parallel programming, given the pragmatics of
computing over massive data sets.
The second session of the afternoon dealt with issues of
programming "in the small" versus "in the large". It seemed
like a bit of a hodgepodge itself. The most entertaining of
these talks was by
Dennis Brylow
of Marquette, called "Object Dis-Oriented". He said
that his charge was to "take a principled stand that will
generate controversy". Some in the room found this to be a
novelty, but for Owen and me, and anyone in the SIGCSE crowd,
it was another in a long line of anti-"objects first" screeds:
Students can't learn how to decompose into methods until they
know what goes into a method; students can't learn to program
top-down, because then "it's all magic boxes".
I reported on a similarly
entertaining panel
at SIGCSE a couple of years ago. Brylow did give us something
new, a phrase for the annals of anti-OO snake oil: to students
who learn OOP first see their programs as
... full of exception gnomes and garbage collection fairies.
Owen asked the natural question, reductio ad absurdum: Why not
teach gates then? The answer from the choir around the room
was, good point, we have to choose a level, but that level is
below objects -- focused on "the fundamentals". Sigh.
Abacus-early,
anyone?
Brylow also offered a list of what he thinks we should teach
first, which contains some important ideas: a small language,
a heavy focus on data representation, functional
decomposition, and the fundamental capabilities of machine
computation
This list tied well into the round-table discussion that followed,
on what computational concepts science students should learn. I
didn't get a coherent picture from this discussion, but one
part stood out to me. Bruce Sherwood said that many scientists
view analytical solution as privileged over simulation, because
it is exact. He then pointed out that in some domains the
situation is being turned on its head: a faithful discrete
simulation is a more real depiction of the world than
the closed-form analytical solution -- which is, in fact, only
an approximation created at a time when our tools were more
limited. The best quote of this session came from John Zelle:
Continuity is a hack!
The day closed with another hodgepodge session on the role
of data visualization.
Ruth Chabay
spoke about visualizing models, which in physics are
as important as -- more important than!? -- data.
Michael Coen
gave a "great quote"-laden presentation that on the question
of whether computer science is the servant of science or the
queen of the sciences, a lá
Gauss on math
before him.
Chris Hoffman
gave a concise but convincing motivational talk:
- There is great power in visualizing data.
- With power comes risk, the risk of misleading.
- Visualization can be tremendously effective.
- Techniques for visual data analysis must account for the
coming data deluge. (He gave some great examples...)
- The challenges of massive data are coming in all
of the sciences.
When Coen's and Hoffman's slides become available on-line, I
will point to them. They would be worth a glance.
Hodgepodge sessions and hodgepodge days are okay. Sometimes
we don't know where the best connections will come from...
-----