TITLE: Workshop 1: Creating a Dialogue Between Science and CS AUTHOR: Eugene Wallingford DATE: November 15, 2007 8:34 PM DESC: ----- BODY:

[A transcript of the SECANT 2007 workshop: Table of Contents]

The main session this morning was on creating a dialogue between science and CS. There seems to be a consensus among scientists and computer scientists alike that the typical introductory computer science course is not what other science students need, but what do they need? (Then again, many of us in CS think that the typical introductory computer science course is not what our computer science students need!) Bruce Sherwood, a physics professor at North Carolina State, addressed the question,"Why computation in physics?" He said that this was one prong in an effort to admit to physics students "that the 20th century happened". Apparently, this is not common enough in physics. (Nor in computer science!) To be authentic to modern practice, even intro physics must show theory, experiment, and computation. Physicists have to build computational models, because many of their theories are too complex or have no analytical solution, at least not a complete one. What excited me most is that Sherwood sees computation as a means for communicating the fundamental principles of physics, even its reductionist nature. He gave as an example the time evolution of Newtonian synthesis. The closed form solution shows students the idea only at a global level. With a computer simulation, students can see change happen over time. Even more, it can be used to demonstrate that the theory supports open-ended prediction of future behavior. Students never really see this when playing with analytical equations. In Sherwood's words, without computation, you lose the core of Newtonian mechanics! He even argued that physics students should learn to program. Why? More on science students and programming in a separate entry. Useful links from his talk include comPADRE, a part of the National Science Digital Library for educational resources in physics and astronomy, and VPython, dubbed by supporters as "3D Programming for Ordinary Mortals". I must admit that the few demos and programs I saw today were way impressive. The second speaker was Noah Diffenbaugh, a professor in earth and atmospheric sciences at Purdue. He views himself as a modeler dependent on computing. In the last year or so, he has collected 55 terabytes of data as a part of his work. All of his experiments are numerical simulations. He cannot control the conditions of the system he studies, so he models the system and runs experiments on the model. He has no alternative. Diffenbaugh claims that anyone who wants a career in his discipline must be able to do computing -- as a consumer of tools, builder of models. He goes farther, calling himself a black sheep in his discipline for thinking that learning computing is critical to the intellectual development of scientists and non-scientists alike. When most scientists talk of computation, they talk about a tool -- their tool -- and why it should be learned. They do not talk about principles of computing or the intellectual process one practices when writing a program. This concerns Diffenbaugh, who thinks that scientists must understand the principles of computing on which they depend, and that non-scientists must understand them, too, in order to under the work of scientists. Of course, scientists are the only ones who fixate on their computational tools to the detriment of discussing ideas. CS faculty do it, too, when they discuss CS1 in terms of the languages we teach. What's worse, though, is that some of us in CS do talk about principles of computing and intellectual process -- but only as the sheep's clothing that sneaks our favorite languages and tools and programming ideas into the course. The session did include some computer scientists. Kevin Wayne of Princeton described an interdisciplinary "first course in computer science for the next generation of scientists and engineers". On his view, both computer science students and students of science and engineering are shortchanged when they do not study the other discipline. One of his colleagues (Sedgewick?) argues that there should be a common core in math, science, and computation for all science and engineering students, including CS. What do scientists want in such a course? Wayne and his colleagues asked and found that they wanted the course to cover simulation, data analysis, scientific method, and transferrable programming skills (C, Perl, Matlab). That list isn't too surprising, even the fourth item. That is a demand that CS folks hear from other CS faculty and from industry all the time. The course they have built covers the scientific method and a modern programming model built on top of Java. It is infused with scientific examples throughout. This include not examples from the hard sciences, such as sequence alignment, but also cool examples from CS, such as Google's page rank scheme. In the course, they use real data and so so experience the sensitivity to initial conditions in the models they build. He showed examples from financial engineering and political science, including the construction of a red/blue map of the US by percentage of the vote won by each candidate in each state. Data of this sort is available at National Atlas of the US, a data source I've already added to my list. The fourth talk of the session was on the developing emphasis on modeling at Oberlin College, across all the sciences. I did not take as many notes on this talk, but I did grab one last link from the morning, to the Oberlin Center for Computation and Modeling. Occam -- a great acronym. My main takeaway points from this session came from the talks by the scientists, perhaps because I know relatively less about what scientists think about and want from computer science. I found the examples they offered fascinating and their perspectives on computing to be surprisingly supportive. If these folks are at all representative, the dialogue between science and CS is ripe for development. -----