TITLE: Thinking and Doing in the Digital Age AUTHOR: Eugene Wallingford DATE: May 02, 2011 3:52 PM DESC: ----- BODY: Last week, someone I follow tweeted this link in order to share this passage:
You will be newbie forever. Get good at the beginner mode, learning new programs, asking dumb questions, making stupid mistakes, soliciting help, and helping others with what you learn (the best way to learn yourself).
That blog entry is about inexorable change of technology in then modern world and how, if we want to succeed in this world, we need a mindset that accommodates change. We might even argue that we need a mindset that welcomes or seeks out change. To me, this is one of the more compelling reasons for us to broaden the common definition of the liberal arts to include computing and other digital forms of communication. As much as I like the quoted passage, I liked a couple of others as much or more. Consider:
Understanding how a technology works is not necessary to use it well. We don't understand how biology works, but we still use wood well.
As we introduce computing and other digital media to more people, we need to balance teaching how to use new ideas and techniques and teaching underlying implementations. Some tools change how we work without us knowing how they work, or needing to know. It's easy for people like me to get so excited about, say, programming that we exaggerate its importance. Not everyone needs to program all the time. Then again, consider this:
The proper response to a stupid technology is to make a better one yourself, just as the proper response to a stupid idea is not to outlaw it but to replace it with a better idea.
In the digital world as in the physical world, we are not limited by our tools. We can change how our tools work, through configuration files and scripts. We can make our own tools. Finally, an aphorism that captures differences between how today's youth think about technology and how people my age often think (emphasis added):
Nobody has any idea of what a new invention will really be good for. To evaluate, don't think; try.
This has always been true of inventions. I doubt many people appreciated just how different the world would be after the creation of the automobile or the transistor. But with digital tools, the cost of trying things out has been driven so low, relative to the cost of trying things in the physical world, that the cost is effectively zero. In so many situations now, the net value of trying things exceeds the net value of thinking. I know that sounds strange, and I certainly don't mean to say that we should all just stop thinking. That's the sort of misinterpretation too many people made of the tenets of extreme programming. But the simple fact is, thinking too much means waiting too long. While you are thinking -- waiting to start -- someone else is trying, learning faster, and doing things that matter. I love this quote from Elisabeth Hendrickson, who reminded herself of the wisdom of "try; don't think" when creating her latest product:
... empirical evidence trumps speculation. Every. Single. Time.
The scientific method has been teaching us the value of empiricism over pure thought for a long time. In the digital world, the value is even more pronounced. -----