… on the great principles of computing in UBIQUITY. Computer Science. Is it Really a Science, and What’s It a Science About? A computing visionary and leader of the movement to define and elucidate the “great principles of computing,” Peter J. Denning is a professor at the Naval Postgraduate School in Monterey, California. He is a former president of the ACM.
UBIQUITY: How much success do you think you’ve had advocating that computing is a science?
DENNING: I find little argument with the claim that computing is engineering, but skepticism toward the claim that computing is science. In the past few years there has been a sea change on the science claim. The skeptics are coming around. Part of the reason is that scientists in other fields, particularly biology and quantum physics, have declared that information processes occur naturally in their fields. (…)
When I began to call myself a computer scientist in 1967, I encountered many people who reacted with the question, “What is the science in computers?” It was not easy to answer that question, and little in my academic training helped. That question has always nagged me. (…)
But we’ve brought forward numerous other principles such as the uncertainty principle for selection among near-simultaneous signals (first discovered in computer architecture), the bottleneck principle (networks), the locality principle (operating systems), the two-phase locking principle (databases), and the hierarchical aggregation principle (systems design). There are many other principles besides these. (…)
The cynics say that any field that calls itself a science can’t be a science. The serious critics invoke additional criteria including non-obvious implications of principles and falsifiability of hypotheses. I think we can answer these. The most difficult objection has been that computing can’t be a science because science deals with natural phenomena, whereas computers are manmade. They say that, at best, computing is a science of the artificial, not a real science. (…)
In 2004 I sat down and carefully checked how computing does or does not satisfy all the accepted criteria of being a science. These criteria include an organized body of knowledge, a track record of non-obvious discoveries, an experimental method to test hypotheses, and an openness to any hypothesis being falsified. They also include secondary distinctions such as interplay between science and art, and between basic and applied. I saw that we could check off every one of the accepted criteria. (…)
Even so, I still found resistance to the conclusion that computer science is a science. The single remaining objection was that, down at the bottom, science deals with natural things. I quipped that people are found in nature and people build computers. No, that doesn’t cut it, came the reply, science deals with things that existed before people came along. (…)
The claim is that, because computers are artifacts, all the things we study about computers are artificial. The purists say that a true science deals with natural, not artificial, entities. They say that computer science is about programming, chips, networks, and other technologies, all of which are human inventions. These technologies won’t necessarily be around in a generation or two. How can their study amount to a science? (…)
It struck me that if people in other fields think we are mainly programmers and technologists, we have been grossly unsuccessful at communicating the full breadth and depth of everything that we have done. (…)
It seemed to me that if we are a science, we should be able to explain our fundamental principles. I began to ask what they are. I discovered that neither my colleagues nor I could say. I grew up with the definition that computer science is the study of phenomena surrounding computers. This definition puts the computer at the center. It holds that computation is what computers do. In 2001, when Biology Nobel Laureate David Baltimore said that cellular mechanisms are natural computational means to read DNA and construct new living cells, I saw that our definition, and the thinking behind it, is backwards. Computation is the principle, the computer is simply the tool. (…)
Our older definition is like saying that biology is the study of phenomena surrounding the microscope, or astronomy is the study of phenomena surrounding the telescope. How odd this sounds! It sounds odd because we recognize that the microscope and the telescope as tools to study life and the universe. So it is with computation. Suppose that information processes already exist in nature, and the computer is our tool for studying them? David Baltimore is one of first scientists to say that computation occurs naturally; many have since followed.
UBIQUITY: So your project is really about developing a new language for discussing computing?
DENNING: Yes. That’s a nice way of putting it. Over the years, we have evolved language for computing that sounds like we think we’re all about programming and computing technology. I noticed that the natural sciences go to great lengths to emphasize their fundamental principles. Technologies come and go but the principles change only occasionally. About ten years ago I started to look seriously at what a principles-oriented language for computing might look like. (…) A breakthrough happened when we saw that we had to think of computation as the principle and computer as the tool. That realization allowed us finally to construct an effective list of principles. (…)
People from other fields are saying they have discovered information processes in their deepest structures and that collaboration with computing is essential to them. We’re getting better at convincing people that there is something much deeper to computing than simply programming, abstractions, chips, and networks. (…)
UBIQUITY: What will Ubiquity readers find when they follow the link we give them, which is http://cs.gmu.edu/cne/pjd/GP ?
DENNING: They will find a complete description of the Great Principles of Computing Project. In addition to the project overview, readers will find a taxonomy of principles in seven categories, narrative overviews of each category, a set of top-level principles for each category, detailed expansions of each principle, an analysis of new uses of the new body of knowledge, a discussion of a Great Principles Library, links to partner projects, and answers to FAQ (frequently asked questions).
UBIQUITY: Let’s quickly run down these project components. What are the seven categories?
DENNING: The seven categories are our first major contribution. They are computation, communication, coordination, recollection, automation, evaluation and design. They are groups of related principles concerning a functional area of computing. They are not technologies. No category is a principle in itself — it’s just simply a grouping of principles. I’ve compared the seven categories to windows in a seven-sided building that contains all computing knowledge….