Skip to main content

Vinton G. Cerf: "The Value of Investment by the U.S. Government Cannot Be Overstated"

Written testimony for the U.S. Senate Committee on Commerce, Science and Transportation hearing "The Federal Research Portfolio: Capitalizing on Investments in R&D" held on July 17, 2014

On Thursday, July 17, four science experts served as witnesses at the U.S. Senate Committee on Commerce, Science, and Transportation hearing, “The Federal Research Portfolio: Capitalizing on Investments in R&D.” The hearing considered the federal government’s role in research and development (R&D), and the nation’s STEM education and outreach initiatives.

Attendees in the Capitol hearing room were Mariette DiChristina, editor in chief and senior vice president of Scientific American; Vinton G. Cerf, computer scientist, Google’s Internet Evangelist and one of the fathers of the Internet; Neal F. Lane, former director of the White House Office of Science and Technology Policy; and Stephen E. Fienberg, professor of statistics and social science at Carnegie Mellon University.

Recognizing the need for long-term investments in science and technology, Congress passed the America COMPETES Acts of 2007 and 2010 to significantly increase federal R&D budgets, to promote STEM (science, technology, engineering and mathematics) education and to support the innovation necessary for economic growth.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Below is the full text of the written testimony by Vinton G. Cerf.

Chairman Rockefeller, Ranking Member Thune, Members of the Committee, distinguished panelists and guests, I am honored and pleased to have this opportunity to participate in a hearing on a topic about which I am passionate and committed: basic research. There is no substitute for deep understanding of natural and artificial phenomena, especially when our national and global wellbeing depend on our ability to model and make predictions regarding them. It would be hard to overstate the benefits that have been realized from investment by the US Government and American industry in research.

I am sure every member of this committee is well aware of the fundamental scientific paradigm: Theories are developed to explain observations or to speculate on how and why things might work. Experiments are undertaken to validate or refute the predictions of the theory. Theories are revised based on experimental results.

Basic and Applied Research
While the primary focus of attention in this panel is on basic research, I feel compelled to observe that basic and applied research go hand-in-hand, informing and stimulating each other in a never-ending Yin and Yang of partnership. In some ways, applied research is a form of validation because the success (or failure) of the application may reinforce or contradict the theoretically predicted results and the underlying theory. Basic research tries to understand and applied research tries to do and often one must pursue both in the effort to uncover new knowledge.

I would like to use the Internet as an example of applied research to make several points. The Internet was first conceived by Bob Kahn in late 1972. He and I worked together on the idea during 1973, publishing the first paper on its design in May 1974. It was launched operationally on January 1, 1983. Sponsored by the US Defense Advanced Research Projects Agency (DARPA), the Internet drew strong motivation from its earlier and highly successful ARPANET and later Packet Radio and Packet Satellite projects. The Packet Satellite project also drew, in part, on the results of another project called ALOHAnet that had been sponsored by the U.S. Air Force Office of Aerospace Research (SRMA) and DARPA. NSF was a major contributor to the Internet’s development and dramatic expansion in the academic community with its NSFNET project that linked NSF’s supercomputers with the research community. The Department of Energy’s ESNET and the NASA Science Internet (NSINET) added to the energy driving this development.

First, successful applied research projects like the Internet may take a long time to mature. It was ten years from the conception to the deployment of the system and required persistent funding and advocacy during and after that period, to say nothing of the research and experimentation that preceded it.

Second, while primarily an engineering and applied research project, the system did then and continues now to turn up new theoretical and analytical challenges. We are still evolving theories and models of the behavior of this complex, growing and evolving system as we measure, observe and analyze its performance. The applications of the Internet continue to drive research aimed at understanding and improving its operation or in inventing something better.

Third, serendipity has played a significant role in the evolution of the Internet’s functionality and the applications it supports. Networked electronic mail emerged as a major but unplanned application on the ARPANET. The World Wide Web (WWW) was initially conceived in 1989 to support sharing of research papers in particle physics at the Center for European Nuclear Research (CERN). It spread rapidly on the Internet after the introduction of the MOSAIC browser by the NSF’s National Center for Supercomputer Applications (NCSA) at the University of Illinois in Urbana-Champaign in late 1992 and the creation of the Netscape Communications corporation in 1994. The WWW has become the most widely-used application on the Internet. Though the WWW was conceived for a particular application, its generality, and that of the underlying Internet, has created the conditions for a cornucopia of new uses that continue to be invented daily.

Research Takes Time
Validation of basic research may also take a long time. The notion of the inflation of the early universe still awaits satisfactory confirmation. Postulated by Alan Guth (among others) around 1974, this year’s recent results, from measurements taken by the BICEP2 experiment, suggest evidence that this theory is correct, but there is significant debate about the interpretation of the measurements. While the community awaits further corroborating or refuting experimental validation of the measurements, it is important to recognize that the means to gather potentially validating experimental data took 30 years to reach maturity. A similar observation can be made for recent discovery of a Higgs boson by the Large Hadron Collider team at CERN. Peter Higgs and his colleagues postulated the existence of this fundamental particle and its associated field around 1964 but it has taken 50 years for the experimental capacity to test this theory to reach the point where such tests could be undertaken.

It’s Risky: There are No Guarantees
It is worth pausing for a moment to appreciate that research, by its very nature, cannot always guarantee results. Moreover, sometimes the results may come in the form of surprises. A canonical example is the discovery by Alexander Fleming, in 1928, that penicillium mold produces an antibiotic. He was reacting to an unexplained observation in some petri dishes he happened to notice. It was not until 13 years later in 1941 that the active compound we call penicillin was isolated. The best scientists are the ones who are alert to anomalies and seek to understand them. Nobel prizes don’t go to scientists who ignore anomalies. They go to the scientists who see unexpected results and say, “huh? That’s funny!” and try to find out what is behind an unanticipated observation.

Humility is called for in this space. One hears the term “Laws of Physics” as if punishment awaits anyone or anything that dares to break them. And, yet, we know these so-called laws may be only approximations of reality – limited by the accuracy of our measurement tools and experimental capacity to validate their predictions. Every scientist must be prepared to cast aside or revise a pet theory if measurement and observation contradict it.

Perhaps more important is the ability to sustain high risk, high payoff research. American industry can afford to take some risk but sustainable businesses are rarely in a position to invest in very long-term research. Venture capital, while historically willing to take considerable risk, is looking for near-term payoffs. The ability to take sustained, long-term risk for potential long-term benefit falls largely to the government. The United States has benefited from underwriting this kind of research, as exemplified by the research programs of the National Science Foundation (NSF), the Defense Advanced Research Projects Agency, the National Institutes of Health, the National Institutes of Standards and Technology, among many other US Government supported research programs.

In this area, the US Congress and the committees focused on scientific research and development have the greatest roles to play. Consistent and increasing support for basic and applied research and advanced development has been the source of most major advances in science and technology in the past 70 years. The American economy has been the envy of the world, in large part because of this consistent cycle of long-term research and its application to near-term products and services.

The Importance of Failure
Failure is the handmaiden of wisdom in the scientific world. When we make predictions or build systems based on our theoretical models, we must be prepared for and learn from our failures. Understanding the reason for failure is sometimes even more important than positive results since it may pave the way for far deeper understanding and more precise models of reality. In the scientific enterprise, the freedom to take risk and accept the potential of failure makes the difference between merely incremental refinement and breakthroughs that open new vistas of understanding.

In the late 1800s it was thought that the Newtonian model of the universe was complete and that we merely needed to measure the physical constants more accurately to be able to make unequivocal predictions. In 1905, Einstein’s four papers on the Photoelectric effect, Brownian motion, special relativity and mass-energy equivalence (E=Mc2) shattered the complacency of early 20th Century physics. He showed that purely Newtonian notions were inadequate to explain measured observations. He compounded his impact in 1915 with the publication of his monumentally important field equations of general relativity.

Research into the nature of the atom led to the development of quantum field theory beginning in the 1920s. Efforts to reconcile its extremely counter-intuitive but extremely accurate predictions with Einstein’s geometric theory of space-time have not borne demonstrable fruit. The irony of all this is that we now believe that the physics of the very small are extremely relevant to the study of the universe at large because the early universe at the moment of the so-called Big Bang was so small and dense and hot that quantum models appear to have dominated its behavior. Einstein’s geometric theory simply breaks down under these conditions and provides no predictions of testable use.

If we have learned anything over the course of the past hundred years, it is that we know less than we once thought we knew about the world around us. For scientists, this only means that the territory yet to be explored is simply larger than ever and that discovery awaits us at every turn.

The Role of Computing
Richard Hamming is a legendary numerical analyst. As he famously observed: “The purpose of computing is insight, not numbers.” Computers, computation, networking and information sharing have become essential parts of the research landscape over the past 50 years. The World Wide Web and the search engines that have evolved around it have improved our ability to share and discover information and potential research partners on a global scale. New disciplines have emerged such as computational biology, computational chemistry and computational physics. We use increasingly detailed and accurate models to make predictions that we can test in the laboratory. The 2013 Nobel prize in chemistry went to three NSF-funded researchers for their models of molecular processes. From the Scientific American blog: “… this year’s prize in chemistry has been awarded to Martin Karplus, Michael Levitt and Arieh Warshel for their development of “multiscale methods for complex systems”. More simply put, these three chemists have been recognized for their development and application of methods to simulate the behavior of molecules at various scales, from single molecules to proteins.”

There is a renaissance in the application of computing to research, partly driven by the vast increase in computational power and memory found in combinations of cloud and super computing. “Big data” has become a mantra but it is fair to say that our ability to absorb, analyze and visualize vast quantities of measured or computed data has improved dramatically in the last few decades. We can use finer and finer-grained models, improve accuracy and timeliness of predictions, thanks to these capabilities. Computational biology may lead to breakthroughs in our ability to understand genetics, epi-genetics, the proteome and the importance of flora in our digestive systems. With this knowledge, we will help people live longer, healthier and more productive lives. Our ability to understand global phenomena will benefit from this computational renaissance.

I would be remiss not to mention the Internet of Things that is fast upon us. The networking of common devices that surround and perfuse our society is rapidly becoming reality. From household appliances to office equipment, from industrial manufacturing to utilities, from transportation vehicles to personal monitoring equipment, we will live in an increasingly networked world. We will be surrounded by software. It is vital that we learn to design safety and security into these systems and to understand and be able to predict their aggregate behavior. This trend, too, illustrates the promise and the peril of our modern world. Cyber-security and cyber-safety must accompany our increasing use of computers, programmable devices and networks if we are to receive net benefit from these developments.

Nano-Materials
Adjacent to and actually contributing to computational capacity we find nano-technology of increasing importance and value. Materials not found in nature have properties that defy intuition (e.g. invisibility and superconductivity). Graphene: sheets of carbon molecules, arrayed in one-atom-thick, hexagonal, “chicken wire” fashion, have unexpected potential for replacing silicon in transistors, for filtering impurities from water, for conducting heat and super-conducting electricity. Carbon is becoming both the bête noir and the deus ex machina of our civilization, depending on whether it is in the form of carbon dioxide, hydrocarbon fuels, or carbon nanotubes!

In the Interest and Pursuit of Science and its Application
It is widely and correctly appreciated that science, technology, engineering and mathematics (STEM) form the basis for improving upon and making use of our understanding of how the phenomena of our world work. While there is persistent controversy regarding the supply of STEM-trained workers, there can be little doubt that there is an increasing demand in the workforce for these skills.

As a recent president of the Association for Computing Machinery (ACM) and a member of the Google staff, I have been a strong proponent of the proposition that computer science should be a required part of the K-12 curriculum. Every student should have some exposure to the concept of programming, not only because it promotes logical thinking but also because it is important for everyone to understand and appreciate the potential weaknesses in all software-controlled systems. Computer science should be treated on a par with biology, chemistry, physics and mathematics in K-12 and undergraduate curricula, not simply as an elective that bears no STEM credit.

The maker movement is perhaps one of the most important, emerging phenomena in modern culture. The rediscovery of the joy and satisfaction of making things is contributing to a rebirth of American interest in small-scale manufacturing and pride of workmanship. The development of so-called 3D printers has accelerated this phenomenon. NSF is strongly involved in these initiatives. Coupled with research programs in advanced manufacturing, stimulated in part by versions of the America COMPETES Act [P.L. 110-69 of 2007 and P.L. 111-358 of 2010), advanced manufacturing and the maker movement have the potential to recapture American initiative and interest in a space that historically had moved off shore.

Voluntary programs such as Dean Kamen’s FIRST Robotics competitions are representative of a wave of such initiatives that have the potential to rekindle the natural STEM interests of America’s youth.

It is sometimes said that we are all born natural scientists but that our educational system sometimes manages to erode this natural curiosity with poorly constructed curricular content and style of presentation. Computers and networks may have a role to play here as well.

An early foray into Massive, Open, Online Classes (MOOCs) space was undertaken by two of my Google colleagues, Sebastian Thrun and Peter Norvig. They proposed to teach an online course in artificial intelligence, in cooperation with Stanford University. Expecting, at most, 500 people to sign up, they were stunned to find 160,000 people had applied to take the class. Critics pointed out that only 23,000 completed the course – but I defy you to provide an example of any teacher of computer science who had taught that many students in the course of a career let alone one class!

The early success of MOOCs has generated a justifiable excitement and formation of for-profit and non-profit efforts in this space. Serving classes of tens of thousands of students at a time, the economics of MOOCs is dramatic and compelling. A class of 100,000 students, paying $10 each, generates $1M in revenue! Plainly, the scaling is the key leveraging factor. While absolutely not a panacea, the potential for delivering high quality content and individualized learning in appropriate educational areas has a transformative potential for an educational system that has not changed much in the last 200 years.

Conclusion
In my opinion, support for basic and applied research is fundamentally justifiable based not only on the civil and economic benefits it has conferred but also on the ground-level understanding that basic research is high risk but has a high potential payoff. Only the Government has the capacity to sustain this kind of effort. The National Science Foundation was founded by Congress in 1950. Over the past 60+ years, NSF has successfully supported the scientific research enterprise through widely solicited proposals, a well tested peer review system, dedicated and well-qualified program managers and strongly motivated and highly effective leadership.

As a member of the National Science Board, I have learned that successful scientific endeavors supported by NSF rely on a partnership among the research community, the National Science Foundation staff, leadership and board, and the members of the House and Senate who are equally committed to basic and applied research. Vannever Bush got it exactly right in his landmark report: Science, The Endless Frontier. Science is an endless frontier. The more we learn, the more we know we don’t know, and the more we must dedicate ourselves to learning and knowing more.

Vinton G. Cerf is the chairman of the Marconi Society and serves as vice president and chief internet evangelist for Google. Cerf is widely known as one of the creators of the internet.

More by Vinton G. Cerf