Transforming Defense
Basic Research Strategy

 

AUGUSTUS W. FOUNTAIN III


From Parameters, Winter 2004-05, pp. 40-54.


The US armed forces currently enjoy an unprecedented level of technological superiority across the full spectrum of military threats. These advances were primarily funded through US government and Department of Defense support of basic science and technology throughout the 50 years of relative peace experienced during the Cold War. A long-term investment in research has allowed the military to field key enabling technologies such as radar, jet engines, nuclear weapons, night vision, precision-guided munitions, stealth, the Global Positioning System, unmanned air vehicles, and information management systems that have dramatically changed warfare. Technological superiority will continue to be a cornerstone of our national military strategy.1 While today’s technological edge allows us to dominate the broad spectrum of conflict and win with relatively few casualties, maintaining a technological edge has become a key component of the vision to transform the US joint forces by relying on the development and fielding of high-technology weapons that enable a smaller force to be more effective.2

The catalyst that created today’s generation of technological advances was a post-World War II decision to create a huge national engine of public science. The blueprints of this engine were drafted in a report to President Truman by Vannevar Bush, who was the Director of the Office of Scientific Research and Development. The foundation of Dr. Bush’s plan was to fund investigator-initiated projects, largely conducted in academic laboratories, by civilians independent of the military establishment.3 Under this construct, universities did fundamental research work—the “R” in R&D. Government laboratories and arsenals would then take some of that research and, with the cooperation of industry, develop it into military technologies. The vision Bush proposed clearly recognized that the applications developed

40/41

from basic research often appeared many years after the work was initiated, and that there may be no clear benefit realized from much of this work.

In the 50 years since the end of World War II, changes have occurred that might call for a major adjustment in our strategy for defense funding of scientific research. The two most important are the end of the Cold War and the emergence of a global technological marketplace.4 Public funding of basic research for the Department of Defense during the Cold War was successful because it minimized risk by taking maximum advantage of long-term research projects that produced rather mature technologies for development. The Global Positioning System (GPS) is an example of a technology that has given US forces an incredible advantage on the modern battlefield. Research on satellites and a global positioning system began in 1946 after the publication of an article on geo-stationary orbits by physicist Arthur C. Clarke, more widely known for writing 2001: A Space Odyssey. The first GPS satellite was launched in 1978, with the full 24-satellite constellation completed on 9 March 1994.5 In a way our science and technology capability has acted as an additional form of deterrence against our adversaries. However, in today’s fast-paced and dynamic environment, the Department of Defense cannot afford 48 years to research, develop, and deploy critical technologies to the warfighter. Many critical defense technologies are now readily available in the global marketplace. Therefore advanced technology is as readily available to our adversaries and allies alike. This makes the in-house development of new capabilities ever more important.

The Department of Defense is relying on an investment in science and technology to provide the foundation for transformational joint warfighting capabilities. However, the DOD has maintained the same basic research infrastructure and funding policies that were developed for the Cold War. In order to stay ahead of adversaries with access to technologies available in the global marketplace, the DOD needs to shorten the time-frame from concept to fielding. The public funding of defense basic research in universi-

41/42

ties is too cumbersome, slow, and focused on the wrong goals to adequately develop the technology needed for fighting the Global War on Terror or to deliver to the Future Force of 2020. Thus, here is the question posed by this article: “Is the Department of Defense basic science research strategy capable of developing the technology necessary to enable key elements of the US military’s transformation?”

The DOD Science and Technology Process

The purpose of DOD research is to ensure that our warfighters have “superior and affordable technology to support their missions and to provide revolutionary capabilities.”6 The DOD Science and Technology (S&T) program is coordinated and focused through a series of five documents: the Defense S&T Strategy, the Defense Technology Area Plan, the Defense Technology Objectives document, the Joint Warfighting S&T Plan, and the Basic Research Plan. These documents, as well as supporting individual S&T master plans of the military services and Defense agencies, guide the annual preparation of the DOD budget and program objective memorandums (POMs). The first four documents are updated quadrennially, with the fifth being updated biennially.

The Defense S&T Strategy establishes high-priority investment areas and then implements those goals by assigning a service or agency the lead for a given research area. This process is called “Reliance” and allows the Defense Department to combine resources and reduce redundancy. The Reliance process includes research efforts from the separate services, the Ballistic Missile Defense Organization, the Defense Threat Reduction Agency, the Defense Advanced Research Projects Agency, the Office of the Deputy Under Secretary of Defense for Advanced Systems and Concepts, and the Joint Staff.7

The Defense Technology Area Plan documents the focus, content, and principal objectives of the overall DOD science and technology efforts. This plan outlines the investment strategy for applied research and advanced technology development in 12 key technologies critical to the DOD, but organized along service lines. Additionally the plan details the nearly 200 Defense Technology Objectives, which are the fundamental building blocks of the Defense S&T program. These objectives form the basis of the Defense S&T Reliance process by assigning key research objectives and specific technology advancements to each of the participating services and agencies.8

The Joint Warfighting S&T Plan (JWSTP) is similar to the Defense Technology Area Plan. However, it ensures joint efforts are achieved throughout the applied research and advanced technology development arenas. This document outlines the Joint Warfighting Capability Objectives, which are similar in principle to the Defense Technology Objectives, but their primary

42/43

fountai2.jpg (30506 bytes)

Figure 1. Science and Technology Planning Process.

purpose is to ensure that the S&T program supports future joint warfighting capabilities. The Joint Requirements Oversight Council has endorsed the planning process and methodology of the JWSTP. Together, the JWSTP and the Defense Technology Area Plan ensure that the near- and mid-term needs of the joint warfighter are properly balanced and supported in the S&T planning, programming, budgeting, and assessment activities of the Department of Defense.9 While the technical areas outlined in the two plans are different, active participation by the service laboratories, the defense agencies, and the warfighters provides the requirements that drive the basic research areas. These requirements are evaluated in service S&T program reviews and in the Technology Area Reviews and Assessments of the Deputy Under Secretary of Defense (S&T).

In the Technology Area Reviews and Assessments, representatives from academia, government, and industry evaluate programs based on their completeness, balance, relevance, and transition plans, and thus avoid unnecessary duplication with other DOD programs. The Technology Area Reviews and Assessments also compare the programs to guidance from the Director of Defense Research & Engineering in the Office of the Secretary of Defense, the Defense S&T Strategy, the Joint Warfighting S&T Plan, the Defense Technology Area Plans, and the Basic Research Plan. Particular emphasis is placed on the responsiveness of programs to the Defense Technology Objectives, which state what technology advancements are to be developed and demonstrated, by what fiscal year, for what specific benefit, solving what technical barrier, and for which service. As shown in Figure 1, the Science

43/44

and Technology Planning Process is primarily used for the purpose of developing the program objective memorandums.10 One criticism of this process is that there are no effective criteria for evaluating these programs in their ability to fulfill joint warfighting requirements.11 There simply is no mechanism in place to evaluate whether the investment of funding toward fulfilling joint warfighting requirements is met until a technology is being fielded.

Defense Basic Research

Basic research is primarily concerned with the discovery of new fundamental knowledge and the expansion of understanding in a given area. Defense basic research is therefore primarily concerned with the discovery and development of fundamental knowledge and understanding to enable future technologies that benefit national defense capabilities. The character of defense basic research therefore is distinguishable from other similar research more by the researcher and his or her motivation than by the actual research conducted.12 The Basic Research Plan presents the DOD objectives and investment strategy for DOD-sponsored basic research performed by universities, industry, and service laboratories. The plan supports the long-term research needs of the Department of Defense presented in each of 12 technical disciplines: atmospheric and space sciences, materials science, biological sciences, mathematics, chemistry, mechanics, cognitive and neural science, ocean sciences, computer science, physics, electronics, and terrestrial sciences. While it is often difficult to delineate the boundary between basic research and applied research, basic research should enable many potential future applications and uses, whereas applied research seeks to fill gaps in knowledge toward a particular application. Defense research is managed mainly by or through the Army Research Office, the Office of Naval Research, the Air Force Office of Scientific Research, and the Defense Advanced Research Projects Agency. Oversight of the entire basic research program is the responsibility of the Director for Basic Sciences in the Office of the Deputy Under Secretary of Defense for Laboratories and Basic Sciences, located in the Office of the Director of Defense Research and Engineering.13 While the DOD research, development, test, and evaluation (RDT&E) budget appropriation for FY03 was $57.0 billion, the amount budgeted for basic research was $1.417 billion, only 2.49 percent of the RDT&E total.14 This amount has remained nearly constant since 1985.15

One can question whether this investment in basic research is being made wisely. Nearly 54 percent of this funding goes to universities with no direct accountability to fulfilling requirements outlined in the Defense Technology Area Plan. Instead of being spent in an effort to meet the technological needs of the warfighter, much of this funding goes toward more altruistic goals such as: establishing collaborative research between university profes-

44/45

sors and students with military laboratories; strengthening academic programs in science, mathematics, and engineering; encouraging students to pursue degrees and careers in science; providing equipment, scholarships, and work/study opportunities; helping universities improve their capacity to perform research of interest to DOD; and training students in scientific disciplines.16 However, according to Dr. Joseph Rocchio, Director of the Sensors and Electron Devices Directorate of the Army Research Laboratory, this funding is crucial in order to “buy access” to the smartest minds and get them interested in helping the Defense Department solve important problems.17

Within academia, the peer review of proposals has long assured the matching of funding to researchers with the best ideas. Defense basic research is also carried out in a similar competitive process, by having individual researchers or research consortia submit proposals to receive funding in the form of research awards, education grants, equipment grants, and technical assistance grants. The Multidisciplinary University Research Initiative (MURI) program is the principal means of obtaining DOD funding for basic research. While peer review goes a long way toward ensuring quality in the allocation of funds from federal agencies to individual research projects, it normally occurs at the start of the funding stream, with few subsequent checks on the quality of the research outputs.

If basic research were a business, the efficient allocation of resources would be a relatively straightforward matter. Resources would go toward the efforts that demonstrated the highest productivity, as calculated by some output metric. But measuring research outputs and the productivity of basic research is highly problematic; it has proved a troublesome issue for businesses as well.18 Basic research cannot easily be made deterministic, so it is often difficult to know if a project will be successful or proceed in the originally proposed direction. Presently there is no widely accepted way for the federal government in conjunction with the scientific community to make priority decisions about the allocation of resources in and across scientific disciplines.19 While metrics such as the number and quality of peer-reviewed publications, citations, graduate students, research awards, and the level of external funding are indicators of a vibrant research program, they do not necessarily show how the needs of the warfighter are being met. Without meaningful and practical output measures, the system of peer-reviewed individual research grants and institutional grants simply invests in the infrastructure and salaries necessary for researchers to do their work. The scientific work that proceeds from these investments should therefore meet some metric to ensure that the joint warfighting capabilities of the future are being developed. Without some individual or institutional accountability of university researchers to the Technology Area Reviews and Assessments process,

45/46

the allocation of funds through peer-reviewed grants will not meet all the needs of our defense basic research program. This is evidenced by the fact that from FY97 to FY02, 181 MURI projects have been funded, and none of them has transitioned technology to the warfighting force.20

A Cooperative Research and Development Agreement (CRADA) is another way industry and universities partner with DOD to conduct specific R&D activities. Any state or local government, commercial industry, public or private foundation, or non-profit organization can enter into a CRADA agreement with the Department of Defense. These agreements are not considered a procurement contract, grant, or cooperative agreement. A CRADA is a written agreement between one or more DOD laboratories or technical activities and one or more non-federal entities. The parties entering into a CRADA primarily exchange intellectual property, expertise, and data. However, they may also exchange the use of personnel, services, materials, equipment, and facilities. The DOD also can provide personnel, facilities, equipment, or other resources, with or without reimbursement. Non-federal partners can provide funds, people, services, facilities, equipment, or other resources. DOD participants can accept funding from a CRADA partner to perform research or development that benefits the partner, but no DOD funds can flow to the CRADA partner. The rights to inventions and other intellectual property are flexible and are negotiated as a part of the agreement.21

An additional issue is the practice of congressional earmarking. Public funding for defense basic research often becomes a political football due to the large institutional and regional economic stakes. In a recent survey, the National Academy of Sciences highlighted the dramatic growth in the number and size of earmarks for academic research. Over the past decade, congressional earmarks for academic institutions to conduct defense basic research increased in value from tens of millions to hundreds of millions of dollars.22 Examples are the six recent congressionally directed medical research programs signed into law by President Bush as an inclusion to the FY 2004 Defense Appropriations Act. These programs earmark nearly $273 million dollars for research in the fields of breast cancer, prostate cancer, neurofibromatosis, ovarian cancer, leukemia, and tuberous sclerosis.23 While these programs pursue worthwhile goals, none of these programs serves to meet the needs of the Defense Technology Objectives or the Joint Warfighting Capability Objectives, and in no way do they serve the warfighter. In this regard the practice of congressional earmarking is the least productive use of research funds. Congressionally earmarked funds generally place narrow constituent interests over scientific merit. The promise or threat to remove funding is often subsequently used to influence or change the character of a project. Additionally, these efforts often bypass the primary mechanism for allocating federal basic research funds—

46/47

the competitive, peer-review process. Without a means of determining merit or need, congressional earmarking for defense basic research further removes the researcher from any obligation of meeting the technological needs of the joint force. Since congressional earmarks will no doubt continue, it is therefore the responsibility of policymakers to ensure that necessary investments in defense basic research and institutional grants proceed on the basis of scientific merit and in the larger context of national needs and priorities.

While there is a need for public investments in university infrastructure and large-scale projects, the nature and size of defense research makes the funding of universities inappropriate. The amount of federal obligations for basic research from the Department of Defense are much smaller in comparison to those of the National Institutes of Health, the Department of Energy, the National Science Foundation, or NASA. During the 1970s, industry recognized that university-centric research was too cumbersome and transformed their research efforts into something called “industrial-strength basic research.”24 In this construct, research is pursued within large interdisciplinary teams with impressive infrastructure support. In a recent interview, James C. McGroddy, who retired in 1996 as a senior vice president for research from IBM, stated that “industry can gain great benefits from research if it’s managed right.” Research, he argued, “cannot be performed in a monastery on a hill.” When research is properly managed, “it attracts the best people, it moves basic science to invention to new technologies, which garner key patents, and the company also gains key insights into the future.”25 Teams working in a single corporate setting, with powerful capital tools and objective-driven management, have demonstrated that they can tackle big projects, often more successfully than distinguished but dispersed academic consortia. Industrial-strength fundamental research in biotechnology has been the most recent proving ground of this type of research and has generated revolutionary changes in short periods of time.26 Yet this concept is nothing new—it is similar to the concept of the Manhattan Project that created the atomic bomb while making great strides in the field of high-energy physics.

DOD Laboratories

Vannevar Bush’s vision of publicly funded research was primarily designed to maintain the high level of scientific and intellectual capital created during World War II and apply it toward “practical purposes.” Having an educated work force with universities manned with capable researchers would create a scientific strategic reserve allowing the nation to surge in times of future war. However, Bush also recognized that the technological margin of success enjoyed by the Allies during the war was dangerously thin and that there was a continuing need for research to support national security. He felt that this re-

47/48

search would best be orchestrated through “a civilian-controlled organization with close liaison with the Army and Navy, but with funds direct from Congress.”27 In addition to conducting research on its own, an organization such as this would be necessary to evaluate new technical opportunities regardless of their source, since some breakthroughs are bound to occur elsewhere. Today this “organization” is realized through the 700 laboratories and research centers known as the Federated Laboratory System.

Over the past 30 years, there have been a hundred major studies on the health of the government science and technology laboratories. Each of these reports has endorsed the requirement for world-class in-house service laboratories and has stated that these service laboratories are an essential component of the warfighting machine of the United States. However, all of these studies state unequivocally that our defense laboratories have been left in a state of severe crisis. The two most recent studies of our service laboratories are particularly damning.28 These reports state that the service laboratories are so poorly funded and managed that “unless they receive help soon at the service, Office of the Secretary of Defense (OSD), and congressional levels they will no longer be able to recruit and retain the high quality, dedicated scientists and engineers required to perform the research necessary to preserve our military’s technological superiority.”29

John H. Hopps, Jr., Deputy Director of Defense Research and Engineering, and Deputy Undersecretary of Defense in the Department of Defense, has stated that our “defense laboratories should have the same attributes as our transformed uniformed military forces.” While the DOD is transforming to build modular joint forces with the attributes of speed, agility, lethality, and knowledge, the service laboratories need to transform with the parallel attributes of “productivity; responsiveness and adaptability; relevance, programming, and execution; generation and application; and perpetuation of knowledge.” Hopps argues that this transformation should lead to a greater investment in breakthrough activities and increase the reach of the defense labs into university basic research programs.30

It is crucial that the focus on defense-unique technologies be continued. If the character of defense basic research is truly defined more by the motivation of the investigator, then this form of research is best accomplished in service laboratories and not in universities or industry. A report by the Naval Research Advisory Committee argues that industry will pursue only high-profit major weapon systems, but “the laboratories are crucial to address high-risk, low-volume Science and Technology (S&T) projects.”31 These projects are often not profitable enough for industry to take on or are classified in nature, so universities avoid them. However, like the atomic clocks pursued by the US Naval Observatory that enabled the development of the Global Posi-

48/49

tioning System, they are critical to the successful fielding of defense-related enabling technologies. In addition to conducting research on their own, a vibrant system of service laboratories is needed to provide in-house technical experts who can advise acquisition program managers on the technical feasibility and affordability of commercial off-the-shelf or proposed outsource solutions.

In the Air Force’s “Science and Technology Workforce for the 21st Century” report, the senior steering group charged with investigating the health of the service laboratories outlined the ideal state of a defense science and technology laboratory. According to this report, an ideal defense laboratory is ultimately measured by outcomes that demonstrate it has a contributing value to its service. These outcomes are:

These outcomes simply cannot be duplicated within the construct of peer-reviewed research at a university.

Transforming Defense Basic Research

While the Defense Department struggles to transform its own research infrastructure and strategy, the National Institutes of Health (NIH) is attempting to do the same in order to make better use of its basic research budget, which in FY 2003 was nearly $13 billion. The National Academy of Sciences was recently commissioned by the National Institutes of Health to study and make recommendations on changes to its basic research funding strategy.33 While NIH research is primarily focused on the biomedical sciences, the agency’s funding strategy is similar to DOD’s. Like the Department of Defense, NIH relies heavily on peer-reviewed extramural and intramural research to solve problems requiring a discovery system of inquiry. Several of the recommendations made by the National Academy of Sciences study committee could also certainly apply to the Department of Defense.

The most fundamental recommendation, yet the most difficult to implement, is the establishment of a set of metrics to assess the technical and scientific output of each project. An additional recommendation is that project assessments should be made periodically by external, independent, peer-review panels and should include scientists from academia, government, and industry. This evaluation should include an assessment of benefit to “the field.”34

This sounds very similar to the Technology Area Reviews and Assessments process discussed earlier. Although that process does not evaluate

49/50

the research itself, it establishes an advisory group for each Defense Technology Objective or Joint Warfighting Capability Objective to make the necessary evaluations on funded research. Each DOD advisory group provides the necessary expertise to the Under Secretary of Defense (Acquisition, Technology, and Logistics); the Director, Defense Research and Engineering; the Deputy Under Secretary of Defense (Science and Technology); the Director, Defense Advanced Research Projects Agency; and the military departments in order to develop a research investment strategy. All research in support of the Department of Defense receives some form of periodic review, generally biannually, from a panel formed by the awarding agency or DOD advisory group. Researchers also must submit annual progress reports on their funded projects. These project reviews are then used to prepare the agency and project reviews at the Technology Area Reviews and Assessments. In both forums the researchers report on the extent of their efforts couched in terms of the published metrics.

This brings us back to the question of which metrics should be used to measure the effectiveness of basic research. The Government Performance and Results Act of 1993 called for federal agencies to develop, by the end of fiscal year 1997, multi-year strategic plans and metrics for assessing progress toward agency goals.35 For research funding agencies like the Defense Advanced Research Projects Agency or the Army Research Office, these metrics include: a list of papers submitted or published during the reporting period, demographic data (number of scientists or students supported), a report of inventions, a description of any significant theoretical or experimental advances, and amount of “technology transfer.” In this context, the Army Research Office defines technology transfer as “any specific interactions or developments which would constitute technology transfer of the research results. Examples include patents, initiation of a start-up company based on research results, interactions with industry/Army R&D Laboratories, or transfer of information which might impact the development of products.”36 The first four metrics are attractive to program managers and review panels because they are easy to enumerate and lend themselves to statistical analysis. While metrics such as these may indicate the size and health of a research program, however, they are essentially irrelevant in regard to meeting the technology needs of the Department of Defense.

The Office of Management and Budget (OMB) under the Clinton Administration and the current Bush Administration has tried to improve the management of basic research programs across the federal government, by reinforcing or adopting best management practices and not focusing on trying to predict the outcome of research. OMB has proposed using “Quality, Relevance, and Performance” as guideline metrics for measuring the investment

50/51

criteria for basic research programs. The intent of these initiatives is to bring more precise information related to program performance to bear on future resource allocation decisions. In order to measure the quality of a research program, agencies are required to periodically examine their projects for scientific and technical excellence by benchmarking them relative to other programs, other agencies, and other countries. To demonstrate relevance, research programs, including unsolicited programs, must identify and prioritize individual research goals and demonstrate the linkages back to national initiatives or overall relevant research goals. A program’s performance is then evaluated by setting and meeting a series of high-priority, multi-year research objectives.37 It is therefore essential that the Department of Defense require all its research programs to establish clear but flexible plans with well-defined milestones that are linked to specific Defense Technology Objectives or Joint Warfighting Capability Objectives.

The US Army recently has taken a different approach to managing extramural research from the approaches discussed above. One of the Army’s main efforts has been to attract the best and brightest to work at solving the Army’s problems through the establishment of University Affiliated Research Centers and Collaborative Technology Alliances. There are currently four DOD-approved centers and five Collaborative Technology Alliances that are partnerships between academia, government, and industry. These University Affiliated Research Centers hope to combine the ability of universities to produce cutting-edge research, the expertise of industry to manufacture technology, and the knowledge of government scientists to guide the research efforts in a manner that meets the needs of the warfighter.38 The four centers encompass the areas of nano-technology, advanced simulations, biotechnology, and electrodynamics; while the Collaborative Technology Alliances encompass the areas of advanced sensors, power and energy, advanced decision architectures, communications and networks, and robotics.

The financial commitment from the government for each University Affiliated Research Center is $50 million over five years, and for each Collaborative Technology Alliance is approximately $35 million over five years. Each of these programs uses some form of a Research Management Board with participation from other Army organizations, other services, and other government agencies. While the Collaborative Technology Alliances are managed by a senior Army Research Laboratory representative designated as the Collaborative Alliance Manager, the University Affiliated Research Centers are managed by the university partner. As an exception, the Institute for Soldier Nanotechnology, established at the Massachusetts Institute of Technology in 2003, has an Army Acquisition Corps liaison officer and several Army Research Laboratory researchers on campus. While the Army is thus

51/52

actively leveraging the facilities and resources of academia and industry to support its own internal research efforts, these programs are too recent to determine their impact on future warfighting technologies.

Conclusions

In 1945 Vannevar Bush established a vision of publicly funded research in which he urged the scientists mobilized to fight World War II to turn their efforts toward solving “the needs and desires of man” once the fighting had ceased.39 As a result of implementing his vision, research universities in the United States have become the envy of the world, mostly using public funding, and they have done so at the expense of funding for our service laboratories.

However, Dr. Bush clearly recognized the continued need for focused research to support national security. With a basic research budget less than half that of the National Science Foundation and a mere fraction of the budget for the National Institutes of Health, the Department of Defense cannot afford to pursue lofty science education goals and satisfy the Defense Technology Objectives and Joint Warfighting Capability Objectives necessary to meet the needs of future warfighting. Additionally, no single approach to funding basic research will be able to satisfy the tremendous technology needs of the future force. A combination of closely managed extramural and intramural research efforts is needed to solve the immense technological challenges of the future. Setting broad priorities for basic research is the domain of policymakers in Congress and the Administration, but it should be the result of informed policy debate. The Department of Defense probably will continue to fund public universities in order to maintain a strong scientific research base, but it should recognize that its impact on providing capabilities to the warfighter is minimal without specific mechanisms to ensure overall quality.40

The new approaches of establishing collaborative venues and centers of excellence incorporating elements of the service laboratories, industry, and university researchers are the key to achieving a successful and rapid transition of scientific knowledge into fielded technology. Situating these centers in a university setting allows the scientific field to determine the quality of the research through the peer-review process, freeing the Department of Defense to focus on guiding the scope of the research in pursuit of developing defense-specific technologies. In light of OMB initiatives and the Government Performance Results Act of 1993, the DOD should restrict research program metrics to those that are linked to well-defined milestones in support of Defense Technology Objectives or Joint Warfighting Capability Objectives. Not only will this allow program managers to monitor and

52/53

assess the progress of the research, but it will allow for the phasing-out of a program once the stated ends are met or eliminating it if the research effort falls short of expectations.

The ability of the Defense Department to leverage research in our universities and industrial base is predicated on using government scientists to shape the basic research into key warfighting technologies. This assumption is valid only if we have strong DOD laboratories that attract world-class scientists. However, our defense laboratories are in a state of severe crisis. An approach worth considering is to eliminate or minimize the funding of basic research at universities in order to build world-class defense laboratory facilities using the Government-Owned/Contractor-Operated (GOCO) model used by both NASA and the Department of Energy. Laboratories like Sandia, Los Alamos, the Jet Propulsion Laboratory, and Lawrence Livermore are world-renowned for their contributions to the scientific field as well as to their respective agencies. In each of these laboratories, the agency has contracted a university to manage the facility and has made it accountable for research goals. Research is conducted by government personnel, university professors, graduate students, and contract personnel. To attract new research ideas, these agencies provide small travel grants for collaborative groups to use the facility with the assistance of permanent staff researchers. The Department of Defense could follow the same approach with its service laboratories by contracting their management to universities or combining them into a Joint Research Laboratory under a single university’s management. Using this model, the Defense Department could have the best of both worlds by sponsoring research that is accountable to meeting stated Defense Technology Objectives and which also serves to meet more altruistic goals like encouraging students in scientific disciplines. At any rate, it is clear that the Deputy Under Secretary for Defense, Science & Technology, needs to take immediate action to reverse the funding and management trends at the service laboratories in order to recruit and retain the high-quality, dedicated scientists and engineers necessary to conduct and manage cutting-edge research.


NOTES

1. US Department of Defense, Quadrennial Defense Review Report (Washington: Department of Defense, 30 September 2001), p. 6.

2. US Department of Defense, Director of Defense Research & Engineering, Basic Research Plan (Washington: Department of Defense, February 2003), p. I-1.

3. Vannevar Bush, Science the Endless Frontier (40th Anniversary Edition) (Washington: National Science Foundation, 1990).

4. National Science Board, Government Funding of Scientific Research: A Working Paper of the National Science Board, NSB-97-186 (Washington: GPO, 1997).

5. “Global Positioning System (GPS),” http://samadhi.jpl.nasa.gov/msl/Programs/gps.html.

6. US Department of Defense, Deputy Under Secretary of Defense, Science and Technology, Defense Science and Technology Strategy (Washington: Department of Defense, May 2000), p. 1.

53/54

7. US Department of Defense, Deputy Under Secretary of Defense, Science and Technology, Defense Science and Technology: Reliance (Washington: Department of Defense, March 2001), p. vi.

8. US Department of Defense, Deputy Under Secretary of Defense, Science and Technology, Defense Technology Area Plan (Washington: Department of Defense, February 2003), p. ES-4.

9. US Department of Defense, Deputy Under Secretary of Defense, Science and Technology, Joint War-fighting Science and Technology Plan (Washington: Department of Defense, February 2003), p. I-4.

10. US Department of Defense, Defense Technology Area Plan, p. ES-2.

11. “Industry R&D Coalition Critique of February 2002 DDR&E S&T Plan,” letter from Dick Engwall, RLEngwall & Associates, Chairman of Industry R&D Coalition, to Robert Baker, Deputy Program Director, 5 August 2002, http://www.dodmantech.com/pubs/pubs.shtml.

12. US Department of Defense, Deputy Under Secretary of Defense, Science and Technology, Defense Basic Research Plan (Washington: Department of Defense, February 2003), p. III-1.

13. Ibid., p. III-4.

14. Ibid., p. IV-1.

15. National Science Foundation, Division of Science Resources Statistics, Survey of Federal Funds for Research and Development: Fiscal Years 2000, 2001, and 2002, http://www.nsf.gov/sbe/srs/nsf02321/sectc.htm.

16. US Department of Defense, Defense Basic Research Plan, p. III-2.

17. Dr. Joseph Rocchio, Director, Sensors and Electronic Devices Directorate, US Army Research Laboratory, interview by author, 30 December 2003, Adelphi, Md.

18. Committee for Economic Development, America’s Basic Research: Prosperity Through Discovery, (New York: Committee for Economic Development, 1998), pp. 32-47.

19. National Research Council, Committee on Criteria for Federal Support of Research and Development, Allocating Federal Funds for Science and Technology (Washington: National Academy Press, 1995).

20. Dr. Larry C. Russell, Jr., “Re: MURI Projects: Technology Transition?” e-mail message to author, 19 December 2003.

21. US Department of Defense, Office of the Under Secretary of Defense (Acquisition, Technology, and Logistics), Managers Guide to Technology Transition in an Evolutionary Acquisition Environment, Defense Procurement and Acquisition Policy, Version 1.0 (Washington: Department of Defense, 31 January 2003), pp. 2-10 to 2-11.

22. National Research Council, Committee on Criteria for Federal Support of Research and Development.

23. US Department of Defense, Congressionally Directed Medical Research Programs, http://cdmrp.army.mil.

24. Donald Kennedy, “Industry and Academia in Transition,” Science, 21 November 2003, p. 1293.

25. Madeleine Jacobs, “Whither Long-Term Industry Research?: A Shift in Support for Basic Research in Industry Raises Questions about the U.S.’s Ability to Compete,” C&EN, 5 May 2003, pp. 37-39.

26. Kennedy.

27. Ibid.

28. US Air Force, Office of the Chief Scientist of the Air Force, “Science and Technology Workforce for the 21st Century,” Washington, D.C., July 1999; US Navy, Office of the Assistant Secretary of the Navy (Research, Development, and Acquisition), Naval Research Advisory Committee, “Science and Technology (S&T) Community in Crisis,” Washington, D.C., May 2002.

29. US Navy, “Science and Technology (S&T) Community in Crisis,” p. 2.

30. Jacobs.

31. US Navy, “Science and Technology (S&T) Community in Crisis,” p. 1.

32. Ibid., p. 18.

33. Institute of Medicine, National Research Council, Large-Scale Biomedical Science: Exploring Strategies for Future Research (Washington: The National Academy Press, 2003).

34. Ibid., p. 195.

35. US Congress, Government Performance Results Act of 1993, 5 January 1993, http://www.whitehouse.gov/omb/mgmt-gpra/gplaw2m.html.

36. US Army Research Laboratory, Army Research Office, “Reporting Instructions,” ARO Form 18, July 2003, http://www.arl.army.mil/aro/forms/arofm18/form18_2004.pdf.

37. National Academy of Sciences, Committee on Science, Engineering, and Public Policy, Evaluating Federal Research Programs: Research and the Government Performance and Results Act (Washington: National Academy Press,1999).

38. John A. Parmentola, “University Affiliated Research Centers,” Army AL&T, November-December 2003, pp. 30-32.

39. Vannevar Bush, “As We May Think,” The Atlantic Monthly, July 1945, pp. 101-08.

40. Committee for Economic Development, America’s Basic Research: Prosperity Through Discovery, pp. 32-47.


Lieutenant Colonel Augustus Way Fountain III is an Academy Professor in the Department of Chemistry and Life Science at the US Military Academy. He is a graduate of Stetson University and received the M.S. and Ph.D. degrees in analytical chemistry from Florida State University. He is also a graduate of the US Army Command and General Staff College and the US Army War College. LTC Fountain served as the Battalion Chemical Officer to the 1st Battalion, 75th Ranger Regiment during Operation Just Cause in Panama and as the Regimental Chemical Officer to the 504th Parachute Infantry Regiment throughout the Persian Gulf War. He has conducted funded research for the Department of Energy, the Defense Advanced Research Projects Agency, and other agencies in the areas of remote optical sensing, UV-laser carbonization of polymer surfaces, and laser spectroscopy.


Go to Winter issue Table of Contents.

Go to Cumulative Article Index.

Go to Parameters home page.

Reviewed 23 November 2004. Please send comments or corrections to usarmy.carlisle.awc.mbx.parameters@mail.mil