|| Search | Site Map | Help | Contacts|
[If the United States is to continue as the world leader in basic research, its scientists and engineers must have access to the most powerful computers.]
Impact of the Terascale System
Investing in Ideas
Terascale Computing in the PACI Program
The Terascale Computing System continues a history of National Science Foundation support for high-performance computing that began with the Supercomputer Centers program established in 1985. The Partnerships for Advanced Computational Infrastructure (PACI) program replaced this program in FY1998. PACI adds emphasis on the coupling of computational and computer science in order to more effectively exploit the emerging capabilities of scalable-parallel systems, high-performance networking and high-bandwidth, large-capacity mass-storage systems.
Within this environment, simulation and modeling for a vast array of scientific and engineering problems have led to truly revolutionary insights over the last decade, and there is every sign that progress is accelerating. Due to increased computer capability, computational science is experiencing a revolution in its ability to solve new research problems. The recent demonstration of computers with speeds of a teraflop (1012 floating-point operations per second) or more has directed attention to important fundamental science and engineering problems which are not amenable to solution with current systems, but would be accessible to terascale range computation.
The Planning Process
In early 1998, NSF began planning to upgrade the availability of very high end computing resources for the academic science and engineering research community. As the NSF strategic plan states, part of our mission is investing in Ideas that provide a deep and broad fundamental science and engineering knowledge base. A principal means of enabling those Ideas is to make available Tools for wide access to state-of-the-art science and engineering infrastructure, e.g., support for user facilities in many fields. One of the most widely used research tools that has emerged over the past decade is high-end computational capability; this broad use of computers by many different areas is referred to as "computational science." Consequently, such activities in computer and information science and engineering have received high priority in NSF planning and budget development.
A workshop at NSF in May 1998 (one of a series that examined various questions relating to terascale computing) identified numerous important computational applications that could take advantage of significantly increased computing power. This is true among traditional users of computing in physics, chemistry (the 1998 Nobel Prize was for computational chemistry), geosciences and engineering, as well as in disciplines such as biology, where computing is still emerging as a critical new tool. New application domains such as economics, sociology, and even history are ripe for exploration. These points were reinforced at a joint NSF/DOE National Workshop on Advanced Scientific Computation hosted at the National Academy of Sciences in July 1998.
The PITAC Report
The President's Information Technology Advisory Committee (PITAC) also considered High-End Computing during its deliberations about the current state and future directions of the Nation's Information Technology. PITAC's final report, Information Technology Research: Investing in Our Future, released on Feb. 24, 1999, found that:
The report goes on to state that:
At the National Science Foundation, we expect to see fascinating new science and engineering as a result of the TCS I machine at PSC. The system will be fully integrated as a leading-edge computing resource in the fabric of the PACI program, and we have every confidence that PSC will splendidly fulfill its traditional role of supporting the very high-end users.
|| Feedback | Home|