In Progress, 2005
Nanotechnology — a world of very small things (nano, Greek for dwarf) with big potential to affect our lives. Basically, nanotechnology means the ability to design materials at the scale of one to 100 nanometers. How small is this? Imagine a human hair sliced into 10,000 length-wise slivers. The diameter of each would be about one nanometer — a billionth of a meter.
Starting in the 1980s, new abilities to visualize and synthesize materials at the scale of individual atoms opened up possibilities for many applications, including supersmall, superfast electronics and molecule-sized particles to deliver insulin. Because the atomic structure of molecules governs a material’s properties, nanotechnology creates the potential to craft materials atom-by-atom according to the properties — such as hardness, strength, electrical conductivity — desired. To take advantage of this potential, however, requires precise understanding of atom-to-atom interactions.
The ability to predict these interactions depends on quantum-theory computations, but even the simplest nanostructures involve thousands of atoms, imposing huge computational demands. One approach is highly efficient software called LSMS (locally self-consistent multiple scattering method). When implemented at PSC to take advantage of massively parallel processing on the Cray T3E, LSMS became the first research software to achieve sustained performance over one teraflop (a trillion calculations per second). For this accomplishment, a team from Oak Ridge National Lab, the National Energy Research Scientific Computing Center, the University of Bristol (UK) and PSC won the 1998 Gordon Bell Prize, given for the top achievement in high-performance computing.
In recent work, PSC senior scientist Yang Wang and colleagues at Oak Ridge and Florida Atlantic University used LSMS and Big Ben, PSC’s Cray XT3, to calculate electronic and magnetic structure of an iron nanoparticle five nanometers in diameter. Composed of 4,409 atoms, the nanoparticle was embedded in an iron-aluminide (FeAl) matrix of 11,591 atoms — for a total of 16,000 atoms. A cross-sectional slice from the simulation shows charge distribution on the atoms (blue-positive to red-negative). In the nanoparticle itself, neutral Fe (green) is bounded by Fe atoms (yellow & light blue) that lose electrons. Other boundary Fe atoms gain electrons to become more negative (red). In the matrix, Al atoms (blue) lose electrons to Fe atoms (orange). These are the first quantum-based calculations of a physical system several nanometers in scale.
What would happen if many families in the largest U.S. cities receive vouchers they could use to send their children to private school? This issue has been debated for many years, sometimes with considerable heat. There are many uncertainties. How might vouchers affect where families choose to live, thereby shifting populations, property values and the tax base of public-school funding? What would happen to the quality of public schools?
To bring better understanding to these and other uncertainties, Maria Ferreyra, assistant professor of economics at Carnegie Mellon’s Tepper School of Business, turned to supercomputing. She created a “general equilibrium” model — a kind of economic model that accounts for a broad range of factors affecting individual decisions. Ferrerya’s model went signficantly beyond previous work in this area, incorporating a number of factors — such as idiosyncratic taste for location and school choice, household religious preference, and Catholic and non-Catholic private schools — that give a deeper connection to the reality of the school-choice process.
The first part of Ferreyra’s task was to develop a model without vouchers that would reproduce the reality of location and school-choice in U.S. metropolitan areas. Using census data, she focused on New York, Chicago, Philadelphia, Detroit, Boston, St. Louis and Pittsburgh. Because the model needed to be run thousands of times, this was a supercomputing-scale project. She turned to PSC’s pool of 20 Intel Pentium 4s (2.4 GHz, 512 MB RAM). “The amount of computing involved was enormous,” says Ferreyra, “This required about three weeks of computing on the PSC Condor pool and would have taken well over a year on a single desktop machine. I couldn’t have done this without supercomputing.”
After this process of “estimating” the model, Ferreyra simulated two different large-scale voucher programs in Chicago: one with universal vouchers and another with vouchers restricted to non-sectarian schools. This graphic of the Chicago area indicates boundaries between census tracts (fine lines) and school districts (colors). For the model estimation and predictions, Ferreya aggregated school districts into pseudo-districts (thick black lines) with neighborhoods of roughly the same number of housing units. The model predictions were surprising in several respects. Because vouchers allow more latitude to choose schools independent of household location, many households locate in less expensive neighborhoods than they would without vouchers and send their children to private schools. There is much movement and many private schools open in both the inner city and suburbs. Whether or not public schools improve depends on where they are located, with improvement that might not be anticipated occuring in neighborhoods of relatively low-cost housing.