Pittsburgh Supercomputing Center 

Advancing the state-of-the-art in high-performance computing, communications and informatics.

N+N Workshop in High Performance Computing for Biomolecules and Materials

April 15-16, 2004
Holiday Inn Capitol, Washington, DC

>> See the talks that were given and a list of attendees.

Objectives

The workshop built on the outcomes of the previous workshop and consolidated existing links. It also reviewed the strengths and weaknesses of the research programs in which the participants are involved. Future research aims were identified and the possibilities of cross-over collaborations, both between the UK and US and between the Biomolecules and Materials research communities, were explored. More specifically, the workshop provided the opportunity to discuss current and future needs within these areas, especially with respect to existing HPC machines and data archives, Grid-based applications, and next-generation HPC hardware and software.

The objectives can thus be summarised as:

  • To review and compare on-going programs of research, defining strengths, weaknesses and barriers.
  • To identify priority areas for future collaborations between members of the UK and US Biomolecules and Materials simulation communities.
  • To identify future requirements for HPC hardware and software.
  • To identify potential projects for international grid computing.

Summary of Final Session & Conclusions

The final session was essentially a round-table airing of views relating to issues affecting the HPC materials & bio communities. Broadly speaking, the issues raised fell into three areas:

The following is a collection (without attribution) of the various suggestions, comments, & observations from the final session.

Opportunities

Education

Two clear action items emerged from the discussion. The first was that the NSF and EPSRC - BBSRC should perhaps re-examine their respective procedures for enabling genuine US-UK international collaborations. Based on the experiences of the present N+N participants, it seems that current practises are not as effective as they might be.

The second and likely equally important action item was the recommendation that the next N+N meeting should be a workshop for post-doctoral researchers (& graduate students?) plus younger faculty. Indeed, it would seem appropriate for the NSF & EPSRC - BBSRC to fund one workshop in the UK and another the following year in the US. One theme to emerge from the present meeting was HPC and the Nano-bio Interface.

The emphasis on younger researchers is important to ensure that the critically important area of HPC applications remains attractive to the next generation of researchers. In this regard, there was also considerable discussion and enthusiasm for seeking joint US-UK educational opportunities for students. Ideas that were discussed included the NSF IGERT scheme, and domain specific graduate degrees (Chemistry, Physics, Biochemistry, etc) with HPC being a Masters or Diploma option. For example, the latter could involve some courses that might be offered in the UK plus other complementary courses in the US, with the combined courses being required. Sometimes critical mass is a problem for these programs. Exchange of students may be one way to overcome this problem. Agencies from the UK and US could perhaps provide bursaries to facilitate the “overseas” parts of such international programs.

Other comments:

Dual degree programs, e.g., PhDs who also get a MSc in computer science, already exist at some US universities. If NSF is not the appropriate agency, others (e.g. DARPA, NASA) might be able to provide trans-national funding?

How to bring new blood into the HPC community interested in materials & bio fields? (This is another reason the next N+N meeting should be for younger people).

DoE has a presence in both materials and biology. The National labs are important for big software projects.

Research

HPC in materials science has already been very important. The needs for ‘big iron’ will increase. Systems biology is a sleeping giant. Future will involve moving from materials and biomolecular systems to cells and beyond - towards a global cell modelling initiative and then beyond the cell, to the virtual heart, etc.

Needs

CPU architecture

What aspects of the machine architecture are going to be most important - Faster processors. Put more into relatively faster interconnects rather than processors to facilitate scaling for more modest size problems.

Efficient way of computing is to have a ~200 cpu dedicated machine for a group of ~15 people; likes UK consortia model.

Sometimes there is too much emphasis on simulating really big systems — need for better sampling on moderately-sized problems.

Algorithm Development

Algorithms are critically important. They are responsible for at least as much advance as machine improvements. People need lots of computing time to develop and test algorithms. Also they need to be assured of the availability of future computing resources to justify the effort.

Grids & Software management

GRID computing — activation energy barrier too high.

GRID computing, recall it is in its infancy; ask the agencies for ways in which collaborations might be facilitated.

BioSimGRID — establish collaboration tools for distribution to the community to help remove barriers. The intention is to store sufficient metadata to help extend the shelf-life of the simulation per se. The question was raised whether the data would have a useful shelf life, and whether enough metadata would be stored to make it useful.

Grid computing does have high potential. There was a strong suggestion that source codes should be freely available — the ‘linux model’ … But some debate about this.

Focus on more seamless software for different codes to communicate - Problems with unstable middleware… Collaboration on standards for data exchange is a possible tangible area of US-UK collaboration.

Charmm is a model for reaching out and developing a community for software development. But note the considerable effort in maintaining such codes.

Miscellaneous

Multi-scale modelling is difficult to enter with a small group — need collaborative efforts (e.g., CHARMM, CPMD, etc)

Issues of managing software

How to bring application science back to NSF supercomputers programme

Funding Mechanisms for collaborative ventures

How do we build upon the potential synergy (Beyond an MoU) ?

How do we fund US/UK collaborations & not just talk about it? Added value from this meeting beyond discussions… how do we fund integrated joint research projects? Should be provision for US nodes in UK consortia, and vice versa… NIH does seem to have a model for this…? Another possible model would be to enmesh graduate training programs, e.g. US/German model.

Need mechanisms for genuine collaboration; need for applications to drive developments; difficult to create overlap between distant scales but essential to attempt it.

Main problem with collaborations is the current funding model — especially taking money from UK to US. Recognition of problems with funding models & need for work to be done in this area, although it will not be easy.

Importance of developing interfaces — between people and between codes; even funding for short visits could be useful.

NIH is a mission-oriented agency, and could embrace the science of this meeting.

There seems to be a case to be made for HPC contributing to NIH mission at the materials/nano/biology interface covered in this meeting, and so it would be useful to have a document making this case.

People. Science. Collaboration.

PSC's Bi-annual Publication (select issue to download PDFs)

PSC Spring2014c

 

PSC2013 covers web Projects2012-1

Subscriptions: You can receive PSC news releases via e-mail. Send a blank e-mail to psc-wire-join@psc.edu.

 

 

Stay Connected

Stay Connected with PSC!

facebook 32 twitter 32 google-Plus-icon