Pittsburgh Supercomputing Center Collaboration with Harvard Pioneers a New Approach in Brain Study

Prodigious data-transmission, management and image processing made it possible to map the connections between individual brain cells identified according to function

PITTSBURGH, August 16, 2011 — “Untangling Neural Nets” said the big-print headline on the cover of Nature (March 10, 2011), with an image from work by scientists at Harvard and the Pittsburgh Supercomputing Center (PSC). A news comment in the same issue framed their research as “an exciting and pioneering approach . . .” by which the researchers “achieved a new feat . . . a way of directly studying the relationship of a neuron’s function to its connections.”

[i]Nature[/i] - March 10 Issue CoverNature - March 10 Issue Cover

The Harvard-PSC team’s paper culminates several years of work at Harvard as well as prodigious data transmission, management and image processing at PSC. The result is a notable first step toward a major goal of neuroscience: a wiring diagram of the brain. “We’ve just begun to scratch the surface,” says Clay Reid, professor of neurobiology at the Harvard Medical School and Center for Brain Science, who led the project, “but we’re moving toward a complete physiology and anatomy of cortical circuits.”

Their study relied on a series of innovations with advanced technology, beginning with experiments that identified individual brain cells — known as neurons — of a mouse as they respond to what the mouse is seeing. “The first part of this work,” says Reid “is something we’ve been doing for five or six years — literally watching the brain see, with neurons reacting to very specific elements in the field of view.”

The next challenge involved capturing high-resolution images of the extremely small volume of brain involved — about the size of the period at the end of this sentence. Within this tiny patch of the visual cortex, the part of the brain that processes impulses from the retina, they had identified 100 neurons according to function. With very high-resolution imaging, they aimed to make sense from the tangled, tentacle-like mass of neural structure and to relate neural wiring to function. “The big challenge,” says Reid, “is tracing the connections between neurons.”

To get the high-resolution images needed, Reid and Ph.D. student Davi Bock (now head of a laboratory at Janelia Farm, the research campus of the Howard Hughes Medical Institute) and postdoctoral fellow Wei-Chung Lee developed a souped-up version of transmission electron microscopy (TEM). They prepared the brain tissue and — with a precision diamond knife called an “ultramicrotome” — cut ultra-thin slices (40 nanometers, the thickness of a few hundred atoms) of part of the volume that included 10 of the functionally identified neurons. With a specialized TEM camera array, they imaged these sections.

The imaging presented a massive data-processing task. In late 2007 a fortuitous meeting of minds (pun intended) occurred at a brain science conference. Discussions between Reid, Bock and bio-imaging expert Art Wetzel of PSC’s National Resource for Biomedical Supercomputing (NRBSC) opened the door to a way forward in handling the TEM data. By April 2009 Wetzel and his NRBSC colleague Greg Hood had tools in place at PSC to receive from Harvard, store and process more than a terabyte (a trillion bytes, equivalent to 1000 copies of the full set of Encyclopedia Britannica) of TEM data per day.

Over a six-month run, April to September, the NRBSC scientists transmitted more than 110 terabytes of data and collected more than three million TEM camera frames from Harvard. PSC network staff worked closely with Harvard to maximize bandwidth performance. “This was near the limits,” says Wetzel, “of what could be sent using commodity best-effort network service.”

Wetzel and Hood archived the image data (using PSC’s file archival system), and at the same time, with a workstation custom-built for this job, began the task, a combination of art and science, of digitally stitching frames into sections (as many as 14,000 frames per section) and then stacking these quilt-like sections to recreate the imaged brain volume for 3D viewing and analysis on a computer screen. In 2010, after months of work at NRBSC, the Harvard team used this 3D model to manually trace the axons (output wires) from each of the functionally identified neurons to its junction (synapse) with a dendrite (input wire) of another neuron — and beyond, to the boundaries of the imaged volume.

Reid and his colleagues, in effect, crawled through the brain’s dense thicket, neuron to neuron, and mapped a small part of the visual cortex. “This gives us a new approach,” says Reid “to answer the question, ‘How does the brain see?’ We can finally look at circuits in the brain in all of their complexity. How the mind works is one of the greatest mysteries in nature, and this presents a new and powerful way to explore that mystery.”

Quilt Patching & Slice Stacking

Research on the visual cortex over nearly a century has shown that it is organized into circuits according to visual function. Neurons that respond to vertical features in the field of view — trees or telephone poles, for instance — are interspersed in the mouse visual cortex with neurons that respond to horizontal features. The objective of this research is to reverse engineer the wiring in order to get at deeper understanding of how a circuit works, in particular to understand how neurons that respond to different features make connections in a local circuit. “To understand the cerebral cortex,” says Reid, “we’d like to go one circuit at a time. A circuit is roughly 10,000 neurons with tens of millions of connections between the neurons.”

Ten neurons, the researchers are acutely aware, is barely a start, but a start, and they’re busy planning an expanded study. “By historical standards, this was a large volume,” says Bock, for whom this work constituted his Ph.D. dissertation, “but it was barely big enough to contain some interesting cortical circuitry.”

“What we’ve done,” says Wetzel, referring to the paper in Nature, “is about 1/80th of the target volume for our next step, a cubic millimeter, large enough to encompass a circuit.” In preparation for the larger volume, he and Hood have begun upscaling their storage and processing capabilities to handle volumes 10 times larger, and expect to be prepared to handle data transmission at the scale of petabytes (1000 terabytes) in two to three years.

To get an idea of the quantity of information-capture involved, imagine the brain as a wedge of cheese. If each TEM-prepped section were a millimeter thick, roughly a thin slice of cheese (instead of 40 nanometers), and the lateral dimensions increased proportionately, the cheese slices would be larger than a basketball court. A cubic millimeter of brain will yield 25,000 of these basketball-court sized cheese slices.

For the automated stitching of frames and post-processing of the immense TEM data sets, Hood and Wetzel applied software methods they developed in earlier volumetric imaging, mainly with the roundworm (C. elegans), adapting them to the Harvard data and, in some cases, creating ad hoc methods. The latter included adjusting for distortions in the frames that occur as part of TEM imaging. “Some sections,” says Hood, “had pronounced shear distortion. Since this is very regular, we could mathematically compensate for it before addressing the irregular distortions.”

To stitch the individual frames, intentionally imaged with overlap, into a single mosaic, they use various search methods (including fast Fourier transform correlations) to match information in adjacent frames. This process, says Wetzel, matches frames both spatially and in intensity to produce a nearly seamless image of each section.

To map each section to its neighboring sections, they apply a “pair-wise” registration algorithm, compensating for deformations that inevitably occur when cutting tissue so thinly. They next construct a “spring model” of the entire stack of sections, with the pairwise registration maps guiding the placement of springs between adjacent sections. Finally, by letting this spring model relax, they obtain a 3D alignment of the stack and can produce a finished volume for viewing and analysis.

Tip of the Iceberg

Along with demonstrating viability of a powerful approach to brain research, the Harvard researchers also produced new evidence tending to confirm earlier studies about “inhibitory neurons” — neurons that, rather than transmitting an excitatory electro-chemical pulse, suppress the activity of other neurons. Tracing the axon-to-dendrite connections within the imaged volume showed that the inhibitory neurons, which the researchers could identify by their structure, had no functional preference; connections to their dendrites arrive from the functionally identified neurons without regard to the visual response properties of the transmitting neuron. Understanding these relationships can be important, says Reid, because many neurological conditions, such as epilepsy, seem to be the result of neural inhibition gone awry.

Still, the accomplishment, in this case, lies less in the scientific finding than the proof of method, for which Reid gives large credit to the partnership with PSC. “The amount of data they took-in and did this very precise alignment with,” he says, “was completely unprecedented. We couldn’t have done our science unless we had this team to wrestle the large dataset to the ground.”

As Wetzel and Hood prepare to handle more data, Reid and his colleagues are scaling-up their TEM platform to generate much larger data sets. “This is just the tip of the iceberg,” he says of the published work. “Within ten years I’m convinced we’ll be imaging the activity of thousands of neurons in a living brain and tracing tens of thousands of connections between them.”

About NRBSC: http://www.nrbsc.org
The National Resource for Biomedical Supercomputing at PSC is supported in part through a grant from the NIH National Center for Research Resources.

System status

  Bridges status

Bridges is up and running normally.


Featured Projects

Data Exacell (DXC)


The Data Exacell (DXC) is a research pilot project to create, deploy, and test software and hardware building blocks to enable data analytics in scientific research.

XSEDE Service Provider

image gallery

PSC is a service provider of the Extreme Science and Engineering Discovery Environment (XSEDE).