BRAIN 2025: A Scientific Vision

The Advisory Committee of the (NIH) Director (ACD) Brain Research Through Advancing Innovative Neurotechnologies® Initiative, or The BRAIN Initiative®, Working Group published the BRAIN 2025 report on Thursday, June 5, 2014. Enthusiastically endorsed by the ACD, the BRAIN 2025 report articulated the scientific goals of the BRAIN Initiative and developed a multi-year scientific plan for achieving these goals, including timetables, milestones, and cost estimates. View the BRAIN 2025 report(pdf, 1223 KB)

Roster

Ex Officio Members 
Kathy Hudson, PhD, NIH 
Geoffrey Ling, MD/PhD, DARPA 
Carlos Peña, PhD, FDA 
John Wingfield, PhD, NSF 
 
Executive Secretary 
Lyric Jorgenson, PhD, NIH 

Cornelia Bargmann, PhD (co-chair) 
The Rockefeller University    

William Newsome, PhD (co-chair) 
Stanford University           

David Anderson, PhD 
California Institute of Technology     

Emery Brown, MD, PhD 
Massachusetts Institute of Technology and  
Massachusetts General Hospital          

Karl Deisseroth, MD, PhD 
Stanford University       

John Donoghue, PhD 
Brown University 

Peter MacLeish, PhD 
Morehouse School of Medicine    

Kamil Ugurbil, PhD 
University of Minnesota           

Eve Marder, PhD 
Brandeis University 

Richard Normann, PhD 
University of Utah 

Joshua Sanes, PhD 
Harvard University 

Mark Schnitzer, PhD 
Stanford University 

Terrence Sejnowski, PhD 
Salk Institute for Biological Studies 

David Tank, PhD 
Princeton University 

Roger Tsien, PhD 
University of California, San Diego

Executive Summary

The human brain is the source of our thoughts, emotions, perceptions, actions, and memories; it confers on us the abilities that make us human, while simultaneously making each of us unique. Over recent years, neuroscience has advanced to the level that we can envision a comprehensive understanding of the brain in action, spanning molecules, cells, circuits, systems, and behavior. This vision, in turn, inspired The BRAIN Initiative®. On April 2, 2013, President Obama launched The BRAIN Initiative® to “accelerate the development and application of new technologies that will enable researchers to produce dynamic pictures of the brain that show how individual brain cells and complex neural circuits interact at the speed of thought.” In response to this Grand Challenge, the National Institutes of Health (NIH) convened a working group of the Advisory Committee to the Director, NIH, to develop a rigorous plan for achieving this scientific vision. This report presents the findings and recommendations of the working group, including the scientific background and rationale for The BRAIN

The Brain Research through Advancing Innovative Neurotechnologies Initiative aims to accelerate the development and application of innovate technologies to produce a new, dynamic picture of the brain.

 Initiative® as a whole and for each of seven major goals articulated in the report. In addition, we include specific deliverables, timelines, and cost estimates for these goals as requested by the NIH Director. 

The charge from the President and from the NIH Director is bold and ambitious. The working group agreed that the best way to set this vision in motion is to accelerate technology development, as reflected in the name of The BRAIN Initiative®: “Brain Research through Advancing Innovative Neurotechnologies.” The focus is not on technology per se, but on the development and use of tools for acquiring fundamental insight about how the nervous system functions in health and disease. The initiative is only one part of the NIH’s investment in basic, translational, and clinical neuroscience, but neurotechnology should advance other areas as well. To achieve these goals, we recommend that The BRAIN Initiative® develop over a ten-year period beginning in FY2016, with a primary focus on technology development in the first five years, shifting in the second five years to a primary focus on integrating technologies to make fundamental new discoveries about the brain. The distinction between these phases is not black and white, but rather is a matter of emphasis and opportunity. Discovery-based science will motivate technology development in the first phase, and further technology development will be needed as the focus shifts to discovery in later years. 

In considering these goals and the current state of neuroscience, the working group identified the analysis of circuits of interacting neurons as being particularly rich in opportunity, with potential for revolutionary advances. Truly understanding a circuit requires identifying and characterizing the component cells, defining their synaptic connections with one another, observing their dynamic patterns of activity as the circuit functions in vivo during behavior, and perturbing these patterns to test their significance. It also requires an understanding of the algorithms that govern information processing within a circuit and between interacting circuits in the brain as a whole. The analysis of circuits is not the only area of neuroscience worthy of attention, but advances in technology are driving a qualitative shift in what is possible, and focused progress in this area will benefit many other areas of neuroscience. 

With these considerations in mind, the working group consulted extensively with the scientific community to evaluate challenges and opportunities in the field. The following areas were identified as high priorities for The BRAIN Initiative®. These goals are intellectually and practically expanded in Sections II and III of this report.

  1. Discovering diversity: Identify and provide experimental access to the different brain cell types to determine their roles in health and disease. It is within reach to characterize all cell types in the nervous system, and to develop tools to record, mark, and manipulate these precisely defined neurons in the living brain. We envision an integrated, systematic census of neuronal and glial cell types, and new genetic and non-genetic tools to deliver genes, proteins, and chemicals to cells of interest in non-human animals and in humans. 

  1. Maps at multiple scales: Generate circuit diagrams that vary in resolution from synapses to the whole brain. It is increasingly possible to map connected neurons in local circuits and distributed brain systems, enabling an understanding of the relationship between neuronal structure and function. We envision improved technologies—faster, less expensive, scalable—for anatomic reconstruction of neural circuits at all scales, from non-invasive whole human brain imaging to dense reconstruction of synaptic inputs and outputs at the subcellular level. 

  1. The brain in action: Produce a dynamic picture of the functioning brain by developing and applying improved methods for large-scale monitoring of neural activity. We should seize the challenge of recording dynamic neuronal activity from complete neural networks, over long periods, in all areas of the brain. There are promising opportunities both for improving existing technologies and for developing entirely new technologies for neuronal recording, including methods based on electrodes, optics, molecular genetics, and nanoscience, and encompassing different facets of brain activity. 

  1. Demonstrating causality: Link brain activity to behavior with precise interventional tools that change neural circuit dynamics. By directly activating and inhibiting populations of neurons, neuroscience is progressing from observation to causation, and much more is possible. To enable the immense potential of circuit manipulation, a new generation of tools for optogenetics, chemogenetics, and biochemical and electromagnetic modulation should be developed for use in animals and eventually in human patients. 

  1. Identifying fundamental principles: Produce conceptual foundations for understanding the biological basis of mental processes through development of new theoretical and data analysis tools. Rigorous theory, modeling, and statistics are advancing our understanding of complex, nonlinear brain functions where human intuition fails. New kinds of data are accruing at increasing rates, mandating new methods of data analysis and interpretation. To enable progress in theory and data analysis, we must foster collaborations between experimentalists and scientists from statistics, physics, mathematics, engineering, and computer science. 

  1. Advancing human neuroscience: Develop innovative technologies to understand the human brain and treat its disorders; create and support integrated human brain research networks. Consenting humans who are undergoing diagnostic brain monitoring, or receiving neurotechnology for clinical applications, provide an extraordinary opportunity for scientific research. This setting enables research on human brain function, the mechanisms of human brain disorders, the effect of therapy, and the value of diagnostics. Meeting this opportunity requires closely integrated research teams performing according to the highest ethical standards of clinical care and research. New mechanisms are needed to maximize the collection of this priceless information and ensure that it benefits people with brain disorders. 

  1. From BRAIN Initiative to the brain: Integrate new technological and conceptual approaches produced in Goals #1-6 to discover how dynamic patterns of neural activity are transformed into cognition, emotion, perception, and action in health and disease. The most important outcome of The BRAIN Initiative® will be a comprehensive, mechanistic understanding of mental function that emerges from synergistic application of the new technologies and conceptual structures developed under The BRAIN Initiative®. 

The overarching vision of The BRAIN Initiative® is best captured by Goal #7—combining these approaches into a single, integrated science of cells, circuits, brain, and behavior. For example, immense value is added if recordings are conducted from identified cell types whose anatomical connections are established in the same study. Such an experiment is currently an exceptional tour de force; with new technology, it could become routine. In another example, neuronal populations recorded during complex behavior might be immediately retested with circuit manipulation techniques to determine their causal role in generating the behavior. Theory and modeling should be woven into successive stages of ongoing experiments, enabling bridges to be built from single cells to connectivity, population dynamics, and behavior. 

This synthetic approach will enable penetrating solutions to longstanding problems in brain function, but we also emphasize the likelihood of entirely new, unexpected discoveries that will result from the new technologies. In some sense, BRAIN Initiative scientists who apply the new activity-monitoring technology will be like Galileo looking into the heavens with the first optical telescope. Similarly, new perturbation tools and quantitative approaches are likely to yield extraordinary insights into the relationship between brain activity and mental functions. We expect to discover new forms of neural coding as exciting as the discovery of place cells, and new forms of neural dynamics that underlie neural computations. 

Over the course of our deliberations, specific themes emerged that should become core principles for the NIH BRAIN Initiative. 

  1. Pursue human studies and non-human models in parallel. The goal is to understand the human brain, but many methods and ideas will be developed first in animal models. Experiments should take advantage of the unique strengths of diverse species and experimental systems. 

  1. Cross boundaries in interdisciplinary collaborations. No single researcher or discovery will solve the brain’s mysteries. The most exciting approaches will bridge fields, linking experiment to theory, biology to engineering, tool development to experimental application, human neuroscience to non-human models, and more, in innovative ways. 

  1. Integrate spatial and temporal scales. A unified view of the brain will cross spatial and temporal levels, recognizing that the nervous system consists of interacting molecules, cells, and circuits across the entire body, and important functions can occur in milliseconds or minutes, or take a lifetime. 

  1. Establish platforms for sharing data. Public, integrated repositories for datasets and data analysis tools, with an emphasis on ready accessibility and effective central maintenance, will have immense value. 

  1. Validate and disseminate technology. New methods should be critically tested through iterative interaction between tool-makers and experimentalists. After validation, mechanisms must be developed to make new tools available to all. 

  1. Consider ethical implications of neuroscience research. BRAIN Initiative research may raise important issues about neural enhancement, data privacy, and appropriate use of brain data in law, education and business. These important issues must be considered in a serious and sustained manner. BRAIN Initiative research should hew to the highest ethical standards for research with human subjects and with non-human animals under applicable federal and local laws. 

  1. Accountability to NIH, the taxpayer, and the basic, translational, and clinical neuroscience communities. The BRAIN Initiative® is extremely broad in interdisciplinary scope and will involve multiple partners both within and outside the NIH. Oversight mechanisms should be established to ensure that BRAIN funds are invested wisely for the ultimate benefit of the public and the scientific community.

To guide The BRAIN Initiative® and ensure that these goals and principles are evaluated and refreshed as appropriate, we recommend that a scientific advisory board be established, to be composed of scientists who are experts in the diverse fields relevant to the Initiative — neuroscience, molecular biology, the clinical sciences, the physical and quantitative sciences, and ethics. The rapid pace of technological and conceptual change in neuroscience almost ensures that some portions of this report will be obsolete within several years. A cohesive and rigorous scientific advisory board will be invaluable in responding to future challenges. 

As part of the planning process, the working group was asked to estimate the cost of The BRAIN Initiative®. While we did not conduct a detailed cost analysis, we considered the scope of the questions to be addressed by the initiative, and the cost of programs that have developed in related areas over recent years. Thus our budget estimates, while provisional, are informed by the costs of real neuroscience at this technological level. To vigorously advance the goals of The BRAIN Initiative® as stated above, we recommend an investment by the NIH that ramps up to $400 million/year over the next five years (FY16-20), and continues at $500 million/year subsequently (FY21-25). A sustained, decade-long commitment at this level will attract talented scientists from multiple fields to the interdisciplinary collaborations that are essential to The BRAIN Initiative® and its ambitious goals. 

Preamble

We stand on the verge of a great journey into the unknown—the interior terrain of thinking, feeling, perceiving, learning, deciding, and acting to achieve our goals—that is the special province of the human brain. These capacities are the essence of our minds and the aspects of being human that matter most to us. Remarkably, these powerful yet exquisitely nuanced capacities emerge from electrical and chemical interactions among roughly 100 billion nerve cells and glial cells that compose our brains. All human brains share basic anatomical circuits and synaptic interactions, but the precise pattern of connections and interactions are highly variable from person to person—and therein lies the source of the remarkable variation we see in human behavior, from the breathtaking dance of a ballerina, to the elegant craftsmanship of a master carpenter, to the shrewd judgment of an expert trader. Our brains make us who we are, enabling us to perceive beauty, teach our children, remember loved ones, react against injustice, learn from history, and imagine a different future. 

The human brain is simply astonishing—no less astonishing to those of us who have spent our careers studying its mysteries than to those new to thinking about the brain. President Obama, by creating The BRAIN Initiative®, has provided an unprecedented opportunity to solve those mysteries. The challenge is to map the circuits of the brain, measure the fluctuating patterns of electrical and chemical activity flowing within those circuits, and understand how their interplay creates our unique cognitive and behavioral capabilities. We should pursue this goal simultaneously in humans and in simpler nervous systems in which we can learn important lessons far more quickly. But our ultimate goal is to understand our own brains. 

Like the Apollo program, this challenging objective will require the development of an array of new technologies, drawing on scientists and engineers from a multitude of disciplines. These technologies will have to be integrated in an unprecedented manner to achieve the Initiative’s goals. We are at a unique moment in the history of neuroscience—a moment when technological innovation has created possibilities for discoveries that could, cumulatively, lead to a revolution in our understanding of the brain. The new technologies described in this report are already laying a foundation for exceptional progress, but more innovation is required. Spectacular opportunities for deeper understanding would be created by new molecular techniques to identify the specific connections between nerve cells that change when a new memory is formed or a new social situation encountered. Similarly, new physics and engineering methods for noninvasive measurement and tuning of activity in fine-scale human brain circuits would create a revolution in the understanding and treatment of disease. 

What will be gained by solving the mystery of the brain’s circuits and their activity across time and space? Understanding the brain is a riveting intellectual challenge in and of itself. But in the longer term, new treatments for devastating brain diseases are likely to emerge from a deeper understanding of brain circuits. For example, treatment of Parkinson’s disease has been greatly enhanced by circuit-level understanding of the brain’s motor systems. Our front-line treatment for Parkinson’s is the dopamine precursor drug, L-dopa, but its efficacy decreases over time while severe side effects increase. In response, teams of neurophysiologists, engineers, and physicians fused an understanding of the brain’s motor circuits with technological advances to create deep brain stimulation (DBS), which can restore motor circuit function in many Parkinson’s patients for up to several years. Current research into brain circuits for mood and emotion has the potential to advance psychiatry in similar ways. 

We believe this to be a moment in the science of the brain where our knowledge base, our new technical capabilities, and our dedicated and coordinated efforts can generate great leaps forward in just a few years or decades. Like other great leaps in the history of science—the development of atomic and nuclear physics, the unraveling of the genetic code—this one will change human society forever. Through deepened knowledge of how our brains actually work, we will understand ourselves differently, treat disease more incisively, educate our children more effectively, practice law and governance with greater insight, and develop more understanding of others whose brains have been molded in different circumstances. To achieve this vision, our nation must train and support a new generation of trans-disciplinary brain scientists and provide the resources needed to unleash their creative energies for the benefit of all. 

On a personal note, the members of this committee are grateful to President Obama and the NIH for the opportunity to embark on our own journey of discovery over the past year in preparing this report. We are indebted to numerous colleagues who participated in our four workshops in the summer of 2013, and in public feedback sessions following publication of our preliminary report in September of 2013. And we are grateful to our many colleagues who shared their insights in one-on-one conversations, arguing with us and educating us in the process. We also value the perspectives offered to us by patient advocacy groups and members of the lay public. This journey has already proved lively and enjoyable. We look forward to the next phase of discovery in The BRAIN Initiative®. 

Respectfully, the members of The BRAIN Initiative® Working Group

SECTION I. THE BRAIN INITIATIVE: VISION AND PHILOSOPHY

1. Mapping the Structure and Components of Circuits 

What classes of neurons and glia are involved in a given mental process or neural activity state? Which cells and brain regions contribute to a single percept or action, and how are they connected to each other? To answer these questions, we must define the cellular components of circuits, including their molecular properties and anatomical connections. This knowledge will tell us what the brain is made of at molecular, cellular, and structural levels; it will also provide a foundation for understanding how these properties change across the normal lifespan and in brain disorders. 

1a. Cell Type: A Starting Point for Intellectual and Technical Progress 

1a-i. A Census of Neuronal and Glial Cell Types 

The brain contains many classes of neurons and glia, but not infinitely many. It consists of neurons that are distinguished by their neurotransmitters, electrophysiological properties, morphology, connectivity, patterns of gene expression, and probably other functional properties. These properties are major determinants of system-wide neural activity patterns. Classification of neurons is prerequisite to manipulating them in controlled ways, and to understanding how they change in brain disorders. Information about the types of glial cells, vascular cells, and immune cells associated with the nervous system may also increase our understanding of brain function in health and disease. Therefore, a valuable short-term goal for a BRAIN Initiative is to generate a census of cell types within the brain. 

There is not yet a consensus on what a neuronal type is, since a variety of factors including experience, connectivity, and neuromodulators can diversify the molecular, electrical, and structural properties of initially similar neurons. In some cases, there may not even be sharp boundaries separating subtypes from each other, and cell phenotypes may change over time. Nonetheless, there is general agreement that types can be defined provisionally by invariant and generally intrinsic properties, and that this classification can provide a good starting point for a census. Thus, the census should begin with well-described large classes of neurons (e.g. excitatory pyramidal neurons of the cortex) and then proceed to finer categories within these classifications. This census would be taken with the knowledge that it will initially be incomplete, and will improve over iterations. A census of cell types is an important short-term goal for several reasons: 

1. An agreed-upon set of cells provides a frame of reference for studies in many labs, and possibly in different organisms, allowing cross-comparisons. For example, to the extent that neuronal cell types are conserved across species (itself an important question) we can ask whether there are differences in their numbers and ratios in the cortex of primates compared to rodents. 

2. An agreed-upon set of cells provides a foundation for further experiments, and shapes the problem going forward. For example, what genes are expressed in each of the different cell types? Do the sets of cells add up to 100%, suggesting that all neurons and glia are accounted for? Are there genetic elements such as Cre lines or viruses that provide experimental access to each cell type? 

3. This problem can be solved with readily achievable improvements to existing technology. 

4. This project has an initial endpoint, and the list itself will be a resource that can serve to organize subsequent BRAIN experiments and data analysis in a systematic way. 

We envision a “neuron-ontology” bioinformatic framework that might be analogous to the gene-ontology framework in molecular genetics.  The framework should include the potential to cross-reference information about homologous cell types from different animals, which could be valuable in the same way that information about homologous genes is valuable for linking studies across species. 

The atlas should expand to describe the detailed morphology and connectivity of each neuronal class, its activity under different conditions, and its response to perturbations, as these results emerge. It could grow to include information about cells from human patients and animal models of human disease, extending its reach and providing insight into pathological processes. 

The ultimate census of cell types would include all of the neurons and glia with molecular annotation at subcellular resolution: not just mRNA expression but ion channels, synaptic proteins, intracellular signaling pathways, and so on. This is beyond the reach of current technology, but stating the goal will provide impetus to technological development. The Allen Institute for Brain Science Brain Atlas for RNA in situ hybridization is an example of the annotation of an anatomical database with molecular information. Similarly, array tomography can provide information about the location of specific proteins within cells based on antibody staining and optical imaging. 

A census and database of cell types might begin with the mouse, where many genetic tools have already been developed and substantial data exist on gene expression patterns. Over the longer term, the census could be extended to different animal species and to humans. 

1a-ii. Tools for Experimental Access to Defined Cell Types 

The ability to define, monitor, and manipulate a circuit requires experimental access to the individual cells and groups of cells within that circuit. Development of such tools will be facilitated by molecular analysis of cell types (section 1a-i), and should in turn facilitate progress in mapping neuronal connectivity (section 1b), understanding neuronal dynamics (section 2), and establishing function through causal neuronal manipulation (section 3). 

The past decade has seen the development of remarkable genetic tools including calcium indicators (e.g. GCaMP), optogenetic tools (e.g. Channelrhodopsin), synaptic monitors (e.g. SynaptopHluorin), chemogenetic tools (e.g. RASSLs/DREADDs), and a variety of tags that permit proteins to be visualized in vivo. By their nature, using these tools requires the ability to deliver a gene to a neuron or neurons of interest (‘genetic access’). Projects such as the Howard Hughes Medical Institute (HHMI) Drosophila project, the NIH Gene Expression Nervous System Atlas (GENSAT), and the Allen Brain Atlas have been, or are currently engaged in, developing genetic access to defined cell types in Drosophila and the mouse, but while these tools are nearing completion in the fly, they are not comprehensive in any other species. 

The current methods for neuron-specific gene delivery in the mouse are typically bipartite, with (1) a recombinase gene such as Cre and (2) an effector/sensor gene that is activated by the recombinase. These two elements are independently delivered as transgenes, through recombination into an endogenous locus or insertion into bacterial artificial chromosomes (BACs), or as stereotactically-injected viruses; the “intersection” between the two elements generates a more restricted expression pattern than either one alone. The potential value of such lines is high, but there is much room for improvement, and only a small number are in regular use in the current literature. 

In addition, non-genetic methods could be used to deliver active agents to neurons of particular types, and would expand the range of possible experiments. Viruses or liposomes that contain pharmacological agents, proteins, or nanoparticles might be coated with antibodies that direct them to certain cell types. Providing reliable access to specific cell types in particular neural circuits or brain areas will accelerate all areas of modern neuroscience. 

A full exploration of methods for targeting genes, proteins, and chemicals to specific cell types is highly desirable. One question that should be considered during the initial stages of The BRAIN Initiative® is whether genome engineering by conventional transgenesis could be superseded by methods that are faster, cheaper, and more easily generalized across species. Mouse husbandry is slow and expensive, and generating the right multi-transgenic strains is a financial and temporal drag on the progress of neuroscience research. Furthermore, we wish to study other species as well. The BRAIN Initiative® should solicit new ideas for cell-type specific delivery of transgenes, perhaps based on viruses bearing small specific regulatory regions for intersectional cell-type definition via (for example) multiple recombinases; or viruses driving efficient, specific integration of exogenous genes into the genome through cutting-edge tools such as CRISPRs and TALENs; or antibody-targeted liposomes. Completely new ideas might emerge to address this problem. The most valuable ideas will be those with potential to solve the general problem for any species, in preference over those that would work for only one species at a time. The long-term vision is development of comprehensive, general suites of tools that target expression to a brain area of interest, disseminated for broad, effective use in neuroscience labs around the world. 

The next frontier would be gaining access to the human brain, which is more likely to involve transient delivery of RNA or a chemical than permanent genetic change, although viral vectors for human gene therapy are currently under exploration in the brain. Several pharmaceutical companies are developing tagged antibodies that cross the blood-brain barrier (e.g. via transferrin receptors), and these might be chemically or genetically engineered to include effectors or sensors of neuronal activity. 

In summary, it is within reach to characterize all cell types in the nervous system, and to develop tools to record, mark, and manipulate these precisely defined neurons in vivo. We envision an integrated, systematic census of neuronal and glial cell types, and new genetic and non-genetic tools to deliver genes, proteins, and chemicals to cells of interest. Priority should be given to methods that can be applied to many animal species and even to humans. 

1b. The Structural Map: Tracing Anatomical Circuits at Different Scales 

Rapid information flow across the brain is mediated by anatomical connections between cells, including local connections within a brain region and long-range connections into and out of that region. Defining circuit function requires knowledge of circuit structure. Three levels of anatomy should be considered: long-range, intermediate-range, and detailed connectivity. 

1b-i. Long-Range Connectivity 

Traditional neuroanatomy has focused on large-scale, long-range connections between different brain regions (e.g. the thalamocortical tract). In humans, long-range connections are being studied within the Human Connectome Project by noninvasive imaging methods. In animals, long-range connections are being pursued in detail using serial sectioning combined with modern dye-tracing techniques and genetic markers, with newly emerging whole-mount imaging and staining methods such as CLARITY and clearing techniques such as Scale and SeeDB poised to make an impact. Currently, most effort is being expended on the mouse model system, but these techniques can and should be extended to other species as well. The new whole-mount methods also appear promising for tracing tracts in human post-mortem tissue, and may provide an important high-resolution complement to noninvasive imaging methods. 

“Projectomes” of this kind are attainable within the next few years with current and emerging technology. The next steps are identifying gaps and completing studies of rodent, non-human primate, and human anatomical tracts at high quality, integrating the results across labs and institutions, and making the information broadly available to the community. Inclusion of other species for comparative purposes is highly desirable. Combining these datasets into a common bioinformatic framework, and registering these datasets with other streams of information describing the cell populations and projections of interest, such as molecular phenotype and activity patterns during behavior, will increase their depth and scientific utility. 

1b-ii. Intermediate-Scale Connectivity 

The next problem is mapping circuits at an intermediate scale. What long- and short-range projections make up a specific functional circuit, which may consist of only some of the cells in a particular brain region? For example, brain regions such as the hypothalamus consist of mixed cell populations, each of which has very different input and output connections that are not evident in the large-scale connectivity. Mapping these connections currently requires considerable time and effort. Progress in this area is attainable and should be vigorously pursued. 

There is considerable potential for improving the tools for studying intermediate-level circuitry. Trans-synaptic tracing of connections is highly desirable, but existing methods (lectins, dyes, and rabies-based viral tracers) are imperfect. For example, rabies tracers are largely limited to retrograde tracing, even though anterograde tracing is equally important for defining circuits. There is a concern that the present tracers may work on only a subset of cell types, and there is no fully accepted answer as to whether these tracers are strictly trans-synaptic or more generally trans-neuronal. 

Better methods for tracing circuits are critical to rapid progress throughout neuroscience, and would provide important structural constraints for interpreting virtually all functional studies. Better trans-synaptic tracers would be extremely valuable, and their development should be encouraged by The BRAIN Initiative®. These may be based on viruses or on different kinds of transgenic technologies; combining these methods with tract tracing or array tomography would increase their resolution. Other potential techniques are being explored, but none has yet matured: methods that use fluorescent proteins to label synapses (e.g. GRASP); or enzymatically-based detectors of trans-synaptic recognition (e.g. ID-PRIME), which have the enormous advantage of amplifying signals to enable robust detection. Truly transformative technologies could be encouraged from molecular biology or chemistry. The most attractive methods would be those applicable to many species, including humans. Methods that work in post-mortem brains would be particularly valuable for high resolution mapping of human brain circuits. Identifying intermediate-scale circuits should be a significant goal of The BRAIN Initiative®. 

1b-iii. Detailed Connectivity: Towards a Full Connectome 

Finally, there is the question of reconstruction of circuits at very high resolution through electron microscopy, which is widely considered to be the gold standard for circuit mapping. To date sparse reconstruction has been used to examine small numbers of neurons in a variety of systems, but dense reconstruction has been applied only to the very small animal C. elegans, or to small parts of the nervous systems of larger animals such as Drosophila; ongoing studies in mammals are extending this approach to the retina. The past few years have seen great strides in sectioning and image collection techniques, but even so, electron microscopy (EM) is prohibitively slow for large-scale studies. The bottleneck is data analysis, the painstaking and potentially error-prone process of tracing fibers and mapping synapses from one very thin section of a brain to the next across thousands of successive sections. 

The impact of dense EM reconstruction would be amplified tremendously if it were possible to increase throughput 100- or 1000-fold across all steps of the procedure, including segmentation and reconstruction as well as sectioning and data acquisition. Some promising improvements have been demonstrated, including automated capturing of serial sections for transmission EM, and serial block face scanning EM that maintains perfect 3-D registration during automated sectioning and data acquisition, but much remains to be done. 

Possible areas for incremental improvement include: 

  1. Improved methods for the histological preparation of neural tissue, especially large samples. Can we engage chemists and chemical engineers in the problem to bring fresh approaches to this century-old area of research? 

  1. Improved methods for automated tissue sectioning and imaging, although this area has already progressed greatly. 

Areas where progress is most needed are: 

  1. Improved software methods for segmenting and assembling the data. Technological advances in machine learning, artificial intelligence, and crowd-sourcing approaches to reconstruction could have a profound impact on the field. 

  1. Improved methods for synapse identification, in particular the ability to assess the type of synapse (excitatory, inhibitory, modulatory, electrical) and estimate synaptic strength. Cell type-specific markers or molecular markers of subsets of synapses that are visible at the EM level could be very helpful in large-scale reconstructions. 

EM is labor intensive, but it happens in stages. Sectioning for EM is relatively quick; scanning takes ten times as long; reconstructing is slower by orders of magnitude. If high quality scanned images were made available on the internet, individual users could spend their own time reconstructing areas of the brain of relevance to them, using software tools made available by the experts. Under this model, the laborious reconstruction task would be performed as-needed by a world-wide community of collaborators. The scope and impact of EM could be broadened beyond the relatively small group of expert labs by encouraging sharing of primary scanned images and reconstruction tools. There is no reason, in the modern era, for EM micrographs to be trapped in the lab that generated them. It would be exciting for The BRAIN Initiative® to generate the basic data resource (high quality micrographs) for a variety of brains and species, with entirely open access to the data. 

Truly innovative approaches to dense reconstruction should be encouraged, with a focus on the data analysis bottleneck and greatly improved throughput. A 100- or 1000-fold improvement should be held up as a serious goal. As with other approaches to wiring, registering these dense connectivity datasets with molecular phenotypes and activity patterns during behavior will vastly increase the scientific utility and interpretability of the data. Future decisions about whether, when, and how to scale up these anatomical approaches will depend on progress in the methods described above. 

Development of these new technologies should proceed hand-in-hand with application to important problems in neuroscience. In the best case, dense reconstruction could be performed after recordings of neuronal activity and behavior in the same animal. The larval zebrafish is a promising system for a full, dense reconstruction of a vertebrate nervous system. Smaller projects in the mammalian retina, hippocampus, or cortex could have a broad impact. The important point is that broad support for large-scale, dense connectomics will only appear when it begins to yield answers to specific scientific questions that could not have emerged by other means. 

In summary, it is increasingly possible to map connected neurons in local circuits and distributed brain systems, enabling an understanding of the relationship between neuronal structure and function. We envision improved technologies—faster, less expensive, scalable—for anatomic reconstruction of neural circuits at all scales, such as molecular markers for synapses, trans-synaptic tracers for identifying circuit inputs and outputs, and EM for detailed reconstruction. The effort would begin in animal models, but some mapping techniques may be applied to the human brain, providing for the first time cellular-level information complementary to the Human Connectome Project. 

2. Neuronal Dynamics: Recording Neuronal Activity Across Time and Space 

Understanding the electrical and chemical activity of neuronal circuits and systems is central to The BRAIN Initiative®. The challenge is that these circuits incorporate neuronal activity at a variety of spatial and temporal scales. At the spatial level, an ensemble of neurons associated with a given behavioral task may be concentrated in one brain region, but not all neurons in that region may be part of the ensemble, and other important neurons will reside in different regions. For example, a circuit for conditioned fear behavior might include subsets of neurons in the primary sensory cortex and thalamus (threat sensation), the hippocampus (memory formation), the amygdala (fear learning), the autonomic nervous system (physiological output), and the prefrontal cortex (top-down control of behavioral response to the threat), among many others. To systematically study brain mechanisms underlying a particular behavior or cognitive process, it is important to sample neuronal activity broadly across brain structures and record from many identified cell types. It is also critical to measure and analyze neuronal activity at multiple time scales that are relevant to behavior and cognition: fast (e.g. neuronal spikes), intermediate (e.g. short-term plasticity, recurrent excitation) and slow (e.g. global attentional and arousal states; neuromodulation). 

In an ideal world, a neuroscientist might or might not want to know the activity of every neuron in an animal under a given condition—this is a subject of debate—but there is general agreement that we need to measure neuronal activity with much more fidelity across much larger spatial and temporal scales than we are managing at the moment. In the vast majority of experiments, we observe only a tiny fraction of the activity in any neuronal circuit, and then under a very limited range of behavioral conditions. How can we best identify the spatial and temporal patterns of activity that underlie specific cognitive processes and behaviors? How will we know when we have recorded from enough neurons to understand a cognitive process or mental state? What methods are needed to record all relevant kinds of activity in all relevant brain regions? 

2a. What Neurons Should We Record? Identifying Dispersed Circuits 

As illustrated above, it is important to scan the brain to identify distributed circuits. In general, distributed circuits must be defined functionally, based on activity of the constituent neurons during a behavior or under specific experimental conditions. While structural maps lay the foundation for our understanding, even the highest level of anatomical resolution is not sufficient to define a circuit, because synapses vary in their strength and modulation. An additional complication is that individual neurons may participate in different functional circuits under different experimental conditions or behavioral tasks. Mapping dispersed and overlapping circuits can be aided by labeling neurons that are active during a specific window of time, which permits identification of functionally related cells that are spatially intermixed with other cells. 

Existing tools for identifying functional circuits on cellular scales must be improved, and development of novel tools strongly encouraged. The relationships between cells in circuits can be rigorously established by electrophysiological recordings in which one cell is stimulated and the other’s activity is recorded, sometimes even with many neurons (e.g. laser scanning photostimulation), but this is not easily accomplished over long distances. A variety of methods involving optogenetic tools may help in this effort. For example, virally-delivered opsins and fluorescent proteins will spread throughout cells and down axons, allowing anatomically-defined optogenetic control by light delivery at the axon terminal region. Exciting defined cells or axon terminals with light while recording from a single postsynaptic cell can define sources of functional input (channelrhodopsin-assisted circuit mapping). This method in its current form is not equally effective in all settings, however, and is not easily combined with large-scale recordings. 

One class of tools for circuit mapping is based on the expression of “immediate early genes” whose expression is up-regulated by sharp increases in neuronal activity. In its original form, each animal is only examined once, and the cells are dead and fixed by the time they are identified. In its modern form, immediate early gene expression can be coupled to reporters such as tamoxifen-regulated Cre recombinase, allowing permanent Cre-marking of neurons that were highly active at the time that tamoxifen was delivered or removed. While useful, the existing promoters are slow reporters with unpredictable regional and cell-type specificity, and their expression is only partially correlated with neuronal activity. In short, this is a tool to begin sketching a circuit for an entirely novel stimulus, but is not sufficient to watch ongoing, more modest changes in activity. 

Improvements in methods to identify neurons in active circuits should be encouraged. The importance of these methods will be greatest for circuits that are distributed, or intermixed with other circuits, in a way that frustrates conventional anatomical tracing. Better time resolution is highly desirable. A new method uses phosphorylation of a ribosomal protein, S6, to label active cells while allowing their mRNA expression to be characterized; this method is applicable across mammals, not just to mice. Improved transcriptional reporters have been suggested that would require the coincidence of light activation and calcium entry to induce transcription, with light used to define the point at which activity is measured. There may be entirely new ways to solve this problem through biochemical or chemical reagents that mark active neurons. The ability to mark several circuits in series in the same animal would allow more sophisticated analysis, for example for within-animal comparisons of the effect of different behavioral states on neuronal responses. 

2b. How Many Neurons Should We Record? The Test Case is Complete Circuits 

Small model systems provide a test bed for asking how much “emergent information” arises from recording an entire brain or brain structure, and provide initial clues to the density of recordings needed to characterize functional circuits. In a first example from the 1980s, voltage-sensitive dyes revealed widespread activation of over 100 abdominal ganglion neurons in the mollusc Aplysia during gill withdrawal behavior. However, other experiments argued that experience-dependent changes in gill withdrawal and its regulation could be controlled by just a few of these neurons. These contrasting observations posed important questions about the relative roles of population activity and single-neuron function that are still not answered. To assess the value added by monitoring all neurons in a circuit, complete or near-complete recordings should be gathered from a few model systems under a variety of conditions. 

It is worth noting how far we still are from the goal of recording the activity of a complete circuit, except for a few invertebrate ganglia, even though recording methods have been scaled up in recent years. For example, large-scale neurophysiological approaches have allowed recordings from thousands of neurons in the vertebrate retina, but even so only the retinal ganglion cells have been recorded at scale, not the many nonspiking bipolar and amacrine cells that process information prior to optic nerve output. Amacrine cells illustrate an additional challenge to the concept of “complete” recordings; some have subcellularly compartmentalized voltage signals that would be overlooked if recordings were made only from the soma. 

A few test cases for large-scale, “complete” neuronal recordings would be of great interest, especially if gathered in close partnership with theory and behavioral analysis to provide context and interpretation. Genetically-encoded calcium indicators are already being used to image a large fraction of neurons in brains of the larval zebrafish and the nematode worm C. elegans, although not yet at speed in behaving animals. Complete recordings from neurons within well-defined mammalian brain areas are appealing, but will require new approaches. 

2c. How Should We Record? Advancing Recording Technology 

Currently, there are two important classes of methods for recording neuronal activity. Classically, electrophysiology with electrodes has been the workhorse of neuroscience. Microelectrode and macroelectrode recordings will continue to be important due to their high temporal resolution, their applicability to structures throughout the brain, and their appropriateness for human studies. More recently, optical methods for recording activity have been greatly improved, providing substantial opportunity for further advances. Both are important areas for development. 

2c-i. Electrode Arrays for Recording Voltage and Passing Current 

Micro- and macroelectrodes are widely used tools for recording neural voltage signals and stimulating neural tissue artificially via passage of electrical current. Understanding cognitive and behavioral processes mediated by distributed neural circuits will be greatly accelerated by the development of next-generation multi-electrode arrays that can record single cell activity simultaneously from large populations of neurons at multiple sites throughout the brain. New electrode architectures and accompanying methods for extracting the measured signals are being pursued in many laboratories. Here we highlight general problems whose solution would accelerate progress on many fronts, from basic circuit research in small animals to human clinical application. 

Penetrating electrode arrays: Much recent work has been directed at developing arrays of penetrating microelectrodes using integrated circuit technology, with substantial efforts underway to increase the number of electrode shanks and the number of recording contacts per shank. Next-generation electrode arrays should also have the capability of simultaneous stimulation and recording from individual electrodes on the array, since this capability can be critical for establishing functional relationships among recorded neurons. Advances should be sought to address four primary problems currently impeding progress. First is the physical design of electrode arrays with greatly increased numbers of shanks and contacts; different designs will be needed for the complementary goals of making simultaneous recordings from very large numbers of neurons at one site (e.g. a cortical column) versus more modest numbers of neurons at many dispersed sites. Second is a major divide between “passive” and “active” electrode design. One obstacle to true high-density recording (e.g. >1000 recording contacts) is physical management of the large number of wires that must be attached to the recording contacts. This problem can be reduced substantially by incorporating “active” circuitry into the array implants that filters, amplifies, multiplexes, and telemeters the primary signals close to the recording source. Solutions to these problems will require substantial engineering innovation in miniaturizing active electronics and reducing the power needs of the circuitry. Third is the continued development of hybrid electrode arrays that combine electrical and optical recording and stimulation capabilities, increasing experimental power relative to either method alone. Finally, particularly for human use, electrode arrays must be made more compatible with the target tissues in their mechanical compliance, cross-sectional area, lifetime in the implanted tissues, and immune tolerance. 

Surface electrodes for recording average neural activity from cortical loci are an established technology with notable recent improvements: Small surface electrodes (20 micron diameter) at very high spatial densities (20 micron separation) have been created on thin, flexible parylene substrates (4 microns thick) that “conform” well to local curvature of the cortical surface. Such conformal arrays coupled to smart electronics would be particularly useful for research and device development (e.g. prosthetics) in humans. 

Further progress will come from supporting a diverse assortment of penetrating and surface electrode array designs, with emphasis placed on designs that, 1) are smarter, smaller and use less power, 2) integrate multiple capabilities (e.g. electrical and optical; recording and stimulation), and 3) can be implemented at multiple scales, from small animals to humans. In the longer term, new materials and designs may prolong the useful lifetime of implanted arrays. Next-generation electrode array design is an area that might benefit substantially from interaction with private companies with expertise in integrated circuit chip design, miniaturization, wireless telemetry, and low-power applications. Some companies with relevant expertise have expressed interest in partnering with The BRAIN Initiative®. The main obstacle to such partnerships to date has been the lack of financial incentive due to the small market. 

2c-ii. Optical Sensors of Neuronal Activity 

The ability to monitor activity in large numbers of neurons has been accelerated over the past two decades by using optical methods and tools from chemistry and genetics. Optical sensors, whether chemical or genetic, have the potential to report sub-cellular dynamics in dendrites, spines, or axons; to probe non-electrical facets of neural dynamics such as the neurochemical and biochemical aspects of cells’ activities; and to sample cells densely within local microcircuits. The capacity for dense sampling holds particular promise for revealing collective activity modes in local microcircuits that might be missed with sparser recording methods. 

Genetic tools can also target cells by genetic type or connectivity, and maintain large-scale chronic recordings of identified cells or even individual synapses over weeks and months in live animals. Such large-scale chronic recordings are especially beneficial for long-term studies of learning and memory, circuit plasticity, development, animal models of brain disease and disorders, and the sustained effects of candidate therapeutics. 

Although the acceleration in optical sensor development is relatively recent, it has already had a great impact on the field. Presently, most in vivo optical recordings are studies of neuronal or glial calcium dynamics. Neuronal calcium tracks action potentials as well as presynaptic and postsynaptic calcium signals at synapses, providing important information about both input and output signals. However, its ability to report subthreshold or inhibitory signals is variable, and while existing indicators have achieved single-spike sensitivity in low firing rate regimes, they cannot yet follow spikes in fast-spiking neurons. 

The future of this field is not just improving calcium sensors, but generating a broad suite of optical sensors. Voltage indicators are ripe for development: by following voltage, one could in principle follow spikes and subthreshold signals, including inhibition. Several genetically encoded voltage indicators have appeared, but they do not yet have the desired combination of signal strength and speed, and could benefit greatly from disciplined, iterative improvements. Improved voltage indicators may well be genetically encoded, but other approaches from chemistry and nanotechnology should also be considered. The experience of optimizing the calcium indicators should be directly applicable to improving voltage indicators. Indicators with ultra-low background emissions hold particular importance for reliable event detection and timing estimation. 

A major advance that could emerge from optical approaches is expanding the kinds of neuronal activity that are measured. For example, synaptic transmission is a rich area of research at the single neuron level, and could be accessible at the circuit level with better tools. Electrical synapses and their regulation are essentially invisible to most current recording methods. Direct measurements of released neurotransmitters at single-cell or single-synapse resolution are highly desirable: the probability of transmitter release at a synapse can vary 100-fold, and synapses also have properties such as depression and facilitation that shape signaling in real time. Existing methods for detecting transmitters such as voltammetry are useful but have limited spatial resolution. Direct measurement of released glutamate has recently been accomplished with a genetically-encoded sensor, offering potential improvement in both spatial and temporal resolution. Tools that allow direct measurements of other transmitters such as GABA, dopamine, serotonin, and neuropeptides would provide the needed view of synapses in action. 

In the longer term, additional signaling properties may be monitored. For example, neuromodulatory states can dramatically change properties such as signaling dynamics, excitability, and plasticity. Measuring the biochemical readouts of neuromodulatory states (e.g. cAMP) may provide views of the slow processing that circuits perform in parallel to rapid computations. Glia are increasingly recognized as important players in neuronal signaling and pathology; monitoring glial activity and metabolic coupling may shed unexpected light into information processing in the brain. Monitoring synapses at a large scale could define the codes for transferring information across neuronal circuits and systems. A goal of particular interest is a way to find the synapses in a circuit that change as a result of experience and learning. 

2c-iii. Integrated Optical Approaches: Neuroscience and Instrumentation 

Optical methods capture the central vision of The BRAIN Initiative®, that of integrating many approaches into a single experiment. Optical methods can be multiplexed to combine activity monitoring, manipulation, circuit reconstruction, and characterization of a single cell’s morphology and molecular constituents (or at least a subset of the above) simultaneously. Similarly, combining electrode recording with optical methods provides added value. For example, including optical reporters like dyes or fluorescent proteins can help identify the recorded cell’s identity and connectivity. 

All of neuroscience will benefit from a streamlined integration of optical technologies for large-scale recording, optogenetic manipulation, and circuit reconstruction that allows multi-faceted studies of identified cells and circuits in individual brains. This will encompass unified development of compatible optical hardware, genetic or chemical activity reporters, and optogenetic tools. Technology for optical studies of brain dynamics and circuitry, cell types, and molecular content should be progressively developed over the long-term to attain sufficient throughput for sophisticated studies of the differences between individual subjects, in animals and in humans. 

To reach their potential, optical methods should be viewed holistically. Wavelength ranges used for next-generation multi-color optical imaging and optogenetic control should ideally be tuned for mutual compatibility. Likewise, the capabilities and limitations of optical hardware should be taken into consideration when developing new sensor molecules, and vice versa, since the collective optical system is what ultimately should be optimized. For example, in the domain of optical sensors, much work is done at the surface of brain structures because imaging deep tissues remains a problem. Red or near infrared optical indicators would improve imaging depths in scattering tissues, but complementary strategies to solve this problem may be developed at the hardware-sensor interface, for example via nonlinear optical excitation using long wavelength illumination. 

Optical engineering and photonics are rapidly progressing fields; ongoing advances in optical hardware and computational optics are likely to be highly pertinent to The BRAIN Initiative®. Recent progress in miniaturized optics and CMOS image sensor chips for mobile phones has already yielded new capabilities for fluorescence imaging of neural activity in freely behaving animals. However, most emerging optical components will not have been tailored for neuroscience applications; systems engineering of new instrumentation using these components should pay careful heed to the unique needs of neuroscience experimentation. 

Great benefit could come from short-term and sustained efforts to develop new instrumentation to improve the speed, tissue volume, tissue depth, and number of brain regions that can be monitored in live animals. These advances might come in many forms, such as: new hardware for high-speed imaging; parallelized detection systems; progress in miniature optics; novel light sources; microscopes with capabilities for large-scale recordings; wireless or automated imaging instrumentation; next-generation optical needles for imaging deep tissues; flexible optoelectronics; holographic or light-field techniques for precise optical interrogations in all three spatial dimensions; CMOS image sensors of larger size, finer pixels or built-in capabilities for image processing and automated detection of neural activity; or optical systems with scalable architectures and automated analytics for imaging in multiple animals or brain areas concurrently. Many of these instruments might exhibit both optical recording and manipulation capabilities. 

Concurrent with the emergence of integrated optical approaches, it is essential to develop computational approaches for the analysis and management of the enormous data sets the optical techniques will yield (see also section 5). Calcium imaging studies in mice produce ~1 Gbits/sec of data; anatomical datasets will readily grow to the ~10 Petabyte scale and beyond. Sustained efforts will be necessary to develop sophisticated analytic tools for the analysis of these experiments. Policies and methods for data sharing will also need to be developed to fully exploit the value of these datasets (see section II.8d). 

At a deeper level, the concepts of optical imaging should be considered across other modalities such as magnetic fields or ultrasound. The value of existing technologies for human neuroscience, such as functional Magnetic Resonance Imaging (fMRI) and magnetoencephalography (MEG), is immense; developing higher-resolution methods for human use is an aspiration heard across the field. A non-invasive or minimally invasive imaging modality with cellular resolution that could interrogate large portions of the mammalian brain would represent a major advance for both animal and human studies. Any such technology that was safely applicable in humans would revolutionize our understanding of human brain function. 

2c-iv. Nanotechnology and Unanticipated Innovations 

As devices move from the micro- to nanoscale, properties emerge that may provide new opportunities to interrogate neurons. Silicon based nanodevices are one such example. Microwires, three microns in diameter, that project out from the surface of a conventional electrode can achieve intracellular access to cells plated over these wires; nanoposts, less than one micron in diameter, can create gigaohm seals with intracellular access for sustained periods of time. These devices have real promise; a high priority is to move their development from cell cultures to integrated neural systems, in slices or in vivo. 

In the intermediate or long-term, revolutionary new technologies may emerge, and The BRAIN Initiative® should encourage their exploration and development. Nanodiamonds, for example, are particles whose photochemical properties—fluorescent light emission—might be tailored to exhibit sufficient sensitivity to applied electric fields to serve as optical reporters of electrical activity. With respect to neuroscience applications, however, essentially all aspects of this proposed technology are untested, from deployment of the particles within the neuronal plasma membrane to measurement of the light signals. DNA- and RNA-based technologies have been suggested as indirect reporters of neuronal activity that could be decoded by sequence. 

These and other technologies should be encouraged and given “room to breathe” but held to a standard of progress: Emphasis should be placed on supporting methods and research teams that provide a logical, clear experimental pathway from an in vitro demonstration to a set of in vivo applications in increasingly complex neuronal systems. Nanoscale recording devices are potentially beneficial in terms of increased recording density, decreased tissue damage, and for long-term intracellular recording. However, they are all in very early stages, and moving this technology from in vitro to in vivo application is a significant challenge that will require sustained interaction between nanoscientists and neuroscientists. 

In summary, we should seize the challenge of recording dynamic neuronal activity from complete neural networks, over long periods, in all areas of the brain. There are promising opportunities both for improving existing technologies and for developing entirely new technologies for neuronal recording, including methods based on electrodes, optics, molecular genetics, and nanoscience, and encompassing different facets of brain activity, in animals and in some cases in humans. 

 

3. Manipulating Circuit Activity 

Observing natural patterns of neural activity generates hypotheses about their functional significance, but causal tests of such hypotheses require direct manipulation of the underlying neural activity patterns. In the 1950s, Penfield’s electrical stimulation experiments suggested that a memory or thought could be elicited by activating neurons in the underlying network. In intervening years, electrical, chemical, and genetic methods for stimulating or inhibiting neurons have provided numerous insights. Currently, stimulating electrodes are being placed in human patients for spinal cord stimulation and deep brain stimulation (DBS), among other therapeutics. Despite these successes, current human stimulation methods lack precision and specificity, and could benefit from technological advances. In non-human neuroscience, a major recent advance in circuit manipulation has been the development of optogenetic tools based on light-activated channels and pumps. The combination of rapid activation, reliable effects, and genetic delivery of the optogenetic channels to specific cell types and brain regions has revolutionized modern neuroscience. Optogenetic tools for depolarizing and hyperpolarizing neurons have proved to be a general method for testing and generating hypotheses of brain function across systems, brain regions, and (non-human) species. 

The existing optogenetic tools, generally based on Channelrhodopsin (depolarizing) and Halorhodopsin or Archaerhodopsin (hyperpolarizing), have been transformative but are not perfect for all uses. They have been subjected to multiple rounds of genetic engineering for optimization for different purposes, improvements demonstrating the importance of iteration in tool development. Nonetheless, new advances could make them still more useful. In the present versions, the light-induced currents are generally small, and the blue/ultraviolet light that most of these tools prefer is toxic to biological tissues, and does not penetrate deep into tissues. Most of the tools have overlapping absorption spectra with each other and with the genetically-encoded sensors of neural activity; they would be more useful if the spectra were separate. As mentioned earlier, improvements in optical physics may provide benefits for existing classes of tools, for example by allowing stimulation in complex, rapidly evolving patterns that imitate measured natural patterns of circuit activity. These kinds of improvements are incremental, but their cumulative impact could be substantial because these tools are so widely used. 

There are broader possibilities for manipulating neuronal activity in vivo. Chemogenetic tools (such as RASSLs, DREADDs, and chemical-genetic switches for kinases and channels) are already a useful complement to optogenetics for long-term manipulation, and this is another area that will benefit from continued improvement. Entirely new tools could be developed based on magnetic stimulation, gases, infrared excitation, ultrasound, or organic or physical chemistry to allow access to neurons deep within the brain. Techniques of this sort could also allow independent access to multiple circuits, or independent tools for monitoring and manipulating neurons. Noninvasive and non-genetic approaches will be particularly important for human neuroscience. 

There is also substantial room for growth in modulating more subtle aspects of neuronal function, not just depolarization and hyperpolarization. Among the tools that could have enormous impact are tools for silencing or activating particular synapses; affecting neuropeptide release independently of neurotransmitter release; or activating or inhibiting second-messenger cascades in real time, including those that mediate growth factor and neuromodulatory signals. 

In summary, by directly activating and inhibiting populations of neurons, neuroscience is progressing from observation to causation, and much more is possible. To enable the immense potential of circuit manipulation, a new generation of tools for optogenetics, chemogenetics, and biochemical and electromagnetic modulation should be developed for use in animals and eventually in human patients. Emphasis should be placed on achieving modulation of circuits in patterns that mimic natural activity. 

4. The Importance of Behavior 

How can we discern the meaning of the complex, dynamic activity patterns in the brain? Neurologists often gain insight into a human brain disorder by observing a person’s behavior, supplemented by person’s verbal reports. For example, by measuring the behavior of the patient HM on objective tests, and by interacting with him on numerous occasions, the neuropsychologist Brenda Milner was able to demonstrate that his damaged hippocampus prevented him from forming explicit memories of events, but not implicit memories or habits. In non-human animals, however, we must rely exclusively on behavioral observation and measurements to gain insight into their cognitive processes. To understand an animal’s perception, cognition, or emotion, we must start by observing its actions. 

Applying next-generation recording and manipulation tools to evaluate behavior insightfully will become increasingly important to neuroscience. Behavioral metrics are most useful if they are objective and reliable, but they should also permit the study of rich behaviors appropriate to the species. Among existing behavioral methods, formal psychophysics has been powerful because of its use of detection tasks and choice (discrimination) tasks, which are standardized, quantifiable, and easily related to theoretical models such as signal detection theory. An alternative set of methods based on neuroethology examines freely-moving animals in naturalistic environments, where the behaviors for which the animal has evolved can be expressed. These experiments are typically combined with recordings from small implanted recording devices, with flexible tethers to allow free movement. A newer set of methods restrains the animal partially by holding its head fixed, but allows it to move on a tracking ball, giving it the perception of movement. These “fly on a ball” and “mouse on a ball” experiments can be combined with virtual reality environments in a closed-loop configuration, in which the animal’s behavioral choices result in apparent changes in its environment; importantly, they permit the simultaneous use of complex optical or electrophysiological recording systems. 

Advanced techniques for manipulating, tracking, and analyzing animal behavior will be crucial components of recording and optogenetic experiments. Since no two animals have identical brains, matching neuronal and behavioral dynamics will be best achieved by conducting behavioral experiments over long durations in a single individual. Given the capabilities for tracking individual cells over many weeks in the living brain, designs of behavioral assays should fully exploit these long-term capabilities to reveal how the brain supports learning and memory, how the brain is altered in disease states, and how it responds to therapeutic manipulations. Building this behavioral capability should be a priority for The BRAIN Initiative®. Ultimately, neuronal recordings and manipulations should intelligently scan as much as possible of the animal’s behavior and cognitive repertoire. 

One avenue for further growth is a more detailed understanding of behavioral dynamics. This, in turn, requires a framework for capturing animal behaviors in video or audio format, segmenting and classifying them, and determining their duration and the transitions between them. In an advance over methods requiring expert human observers, an emerging area combines neuroscience with machine vision and machine learning to automate analysis of high-dimensional behavioral data from video and auditory recordings. Subsequent behavioral segmentation and classification can be specified by the scientist (supervised learning) or detected entirely by a computer (unsupervised learning). This automated quantification of behavior has many appealing features that support its further development. It provides a high level of objectivity and consistency; it is labor-saving, enabling high-throughput and high-content analysis of behavior; and it has the potential to uncover new behavioral patterns that have been overlooked by human observers. 

In summary, the clever use of virtual reality, machine learning, and miniaturized recording devices has the potential to dramatically increase our understanding of how neuronal activity underlies cognition and behavior. This path can be enabled by developing technologies to quantify and interpret animal behavior, at high temporal and spatial resolution, reliably, objectively, over long periods of time, under a broad set of conditions, and in combination with concurrent measurement and manipulation of neuronal activity. 

 

5. Theory, Modeling, and Statistics Will Be Essential to Understanding the Brain 

Large brain data sets are accumulating at an unprecedented rate that will accelerate over the next decade as The BRAIN Initiative® gathers momentum. The goal of brain theory is to turn this knowledge into understanding, but this is a formidable task. Brains—even small ones—are dauntingly complex: information flows in parallel through many different circuits at once; different components of a single functional circuit may be distributed across many brain structures and be spatially intermixed with the components of other circuits; feedback signals from higher levels constantly modulate the activity within any given circuit; and neuromodulatory chemicals can rapidly alter the effective wiring of any circuit. In complex systems of this nature, our intuitions about how the activity of individual components (e.g. atoms, genes, neurons) relate to the behavior of a larger assembly (e.g. macromolecules, cells, brains) often fail, sometimes miserably. Inevitably, we must turn to theory, simulation, and sophisticated quantitative analyses in our search to understand the underlying mechanisms that bridge spatial and temporal scales, linking components and their interactions to the dynamic behavior of the intact system. 

Theory, modeling and statistics play at least four key roles in our effort to understand brain dynamics and function. First, for complex, frequently counterintuitive systems like the brain, mathematical modeling and simulations can organize known data, assist in developing hypotheses about underlying mechanisms, make predictions, and thus assist in designing novel experiments to test the hypotheses. Second, confirmatory statistical analysis allows us to move in the inverse direction after data collection, using formal inference approaches to support or disprove a stated theory or hypothesis. Third, exploratory data-mining techniques can be used to detect interesting regularities in complex data even when information cannot yet be summarized with the aid of forward modeling or when prior hypotheses do not yet exist. Data-mining techniques are powerful, but genuine understanding of underlying mechanisms will typically require subsequent confirmatory analyses of the usual hypothesis-testing variety. Finally, formal theory seeks to infer general principles of brain function that unify large bodies of experimental observations, models, and simulation outcomes. The brain computes stably and reliably despite its construction from billions of elements that are both noisy, and constantly adapting and re-calibrating. Elucidation of the general principles underlying this remarkable ability will have a profound impact on neuroscience, as well as on engineering and computer science. 

5a. Combining Theory, Modeling, Statistics, and Experiments 

Theory and modeling have illuminated numerous areas of neuroscience in the past: the mechanisms of action potential generation (Hodgkin-Huxley), synaptic plasticity (Hebb), visual motion computation (Hassenstein-Reichardt), the efficiency of sensory codes (Barlow), the role of inference and priors in perception (Helmholtz), the role of dopaminergic systems in computing prediction-errors for reinforcement learning (Schulz, Sutton, Dayan), and decision-making under uncertainty (Green and Swets, Luce). True partnerships between theorists and experimentalists will yield large dividends for almost every conceptual and experimental problem to be tackled under The BRAIN Initiative®. 

Modeling and theory developed through close collaborations of theorists and experimentalists are most likely to yield penetrating insight and drive creative experimental work. Data gathered by an experimentalist uninformed by theory, even excellent quality data, may not be the data that will generate the most definitive conclusions or greatest conceptual clarity. Similarly, theorists who participate actively in the acquisition of data are more likely to acquire a biological sense of the system, its reliability, and its limitations, increasing the likelihood that their theories reflect biological reality and make predictions that are feasible for experimental verification. 

Ideally, theorists and statisticians should be involved in experimental design and data acquisition, not just recruited at the step of data interpretation. The close working relationships we envision could be supported by grant opportunities that require the participation of a statistician or theorist in collaboration with an experimentalist; a highly successful model for this has been the joint NIH-National Science Foundation (NSF) Collaborative Research in Computational Neuroscience (CRCNS) program. 

We next highlight a few of the many areas that appear promising for the collaborative efforts of theorists and experimentalists under the goals of The BRAIN Initiative®. 

5a-i. New Statistical and Quantitative Approaches to New Kinds of Data 

The next generation of neural activity recordings will be different from previous ones. All signs point to a major increase in the quantity of neuronal recordings, but the quality of neuronal recordings will also change during The BRAIN Initiative®. Electrophysiological techniques are being complemented increasingly by optical recording methods, which will dictate different analytic approaches. The ability to identify specific cell types, and map this information onto activity maps, will provide an additional dimension to these rich data sets. The ability to record multiple forms of activity simultaneously — spiking, subthreshold activity, synchrony, neuromodulatory states — will increase as well. We will need new tools to analyze these complex datasets, as well as new tools and algorithms for data acquisition and interpretation: e.g., 1) different types of measurement will need to be fused to extract meaningful information, 2) the sheer amount of data will demand highly efficient algorithms, 3) in some cases, analyses will need to be done in real time, either because data volumes are too large for storage, or because the experiment is designed to adapt to the responses, or for applications such as neural prosthetics. 

As new kinds of data become available through advances in molecular sensors and optical recording, equal effort must be expended to extract maximum insight from these novel and often complex data sets. Data analytic and theoretical problems are likely to emerge that we cannot anticipate at the present time. Resources should be available for experts from essential disciplines such as statistics, optimization, signal processing, and machine learning to develop new approaches to identifying and analyzing the relevant signals. 

5a-ii. Dimensionality and Dynamics in Large-Scale Population Recordings 

In large-scale population recordings like those envisioned under The BRAIN Initiative®, the issue of dimensionality is critical. Dimensionality reduction techniques detect correlated activity among subsets of the sampled neuronal population, identifying ensembles of neurons that might be functionally related to each other in interesting ways. In current recordings of tens-to-hundreds of neurons, the dimensionality of the data is typically much lower than the number of recorded neurons, encouraging the notion that different behavioral variables or neural circuitry constraints might be reflected in the activity of neurons in different ensembles. Low dimensionality is also interesting because it implies that one can capture the major sources of variation in a system by recording from a relatively small proportion of its neurons. Plainly, the issue of data dimensionality has substantial implications for what our experimental goals should be in large-scale population recordings. In a low-dimensional system, it would be far more important to sample neuronal activity strategically than to record from every neuron in the system. 

There is reason for caution, however. The number of apparent dimensions in the data may be artificially low if the behavioral task is too simple or if neuronal activity is not measured for sufficiently long periods of time. Thus it is essential not only to record more neurons, but to also increase behavioral and stimulus complexity in order to obtain richer, higher dimensional data sets. A second problem is that many of these methods, such as independent component analysis, prinicipal component analysis, support vector machines, and graphical models, are designed for problems in which the structure in the data is static. More sophisticated statistical methods exist to analyze time series, and should be further developed to analyze the highly nonlinear dynamic structure of most neuroscience data. 

The dynamics of neural activity in single cells, small circuits, and large populations are complex, and include nonstationary calibration, adaptation, and learning mechanisms that occur simultaneously and on different time scales. These are currently not well understood, and will require development of new theoretical and analysis tools based on control theory, information theory, and nonlinear dynamical systems. 

Resolving the theoretical issues associated with dimensionality and dynamics, and developing new techniques and complex behavioral paradigms that can release the potential power of large neural data sets, are important contributions that theory and modeling can make to The BRAIN Initiative®. 

5a-iii. Linking Activity Across Spatial and Temporal Scales 

One of the most remarkable properties of nervous systems is that the temporally-enduring behavior of an organism emerges from the collective action of molecules and cells operating on time scales many orders of magnitude shorter. At the cellular level, information is encoded in the patterns of action potentials generated by individual neurons, each enduring for roughly a millisecond. Yet a working memory may last tens of seconds, and a single purposeful behavior can extend for minutes or hours. 

The integration of information across temporal scales is a problem that will involve biochemical signaling pathways that outlast an electrical input, circuit properties such as ‘attractors’, in which a large population of interacting nerve cells achieves an enduring activity state, and experience-dependent changes in the strength of synaptic connections between neurons. Meaningful models of brain activity will need to incorporate events at all of these temporal scales. 

No less impressive is the extension of coordinated neural activity over large spatial scales within the brain. A purposeful behavior as simple as an eye movement can involve millions of neurons distributed across more than a dozen areas brain areas. How is widespread activity in the cortex, basal ganglia, thalamus, midbrain, and brainstem orchestrated to achieve a single behavioral goal? 

Theory, modeling, and biophysically realistic simulations will play a critical role in deepening our understanding of these and many similar phenomena. Theory can help us develop hypotheses and design the right experiments to ask how purposeful systems-level behavior emerges at extended spatial and temporal scales. 

5a-iv. Flexible Behavior and Decision-Making 

Much of our knowledge of brain function comes from experimental measurements from one or a few neurons in a single brain area. As we attempt to understand the remarkably complex behavioral and cognitive abilities exhibited by humans, we will need to consider the interactions within and between larger neural systems and brain areas. Complex actions are driven by simultaneous inputs from multiple sensory modalities, as well as internal states and memories that represent goals, constraints, and preferences. These actions are readily adapted to different environments and contexts, and they can be learned and refined with experience. In addition to humans, many animals also demonstrate behaviors that can be flexibly reshaped or adapted according to context or task requirements. For example, neurons in the frontal lobes of non-human primates are centrally involved in decisions, and have been shown to respond to both task-relevant and task-irrelevant sensory stimuli, along with signals related to behavioral choices. The representation of these attributes within neuronal ensembles can change markedly during the execution of any given task. We do not yet have a systematic theory of how information is encoded in the chemical and electrical activity of neurons, how it is fused to determine behavior on short time scales, and how it is used to adapt, refine, and learn behaviors on longer time scales. Finally, humans and perhaps some animals have the capacity for symbolic computation using language and in other domains as well; a brain-based theory of these higher functions is notably lacking. 

Coordinated work in theory, modeling and experiment will be required to understand the mechanisms of context-dependent information flow in the brain, which lies at the heart of flexible behaviors such as decision-making. 

In summary, rigorous theory, modeling, and statistics are advancing our understanding of complex, nonlinear brain functions where human intuition fails. New kinds of data are accruing at increasing rates, mandating new methods of data analysis and interpretation. To enable progress in theory and data analysis, we must foster collaborations between experimentalists and scientists from statistics, physics, mathematics, engineering, and computer science. 

 

6. Human Neuroscience and Neurotechnology 

A primary goal of The BRAIN Initiative® is to understand human brain function in a way that will translate new discoveries and technological advances into effective diagnosis, prevention, and treatment of human brain disorders. The study of human brain function faces major challenges because many experimental approaches applicable to laboratory animals cannot be deployed in humans. Nevertheless, direct study of the human brain is critical because of our unique cognitive abilities as well the profound personal and societal consequences of human brain disorders. 

Improvements to existing technologies like magnetic resonance imaging (MRI) and positron emission tomography (PET) have revolutionized our ability to noninvasively study the structure, wiring, function, and chemistry of the human brain. Other important opportunities are emerging from the increasing number of humans who are undergoing diagnostic brain monitoring with recording or stimulating electrodes, or are receiving neurotechnological devices for therapeutic applications or investigational studies (e.g. DBS). Some of the most promising new opportunities involve combining these and other techniques to cross barriers of spatial and temporal scale that have impeded progress in the past. For example, the inability to measure activity and chemistry at the cellular level with noninvasive tools creates significant uncertainty about the functional meaning of some of the recorded signals. We can potentially address this problem by combining noninvasive brain imaging with higher resolution data obtained from diagnostic monitoring of humans or from implanted devices in humans. Insight is also emerging by combined measurement of noninvasive and cellular-level signals in animal models. Breaking these barriers of scale (Section 6a-ii) would yield substantial benefits for diagnosis and treatment of disease as well as for basic discovery about the human brain. These and other creative approaches for understanding human brain function should receive vigorous support under The BRAIN Initiative®. 

6a. Human Brain Imaging 

The last twenty years have seen explosive growth in the development and use of noninvasive brain mapping methods, predominantly MRI, complemented by MEG and electroencephalography (EEG), to investigate the human brain under normal and pathological conditions, and across the human lifespan. In the future, we anticipate significant progress in using these methods to measure the wiring diagram and functional activity of the human brain at multiple scales—neuronal ensembles, circuits, and larger scale networks (‘circuits of circuits’). In turn, these capabilities will allow us to visualize and understand circuit-level disruptions related to human brain diseases. Brain imaging techniques are also valuable for evaluating the effects of pharmacological treatments and non-invasive brain stimulation methods, or for validating other functional measurement methods like near-infrared spectroscopy. 

6a-i. MRI Approaches 

MRI techniques contribute extensively to human neuroscience in three broad ways: fMRI enables correlation of functional brain activity with cognition and behavior; diffusion-weighted MRI (DW MRI) provides estimates of the trajectories of long-distance pathways in the white matter; resting-state fMRI (rfMRI) enables us to deduce ‘functional connectivity’ between dispersed brain regions. All of these investigations will be accelerated greatly if the spatial and temporal sampling limitations of the magnetic resonance (MR) measurements can be reduced. The vast majority of current MR studies aim for whole-brain coverage and achieve spatial resolution of 2 mm (isotropic) or coarser. An 8 ml voxel from such an image contains a veritable world of smaller circuit components—more than 600,000 neurons and glial cells, many columnar ensembles, intermixed cortical layers, or (in the white matter) several fasciculated fiber bundles that may cross each other, fan out or turn within that volume. 

The problem of relating MR signals to the underlying circuitry can be ameliorated somewhat by improved physical measurements. For example, in precisely targeted experiments using high field-strength magnets, hemodynamically based MR measurements have reached spatial scales below 1mm—to the level of cortical columns and individual cortical laminae. Even at this scale, however, MR measurements reflect a complex combination, at the vascular level, of electrochemical activity of many thousands of neurons and glia. Thus it is critically important to develop and experimentally validate theories of how MR signals are based in the underlying cellular-level activity, as we consider in the next section. 

6a-ii. Bridging Spatial Scales 

To link the integrative functional signals measured by MRI to cellular-resolution activity, two separate issues warrant intensive investigation. First, we must firmly establish how the electrical and chemical activity in different populations of excitatory and inhibitory neurons, glial cells, axons, and presynaptic terminals contribute to the local vascular response—the classic neuro-glio-vascular coupling problem. This knowledge is immediately relevant not only for interpretation of fMRI signals, but also for investigation of neurological and psychiatric diseases in which a disruption in neuroglial communication and/or deterioration of neurovascular coupling contributes to motor and cognitive decline. 

The second issue, which has received much less attention, concerns what information coded in the neural activity of populations of neurons within the imaged voxel is captured in the hemodynamic response. By analogy to bridging scales in physics, the basic intellectual question is how the electrical and chemical activity of neurons, glia, and synapses in imaged voxels are integrated or averaged to generate a hemodynamic signal. These questions can be best answered by direct comparison of cellular-resolution population activity of neurons with hemodynamic response measurements, ideally in animals engaged in sophisticated behavior likely to invoke the full computational power of neural circuits. Additional approaches include the use of optogenetics, pharmacology, and mutant animals to perturb neuronal signals and directly observe the effects at the level of fMRI. These approaches offer the opportunity to bridge anatomical and physiological scales, going from cellular-resolution neuronal and glial activity to macroscopic circuits, networks and ultimately behavior. 

DW MRI estimates the orientation of axonal fiber bundles, capitalizing on the fact that water diffuses most rapidly along the length of axons. By probing at many different orientations, DW MRI can estimate not only the dominant fiber orientation in each voxel, but also the orientation of crossing fiber bundles, which are very common in the white matter thicket. Tractography algorithms combine information across a succession of voxels to estimate the overall trajectory of long distance pathways. Some of the problems faced by DW MRI in charting anatomical connectivity in the human brain, such as crossing fibers, fiber fanning, and so forth, may be alleviated by higher spatial resolution and more sensitive imaging techniques. The benefits of improved resolution have already been demonstrated by results from the Human Connectome Project, but these achievements still fall short of what is needed. Our ability to infer structural connectivity patterns using DW MRI will be further improved by using anatomically informed priors that are based on accurate statistical models of the distribution of trajectories taken by axons and fiber bundles in white matter. Such information can be obtained from cutting-edge microscopy methods (e.g., optical coherence tomography, polarized light imaging, CLARITY) and used to improve the fidelity of DW MRI-based tractography methods, thus bridging microscopic and macroscopic scales at the anatomical level. It is critical that at each stage of the process of improvement, the overall validity of DW MRI-based tractography is demonstrated by direct comparison with anatomical tract tracing in animal models, including non-human primates. 

Thus, specific opportunities for MRI technology under The BRAIN Initiative® include submillimeter spatial resolution descriptions of neuronal activity, functional and structural connectivity, and network analysis in the human brain through advances in instrumentation, data acquisition and analysis techniques, and theoretical modeling linking activity with behavior. 

6a-iii. Resting State fMRI and Brain Network Dynamics 

rfMRI enables inferences to be made about functionally connected networks that may be widely dispersed within the brain. It relies on the observation that functionally related areas that are co-activated during performance of a task also exhibit correlated spontaneous fluctuations when subjects are simply "resting" in the MR scanner. Many large-scale correlated temporal patterns, referred to as resting state networks (RSNs) have been identified in this manner. RSNs persist during sleep and under anesthesia, and are consistent across subjects and to some extent across species. Importantly, RSNs display some degree of correspondence with anatomical connections, but the correspondence is far from perfect. RSNs appear to reflect functional coordination operating across multiple synapses within a circuit, providing information about correlated activity that is difficult to infer from anatomical maps alone. 

Recent developments in significantly accelerating whole brain functional images (e.g. 2 mm isotropic resolution, whole brain images acquired in less than a second) have enabled major improvements in spatial and temporal sampling of resting state fluctuations, leading to major gains in statistical power to detect RSNs, increases in the number and granularity of network nodes, and an enhanced ability to analyze neural dynamics (e.g. detection of regions dynamically participating in different networks). These recent advances emphasize the importance of gains enabled by improving spatial and temporal resolution of the MRI data, and the need to push the technology further. 

Changes in RSNs have been implicated as possible biomarkers for functional classification in several cognitive disorders, and have enormous potential for further development in this area. Whether such rfMRI derived networks and/or network dynamics can inform us about individual differences in psychiatric disorders or guide individualized therapies is yet to be determined, but it is one of the significant potential payoffs of applying improved MRI methods and analysis techniques to be developed under The BRAIN Initiative®. 

6b. EEG and MEG 

EEG and MEG provide a unique capability for noninvasive analysis of human brain activity with high temporal resolution. Numerous studies have demonstrated the merits of EEG/MEG for detecting neural correlates of a broad range of human cognitive processes as well as brain disorders such as epilepsy. The simplicity and mobility of EEG monitoring systems has facilitated the study of human brain signals in naturalistic settings. 

EEG/MEG is limited in its spatial resolution. Localizing EEG signals to specific brain structures (source imaging) has benefitted significantly from the a priori anatomic constraints measured with structural MRI. This synergistic interaction highlights the value of combining data across imaging measurement modalities. Recent advances in source imaging have significantly improved localization of event-related brain activity in healthy human subjects and of interictal spikes in epilepsy patients. A significant challenge for the future is to develop advanced source imaging techniques that can map spontaneous brain activity, including RSNs in healthy subjects, as well as abnormal network connectivity associated with neurological or psychiatric disorders. 

An important opportunity lies in integrating high temporal resolution EEG (and MEG) source imaging with high spatial resolution fMRI. Significant progress has been made to leverage the complementary nature of EEG and fMRI, which can be performed simultaneously in an MRI scanner. Challenges exist to better understand the correlation between BOLD signals and electrophysiological events via neurovascular coupling, technologies for high fidelity recordings of brain activity using EEG-fMRI, and techniques to enhance performance of EEG source imaging from simultaneously acquired fMRI data. It is noteworthy that MEG/EEG methods fail to record spiking activity (the output signals of most neurons); this scale is missing in non-invasive recordings. New methods for recording neuron spiking externally would have a very large impact. 

6c. PET and Neurochemistry 

Assessment of dynamic neurochemical and other molecular events has been relatively neglected in recent years, in large measure because the techniques are difficult compared to fMRI. While MR spectroscopy offers a view of some important intrinsic molecules, it has not yet shown chemical specificity for neurotransmitter/receptor interactions. Optical and MR-based imaging methods offer significant potential for molecular imaging in animal models using exogenous probes, but the translation of these methodologies to humans is not straightforward. In the short- to mid-term, nuclear techniques, including single photon emission computed tomography (SPECT) and principally PET, provide the best means to translate studies of neurotransmitters, receptors, and neuromodulators to humans. 

Two principal challenges limit the role of these techniques today. The first is to exploit the potential for better use of existing PET tracers that target dozens of important neurotransmitter systems and their receptor subtypes. Within the libraries of compounds tested for therapeutic potency by the pharmaceutical industry lie hundreds of compounds awaiting evaluation of their potential as imaging agents. Public/private partnerships under The BRAIN Initiative® could unlock this potential treasure trove of compounds, not as therapeutic agents (for which they were originally evaluated), but as compounds for discovery of receptor function. Second, while the principle of using PET to evaluate changes in receptor occupancy secondary to pharmacological or cognitive stimulation has been demonstrated, the means for dynamic assessment of neurochemical-specific brain activation, analogous to fMRI localization of “activation”, is not yet in hand. 

Significant progress is possible, however, in both short and longer terms. True dynamic assessment of receptor occupancy and metabolism, at spatial resolution approaching today’s fMRI studies and temporal resolution of minutes, is a feasible mid-term goal with key receptor subtypes of the dopaminergic, serotonergic and glutamatergic neurotransmitter systems. In the longer term, the range of molecular targets and receptor subtypes amenable for study should steadily grow, tapping into the breadth of neurochemical expertise available through partnerships with pharmaceutical companies. 

In summary, there is a need to improve spatial resolution and/or temporal sampling of human brain imaging techniques, and develop a better understanding of cellular mechanisms underlying commonly measured human brain signals (fMRI, DW MRI, EEG, MEG, PET)—for example, by linking fMRI signals to cellular-resolution population activity of neurons and glia contained within the imaged voxel, or by linking DW MRI connectivity information to axonal anatomy. Understanding these links will permit more effective use of clinical tools for manipulating circuit activity, such as deep brain stimulation and transcranial magnetic stimulation (TMS). 

6d. Devices for Monitoring and Stimulating the Human Brain 

A new generation of medical devices for interfacing with the living human brain has been fueled by the merger of engineering advances with neuroscience discovery. Devices, some already in hand, are being used to monitor brain function, to diagnose and treat mood and movement disorders, and to restore sensory and motor functions lost following injury or disease. Thousands of humans are receiving these neurotechnologies in clinically approved or investigational applications. With their informed consent, these individuals provide an extraordinary opportunity for rigorous research on normal brain function, as well as on the effects of brain injury or disease. When coupled with non-invasive imaging there is a real opportunity to bridge scales from limited cellular to whole brain functional imaging methods. 

The population of humans receiving recording or stimulating devices is large and growing. Most notably, DBS electrodes implanted in a specific basal ganglia circuit have helped to relieve more than a hundred thousand people of the rigidity, tremor, and slow movements of Parkinson’s disease. DBS is also widely and successfully employed in motor disorders such as dystonia and tremor, and there is reason to think that its potential use is much broader: promising results have appeared for DBS use in intractable depression, and it is being explored as a treatment for obsessive-compulsive disorder and even memory decline, which could have major public health implications. Another frontier is ‘closed-loop’ implanted systems in which data analysis is performed in real-time by a computer and used to generate future patterns of brain stimulation. For example, a sensor might detect an epileptic seizure in the early stages of its development and reduce or block it by stimulating the brain into quiescence. 

In the sensing domain, brain-computer interfaces, still early stage investigational devices, can enable people with paralysis to use their own brain signals to control assistive prosthetics like computers or robotic arms well enough to perform some everyday activities of living. In these individuals, chronically implanted multielectrode sensors provide unprecedented high-resolution recording over years. Developments in sensing and stimulation technology promise a series of new devices that will increase the quality of life and independence of individuals limited by a wide range of brain injuries or disorders. Each person with a device, who is willing to participate, becomes a possible research participant with potential to yield valuable data about brain function. 

The availability of this large cohort of people with implanted technology opens the possibility not only to advance clinical care, but to carry out detailed studies that were barely conceivable a decade ago. During intraoperative mapping in epilepsy, researchers found neurons in the medial temporal lobe of a human patient that responded to pictures of individual actors or politicians — and also to the spelled-out name of the same person. These neurons provide a fascinating example of the encoding of categories or abstract concepts in the brain. Direct brain recordings during anesthesia have revealed characteristic transformations in brain activity as consciousness is lost. Chronic multielectrode array recordings from brain-computer interfaces in people with longstanding paralysis have shown how the motor cortex retains representations of the arm even years after a stroke, raising new questions about plasticity in the human brain. As these applications continue to expand, there will be an unprecedented opportunity to study human circuits, both by recording their activity and by modulating their activity. 

6e. Teams for Basic and Clinical Human Research 

Taking advantage of the scientific opportunities offered by neurotechnology developments and volunteer human patients is an exceptionally complex endeavor. Every opportunity should be maximized while maintaining the highest standards for research participant safety and protection. Meeting research goals and human research standards requires closely integrated research teams including clinicians, engineers, and scientists who work together to organize and carry out research of the highest integrity and rigor. Clinicians who use new neurotechnologies in human research should interact closely with the engineers, scientists and companies developing them, to ensure the rapid creation, validation, and dissemination of effective tools. The regulatory oversight of human research, as well as the close clinical relationship and potentially long-term commitment to participants, can place a special time and financial burden on investigators. In addition, storing and processing data in compliance with federal privacy protection laws are challenging. Because related clinical research activities can occur across many universities and medical centers, mechanisms to standardize and share precious data from human subjects are essential. Further, because research is often aimed at creating new medical devices for treatment of human disease or injury, the objectives of the research team must often be aligned with regulatory paths and industry standards needed to translate early stage testing into a commercially viable technology. These varied demands must be met while also adhering to the highest standards of scientific quality and design of preclinical studies. 

These standards can and are being met by dedicated teams of collaborative researchers, but consideration should be given to reducing unnecessary bureaucratic hurdles in the academic setting. This difficult but extremely valuable area of research should be made as efficient as possible to maximize the efforts of both the researchers and the patients. Initiating such steps could accelerate innovative research, ultimately driving down costs and providing better clinical devices and therapeutic outcomes. Because this kind of research is so valuable yet so complex, we must develop novel incentives and straightforward mechanisms to translate basic science advances into human pilot testing, while maintaining the ethical standards and regulatory procedures in place for human research. 

In summary, humans who are undergoing diagnostic brain monitoring or receiving neurotechnology for clinical applications provide an extraordinary opportunity for scientific research. This setting enables research on human brain function, the mechanisms of human brain disorders, the effect of therapy, and the value of diagnostics. Meeting this opportunity requires closely integrated research teams including clinicians, engineers, and scientists, all performing according to the highest ethical standards of clinical care and research. New mechanisms are needed to maximize the collection of this priceless information and ensure that it benefits people with brain disorders. 

6f. Human Neural Technology Development 

For human research, The BRAIN Initiative® should ultimately support two broad types of technology development: (1) research tools that allow us to better investigate brain structure and function, and (2) clinical tools that enable us to better diagnose, prevent, treat, and cure brain diseases, including technologies that can restore lost functions. Devices for use in humans need substantial improvement over existing technology: they need to be more reliable, stable, and long lasting, which will require better materials, biocompatibility, and features optimized for human use. We need electrode arrays with higher spatial resolution for recording and stimulation both within and across brain areas. In the medium- to long-term, new monitoring capabilities (acoustic, optical, chemical, etc.) should be incorporated into all implanted devices; when devices are implanted into human subjects, they should deliver the maximal scientific benefit consistent with health and safety of the participant. As detailed in section 2c-i, implantable devices must get smarter, smaller, and more energy efficient; they require wireless communication in compact packaging able to last for years in the body. Engineers and scientists must rise to the challenge of developing this next generation of neurotechnological devices. 

Both penetrating and surface sensors need to be substantially improved and tested for their full capabilities. Precision in placing sensors within identified circuits will require MRI compatibility, which will also provide an opportunity to advance basic knowledge that links MRI identified circuits with clinical outcomes and cell and circuit scale function. 

Potentially transformative technologies should also be entertained. For example, the use of optogenetics tools in humans is conceivable in the mid- to long-term. Initial safety studies of adenovirus-associated virus vectors in human brains are encouraging, suggesting that viral delivery of therapeutic genes can be explored in the near future, with careful and comprehensive testing of viral delivery systems to evaluate long term safety and efficacy. 

As previously mentioned, noninvasive tools for fine-resolution stimulation of the human brain would be transformative, potentially reducing or eliminating the need for invasive electrode implants. Present noninvasive stimulation techniques are being explored for therapeutic effects, including TMS, and direct-current and slow alternating-current stimulation. These techniques are able to activate ~cm scale areas of brain for potential neurological and psychiatric applications. Prefrontal lobe TMS is already approved by the Food and Drug Administration (FDA) to treat depression. However, the scale, duration, mechanism of action, and the potency of their effects need to be better elucidated. A long-term goal of The BRAIN Initiative® should be to find ways to obtain high spatial and temporal resolution signal recording and stimulation from outside the head, perhaps through the use of minimally explored energy delivery techniques such as focused ultrasound or magnetic stimulation.  

7. Education 

New tools, whether they come in the form of equipment, molecular clones, or data analysis algorithms, should be disseminated to a wide scientific user base, along with the knowledge required to wield them. 

7a. Education and Training in Emerging and Interdisciplinary Methods 

New training mechanisms will be required to successfully deploy the tools, technologies, and methods developed under The BRAIN Initiative® to the neuroscience community. Funding training centers and personnel for teaching new techniques would require relatively modest funds and space, but would be a great benefit for the entire neuroscience community. Optogenetics has been successful in part because of organized mini-course training of faculty and students from around the world in the required surgeries and techniques, both in university settings and in course modules at Cold Spring Harbor and Woods Hole. Mini-courses in new technologies represent a way to bring an entire community of users up to a high level of understanding—and productivity—in a short period of time. They level the playing field between scientists at large institutions and those at smaller institutions who may not have the same resources. These teaching mechanisms have the added benefit of communicating experimental standards and pitfalls, which often trip up early users of new technologies, but currently suffer from a lack of standardized space and funding support in the traditional academic setting. 

Training in quantitative neuroscience should be an area of special focus for The BRAIN Initiative®. This includes teaching theory and statistics to biologists, and exposing physicists, engineers, and statisticians to neuroscience. Mechanisms include fellowships as well as short courses and workshops in neuroinformatics, statistics, and computational neuroscience. 

7b. Building Strength in Quantitative Neuroscience 

Attracting new investigators to neuroscience from the quantitative disciplines (physics, statistics, computer sciences, mathematics, and engineering), and training graduate students and postdoctoral students in quantitative neuroscience, should be high priority goals for The BRAIN Initiative®. While forward-looking programs like the Sloan-Swartz Centers in Theoretical Neuroscience have attracted trainees from the quantitative disciplines and promoted their careers, this critical human asset remains too small and too tenuously established within neuroscience. Too many neuroscience departments remain skeptical of hiring faculty whose research does not focus primarily on experimental lab work, and too many statistics, physics, mathematics and engineering departments are hesitant to hire faculty who focus intensively on the nervous system. 

The field would benefit greatly from incentives for faculty recruitment at this critical interface. A major benefit of attracting new faculty from the quantitative disciplines to neuroscience would be their teaching of quantitative concepts and skills—theory, modeling, statistics, signal processing, and engineering in its many forms—with emphasis on real-world applications to neuroscience data sets, including those introduced to the public domain under BRAIN-sponsored research projects. Most students currently enter neuroscience graduate programs with little-to-no training in statistics, computer science, or mathematical modeling, and many receive little formal quantitative training during their graduate education. This must change. All areas of neuroscience, not just those of particular emphasis under The BRAIN Initiative®, will become increasingly dependent on quantitative perspectives and analyses in the future; training of students in quantitative reasoning, principles, and techniques must increase accordingly. 

In summary, progress would be dramatically accelerated by the rapid dissemination of skills across the community. To enable the broadest possible impact of newly developed methods, and their rigorous application, support should be provided for training—for example, summer courses and course modules in computational neuroscience, statistics, imaging, electrophysiology, and optogenetics—and for educating non-neuroscientists in neuroscience.  

8. Maximizing the Value of The BRAIN Initiative®: Core Principles 

The emphasis in this report is on posing questions, not dictating solutions. However, certain principles and approaches can maximize the intellectual value and long-term impact of all aspects of The BRAIN Initiative®, as enumerated below. 

8a. Pursue Human Studies and Non-Human Models In Parallel 

Our ultimate goal is to understand the human brain, and as stated above, human neuroscience should be a key element of The BRAIN Initiative®. However, both ethical principles and scientific feasibility will require many methods and ideas to be developed in non-human animal models, and only later applied to humans. With a few exceptions, we do not emphasize particular species, but instead encourage a diversity of approaches. The history of neuroscience teaches us that many different models should be enlisted for the unique advantages that they provide, and also that comparative approaches are very powerful in discovering biological principles. We expect The BRAIN Initiative® to include nonhuman primates such as rhesus macaques, because they are evolutionarily the closest animal model for humans, and this will be reflected in their behavioral and cognitive abilities, genetics, anatomy, and physiology. We expect the mouse to be the initial mammalian model for the use of genetic tools, supplemented by the rat, long appreciated for its behavior and neurophysiology, where genetic tools are also emerging. The transparent zebrafish larva should facilitate optical recording methods in the context of a simplified vertebrate neuroanatomy. Invertebrate animals with smaller nervous systems offer rapid experimental turnaround, rapid testing and validation of new tools, and the ease of genetics (for worms and flies) or of electrophysiology (for molluscs, crabs, and leeches) targeted to defined neurons; most neuroscientists have been surprised to see how many features of the brain and behavior are shared by vertebrates and invertebrates. 

Finally, the list of species above is not complete. It is important to realize how much has been gained from studying a wider variety of animal species, recognizing their special abilities and the perspective they provide on the brain. For example, the only animals for which a teacher instructs vocal learning, other than humans, are songbirds. The richness of the behavioral repertoire in songbirds has led to remarkable insights into learning, motor control, and the importance of social context in behavior. Important insights into the brain have come from studies of many other creatures (barn owls, electric fish, chickens, bats, and more). The most significant technologies developed by The BRAIN Initiative® should facilitate experiments in these and other specialized animals, broadening the reach and scope of questions that can be asked about the brain. 

The fundamental principle is that experimental systems should be chosen based on their power to address the specific question at hand. Although the emphasis of The BRAIN Initiative® is on the whole brain, some technologies will require careful analysis in culture systems or slices before they can be used in intact animals or humans. The BRAIN Initiative® should not be dogmatic when faced with a compelling scientific argument for a different approach. 

8b. Cross Boundaries in Interdisciplinary Collaborations 

Surveying the landscape of neurotechnologies reveals some that are mature, some that are emerging and in need of iterated, disciplined improvement, and some that require re-imagination. It is critical that The BRAIN Initiative® boldly supports the very best ideas addressing each need. This report describes the current state of the field, but transformative new ideas will emerge in the future that are not on today’s horizon. The BRAIN Initiative® must find a way to recognize such new ideas and let them flourish. Some innovative ideas will certainly fail, but this is not the time to play it safe. If the majority of proposals succeed in a predictable manner, we are not being adventurous enough. 

At this stage, it is senseless to choose a single funding mechanism or set of investigators for The BRAIN Initiative®; there must be exploration. Applications should be solicited widely, with open competition for resources. Some ideas will be initiated by individual investigators who see a new way forward. Other ideas will require larger teams of scientists, particularly in human neuroscience with its unique ethical and scientific challenges. 

A theme that emerged clearly from the working group’s workshops and discussions is the benefit to be gained by new scientific partnerships that cross traditional areas of expertise. This point was made in many specific contexts: 

The physicists and engineers who develop optical hardware should partner with the biologists and chemists who develop new molecular sensors. 

The tool-builders who design new molecules for sensing or regulating neurons should partner with neuroscientists who will rigorously examine their validity in neurons and brains. 

The theorists who develop models for understanding neuronal dynamics should partner with experimentalists, from initial experimental design to execution to interpretation. 

The clinicians and neuroscientists who develop sophisticated imaging methods in humans should partner with scientists working in animal models who can relate imaging signals to the underlying cellular mechanisms with great precision. 

Supporting collaborations across disciplines, with outstanding scientists who are intellectual equals, could light new fires in technology development. Such groups need not be at one institution to be effective — the quantitative and physical scientists might be at engineering schools, their neuroscientist partners might be at medical schools. Small collaborating groups of two or three investigators could open new doors in ways that no single investigator or conventional department would imagine; The BRAIN Initiative® should particularly stimulate this kind of partnership. 

8c. Integrate Spatial and Temporal Scales, and Accelerate All of Neuroscience  

As mandated by the charge to our working group, this report focuses on new research opportunities at a critical level of neuroscience investigation—that of circuits and systems. As described in detail in this report, however, circuits and systems cannot be understood incisively without reference to their underlying components—molecules, cells, and synapses. Neither can circuits and systems be understood without reference to the whole brain, the behavior of the organism, and how brain circuits are shaped by the unique experiences of the individual. The brain must be understood as a mosaic unity encompassing all of these levels. 

The particular focus of The BRAIN Initiative® represents only one important aspect of neuroscience, but one that can benefit many other areas. The funds devoted to The BRAIN Initiative® are a very small fraction of the NIH’s total investment in neuroscience and neurological disorders. To have maximal impact, the new knowledge and technology created under BRAIN must focus, but its products must accelerate all other subdisciplines of neuroscience so that they also advance and flourish. 

Appropriately, a substantial fraction of the NIH’s investment in neuroscience is allocated to specific human brain disorders. The scientific plan laid out in this report has been composed with a specific eye toward The BRAIN Initiative’s eventual impact for humans—in translational neuroscience research, in medical practice to alleviate suffering (some technologies, such as brain imaging and stimulation, are already in widespread use in medicine), and in other areas such as education. In the nearer term, methods from the BRAIN Initiative can be applied in animal models of human brain disorders, seeking insight about fundamental disease mechanisms and asking about the sources of variation in neurological function across individuals. 

There are many points of intersection between other areas of basic neuroscience and The BRAIN Initiative® as well. A delineation of neuronal cell types and their patterns of gene expression should be a resource for cellular and molecular neuroscience. The census of cell types, and studies of their connectivity, should provide new tools to study central questions in developmental neuroscience. Technologies for real-time measurements of neuronal activity, neuromodulators, and synaptic connections are ideally suited for use in cellular and slice neurophysiology; indeed, many will be used there before they can be applied in whole animals. 

8d. Establish Platforms for Sharing Data 

The traditional way to exchange scientific information is through publications and books, but we have entered a new age of information that is not limited by the narrow bandwidth of journals. High-speed computing and massive storage capabilities have enabled collection of much larger datasets than was previously possible, and the Internet has enabled data sharing on a far wider scale. However, many datasets that are currently available are diverse, fragmented, and highly dynamic, which is to say unstable. Currently most of the raw data that go into published papers are not available outside the laboratory where they were collected. Inevitably, this leads to duplication of effort, inefficiency, and lost knowledge. 

Well-curated, public data platforms with common data standards, seamless user accessibility, and central maintenance would make it possible to preserve, compare, and reanalyze valuable data sets that have been collected at great expense. This would be of great benefit to neuroscience, just as the availability of public genomic and protein structure databases have transformed genetics and biochemistry. Analysis tools and user interfaces should be developed that can be run remotely, such as the Basic Local Alignment Search Tool (BLAST) program for sequence alignment in genomic databases. Creating and maintaining such data platforms would entail a major effort of the community to decide what data and metadata to include, controls on the use of data, and support for users. Valuable lessons and best-practices can be learned from existing public datasets, which include the Allen Brain Atlas, the Mouse Connectome Project, the Open Connectome Project, the CRCNS data sharing project, ModelDB, and the Human Connectome Project, as well as datasets generated by the physics, astronomy, climate science, and technology communities. A first unifying attempt, the Neuroscience Information Framework (NIF) sponsored by NIH, provides a portal to track and coordinate multiple sites, but the myriad genetic, anatomical, physiological, behavioral and computational datasets are difficult to manage because of their heterogeneous nature. The NIH Big Data to Knowledge (BD2K) Initiative offers opportunities to neuroscientists to develop new standards and approaches. 

Methods and software as well as data should be shared. Some neural simulators such as Genesis, NEURON and MCell are well-established, open source and well documented, but the software for many models and simulations in published papers are undocumented or unavailable. The description of a model in a published paper is often insufficient to reproduce the simulations; it is essential that software be made available so that all models in published papers are reproducible. As data sets become larger and as new types of data become available, there is increasing need for public, validated methods for analyzing and presenting these data. As an example, microelectrode recordings often pick up spikes from several neurons that need to be separated into single units—a procedure known as “spike sorting”. A plethora of custom spike-sorting programs have been created by many individual laboratories. But there are no widely accepted standards for rigorous spike sorting, and it can therefore be difficult to compare data precisely across laboratories. The community would benefit from common standards for spike-sorting and for other common data analysis procedures. 

New data platforms would also encourage changes in the culture of neuroscience to promote increasing sharing of primary data and tools. We heard from many researchers about the value of sharing data, and their desire for stable, easily interconvertible data formats that could accelerate the field. Data and data analysis tools that emerge in The BRAIN Initiative® should be freely shared to the extent possible, no later than the date of their first publication and in some cases prior to that date. Some areas of neuroscience, such as human brain imaging (the Human Connectome project; the International Neuroimaging Data-Sharing Initiative), are already sharing data on a large scale despite the enormous datasets involved. Having said that, extending this model to all fields is a difficult problem, and cannot be solved at one step. Based on the history of data sharing in many fields of biology, the solution will come from the engagement of sophisticated, motivated members of the scientific community from the bottom up, not from a directive from above. 

To meet these goals, BRAIN Initiative will require infrastructure for integrating and sharing relevant datasets and data analysis methods. There is much that could be done in partnership with computer scientists and database experts to set standards for data formats and best practices for maintaining and disseminating heterogeneous data. The infrastructure for maintaining common databases will require dedicated resources, which may be provided by the NIH BD2K Initiative or the NIH Blueprint for Neuroscience Research to support The BRAIN Initiative®. 

8e. Validate and Disseminate Technology 

A primary goal of The BRAIN Initiative® will be to identify and support new technologies with potential to substantially accelerate high quality brain research. Technology development begins with innovation, but it is a continuing process. The first genetically-encoded fluorescent calcium indicator, cameleon, was published in 1997; the current versions, such as GCaMP6, were published in 2012 after a focused and sustained effort over ~5 years at a total cost of ~$10 million. The basis of the method has not changed, but the breadth of its utility and applications has increased immensely. 

This example and others show that technologies become valuable after they have gone through the processes of validation in biological systems (with comparison to the current best practices), iteration (serial improvements in properties), application (to a variety of test systems), and dissemination (including education and training). The entire neuroscience community would benefit from support for accelerated technology development between the initial proof-of-principle and the mature system; incisive new research could be accelerated by decades. 

Following technology development, The BRAIN Initiative® should build an infrastructure for sharing relevant tools of all kinds, whether biological, chemical, or physical. Molecular biology tools and viruses are easily disseminated, although there is some associated cost that must be supported. Other tools in discussion from chemistry, nanoscience, and physics are not so easily sent in the mail. If The BRAIN Initiative® develops next-generation electrodes, nanotechnologies, or chemical probes, it should ensure that they can be synthesized, fabricated, or readily purchased by researchers, as appropriate. If The BRAIN Initiative® funds development of a next-generation microscope, it should ensure that private or publicly supported mechanisms make it available to a variety of users, not just the inventors. A pathway to dissemination should be expected for BRAIN-derived tools, whether that is commercialization, core facilities, or something else. Computational and statistical tools developed under The BRAIN Initiative® should also be supported and broadly available. Extremely complex technologies like next-generation high-field MRI instruments might need to be centralized, like the centralized X-ray beam lines used by structural biologists. Core facilities could be established with state-of-the-art technology, perhaps at a few universities or research institutes, but allowing use by researchers from across the country. Such core facilities could be attractive enough to the host institutions that they would co-invest in equipment and support personnel. 

In summary, a core principle of The BRAIN Initiative® is that new technologies and reagents should be made available across the community at the earliest possible time. This will require a thoughtful development of dissemination policies by the scientific community, as well as specialized support mechanisms, private-public partnerships, and training programs. 

8f. Consider Ethical Implications of Neuroscience Research 

The working group is pleased that the President has charged his Bioethics Commission with exploring the ethical issues associated with the conduct of neuroscience research, and also the ethical issues surrounding the application of neuroscience research findings in medicine and other settings. Many ethical and policy issues raised by The BRAIN Initiative® are not unique to neuroscience research, and thus we can learn from ongoing experiences in other fields. For example, the NIH BRAIN Working Group has endorsed data sharing, since advances in science are often catalyzed by collaborations and open access to data. However, our experiences with genetic data have shown that privacy concerns must be managed carefully to protect human research participants. Other issues such as defining what constitutes “enhancement” and how we obtain consent from potentially vulnerable populations have been debated widely across biomedical research, and will continue under The BRAIN Initiative®. 

Although brain research entails ethical issues that are common to other areas of biomedical science, it entails special ethical considerations as well. Because the brain gives rise to consciousness, our innermost thoughts and our most basic human needs, mechanistic studies of the brain have already resulted in new social and ethical questions. Can research on brain development be used to enhance cognitive development in our schools? Under what circumstances should mechanistic understanding of addiction and other neuropsychiatric disorders be used to judge accountability in our legal system? Can civil litigation involving damages for pain and suffering be informed by objective measurements of central pain states in the brain? Can studies of decision-making be legitimately used to tailor advertising campaigns and determine which products are more attractive to specific consumer bases? Brain research must proceed with sensitivity and wisdom. The working group looks forward to the deliberations of the Bioethics Commission, and to interacting with the group to establish a scientifically rigorous plan for The BRAIN

The Brain Research through Advancing Innovative Neurotechnologies Initiative aims to accelerate the development and application of innovate technologies to produce a new, dynamic picture of the brain.

 Initiative® that is grounded in sound ethical policies. 

As is clear from the scientific issues reviewed in this report, developing a deep understanding of the brain is only possible through research on animals and informed, volunteer human subjects. Without question, research under The BRAIN Initiative® should adhere to the highest ethical standards for research with human subjects and with non-human animals, within the regulatory framework of the United States and host research institutions. 

8g. Maintain Accountability to NIH, the Taxpayer, and the Basic, Translational, and Clinical Neuroscience Communities 

The BRAIN Initiative® has the potential to advance human knowledge, to create a foundation of knowledge and methods appropriate for prevention, diagnosis, monitoring, and treatment of human brain disorders, and to stimulate new technologies that blossom in industry. A focused, sustained investment in The BRAIN Initiative® has the power to change our future. Nonetheless, the NIH and its leadership must choose among many opportunities in the larger context of scientific and public health needs. Like other investments made by government agencies, The BRAIN Initiative® should be evaluated regularly by scientists, patients, and the general public to ask whether the ongoing research plan represents the most effective use of NIH funds. 

Evaluation of research under The BRAIN Initiative® poses special challenges due to the interdisciplinary nature of the research and the need to align efforts with the existing NIH neuroscience research portfolio, such as the Blueprint for Neuroscience Research, a collaborative neuroscience effort that spans 15 of the 27 NIH Institutes and Centers. To exercise appropriate oversight, the evaluation mechanism should be similarly broad and interdisciplinary.  

9. Further Reading 

The resources listed below introduce some of the neurotechnologies and BRAIN Initiative-related resources that are described in the report. Most papers are reviews, although a few recent methods papers are included. This is not a comprehensive citation list. 

The BRAIN Initiative® 

Remarks by the President on the BRAIN Initiative and American Innovation  

NIH BRAIN Initiative

Insel TR, Landis SC, Collins FS (2013) Research priorities. The NIH BRAIN initiative. Science 340(6133):687-688. 

Mapping the Structure and Components of Circuits 

Cell Type 

Bernard A, Sorensen SA, Lein ES (2009) Shifting the paradigm: new approaches for characterizing and classifying neurons. Curr Opin Neurobiol 19(5):530-536. 

Lein ES et al (2007) Genome-wide atlas of gene expression in the adult mouse brain. Nat 445(7124):168—176. 

Experimental Access to Cell Types 

Gaj T, Gersbach CA, Barbas CF 3rd. (2013) ZFN, TALEN, and CRISPR/Cas-based methods for genome engineering. Trends Biotechnol. 2013 31(7):397-405. 

Jenett A et al (2012) A GAL4-driver line resource for Drosophila neurobiology. Cell Rep 2:991-1001. 

Huang ZJ, Zeng H (2013). Genetic approaches to neural circuits in the mouse. Ann Rev Neurosci 36:183-215. 

Structural Maps 

Denk W, Briggman KL, Helmstaedter M (2012) Structural neurobiology: missing link to a mechanistic understanding of neural computation. Nat Rev Neurosci 13(5):351-8. 

Ginger M, Haberl M, Conzelmann, KK, Schwarz, MK, Frick A (2013) Revealing the secrets of neuronal circuits with recombinant rabies virus technology. Frontiers in Neural Circuits 7(2):1-15. 

Kleinfeld D et al (2011) Large-scale automated histology in the pursuit of connectomes. J Neurosci 31(45):16125-16138. 

Osten P, Margrie TW (2013) Mapping brain circuitry with a light microscope. Nat Methods 10:515-523. 

Human Connectome Project 

Mouse Connectome Project  

Clarity Resources 

Neuronal Dynamics: Recording Neuronal Activity Across Time and Space 

Recording from Complete Circuits 

Ahrens MB, Orger MB, Robson DN, Li JM, Keller PJ (2013) Whole-brain functional imaging at cellular resolution using light-sheet microscopy. Nat Methods10(5):413-420. 

Alivisatos AP, Chun M, Church GM, Greenspan RJ, Roukes ML, Yuste R (2012) The brain activity map project and the challenge of functional connectomics. Neuron 74:970-974. 

Advancing Recording Technology (Electrophysiology) 

Buzáki G (2004) Large-scale recording of neuronal ensembles. Nat Neurosci 7(5):446-451. 

Szuts TA et al (2011) A wireless multi-channel neural amplifier for freely moving animals. Nat Neurosci 14(2):263-270. 

Advancing Recording Technology (Optical sensors) 

Looger LL, Griesbeck O (2012) Genetically encoded neural activity indicators. Curr Opin Neurobiol 22(1):18-23. 

Peterka DS, Takahashi H, Yuste R (2011) Imaging voltage in neurons. Neuron 69:9-21. 

Integrated Optical Approaches: Neuroscience and Instrumentation 

Wilt BA, Burns LD, Wei Ho ET, Ghosh KK, Mukamel EA, Schnitzer MJ (2009) Advances in light microscopy for neuroscience. Annu Rev Neurosci. 32:435-506. (4.3MB - PDF)  

Nanotechnology and Unanticipated Innovations 

Alivisatos AP et al (2013) Nanotools for neuroscience and brain activity mapping. ACS Nano 7(3):1850-1866. 

Spira ME, Hai A (2013) Multi-array technologies for neuroscience and cardiology. Nat Nanotechnol. 2013 8(2):83-94 

Manipulating Circuit Activity 

Fenno L, Yizhar O, Deisseroth K (2011) The development and application of optogenetics. Annu Rev Neurosci 34:389-412. 

Farrell MS, Roth BL (2013) Pharmacosynthetics: Reimagining the pharmacogenetic approach. Brain Res 1511:6-20. 

Packer AM, Roska B, Hausser M (2013) Targeting neurons and photons for optogenetics. Nat Neurosci 16(7):805-815. 

The Importance of Behavior 

Dombeck DA, Reiser MB (2012) Real neuroscience in virtual worlds. Curr Opin Neurobiol 22(1):3-10. 

Kabra M, Robie AA, Rivera-Alba M, Branson S, Branson K (2013) JAABA: interactive machine learning for automatic annotation of animal behavior. Nat Methods 10(1):64—67. 

Theory, Modeling, and Statistics 

Ganguli S, Sompolinsky H (2012) Compressed sensing, sparsity, and dimensionality in neuronal information processing and data analysis. Annu Rev Neurosci 35:485-508. 

Marder E, Taylor AL (2011) Multiple models to capture the variability in biological neurons and networks. Nat Neurosci 14:133—138. 

Shenoy KV, Sahani M, Churchland MM (2013) Cortical control of arm movements: a dynamical systems perspective. Annu Rev Neurosci 36:337-359. 

Wang XJ (2013) The prefrontal cortex as a quintessential “cognitive-type” neural circuit: working memory and decision making. In: Principles of frontal lobe function, Second edition (Stuss DT, Knight RT, eds), pp. 226-248. New York: Oxford UP. 

Kass RE, Ventura V, Brown EN (2009) Statistical issues in the analysis of neuronal data. J Neurophys 94(1):8-25. 

Human Neuroscience and Neurotechnology 

Donoghue JP (2008) Bridging the brain to the world: a perspective on neural interface systems. Neuron 60(3):511-521. 

Fox MD, Halko MA, Eldaief MC, Pascual-Leone A (2012) Measuring and manipulating brain connectivity with resting state functional connectivity magnetic resonance imaging (fcMRI) and transcranial magnetic stimulating (TMS). Neuroimage 62:2232-2243. 

Holtzheimer PE, Mayberg HS (2011). Deep brain stimulation for psychiatric disorders. Annu Rev Neurosci 34:289-307. 

Lozano AM, Lipsman N (2013) Probing and regulating dysfunctional circuits using deep brain stimulation. Neuron 77(3):406-424. 

McNab JA et al (2013) The Human Connectome Project and beyond: Initial applications of 300 mT/m gradients. Neuroimage 80:234-245. 

Smith SM et al (2013) Resting-state fMRI in the Human Connectome Project. Neuroimage 80:144-168. 

Sotiropoulos SN et al (2013) Advances in diffusion MRI acquisition and processing in the Human Connectome Project. Neuroimage 80:125-143. 

Principles 

Data Platforms, Data Sharing, and Big Data 

Akil H, Martone ME, Van Essen DC (2011) Challenges and opportunities in mining neuroscience data. Science 331(6018):708-712. 

Berman F, Cerf V (2013) Science priorities. Who will pay for public access to research data? Science 341(6146):616-617. 

Ng L et al (2009) An anatomic gene expression atlas of the adult mouse brain. Nat Neurosci 12(3):356—362. 

Ethical Considerations 

Presidential Commission for the Study of Bioethical Issues — BRAIN Initiative 

SECTION III. IMPLEMENTATION: GOALS, DELIVERABLES, TIMELINES, AND COSTS

In its final report, the working group was charged with developing short-, medium-, and long-term goals for the NIH BRAIN Initiative, a set of timelines, milestones, and cost estimates, and proposals for scientific mechanisms. This project was the predominant focus of working group activities from September 2013 to May 2014, and is the subject of this Section. 

The release of the interim report in September 2013 was followed by a period of extended discussion between the working group and the broader scientific community. An open town hall was held at the Society for Neuroscience Meeting. The past, present, and future leadership of the Society for Neuroscience were consulted (see Appendix B). In addition, the presidents of clinical societies in neurology, psychiatry, neurosurgery, anesthesiology, neuroradiology and neuropsychology were consulted in personal discussions and conference calls, seeking their advice for the best approaches to the unsolved problems in their fields. The working group held a meeting with the leadership of the founding BRAIN Initiative partners from NSF, the Defense Advanced Research Projects Agency (DARPA), HHMI Janelia Farm Research Campus, the Allen Institute for Brain Science, and the FDA. Open solicitation of advice continued through the NIH BRAIN Initiative website. 

The working group believes that the vision of The BRAIN Initiative® expressed by President Obama and NIH Director Francis Collins will require a sustained ten- to twelve-year research program, described below. The goals and principles of the interim report are embraced by the final report, but in this section these goals are regrouped to more effectively project them forward over this longer timeframe. For the sake of this document, a ten-year program is described beginning in FY16, to allow adequate planning beyond FY14 (already underway) and FY15 (which begins in July 2014). 

The first five years of The BRAIN Initiative® (FY16-20) are envisioned primarily to support technology development and validation, with the creation of new methods followed by their maturation and integration in the service of important scientific questions. Technology development will continue throughout The BRAIN Initiative®, perhaps peaking around year 5 (FY20). Exploration of scientific questions can and will occur while new technologies are being validated, but the most serious scientific advances will take place after technologies have proven effective, when they can be pursued in a vigorous fashion. Thus beginning more slowly, but peaking in years 6-10 (FY21-25), collaborative groups funded by The BRAIN Initiative® will increasingly use these technologies to answer fundamental questions about the brain. 

Below we consider seven major scientific goals of The BRAIN Initiative® (Sections III.1-7) as well as infrastructure to support its core principles (Section III.8). In Section III.9 we estimate the overall cost of The BRAIN Initiative®. Within each of the seven major goals described below, short- and long-term milestones are suggested for measuring the overall progress of a particular BRAIN project. The proposed milestones are very ambitious, and we do not anticipate that all milestones will be achieved within the specified time periods. Indeed, future scientific discoveries and technological developments are likely to reveal higher priority opportunities for rapid progress, or show that some of the current milestones are less important for reasons that cannot be foreseen at present. The future leadership of The BRAIN Initiative® must judge the progress and continued relevance of any particular project based on overall scientific momentum and achievement in the context of a rapidly changing field. We conclude with examples of fundamental questions that should be addressed by The BRAIN Initiative® (Section III.10). 

Relationship to Private Efforts and International Projects 

The announcement of The BRAIN Initiative® by President Obama proposed partnerships between private and public organizations committed to brain research. Communication between the NIH, NSF, DARPA, and the FDA, communication between private and public partners in the US, and international communication are highly desirable at all stages of The BRAIN Initiative®. The brain is too complex to be solved by individual organizations. 

During the planning process, the working group consulted all partners in the original White House announcement and considered their research agendas carefully. For example, efforts to characterize cell types and connectivity in specific organisms and brain regions are ongoing at the Allen Institute for Brain Science1-3 and HHMI’s Janelia Farm Research Campus4,5. Inasmuch as the results of these private efforts are made available to the broader community, The BRAIN Initiative® will benefit from them. However, the HHMI and Allen Institute projects are focused on specific animal models and brain structures, and will not solve all problems of cell type and connectivity even if data are shared; the broad scope of The BRAIN Initiative® requires additional commitment to these areas, especially in the human brain. 

At a practical level, the private efforts have generated brain-related technology, reagents, organizational models, publicly accessible databases and search algorithms. NIH projects can leverage these efforts or partner with these groups to increase their cost-effectiveness. 

In 2013, the European Union announced the Human Brain Project, a €1 billion project with an emphasis on information computing technology infrastructure for neuroscience. The working group has been in contact with participants in the Human Brain Project, and meetings between representatives of the United States BRAIN Initiative and the Human Brain Project to discuss shared interests such as data platforms are planned. We expect useful interactions between The BRAIN Initiative® and the Human Brain Project to develop as opportunities arise.  

1. Discovering Diversity 

1a. Scientific Goal: Identify and provide experimental access to the different brain cell types to determine their roles in health and disease 

1b. Overall Objective 

The mammalian brain contains ~108 (mouse) - 1011 (human) neurons. These neurons are not homogeneous, but consist of diverse subpopulations with genetically, anatomically, physiologically, and connectionally distinct properties. Defining these cell types and their functions in different brain regions, and providing methods for experimental access to them in a variety of animals and in humans is essential to generating a comprehensive understanding of neural circuit function. 

1c. Deliverables 

A census of neuronal and glial cell types in key brain regions using multiple analysis modalities (the “parts list”), and an intellectual framework for cell type classification. The modalities of interest include (but are not limited to) transcriptional/protein profiling, electrophysiological recording, cellular anatomy, and connectivity. This information is fundamental because it will provide knowledge that is essential to a deep understanding of neural coding and computation. 

Experimentalaccess to defined neuronal and glial subpopulations, and tools for cell type-specific connectivity mapping, recording, and modulation. By “access,” we mean ways to target reporters, indicators, and effectors to a desired neuronal or glial subpopulation. In the short- to intermediate-term, this will likely involve genetic methods. This enabling technology will allow analysis of meso-scale connectivity, functional (causal) manipulations, and electrophysiological or optical recording of activity to be linked to each other at the level of a defined cell type. In humans, experimental access will provide a route to novel targeted therapies for neurological and psychiatric disorders. 

Among the scientific questions to be addressed by this goal: 

How many cell types exist in the brain, to a first approximation? 

Is there a basic organizational logic to cell type diversity throughout the brain? 

Do well-defined cell types shape neural circuit function to a greater extent in some brain regions than in others? 

What level of granularity of cell type definition is required for understanding the function of a given neural circuit? 

Can we target specific human cell types to develop new therapies for neurological and psychiatric disorders? 

1d. Rationale and Principles 

See also Section II.1A of this report. 

There is universal agreement that important phenotypic distinctions exist among different classes of neurons6-8. Glia, which outnumber neurons and play multiple roles in brain function9, are also heterogeneous10,11. A consensus definition and taxonomy of brain cell types has not yet been achieved. Nevertheless, objective classifiers can be built based on a principled approach combining electrophysiological, gene expression, and anatomical and connectional data8,12. It is likely that the best working definitions of natural cell types will emerge from empirical classifications based on functionally relevant phenotypic dimensions7. Working definitions will be updated continuously as more data are collected and deeper understanding emerges. 

An important contribution to the conceptual definition of cell type will come from iterative interactions with theory and modeling (Section III.5). For example, theoretical considerations may specify the level of granularity of cell type identification that is necessary to understand the computations in a particular brain region, providing an initial guide for experimental analysis. In turn the level of cellular heterogeneity observed in a given brain region can constrain models and generate new predictions concerning circuit function or disease intervention. 

Challenges for The BRAIN Initiative®: 

Increasing the throughput, scale, and dimensionality of cellular phenotyping. The ability to systematically characterize different cell types across the brain, using multiple phenotypic criteria, is limited by the throughput, multiplexing capacity, and compatibility of existing experimental methods. New methods, such as multiplexed in situ hybridization13, array tomography for antibody staining14, single-cell multiplexed PCR, and single-cell RNA sequencing15, are being developed to address some of these limitations, and improvements in these methods will occur over the next several years. There is, nonetheless, a need to 1) expand the repertoire of reagents for cell type characterization, particularly in the antibody domain (emphasizing cross-species reactivity to humans); 2) increase the throughput and multiplexing capability of cellular phenotyping methods; 3) improve the compatibility of methods to quantitatively characterize cell type diversity along multiple phenotypic dimensions, at the single-cell level. 

Measuring dynamics of phenotypic marker expression. Another challenge is our currently inability to measure the stability of molecular properties across time scales or conditions: standard methods for analyzing gene or protein expression in vivo can typically only be performed at a single time point. Some markers or phenotypic properties may be stably expressed in a given cell population across different conditions (“canonical markers”), while others may be dynamically expressed16 ; it is currently difficult to distinguish between the two. Dynamical methods using genetically encoded fluorescent reporters exist, but are limited to one or a few genes at a time17. Therefore, there is a need to develop new technology to measure the temporal dynamics, or conditional properties, of many phenotypic markers at a single-cell level in vivo. 

Adequate specificity of experimental access. The current approach to experimental access is to target the desired cell population based on the expression of one gene, or the intersecting expression of two or more genes, typically using recombinases such as Cre or Flp18-21. Anatomical selectivity, in mammals, is provided by stereotactic injection of viruses, often combining gene expression specificity with projections to or from an anatomical target22. Another approach is to begin with a molecularly-defined cell population, and then use transcriptional profiling23-25, activity or anatomical criteria to ask how many subtypes of cells it represents1,8,26. We do not yet know the number and complexity of phenotypic dimensions that will be required to distinguish different cell types as they emerge from the census. The more dimensions required, the greater the complexity of combinatorial methods needed for specific experimental access. This will require transformative new technologies to facilitate “intersectional” approaches in which multiple conditions are met simultaneously. One example would be a method to “trap” cell types based on their activity under particular conditions; such methods already exist27,28, but it would be ideal to restrict their time resolution to milliseconds or seconds rather than hours, and to combine such activity trapping with anatomical and gene expression selectivity. 

A challenge for human applications is that germline modification of the genome is unacceptable for ethical and practical reasons. Existing methods for accessing neurons based on their anatomical locations and projections29 invariably recruit a variety of cell types. Therefore, transformative new technologies will be required to solve this problem for human use. An example would be the development of virally based methods for genome editing in post-mitotic neurons in the context of therapeutic trials30. Another example would be delivery of genes or pharmaceutical agents to cells based on combinations of surface marker proteins, perhaps best targeted by a suite of antibodies, either as direct conjugates or as a means of pseudotyping enveloped recombinant viral vectors31. 

1e. Implementation 

We recommend that the initial stages of this project be focused on obtaining an inventory with molecular, anatomical, and electrophysiological descriptions of all of the cell types in several selected brain regions of organisms such as C. elegans, Drosophila, zebrafish, mouse, and non-human primate, as well as the development of tools for genetic access to all of these cells. These organisms and brain regions would be prioritized based on their interest to large communities of neuroscience researchers and their relevance to human disease. This strategy would identify challenges and opportunities for iterative tactical improvements in technology and process. Among multiple possible starting points, we suggest the following list of important areas in the mouse as an example. This set is intended to complement ongoing efforts in privately funded projects1-5. 

Example brain regions for initial analysis in the mouse: 

Retina. The retina is the region in which the most progress has been made in the characterization of different cell types, and in the generation of reagents to provide access to those cell types. It therefore has the greatest chance of being completed within a 3-5 year period, and could serve as a flagship project for The BRAIN Initiative®. It is relevant to the fields of vision, general sensory and signal processing, and to clinical issues including neurodegenerative diseases and vision disorders. 

Spinal cord. This spinal cord is another area in which a great deal of progress has been made in the identification of different cell types. It is important for understanding the control of locomotion and the function of central pattern generators, and to human neurological disorders such as paralysis, traumatic spinal injury, chronic pain, and motor neuron degenerative diseases such as amyotrophic lateral sclerosis (ALS). 

Hippocampus. This area is an intense focus of basic neuroscience research into learning, memory, and spatial navigation, and is important to human memory disorders such as Alzheimer’s disease. 

Striatum. This area is a focus of neuroscience research in movement control, reward, motivation, and decision-making. It is highly relevant to addiction and to movement disorders such as Parkinson’s disease and Huntington’s disease. 

Amygdala/Hypothalamus. These interconnected regions are a focus of basic neuroscience research into fear, anxiety, feeding, and social behaviors. They are of central concern for psychiatric disorders such as post-traumatic stress disorder (PTSD) and anxiety disorders, and for obesity and eating disorders. 

Prefrontal cortex. The cerebral cortex is highly developed in humans compared to other animals, and the prefrontal cortex in particular is associated with human decision-making, cognition, and emotional behaviors. Its functions are disrupted in schizophrenia and dementia. 

At later stages, as technology evolves to increase the throughput of descriptive analysis, the census of cell types should be expanded to additional brain regions and species, including humans. 

1f. Mechanisms 

The major goals of this project may be achieved using a variety of organizational or operational models ranging from independent laboratories, to distributed collaborative consortia, to regional laboratories or institutes. In any model, it is essential that data be collected in a standardized manner, so that the results obtained from the different brain regions are directly comparable and can be effectively integrated into a common database 32. This is challenging because of the number of different experimental modalities required and the lack of standardization in many of them (especially electrophysiology). This is, therefore, a considerably more complex undertaking than the Human Genome Project, where different laboratories all used the same, standardized DNA sequencing technology. The best chance of addressing this concern is to emphasize early sharing of standardized, high-quality data across different laboratories. 

1g. Timelines and Milestones 

The following specific advances can be anticipated from this project: 

Fundamental knowledge - census of cell types, logic of cell type diversification in different brain regions. 

Open-access database of integrated information with computational search tools. 

Reagents for cell-type access, including both new approaches and improved and scaled-up versions of existing methods (e.g. viral vectors, Cre lines). 

Standardized methods for cell type characterization (transcriptional profiling, proteomics, electrophysiology, anatomy, connectional mapping, etc.). 

New methods to approach the cell-types problem in non-traditional organisms, including humans and other primates, and to reduce cost of experiments in non-human animals. 

Short-term goals (1-5 years) 

Arrive at consensus for phenotypic criteria used to define cell types. 

Complete a cell type census in two regions (perhaps retina, spinal cord), and generate a preliminary cell type census in at least four other regions of interest. 

Increase throughput and scale of descriptive cell type census by 10-fold over 5 years. 

Initiate cell type census for other organisms, including non-human primates and humans. 

Produce antibody reagents for cell-type identification, emphasizing cross-species reactivity (rodents, non-human primates, humans) and immunohistochemical applications. Cocktails of monoclonals, and immune reagents directed to cell-surface antigens, are of particular interest. 

Expand the set of reagents for genetic access to cell types in initial brain regions of interest, using existing Cre, Flp, Dre recombinase technology and emerging genome editing technology to facilitate intersectional strategies. 

Create or contribute to a publicly accessible database of cell types; develop associated software for searching/computing on the database. 

Develop new technologies for germline modification-independent approaches to cell type access across species, including non-human primates and humans. 

Develop new technologies for improved multiplexing, compatibility, and dynamics of cellular phenotyping. 

Long-term goals (6-10 years) 

Generate “first draft” cell-type census for all major brain regions in mouse. 

Provide specific genetic access to at least 200 different cell types in the mouse brain, and comprehensive access within specific regions of interest. 

Produce a cell type census for selected brain regions in non-human primates and humans. 

Achieve cell type-specific targeting of optical imaging and optogenetic perturbations in multiple mammalian species, including non-human primates. 

Achieve proof-of-principle cell type-specific targeting of therapeutic manipulations in humans.  

2. Maps at Multiple Scales 

2a. Scientific Goal: Generate circuit diagrams that vary in resolution from synapses to the whole brain 

2b. Overall Objective 

Anatomical connectivity is essential for understanding and predicting the functional signals that underlie cognition and behavior. In addition, anatomical circuit maps will provide new insight into how healthy brain circuits develop in early life and how circuit development and function go awry in psychiatric and neurological disorders. 

2c. Deliverables 

Improvements in the resolution and accuracy of imaging methods to enable complete and accurate mapping of neural circuit structure in human and animal brains. 

Integration of structural data across scales (from whole-brain to ultrastructural) and with other modalities (physiology, molecular biology, biochemistry). 

Discovery of anatomical differences between healthy and diseased brains. 

Among the scientific questions to be addressed by this goal: 

Are there circuits unique to the human brain that could help account for its amazing capabilities, such as language? 

Can macro-connectomic connectivity patterns serve as biomarkers for brain disorders? How widely can they be used for differential diagnosis of brain disorders, monitoring disease progression, and predicting or monitoring responses to therapy? 

Do some brain disorders result from anatomical "connectopathies" with stereotyped defects in neural circuitry? 

What changes in circuits accompany, and perhaps cause, age-related cognitive decline? 

Can we predict circuit function from wiring diagrams of connectivity? 

How different are connectivity patterns in genetically identical organisms in a species, ranging from isogenic flies and worms to identical twin humans? 

Can individual variations in connectivity be related to or even predict individual differences in brain function? 

2d. Rationale and Principles 

See also Sections II.1B and 6 of this report. 

The BRAIN Initiative® aims to develop technologies that will allow us to discover how patterns of activity in neural circuits account for mental processes and behavior. For much of the past century, few methods were available to map circuit function, so structural maps served as proxies for functional maps. The beautiful circuit diagrams of Ramon y Cajal epitomize both the power of this approach and its limitations. Over the past decade, new methods have emerged to map circuit activity directly and on a large scale. Nonetheless, anatomical connectivity remains essential for identifying the actual mechanism at work in the nervous system and making useful predictions about the development and function of neural circuits in health and disease. 

Structural (anatomical) connectivity can be mapped at several levels, which have been termed “macro-, meso-, and micro-scale.” 

The macro-connectome is the map of connectivity patterns across the entire brain at the level of centimeters to millimeters. This term is generally applied to analyses of the human brain, because few of the finer-scale methods used in experimental animals are currently applicable to humans. Macro-connectomic efforts are best suited for determining which brain regions are interconnected. 

The meso-connectome is intermediate in spatial resolution. It refers to the mapping of connections to and from brain areas at millimeter to micron scale. Such maps are sometimes called “projectomes.” New methods allow extension of these approaches to mapping projections of specific cell types and enhancing maps with measures of synaptic connectivity. Meso-connectomic efforts are underway for experimental animals32,33, but their application to humans is just beginning. 

The micro-connectome is a map at micron to nanometer resolution, which aspires to document the location of every neuron and/or synapse in a restricted brain region. It currently relies on ultrastructural methods that in principle can be applied to any tissue or species but have, in fact, so far been applied in a high-throughput fashion only to invertebrates and rodents. 

2d-i. Macro-Connectomic Methods 

Current human connectomic efforts rely on noninvasive MRI-based methods34-36. DW MRI exploits the anisotropy of water diffusion along myelinated axons to map white matter tracts37. rfMRI measures correlations in spontaneous activity across areas in resting subjects. It provides an indirect measure of anatomical connectivity based on the assumption that correlations are greatest amongst connected regions38. 

These methods, which are the only ones currently applicable to living humans, have provided valuable information and are transforming both basic and clinical human neuroscience. They are, however, limited in several respects. First, their spatial resolution is relatively coarse: the smallest voxel that can be discriminated (>1mm3 ) contains tens of thousands of neurons. Second, limited sensitivity leads to a high incidence of false negatives and positives. Third, while diffusion and resting state measures have been shown to reliably identify several known connections, they have not been adequately and comprehensively validated. Notably, current tractography methods are biased toward some brain regions, such as the caps of gyri, and away from others. In addition, the directionality of connections cannot be inferred from these methods. To overcome these limitations, higher resolution imaging methods, improved means for capturing relevant signals, and critical tests of current methods are needed. 

2d-ii. Meso-Connectomic Methods 

The best-developed meso-connectomic methods involve injection of a tracer into a small, well-mapped brain area, followed by enumeration of the areas and cell types that send projections to (retrograde tracers) or receive projections from (anterograde tracers) the labeled region. Tracers include fluorescent dyes and viral vectors39. Such maps are called “projectomes.” 

Projectomes can be enhanced in several ways. (a) Viral vectors can be targeted to specific cell types, using Cre-Lox technology, fractionating a single projectome into maps that reveal distinct connectivity of the many cell types that reside within a region40. (b) Multicolor methods can be used to resolve subsets of axons within a tract41-43. (c) Transsynaptic methods, generally using viral vectors, enable spread of a tracer from a neuron to other neurons that it synapses upon (anterograde) or that synapse on it (retrograde), revealing characteristic patterns, or ‘motifs’, of synaptic connectivity 44. (d) Methods such as GRASP identify synapses made between labeled pre- and postsynaptic cells, revealing not only the identity of the connected cells but also the location of the synapses that connect them45,46. 

Meso-scale methods are valuable to BRAIN Initiative efforts for several reasons. First, they provide information needed to interpret activity-based maps, assuming that the anatomical and physiological datasets can be registered. Second, when used in conjunction with novel methods for clearing and visualizing tissue47-50 large blocks of tissue can be imaged, speeding analysis. Third, some of these methods can be used to map human tissue obtained at autopsy and from brain banks49. Fourth, these methods can be scaled up to permit analysis of multiple samples, allowing assessment of variability among individuals and comparison of normal and pathological tissue. Fifth, meso-connectomic efforts can be augmented to provide information about “chemical circuitry,” as exemplified by neuromodulatory signaling mechanisms, which can be detected molecularly with antibodies or genetic markers. 

On the other hand, substantial challenges remain. First, the fraction of all authentic connections that are reliably detected remains uncertain. Second, current projectomic methods are well suited to map long-range connectivity but poorly suited to map connectivity within small regions, because many cells are generally targeted at once and the density of interconnection is high. Third, methods based on viral vectors and transsynaptic tracers require increased sensitivity, resolution, and generality and decreased toxicity. Fourth, human brain bank resources are inadequate to meet new meso-connectomic opportunities, in terms of tissue availability, quality and curation. 

2d-iii. Micro-Connectomic Methods 

To date, micro-connectomes have been obtained exclusively by EM, because only this method has the resolving power needed to map individual synapses and the finest-caliber neurites with certainty. The original connectome, that of C. elegans, was obtained by serial section transmission electron microscopy (TEM)51. This approach remains the gold standard but is technically arduous. Much attention has been focused recently on devising methods that are more amenable to automation of section preparation, such as serial block-face (SBF) and focused ion beam (FIB) scanning EM and automated tape ultramicrotomy (ATUM). Conventional and SBF microscopy have been used to generate large connectomic datasets over the past few years52-57. It should be noted that specimen preparation and fixation are still “dark arts” that could bear modernization and improvement. Technology development in this area should continue, but at this writing it remains unclear which of the currently available methods will be best suited for large-scale efforts. 

A major and serious challenge to this field, which could be addressed by The BRAIN Initiative®, is that advances in methods for sectioning and imaging have not been matched by advances in methods for image segmentation and reconstruction and, for conventional methods, section alignment. It takes orders of magnitude more time to generate reconstructions than to obtain the original data. Thus, the bottleneck now is data analysis rather than data acquisition58,59. Even for block-face methods in which no specimen alignment is needed, no existing computational segmentation algorithms are sufficiently accurate to replace human annotators. One rough estimate is that complete reconstruction of a single 1 mm cube of cortex (~0.2% of the whole brain) would take 10,000 person years59 and cost >$100 million. Ongoing improvements in computational methods and crowd-sourcing approaches are chipping away at the problem, but large-scale investment in micro-connectomics should be tied to substantial progress in this area. 

Because micro-connectomic maps are so expensive in money and time, and because they can only be obtained for very small brains or for local circuits (microcircuits) within large brains, there is no consensus concerning their priority within The BRAIN Initiative®. There is no doubt that reference connectomes would be of great value59. On the other hand, it remains unclear how variable micro-connectomes will be among individuals within a species, or even over time in a single individual60,61 . This variability would not be an insurmountable problem if it were feasible to routinely map micro-connectivity on the same individuals that have been analyzed physiologically, or to compare normal and pathological tissues in substantial numbers of individuals. This is not yet feasible, however. Thus it is especially important to focus now on developing technologies that will drive down the cost of connectomics. 

It is also worth considering alternative methods for generating micro-connectomes. A major advance over the past decade has been the invention of a suite of super-resolution light microscopic methods — STORM, PALM, STED, etc. — with a resolution far greater than that of diffraction-limited light microscopy62,63. Super-resolution methods allow use of multiple colors, super-position of molecular (immunohistochemical) labels on neurites and synapses, and even live imaging. Current resolution of these methods is in the range of 10-30 nanometers, which is nearly sufficient for micro-connectomic mapping in at least some tissues. Substantial improvement in super-resolution methods could provide an attractive alternative to ultrastructural approaches. 

2e. Implementation 

Methods for structural mapping at all levels remain inadequate. We therefore focus in the first five years on developing new techniques and improving promising ones. As methods improve, we anticipate that they will be increasingly useful for high-throughput comparison of circuits in normal and pathological tissue, both in experimental animals and in humans. 

2e-i. The Macro-Connectome 

Short-term goals (1-5 years) 

An urgent need is to determine the relationship of currently used proxies of connectivity in humans (DW MRI and rfMRI) to direct measures of synaptic connectivity as determined structurally or functionally. We recommend investment in imaginative studies aimed at that goal. For example, recent work suggests that focused ultrasound could be used for focal stimulation of small (~1mm3 ) brain regions64-66. Coupling such stimulation with measurement of BOLD-based functional imaging signals (fMRI) could be used to generate a map of connectivity that could be compared with those inferred from MRI. Other approaches to be pursued in parallel include direct validation of MRI measures in human or non-human primates using projectomic methods, with the goal of decreasing the regional biases of DW MRI anatomical imaging methods and increasing their sensitivity; studies in non-human primates suggest that classical tracing methods and DW MRI tractography only correlate well for the strongest 20% of connections. Combinations of imaging in vivo followed by projectomic analysis ex vivo in non-human primates might reveal new MRI “signatures” that could be applied to humans. 

A second impediment to progress is the limited resolution of MRI methods, which will facilitate studies of both connectivity (this section) and activity (next section) of human brains. Resolution limits are currently ~8 mm3 (e.g. 2x2x2 mm3 ) for rfMRI and ~2 mm3 (e.g. 1.25x1.25x1.25 mm) for DW MRI using advanced 3 Tesla imaging platforms. We support efforts to improve resolution significantly to reach voxel volumes of ~0.3 to 0.4 mm3. 

Long-term goals (6-10 years) 

Resolution of ~0.4 mm3 for rfMRI and DW MRI will enable numerous new studies, but we should not be satisfied with this goal. Vigorous pursuit of new approaches hold promise for improvement in resolution to 0.1 mm3 or better in 6-10 years. This would provide the ability to visualize connectivity in the human brain at the level of layers and small ensembles such as cortical columns. 

MRI methods are already being used to detect individual differences among human brains that correlate with individual differences in brain function and genotype. As resolution and accuracy improve, new opportunities will arise for generating biomarkers for early diagnosis and providing mechanistic analysis of brain disorders. We support pilot projects to test the potential of these methods. 

2e-ii. The Meso-Connectome 

Short-term goals (1-5 years) 

The greatest need in this area is to improve the specificity and reliability of methods for projectome and cell-type-to-cell-type tracing. For example, transsynaptic tracing methods are inadequately validated; it is not clear whether iterative improvements will suffice or whether new approaches are on the horizon. For viral vectors, decreasing toxicity and exploring methods for selective targeting to specific cell types in genetically inaccessible species, especially humans, is essential. Optical and computational methods for efficient, high-resolution collection of multidimensional datasets from large volumes, and registration of these datasets with cellular-resolution activity information, will need to be developed and improved. Emphasis should be placed on methods that can be applied to non-human primates in vivo, and to human autopsy or biopsy material. 

Long-term goals (6-10 years) 

Based on anticipated methodological advances in the first few years of The BRAIN Initiative®, we propose vigorous efforts to map projectomes with cell-level resolution and cell type-specificity in healthy and pathological tissue. Mapping could begin with projectomes of key brain regions in normal animals, and in animal models of human brain disorders (e.g. autism or Parkinson’s disease). As methods improve, it will be possible to (a) register connectivity with activity mapped in the same brains, (b) scale up mapping to compare structure and function in whole brains of normal animals and disease models, and (c) map projectomes in key areas of normal human brains obtained at autopsy, as well as brains of people with brain disorders. 

2e-iii. The Micro-Connnectome 

Short-term goals (1-5 years) 

Tests of competing EM approaches — e.g., FIB, ATUM, SBF, TEMCA — are already underway, as are efforts to generate sample connectomes — e.g. of C. elegans larvae, fly optic lobe, and mouse retina. A major contribution of The BRAIN Initiative® should be to spur improvements in segmentation and reconstruction methods because these are the bottlenecks in micro-connectomic research. Machine learning, crowd-sourcing and other promising approaches could be pursued in parallel. Importantly, segmentation methods developed in this effort will be applicable to light as well as electron microscopic data sets. 

We also recommend exploring the possibility that super-resolution light microscopic methods could be improved to the point of providing micro-connectomic (nanometer-level) mapping of at least some circuits. 

Although segmentation algorithms will continue to mature, we anticipate that within a few years methods will be sufficiently developed to support collection of a few large data sets that will not only be immediately useful but also provide a way to explore how micro-connectomes can be used in conjunction with other types of data. We propose this challenge: Monitor the behavior of an individual zebrafish over a protracted period (at least several hours) while simultaneously recording activity from as many neurons as possible. Then, prepare the brain of that same zebrafish, submit it to volume EM or ultra-high resolution light microscopy with multiple molecular labels, and reconstruct many and perhaps all connections in its brain. 

Long-term goals (6-10 years) 

As segmentation methods improve further, it will be possible to complete sparse reconstructions of key areas in brains of normal animals and in selected animal models of human brain disorders. Likewise, we anticipate that initial segmentation of key areas within normal and pathological human brain will be feasible within 10 years. 

An aspirational goal, dependent on decreased cost and increased speed, will be to reconstruct key areas from multiple individuals, thereby mapping individual variations in connectivity, first in experimental animals and later in humans. This accomplishment would be transformational, because individual variations in neural connections and activity patterns underlie the remarkable behavioral and cognitive differences among individuals. 

2f. Mechanisms 

During the first few years, the emphasis will be on improving current technology and exploring the broadest possible space in search of new technologies. This work is best supported by investigator-initiated mechanisms. In later years, we anticipate that continued technology development will be accompanied by consensus on best practices for more focused efforts — for example, for mapping projectomes in healthy and pathological tissue. This work may require a larger investment, and consortia involving several sites working in coordination may be desirable. Very large-scale centers may eventually be required for large-scale connectomic efforts, but that will depend on the scientific problems to be addressed and the available technology. 

2g. Timelines and Milestones 

Short-term goals (1-5 years) 

Validate MRI-based methods for macro-connectomic mapping of connectivity in humans. 

Improve MRI resolution to ~0.3-0.4 mm3 (also see Section III.3). 

Develop a high-quality toolbox of methods for efficiently mapping and annotating projectomes in experimental animals, including non-human primates, as well as in human tissue blocks. 

Reduce the time needed to segment volume EM data sets by one hundred to one thousand-fold. 

Reconstruct micro-connectomes of individual animals that have been studied physiologically and behaviorally—e.g. dense reconstruction of a zebrafish brain following activity imaging of the behaving animal. 

Develop new techniques for using electron and/or super-resolution light microscopy to integrate molecular signatures of cells and synapses with their nanoscale connectivity 

Long-term Goals (6-10 years) 

Improve MRI resolution to better than 0.1 mm3 (see also Section III.3). 

Obtain high resolution, macro-scale connectomes of 100s-to 1000s of normal and disordered human brains, with accuracy sufficient to provide individual maps under different conditions, in standardized formats that allow broad access and comparisons of these reference data with new data. Identify connectional structures underlying psychiatric and neurological disorders. 

Obtain projectomes in normal animals and disease models. 

Obtain projectomes in healthy and pathological human brain. 

Produce microscale reconstruction of key areas from multiple individuals in an animal model, following behavioral and physiological analysis, thereby relating individual variations in connectivity to functional differences.  
3. The Brain in Action 

3a. Scientific Goal: Produce a dynamic picture of the functioning brain by developing and applying improved methods for large-scale monitoring of neural activity 

3b. Overall Objective 

Large-scale monitoring of neural activity is at the heart of The BRAIN Initiative®, providing the means to map and characterize the changes in electrical and chemical signaling underlying mental processes. It will also serve as a foundation in translational applications involving brain monitoring and stimulation, including the restoration of lost or aberrant neural function. As such, it is a core technology for The BRAIN Initiative® and considerable effort should be made to develop both new and improved large-scale recording methods. Technology development should proceed with specific biological applications in mind that would, if successful, provide new information on the neural circuit basis of brain function, or provide new capabilities in therapeutics. Thus, the emphasis is on technology applicable to the study of the awake brain during quantifiable behavior, providing a basis for the interpretation of neural signals. 

3c. Deliverables 

New and improved electrodes for large-scale recording. Objectives include increased number and density of recorded neurons, access to more brain areas, increased reliability, and minimal invasiveness and tissue reaction. Longer-term objectives include practical devices incorporating nanowires or other nanofabricated structures for in vivo high-density extracellular or intracellular recording. 

New and improved optical sensors of neural activity, both electrical and chemical. Objectives include better fluorescent indicators, spectroscopic molecular signatures, or nanoparticle probes, preferably with cell-type specific targeting, for membrane voltage, neurotransmitter and neuromodulator concentrations, synaptic activity, and biochemical processes. 

New and improved instruments for optical monitoring of neural activity. Objectives include new and improved imaging methods for monitoring populations of neurons at high density within local brain circuits in any brain area, and, ultimately, across many brain regions simultaneously, during quantitatively measured behavior. Compatibility with perturbation technology (Section III.4) will be highly desirable. 

Development of improved technology for monitoring human brain activity, including extending the resolution of current-technology fMRI measurements to 0.1 mm3 (0.46mm voxels) and new methods for making human fMRI measurements. Such new methods could involve current fMRI approaches implemented with new instrumentation and spatial encoding techniques to perform studies in natural settings. A long-term goal is a new technology with cellular spatio-temporal resolution that could interrogate large portions of the human brain. 

Scientific questions addressed by this goal: 

How is sensory information transformed into higher-order perception? 

How is short-term working memory encoded, maintained, and read out? 

What are the circuit mechanisms underlying decision-making? 

What fundamental logic and mechanisms mediate motor control? 

How do multiple brain areas communicate and work together as behavior and task demands change? 

How can we reliably detect multiple brain states that occur during wakefulness and sleep? What are the unique functions of these states? 

How do neuromodulatory signals remodel circuit dynamics and brain states? 

How are internal cognitive models of the world encoded, updated and accessed to make predictions and guide future actions? 

3d. Rationale and Principles 

See also Sections II.2 and II.6 of this report. 

A diverse set of new and improved methods for monitoring neural activity in the functioning brain will be the first deliverable anticipated from this project during the first 5 years. As the project develops and matures, we anticipate that these methods will be increasingly used to produce a second deliverable and a primary goal of The BRAIN Initiative®: the construction of a dynamic picture of brain function that integrates neuronal and circuit activity over multiple temporal and spatial scales. Accordingly, progress in later years will increasingly apply large-scale recording methods to biological questions, often in experiments in which recording is integrated with other methods such as cell type identification (Section III.1), anatomical circuit mapping (Section III.2), perturbation (Section III.4), behavior, and theoretical analysis (Section III.5). 

New recording technologies should provide the ability to map at unprecedented resolution and scale the electrical and chemical activity of populations of neurons in the awake brain during cognition, emotion, and behavior. This new data will provide the basis for a conceptual understanding of neural coding: how information relevant to the brain state, sensory stimuli, or other variables are encoded in this activity. Following the changes in neural activity over time—the neural dynamics—will provide key information for establishing the computational function of a neural circuit, and hypotheses about how the brain works will be subject to direct experimental observation. For example, are decisions represented by diverging sequences of neural activity that become progressively less similar over time (i.e. a bifurcation in neural dynamics)? Is the short-term memory trace of a face represented by a pattern of activity on a low-dimensional attractor manifold within the connected brain areas that process face information (the continuous attractor hypothesis)? 

3e. Metrics for Progress in Recording Technology 

A diverse set of methods and instrumentation is used for detecting large-scale neural activity. Nevertheless, the technologies for large-scale neural recording can be evaluated using metrics that allow comparisons between technologies and provide quantitative means to define goals and milestones. For many of these metrics we expect transformational improvements (e.g.10—100´ or more), which provides the motivation and the basis for setting milestones for large-scale neural recording within The BRAIN Initiative®. 

There are often important engineering tradeoffs between the different metrics, and we do not expect that individual technologies will be able to simultaneously achieve optimum performance on all metrics. Rather, we expect a range of technologies will arise that possess complementary strengths and address different experimental challenges and gaps in our present knowledge of brain function. For example, although the number of neurons that can be concurrently recorded is an important metric, one must additionally understand the extent to which a candidate technology offers the capability to record from specified cell types, and to sample densely and in a volumetric fashion within basic micro-architectural units such as individual cortical columns (see Section III.7). 

The Metrics: 

Number of neurons recorded. For extracellular recording with the present generation of microelectrodes (tetrodes67,68 /silicon-probes69-71 ), or for current calcium imaging methods72-74 at high speed, the number of neurons recorded simultaneously typically ranges from tens to hundreds. There is a tradeoff in optical imaging between the number of neurons recorded and temporal resolution: at low temporal resolution, up to 100,000 neurons have been recorded in the transparent larval zebrafish75. Increasing the number of neurons recorded at a particular temporal resolution can be used as a metric for progress. Volumetric recording. Recording methods differ with respect to the geometrical arrangement of the individually recorded neurons. For example, an individual tetrode records from a cluster of closely spaced neurons near the tip, while the clusters recorded by each tetrode are typically at least several hundred microns apart. Single silicon probes can typically sample many points along the probe’s shaft, but have limited sampling orthogonal to the probe’s axis76. In optical imaging, imaged neurons typically have cell somata geometrically located within a limited two-dimensional image plane. In all cases, it would be desirable to develop methods in which any neuron within the tissue could be accessed by a given recording technology, enabling large-scale recordings of neural activity across a 3-D volume of tissue. Examples of approaches for 3-D recording include multi-contact silicon electrode arrays77 and multi-plane 3-D volumetric optical calcium imaging78. Density of recordings. Many regions of the brain have conserved, repeating motifs in their micro-circuitry that seem to represent basic micro-architectural and functional units. To understand the function and collective activity patterns of these units, such as cortical columns or cerebellar microzones, we need recording techniques that offer dense, ideally complete, sampling capabilities, across the volumes of tissue typical of the individual units (from fractions to multiple cubic millimeters). Temporal resolution. Action potentials can have a duration of less than 1 millisecond; synaptic events last 10-500 milliseconds; slow modulation currents and potentials develop over seconds to minutes. Recording methods differ in their ability to follow these different temporal scales. Faster time resolution for action potential measurement is a common goal of recording methods, but it is not an exclusive goal or benchmark, since there is much to be learned at the synaptic and neuromodulation time scales. Successful and routine access to any neural structure. A technology that is very informative may nevertheless have a low success rate (e.g. in vivo patch recordings in behaving animals), or may not be equally applicable to all brain areas or cell types. Cells in deep brain areas and cells that are small, or have low activity rates, can be challenging to access and identify with existing recording methods. Methods that provide routine access to any anatomical structure and neuron type in the brain, at single-cell resolution, would be highly valuable. Compatibility with stimulation of neural activity at cellular resolution. Experimental manipulation of neural activity is an important tool for testing causality in the characterization of neural circuits, and is increasingly used in therapeutics (see Section III.4). Ideally, the coding and dynamic properties of neurons in a circuit could be characterized with large-scale neural recording, and then a particular spatial and temporal pattern of stimulation, tailored to those neurons and the question or therapy at hand, could be applied to the system. These goals can be met by the development of compatible optical imaging and stimulation instrumentation and spectrally-separated optical sensors and effectors; by improved high-density electrodes with both recording and stimulation capability; or by electrode arrays for use with simultaneous optogenetic stimulation. Ability to anatomically identify the recorded cells. A mechanistic understanding of neural circuit function will be aided by combining large-scale recording with methods that provide information on the cell type, anatomy, and synaptic connectivity of the recorded neurons. This information can obtained during experiments in real time by targeting genetically encoded sensors to specific cell types, or by optogenetic stimulation of genetically tagged neurons, or it can be obtained post-hoc in fixed tissue subjected to histological or connectomic analysis. Ability to identify and record from the same neurons over long periods of time. The development of methods that provide reliable cell identification between recording sessions and over longer and longer time intervals will provide new capabilities of particular importance to understanding how neural circuits change over the course of development, learning, or disease progression. Minimal tissue damage due to invasiveness of method, or chronic tissue reaction. These issues are critical when recordings are made in humans. Existing technology leaves much room for improvement in invasiveness and biostability, and considerable improvement is needed in minimizing tissue reactions. Resolution and fidelity of recording of activity with MRI techniques. Variants of fMRI have been shown to reach the spatial scales of cortical columns, even with some degree of cortical layer differentiation. These MRI approaches can provide information concerning the averaged activity of these ensembles and their dynamics over a large volume including the entire human brain. However, the sensitivity needed for these measurements often requires extensive data acquisition times. Significant gains in signal-to-noise and speed of MRI are needed to achieve robust detection of activity in sub-columnar clusters with better layer resolution and over large volumes. With improvements in instrumentation as well as image acquisition and analysis techniques, MRI should aim to reach volumetric resolution of 0.1 mm3 with high fidelity to neuronal signals triggering the MRI responses. Such a capability would complement information obtained through high-density electrical or optical recordings within volumes of tissue equivalent in size to MRI voxels (from fractions to multiple cubic millimeters). Although the main goal of fMRI work is human neuroscience, validation of these technologies will require work in non-human animals that combines single-cell recording methods with MRI method development. Multi-area population recording at cellular resolution. Many current large-scale recording technologies, such as microelectrode arrays and calcium imaging, are typically used to record from one brain area at a time. Questions about how information is conveyed from area to area in the brain or how multiple areas work together for cognitive tasks like decision-making, would benefit from improved methods by which several or many separated areas can be sampled at the same time. Subthreshold and subcellular voltage signals. Extracellularly recorded spikes and field potentials are the signals currently employed in large-scale recordings with microelectrodes. However, dendritic activity and sub-threshold dynamics both hold substantial interest. New approaches such as nanowires79,80 and automated patch recordings81 offer opportunities to expand multi-neuron recording to direct measures of membrane potential, which would be enormously beneficial in characterizing how synaptic inputs and intrinsic cell dynamical properties such as oscillations and plateau potentials contribute to response properties. Similarly, new optical probes such as genetically encoded or nanoparticle based optical voltage sensors82,83 offer prospects for dendritic recordings and may bring in vivo optical imaging of membrane voltage into a practical reality. Chemical and biochemical signals of brain activity. Current methods for measuring neuropeptides and biogenic amines are insensitive and spatially crude; their activity is often inferred indirectly via measurements of electrical activity, leading to a substantial loss of information. Yet these systems are the targets of most psychiatric drugs and drugs of abuse, indicating that they have potent brain functions. Improved methods are needed for direct measurements of neurotransmitters and neuropeptides, the release of different classes of synaptic vesicles, and biochemical states within neurons. Effectiveness and efficiency of data collection. A major bottleneck of large-scale neural recording technologies is not the sensors per se, but how to get the recorded information from sensor to an external device for analysis or archiving. For example, advances in silicon multi-site electrodes for large-scale neural recording will require advances in on-chip multiplexing, filtering, digitization, and communication. Similarly, nanoparticle-based optical detectors of voltage will require methods to read out all probes independently and simultaneously84. Compatibility with interesting behavior. Large-scale recording methods often provide constraints on the type of behavior that can be studied due to requirements such as head restraint and tethers. Two approaches to reducing these constraints show promise. First, new behavioral paradigms such as virtual reality systems85 and voluntary head fixation systems86 provide a greater range of behaviors, such as navigation and decision making in partly-restrained animals. Second, wirelessly powered and telemetered miniaturized electrode arrays87 and optical imaging instrumentation can be used for head mounted systems on freely behaving animals, including humans. 3f. Implementation and Mechanisms 

The future development of new recording technologies will increasingly require the participation of scientists from physics, chemistry, molecular biology, electrical and neuroengineering, material sciences, and computer science. One rate-limiting factor for development of this type of technology is a social engineering problem. Neuroscientists must be able to interact with scientists and engineers from the physical and information sciences, and it must happen organically. Indeed, human MRI technology has been successfully developed in groups spanning engineers, physicists, signal processors, mathematicians and neuroscientists, both at the level of individual labs or in large consortia (e.g. the Human Connectome Project). More recently, HHMI Janelia Farms Research Campus has hired experienced physicists, chemists, engineers, and computer scientists who work alongside neuroscientists, with incentives to develop new technologies for biology. Mechanisms must be provided that facilitate this kind of interaction within The BRAIN Initiative® at the level of a single university lab or center, or in a consortium. In addition, educational programs that allow physical and information scientists to learn biology would provide an entry point for new engineers and scientists from other disciplines to learn about the opportunities provided by this initiative. 

It is also highly desirable to involve industry partners who have experience in manufacturing products for use in humans, as their resources and expertise often complement that of the academic researcher. Many devices potentially have enormous development and production costs. A single device for human clinical applications can take $100-200 million to become a commercial, FDA approved product. The rate of tool or device innovation and development is linked to the amount of funding available. Forming industry partnerships is one way to mitigate the costs to the NIH

In general, new programs and funding strategies are needed to bring together people with varying expertise (e.g., physics, chemistry, molecular biology, electrical engineering, bioengineering, computer science) and industry organizations capable of commercializing new recording technologies. Small groups of collaborators can be very effective, but larger multi-million dollar projects may be required in some cases for engineering and technology development beyond what is normally possible in a basic research setting. Future BRAIN leadership could evaluate the Initiative’s technological progress in its early years, and the success or failure of larger scale efforts presently ongoing in the field, in order to gauge the importance of future large projects. All mechanisms should operate in a way that does not isolate potential developers and users of the technology from one another, but rather drives innovation through consortia and collaborations. 

3g. Timelines and Milestones 

3g-i. Microelectrode Recording 

Short-term goals (years 1—5) 

High-density penetrating silicon microelectrode arrays (100 micron spacing, 4x increase in density compared to existing arrays88 ), with reliability and recording quality appropriate for a standardized laboratory method. 

Silicon probes with multisite electrodes containing hundreds to thousands of 3-D distributed contacts, with on-probe electronics for filtering, amplification, multiplexing, and digitization, capable of recording and stimulating cells across all layers in a cortical column. 

Flexible (electrocorticography/ECoG) surface EEG grids (wirelessly powered and reporting); mm and sub-mm scale electrode spacing, designed primarily for primates, including human applications. 

Proof of principle demonstrations, using in vitro preparations as a first step toward in vivo application, of novel electronic devices using nanowires or other nanofabricated structures for neural recording. 

Integrated technology, recording and perturbation: fully integrated, robust wireless duplex (two-way) electronics on-chip for remote powering, recording and stimulation with transmission distances of 1 meter. 

Integrated technology for human use: practical electrodes for deep brain population recording and stimulation at cellular resolution; methods to reduce chronic tissue reaction and improve the reliability, biostability, and long-term performance of chronic electrodes. 

Long-term goals (years 6-10) 

Practical access to any area in a non-human or human brain with minimum invasiveness. The implantable “brain button” for human use: a self-contained device that can be surgically implanted with no trans-cranial connections and that allows for recording and stimulation from large numbers of neurons. 

Practical devices for in vivo use incorporating nanowires or other nanofabricated structures; high-density intracellular recording in vivo. 

3g-ii. Optical Recording 

Short-term goals (years 1—5) 

Fast, sensitive genetically-encoded fluorescence voltage sensors capable of reliable action potential detection in neural populations in vivo. 

Genetically-encoded sensors for optical detection of neurotransmitters and neuromodulators. 

Sensors that report through a change in fluorescence a combination of physiological events, such as the “AND” function of neural activity and the time of a light pulse. This will allow circuits active at particular epochs of a behavior to be highlighted. 

Reliable, high-sensitivity optical subthreshold voltage measurements at cellular resolution for measuring synaptic currents and dendritic integration. 

Instrumentation platforms suitable for wide dissemination for recording from 104-105 neurons in behaving mammals. 

Instrumentation for population optical recording at cellular resolution from 5 different brain areas simultaneously. 

Integrated technology—recording, behavior, and theory: recording the activity from every (identified) neuron in the brain of a transparent organism (worm, fly larvae) during natural behavior as test cases: can we really understand full range of behaviors from network level neural dynamics? (see also Sections III.5 and III.7) 

Integrated technology—recording, theory, and perturbation: simultaneous optical imaging and optogenetic stimulation of neural populations at cellular resolution; providing tailored optical stimulation (playback) to the same or different populations of neurons at cellular resolution. (see also Sections III.4 and III.7) 

Long-term goals (years 6-10) 

Instrumentation for 1-10 million neuron recordings in behaving mammal. 

Volumetric and deep brain imaging at millisecond resolution in non-transparent species; reduced invasiveness of optical access for imaging applications. 

High-speed voltage imaging, in multiple brain areas of behaving rodents and primates. 

Integrated technology, recording and perturbation: Multiple independent spectral channels for both imaging and stimulation, for monitoring and manipulation of multiple cell types concurrently. 

3g-iii. Human Neuroimaging 

Short-term goals (years 1-5) 

Increased spatial resolution for fMRI (e.g. robust detection of signals on the spatial scale of ~0.1 mm3 volumetric resolution in the entire human brain). 

Quantitative understanding of relationships between neuronal, glial, and metabolic signal dynamics (in combination with theory, Section III.5). 

fMRI deployed in combination with EEG and MEG in more “natural environments”, with improved analysis methods (Section III.5), allowing examination of a greater range of human behavior. 

Integration of experiment with theory: Detailed understanding of the problem of multi-scale averaging in specific neural systems (to determine the information content of the MRI signals that average activity across ensembles of individual cells that are contained in the MRI voxel). 

Long-term goals (years 6-10 and beyond) 

Innovative technologies that can significantly expand our ability to detect activity non-invasively in the human brain. This will require new, unknown mechanisms and techniques; no milestones or timelines are possible. A non-invasive or minimally invasive imaging modality with cellular spatio/temporal resolution that could interrogate large portions of the mammalian brain would represent a major advance for both animal and human studies. Any such technology that was safely applicable in humans would revolutionize our understanding of human brain function.  

4. Demonstrating Causality 

4a. Scientific Goal: Link brain activity to behavior by developing and applying precise interventional tools that change neural circuit dynamics 

4b. Overall Objective 

Precise circuit-level perturbation techniques can 1) determine the causal significance of neural activity patterns in cognition, emotion, perception and other processes, 2) probe the internal structure and dynamics of nervous systems, and 3) serve as a basis for new therapeutic interventions. Experimental control of circuit activity (or circuit “tuning”) is most useful when integrated with assessment of naturally occurring activity patterns, global and local wiring patterns, and molecular/genetic identity of the same cells that are controlled or perturbed. 

4c. Deliverables 

New and improved perturbation technologies suitable for controlling cells that have been specified by type, wiring, location, and other characteristics (see Section III.2). Perturbation technologies in this context could include tools for stimulation, inhibition, or modulation that mimic natural activity, and could span optical, chemical, electromagnetic, biochemical, and other modalities for delivery of control signals. 

Application of perturbation tools to behaving animals to understand the causal linkage between neural activity patterns and behavior, in the context of sophisticated and quantitative behavioral measurements. “Playing back” activity sequences at appropriate spatial and temporal resolution to test the sufficiency of candidate activity patterns in eliciting or suppressing normal or pathological behaviors. Inhibiting natural activity patterns to ask if they are necessary for normal or pathological behaviors. Perturbation is the critical test of any theory purporting to relate brain activity to cognition and behavior—does the theory make accurate predictions about cognitive or behavioral performance following precise manipulations of specific brain circuits? At the same time, improvements in quantitative analysis of behavior should increasingly move toward the precision of recording and perturbation methods. 

Application of perturbation tools to provide a mechanistic understanding of neural dynamics. Combined perturbation/recording/theory experiments that nudge systems out of their natural dynamics in simple and complex ways, to probe information flow and reveal the organizational logic of complex neuronal networks. 

Application of perturbation tools to humans to understand normal nervous system function and mechanisms, causes, and treatments of psychiatric and neurological disease. Clinical stimulation devices can be used experimentally to explore many aspects of human brain function. Precise interventions in the context of behavior can open the door to understanding pathological processes, screening for therapeutic agents, and identifying specific activity perturbations to restore, regulate, and repair dysfunctional circuits. 

Among the scientific questions to be addressed by this goal: 

How are measureable aspects of perception and behavior modulated by alteration of activity patterns in underlying neural populations? 

What alterations of these activity patterns give rise to maladaptive or pathological behavior? 

Are precise corrections of these activity patterns at the cellular level required to restore typical behavior, or are more simple shifts in regional or projection dynamics sufficient? 

How do shifts in the balance of activity among different brain regions, projections, or cell types affect circuit function? 

What is the causal role of spike rate, timing and synchrony relationships among neurons, projections, and brain regions, in circuit processing and behavior? 

Are there consistent neural activity “motifs” or patterns that perform core computations in different brain regions that perform different tasks? 

Which activity patterns in specific neural circuits can promote or inhibit social behavior, allocation of behavioral energy to a task, or conscious awareness? 

Can therapeutic intervention be productively guided in a patient-specific way by considering brain structure or activity alongside a patient’s symptoms, and then adapting an activity intervention to that patient’s unique clinical situation? 

4d. Rationale and Principles 

See also Sections II.3, II.4, and II.6 of this report. 

Historically, perturbations such as anatomical lesions, behavioral pharmacology, and electrical microstimulation have revealed many important properties of the brain89-93, and new methods to tune neural circuits are increasingly powerful. Electrical stimulation continues to be a particular area of importance because it provides high temporal resolution and can be used in human subjects, either acutely or chronically, to probe or modulate brain function in clinical settings. Electrical stimulation does not typically provide single-cell or cell-type resolution, and even when electrodes are placed with millimeter-scale precision they can modulate much more distant cells by acting upon fibers of passage. Yet the simplicity and flexibility of electrodes will ensure ongoing use, and their utility should be improved. Specifically, electrode-based perturbative tools should be longer-lasting with enhanced biocompatibility, biostability, efficiency, compactness, spatial resolution and monitoring capabilities for closed-loop control. Stimulating electrodes should therefore be an area of BRAIN Initiative technology development and investigation. 

Optogenetics is a powerful new tool for modifying brain activity, in which targeted gene delivery provides cell-type and regional resolution, and targeted light delivery provides high temporal resolution. Currently, its use is limited by light scattering, typically requiring fiberoptics for most deep brain structures94-96.Needed improvements to optogenetic tools include narrower (light wavelength) action spectra, increased light-sensitivity, and tools with new kinds of ion conduction properties or other electrical or biochemical modulatory capabilities. Optogenetic approaches also need to develop further to enable not just cell type-resolution, but single-cell resolution, in systems as complex as behaving animals97. A major goal of The BRAIN Initiative® should therefore be the development of optogenetic tools with new kinetic, spectral, and effector function properties, and concomitant development of new light-delivery mechanisms that are operative in behaving animals and allow sparse, distributed, user-defined patterns of cell-resolution, millisecond-scale play-in/inhibition of dynamical motifs. 

If optogenetics represents a refinement of electrical stimulation, what is sometimes called chemogenetics98 is a refinement of pharmacology using genetically-encoded effector proteins. Regional and cell-type resolution are provided by effector gene delivery, as for optogenetics, while delivery of a drug (often systemically) triggers the perturbative manipulation. These tools are often limited by slow timescales for onset and offset of drug effects (up to hours to days, if delivered systemically). They are typically unsuited for generating specific patterns of neural spikes, and often induce sustained perturbations that can lead to circuit-level adaptations and compensations. Nonetheless, the global reach of chemical and chemogenetic tools, the ease of drug delivery, and the fact that they can easily be combined with optogenetics are advantages that will drive innovation. New chemogenetic tools with an expanded repertoire of effects and better-controlled kinetic properties should be a goal of The BRAIN Initiative®. 

A new generation of biological perturbation tools could be envisioned based on advances in natural products biochemistry and synthetic biology. Natural products can provide insight into cell and circuit level processes; much has been learned from the potent and specific snake venoms (conotoxins) and plant metabolites (capsaicin, opioids) that target pain pathways 99,100. Notable advances in synthetic biology include nanobodies, single-chain antibodies from camelids that can be molecularly targeted to specific cell types and receptors to modulate their function or increase their sensitivity to drugs101. In another form of synthetic biology, the biotechnology industry is developing hybrid proteins that cross the blood-brain-barrier by combining a shuttle protein such as transferrin with a “payload” targeted to the brain102. Although these tools lack temporal specificity, they are promising for therapeutic use in humans, and they have only begun to be exploited for neuroscience at the circuit and systems level. 

Finally, non-chemical/optical/electrical energy delivery modalities for perturbation are a speculative but worthy avenue of investigation. Studies in animals have demonstrated effects of magnetic, acoustic/ultrasound, and thermal perturbations on brain activity and behavior, and should be explored further. In humans, an example use is provided by TMS which is an effective research tool and also cleared for human clinical therapy in depression. There are many possibilities, but most proposed mechanisms are lacking in speed, safety, efficacy or versatile single-component targetability; it is likely that mature versions of these methods will require chemical or genetic sensitizing agents to provide specificity. New ideas for perturbative intervention are needed and welcomed from diverse fields of engineering, chemistry, and physics. 

New perturbation tools should be optimized and validated based on the spatial and temporal specificity of targeting, compatibility with subsequent assessment of wiring and molecular phenotypes of controlled cells and synaptic partners, and compatibility with readouts of behavior and neural activity. Not all tools need be applicable to all systems (for example some tools may be not suitable for primates or people) but general use across species is desirable, and even species-optimized approaches can be extremely informative and useful. 

To fully capitalize on the increased understanding of cell-types provided by The BRAIN Initiative® (Section III.1), newly developed perturbation tools must be readily targeted to cell types defined by multiple features such as genetics, wiring, spatial location, and activity history. For example, increasingly advanced intersectional genetic tools or genome targeting strategies97,103 suitable for invertebrates, fish, mice, rats, and primates will become an essential component of the perturbation toolbox and will need to be developed alongside these new control tools to ensure compatibility and synergy across BRAIN sub-initiatives. 

4e. Implementation 

We envision two broad phases for the development and application of new neural perturbation technologies: Phase 1 (years 1-5) should emphasize rapid improvement of recently invented perturbation techniques (e.g. optogenetics, chemogenetics), and invention of novel approaches, especially those that may be applied non- (or minimally) invasively to human and animal nervous systems. Refinement of existing electrical stimulation methods should be supported because of their unique advantages in human neuroscience and therapeutics. As always, technology development will proceed most effectively when linked iteratively to concrete applications in neuroscience research. 

Phase 2 (years 6-10) will adapt and apply perturbation techniques to increasingly large and complex systems, and will continue to support ongoing technology development and refinement. Phase 2 should emphasize integration of major categories of BRAIN-related investigation into how nervous system activity and behavior arise from circuitry, including simultaneous perturbation of activity, molecular/genetic and biophysical phenotyping of cells and synapses, mapping of wiring or connectivity patterns, observation of native activity patterns during behavior, quantification of behavioral output, and computational analysis of circuit activity patterns. 

The deepest understanding of nervous system function will be achieved to the extent that these individual types of investigation are integrated, or all carried out within the same circuit (Section III.7). These integrated methods should be applied to scientific questions about brain function; only through question-driven research will the strengths and limitations of each approach come into focus. 

Moving investigational and invasive methods toward the human brain scale will be possible with certain tradeoffs. As The BRAIN Initiative® progresses, new knowledge about macro- and meso-scale circuitry (Section III.2) may raise possibilities for effective macro- and meso-scale perturbations, as opposed to demanding single-cell spatial resolution perturbations. Assemblies, dynamic temporal motifs, and larger projections or tracts may be recruitable in principled fashion using macro-resolution interventions, as has been the case in deep brain stimulation therapies for Parkinson’s disease. 

Advanced instrumentation will be essential in Phase 2. For example, at the simplest level, laser power will quickly become limiting for large-scale optical perturbation. New engineering, physical principles, and applied science tools will be required to guide perturbation energy to the targeted volume of brain tissue, including corrective or adaptive strategies to compensate for tissue distortion, scattering, and absorption. Collaborative efforts involving scientists and engineers across fields will be needed. 

Finally, theory and computational methods (Section III.5) will be an essential and tightly integrated part of all perturbation work. For example, in animal studies involving imaging of activity followed by perturbation, computational tools will be needed for detection of statistically significant dynamics, dimensionality reduction of resulting datasets, and real-time prediction of effective interventions based on this computational outcome. In clinical work, dynamical modeling of the effect of perturbations on the brains of human patients (based on functional connectivity and tractography) may predict optimal sites for intervention 37,104,105 , especially with increased causal understanding of neural circuits and behavior arising from animal studies 96. 

Behavior and cognition. The activity of the brain is expressed in the behavior of an animal or person. Correspondingly, the tools used for analyzing behavior should have the accuracy, specificity, and diversity of the methods used to measure and manipulate the brain circuits that give rise to those behaviors. The improvement of existing methods and development of novel methods for the quantitative, objective and automated analysis of behavior is critical to The BRAIN Initiative®. Standard methods of formal psychophysics, principled approaches incorporating supervised or unsupervised computational classifiers for detailed analysis of natural behaviors in freely-moving animals, closed-loop virtual reality environments, and other techniques for manipulating, tracking and analyzing animal behavior, as well as internal brain states such as mood, emotion or attention, must be developed in partnership with compatible long-term capabilities for neuronal recording and perturbations. The application of machine vision, machine learning, and other technologies for identifying behavioral motifs, activities and actions is likely to be as important for BRAIN as the development of methods for capturing neural activity motifs106. Automated behavior quantification and classification is a step toward high-throughput and high-content analysis of behavior as a readout of brain activity. It may also have a significant impact on the development of new drugs for the treatment of neurological and psychiatric disorders in humans. 

4f. Mechanisms 

Phase 1 (years 1-5) should include tools relevant to laboratory animals as well as to humans, should include appropriate engineering and computational justification, and should capture the greatest diversity of possible approaches in the early years. This phase should emphasize targetable perturbation tool diversification/development/validation, as well as cell-type targeting strategies compatible with these perturbation tools, and basic instrumentation for delivery of the perturbation (control) information. It will be important to fund the rigorous and quantitative comparison of different perturbation methods in multiple real-world systems and settings. 

Phase 2 (years 6-10) should increasingly address the practical realities and economic efficiencies of scale required for bringing increasing numbers of cells and circuits under perturbative control using reliable and reproducible techniques. This phase is likely to involve both moderate-sized collaboration for smaller nervous systems and specific scientific problems, and a small number of larger consortia to meet the goal of maximally integrated, large (e.g. nonhuman primate and human) nervous systems. Funded projects at all scales should build capability for, or experimentally achieve, simultaneous integration of the major categories of investigation within the system of choice. 

4g. Timelines and Milestones 

Short-term goals (Phase 1, years 1-5) 

Identification of new, improved, and validated perturbation technologies: 

Spatial precision: cellular or at least cell-type resolution for experimental animals; identified circuit level resolution for humans. 

Temporal precision: improved timescales for onset and offset, matched to the question at hand—millisecond resolution is one ideal, and sustained perturbation is another. 

Diversity: tools with different activation properties (e.g. photoactivation spectra), so that multiple populations or cell types may be targeted in the same preparation. 

Scope: access to a broad and deep volume of neural tissue during behavior; at least hundreds to thousands of neurons over millimeters or more, with minimal temporal delay within an experiment. 

Compatibility of perturbation tools with tools for observation of native activity patterns, wiring, and molecular identity of the perturbed circuit components. 

Integrated evolution of quantitative, precise, high-content behavioral methods appropriate for freely-moving and restrained animals. 

Long-term goals (Phase 2, years 6-10) 

Development and application of perturbation technologies in behavior.Targeting large intact primate or human systems in addition to smaller model systems, building on advanced instrumentation, computation, data managements and analysis, and other technology developed during Phase 1. 

Within-experiment integration of perturbation techniques with other key BRAIN-sponsored technologies: cell type identification, anatomical circuit tracing, large scale recording of native activity patterns, precise quantification of behavior, and tests of specific theories of neural coding, computation and dynamics. 

Advance the scale of analysis by ~1 order of magnitude per year: 100 cells controlled independently along with multiple parallel data streams in the same preparation (imaging, wiring, annotation, behavior, and/or modeling) should be achievable in year 6, 1000 in year 7, 10,000 in year 8, and so on. Access to 1 million neurons could be held as an ambitious goal.  

5. Identifying Fundamental Principles 

5a. Scientific Goal: Produce conceptual foundations for understanding the biological basis of mental processes through development of new theoretical and data analysis tools 

5b. Overall Objective 

The overarching goal of theory, modeling, computation and statistics (hence TMCS) in neuroscience is to create an understanding of how the brain works—how information is encoded and processed by the dynamic activity of specific neural circuits, and how neural coding and processing lead to perception, emotion, cognition and behavior. Powerful new experimental technologies will produce large, complex data sets, but rigorous statistical analysis and theoretical insight will be essential for understanding what these data mean. Coherent lessons must be drawn not only from the analysis of single experiments but by integrating across experiments, scales and systems. Theoretical studies will allow us to check the rigor and robustness of new conceptualizations and to identify distinctive predictions of competing ideas to help direct further experiments. Neuroscience will mature to the extent that we discover basic principles of neural coding and computation that connect and predict the results of diverse experimental manipulations of brain and behavior. 

5c. Deliverables 

New techniques for analyzing the large, complex data sets to be produced under all goals of The BRAIN Initiative® (e.g. Sections III.1-4). Needed technical advances include methods for finding high-order structure in recording, anatomical and behavioral data sets (data exploration), methods for building models that can identify potential underlying mechanisms, and methods for rigorous hypothesis testing by fitting models to data. 

Multiscale approaches for integrating data obtained from different experimental techniques. BRAIN-sponsored data sets will cover spatial scales ranging from microns to meters, and time scales from milliseconds to minutes, hours or even the lifetime of an organism. New analytic and computational methods, as well as new theoretical frameworks, are required to understand how organism-level cognition and behavior emerge from signaling events at the molecular, cellular and circuit levels. 

Discovery of general principles of neural coding, computation, circuit dynamics, and plasticity. By synthesizing results from numerous experiments that explore different neural circuits at different levels, theoretical studies will uncover common themes and general principles. These principles will elucidate how neural circuits work, that is, how populations of neurons collectively support cognition. 

Making TMCS perspectives and techniques available to all neuroscientists. Faculty development, student training, and collaborative grants will generate increased quantitative rigor throughout the field. 

Among the scientific questions to be addressed by this goal: 

Circuit stability: How are circuits formed by genetic and experiential forces, and then maintained over the animal’s lifespan despite extensive experience-dependent plasticity? 

Neural Coding: What are the neural codes used by brains for sensory information processing, information transmission, and motor control? 

Neural Dynamics: How do interacting neurons in distributed circuits integrate and transform inputs on multiple time scales? 

Learning: How are neural dynamics changed by learning? What are the modulatory and plasticity mechanisms responsible for different forms of learning? 

Memory: What neuronal, synaptic, biochemical and circuit mechanisms support working memory and long-term memory? How are memories retrieved? 

Decisions and actions: How do neural circuits ‘read out’ the dynamics of multiple neural populations to guide behavior and cognition? 

Each of these fundamental problems is an example of the overarching goal of TMCS: understanding how the nervous system produces high-order, flexible behavior in the pursuit of the organism’s goals. 

5d. Rationale and Principles 

See also Sections II.5 and II.7 of this report. 

New technologies developed by The BRAIN Initiative® will produce opportunities for deeper understanding of the relationships between different forms of neural activity and the cognitive and behavioral functions of the brain. Exploiting this opportunity will require new statistical and computational tools to analyze data sets that are larger and more complex by orders of magnitude than those of the past, and the development of new models of neural circuits and brain function that can be tested experimentally. 

The BRAIN Initiative® should incorporate sophisticated statistical and computational methods into every stage of research. TMCS collaborators should be embedded in the planning and design of experiments, so that expensive, multidimensional data sets are optimal for analysis and interpretation. Well-designed and intelligently interpreted experiments can drive neuroscience to move beyond qualitative models to principled, quantitative analysis of nonlinear processes by skilled application of mathematical and computational methods, numerical simulations, and formal statistical analyses 107. 

Over the past twenty years, theoretical and computational neuroscience have been stimulated by the recruitment of select physicists, engineers, mathematicians, computer scientists, and statisticians108-110, in part through visionary investment by the Sloan and Swartz Foundations. Nonetheless, the extent to which their expertise has permeated the experimental community is uneven. Many theorists and modelers would benefit from closer interactions with experimentalists, and many experimental groups would benefit enormously from direct access to this theoretical cohort. The BRAIN Initiative® should facilitate the entry of TMCS-trained researchers into neuroscience. 

At a more basic level, many experimental neuroscientists are not trained in quantitative methods they will need in the future. Development and dissemination of quantitative expertise within the neuroscience community should be pursued aggressively. 

5e. Implementation 

The BRAIN Initiative® will generate many kinds of data, including: 

Functional imaging: fMRI, PET, near infrared spectroscopy 

Neurophysiology: EEG, MEG, and local field potentials (LFP); single-cell and multiple cell spike trains 

Optical recordings: genetically-encoded or chemical reporters of voltage (including subthreshold voltage), calcium, neurotransmitters, synaptic activity, biochemical states 

Behavior: binary observations, processed video data, human psychophysical data 

Cell-level data: anatomy, connectivity, gene expression, biophysical properties 

Each of these data streams will benefit from advances in the quantitative analyses described below, but the greatest impact will come from combining analysis across these modalities—a relatively new endeavor for neuroscience. 

5e-i. New Techniques for Analyzing Large, Complex Data Sets 

Standardized Analysis Methods for Neural Spike Train Data. Spikes, or action potentials, are the primary means of information transfer over long distances within the nervous system. A series of spikes (or spike train) encodes the message that a given neuron transmits to its downstream targets. Analysis of neural spike trains has been an active area in neuroscience research, but many important questions remain unanswered and the scale of the problem will grow under The BRAIN Initiative® 107,111 . We must develop reliable, readily accessible solutions to the problems of spike sorting, information encoding, connectivity and information decoding that are crucial for confirmatory statistical analyses in data sets of the expected size of 100,000 to 1,000,000 simultaneously recorded neurons. 

New Methods for Statistical Modeling of Neuroscience Data. The BRAIN Initiative® will need to develop new statistical models for neuroscience data, both for exploratory analysis—whose purpose is to learn about potential structure in the data, to develop potential hypotheses, and to aid in the design of future definitive experiments 112—and for confirmatory analysis—which entails definitive experiments to test explicit hypotheses, make inferences and eventually decisions 107. These will need to be tailored to each kind of experiment within The BRAIN Initiative®, and especially integrated experiments combining multiple technologies (below and Section III.7). 

Dimensionality Reduction. Typical BRAIN Initiative data sets, consisting of recordings or imaging of hundreds to many thousands of neurons, will be far too complex to analyze directly. Instead, dimensionality reduction methods must be applied to extract a set of significant independent signals from these data. New approaches must be developed that allow us to extract signals that faithfully reflect the full data set, but are compact enough for visualization and further analysis. The importance of this step in the data analysis pathway cannot be overstated: extracting the appropriate, meaningful signals and displaying them in illuminating ways is a key to developing insight into what neural circuits are doing and how they are doing it. Dimensionality reduction will provide principled answers to a series of questions about how much data should be gathered, at what level of detail, in a given experiment. For example, if one wishes to understand a specific cortical circuit, what fraction of neurons does one need to monitor to extract the necessary information about circuit performance? Is it appropriate to lump different cell types to obtain an overview of circuit dynamics? For inferring function from anatomical data, is it sufficient to know average patterns of convergence and divergence of synaptic connections, or is it necessary to capture the full range of the distributions and strengths of these connections? 

Levels of Description. Dimensionality reduction techniques are equally important for biophysical models of information processing in single neurons and circuits. There is a tension between models that incorporate biological detail about (for example) the distribution and kind of ion channels and receptors on single neurons, and simplified, informative network models that capture the essential dynamics of the neurons in question 113-116. We must understand what level of detail provides the greatest insight for each question. 

Dynamic Systems Analysis. A dynamic model is a form of dimensionality reduction because it permits an entire time series, such as a sequence of action potentials, to be derived from a set of initial conditions of a dynamic system. In other words, dynamic modeling allows us to compress data consisting of measurements over many time points into a set of values at a single time point. Methods exist for extracting dynamic systems descriptions from data, but these must be extended and specialized for neuroscience applications. Advances in this area are expected to have a significant impact on our ability to understand BRAIN Initiative data. 

5e-ii. Multiscale Integration of Data from Different Experimental Techniques 

Fusion of Information From Highly Diverse Data-Streams; Linking Activity Across Spatial and Temporal Scales. The BRAIN Initiative® will produce novel data sets of enormous variety. For example, information on cell types, anatomical connectivity, and neurophysiological activity and effects of perturbation during behavior may be collected from the exact same neural tissue in some animal systems. Within a single experiment, spatial scales could range from microns (synapses) to millimeters (short-range circuits) to tens of millimeters (long-range circuits). In the same experiment, temporal scales could range from milliseconds (spikes) to seconds (attractor states) to minutes or longer (stable circuit changes related to learning). Identifying appropriate ways to integrate data from multiple experimental sources across scales is an unparalleled challenge and opportunity for TMCS. 

Understanding the Biophysical Basis of fMRI, EEG, and Other Neural Recording Technologies. For several recording technologies commonly used in neuroscience experiments—particularly though not exclusively in human neuroscience—the biophysics of how the recorded signals relate to activity in the underlying neural circuits is an unsolved question (e.g. EEG, MEG, LFP, fMRI). Collectively these techniques comprise our best current sources of information about neural processing in the human brain. Our statistical models of these critical signals and what they mean can be improved substantially through accurate understanding of the underlying biophysics; linking levels in this manner is an example of multi-scale analysis, an important general problem that must be addressed in The BRAIN Initiative®. 

Solving High-Dimensional Inverse Problems for EEG and MEG. In humans, high-density EEG and MEG recordings are easily obtained, but their interpretation is limited by the “inverse problem” of assigning the sources for signals —many underlying patterns of brain activity can give rise to the same observed pattern of EEG or MEG signals at the scalp. EEG and higher resolution fMRI data are now being collected simultaneously in humans, providing an opening for substantive progress on the inverse problem. A solution would enable more incisive interpretation of human brain signals and their relation to ongoing cognitive processes. 

Functional Connectivity Analyses in Human Brain Imaging. Large-scale, simultaneous recordings of neural signals lead to hypotheses about the connectivity of large-scale circuits in the brain. As indicated in sections III.2 and III.3, for example, simultaneous fluctuations in resting state fMRI signals may allow reconstruction of functionally related networks of brain areas even though those areas are not directly connected anatomically. Considerable theoretical and statistical work must be done, however, to place these empirical observations on firm ground. Because macro-scale human brain networks are potentially important as biomarkers for psychiatric disease and for targeting therapies, this is a particularly consequential area for TMCS analyses in neuroscience. 

5e-iii. Identifying General Principles of Brain Function 

Theoretical Input on Experimental Design. Theorists should be involved at every stage in BRAIN initiative experiments, from initial experimental design to the final extraction of new concepts. Theoretical neuroscience is about experimenting with ideas. Although a theoretical analysis can never determine whether a particular idea is actually implemented in a biological system—that is the role of the experiment—it can check the consistency, viability and robustness of hypotheses at a rate that is hundreds of times faster than experimentation, saving much time and expense. Furthermore, modeling can help identify the best experimental approaches for testing a particular hypothesis by revealing the features that are its unique signatures. Theorists will provide answers to the basic questions in experimental design such as: How many neurons should be recorded? How many trials are likely to be required? At what spatial and temporal scales should the system be studied and with what resolution? Is the task being considered complex enough? Are the data analysis methods being proposed optimal? 

Uncovering Common Themes Across Different Systems. Theorists are expert at integrating data from multiple experiments, techniques and systems. By focusing on underlying mechanisms, models allow parallels between different systems to be identified despite differences in implementation. In this way, analogous operating principles in the olfactory system, the cerebellum, other cerebellar-like structures, and cortical circuits have been revealed. This integrative process should progress enormously with the data that emerge from the BRAIN Initiative®. 

Single-Trial Decoding Techniques. Decoding refers to a number of methods that can be used to “eavesdrop” on the nervous system by reading out the information contained in neuronal spike trains117. The BRAIN Initiative®, with its greatly expanded scale of neuronal recording, offers an exciting new opportunity—decoding information from single trial data. This is important because it provides a means to explore the internal states of the brain that are not reliably linked to externally observable events. Neural activity associated with observable sensory stimuli or motor responses can be identified in averaged data by temporally aligning trials on the observed events. This is far more difficult for activity associated with transient internal states that are only loosely associated (if at all) with external events. However, the statistical power of large-scale neuronal recordings, coupled with analysis methods such as state-space models or hidden Markov models, can identify and classify internal states on individual trials, opening many aspects of cognition to scientific investigation for the first time117. 

Human Brain Disorders. Another area in which experimental results, clinical observations, and theory could fruitfully be combined is human neurological and psychiatric disease. The reward prediction error model has been useful in reasoning about drug addiction, and serves as an example of a theory that links molecules, cells, and circuits to experience-dependent behaviors and clinical pathology. Building theories of this generality and explanatory power, linking different functional domains, should be a goal of The BRAIN Initiative®. Theories of emotional processing could provide insight into mood and anxiety disorders; theories of cognition could motivate new experimental investigations of schizophrenia; and theories of social cognition could provoke new ways of thinking about autism. 

5e-iv. Accelerate Adoption of TMCS Perspectives and Techniques in Neuroscience 

Given the coming increase in large, complex data sets in neuroscience and the increasing relevance of quantitative approaches to understanding these data sets, The BRAIN Initiative® should sponsor the development of quantitative expertise at all levels — faculty, postdoctoral and graduate student. 

Faculty: Provide incentives to biology and neuroscience departments so that theorists and statisticians are seen as important and integral faculty hires. Change the composition and culture of NIH study sections to recognize that theory is critical for the neuroscience of the future. 

Training: Ensure that all neuroscience postdocs and graduate students become proficient with basic statistical reasoning and methods, and are able to analyze data at an appropriate level of sophistication, for example by writing code. Encourage trainees to construct models as a way to generate ideas and hypotheses or to explore the logic of their thinking. 

5f. Mechanisms 

Combined theoretical and experimental research in neuroscience can be accommodated under several funding mechanisms. Multi-collaborator investigator-initiated grants are well suited to intense, iterative exchanges between experimental and theoretical groups addressing specific scientific problems together. An expansion or new imagining of the NIH-NSF CRCNS effort, which encourages close collaboration between TMCS investigators and experimentalists, would be of great value to The BRAIN Initiative®. The BRAIN Initiative® should develop study sections that value TCMS both in its own right and in collaboration with experimentalists; there is a widespread sense that this perspective is lacking. 

Experimental branches of The BRAIN Initiative® (see sections III.1-4, 6-7) envision the possibility of collaborative consortia or regional centers for developing technology and applying it to fundamental questions in neuroscience; it is critical that TMCS investigators be included in such consortia from their inception. 

The NIH Big Data To Knowledge (BD2K) initiative overlaps in some of its goals with this aspect of The BRAIN Initiative®, with the strongest overlap in developing methods for analysis of large, complex datasets. A close partnership between BD2K and The BRAIN Initiative® will accelerate the development of quantitative methods in this area. 

Accelerating the adoption of TMCS approaches within neuroscience departments and educational programs may require more creative mechanisms, as outlined in the specific recommendations below. Possibilities include administrative supplements to existing research project grants for collaborations with TMCS investigators, and junior faculty hiring incentives like those deployed by NIH under the American Recovery and Reinvestment Act of 2009 (ARRA) several years ago. A variety of mechanisms can be deployed to strengthen quantitative training among postdocs and graduate students. 

5g. Timelines and Milestones 

Short-term goals (1-5 years) 

Develop new techniques for analyzing large, complex data sets. 

Develop dynamic versions of principal component, independent component, graphical models and compressed sensing that may be used to dynamically track structure in continuous data, point process data and combinations of the two118. 

Develop dimensionality reduction techniques to determine, for example, how densely and under what behavioral conditions we must sample the electrical activity of large networks to understand their function; when, if ever, it is appropriate to “lump” cell types; what level of anatomical detail is most effective for connectomics analysis; and what biological details are most important for useful biophysical modeling of single neurons. 

Develop automated, dynamic techniques that would allow neuroscientists to take rapid, preliminary looks at their data prior to performing formal analyses. These methods may take advantage of graphical displays, deep learning techniques and graphical models119,120. 

Implement solutions to the spike sorting, information encoding, connectivity and information decoding problems that are needed for confirmatory statistical analyses of 100,000 to 1,000,000 simultaneously recorded neurons121. 

Develop real-time signal processing algorithms for each of the major types of neuroscience data listed in section 5e. 

Multiscale linkages 

Establish the biophysical sources of the major brain rhythms that are present in EEG and MEG recordings, and the more local sources that give rise to the LFP in different brain regions and different cortical layers122. 

Develop a formal statistical inference framework to conduct network connectivity analyses from different types of neuroscience data such as fMRI, EEG, LFP and multiple single neuron recordings123. 

Explore theoretical and statistical frameworks for fusing information across different experimental techniques and different temporal and spatial scales in neuroscience experiments. 

Develop computationally efficient solutions to high dimensional inverse problems, with particular attention to the interpretation of EEG and MEG data in humans. 

Develop theories and models of collective neuronal activity on spatial scales that span individual synapses, neurons, circuits, networks and systems; develop theories of dynamical activity that span timescales of synapses, action potentials, network activity (including attractors and persistent activity) and internal circuit states (including neuropeptides and neuromodulatory systems). 

Identifying General Principles 

Develop theoretical insights into how circuit dynamics depend on the properties of single neurons and their connections. Identify conditions for which insights from small circuits scale to larger circuits. Determine which general rules of circuit function depend on specific biological details of neuronal and synapse function. 

Higher-order flexible behavior: Develop systematic theories of how information is encoded in the chemical and electrical activity of neurons and glia, how these are used to determine behavior on short time scales, and how they are used to adapt, refine and learn behaviors on longer time scales108. 

Develop a detailed understanding of the circuit and plasticity mechanisms that support different forms of learning. 

Construct a mechanistic understanding of how motor acts are initiated, controlled, sequenced and terminated. 

Propose, study and validate mechanisms that allow information to be gated, switched and transmitted between specific brain regions. 

Develop methods for detecting and classifying internal brain states; relate these states downward to neuromodulatory mechanisms and upward to memory formation, motivation, and internal models. 

Decision-making: Relate cellular-level neuronal activity to basic cognitive processes underlying decisions, including dopamine systems, reward-prediction error, and planning124. 

Goal-directed behavior: Theories for how interactions within and between large neural systems and brain areas—encompassing inputs from multiple sensory modalities, internal states, memories, goals, constraints, and preferences—drive behavior in freely behaving animals including humans125. 

Accelerate the incorporation of TMCS perspectives and techniques in neuroscience departments and programs 

Deploy administrative supplements to existing experimental grants to support three to six month exploratory collaborations between TMCS and experimental laboratories. 

Develop incentives for departments to hire tenure-track TMCS assistant professors. 

Increase the quantitative capabilities of graduate students and postdocs in neuroscience training programs by tailoring individual and institutional training grants to include training in statistics and computational methods. 

Provide funding for summer courses in TMCS disciplines relevant to neuroscience, and for innovative new methods for conferring expertise. 

Long-term goals (6-10 years) 

Develop new techniques for analyzing large, complex data sets 

Integrate statistical and analytic approaches with models of neural circuits based on connectivity maps and cell types. 

Extend the solutions for spike sorting, encoding, connectivity, and decoding to data sets larger than 1,000,000 simultaneously recorded neurons, and integrate with connectomic data and other types of data. 

Develop principled methods for real-time feedback control experiments to manipulate and analyze neural circuits using novel perturbation and recording techniques. Include real-time applications to neural devices and prosthetics in humans. 

Multiscale linkages 

Establish a generic framework for fusing information across different experimental techniques, and different temporal and spatial scales in neuroscience experiments. 

Make computation of high dimensional inverse solutions from MRI, EEG and MEG recordings feasible in real-time. 

Identify the essential elements of widely distributed, time-varying neuronal processes by bridging between detailed realistic models and qualitative behavioral models. Define the principles governing the computation at each spatial and temporal scale that are important for understanding the behavior of the system as a whole. 

Identify general principles 

Establish theoretical approaches to understand the general principles that apply in both large and small circuits. Of particular interest will be theoretical studies that shed light on how circuits interact, which will eventually provide insight into how complex human cognition emerges from interacting brain circuits. 

Accelerate incorporation of TMCS perspectives and techniques in neuroscience 

Continue incentives for hiring TMCS faculty and dissemination of new computational methods for postdoctoral and graduate students through courses at home institutions, summer courses, web-based courses, and other mechanisms.  

6. Advancing Human Neuroscience 

6a. Scientific Goal: Develop innovative technologies to understand the human brain and treat its disorders; create and support integrated human brain research networks 

6b. Overall Objective 

Each goal of The BRAIN Initiative® has an explicit component addressing human brain research, and accordingly, earlier sections of this report address technologies to study human cell types, connectivity, large-scale activity, functional perturbation, and models of brain function. Beyond these topics, however, there are scientific, experimental, and ethical issues that are specific to human neuroscience, whether fundamental, translational, or clinical. Clinically approved investigational technologies, including devices that are surgically implanted into the brain, provide a unique research opportunity to investigate neural function by stimulation or recording at the resolution of cells and circuits. Research involving human subjects, however, comes with a special mandate to ensure that these rare and valuable data are collected according to rigorous scientific standards, curated carefully, and shared amongst the research community. Specialized teams of researchers must be assembled and supported to plan and carry out experimental studies in concert with effective clinical treatment programs. 

6c. Deliverables 

Integrated teams of clinicians, scientists, device engineers, patient-care specialists, regulatory specialists, and ethics specialists to take advantage of unique research opportunities offered by studies in which informed, consenting human subjects participate. Such teams may be assembled within a single university or medical center, or may comprise integrated consortia across multiple universities and medical centers, which could facilitate sharing of standardized data and training in this unique form of research. 

A streamlined path for developing, implementing and integrating innovative new technologies for human neuroscience research, through cooperation of clinical and academic research teams and private companies in a pre-competitive space. Technologies would include implantable devices with combined recording and stimulation capabilities that both advance clinical diagnostic or therapeutic applications and maximize their scientific research value. In the long term, advances in electrical, optical, acoustic, genetic and other modalities should be integrated into neurotechnologies for human clinical and research use. 

Among the scientific questions to be addressed by this goal: 

How does neural activity in specific circuits relate to the conscious experience of humans as they perform a cognitive or behavioral task? 

What neural mechanisms underlie the remarkable human ability to represent information symbolically (as in language) and then use that information in novel situations outside the context in which it was originally learned? 

What neural circuit dynamics enable mental mathematical calculations? 

What patterns of neural activity in which brain structures correspond to human emotional states? Can we treat emotional disorders by applying neuromodulation techniques to these structures and circuits? 

What patterns of neural activity in motor areas of the brain correspond to specific plans to move the eyes, hand or arm toward particular targets? Can we decode mental motor plans with sufficient speed and accuracy to control supple, effective prosthetic devices for paralyzed patients? 

6d. Rationale and Principles 

See also Section II.6 of this report. 

Certain questions about brain function can only be answered through studying humans. A few examples include language, higher-order symbolic mental operations, and individual-specific aspects of complex brain disorders like schizophrenia or traumatic brain injury. Studies of human brain activity present extraordinary opportunities for both clinical advances and research. For example, our understanding of how experience entrains memory has been enhanced by recordings in people undergoing monitoring prior to epilepsy surgery 126. Similarly, as part of an effort to restore lost function after stroke, injury, or neurodegenerative disease, researchers have used activity recorded within the nervous system of paralyzed people to drive mental control of computer cursors and robotic arms 127. However, even as human brain stimulation and recording increases in clinical settings, many opportunities presented by this valuable population of people are missed because there are no readily accessible mechanisms to preserve and share this data. 

In the realm of brain stimulation, there are large populations of people with implanted devices used for sensory replacement and neuromodulation applications. More than 200,000 people have cochlear implants. Studies of sound perception and language in these people, especially those who receive them for congenital deafness, have led to insights into human cognition and have also improved cochlear implants 128. Although they now represent a small group, people with stimulating electrodes in the eye to restore vision will provide a growing opportunity to study the human visual system and create better devices. More than 100,000 people are implanted with DBS systems to treat motor disorders such as dystonia, essential tremor, and Parkinson’s disease. Stimulating systems are also being used in the spinal cord to treat chronic pain, and electrodes are being placed in many different brain regions for experimental treatment of disorders such as depression, obsessive-compulsive disorder, or cognitive decline. These individuals provide a unique setting for researchers to study circuit function, while also learning how stimulation can overcome movement, behavioral and cognitive disorders. 

At the same time, the ability to record electrical activity at the cellular scale in humans is expanding, providing a unique opportunity to make essential cross-scale links between neuronal activity and more global signals from noninvasive imaging methods like fMRI. Both cellular-level and global signals can then be linked to human behavior, thought and emotion. Most human studies at the cellular level are performed intraoperatively for brief times (acute studies) or intermittently for a few weeks in association with clinically indicated invasive mapping (subacute). More recently, with the advent of brain-computer interface research, chronically implanted electrode arrays have made it possible to study populations of neurons for durations longer than 5 years. The numbers of people engaged in research trials is likely to expand considerably in the next decade. 

In many cases, the individuals who have stimulation or recording devices are also treated with drugs, providing opportunities to advance an understanding of pharmacological mechanisms as well. 

The important opportunity to carry out research on human brain function while advancing the clinical capabilities of emerging neurotechnologies creates special issues for human research, including: 

Clinical support networks: A means to ensure support of fundamental human brain research in clinical settings and within clinical trials. 

Training: Ensuring understanding of the special requirements for human research. Training a new generation of human neuroscientists who are rigorous researchers, compassionate clinicians, creative engineers, and competent administrators of complex scientific teams. 

Data capture and sharing: A means to capture human data in standardized formats and to curate and share that data within the framework of protecting private information. 

Effective human neurotechnology: A means to advance the development of safe, but innovative technology suitable for research in human brains. 

Ethics: Strong ethical frameworks, review and oversight of human research. 

6e. Implementation 

6e-i. Clinical Support Networks 

Human research using invasive recordings or implanted devices can include fundamental neuroscience, translational research, and applied work to evaluate safety and efficacy of new clinical technologies. Both investigational and approved medical devices enable basic research that can be performed during clinical diagnostic procedures (e.g. epilepsy monitoring), in clinical trial settings (e.g. implanted investigational devices), or in conjunction with clinically indicated therapies (e.g. deep brain stimulation). However, much of this exceptionally valuable human data are not captured, curated or made available for research. A change of culture in neurosurgery and a change in support by NIH could dramatically expand our knowledge of human brain activity. It is disappointing that of the hundreds of people who experience brain stimulation and recording in surgery, only a handful are studied systematically. 

The range of issues associated with this research requires integrated research teams. A close working relationship between clinicians and investigators is needed so that rigorous experiments can be performed without compromising clinical care. The team should include engineers to keep equipment functional for the limited time available for research, and monitor the safe and effective operation of research level equipment. Efficient research requires clinical trial management by professionally trained individuals who maintain close interactions with institutional and federal regulatory bodies to ensure proper adherence to human research requirements, safety monitoring, and timely documentation and reporting. Because this research can provide a critical assessment of technology being used in humans, the team must have seamless mechanisms to communicate regularly with the FDA to provide ongoing feedback. 

Supporting research in device trials is particularly challenging. Pilot-stage clinical device trials performed in academic settings should be designed and supported to permit hypothesis-driven neuroscience while pursuing clinical safety and or efficacy goals. In this context, fundamental neuroscience can only be pursued on a solid foundation of carefully designed,concurrently performed clinical research requiring participant recruitment, surgery, inpatient hospital stays,device assessment, clinical data collection, and safety and regulatory monitoring by specialists with specific training in these areas. Partnerships between research teams and industry are exceptionally valuable for technology development. Industry is well versed in controlled design, process management, and regulatory compliance required of devices, while academic researchers have deep knowledge about the significance and meaning of the data. Industry input and collaboration in research teams will accelerate progress in translating technology to the clinical population and providing process management necessary to generate devices that are maximally useful to the largest possible population. 

6e-ii. Human Data Capture, Curation, and Sharing 

Making data available to the research community is a general principle of The BRAIN initiative® (Section III.8), but gathering, curating, and disseminating human data is complex. Even where human data exists, in many instances it is not available for research use. Straightforward changes in mechanisms, supported by the NIH, could change this reality. For example, intraoperative brain function mapping in neurosurgery should, where possible, be stored and fully annotated and made available to researchers. Humans with implanted sensors or stimulators should, wherever possible, be studied systematically with formal collection of information. For example, when deep brain stimulation protocols are adjusted in Parkisonian patients, systematic information should be collected on the effects on disease state, cognitive ability, and mood. Achieving data collection in a broad range of settings involving clinical brain recording and stimulation will have costs, which should be recognized and supported. Patient advocacy organizations should be mobilized to encourage participation of their member base in research studies. Patient groups are also a powerful force to encourage data collection and data sharing. 

The complexity and expense of experimental therapies means that most neurotechnology device studies include just a small number of participants, and similar trials may be conducted at multiple sites. Wherever possible, the power of these studies should be increased by making an early investment in data standards and mechanisms to collect, store and share data among different groups, beginning with pilot trials that test and evaluate different collaborative structures. One successful example is the international epilepsy electrophysiology portal (IEEG), a NIH-funded collaborative initiative to share data, tools, and expertise between researchers; this group translates recordings made using different sites and instruments into a common data framework to facilitate comparison and collaboration. Several other data collection initiatives are currently in operation; they need a mechanism to link them seamlessly, make user communities aware of their existence and capabilities, and most importantly, to assess and communicate their effectiveness. 

Computational specialists, as described in Section III.5, will be needed to ensure proper collection and timely processing of human neuroscience data. Standardization challenges abound. For example, studies of seizure patients with subcranial sensors and stimulators vary by clinical specialties, referral patterns, the population size of the medical center, and the particular technology used to obtain the data. Humans also pose a particular challenge in that the participants almost always have an existing disease process (e.g., Parkinson’s, ALS) or injury (stroke, spinal cord damage) that must be considered in experiments related to ‘normal’ function. Clinical assessments of the nature and form of the disorder should be standardized to allow cross-patient and cross-laboratory comparisons and data pooling. It would be immensely valuable to identify ways to merge all (de-identified) clinical and research data collected from humans into a single research record (e.g. genetics, recordings, imaging data). 

6e-iii. Advancing Effective Technology for Humans 

When a device is implanted or used in a human, it is essential that the device functions effectively and safely, and will maximize the quality and quantity of the data collected. FDA-approved deep brain stimulation electrodes, intraoperative recording and stimulating electrodes, and sub-acute surface grids and depth electrodes are commercially available. However, current technologies do not meet the clinical and research needs for high-density sampling and stimulation and broad spatial coverage, and are not proven to be reliable or stable over long durations (Section III.3). Special issues of materials development, device longevity, and safety must be considered before preclinical innovations from animal studies can be applied to humans; implant failures, short device lifetimes, and signal instability are unacceptable for human devices. Solving these problems will require a close working relationship between engineers, clinicians and neuroscientists to overcome these challenges. 

Until now, most devices used in humans have been designed for a single goal—for example, either stimulation or recording. The most dramatic improvement that could be made for implanted devices would be to combine multiple measurement and manipulation capabilities in a single device. With technological miniaturization and cost reduction, it should be possible to build sophisticated new devices, collect data about their operation, and provide investigational access to the brain without compromising the safety or efficacy. Very new devices combine DBS with sensors that allow EEG scale measurements of changes in brain activity that result from macro-scale stimulation. In combination with careful clinical assessment, such hybrid devices have great potential to help us understand exactly how stimulating currents interact with human brain tissue, why outcomes vary from patient to patient, and how we can achieve more consistent therapeutic results. New BRAIN-supported technologies created for research in animals (Sections III.3, III.4) will inspire next-generation devices for human brain monitoring and therapies using optical, acoustic and magnetic modalities. 

6e-iv. Connecting to Brain Structure 

Human brain tissue is a precious, limited resource, whether from normal or diseased brains, from post-mortem tissues or biopsies. Brain banks are costly to maintain and require skilled technical oversight. Thoughtful improvements could increase the value of these resources. For example, it would be invaluable to examine human brain structure postmortem with meso-scale connectomic techniques in individuals who had been studied previously with macro-scale diffusion and fMRI techniques (Section III.2). 

NIH should consider how best to build infrastructure that provides an integrated record of functional and structural data from human brains. Movement in this direction is already happening, as NIH has recently funded the NeuroBioBank, a federated network of brain and tissue repositories that collects, evaluates, and stores human brains and makes them available to qualified researchers. 

6e-v. Human Research Ethics 

While there are well-established laws and procedures for human research, studies in humans that involve sensing or stimulating the brain require ongoing oversight. This need is best served by a continual dialog between the researchers and ethical oversight bodies. Ethics committees must include strong representation from members who are informed in clinical and basic neuroscience, and who are aware of the history of brain manipulations and recordings. Students and scientific staff must be trained in ethical human research; meetings dealing with human brain research must address ethical considerations. Ethicists should also be exposed to the unique nature and potential of human neurotechnology. 

6f. Mechanisms 

Research: NIH could lead a new scientific era in human brain research by supporting integrated teams of clinicians, scientists, device engineers, patient-care specialists, regulatory specialists, and ethics specialists to take advantage of unique research opportunities offered by brain stimulation and recording in humans with clinically indicated devices who have voluntarily given informed consent to participate in research. The NIH Clinical and Translational Science Awards (CTSAs) are one mechanism that could provide resources and incentives for these interdisciplinary studies. New mechanisms could be developed that promote increased engagement of neurosurgeons, neurologists, neuroradiologists, and anesthesiologists with scientists, and support the costs associated with performing research in the context of clinical activity. A clinical trial network for devices would be a valuable mechanism to coordinate trial design, data collection, sharing, and analysis. In addition, it could add to the efficiency of human device trials, accelerating both the accumulation of knowledge and the translation of devices to end-users. Engagement of humans themselves as voluntary, informed participants is of primary importance. 

Training: Human research requires specialized knowledge and training in the ability to work in interdisciplinary teams, knowledge of IRB and compliance processes, and attention to data security, data handling, and confidentiality. Research aimed at creating new human devices and drugs must take account of FDA processes and procedures, and ultimately must respond to policies for reimbursement (e.g. Centers for Medicare and Medicaid Services policies), because a product that cannot be paid for cannot be used. Ethics training is essential. NIH should support training programs for the entire enterprise of human research, both fundamental and applied (see also Section III.8). Ideally, this training could be incorporated into clinical trial networks to help ensure uniformity and compliance. 

6g. Timelines and Milestones 

Short term goals (1-5 years) 

Technology: Establish a support path for developing innovative tools translatable to human applications (coordinated with neurotechnology advances in animals); pipeline to develop over years 3-5. 

Trial Networks: Establish pilot projects for collaborative human neuroscience trial networks, then expand the program within CTSAs or other entities to form large-scale clinical trial networks to facilitate basic research and device development. 

Training: Establish training grants for human research, ethics. 

Human data capture: Support human data sharing for electrophysiological and structural studies, and common platform development for electrophysiological and clinical data. 

Ethics: Neuroscience/ethics training programs, meetings and interactions to establish guidelines and principles for human neuroscience research. 

Long term goals (6-12 years) 

Technology: Application of high-resolution recording and stimulation for human research and for a broad range of clinical applications; support ongoing pipeline of new technology innovation involving electrical, optical, acoustic and magnetic modalities. 

Trial Networks: Establish international collaborative networks based on the most successful within-United States model from years 1-5; generalize United States model to many major clinical research institutions. 

Training: Core networks provide training to additional researchers and physicians in human neuroscience research requirements; dissemination to other institutions. 

Human data capture: Routine capture of data in surgical settings and device use; curated human neurophysiological data available to research community. Integrate knowledge with animal data. 

Structure/Function: Correlation of human circuits, electrophysiology and anatomy. 

Ethics: All research carried out and monitored under agreed upon ethical principles for human neural interfaces.  

7. From BRAIN Initiative to the Brain 

7a. Scientific Goal: Integrating new technological and conceptual approaches produced in goals #1-6 to discover how dynamic patterns of neural activity are transformed into cognition, emotion, perception, and action in health and disease 

7b. Overall Objective 

The brain is the most complex biological system we know of in nature. Unlocking its mysteries will require a systematic effort that coordinates and focuses all technologies developed under The BRAIN Initiative®. To maximize their ability to answer critical questions about brain function, the new technologies must be combined in an integrated way and developed using principles of systems engineering. The overall objective is to develop and use these integrated platforms to provide a comprehensive, mechanistic understanding of neural circuits and systems. We must understand how circuits give rise to dynamic and integrated patterns of brain activity, and how these activity patterns contribute to normal and abnormal brain functions. Our expectation is that this approach will answer the deepest, most difficult questions about how the brain works, providing the basis for optimizing the health of the normal brain and inspiring novel therapies for brain disorders. 

7c. Deliverables 

Integrated methods and instrumentation that combine activity measurements, perturbation, behavioral analysis, cell type information, connection maps, and theory. Integrated experimentation requires careful coordination and coevolution of individual technologies, but will yield great benefits. When mature, these technologies should be disseminated widely in basic and clinical neuroscience settings. 

Application of fully integrated systems to discover brain processes underlying cognition, emotion, perception, decision-making, and memory; answers to fundamental questions about how the brain works, and starting points for new therapeutic capabilities.By analyzing neural coding and dynamics in relevant brain regions and cell populations, in the context of connectivity diagrams, behavioral analysis, and sophisticated theoretical and quantitative tools, we will acquire mechanistic and conceptual insight into the relationships between neural systems and mental functions in health and disease. 

7d. Rationale and Principles 

A guiding principle of The BRAIN Initiative® is that each new technology, be it sensors, connectomic analysis, neural recording or perturbation, should be developed with the goal of answering fundamental questions in brain function. In some cases, particularly in the early phases of the initiative, a given technology will be applied in isolation to provide important knowledge. For example, new methods of cell identification could produce an understanding of all cell types in the neocortex, or new connectomics methods could yield the projectome of the human brain. However, many of the most exciting and powerful applications will come from combining the new technologies into an integrated whole, a goal we have pointed to in each preceding section. 

Hand-in-hand with these new combinations of experimental methods will come integrated work from theory, modeling and statistics that provide rigor to observations, new methods of visualization and understanding of the data, and, most importantly, new conceptual frameworks for interpretation of the data (Section III.5). These new conceptual frameworks will generate testable hypotheses framed in terms of measurable parameters that in turn will stimulate new experimental work. When verified, these hypotheses and theories will provide the mechanistic understanding of neural circuit function that is the principal goal of BRAIN

Successful integration of neuroscience technologies in The BRAIN Initiative® should emphasizeasystems engineering approach, in which one optimizes the collective performance of the final system, rather than optimizing individual components129. For example, a systems engineering approach to developing fluorescent indicators of neural activity should focus not only on the properties of the indicator molecule but also on parameters such as wavelength compatibility with other optical sensors, optogenetic probes, detectors or illumination sources, or robustness to fixation processes used in post mortem analyses. This would be important when combining targeted recording with post-hoc connectomic analysis. Similarly, designers of new lasers, lenses or detectors for deep tissue imaging should consider the capabilities of important molecular sensors and probes. 

A challenge in the design of integrated systems arises when multiple component technologies are advancing in parallel. Ongoing progress in lasers, microscopes, optogenetic probes and optical indicators should be coordinated across these domains. Coordination between teams working in complementary areas can help technologies co-evolve successfully. In general, integrated design often requires multiple domains of expertise working together, and for this reason, technology integration will require funding mechanisms that encourage collaborative teams of investigators. 

7e. Implementation and Mechanisms 

Starting in the early stages of The BRAIN initiative®, and progressing steadily over its duration, we expect a growth in the integration and combination of new technologies. The combinations will first be pairwise (e.g. recording with stimulation), but will later occur with increasing sophistication as combinations of many or all of the technologies. We anticipate that integration will almost always involve concomitant development of bridge technologies that go beyond the development of each component technology. For example, performing connectomic analysis after large-scale recording requires accurate methods (i.e. fiducial marks) that co-register recorded cells with anatomically identified cells. Similarly, the combination of large-scale optical recording with patterned optical stimulation requires new forms of integrated instrumentation that can be used in behaving subjects, and the development of molecular sensors that allow independent targeting and spectral separation. Thus continued technology development will be intrinsically linked to application in the middle stages of The BRAIN Initiative® as combinations of technology begin to emerge. The application of integrative technologies to specific question-based projects will grow over the later years of The BRAIN Initiative®. 

Collaborative Consortia: We anticipate that integrated consortia combining several approaches will come together as groups of BRAIN Initiative researchers focus on answering specific questions in neuroscience. In an ideal consortium, we imagine expertise and integrated instrumentation in quantitative analysis of behavior, large scale neural recording, cell identification and targeting, connectomic and structural analysis, perturbation/stimulation technology, statistical analysis of high-dimensional data, and theoretical modeling and simulation. Groups of researchers with shared interests in the same neural circuit or animal model will benefit from integrated approaches that create standardized technology platforms for neuroscience experimentation. 

This breadth of expertise is difficult or impossible to achieve in single lab, but can be developed in self-organized consortia. Although some of this work will happen through partnerships with foundations and private institutes that have broad in-house expertise (e.g. the mouse visual cortex project at Allen Institute for Brain Science130, and fly work at HHMI Janelia Farm Research Campus53,131), mechanisms should be developed that encourage teams of investigators at academic research centers to organize around specific research projects. A team might be located at an individual institution, or at a set of labs distributed across multiple locations. 

Once integrated technologies are mature, appropriate mechanisms for maximizing their impact may include funding state-of-the-art instrumentation facilities, accessible to researchers from across the country (Section III.8b). 

7f. Timelines and Milestones 

Examples of integrated technologies (years 1-5) 

Genetic access to cell types combined with optical stimulation and optical or electrical recording. Recordings can be targeted toward specific cell types that have been genetically tagged according to their neurotransmitters, anatomical connections, or morphological characteristics. 

Genetic access combined with connectomics. For example, differential labeling of cell types combined with light microscope or EM connectomics will provide connectivity statistics of individual cell classes. 

Quantitative behavioral analysis integrated with large-scale recording and functional perturbation in the context of anatomically or genetically characterized cell types. The neural mechanisms underlying a sophisticated behavioral task can be cast in terms of circuit function: cell identities, inputs, projections, and neuromodulatory influences. 

Simultaneous optical large-scale recording and optical stimulation: replay of natural activity patterns first measured with recording techniques during controlled behavior. For example, this approach could test the idea that specific neural activity sequences encode elapsed time in episodic memory. 

Post-hoc connectomics of a neural circuit following large scale recording. Recordings alone typically permit multiple mechanistic accounts of the observed activity patterns. Post-hoc reconstruction of the underlying circuits will provide essential information for identifying the actual mechanisms of neural coding and dynamics. For example, this combination of technologies can test the idea that persistent neural activity in short term memory (like the memory of eye position in the oculomotor integrator) is produced by a recurrent connectivity in the network. 

Advanced ‘optical helmets’ for interrogating brain tissue in freely behaving animals, miniaturized to be worn during active behavior, integrating multiple color channels for recording and manipulating neural activity, advanced signal processing capabilities, and wireless connectivity. 

Applying these methods to fundamental questions about the healthy brain and brain disorders (years 1-10) 

Application of combined technologies will be far more powerful than application of any new technology alone, and the real payoff of BRAIN will come as we apply these powerful integrated technologies to address fundamental problems of brain function in health and disease. Thus we envision relatively modest funding for integrated, collaborative experimentation in years 1-5 of The BRAIN Initiative® that will increase substantially in years 6-10. Examples of specific questions to be addressed are provided in the Conclusions (Section III.10) of this report.  

8. Supporting the Core Principles of The BRAIN Initiative® 

In addition to the scientific goals of The BRAIN Initiative®, the working group identified core principles that should apply to research conducted in this program. These are: 

Pursue human studies and non-human models in parallel. 

Cross boundaries in interdisciplinary collaborations. 

Integrate spatial and temporal scales. 

Establish platforms for preserving and sharing data. 

Validate and disseminate technology. 

Consider ethical implications of neuroscience research. 

Create mechanisms to ensure accountability to the NIH, the taxpayer, and the community of basic, translational, and clinical neuroscientists. 

Principles 1-3 are explicitly embedded within each individual goal described in this Report. For example, each of the scientific goals described in Sections III.1- III.5 and Section III.7 includes both research in humans and research in non-human animals (Principle 1) with the recognition that these two streams of research will complement, challenge, and complete each other. Similarly, interdisciplinarity and integration across scales (Principles 2 and 3) are included in all goals, and are at the heart of Section III.7. 

Technology validation (Principle 5) should also be embedded within each individual goal. In overview, The BRAIN Initiative® should support maturation of technologies that have already achieved proof of principle, but have promise for further development. Metrics of value during this phase of maturation and validation include effectiveness, robustness, practicality, applicability to a wide range of systems, and minimizing cost. Each new technology should be rigorously compared to other technologies using these metrics during the process of validation, with the recognition that these are not hard and fast rules. For example, in some cases a particular approach will have great value in a single experimental animal, and should be pursued even if it does not apply across many species. The process of validation should emphasize iterative communication between tool developers and experimentalists in biological systems. 

Aspects of Principles 4-7, however, are not easily subsumed within a particular goal and will require dedicated resources; we therefore discuss them here. An infrastructure for data sharing (Principle 4) will contribute to each goal of The BRAIN Initiative®, and should be developed with all goals in mind. Dissemination of new technology developed under The BRAIN Initiative® (Principle 5) will contribute to some goals more than others, but at this point we cannot foresee where its eventual costs will be concentrated. We therefore present a more general discussion of technology dissemination. Ethical concerns (Principle 6) are so central to human subjects research that they are discussed specifically in Section III.6; here we consider broader ethical issues that apply across The BRAIN Initiative®. Finally, accountability is a principle to which we all subscribe, but we do not always state it explicitly (Principle 7). It is an appropriate topic on which to end this report. 

8a. Principle: Create and support an infrastructure for preserving and sharing data 

8a-i. Overall Objective 

Accelerate scientific progress by establishing platforms for sharing data (Section II.8) and data analysis tools. Make expensive, hard-won data sets collected under The BRAIN Initiative® available to a large community of researchers. 

8a-ii. Deliverables 

Integrated repositories for datasets and data analysis tools, with an emphasis on user accessibility and expert curation. 

An infrastructure for open sharing of archival data and tools for data analysis that improve reproducibility of published results, makes new analyses possible, and facilitates comparisons to data from future experiments. 

8a-iii. Rationale 

The BRAIN Initiative® will produce large datasets that should be designed and collected with the goal of widespread dissemination. Good examples of accessible, well-designed data sharing platforms in neuroscience include the Allen Brain Atlas and the Human Connectome Project. More broadly, the National Center for Biotechnology Information (NCBI) and the Protein DataBank (PDB) are gold-standard examples in genomics and structural biology, respectively. 

The BRAIN Initiative® will face substantial challenges in facilitating data sharing, including: 

BRAIN data will be heterogeneous in nature, including anatomical and connectomic data, physiological recordings of different kinds (e.g. electrical, optical, and fMRI obtained under different conditions in different formats) and genetic information; new kinds of databases will be needed. 

Experimental protocols and reagents vary widely between labs, presenting a challenge for data standardization. 

Data preparation, annotation, and metadata collection will have associated costs that require financial support. 

Anatomical names for brain areas and structures are often inconsistent between species and labs132, and will need translation into common frameworks. 

Privacy is a concern for sharing human data. 

Despite these difficulties, it is important that preparations for data sharing begin as soon as possible, since a major data platform for BRAIN is likely to take several years to plan and build. Indeed, experience in genomics suggests that the deposition of data in common data platforms will reveal which of the challenges listed above (or not yet considered) are most serious. 

These general guidelines should help to speed the development of data sharing for The BRAIN Initiative®: 

Standards for data formats, metadata and nomenclature should be established through consensus involving all major stakeholders. 

Computer programs for data preprocessing and analysis should routinely be provided along with the data so that published results can be replicated and, if necessary, re-evaluated. 

Policies for depositing and disseminating data and programs should be established, including who has access and under what conditions. 

Data platforms should make it easy for researchers to standardize their data and make it public, and easy for other researchers to access, analyze, and visualize the data. 

Expert curation and other resources will be needed for data deposition, access, and updating. These will have associated costs that should be supported. 

Analysis tools should be developed to integrate across data sets and span different types of data. Like genome analysis tools, these should run on the servers that maintain the data, since some datasets will be too large to download. 

Well-designed data platforms will directly benefit research in individual labs, providing an incentive for their use. Even without a BRAIN Initiative, the size and complexity of emerging datasets make it difficult for individual labs to analyze and back up their data. The development of well-designed data platforms can provide convenient solutions within the lab as their first use. At the next stage, the platforms will make it easier for labs to collaborate with other researchers and data scientists. By designing platforms with different levels of permission and agreements on data use, these in-lab resources can be seamlessly expanded into shared data resources at appropriate times and levels. 

The call for data sharing expressed here is a bottom-up mandate from much of the research community. For example, it was strongly voiced by the researchers who participated in our workshop on human neuroscience in the summer of 2013. This group has already benefited from the sharing of large brain imaging datasets generated by the Human Connectome Project, and sees its future potential. 

The question of which data are shared, when, and how credit is assigned, should be answered by NIH and BRAIN Initiative participants. Experience from other fields such as genomics suggests that data should be shared no later than the date of first publication; for resource projects, data should be shared on a regular (e.g. quarterly) schedule prior to publication. Community recognition can be conferred by citing deposited data; new journals are being founded to publish and enable citation of datasets. Neuroscience can learn not only from genomics but from other areas of science such as astronomy and particle physics where these issues have been resolved. Half of the articles published from the Hubble Space Telescope are based on well-curated archival data, not including the original team that designed data collection133.

In addition to sharing of data, per se, research could be speeded in many labs through development, validation and dissemination of generally useful software tools for data analysis. Examples of high impact software tools include BLAST, developed at the National Center for Biotechnology Information for comparing biological sequence information; ImageJ, developed at NIH, a successful open-source program for general image processing; and the Insight Segmentation and Registration Toolkit (ITK), developed with NIH support, an open-source image segmentation and registration program that facilitated the high successful Visible Human Project. To maximize their utility, software tools must be designed with careful consideration of both scientific needs and long-term sustainability. Virtually all major goals of the BRAIN Initiative research could benefit greatly from development of robust, widely shared software tools. 

Mechanisms for sharing, preservation and analysis of data for The BRAIN Initiative® may be developed in partnership with other NIH programs such as the NCBI and the Big Data to Knowledge (BD2K) Initiative, and would also represent an opportunity for interaction with other organizations who have expertise in this area, such as International Neuroinformatics Coordinating Facility, the Neuroscience Information Framework, and the Allen Institute for Brain Science. 

8a-iv. Timelines and Milestones 

1-3 years: Identify data types and establish standards for each data type. 

3-5 years: Create data sharing platforms for major data types. 

3-5 years: Support development of data analysis software tools that are well-validated and valuable to large segments of the community. 

5-10 years: Scale up databases, maintain software, curate data, support continued development of analysis tools. 

8b. Principle: Dissemination and training in new technologies 

8b-i. Overall Objective 

As discussed in Section II.8, it is a principle of The BRAIN Initiative® that newly developed technologies and reagents should be made broadly available to have the greatest possible reach and impact. This principle is most important for the complex integrated technologies (Section III.7), but applies to individual technologies as well. 

8b-ii. Deliverables 

Mechanisms to enable sharing and widespread dissemination of biological reagents, instruments, and computational tools developed under The BRAIN Initiative®, to maximize their impact across basic, translational, and clinical neuroscience. 

Practical courses in the use of new neuroscience technologies that allow researchers to apply them skillfully and rigorously. 

8b-iii. Rationale 

Dissemination of technology will have special needs and costs that are not included in the individual goals. We expect The BRAIN Initiative® to develop new instrumentation, possibly including new kinds of microscopes; new closed-loop electrical recording/stimulating systems for non-human animals and humans; entirely new instrumentation appropriate for (for example) use of nanotechnology in the brain; and new integrated instrument systems that enable studies of neuronal activity, cell type, connectivity, perturbation, and behavior within the same individual. We expect The BRAIN Initiative® to develop new biological reagents, possibly including genetically-modified strains of rodents, fish, invertebrates, and non-human primates; recombinant viruses targeted to different brain cell types in different species; genetically-encoded indicators of different forms of neural activity; and genetic tools for modulating neuronal activity. It is a principle that such reagents developed under The BRAIN Initiative® be accessible to the broadest possible community of scientists, who should also be trained in their use. 

8b-iv. Implementation and Mechanisms 

For biological reagents, appropriate mechanisms of sharing have been developed, but may need to be expanded and further supported by BRAIN Initiative support of virology centers and genetic stock centers, for example. 

For instrumentation, advances can be promoted by thoughtful development of dissemination policies by the scientific community, as well as specialized support mechanisms and private-public partnerships. For example, for complex instrumentation, commercialization could be supported through the Small Business Innovative Research SBIR program or other partnerships. 

Some new BRAIN technologies may not be affordable by individual labs or small consortia, and there unique issues will arise. The multiphoton microscopes, super-resolution microscopes, multielectrode arrays, multi-beam electron microscopes, and virtual-reality behavioral chambers already in use in neuroscience are expensive, and integrated systems will be more so. Once technologies are mature, appropriate mechanisms for maximizing their impact may include funding state-of-the-art instrumentation facilities, perhaps at a few locations at universities or research institutes, that could be used by researchers from across the country. Visiting scientists might be in residence for several weeks or months while developing a research goal. Host institutions, which would clearly benefit from proximity, could co-invest in equipment and support personnel. As one example, we anticipate that expensive, complex technologies such as next-generation high-field MRI instruments might need to be centralized, like the centralized X-ray beam lines used by structural biologists, but we emphasize that those technologies do not currently exist. 

Any planned BRAIN Initiative facilities should be evaluated for their impact on the broad scientific community, not small groups of researchers. Any facilities funded by The BRAIN Initiative® should have the clearly-stated goal of democratic access and widespread availability of technology, not just in the host institution. Groups that are funded must be held accountable and willing to accept close oversight from the NIH

Training in new methods and their rigorous application should be a component of The BRAIN Initiative®. A relatively small investment in hands-on practical courses can provide immense added benefit in the reach of technologies. This could occur at traditional course sites like Cold Spring Harbor and Woods Hole, or at universities that invite outside researchers for training. 

8b-iv. Timelines and Milestones 

Short term goals (1-5 years) 

Expand existing resources for sharing biological reagents to accommodate the projected needs of BRAIN Initiative research. 

Support commercialization and dissemination of new instruments through the SBIR Program or other partnerships. 

Support shared instrumentation for consortia of investigators within or across universities and research institutions. 

Training courses for researchers to master new research technologies. 

Long term goals (6-10 years) 

Continue to support resources for sharing of reagents and instrumentation; continue training courses for new technologies. 

With appropriate oversight and evaluation of need, create facilities for integrated instrumentation whose cost is beyond the scope of any single laboratory or most institutions. Ensure broad access to these centers from multiple universities and research institutions. 

8c. Principle: Consider ethical implications 

8c-i. Overall Objective 

A solid ethical framework is a path for ensuring that scientific research is of the utmost value to the public it intends to serve. Therefore, the research supported by and the knowledge generated through The BRAIN Initiative® should be regularly assessed for their ethical, legal, and societal implications. 

8c-ii. Deliverables 

Careful and broadly-based consideration of the unique ethical issues raised by human neuroscience research. Joint neuroscience/ethics meetings and training programs; resources for collecting and disseminating best practices. 

Vigorous dialogue among ethicists, educators, government and corporate representatives, patients and their advocates, lawyers, journalists, scientists and other concerned stakeholders about social and ethical issues raised by new knowledge and technologies generated under The BRAIN Initiative® 

8c-iii. Rationale and Principles 

Ethical issues implicit in The BRAIN Initiative® fall under two broad, equally important categories: 1) ethical conduct of research, and 2) ethical and societal implications of new technologies and scientific discoveries. 

Ethical conduct of research. 

Society stands to benefit tremendously from neuroscience research, but it is imperative that this research is conducted in accordance with high ethical standards. Establishing ethical guidelines early makes it possible to address future hurdles and dilemmas in advance, preventing unnecessary costs and delays as well as adverse consequences. Many laws and oversight systems are already in place for research on human subjects, but studies involving monitoring and stimulating the human brain require particular sensitivity and ongoing oversight. Human participants are invaluable resources for neuroscience research, and protecting their interests is of the utmost importance. Informed consent and protection of privacy are critical components of human neuroscience research. BRAIN Initiative projects should strive for complete transparency about the risks and benefits of participation in these studies. 

The BRAIN Initiative® aims to understand how living brains work to generate cognitive functions and behavior. As a result, inevitably, many scientific studies must be conducted in living animals, not in tissue slices, cell cultures or computer simulations. This research, too, should adhere to rigorous ethical standards. Both our own sense of ethics and the legal and regulatory framework of scientific research in the United States require a strong commitment to performing such experiments with minimal suffering to animals, and a respect for animal cognition and life. 

Ethical and societal implications of new technologies and scientific discoveries. 

As suggested in Section II.8f, mysteries unlocked through The BRAIN Initiative®, and through neuroscience in general, are likely to change how we perceive ourselves as individuals and as members of society. Many of these discoveries will raise more questions than they answer. We may need to consider, as a society, how discoveries in the area of brain plasticity and cognitive development are used to maximize learning in the classroom, the validity of neuroscience measurements for judging intent or accountability in our legal system, the use of neuroscience insights to mount more persuasive advertising or public service campaigns, the issue of privacy of one’s own thoughts and mental processes in an age of increasingly sophisticated neural ‘decoding’ abilities, and many other questions134. Questions of this complexity will require insight and analysis from multiple perspectives, and should not be answered by neuroscientists alone. 

Many of these ethical considerations are not unique to the BRAIN Initiative or neuroscience. But researchers must remain cognizant of them at all stages of research and should have appropriate avenues for seeking guidance when applicable. 

8c-iv. Implementation and Mechanisms 

The President has charged his Bioethics Commission to take a broad view of ethical implications of neuroscience research that extend beyond The BRAIN Initiative®. The BRAIN working group endorses this action enthusiastically. The Bioethics Commission should engage neuroscience-related issues whenever they arise—under the present and future administrations—and the Commission should be able to count on the participation and support of neuroscientists when our expertise is needed. 

No single commission, however, can explore these ethical issues with the depth and diversity of perspectives that mirrors our society. A broader conversation is necessary. Stakeholders should be engaged through a variety of additional mechanisms, including academic research in bioethics, training programs for a broad array of practitioners and students in the medical professions, conferences targeted to audiences with different levels of scientific expertise, and media outreach. 

As summarized in Section III.6, oversight for neuroscience research with human volunteer subjects is best served by a continual dialog between the researchers and ethical oversight bodies. Ethics committees must include members who are informed in clinical and basic neuroscience, and who are aware of the history of brain manipulations and recordings. Students and scientific staff involved in the projects must be trained in ethical human research, and ethicists should be educated about the unique nature and potential of human neurotechnology. 

8c-v. Timelines and Milestones 

Goals (years 1-10) 

Joint neuroscience/ethics training programs and meetings to consider the unique issues raised by human neuroscience research, and to establish a shared vision for the ethical conduct of such research. 

Resources for collecting and disseminating best practices in the conduct of ethical scientific research, particularly for the conduct of clinical research. 

Support for data-driven research to inform ethical issues arising from BRAIN Initiative research, ideally with integrated activities between ethicists and neuroscientists. 

Opportunities for outreach activities focused on engaging government leaders, corporate leaders, journalists, patients and their advocates, educators, and legal practitioners in discussion of the social and ethical implications of neuroscience research. 

8d. Principle: Accountability to NIH, the taxpayer, and the basic, translational, and clinical neuroscience communities 

8d-i. Overall Objective 

Ensure that research conducted under The BRAIN Initiative® is targeted toward the scientific priorities established by the NIH, makes efficient use of taxpayer funds, and results in technical and conceptual advances that have a high impact on the pace and quality of neuroscience research. 

8d-ii. Rationale 

Accountability to the NIH and the taxpayer. 

The BRAIN Initiative® has enormous potential for solving persistent mysteries of brain function, spinning off technologies that seed new industries, and opening the door to new treatments for diseases and disorders of the nervous system. To ensure that the Initiative remains on track scientifically and is making efficient use of taxpayer dollars, the progress and accomplishments of The BRAIN Initiative® should be analyzed regularly by the NIH leadership, the NIH neuroscience leadership (directors of neuroscience-relevant institutes), a scientific advisory board (see below) and representatives of the public such as members of patient advocacy groups for brain disorders. This process will require active, high-level tracking of Initiative progress, over and above the typical annual evaluation of investigator-initiated research grants. 

Scientific accountability. While the proposed scientific goals for The BRAIN Initiative® are focused on enhancing our abilities to study and understand brain circuits, the ultimate vision is that this knowledge will transform all of basic, translational, and clinical neuroscience. To achieve this aim, NIH should encourage and facilitate the dissemination and utilization of BRAIN Initiative advances throughout the neuroscience enterprise, moving creatively to bring new tools and discoveries into areas of opportunity. 

Breakthroughs in modern science are typically achieved through cascading advances made by many researchers and clinicians over time, not by isolated groups at a single moment. In the short run, simply ensuring widespread access to new reagents, technologies, and data is a sure way to maximize impact, and also demonstrate accountability to the scientific community and the NIH. Many principles enunciated in this report will create a path for accountability. 

At intermediate timescales, high-quality data moves the field forward. The success of cooperative and open-access projects such as the Human Connectome Project or the Human Genome Project can be measured by the release and use of the data. Shared data are a product available to many researchers, not just those funded by The BRAIN Initiative®, and the external users can provide unbiased evaluation of data quality. If the data are valuable, the data sharing policy will increase support for The BRAIN Initiative® in the community, just as the deposition of DNA sequences into common databases led to increased grass-roots support for the Human Genome Project. Conversely, Balkanized data in inaccessible formats with reluctant sharing leads to resentment and diminished scientific return. Annual renewal of grant funds could be contingent on timely deposition of datasets into shared resources. Dissemination of powerful experimental tools and technologies to the community is another clear-cut demonstration of scientific benefit provided by The BRAIN Initiative®. The use of such tools outside the lab that developed them is the most potent indicator of their value. 

In the long run, scientific and medical advances will be the ultimate metrics of the success of The BRAIN Initiative®. These can and should be tracked by the NIH, with insight into the areas that have made progress, those that require additional support, and those that should be terminated. 

8d-iii. Implementation and Mechanisms 

Coordination and evaluation of research under The BRAIN Initiative® poses special challenges due to the rapidly evolving neuroscience landscape, the highly interdisciplinary nature of the research, and the need to leverage efforts with the existing NIH neuroscience research portfolio. Accountability, integration, and transparency can be greatly enhanced through the formation of a scientific advisory board, which would be composed of scientists who are experts in the many disciplines relevant to the Initiative and possess an understanding of the breadth of the NIH neuroscience research portfolio. A cohesive and rigorous scientific advisory board would: 

Ensure that the scientific vision of The BRAIN Initiative® is updated in response to new technological and conceptual advances that will be made over the course of the next 10 years. A paramount responsibility of this board will be to help identify and respond to these dynamic—and exciting—changes. 

Guide and facilitate the integration of the diverse disciplines under The BRAIN Initiative® as envisioned in this report. No single entity is accustomed to integrating neuroscience with engineering, physics, statistics, applied mathematics, chemistry, genetics, molecular biology, and the clinical sciences, and it will be crucial that communication across these disciplines is achieved and maintained. 

Provide cohesion across the NIH Institutes and Centers responsible for supporting neuroscience research into disease specific areas. Ultimately, the intent is that advances under The BRAIN Initiative® will spur new breakthroughs in our understanding of these diseases and disorders, and BRAIN efforts must be maximized to achieve this aim.  

9. Cost Estimates 

The BRAIN Initiative® Will Require New and Distinct Funding of $300-500 Million per Year. 

As part of the planning process, the working group was asked to estimate a budget for The BRAIN Initiative®. While the scientific program of The BRAIN Initiative® has been informed by the professional expertise of the working group members and many other contributing scientists, we did not conduct a detailed analysis of costs. Instead, we considered the scope of the questions to be addressed by the Initiative, and the cost of programs that have developed in related areas over recent years. The cost estimates for The BRAIN Initiative® should thus be viewed as provisional, but informed by the costs of real neuroscience at this technological level. 

Each scientific goal of The BRAIN Initiative® is critical to its integrated purpose, but different goals require different kinds of investment and resources. Instrumentation and technology costs are expected to be particularly high in developing new recording methods (Goal 3) and improving anatomical methods (Goal 2). All research with human subjects requires close supervision and associated costs, whether it involves structural and functional imaging (Goals 2 and 3), perturbation (Goal 4), or intra- or peri-operative recording (Goal 6). Theory, computation, and statistics require relatively little equipment, but will require strong support for personnel (Goal 5). Infrastructure for data science and technology dissemination will provide added value to the entire Initiative (Section III.8). 

The first year of the BRAIN

The Brain Research through Advancing Innovative Neurotechnologies Initiative aims to accelerate the development and application of innovate technologies to produce a new, dynamic picture of the brain.

 Initiative, FY14, was seeded by a new $40 million commitment from the NIH; in the second year, FY15, NIH will contribute $100 million to new and continuing grants. The working group believes that the program presented in the preceding sections could ramp up to $400 million per year over the next five years (FY16-20), and continue at roughly $500 million per year for the last five years (FY21-25). In total this might represent around 5% of the budget for brain-related research at NIH. A possible trajectory of costs per fiscal year is diagrammed below. 

As described earlier in this Section, the first years will emphasize technology development and validation, with a larger emphasis on problem-driven neuroscience after FY20. 

Ramping up support for The BRAIN Initiative® over time is important to drive its success. A structure that builds up over five to seven years will encourage talented scientists to enter interdisciplinary collaborations to solve important, difficult problems, because they will see a long-term commitment of the NIH. The BRAIN Initiative® should be a catalyst that will drive outstanding young people to enter this area at their most creative career stage. 

We envision the following general mechanisms for supporting BRAIN Initiative research. This list is not meant to be prescriptive or comprehensive; note, for example, that detailed mechanisms for supporting theory and human neuroscience research are addressed within Sections III.5 and III.6. 

Collaborative technology development and validation grants with 2-5 Principal Investigators representing several areas (for example, tool developers and tool users, or optical microscopy and biological sensor development). 

Technology innovation grants for development of new, blue-sky technologies, with 1-2 Principal Investigators. 

Integrated scientific grants addressing a fundamental question such as perception, memory, cognition, or action, with 2-5 Principal Investigators representing different disciplinary approaches, experiment, and theory. 

Multisite grants to address complex integrated systems and primate or human neurotechnologies, with multiple collaborators across fields, including clinical researchers. 

Infrastructure support for BRAIN Initiative research. Data infrastructure grants; practical training infrastructure in quantitative methods and experimental methods; technology dissemination infrastructure for providing broad, democratic access to new neurotechnologies 

Our cost estimates are extremely optimistic. Developing a single new stimulating or recording device for humans up through FDA approval might cost $100 million or $200 million, and this is not represented in our budget estimates. We believe that The BRAIN Initiative® can catalyze the first steps of this process, but that clinical advances can then best move forward through partnership with other clinical and industrial aspects of the research enterprise, which should shoulder some of the cost. We expect co-investment from government agencies, private foundations, international efforts, and industry in supporting BRAIN Initiative goals. 

These cost estimates assume that the budget for The BRAIN Initiative® will supplement, and not be taken from, existing NIH efforts in problem-based basic, translational, and clinical neuroscience. The majority of NIH investment in brain research is appropriately directed toward essential questions pertaining to health and disease, an investment that will remain as important in the future as it is now. For example, while we believe that BRAIN Initiative technologies will accelerate our understanding of circuits involved in learning and memory, and will suggest possible approaches for circuit-level interventions into memory disorders such as Alzheimer’s disease, we also strongly support the entire suite of ongoing NIH Alzheimer’s research that studies protein folding of the toxic beta-amyloid peptide, cell biology of affected neurons, human genetics of disease risk factors, development of new PET ligands for diagnosis, and clinical trials based on our molecular understanding. It would be inappropriate, even unethical, to place these targeted questions aside as circuit-level neuroscience is developed through BRAIN technologies. Thus our budget planning assumes continued strong support of the NIH commitment to neurological disorders and stroke, mental health, addiction, aging, and basic molecular, cellular, developmental and systems neuroscience which are not individually included in The BRAIN Initiative®.  

10. Concluding Remarks 

The Application of Integrated Technologies to Study Fundamental Questions in Neuroscience. 

Numerous long-standing problems in brain science will benefit dramatically from the integrated experimental approach made possible by The BRAIN Initiative®. It is difficult if not impossible to identify one set of problems that is more important than others—the brain functions as a whole, not as the sum of its parts. Nevertheless, we outline below five outstanding questions in understanding the brain in health and disease, and how technology and basic neuroscience developed under The BRAIN Initiative® will address these questions. 

Perception 

Perception mediates our sense of the external world and initiates much of our interaction with the world, providing an essential foundation for cognition and behavior. Our unified, subjective perception of the world emerges from coordinated electrochemical activity of billions of neurons in the brain. While the first levels of sensory processing have been much studied, The BRAIN Initiative® will make it possible to ask increasingly sophisticated questions about higher-level perception and multimodal integration. How do circuit dynamics perform computations on the information at each level of sensory processing? How is sensory information from multiple modalities (vision, audition, touch) integrated to generate a coherent percept of, for example, a specific person? The ability to monitor the dynamic activity patterns of large populations of neurons in areas of the brain specialized for sensory processing, the ability to relate those patterns to the underlying circuit structures, and the ability to measure the perceptual consequences of perturbing those patterns in precise ways will be revolutionary. A critical challenge will be to develop powerful theories of sensory coding, computation, and multi-modal synthesis of sensory information that build on known mechanisms to make testable predictions. Progress in this area will suggest strategies for treating sensory deficits such as blindness or macular degeneration, and strategies for understanding the perceptual distortions in brain disorders such as schizophrenia. 

Emotion and Motivation 

Our reactions to information from the external world are critically shaped by internal brain states such as emotion, motivation, and arousal. Perceptual recognition of a particular person may elicit a warm greeting or furtive avoidance depending upon one’s emotional state. Cell-type specific access to neural systems in the hypothalamus and amygdala are uncovering intermixed neural circuits underlying emotional states such as fear and aggression, but at a deeper level, the integration of this internal information with external information is a mystery. How are internal states generated in the brain, at the level of neural circuits and brain chemistry? How do these states influence cognitive processes such as learning and memory, decision-making or attention? How are the different timescales of cognitive processes and emotional processes integrated? Understanding motivation and cognition is a multilevel process in which no one brain area will provide a full answer. The BRAIN Initiative® will provide more precise access to the subcortical circuits thought to have important roles in generating brain states, and to the cortical areas that are influenced by them and influence them in turn. It will provide tools to manipulate activity within those circuits and measure behavioral outcomes. An understanding of the neural basis of emotional and motivational states is critically important in understanding how genes, the environment, and experience interact in the brain to cause psychiatric disorders such as depression, anxiety, and post-traumatic stress disorder. 

Cognition 

Cognition lies at the core of our internal mental lives, comprising our thinking, planning, and understanding of the world. For example, for humans to carry out a conversation, understand each other’s needs and priorities, and work together as a group or team, remarkable feats of cognition are required involving social and emotional processing, models of the mental states of others, and predictions about the future over many timescales—often in constantly changing situations with many uncertainties. Basic forms of cognition can be studied in animals as well, including decision-making, planning over a range of timescales, navigating paths and performing spatial computations, and making predictions about future rewards. While past experiments have opened the door to mechanistic studies of cognition, The BRAIN Initiative® will introduce a new era. Cognition integrates the external world, internal emotional states, memories, goals, and social context, and thus the circuits underlying cognitive processing are likely to span the entire nervous system. More than any other area, perhaps, the study of cognition will benefit from the far-reaching and high-resolution anatomical, recording, and perturbation methods to be developed under The BRAIN Initiative®. Cognition is central to clinical conditions as well, since key symptoms of crippling human diseases (including degenerative brain disorders, schizophrenia, and autism) lie in the realm of cognition, as do some treatments. For example, the cognitive-behavioral therapies for depression and anxiety disorders are widely-used and effective presumably because cognition is heavily influenced by (and powerfully influences) emotion. The BRAIN Initiative® will provide tools for use both in humans and in animal models to understand cognitive disorders, the systemic effects of complex and varied genetic factors that affect the risk of these disorders, and the mechanistic effects of cognitive, pharmacological, and electrical therapies. 

Learning and Memory 

Our memories provide us with our sense of our lives and selves; they are our connection with the past and the future. Learning and memory are near-miraculous properties of the brain that permit it to encode, store, and recall any event, associating features from all of the senses, linking them into a sequence that incorporates recent and distant information, and infusing them with meaning. The hippocampus is one brain region critical to the formation of new episodic, explicit memories, but it is mysterious how the vast assortment of associations can be assembled or routed there. It is equally mysterious how that information is eventually transferred to other brain areas, as it is during long-term memory. The BRAIN Initiative® will provide the tools for understanding the hippocampus and the many other brain areas involved in learning and memory. These tools will help unravel the dynamic neuronal mechanisms that store information like a phone number for just a few seconds (perhaps persistent neural activity?), and the stable mechanisms that can store a memory for a lifetime (perhaps modifications to synaptic structures, but which, and where?). A better understanding of memory systems in the brain, and their circuit properties, is urgently needed for addressing learning disabilities in children and for addressing the tragedy of memory loss in Alzheimer’s disease and other adult dementias. 

Action 

Motor systems generate our sophisticated learned behaviors, our skills, our language, and many of our pleasures in life. Any impact we have on the world happens ultimately through action, yet the neural processes that generate action are mysterious. An apparently simple goal—picking up a coffee cup—involves dozens of muscles and multiple degrees of freedom about several joints. While computer programs can routinely best human chess grandmasters, robots cannot move with the suppleness of a child. Motor planning and execution is a whole-brain process, with essential contributions of scattered regions including the cerebral cortex, the basal ganglia, the pons, the brainstem, and the cerebellum, as well as the spinal cord. These areas have generally been studied in isolation, but they function together, and damage to any of these areas, for example from injury or stroke, results in motor deficits. Experimental and theoretical tools to be developed under The BRAIN Initiative® will enable comprehensive analyses of movement preparation and generation. Motor structures that are typically studied individually will be studied increasingly as coupled networks, resulting in a clearer picture of the signal transformations that occur between selection and execution of a movement. Perturbation tools should enable incisive tests of function for motor loops through the basal ganglia and cerebellum. These advances will be highly relevant to understanding movement disorders like dystonia and neurodegenerative diseases such as Parkinson’s and ALS. 

In summary, we see immense potential in BRAIN Initiative technology applied to compelling neuroscience questions. We see great opportunities in providing new tools to basic neuroscientists, translational researchers, neurologists, psychiatrists, radiologists, and neurosurgeons. At the same time, we recognize that these initial goals will be refined, that new goals may emerge, and that objectives and priorities are still crystallizing. We have attempted to represent the best collective scientific wisdom of the field, with the advice of our colleagues in neuroscience, medicine, psychology, biology, chemistry, and the quantitative sciences, and with the advice of patient advocates and the public. The BRAIN Initiative® is a challenge and an opportunity to solve a central mystery—how organized circuits of cells interact dynamically to produce behavior and cognition, the essence of our mental lives. The answers to that mystery will not come easily. But until we start, the progress we desire will always be distant. The time to start is now.  

11. References 

1 Zariwala, H. A. et al. Visual tuning properties of genetically identified layer 2/3 neuronal types in the primary visual cortex of cre-transgenic mice. Frontiers in systems neuroscience 4, 162, doi:10.3389/fnsys.2010.00162 (2011). 

2 Zeng, H. et al. Large-scale cellular-resolution gene profiling in human neocortex reveals species-specific molecular signatures. Cell 149, 483-496, doi:10.1016/j.cell.2012.02.052 (2012). 

3 Sorensen, S. A. et al. Correlated Gene Expression and Target Specificity Demonstrate Excitatory Projection Neuron Diversity. Cerebral cortex, doi:10.1093/cercor/bht243 (2013). 

4 Manning, L. et al. A resource for manipulating gene expression and analyzing cis-regulatory modules in the Drosophila CNS. Cell reports 2, 1002-1013, doi:10.1016/j.celrep.2012.09.009 (2012). 

5 Tuthill, J. C., Nern, A., Holtz, S. L., Rubin, G. M. & Reiser, M. B. Contributions of the 12 neuron classes in the fly lamina to motion vision. Neuron 79, 128-140, doi:10.1016/j.neuron.2013.05.024 (2013). 

6 Masland, R. H. The neuronal organization of the retina. Neuron 76, 266-280, doi:10.1016/j.neuron.2012.10.002S0896-6273(12)00883-5 [pii] (2012). 

7 Masland, R. H. Neuronal cell types. Current biology : CB 14, R497-500, doi:10.1016/j.cub.2004.06.035 (2004). 

8 Nelson, S. B., Sugino, K. & Hempel, C. M. The problem of neuronal cell types: a physiological genomics approach. Trends in neurosciences 29, 339-345, doi:10.1016/j.tins.2006.05.004 (2006). 

9 Barres, B. A. The mystery and magic of glia: a perspective on their roles in health and disease. Neuron 60, 430-440, doi:10.1016/j.neuron.2008.10.013S0896-6273(08)00886-6 [pii] (2008). 

10 Raff, M. C. Glial cell diversification in the rat optic nerve. Science 243, 1450-1455 (1989). 

11 Hochstim, C., Deneen, B., Lukaszewicz, A., Zhou, Q. & Anderson, D. J. Identification of positionally distinct astrocyte subtypes whose identities are specified by a homeodomain code. Cell 133, 510-522, doi:10.1016/j.cell.2008.02.046S0092-8674(08)00396-6 [pii] (2008). 

12 Group, P. I. N. et al. Petilla terminology: nomenclature of features of GABAergic interneurons of the cerebral cortex. Nature reviews Neuroscience 9, 557-568, doi:10.1038/nrn2402 (2008). 

13 Lubeck, E. & Cai, L. Single-cell systems biology by super-resolution imaging and combinatorial labeling. Nature methods 9, 743-748, doi:10.1038/nmeth.2069 (2012). 

14 Micheva, K. D., O'Rourke, N., Busse, B. & Smith, S. J. Array tomography: immunostaining and antibody elution. Cold Spring Harbor protocols 2010, pdb.prot5525 (2010). 

15 Ramsköld, D. et al. Full-length mRNA-Seq from single-cell levels of RNA and individual circulating tumor cells. Nature biotechnology 30, 777-782, doi:10.1038/nbt.2282 (2012). 

16 Morgan, J. I., Cohen, D. R., Hempstead, J. L. & Curran, T. Mapping patterns of c-fos expression in the central nervous system after seizure. Science 237, 192-197 (1987). 

17 Parmar, M. & Li, M. Early specification of dopaminergic phenotype during ES cell differentiation. BMC developmental biology 7, 86, doi:10.1186/1471-213X-7-86 (2007). 

18 Dymecki, S. M., Ray, R. S. & Kim, J. C. Mapping cell fate and function using recombinase-based intersectional strategies. Methods in enzymology 477, 183-213, doi:10.1016/S0076-6879(10)77011-7 (2010). 

19 Nern, A., Pfeiffer, B. D., Svoboda, K. & Rubin, G. M. Multiple new site-specific recombinases for use in manipulating animal genomes. Proceedings of the National Academy of Sciences of the United States of America 108, 14198-14203, doi:10.1073/pnas.1111704108 (2011). 

20 Pfeiffer, B. D. et al. Refinement of tools for targeted gene expression in Drosophila. Genetics 186, 735-755, doi:10.1534/genetics.110.119917 (2010). 

21 Gerfen, C. R., Paletzki, R. & Heintz, N. GENSAT BAC cre-recombinase driver lines to study the functional organization of cerebral cortical and basal ganglia circuits. Neuron 80, 1368-1383, doi:10.1016/j.neuron.2013.10.016 (2013). 

22 Lima, S. Q., Hromadka, T., Znamenskiy, P. & Zador, A. M. PINP: a new method of tagging neuronal populations for identification during in vivo electrophysiological recording. PloS one 4, e6099, doi:10.1371/journal.pone.0006099 (2009). 

23 Gay, L. et al. Mouse TU tagging: a chemical/genetic intersectional method for purifying cell type-specific nascent RNA. Genes & development 27, 98-115, doi:10.1101/gad.205278.112 (2013). 

24 Heiman, M. et al. A translational profiling approach for the molecular characterization of CNS cell types. Cell 135, 738-748, doi:10.1016/j.cell.2008.10.028 (2008). 

25 Miller, M. R., Robinson, K. J., Cleary, M. D. & Doe, C. Q. TU-tagging: cell type-specific RNA isolation from intact complex tissues. Nature methods 6, 439-441, doi:10.1038/nmeth.1329 (2009). 

26 Siegert, S. et al. Genetic address book for retinal cell types. Nature neuroscience 12, 1197-1204, doi:10.1038/nn.2370 (2009). 

27 Guenthner, C. J., Miyamichi, K., Yang, H. H., Heller, H. C. & Luo, L. Permanent genetic access to transiently active neurons via TRAP: targeted recombination in active populations. Neuron 78, 773-784, doi:10.1016/j.neuron.2013.03.025 (2013). 

28 Knight, Z. A. et al. Molecular profiling of activated neurons by phosphorylated ribosome capture. Cell 151, 1126-1137, doi:10.1016/j.cell.2012.10.039 (2012). 

29 Tye, K. M. & Deisseroth, K. Optogenetic investigation of neural circuits underlying brain disease in animal models. Nature reviews Neuroscience 13, 251-266, doi:10.1038/nrn3171 (2012). 

30 Gaj, T., Gersbach, C. A. & Barbas, C. F., 3rd. ZFN, TALEN, and CRISPR/Cas-based methods for genome engineering. Trends in biotechnology 31, 397-405, doi:10.1016/j.tibtech.2013.04.004 (2013). 

31 Anliker, B. et al. Specific gene transfer to neurons, endothelial cells and hematopoietic progenitors with lentiviral vectors. Nature methods 7, 929-935, doi:10.1038/nmeth.1514 (2010). 

32 Oh, S. W. et al. A mesoscale connectome of the mouse brain. Nature, doi:10.1038/nature13186 (2014). 

33 Zingg, B. et al. Neural networks of the mouse neocortex. Cell 156, 1096-1111, doi:10.1016/j.cell.2014.02.023S0092-8674(14)00222-0 [pii] (2014). 

34 Van Essen, D. C. Cartography and connectomes. Neuron 80, 775-790, doi:10.1016/j.neuron.2013.10.027S0896-6273(13)00931-8 [pii] (2013). 

35 Craddock, R. C. et al. Imaging human connectomes at the macroscale. Nature methods 10, 524-539, doi:10.1038/nmeth.2482nmeth.2482 [pii] (2013). 

36 Sporns, O. The human connectome: origins and challenges. Neuroimage 80, 53-61, doi:10.1016/j.neuroimage.2013.03.023S1053-8119(13)00265-6 [pii] (2013). 

37 Sotiropoulos, S. N. et al. Advances in diffusion MRI acquisition and processing in the Human Connectome Project. Neuroimage 80, 125-143, doi:10.1016/j.neuroimage.2013.05.057S1053-8119(13)00551-X [pii] (2013). 

38 Smith, S. M. et al. Functional connectomics from resting-state fMRI. Trends Cogn Sci 17, 666-682, doi:10.1016/j.tics.2013.09.016S1364-6613(13)00220-9 [pii] (2013). 

39 Osten, P. & Margrie, T. W. Mapping brain circuitry with a light microscope. Nature methods 10, 515-523, doi:10.1038/nmeth.2477nmeth.2477 [pii] (2013). 

40 Huang, Z. J. & Zeng, H. Genetic approaches to neural circuits in the mouse. Annu Rev Neurosci 36, 183-215, doi:10.1146/annurev-neuro-062012-170307 (2013). 

41 Livet, J. et al. Transgenic strategies for combinatorial expression of fluorescent proteins in the nervous system. Nature 450, 56-62, doi:nature06293 [pii]10.1038/nature06293 (2007). 

42 Cai, D., Cohen, K. B., Luo, T., Lichtman, J. W. & Sanes, J. R. Improved tools for the Brainbow toolbox. Nature methods 10, 540-547 (2013). 

43 Loulier, K. et al. Multiplex cell and lineage tracking with combinatorial labels. Neuron 81, 505-520, doi:10.1016/j.neuron.2013.12.016S0896-6273(13)01177-X [pii] (2014). 

44 Osakada, F. et al. New rabies virus variants for monitoring and manipulating activity and gene expression in defined neural circuits. Neuron 71, 617-631, doi:10.1016/j.neuron.2011.07.005S0896-6273(11)00600-3 [pii] (2011). 

45 Feinberg, E. H. et al. GFP Reconstitution Across Synaptic Partners (GRASP) defines cell contacts and synapses in living nervous systems. Neuron 57, 353-363, doi:10.1016/j.neuron.2007.11.030S0896-6273(07)01020-3 [pii] (2008). 

46 Kim, J. et al. mGRASP enables mapping mammalian synaptic connectivity with light microscopy. Nature methods 9, 96-102, doi:10.1038/nmeth.1784nmeth.1784 [pii] (2012). 

47 Susaki, E. A. et al. Whole-brain imaging with single-cell resolution using chemical cocktails and computational analysis. Cell 157, 726-739, doi:10.1016/j.cell.2014.03.042S0092-8674(14)00418-8 [pii] (2014). 

48 Hama, H. et al. Scale: a chemical approach for fluorescence imaging and reconstruction of transparent mouse brain. Nature neuroscience 14, 1481-1488, doi:10.1038/nn.2928nn.2928 [pii] (2011). 

49 Chung, K. et al. Structural and molecular interrogation of intact biological systems. Nature 497, 332-337, doi:10.1038/nature12107nature12107 [pii] (2013). 

50 Ke, M. T., Fujimoto, S. & Imai, T. SeeDB: a simple and morphology-preserving optical clearing agent for neuronal circuit reconstruction. Nature neuroscience 16, 1154-1161, doi:10.1038/nn.3447nn.3447 [pii] (2013). 

51 White, J. G., Southgate, E., Thomson, J. N. & Brenner, S. The structure of the nervous system of the nematode Caenorhabditis elegans. Philos Trans R Soc Lond B Biol Sci 314, 1-340 (1986). 

52 Helmstaedter, M. et al. Connectomic reconstruction of the inner plexiform layer in the mouse retina. Nature 500, 168-174, doi:10.1038/nature12346nature12346 [pii] (2013). 

53 Takemura, S. Y. et al. A visual motion detection circuit suggested by Drosophila connectomics. Nature 500, 175-181, doi:10.1038/nature12450nature12450 [pii] (2013). 

54 Marc, R. E. et al. Retinal connectomics: towards complete, accurate networks. Prog Retin Eye Res 37, 141-162, doi:10.1016/j.preteyeres.2013.08.002S1350-9462(13)00055-4 [pii] (2013). 

55 Briggman, K. L., Helmstaedter, M. & Denk, W. Wiring specificity in the direction-selectivity circuit of the retina. Nature 471, 183-188, doi:10.1038/nature09818nature09818 [pii] (2011). 

56 Bock, D. D. et al. Network anatomy and in vivo physiology of visual cortical neurons. Nature 471, 177-182, doi:10.1038/nature09802nature09802 [pii] (2011). 

57 Anderson, J. R. et al. Exploring the retinal connectome. Mol Vis 17, 355-379, doi:41 [pii] (2011). 

58 Helmstaedter, M. & Mitra, P. P. Computational methods and challenges for large-scale circuit mapping. Curr Opin Neurobiol 22, 162-169, doi:10.1016/j.conb.2011.11.010S0959-4388(11)00213-3 [pii] (2012). 

59 Plaza, S. M., Scheffer, L. K. & Chklovskii, D. B. Toward large-scale connectome reconstructions. Curr Opin Neurobiol 25C, 201-210, doi:S0959-4388(14)00035-X [pii]10.1016/j.conb.2014.01.019 (2014). 

60 Lichtman, J. W. & Sanes, J. R. Ome sweet ome: what can the genome tell us about the connectome? Curr Opin Neurobiol 18, 346-353, doi:10.1016/j.conb.2008.08.010S0959-4388(08)00083-4 [pii] (2008). 

61 Morgan, J. L. & Lichtman, J. W. Why not connectomics? Nature methods 10, 494-500, doi:10.1038/nmeth.2480nmeth.2480 [pii] (2013). 

62 Huang, B., Bates, M. & Zhuang, X. Super-resolution fluorescence microscopy. Annu Rev Biochem 78, 993-1016, doi:10.1146/annurev.biochem.77.061906.092014 (2009). 

63 Maglione, M. & Sigrist, S. J. Seeing the forest tree by tree: super-resolution light microscopy meets the neurosciences. Nature neuroscience 16, 790-797, doi:10.1038/nn.3403nn.3403 [pii] (2013). 

64 Tufail, Y. et al. Transcranial pulsed ultrasound stimulates intact brain circuits. Neuron 66, 681-694, doi:10.1016/j.neuron.2010.05.008S0896-6273(10)00376-4 [pii] (2010). 

65 Menz, M. D., Oralkan, O., Khuri-Yakub, P. T. & Baccus, S. A. Precise neural stimulation in the retina using focused ultrasound. J Neurosci 33, 4550-4560, doi:10.1523/JNEUROSCI.3521-12.201333/10/4550 [pii] (2013). 

66 Legon, W. et al. Transcranial focused ultrasound modulates the activity of primary somatosensory cortex in humans. Nature neuroscience 17, 322-329, doi:10.1038/nn.3620nn.3620 [pii] (2014). 

67 McNaughton, B. L., O'Keefe, J. & Barnes, C. A. The stereotrode: a new technique for simultaneous isolation of several single units in the central nervous system from multiple unit records. J Neurosci Methods 8, 391-397 (1983). 

68 Gray, C. M., Maldonado, P. E., Wilson, M. & McNaughton, B. Tetrodes markedly improve the reliability and yield of multiple single-unit isolation from multi-unit recordings in cat striate cortex. J Neurosci Methods 63, 43-54 (1995). 

69 Ludwig, K. A., Uram, J. D., Yang, J., Martin, D. C. & Kipke, D. R. Chronic neural recordings using silicon microelectrode arrays electrochemically deposited with a poly(3,4-ethylenedioxythiophene) (PEDOT) film. J Neural Eng 3, 59-70, doi:S1741-2560(06)12912-2 [pii]10.1088/1741-2560/3/1/007 (2006). 

70 Vandecasteele, M. et al. Large-scale recording of neurons by movable silicon probes in behaving rodents. J Vis Exp, e3568, doi:10.3791/35683568 [pii] (2012). 

71 Wark, H. A. et al. A new high-density (25 electrodes/mm(2)) penetrating microelectrode array for recording and stimulating sub-millimeter neuroanatomical structures. J Neural Eng 10, 045003, doi:10.1088/1741-2560/10/4/045003 (2013). 

72 Dombeck, D. A., Khabbaz, A. N., Collman, F., Adelman, T. L. & Tank, D. W. Imaging large-scale neural activity with cellular resolution in awake, mobile mice. Neuron 56, 43-57, doi:S0896-6273(07)00614-9 [pii]10.1016/j.neuron.2007.08.003 (2007). 

73 Gobel, W. & Helmchen, F. In vivo calcium imaging of neural network function. Physiology (Bethesda) 22, 358-365, doi:22/6/358 [pii]10.1152/physiol.00032.2007 (2007). 

74 Ziv, Y. et al. Long-term dynamics of CA1 hippocampal place codes. Nature neuroscience 16, 264-266, doi:10.1038/nn.3329nn.3329 [pii] (2013). 

75 Ahrens, M. B. et al. Brain-wide neuronal dynamics during motor adaptation in zebrafish. Nature 485, 471-477, doi:10.1038/nature11057nature11057 [pii] (2012). 

76 Kipke, D. R., Vetter, R. J., Williams, J. C. & Hetke, J. F. Silicon-substrate intracortical microelectrode arrays for long-term recording of neuronal spike activity in cerebral cortex. IEEE Trans Neural Syst Rehabil Eng 11, 151-155, doi:10.1109/TNSRE.2003.814443 (2003). 

77 Hoogerwerf, A. C. & Wise, K. D. A three-dimensional microelectrode array for chronic neural recording. IEEE Trans Biomed Eng 41, 1136-1146, doi:10.1109/10.335862 (1994). 

78 Ohki, K., Chung, S., Ch'ng, Y. H., Kara, P. & Reid, R. C. Functional imaging with cellular resolution reveals precise micro-architecture in visual cortex. Nature 433, 597-603, doi:nature03274 [pii]10.1038/nature03274 (2005). 

79 Patolsky, F. et al. Detection, stimulation, and inhibition of neuronal signals with high-density nanowire transistor arrays. Science 313, 1100-1104, doi:313/5790/1100 [pii]10.1126/science.1128640 (2006). 

80 Robinson, J. T. et al. Vertical nanowire electrode arrays as a scalable platform for intracellular interfacing to neuronal circuits. Nat Nanotechnol 7, 180-184, doi:10.1038/nnano.2011.249nnano.2011.249 [pii] (2012). 

81 Kodandaramaiah, S. B., Franzesi, G. T., Chow, B. Y., Boyden, E. S. & Forest, C. R. Automated whole-cell patch-clamp electrophysiology of neurons in vivo. Nature methods 9, 585-587, doi:10.1038/nmeth.1993nmeth.1993 [pii] (2012). 

82 Kralj, J. M., Douglass, A. D., Hochbaum, D. R., Maclaurin, D. & Cohen, A. E. Optical recording of action potentials in mammalian neurons using a microbial rhodopsin. Nature methods 9, 90-95, doi:10.1038/nmeth.1782nmeth.1782 [pii] (2012). 

83 Cao, G. et al. Genetically targeted optical electrophysiology in intact neural circuits. Cell 154, 904-913, doi:10.1016/j.cell.2013.07.027S0092-8674(13)00898-2 [pii] (2013). 

84 Marshall, J. D. & Schnitzer, M. J. Optical strategies for sensing neuronal voltage using quantum dots and other semiconductor nanocrystals. ACS Nano 7, 4601-4609, doi:10.1021/nn401410k (2013). 

85 Harvey, C. D., Collman, F., Dombeck, D. A. & Tank, D. W. Intracellular dynamics of hippocampal place cells during virtual navigation. Nature 461, 941-946, doi:10.1038/nature08499nature08499 [pii] (2009). 

86 Scott, B. B., Brody, C. D. & Tank, D. W. Cellular resolution functional imaging in behaving rats using voluntary head restraint. Neuron 80, 371-384, doi:10.1016/j.neuron.2013.08.002S0896-6273(13)00712-5 [pii] (2013). 

87 Kim, S. et al. Integrated wireless neural interface based on the Utah electrode array. Biomed Microdevices 11, 453-466, doi:10.1007/s10544-008-9251-y (2009). 

88 Maynard, E. M., Nordhausen, C. T. & Normann, R. A. The Utah intracortical Electrode Array: a recording structure for potential brain-computer interfaces. Electroencephalogr Clin Neurophysiol 102, 228-239, doi:S0013469496951760 [pii] (1997). 

89 Olds, J. & Milner, P. Positive reinforcement produced by electrical stimulation of septal area and other regions of rat brain. J Comp Physiol Psychol 47, 419-427 (1954). 

90 Penfield, W. & Perot, P. The Brain's Record of Auditory and Visual Experience. A Final Summary and Discussion. Brain 86, 595-696 (1963). 

91 Corbett, D. & Wise, R. A. Intracranial self-stimulation in relation to the ascending dopaminergic systems of the midbrain: a moveable electrode mapping study. Brain Res 185, 1-15, doi:0006-8993(80)90666-6 [pii] (1980). 

92 Salzman, C. D., Britten, K. H. & Newsome, W. T. Cortical microstimulation influences perceptual judgements of motion direction. Nature 346, 174-177, doi:10.1038/346174a0 (1990). 

93 Holtzheimer, P. E. & Mayberg, H. S. Deep brain stimulation for psychiatric disorders. Annu Rev Neurosci 34, 289-307, doi:10.1146/annurev-neuro-061010-113638 (2011). 

94 Fenno, L., Yizhar, O. & Deisseroth, K. The development and application of optogenetics. Annu Rev Neurosci 34, 389-412, doi:10.1146/annurev-neuro-061010-113817 (2011). 

95 Packer, A. M., Roska, B. & Hausser, M. Targeting neurons and photons for optogenetics. Nature neuroscience 16, 805-815, doi:10.1038/nn.3427nn.3427 [pii] (2013). 

96 Deisseroth, K. Circuit dynamics of adaptive and maladaptive behaviour. Nature 505, 309-317, doi:10.1038/nature12982nature12982 [pii] (2014). 

97 Konermann, S. et al. Optical control of mammalian endogenous transcription and epigenetic states. Nature 500, 472-476, doi:10.1038/nature12466nature12466 [pii] (2013). 

98 Farrell, M. S. & Roth, B. L. Pharmacosynthetics: Reimagining the pharmacogenetic approach. Brain Res 1511, 6-20, doi:10.1016/j.brainres.2012.09.043S0006-8993(12)01604-6 [pii] (2013). 

99 Terlau, H. & Olivera, B. M. Conus venoms: a rich source of novel ion channel-targeted peptides. Physiol Rev 84, 41-68, doi:10.1152/physrev.00020.200384/1/41 [pii] (2004). 

100 Julius, D. & Basbaum, A. I. Molecular mechanisms of nociception. Nature 413, 203-210, doi:10.1038/3509301935093019 [pii] (2001). 

101 De Meyer, T., Muyldermans, S. & Depicker, A. Nanobody-based products as research and diagnostic tools. Trends in biotechnology 32, 263-270, doi:S0167-7799(14)00041-9 [pii]10.1016/j.tibtech.2014.03.001 (2014). 

102 Bell, R. D. & Ehlers, M. D. Breaching the blood-brain barrier for drug delivery. Neuron 81, 1-3, doi:10.1016/j.neuron.2013.12.023S0896-6273(13)01184-7 [pii] (2014). 

103 Cong, L. et al. Multiplex genome engineering using CRISPR/Cas systems. Science 339, 819-823, doi:10.1126/science.1231143science.1231143 [pii] (2013). 

104 McNab, J. A. et al. The Human Connectome Project and beyond: initial applications of 300 mT/m gradients. Neuroimage 80, 234-245, doi:10.1016/j.neuroimage.2013.05.074S1053-8119(13)00574-0 [pii] (2013). 

105 Smith, S. M. et al. Resting-state fMRI in the Human Connectome Project. Neuroimage 80, 144-168, doi:10.1016/j.neuroimage.2013.05.039S1053-8119(13)00533-8 [pii] (2013). 

106 Vogelstein, J. T. et al. Discovery of brainwide neural-behavioral maps via multiscale unsupervised structure learning. Science 344, 386-392, doi:10.1126/science.1250298science.1250298 [pii] (2014). 

107 Kass, R. E., Eden, U. T. & Brown, E. N. Analysis of Neural Data. (Springer, 2014). 

108 Rieke, F., Warland, D., de Ruyter van Steveninck, R. & Bialek, W. Spikes: Exploring the Neural Code. (MIT Press, 1999). 

109 Churchland, P. S., Sejnowski, T. J. The Computational Brain. (MIT Press, 1992). 

110 Dayan, P. & Abbott, L. F. Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. (MIT Press, 2001). 

111 Pillow, J. W. et al. Spatio-temporal correlations and visual signalling in a complete neuronal population. Nature 454, 995-999, doi:10.1038/nature07140nature07140 [pii] (2008). 

112 Tukey, J. W. Exploratory Data Analysis. (Addison-Wesley, 1977). 

113 Hodgkin, A. L. & Huxley, A. F. A quantitative description of membrane current and its application to conduction and excitation in nerve. J Physiol 117, 500-544 (1952). 

114 Nagumo, J. & Sato, S. On a response characteristic of a mathematical neuron model. Kybernetik 10, 155-164 (1972). 

115 Kepler, T. B., Abbott, L. F. & Marder, E. Reduction of conductance-based neuron models. Biol. Cybern. 66, 381-387 (1992). 

116 Pinsky, P. F. & Rinzel, J. Intrinsic and network rhythmogenesis in a reduced Traub model for CA3 neurons. Journal of computational neuroscience 1, 39-60 (1994). 

117 Brown, E. N., Frank, L. M., Tang, D., Quirk, M. C. & Wilson, M. A. A statistical paradigm for neural spike train decoding applied to position prediction from ensemble firing patterns of rat hippocampal place cells. J Neurosci 18, 7411-7425 (1998). 

118 Ganguli, S. & Sompolinsky, H. Compressed sensing, sparsity, and dimensionality in neuronal information processing and data analysis. Annu Rev Neurosci 35, 485-508, doi:10.1146/annurev-neuro-062111-150410 (2012). 

119 Bengio, Y., Courville, A., Vincent, P. Representation Learning: A Review and New Perspectives. arXiv:1206.5538 (2012). 

120 Koller, D. & Friedman, N. Probabilistic Graphical Models: Principles and Techniques. (MIT Press, 2009). 

121 Ba, D., Temereanca, S. & Brown, E. N. Algorithms for the analysis of ensemble neural spiking activity using simultaneous-event multivariate point-process models. Front Comput Neurosci 8, 6, doi:10.3389/fncom.2014.00006 (2014). 

122 Steriade, M., Gloor, P., Llinas, R. R., Lopes de Silva, F. H. & Mesulam, M. M. Report of IFCN Committee on Basic Mechanisms. Basic mechanisms of cerebral rhythmic activities. Electroencephalogr Clin Neurophysiol 76, 481-508 (1990). 

123 Vul, E., Harris, C., Winkielman, P. & Pashler, H. Puzzlingly High Correlations in fMRI Studies of Emotion, Personality, and Social Cognition. Perspect Psychol Sci 4, 274-290, doi:DOI 10.1111/j.1745-6924.2009.01125.x (2009). 

124 Sutton, R. S. & Barto, A. G. Reinforcement Learning: An Introduction. (MIT Press, 1998). 

125 Sejnowski, T. J., Poizner, H., Lynch, G., Gepshtein, S., Greenspan, R. Prospective Optimization, Proceedings of the IEEE. 102, 799-811 (2014). 

126 Viskontas, I. V., Quiroga, R. Q. & Fried, I. Human medial temporal lobe neurons respond preferentially to personally relevant images. Proceedings of the National Academy of Sciences of the United States of America 106, 21329-21334, doi:10.1073/pnas.09023191060902319106 [pii] (2009). 

127 Hochberg, L. R. et al. Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature 485, 372-375, doi:10.1038/nature11076nature11076 [pii] (2012). 

128 Svirsky, M. A., Robbins, A. M., Kirk, K. I., Pisoni, D. B. & Miyamoto, R. T. Language development in profoundly deaf children with cochlear implants. Psychol Sci 11, 153-158 (2000). 

129 Deisseroth, K. & Schnitzer, M. J. Engineering approaches to illuminating brain structure and dynamics. Neuron 80, 568-577, doi:10.1016/j.neuron.2013.10.032S0896-6273(13)00936-7 [pii] (2013). 

130 Koch, C. & Reid, R. C. Neuroscience: Observatories of the mind. Nature 483, 397-398, doi:10.1038/483397a483397a [pii] (2012). 

131 Jenett, A. et al. A GAL4-driver line resource for Drosophila neurobiology. Cell reports 2, 991-1001, doi:10.1016/j.celrep.2012.09.011S2211-1247(12)00292-6 [pii] (2012). 

132 Akil, H., Martone, M. E. & Van Essen, D. C. Challenges and opportunities in mining neuroscience data. Science 331, 708-712, doi:10.1126/science.1199305331/6018/708 [pii] (2011). 

133 Mountain, M. Commentary: Flattening the astronomy world. Physics Today 67, 8-10 (2014). 

134 Illes, J. Neuroethics: Defining the issues in theory, practice and policy. (Oxford University Press, 2005). 

APPENDIX A – HOW THE BRAIN INITIATIVE WILL ADVANCE CLINICAL RESEARCH

The burden of human brain disorders is sobering, and there is an urgent need for better approaches to treat and cure these conditions. As an example, over 5 million Americans currently suffer from Alzheimer’s disease. The cost of caring for these stricken individuals is over $200 billion per year, including $150 billion from Medicare and Medicaid. Great strides have been made in uncovering the genetic risk factors that predispose to Alzheimer’s disease, but in current medical practice we lack even a single effective drug for reversing its symptoms, much less curing it; we can only slow the rate of decline. 

Similarly, we lack effective treatments for many psychiatric disorders. 2.4 million Americans suffer from schizophrenia, a terrible, life-shortening mental illness for which existing medications are only partly effective. There has not been a fundamentally new drug developed for schizophrenia in the past 20 years of intense efforts by the pharmaceutical industry. 

The classical approach taken to solve brain disorders is to focus on each disorder in detail, examining its symptoms, pathology, causes, and responses to interventions. In some cases, there have been great successes from this focused approach. The development of L-dopa as a Parkinson’s treatment following the realization that dopaminergic neurons degenerate and die in this disease. However, in many other cases, we do not understand the symptoms or causes, and even when we do, we have not been able to develop effective interventions. The early successes with Parkinson’s have been replicated too few times. 

One reason that there are not better drugs and treatments for brain disorders is that we do not know enough about the brain. Our fragmentary understanding of brain functions means that we are stumbling in the dark when we attempt to treat patients with brain disorders. Neuroscience provides a way to turn on the lights: a better understanding of all brain circuits and networks, and their relationships with one another, has the potential to shed light on disorders of brain development, function, and aging. Whether brain disorders result from injury, or from environmental and genetic interactions, they affect the same organ. Until we understand what the brain does and how it does it, we will lack the judgment to repair or assist it when it malfunctions. 

Even brain disorders that have molecular or cellular causes are ultimately expressed at the levels of circuits and networks. This is true both for neurological and for psychiatric disorders. For example, specific ion channel mutations can lead to seizure disorders (epilepsy), but these effects express themselves as a disruption of the ratio of excitation to inhibition that leads to widespread, aberrant neural activity. Rectifying the activity state may prove to be as important and useful as correcting the aberrant channels per se. Similarly, Parkinsonism begins with the degeneration of dopaminergic neurons, but this disrupts the entire circuits for motor control in which these neurons are embedded, and can, to some extent, be mitigated by targeting remote brain sites. Psychiatric disorders can also have molecular or environmental triggers, but their symptoms arise from defects in the circuits that control cognition, emotion and motivation. Psychiatric disorders in particular have waxing and waning trajectories suggestive of abnormal network states. Electroconvulsive therapy is one of the fastest-acting and most effective treatments for depression; the current exploration of deep brain stimulation for depression represents the search for a more focal and rationally-designed circuit-based intervention. A deep knowledge of brain circuits will provide a foundation for understanding all brain disorders in the perspective of whole-brain function, and this in turn will suggest new approaches to their treatment. 

The technologies to be developed by The BRAIN Initiative® will also motivate new clinical applications. The BRAIN Initiative® should develop ways to target particular cell types with chemical or genetic agents, providing precise and local control of therapeutic delivery. The BRAIN Initiative® should increase the power of structural and functional imaging and recording methods for the human brain. It should lead to a new generation of implantable devices. 

Neuromodulation to restore circuits in neurological and psychiatric disorders: The use of implantable devices as clinical interventions began with the electrical pacemakers that regularize aberrant rhythms of the heart 1. Implantable devices in the brain are the basis of deep brain stimulation (DBS), which treats brain disorders by modulating dysfunctional circuits with electrical current. Basic science studies that explained how clinical symptoms emerge from basal ganglia circuit imbalance led to the effective use of DBS to overcome rigidity, slowness and tremor in Parkinsonian patients who could not use L-dopa 2, and DBS has also been successful in other movement disorders 3. Understanding the circuits involved in psychiatric disorders may help to develop effective DBS strategies for their treatment, as is already beginning for depression 4. It may even be possible to intervene in memory disorders such as Alzheimer’s disease, at least in their early stages, by stimulating the disabled circuits 5. More successful implementation of DBS should arise from knowledge of connections and circuit function that emerge from BRAIN initiative research. 

Epilepsy prevention and intervention: Technologies developed under The BRAIN Initiative® should also be valuable in the treatment of epilepsy, where there is particular promise in the use of devices that can both record and stimulate brain activity. Many patients with intractable seizures are treated by irreversible surgical ablations, but reversible, tunable intervention would clearly be far more desirable. We can envision sensitive implanted measurement devices that can detect an imminent seizure event and immediately modulate the circuit to restore normal function before a seizure. Indeed, a first version of such a fully implantable system has been approved for human use by the FDA. A better understanding of the complex interactions of excitatory and inhibitory cell types in brain circuits should enable advances in these and other forms of seizure prevention. 

Restoration of lost sensory functions: The cochlear prosthesis, implanted in more than 200,000 people, is an effective treatment for profound deafness 6 and a compelling model demonstrating how strong underlying science and medicine can lead to commercial devices for human use 7. Devices to restore vision for those with profound blindness are at an early stage of technological development, with one system recently approved for human use by the FDA. By understanding how the neurons in the retina and brain process streams of information from the visual world, it should be possible to devise intelligent retinal or cortical prostheses or other devices to restore sight. Active work is already underway on electrical stimulation of the retina 8, the optic nerve 9 the lateral geniculate nucleus 10, and the primary visual cortex 11. New technologies such as optogenetics may provide other ways to restore lost vision. 

Brain and peripheral nerve Interfaces to restore movement after paralysis or amputation: Brain Computer Interfaces (BCI) are a technological approach to restoring volitional control of external devices to patients paralyzed by injuries or strokes, impacting hundreds of thousands of Americans 11 with lifelong disability and a need for supportive care. BCIs have the aim of (a) accessing volitional intent, by recording from surgically-implanted microelectrodes in the brain, (b) translating the ‘neural code’ representing intent into a specific command signal and (c) coupling that command either to an assistive device such as a robotic arm or computer, or to electrodes directly implanted in peripheral nerves 12. BCIs have had a few promising successes, but the speed, reliability and dexterity of control are still in their infancy. A better understanding of the brain’s code for targeted movement has the potential to make this experimental technology a practical reality. Another kind of brain-device interface is represented by aspirin-sized microelectrode arrays, implanted into the severed nerves of amputees, that can record volitional motor command signals and use them for closed-loop control of actuators in prototype prosthetic devices. New insights into movement circuits may help develop such devices to the level that prosthetic devices are eventually regarded as ‘self’. 

New brain imaging methods for diagnosis of brain disorders and evaluation of treatments: Human brain disorders are notorious for their complex symptoms and trajectories: it can be difficult even for a skilled clinician to assess the course of an illness or signs that a treatment has succeeded. The tools developed under The BRAIN Initiative® provide approaches to understanding the circuit causes of these disorders as well as the effects of treatments. For example, certain patterns of brain activity such as high activity of the subcallosal cingulate gyrus have been associated with depressed states in patients, and can resolve when depression begins to lift. This marker can help to assess the effects of common therapeutic agents such as SSRI inhibitors, which can take weeks to work. Improvements in human brain imaging through The BRAIN Initiative® may extend this principle by allowing the discovery of biomarkers of different brain disorders. Moreover, improved imaging and recording methods may help address the heterogeneity of brain disorders by identifying brain activity changes associated with disease states in a single individual, creating a truly personalized brain medicine. Other BRAIN Initiative technologies may also be deployed for use in patients. For example, combining functional brain imaging with improved versions of perturbation technologies such as transcranial magnetic stimulation could be used to assess altered functional connectivity in patients with brain disorders, allowing more precise targeting of therapies to functionally compromised pathways. 

New pharmaceutical treatments based on cellular and circuit targets: Most effective drugs target several molecules and many or all brain regions. Many drugs are effective only in an unpredictable subset of affected patients. Many drugs that could be successful in treating human brain disorders are discarded because of side effects that compromise their usefulness. Targeting pharmacological agents to a specific brain region or cell population may become possible when there are molecular markers for those regions or cells, one of the goals of The BRAIN Initiative®. For example, hybrid delivery systems in which a drug is coupled to an antibody that recognizes a particular cell population may effectively deliver the drug while minimizing its level in other locations. 

New approaches to linking genetic risk factors to brain function: The power of human genomics has led to the identification of many single-nucleotide polymorphisms (SNPs) that increase the risk of brain disorders such as schizophrenia, autism, bipolar disorder, and mental retardation. Most of these SNPs fall outside the coding regions of the genes, suggesting that they affect gene regulation, but the affected cell types and regions are unknown. As a result, the relationship between gene and disease processes is unclear. The BRAIN Initiative® will provide information about brain disorders that complements this genetic information. First, the identification of the cell types in the human brain, and their gene expression patterns, should help to identify the cellular site of action of some genes important in brain disorders. An elegant, perhaps extreme example of cell type specificity is the 20,000 or so neurons in the human hypothalamus that produce the neuropeptide orexin/hypocretin; the immune destruction of this tiny population of neurons leads to narcolepsy/cataplexy, a systemic human sleep disorder. At this point we do not know how many such specialized categories of neuronal cells remain to be discovered. At the other extreme, many genes near risk-associated SNPs are expressed very broadly in the brain, raising questions about how they predispose to selective cognitive and emotional disorders. Examining brain activity in patients bearing these SNPs can help pinpoint the affected brain systems, pointing investigation toward cell types in which the SNPs may be active. 

A Call to Action

The path from science to therapeutic treatments is arduous and slow, a long battle. However, biomedical science can point to notable successes. Studies in the 1970s and 1980s in the US, the “War in Cancer” led to the identification of molecules that control cell division and growth, and then the realization that these molecules were mutated in cancer cells. Only in the late 1990s were the first cancer drugs developed that could directly target those tumor-causing mutations; at this writing, they represent one of the fastest-growing areas for effective cancer treatments. In another case, when HIV/AIDS first appeared in the United States in the 1980s, life expectancy was measured in months. Scientific studies of the HIV virus and the immune system since then, funded by federal investment in the NIH, have generated treatments that raise life expectancy after diagnosis to 30 years or more. The brain is the most complex system in the body, and moving from brain science to treatments will not be easy. We must realize that it may take 20 years for discoveries in basic neuroscience to lead to treatments and cures for brain disorders. It may take ten years to elucidate the functions of the brain to a degree that provides a clear path for our colleagues in medicine, engineering, and the biotechnology and pharmaceutical industries. It may take ten additional years for them to transform this knowledge into practical and effective medical advances. But the history of biomedical research, and of visionary support by the NIH, provides hope for treating brain disorders through a deeper understanding of the brain. 

References 

  1. Lillehei, C. W., Gott, V. L., Hodges, P. C., Jr., Long, D. M. & Bakken, E. E. Transitor pacemaker for treatment of complete atrioventricular dissociation. J Am Med Assoc 172, 2006-2010 (1960). 

  1. Farris, S. & Giroux, M. Retrospective review of factors leading to dissatisfaction with subthalamic nucleus deep brain stimulation during long-term management. Surg Neurol Int 4, 69, doi:10.4103/2152-7806.112612SNI-4-69 [pii] (2013). 

  1. Panov, F. et al. Deep brain stimulation in DYT1 dystonia: a 10-year experience. Neurosurgery 73, 86-93; discussion 93, doi:10.1227/01.neu.0000429841.84083.c8 (2013). 

  1. Mayberg, H. S. et al. Deep brain stimulation for treatment-resistant depression. Neuron 45, 651-660, doi:S0896-6273(05)00156-X [pii]10.1016/j.neuron.2005.02.014 (2005). 

  1. Alzheimer's Facts and Figures

  1. Mudry, A. & Mills, M. The early history of the cochlear implant: a retrospective. JAMA Otolaryngol Head Neck Surg 139, 446-453, doi:10.1001/jamaoto.2013.2931688121 [pii] (2013). 

  1. Clark, G. The multi-channel cochlear implant and the relief of severe-to-profound deafness. Cochlear Implants Int 13, 69-85, doi:10.1179/1754762811Y.0000000019cim77 [pii] (2012). 

  1. Caposecco, A., Hickson, L. & Pedley, K. Cochlear implant outcomes in adults and adolescents with early-onset hearing loss. Ear Hear 33, 209-220, doi:10.1097/AUD.0b013e31822eb16c (2012). 

  1. da Cruz, L. et al. The Argus II epiretinal prosthesis system allows letter and word reading and long-term function in patients with profound vision loss. The British journal of ophthalmology 97, 632-636, doi:10.1136/bjophthalmol-2012-301525 (2013). 

  1. Veraart, C., Wanet-Defalque, M. C., Gerard, B., Vanlierde, A. & Delbeke, J. Pattern recognition with the optic nerve visual prosthesis. Artif Organs 27, 996-1004, doi:7305 [pii] (2003). 

  1. Bourkiza, B., Vurro, M., Jeffries, A. & Pezaris, J. S. Visual acuity of simulated thalamic visual prostheses in normally sighted humans. PloS one 8, e73592, doi:10.1371/journal.pone.0073592PONE-D-13-04699 [pii] (2013). 

  1. Stroke Fact Sheets