Académique Documents
Professionnel Documents
Culture Documents
Blue Brain
Amit Dhanvani
Shree Swminarayana Naimisharanya College of
Managemet & IT.
Sidsar Road,Bhavnagar.
dhanvaniamit@gmail.com
Infrastructure
Main
components
of
the
infrastructure
The Blue Brain workflow depends on a largescale research infrastructure, providing:
be Predictive
Reverse
Engineering.
Biological data at different omics levels
displays complex cross-level structures and
dependencies, amenable to discovery by
informatics-based tools. Together, these
dependencies constrain the structure and
functionality of neural circuits, at many
different
levels.
Predictive
Reverse
Engineering exploits these constraints to fill in
gaps in the experimental data. Predictive
Reverse Engineering has already been
successfully applied in several different areas
(prediction of the spatial distribution of ion
channels in 3D model neurons, prediction of
neuronal firing properties from expression data
for a selected set of ion channels, prediction of
synaptic
connectivity
from
neuronal
morphology). In future work, the project will
extend the use of Predictive Reverse
Engineering to new domains, including the
prediction of the transcriptome from limited
expression data, and the prediction of data for
one species (e.g. humans), from data collected
in other species (rat, mouse, cat, primates).
Workflows
Neuroinformatics
Neuroinformatics is the second step in the Blue
Brain workflow. The goal is to extract the
maximal possible information from data
acquisition in the previous step. To achieve
this goal, the project has designed a set of
prototype
workflows
supporting
the
acquisition, curation, databasing, postprocessing and mining of data (protocols,
experimental conditions, results) from Blue
Brain experiments and from the literature.
One of the projects key strategies will
Neuroscience
Data acquisition is the first step in the Blue
Brain workflow and involves different levels
of effort and standardization, from exploratory
Virtual experiments
Goals
The ultimate goal of the Blue Brain Facility is
to enable neuroscientists to conduct virtual
experiments and exploratory studies testing
hypotheses, diagnostic tools and treatments.
This step in the Blue Brain workflow supports
such studies by providing facilities for
configuring the models, setting up virtual
experiments
and
high
performance
simulations, visualization and analysis, all
based on a single, unifying data model.
Simulation environment
The Blue Brain Facility provides simulation
through the Simulation Environment, based on
a
Blue
Gene/P
super-computer.
Any virtual experiment involves the simulation
of biophysical processes at different levels of
brain organization. In the Simulation
Environment, these are represented by models
integrating the data acquired and managed
during the first two steps in the Blue Brain
workflow, with the mathematical abstractions
developed
in
the
third
step.
The Simulation Environment generates
numerical solutions for these models,
Analytics
In silico studies require many different kinds
of analytical data ranging from the time stamp
for a single event -to EEG measurements
covering the whole brain. The Analytics
Environmentleverages the Data Model (see
below) to provide a reliable framework for the
development
of
reusable,
potentially
interactive tools for analytics. The framework
is designed to support cohosting of simulations
and analytics tools on the same machine.
Visualization
The Visualization Environment provides the
interface between the user, modeling tools
(Builders) and the simulation and analytics
environments, with which it shares a common
data model. The Visualization Environment
provides users with tools allowing them to
display and navigate the structural, transient
and analysis data generated by these tools and
environments. These tools include virtual
instrumentsallowingt users to display in silico
brain structures in the same way they appear
during experiments with biological samples.
Data model
Modeling and simulating the brain requires the
integration of structural and functional
information spanning multiple orders of
magnitude in space and time. The Blue Brain
Data Model, which evolves continuously,
provides a unifying spatial scaffolding for
mathematical models representing different
aspects of the brain on different spatial and
temporal scales or serving different purposes
(e.g. visualization vs. simulation). All Blue
2.2 Eye
Seeing is one of the most pleasing senses of the
nervous system. This cherished action
primarily conducted by the lens, which magnies a s
een image, vitreous disc,
which bendsand rotates an image against the retina,
which translates the image and light by a set of
cells.The retina is at the back of the eye ball where
rods and cones structure along with other cellsand
tissues covert the image into nerve impulses which
are transmitted along the optic nerveto the brain
where it is kept for memory.
2.3 Tongue
A set of micro scopic buds on the tongue divide
everything we eat and drink into four kindsof taste:
bitter, sour, salty, and sweet. These buds have taste
pores, which convert the tasteinto a nerve impulse
and send the impulse to the brain by a sensory
nerve ber. Upon
receiving the message,
our brain classies the different kinds of taste. This is
how we can
refer the taste of one kind of food to another.
2.4 Ear
Once the sound or sound wave has entered the drum,
it goes to a large structure called thecochlea. In this
snail like structure, the sound waves are divided into
pitches. The vibrationsof the pitches in the cochlea
are measured by the Corti. This organ transmits the
vibrationinformation to a nerve, which sends it to the
brain for interpretation and memory.
Data acquisition
There are three main steps to building the virtual
brain: 1) data acquisition, 2) simulation, 3)
visualisation of results.
Simulation
NEURON
The primary software used by the BBP for neural
simulations is a package called NEURON. This was
developed starting in the 1990s by Michael Hines at
Yale University and John Moore at Duke University.
It is written in C, C++, and FORTRAN. The software
continues to be under active development and, as of
July 2012, is currently at version 7.2. It is free and
open source software, both the code and the binaries
are freely available on the website. Michael Hines
and the BBP team collaborated in 2005 to port the
package to the massively parallel Blue Gene
supercomputer.
Simulation speed
The 12 patch-clamp, close-up view
BBP-SDK
The BBP-SDK (Blue Brain Project - Software
Development Kit) is a set of software classes
(APIs) that allows researchers to utilize and
inspect models and simulations. The SDK is a C++
library wrapped in Java and Python.
Workflow
The simulation step involves synthesising virtual
cells using the algorithms that were found to describe
real neurons. The algorthims and parameters are
adjusted for the age, species, and disease stage of the
animal being simulated. Every single protein is
simulated, and there are about a billion of these in
one cell. First a network skeleton is built from all the
different kinds of synthesised neurons. Then the cells
are connected together according to the rules that
have been found experimentally. Finally the neurons
are functionalised and the simulation brought to life.
The patterns of emergent behaviour are viewed with
visualisation software.
Visualisation of results
RTNeuron
RTNeuron is the primary application used by the
BBP for visualisation of neural simulations. The
software was developed internally by the BBP
team. It is written in C++ and OpenGL. RTNeuron is
ad-hoc software written specifically for neural
simulations, i.e. it is not generalisable to other types
of simulation. RTNeuron takes the output from
Hodgkin-Huxley simulations in NEURON and
renders them in 3D. This allows researchers to watch
as activation potentials propogate through a neuron
and between neurons. The animations can be stopped,
started and zoomed, thus letting researchers interact
with the model. The visualisations are multi-scale,
that is they can render individual neurons or a whole
cortical column. The image right was rendered in
RTNeuron.
Silicon Graphics
The computer is used by a number of different
research groups, not exclusively by the Blue Brain
Project. In mid-2012 the BBP was consuming about
20% of the compute time. The brain simulations
generally run all day, and one day per week (usually
Thursdays). The rest of the week is used to prepare
simulations and to analyze the resulting data. The
supercomputer usage statistics and jobr history are
publicly available online - look for the jobs labelled
"C-BPP".
Commodity PC clusters
Clusters of commodity PCs have been used for
visualisation tasks with the RTNeuron software.
A research paper published by the BBP team in 2012
describes the following setup:
JuQUEEN
JuQUEEN is an IBM Blue Gene/Q supercomputer
that was installed at the Jlich Research Center in
Germany in May 2012. It currently performs at 1.6
petaflops and was ranked the world's 8th fastest
supercomputer in June 2012. It's likely that this
machine will be used for BBP simulations starting in
2013, provided funding is granted via the Human
Brain Project.
In October 2012 the supercomputer is due to be
expanded with additional racks. It is not known
exactly how many racks or what the final processing
speed will be.
The JuQUEEN machine is also to be used by
the JuBrain (Jlich Brain Model) research initiative.
This aims to develop a three-dimensional, realistic
model of the human brain. This is currently separate
from the Blue Brain Project but it will become part of
the Human Brain Project if the latter is chosen for EU
funding in late 2012.
10
11