Wednesday, October 23, 2013

Brain Data Workshop

Yesterday I was privileged to attend a workshop focused on the future of brain data. Specifically, we were interested in thinking about how to use and share brain data between research groups. This issue is especially relevant considering (a) how much interest the scientific and medical communities have in solving brain problems and (b) the sheer amount of raw data that is being generated by some of these experiments. One researcher was talking about an imaging technique he uses to image nanoscopic slices of brain tissue in three dimensions. He estimates that single 1mm3 block of brain tissue requires well over 100 terabytes to image. Once his new higher resolution system comes on board, that number will nose over 1 petabyte. The era of Big Data is certainly upon us.

We also discussed how other disciplines like astronomy and particle physics have addressed the realities of sharing high volume data that becomes exponentially expensive to produce. Largely, those fields have focused on solving Grand Challenges. Challenges serve the purposes of galvanizing many people within the community around solving the most important, most pressing problems. Ancillary advantages are common use of expensive equipment and centralized access to massive datasets. Perhaps the time has come for the neuroscience equivalent of the Large Hadron Collider project.

In any event, it was nice to have a seat at the table. I enjoy small meetings (fewer than 100 people) such as this where people can just talk and debate and argue. Good stuff.

No comments:

Post a Comment