Showing posts with label lab. Show all posts
Showing posts with label lab. Show all posts

Tuesday, June 19, 2007

My brain is not a Christmas tree

There are few things that irk me more than the way in which fMRI studies are reported in the popular press. Invariably, the article refers to parts of the brain lighting up in a Christmas-tree-like fashion. There was no Christmas tree in my house when I was growing up. There certainly isn't one in my head. More importantly, this phraseology seems to imply that the brain regions in question were:
a) not active before the experimental manipulation which triggered the festive lighting, and
b) in some sense uniformly "activated" by the stimulation.
Even the revered journal Science has adopted this misleading nomenclature in its lay publications (don't you love ecclesiastic language used to describe academia? I'm a monk of science):
Ever flinch at the sight of an actor being punched in the face? The reason is that neurons in the brain light up when we watch others suffering.

No, no, no (and not just because the Mean Monkey doesn't care if you are sad).

The first implication (conveniently denoted (a) above) is completely wrong, but in a simple way. Neural activity occurs constantly throughout the brain. Urban legends would have you believe that only some small fraction of the brain is actively used. While it is true that many neurons only fire action potentials infrequently, information is carried in their silences as well as their action potentials. If a neuron spiked without pause, it would transmit no more information than if it were constantly inactive. Moreover, at any given moment, there are neurons active in every part of the brain. Even the parts that fail to convey their wishes for a Happy Holiday in fMRI pictures. If you close your eyes, neurons continue to fire in the primary visual cortex. The primary visual cortex is even active in blind people. fMRI measures relative increases and decreases in activity; the baseline is never zero.

The second implication ((b), if you've been following along) is more pernicious, because the underlying reality is a bit more complicated than the whiz-bang notion of "lighting up." fMRI's blood-oxygen-level dependent (BOLD) signal measures the slightly counter-intuitive increase in oxygenated hemoglobin when the metabolic requirements of a brain area increase. (Presumably, this is triggered by a homeostatic mechanism which senses the increased oxygen consumption and dilates blood vessels accordingly. The brain is all tricky like that. Don't even ask how it manages to maintain a reasonable connection strength at each synapse in the face of constant potentiation and depotentiation.) This signal is best correlated with the local field potential (the low-frequency component of electrode recordings due to the average synaptic activity over a span of hundreds of micrometers), rather than the actual spiking activity of the neurons in the area. The upshot of this is that fMRI represents the inputs to a brain region, not the local activity.

That a metabolic measure signals input rather than output is reasonable from a biophysical perspective, since relatively few ions move across the axonal membrane in the process of transmitting an action potential. Most of the axon is covered by a lipid-rich myelin sheath which blocks the flow of ions and decreases the capacitance, allowing the action potential to be transmitted quickly between the gaps in the myelin (known as the nodes of Ranvier, which is also the name of a metal band of questionable quality). In contrast, neurons have giant dendritic trees which are subject to a constant barrage of neurotransmitters, most of which cause ion channels to open. When ion channels open, ions flow through them passively along their electrochemical gradient, reducing the strength of the gradient. Thus, when the amount of input increases, more energy needs to be expended to move the ions back against the gradients, hence the increased need for oxygen.

Now that I write this, I'm not entirely convinced by this justification of the coupling between input strength and oxygen utilization, since although the total ionic flow is much greater in the dendrites than the axon, it's still very small compared to the total ionic content of the neuron. You could cut out the ionic pumps and the cell would be fine for hours or days, or so I'm told, in which case there's no need to immediately increase the amount of available oxygen so the ionic pumps can be run in overdrive. However, it's possible that while the cell as a whole would not lose its overall negative charge were the ionic pumps shut off briefly, everything would go out of equilibrium in the dendritic tree. The branches of the dendritic tree are really small, so the total number of ions in a dendritic spine or branch is not very large. Even the relatively insignificant ionic flow due to synaptic activity may be enough to knock around the ionic concentrations in such small volumes.

Anyway, my point is that the BOLD signal from fMRI measures input, not local activity. And it has absolutely atrocious spacial and temporal resolution. Something on the order of millimeters and seconds. But it makes pretty pictures and lets hack science journalists tell the doe-eyed public "this part of the brain is for when you feel sad and lonely; this part of the brain is for when you feel happy." The real action is in calcium imaging, which can track single spikes (almost) from hundreds of cells at a time (but only in layer 2/3 of the cortex of anaesthetised animals), and chronic multitetrode recordings (disclosure: my old lab used this technique; tetrodes are bundles of four electrodes, which allow the isolation of dozens of cells from a single such bundle through a variety of black-magic mathematical tricks), which can record from perhaps a hundred cells for days, weeks, or months at a time (depending upon the strength of your experimental mojo). But no one wants to see pictures of comatose cats with the backs of the heads lopped off, or rats running around with bundles of wires popping out of their skulls. And the experimental results, while useful and meaningful, rarely come with a five second sound-bite. Half the time even specialists in the field aren't sure what the ultimate implication of a study is. So fMRI gets the publicity and a disproportionate share of the funding.

Which is dumb, because very little useful science has come out of fMRI. Glorified phrenology. One of the most striking known facts about the brain is that most of it looks the same. So far as anyone can tell, aside from a few small and probably insignificant differences, cortex is cortex, regardless of whether it's processing visual data or planning an arm movement or contemplating the secrets of the universe. In fact, you can rewire things visual input to the auditory areas and everything works out just fine (von Melchner, Pallas, & Sur, 2000). In ferrets. Not fruit flies. Not frogs. Visual mammals like you and me.

This would seem to suggest that the same basic computation underlies most of what the brain is doing. Wouldn't it be nice to know what this computation is? Why would you waste your time attempting to pinpoint exactly how the computation is divided spatially, when all the evidence suggests that the computation is the same everywhere? People are strange...

Monday, June 18, 2007

The art of science

In the process of thinking deep thoughts, whiteboards can become miraculously covered with some rather strange designs. Think of whiteboard marker ink as the academic equivalent of the holy oil that sometimes accumulates on statues and portraits of the Virgin Mary. Please note that "You get a cookie" is an important technical concept. We have defined Z to be beauty.







Sunday, June 10, 2007

The cult of scientific celebrity

Science is, in essence, a purely rational discipline. While naive notions of hypothesis testing or theory falsification as the primary purpose of experimentation can be dismissed out of hand, the project of science is nonetheless inherently logical, as opposed to emotional, political, or spiritual. The proper measure of a theory is always its ability to model the observed world. Neither the personal ramifications of the theory, nor its source, have any impact on its truth. In this light, I have difficulty understanding the reverence heaped upon those who have achieved success in their scientific pursuits.

Nobel prize winners in particular are accorded almost god-like status. The walls of the atrium of my present lab are decorated with pictures of our collaborators, but also with pictures of notable scientists with whom we have no direct connection. Amongst these latter pictures are a few featuring the heads of the lab together with Nobel winners who happened to pass through Zurich and give a talk at the university or ETH. Recently, Roderick MacKinnon, who won the prize for determining the structure of the potassium channel using x-ray crystallography, deigned to grace our lab with his presence for a few hours and was given the royal treatment. Indeed, in announcing this visit, one of the lab heads said, "Rod MacKinnon will be coming to visit. I trust you all know who Rod MacKinnon is," and left it at that. No, I don't know who Rod MacKinnon is. While the result for which he received the prize is important, it constituted a page or two in my introductory neuroscience textbook. It is an important piece of background information regarding the biophysics of neurons, but it has absolutely no impact on my daily work. My research would be unaffected if the three-dimensional structure of all the neuronal ion channels was still unknown. I work in a laboratory focused on computational and theoretical neuroscience. Why should any of us know who Rod MacKinnon is? Nevertheless, we had a special tea to fete MacKinnon, and everyone gathered around at his feat so that they could root about for any pearls of wisdom he might carelessly cast down. After making the obligatory graduate-student-pounce on the free food, I went back to my desk to get some real work done.

Perhaps my awe of the Nobel prize and those who have received its blessings was dulled by my years at MIT and Caltech, where you could sometimes bump into such holy personages while using a urinal. The sight of David Baltimore zipping around on his Segway like a doofus, crowned with a bicycle helmet, fails to arouse in me any worshipful feelings. The Nobel prize and other such awards are a valuable motivation for scientific achievement, but it is important to recognize that the sort of success they honor depends on luck as much as skill. The ranks of scientists at prominent universities are filled with researchers of the highest caliber who didn't happen to try the one long-shot technique that actually worked, or make just the right mistake when performing an experiment to reveal a wholly unexpected phenomenon. Just because the Nobel committee doesn't think a particular result is worthy of recognition this year does not make it less important than the finding which does happen to be honored.

Did I mention how much I like NIPS's double-blind review policy? I hope all journals adopt that model. Papers should be judged on their content, not their authors.

Friday, May 4, 2007

Neuroscience is hard... Let's go shopping!

Classic quote: "Thus, although neuroanatomical information will be central to understanding how the brain processes stimuli and forms representations, our current knowledge of neuroanatomy is sufficient to constrain neither the problem of binding nor its solution." - Adina L. Roskies, writing in the introduction to a special issue of Neuron on the binding problem.

I recently read (well, listened to, at any rate) Surely You're Joking Mr. Feynman and Further Adventures of a Curious Character, both semi-autobiographical works by famed physicist Richard Feynman. Although these books are mostly about Feynman's extracurricular adventures, he does touch upon his philosophy of scientific pursuits. I was struck by what he described as the radical honesty good scientists must bring to their work. Feynman claims that it is not sufficient to simply present all of the details of your work, he asserts that the good scientist must lay bare all of the potential flaws in their theories and experiments. He particularly warns against scientists lying to themselves and failing to recognize weaknesses in their work. This idea initially intimidated me. I make a point of remaining convinced that I am on the cusp of a (or perhaps THE) great discovery regarding how the brain works. It makes going into lab more fun, it seems mostly harmless, and some days I'm even convinced that it's true. However, none of my work can stand up to this sort of merciless intellectual assault. Yes, there are some interesting ideas in there that bear a passing resemblance to the experimental data, but I can't quite jam my higher-level ideas into a neuron by neuron, receptor by receptor, and ion by ion map of the biophysics of the brain. I can't pretend to have read even 1% of the literature on these detailed phenomena. Moreover, it is a truism of neuroscience that for every paper there is an opposite (but not necessarily equal) paper claiming a mutually exclusive result. Creating a model which matches a set of consistent, correct data is hard. Creating a model which is compatible with an undifferentiated mass of mutually contradictory data, half of which is necessarily wrong, is by definition impossible. I think the real answer is that the brain is a messier affair than particle physics. Cloud chambers are neat and sterile. The brain is squishy an amorphous, and you need to peer into it through the tiniest of tubes (generally, a bunch of solid wires). In this sense, neuroscience is much more akin to statistical mechanics than to particle physics. Asking how the brain works is like asking for a detailed description of the turbulence behind a large truck. It's possible to write down rules describing the interactions at the smallest scale, and it's possible to make some hand-wavy measurements of phenomena at the largest scales, but ne'er the twain shall meet.

Tuesday, April 17, 2007

I am a peer

Occasionally, even we lowly graduate students are called upon for the noblest of scientific activities: the peer review. I just finished reviewing a paper by a post-doc and two heavy-weight professors. The verdict? Utter garbage. Sloppy math, incomplete numerical analysis, and meaningless conclusions. I may not be able to publish papers myself, but I can at least try to keep everyone else from doing so. It's like I have my finger in a leaky dyke.
...
A very, very leaky dyke.
...
I'm drowning, here.
...
Does anyone have a pair of water-wings I could borrow?


Also: Pet Peeve #115 - People who fail to correct their statistics for multiple tests. If you have 100 data points, and you evaluate them all separately for significance, and you find that 5 of them are significant for p < class="blsp-spelling-error" id="SPELLING_ERROR_0">Bonferroni is your friend. Don't leave him standing out in the cold. Invite him in for a beer and some nachos.

Also also: The new Virgin Black album has finally hit the internets. I haven't had a chance to listen to it carefully yet, but repeated plays today while reading and such indicate that it is pretty sweet.

Pet Peeve #114

People who listen to you present an idea about which you have thought extensively and for which you have conducted a fairly thorough literature search and then say: "Have you read the paper by (authors XYZ) in which they discuss (idea ABC)? It sounds pretty similar to this" when they read the paper two years ago and barely remember the contents.

Possible responses, in order of frequency:

(a) No. I haven't read it. I'll look at it. [Proceed to look like an ill-informed doofus; look up the paper; realize it is TOTALLY DIFFERENT in ALL PERTINENT DETAILS. Write angry blog post]

(b) I read it, but it was a couple months ago. I don't remember any significant similarity, but I'll look at it again. [Proceed to look like an ill-informed doofus; look up the paper; realize it is TOTALLY DIFFERENT in ALL PERTINENT DETAILS. Write angry blog post]

(c) Yes. I've read it yesterday. It is superficially similar, but TOTALLY DIFFERENT in ALL PERTINENT DETAILS. [Look like a doofus anyway. Write angry blog post]

(d) Oh shit. I lose. [Write angry blog post]

Not the same, jerk!

Teaching Assistance

I started TAing my first class (ever) today - statistical and dynamical models of brain functions. I think I should get off pretty light work-wise. The professor is writing the exercises, and there are only around ten people in the class, so I'm just responsible for grading assignments and running the recitation section. In a masterstroke of scheduling genius, the recitation is set for the hour immediately following the class, so no one has time to look at the homework or think about anything before their sole opportunity to ask for explanations, clarifications, and elaborations. I'd be more incensed by the inanity of this, except it means less work for me, and I have plenty of things to think about on my own. Of course, out of ten students, exactly 0 are female. Girls in computational neuroscience are like honest politicians: it's a great idea on paper, but somehow it doesn't seem to occur in the wild.

Speaking of papers, here is an excellent one: Parachute use to prevent death and major trauma related to gravitational challenge: systematic review of randomised controlled trials. I hear they're still looking for volunteers for a randomized controlled study of parachute efficacy. Regardless of the implication underlying the article, I suspect the authors still prefer drugs tested in double-blind, placebo-controlled studies to copper bleeding bowls and leeches. You may also be interested in reading about the relative willingness of men and women to accept the social and sexual advances of strangers. Unfortunately, this is an article about the paper; I can't find the paper itself for free.

Brooklyn accents can be deceiving

Back when I was an experimental neuroscientist, my adviser was a friendly enough guy, but he never seemed to think about anything other than science. Even when I was working almost eighty hours a week, he would often be in the lab before me, and would usually still be working when I left for the night. (which itself was perfectly normal behavior compared to our post-doc, who would work for days at a time, literally, until sleep-deprivation-induced visual distortions prevented him from continuing. He would bring sunglasses when he came into lab because he knew that, by the time he left, the sun would definitely hurt his eyes. After one of these mammoth work sessions, he would go home and enter a semi-comatose state for almost 24 hours before repeating the entire cycle.) He was always available whenever I had a question (Literally. 2am? No problem. Of course he's still in his office), but as soon as the technical matters were addressed, he'd go back to writing grants or papers or whatever else he was working on (can you undangle this preposition?). While it was pretty easy to set him on a rant about how great it will be to get "kick-ass data," or how some other lab overlooks important questions or uses inferior methodologies, there was a clear limit to his conversational range. I suppose his quirkiest feature was his penchant for gansta rap. It was always a little disconcerting to walk into the lab and find Snoop Dogg playing while he was preparing electrodes.

My current adviser is amongst the more awesome people I've ever met. The other day we were chatting in his office about technical-type matters. I made some claim which (so far as I can tell) was correct, but which made my adviser feel intuitively uncomfortable. So he spends a couple minutes looking at it from various angles, at each turn surprised that everything remains self-consistent. This puts him in mind of the word "consistent," which apparently has its own theme song. Within a minute, King Crimson's Indiscipline is blaring from the surprisingly large speakers on his desk:

The more I look at it,
the more I like it.
I do think it's good.
The fact is
no matter how closely I study it,
no matter how I take it apart,
no matter how I break it down,
It remains consistent.
I wish you were here to see it.

I wish all professors listened to psychedelic 70's prog rock. Later in the evening, some other line of conversation prompted him to put on a recording of Richard Feynman telling stories of his life. Feynman, as you surely know, was one of the most brilliant people to walk the face of this planet. But when my adviser turns on the recording, my immediate reaction is: this can't be Feynman. This sounds just like the senile old man at the beginning of Sleep, on Godspeed You Black Emperor's Lift Your Skinny Fists Like Antennas To Heaven. Little did I realize that Feynman had a Brooklyn accent (Wikipedia, Apostle of Al Gore, informs me that Feynman was from Queens, but all other accounts refer to his "Brooklyn accent"), and after listening to GYBE one too many times, I've come to associate all Brooklyn accents with this one dementia-addled rant about Coney Island in the early 20th century. Frightening...

Brooding

Yesterday's post has me thinking about thinking in a rather course and physical way. A few months ago I was doing some more mathematical work and needed a white board. There are few white boards in the office in which my desk is located, but the hallway outside of the office is covered with them. This is a counterintuitive arrangement, since people work in the office and could make use of white boards, whereas people rarely do much more than walk through the hallway. Some of the hallway white boards are covered with scribbles which have clearly not changed for many moons. Well, little be it for me to let social convention come between me and my white board. I spent a week or two camped out in the hallway, crouched on the floor, staring moodily at the white board. Apparently, my facial expression and body language reflect the depth of my thinking. While I was curled up in the fetal position on the side of the hallway, I was fairly regularly addressed by the passers-by who wanted to know if I was OK. "Yes, yes. I'm just thinking. Please go away." Of course, the scribbles I put on the board during that period are still there, along with the note "Please erase me."