Monday, November 12, 2007

When acronyms are not

From a current Journal of Neuroscience abstract:
To identify genes downstream of BDNF that may play roles in psychiatric disorders, we examined a subset of BDNF-induced genes also regulated by 5-HT (serotonin), which includes the neuropeptide VGF (nonacronymic)


Neuropeptide VGF (nonacronymic)? WTF? As if crazy acronyms like BDNF (brain derived neurotrophic factor) weren't confusing and non-mnemonic enough. And who, given a rare opportunity to name something, squanders it on a random sequence of letters? Clearly, there's some in-joke here that I'm missing.

Monday, October 1, 2007

Improve your memory by pharmacologically disabling it when it is not needed

And you thought resveratrol was hot...

Paradoxical Facilitatory Effect of Low-Dose Alcohol Consumption on Memory Mediated by NMDA Receptors

Maggie L. Kalev-Zylinska and Matthew J. During

Epidemiological studies have suggested a negative correlation between alcohol intake and Alzheimer's disease. In vitro, ethanol negatively modulates NMDA receptor function. We hypothesized that chronic moderate alcohol intake leads to improved memory via adaptive responses in the expression of NMDA receptors and downstream signaling. We fed liquid diets containing no, moderate, or high amounts of ethanol to control and matched rats with hippocampal knock-down of the NR1 subunit. Rats with increased hippocampal NR1 expression were also generated to determine whether they had a phenotype similar to that of ethanol-fed animals. We found that moderate ethanol intake improved memory, increased NR1 expression, and changed some aspects of neurotrophin signaling. NR1 knock-down prevented ethanol's facilitatory effects, whereas hippocampal NR1 overexpression mimicked the effect of chronic low-dose ethanol intake on memory. In contrast, high-dose ethanol reduced neurogenesis, inhibited NR2B expression, and impaired visual memory. In conclusion, adaptive changes in hippocampal NMDA receptor expression may contribute to the positive effects of ethanol on cognition.

Sunday, September 30, 2007

Boris: Wandering through Plato's cave

Boris is something of a chameleon in the drone doom community. Indeed, they might well object to any label being put on their creative output. While they do have songs in which single notes are drawn out until it seems they must snap and percussion is dispensed with as an unnecessary adulterant of pure tones and feedback, they also frequently display distinctly punk sensibilities, and are comfortable within the psychedelic rock idiom. Their vocals in particular display an intensity which seems rooted in political or personal concerns, rather than the existential terror or universal hatred which drip from the ragged edges of most black metal rasps. Being a drone doom aficionado myself, I think Boris reaches their peak when they veer towards the abstract and leave out the more quotidian vocals.

Their collaborations with Merzbow deserve particular mention. Merzbow seems to provide a textured but emotionally neutral backdrop against which Boris' pure tones can shine out like jewels. I've sometimes compared Ulver's Nattens Madrigal to being awoken late at night by a ringing telephone and picking up the receiver only to find God Himself on the line. The connection is poor and the line full of static, but it is abundantly clear that this is not the all forgiving God of the new testament, nor even the vengeful but rationally-minded God of the old testament. Rather, this is a being of wrath divorced from mortal notions of reason. Lovecraft's idiot flutist Azathoth blares into the line, barely constrained by the medium's bandpass filter, calling down an apocalypse which represents not moral judgment but the inevitable triumph of entropy. Just as Ulver conjures a deranged deity thrust into the modern world through the most banal tool of communication, Boris and Merzbow bring us into Plato's cave. The rough-hewn walls are solid, but devoid of intellectual or emotional presence. Against their mindless physicality dance pure ideas, freed from their earthly trappings by the stabilizing matrix of rock surrounding them. While the mental and the physical are inextricably wed, Boris and Merzbow draw the connection into a thin thread. The abstract and corporeal move independently, throwing each other into deeper contrast.

In their independent efforts, Boris provides this tension with the abstract through alternative routes. Mental and physical intertwine more tightly, but the result is a dance between yin and yang, rather than a uniform composite. Soaring notes reach out, only to be drawn under by the crash of cymbals and feedback. Their music often has an agitated energy which feels almost carnal, like the buzz of amphetamines, limbs vibrating and twitching of their own accord while the mind wanders elsewhere, only loosely coupled to the pumping pistons of the body. Other times, the music retreats into a contemplative fugue, acoustically fleshing out quiet corners of the world, speaking to lost moments spent alone, almost divorced from the self. In all cases, the result screams craftsmanship and quality.



Saturday, September 29, 2007

Computational horsepower: Natural vs artificial systems

An interesting exercise in dimensional analysis:

There are about 10^11 neurons in the brain, with about 10,000 synapses per neuron, yielding approximately 10^15 synapses total. Extracellular electrophysiology would have you believe that the average pyramidal (excitatory) neuron fires at about 10-20 Hz (inhibitory neurons fire even more rapidly), but there is a strong bias towards recording from more active neurons. It's hard to locate silent neurons with an extracellular electrode, since you can only determine that you are near a neuron when it fires. Intracellular electrodes may also be biased towards larger cells, since they are probably easier to spear (or clamp onto, depending upon technique), but almost all in vivo recordings are performed using extracellular electrodes. Arguments based upon metabolic rates suggest that the average firing rate is closer to 1 Hz, but I don't have the actual papers at my fingertips. I'll fish it out the reference if challenged. The traditional leaky-integrate-and-fire model of neural activity suggests that depolarization / shunting / hyperpolarization delivered to synapses throughout the dendritic tree and soma sum linearly in the soma (subject to a low-pas filter), and the neuron fires an action potential when its membrane potential exceeds some threshold. Polsky, Mel, and Schiller (2004), amongst other recent papers, implies that individual dendrites or dendritic compartments actually contain separate computational subunits, utilizing the nonlinear voltage-gated and NMDA channels to perform a thresholding operation similar to that normally ascribed to the axon hillock (where action potentials are initiated).

Let's be generous and assume that each thresholding operation roughly corresponds to a single floating point operation. We can also reasonably estimate that there are 100 such thresholding sites in the dendritic tree, and they perform thresholding operations (i.e., floating point operations) at the same rate that their inputs fire. The human brain then runs at approximately 10^11 neurons * 10^2 thresholding sites / neuron * 1 Hz = 10^13 flops. Bounding our estimate on the other side by assuming that each synapse performs a floating point operation each time it receives a spike and that the average neuron fires at 10 Hz, we find that the brain performs at most about 10^15 synapses * 10 Hz = 10^16 flops. The world's fastest supercomputers run at about 100 teraflops, or 10^14 flops, whereas desktop computers can achieve about 10^10 flops. Furthermore, Moore's law implies that computer performance should double approximately every 24 months. This suggests that computers are very close to the computational capacity of the brain.

Of course, this doesn't imply that we can simulate such a brain. The biophysics underlying these abstract computations is orders of magnitude more complex than the computations themselves. The blue brain project is currently struggling to simulate a single cortical column with a scant 10,000 neurons. To make efficient use of our available computational power to simulate the high-level activity of the brain, we would first need to know the basic algorithms the brain is computing.

Friday, September 28, 2007

Top search keywords

For your amusement, here are some of the search terms which led people to this blog:

amps that really go to 11
how do you purge after a binge?
martini & rossi vermouth global market share
math nightmares
what to do post binge
wolves in the thrown room (the band is wolves in the throne room; the homophone leads to a slightly different mental image)

Seek and ye shall find.

Oliver Sacks, on mountains of amphetamine, mescaline, and cannabis

Stolen without remorse from a Wired interview with Oliver Sacks about his new book on music and the brain:

Hume wondered whether one can imagine a color that one has never encountered. One day in 1964, I constructed a sort of pharmacological mountain, and at its peak, I said, "I want to see indigo, now!" As if thrown by a paintbrush, a huge, trembling drop of purest indigo appeared on the wall — the color of heaven. For months after that, I kept looking for that color. It was like the lost chord.

Then I went to a concert at the Metropolitan Museum of Art. In the first half, they played the Monteverdi Vespers, and I was transported. I felt a river of music 400 years long running from Monteverdi's mind into mine. Wandering around during the interval, I saw some lapis lazuli snuffboxes that were that same wonderful indigo, and I thought, "Good, the color exists in the external world." But in the second half I got restless, and when I saw the snuffboxes again, they were no longer indigo — they were blue, mauve, pink. I've never seen that color since.

It took a mountain of amphetamine, mescaline, and cannabis to launch me into that space. But Monteverdi did it too.

Sunday, September 23, 2007

Apophenia, take 2

After some additional consideration, I think my Apophenia post was unfair. Humans, and indeed mammals in general, are ridiculously good at detecting and utilizing correlations. But only particular types of correlations which they have been evolutionarily prepared to expect and process. Consider that darling of experimental neuroscience, the rat. Rats will learn to associate a tone with a foot-shock in only a few trials, if the tone is brief, co-terminates with the foot-shock, and if the tone predicts the shock with high reliability. If the tone is more than a few seconds long before the shock occurs, if there is a substantial gap between the tone and the shock, or if the tone occurs often without a shock, the rat will not learn the relationship. Similarly, rats can learn that water with a distinctive taste is correlated with nausea after only a single pairing, even if the nausea occurs hours after the water is consumed, but they cannot learn that lights and sounds are correlated with nausea.

You might now quite reasonably be thinking that learning to associate tones and shocks is not very impressive, and that I promised quality correlation detection. The tone-shock combinations are unnatural and thus akin to the correlations in the Apophenia post. Rats rarely encounter electrified grids hooked up to speakers in the wild. The ability to associate foods with illness is no mean feat, given the time spans involved (although it can go awry - if you've ever eaten a distinctive-tasting food while sick and later vomited, you likely found that you had lost your taste for that food. I didn't care much for lobster for most of my youth after my father brought home a special treat one evening when I had an ear infection. Cancer patients undergoing chemotherapy are often left staring down the business end of this phenomenon). But the brain (and the cortex in particular) really comes into its own when processing complex instantaneous sensory stimuli.

Consider the same rat which couldn't remember that it was going to receive a shock after hearing a bell because we nefariously inserted a five-second delay between the two stimuli. If you stick that rat in a big pool of opaque water (they use some sort of latex beads, I think) with a small platform hidden just below the surface of the water, the rat will swim around at random until stumbling upon the platform, at which point it will immediately climb up and out of the water (this is called a Morris water maze. It's generally used to test spatial memory. Note that while they're pretty good swimmers, rat's don't bathe recreationally). If you now pick the rat up, blindfold it, swing it around your head a few times to disorient it, and put it back in the pool at some random location, it will immediately swim back to the platform.

Amazing! After a single trial, the rat learned to associate the complex visual stimulus perceived at a particular location in the pool with the position of the platform. Even though this visual stimulus changes completely depending upon the direction in which the rat is facing (they don't have to approach the platform from the same direction each time). Even though the visual stimulus associated with other positions in the pool is virtually identical to that at the platform. Even though rats are not very visual animals (they rely primarily upon olfaction and tactile sensation (remember the whiskers!)). How the hell was the rat able to figure out that some subtle variation in the pattern of light falling on its retina indicates a nice dry spot to chill out, whereas almost identical patterns would leave it treading water until it drowned from exhaustion? (Of course, as the benevolent experimenter, you would rescue our friend the rat before it met its untimely demise in a kiddy-pool of milky water. Right? Right!?!?!)

Neither neuroscientists nor computer scientists have a convincing answer to this question. If you've ever tried to use voice-recognition software on a computer, you are familiar with how bad computers are at processing sensory input. The reason you've never even seen a computer vision recognition system is that they're even worse. State-of-the-art algorithms can recognize perhaps dozens of different categories of objects, but they are much less nuanced in their discriminations than a rat. For instance, most such systems are baffled by objection rotations and partial occlusion. The brain, in contrast, detects the necessary high-order correlations with such ease that you don't even realize how difficult the task is.

Saturday, September 22, 2007

Remember Clive Wearing

Every introductory course on neuroscience or psychology makes mention of H.M., a man who's medial temporal lobes (including the hippocampus and surrounding structures) were removed to treat drug-resistant epilepsy. After the operation, H.M. lost the ability to form new long-term memories, as well as most of his memory for the ten or so years before the surgery. (Ironically, H.M. has made a greater contribution to human knowledge than all but a handful of professional scientists, but he will never be able to appreciate the impact he's had.) H.M.'s case is canonical because it is the only instance where the connection between brain damage and subsequent amnesia is so clear. Indeed, no surgeon would have taken such drastic action had they known the effects, and no equivalent operation has been performed since (at least on humans). But H.M. is not alone in suffering from anterograde (can't form new memories) and graded retrograde (loss of memories of the recent past) amnesia. Korsakoff's amnesia is brought on by thiamine (vitamin B1) deficiency, primarily in alcoholics who derive a large percentage of their caloric intake from alcohol. The hippocampus can also be damaged by stroke, hypoxia, and infections.

One of the most acute known cases of amnesia is that of Clive Wearing (chronicled recently in this article by Oliver Sacks), whose temporal and frontal lobes were damaged by herpes encephalitis. Unlike H.M. and other amnesiacs, Mr. Wearing remembers nothing at all; his entire experience is restricted to the minute or two available through working memory (the short term memory which underlies active though processes, often believed to have a capacity of 7 +/- 2 "chunks"). Every time Mr. Wearing is distracted, he awakens to an entirely new and unfamiliar reality. For years, Mr. Wearing has kept a diary. Each entry contains the current time and a record of the profound realization that he is now, for the first time, alive and conscious. He then notices the previous entries. Pages of them. All making the same claim. All in his own familiar handwriting. All written by some unremembered stranger. He goes back, systematically crosses out these false entries, and underlines the current entry. The first true entry. He sets the diary down, glances out the window, and awakens for the first time.



Mr. Wearing has more to teach us than the dependency of memory formation and access on certain brain structures. His condition is not so different from our own. Consciousness is inextricably tied to the present moment. Our past and our future belong to other people. People who occupy the same body, in the same world, but who are tied together only by the hallucination of memory.



Apophenia

Humans are horrible at noticing unexpected connections between large number of variables. When I was living in LA, driving took on an added measure of excitement during a heavy rainfall. Not because of reduced visibility or the slickness of the road as months of impacted dust and oil were finally loosened and left to form a thin, low-viscosity film over the asphalt. Rain has those effects on roadways across the world. In New York or Boston, the dangers of driving in the rain are hammered into new motor vehicle operators before the first drop of precipitation taps their moving windshields. Southern Californian drivers, in contrast, are faced with rain so infrequently that the subject seems to be passed over in high school driver's ed. Many are slow to discover the relationship between rain and reduced traction on their own. Even the occasional skid while blasting through tight curves at 60 mph or faster doesn't seem to clue them in to a possible causal interaction. On the evening after a heavy rainfall, there is inevitably a car or two lying inoperable on the sides of the 110. The stories I heard about cars spinning 360 in the middle of the roadway were even more disturbing.

Another example: One summer in college, I was baffled by the seemingly unpredictable variations in my level of motivation when working in the afternoons. Some days I was hungry and tired and disengaged as the afternoon wore into the evening. Other days, I was consumed by my research and barely thought about dinner. I thought the difference might be due to some aspect of my diet or sleep pattern, but I couldn't find any consistent covariations. I even considered the possibility that I was going into ketosis (which of course is patently absurd). Only after the fact did I come to realize that the difference must have been due to the color of the coffee pot. I had always assumed that caffeine had relatively little effect on me and drank coffee because I liked the taste. So I didn't really pay attention to whether I poured my coffee from the brown (caffeinated) pot, or the orange (decaffeinated) pot. In retrospect, I was as naive as the LA drivers who refuse to ease up on the gas peddle during a downpour.

Which brings us to the present day. Up until last Tuesday, I was in a multi-week funk, feeling not just tired, but less conscious than usual. I was sleeping long and deeply, but it just wasn't doing me any good. Philosophers refer to entities that behave just like people but which aren't conscious as zombies. That's what I felt like, perhaps with the addition of a little homunculus sealed off behind one-way, sound-proof glass, allowed to watch the proceedings but unable to exert any control. Then on Tuesday, the funk magically lightened. I can't say I feel 100%, but it's certainly better. What brought on this descent into the voodoo nether-world? What precipitated my slow return? Here we are once again faced with a superflux of variables and no sign of a correlation. I'm sure there's a simple explanation, but I'm equally sure that I will never find it. My best guesses are:

a) After spending a week or two going all hard-core on the German learning, I gave up again. I can't remember the citation off the top of my head, but I'm pretty sure that REM sleep has been shown to be correlated with language learning. That is, immersion in a foreign language increases the amount of REM sleep, and the magnitude of this increase is correlated with the amount of learning. Perhaps my foolish attempts to learn the local language were saturating my REM sleep time, leaving me semi-sleep deprived despite eight hours in bed.

b) The gym finally reopened. It had been closed for the previous two weeks. For cleaning. In the US, the cleaning would have been done at night. Or one section of the facility would have been closed off at a time, leaving the rest functional and open. I can only assume that a phalanx of temporary employees was brought in, issued toothbrushes, and spent ten eight-hour days on their hands and knees, scouring every surface. Either that or they sealed off all the entrances and flooded the building with dilute hydrogen peroxide. When I was finally allowed back in Monday night, the gym was indeed clean. But it was clean before they shut down. Which is to say, I couldn't see that their extensive efforts made any difference. Regardless, I was finally able to lift heavy things and put them down again, after two weeks of just running in circles. This change in my exercise routines may have directly or indirectly affected my energy level.

In the end, I'm probably just an obsessive-compulsive hypochondriac, and it was all in my head to begin with. But the point remains that people are bad at detecting unexpected correlations.

Monday, September 17, 2007

Coffee machine: The legend continues

So I managed to get to the Migros on Saturday with a few minutes to spare before the 5pm closing time. I don't think my roommate bought my story that the coffee machine's great aunt was a phoenix, and that it was consumed in a burst of flame and born anew from the ashes. The slag of its predecessor in the sink might have tipped her off.

On Sunday, I set out to make my first cup of coffee with the gleaming new aluminum marvel. I even managed to read the instructions in German only to discover that, for the past year, I've been packing in the coffee grounds too tight and brewing the coffee over excessive heat. With my ground coffee loosely set in the filter and the flame turned down to medium, I awaited the black nectar that would soon gush from the top of the spout. Except that it didn't. Rather, it bubbled out of the side of the machine and pooled on the top of the oven. I did manage to extract enough juice to have my morning cup of coffee, but the residue in the bottom of the pot revealed a rift extending along perhaps a quarter of the bottom edge of the pot.

Made in Italy. Worthless. Apparently, Italian aluminum is very delicate and tears when subjected to too much force. Even when it is a couple of millimeters thick.

So I'm now on my third coffee machine in three days. Folk wisdom holds that this attempt should be charmed, but I'm not holding my breath.

Saturday, September 15, 2007

Swiss apartments don't have smoke detectors

One of the things you don't realize living in America is how new everything is. A country that is less than 250 years old, and which has been subject to constant rapid growth, necessarily has little in the way of old buildings. As a result, few current buildings have been grandfathered past modern safety regulations. As a case in point, I can't think of a single building I've visited in America which didn't have smoke detectors. In just about any kitchen in the country, were you to leave a two-part screw-together espresso machine on the burner after the coffee had finished sputtering out of the top, the discordant shriek of a smoke alarm would soon alert you to your folly.

In contrast, in Switzerland, were you to start a cup of espresso and retire to your room, the first signal to wake you from your obliviousness would be the scent of something burning slowly diffusing through the crack under the door from the billowing clouds of smoke out in the hallway. You might then rush into the hazy kitchen to find the top of the stove covered with a crust of long-since dried coffee, the knob on the top of the coffee machine melted into a Dali-like parody of itself, and the plastic handle simply gone. You would be forced to conclude that under the influence of extreme and prolonged heat, seemingly durable plastic handles can disintegrate outright.

Switzerland would then place you in an additional bind, as your hankering for coffee might have hit at around 4pm and all of the stores close for the weekend at 5pm, leaving you with 15 minutes to ride your skateboard (your bike having been in the repair shop for the past two weeks; quality work apparently takes a lot of time) to the store to buy a replacement. Mach schnell!!!

Sunday, September 9, 2007

All for naught

Lately, I've had a hankering for a martini. I already had a bottle of gin, so this weekend I went to the supermarket to purchase some olives and a bottle of vermouth. Dry vermouth. The olives were easy, as you would expect, and the alcohol counter had four different types of vermouth. I have generally assumed that vermouth comes in two different varieties: dry (for martinis) and sweet (for leaving on the store shelf). Swiss supermarkets apparently split the world along different lines. Their display distinguished between rot (red) and weiss (white) vermouth. Rather than plunge headlong into the unknown and wind up with something undrinkable, I confronted the salesperson, in German no less, and asked whether any of the bottles contained dry vermouth. After a minute or so of mutually semi-intelligible mumbling, she decided that she didn't know, and asked another clerk. Another minute of garbled German later, and a third clerk was paged to the alcohol counter. And then paged again after five minutes of standing around uselessly. After he managed to communicate his unfamiliarity with the distinction between sweet and dry vermouth, I just bought a bottle marked Martini & Rossi, white. I know that Martini & Rossi makes the canonical vermouth, and I've never put anything red in a martini, so by the process of elimination, this seemed like a reasonable choice. Little did I know that the Italian company labels their sweet vermouth "bianco," and that in a country where an order for a "martini" in a bar produces a glass full of vermouth, rather than a glass of gin over which a closed bottle of vermouth has been quickly passed, supermarkets don't even carry dry vermouth, at least of the Martini & Rossi brand. Why, I am forced to ask, do I waste any time trying to learn German when it proves to be of so little use in practice?

In other news, Terminator 2 in German sounds completely and totally wrong. Arnold's Austrian accent is well enough preserved, but carefully crafted pithy phrases like "I'll be back" just don't translate.

Saturday, September 8, 2007

Clashing subcultures

Last night, I went to a goth event at Dynamo, a "youth culture house." Dynamo has two spaces for holding music-based events: a large hall suitable for a few hundred people on the fifth floor, and what looks like a cave carved out of the basement, which can accommodate perhaps 75 people without risk of suffocation or trampling. The entire complex seems to be maintained by an ouroboros of angry teenagers and twenty-somethings, the youth at the head constantly consuming the jaded veterans. In the main basement room, the walls of rough-hewn stone curve up towards the ceiling like a large train tunnel accidentally lost in subterranean Zurich during the industrial revolution. Sophisticated modern club lights are bolted directly to the unfinished ceiling. They flash and twirl mindlessly, without regard to the tempo or character of the music. In the place of a fog machine, a box fan or two hum in the corners to reduce the risk of hypoxia. I hope that the smaller girls will serve as the canaries in this mine, but my manic dancing probably means that I will be the first to fall.

Dynamo has goth nights approximately twice per month, produced by various groups, and invariably confined to this basement. The building is set into the side of a steep river bank, so when walking to the dungeon, I pass by the upper floors. Last night, the walkways and stairs leading from the street down to the basement were covered with what looked like an international punk convention. In an affluent, mild-mannered place like Switzerland, you'd think that the youth would have relatively little to rebel against, but comfort and security often breed contempt. The Swiss equivalent of America's white middle-class suburban ganstas seems to be a carefully preened pseudo-punk, replete with faux-hawk, tight jeans, and Converse All-Stars. This forces the counter-culture to go even further. In 100 feet, I saw enough metal both wrapped around and pierced through the massed bodies to outfit a small hardware store. Dyed hair and mohawks were definitely the order of the day. Few articles of clothing had been spared intentional damage and reconstruction.

The grand irony of this brief saunter through punk never-never-land is that the punk aesthetic is not so different from goth culture. Both forms of music share common roots. And yet, there was an unmarked but blindingly obvious line dividing the punks from the goths. Even middle school cafeterias permit less rigidly demarcated social stratifications. When I left for the night at 3am, there were still clumps of punks milling around Dynamo. I imagine that their concert ended hours before, but since public transportation would not shake off its nightly torpor for another three hours, they were just smoking cigarettes while watching for the dawn. I saw only a single punk breach the line and wander into the basement, where it was warm and the beer was still flowing.

Friday, September 7, 2007

Every week is fashion week

I read the New York Time web site like a holy writ. Frequently, coverage of some fashion show or another will make its way to the front page of the web site. I'm a sucker for anorexics, so I generally take the bait and click the link. Of course, being the sort of person who reads the entirety of every blurb in a museum (even if it's an art museum (even if the text just list the artist, year of creation, and the ever-popular name "Untitled.")), I actually read the article that accompanies the pictures of expensively dressed waifs and wastrels. So far as I can tell, though, these articles are as free of content as the carefully coiffed heads of the models.

The world of fashion seems to be predicated on some notion of progress, where styles constantly change and evolve. Certainly, there is some basis for a claim of development in other visual art forms. The style of paintings exhibit clear changes from decade to decade, and while I am no art critic, I can appreciate some of the evolving theoretical ideas which underlie the changes in appearance. These runway shows, in contrast, feature the same basic design year after year. Shirts are still shirts. Pants are still pants. The commentators will note that wool is a common trend this year. Last year there was very little wool. But seven years ago everything was wool. Does this make wool new or innovative?

What's more, I cannot for the life of me figure out whether anything motivates the cut of the sleeve of a dress or the choice of lining for a coat other than pure aesthetic preference divorced from deeper metaphor or meaning. I mean, a designer presents a line of coats which neither keep the wearer warm nor allow them a full range of vision. Fantastic. But why? Is this commentary on the position of women in society? Or the relationship of the individual to the technological and cultural artifacts with which we surround ourselves? Because it looks really weird and doesn't really function as clothing in the conventional sense. Even if you could manage to walk around in a high fashion outfit without tripping or running into a wall, your progress would be impeded by the baffled masses gawking at the such alien tactics for covering your nakedness. If runway clothing is so impractical that it cannot be worn outside of a fashion show (hence the necessity of distinguishing "ready-to-wear" fashion from its more cumbersome brethren), then surely these aesthetic flourishes should serve some higher purpose. Please enlighten me.

Wednesday, September 5, 2007

Hot new antidepressant: LSD

On page 1218 of Kandel, Schwartz, and Jessell's canonical Principles of Neural Science (4th ed), a diagram describing the "Action of antidepressant and other drugs at seratonergic and noradrenergic synapses" contains the following annotation for the post-synaptic 5-HT receptor: (Antidepressant) "stimulation of 5-HT receptors as partial agonist (lysergic acid diethylamide)." And in the caption: "Lysergic acid diethylamide (LSD) acts as a partial agonist at postsynaptic serotonergic receptors in the central nervous system. A number of specific compounds are now candidates to act as receptor-blocking agents at various serotonergic synapses."

Jesus Christ! My understanding is that most psychedelic substances are agonists of the 5-HT2a receptor. In fact, I think I've read some papers which used 5-HT2a stimulation (in the rat aorta of all places) as a test of hallucinogenic activity. But I've never thought of these substances as anti-depressants, despite their common activity in the serotonin system. Of course, antidepressants (all of which increase the availability of serotonin in the synapse, at least initially) take weeks to have clinical effects despite their immediate impact on the serotonin system (well, at least SSRI's and the tricyclics have immediate effects), so it's unlikely that a substance that is washed from the body in a matter of hours would have an effect on depression due to its serotonergic activity.

Sunday, July 1, 2007

SUV drivers for a greener tomorrow

A week or two ago, my lab received a fresh shipment of printer paper. The boxes were labeled "bright white." The contents, made from recycled paper, were more of a mouldering yellow. They felt like newsprint, shriveled when exposed to even a drop of water, and reduced the apparent contrast of color figures. I'm all in favor of saving the earth, but the voluntary actions of individual consumers are all but irrelevant. Such gestures of self-deprivation are comparable in efficacy to wearing sackcloth and beating your back with a switch. In the face of billions of other individuals who don't personally choose to save the planet by denying themselves consumerist pleasures, any single person's purchasing decisions have a negligible effect. Moreover, the real damage to the planet can come from surprising places. If I read a document printed on recycled paper while eating an orange imported from Spain and grown with inorganic fertilizers and pesticides, how does the damage incurred in the process of producing and shipping that orange compare to that saved by using recycled paper rather than beautiful fresh white paper? My seat on a flight home from Zurich to New York City produces about 1.7 tons of carbon dioxide. How many reams of white paper would I have to use to place a similar burden on the environment?

This is, once again, a case of the tragedy of the commons. So long as our economic system remains structured so that it is in each individual's personal interest to live in an environmentally unsustainable manner, no amount of personal sacrifice will prevent human activity from fundamentally scarring the world in which we live. In fact, attempting to reduce your own impact may make the problem worse in the end, since it masks the full impact of current policies. Nonlinear systems can exhibit strange dynamics. If it is clear that the earth is currently spiraling towards destruction, it will be easier to convince politicians and populaces across the globe to implement broad policies to moderate human impact. A slow descent allows the nay-sayers to invoke the paradox of the heap. If any single year of wanton consumption only pushes us incrementally towards the brink, then we can safely wait until next year to implement strict controls on energy consumption, recycling, greenhouse gases, and the like. If there is any doubt regarding the severity of the problem, then the same people who deny the existence of global warming today will continue to bury their heads in the sand until environmental armageddon sweeps them from their feet. So! Drive an SUV for a greener planet today!

Saturday, June 30, 2007

A taxonomy of Swiss dance styles

I'm a pretty committed club-goer. Except under extenuating circumstances, I'm at X-TRA's More Than Mode every week, and I generally sojourn out to Abart or Dynamo when they're having a goth event. I've also spasmed in time to a beat at Buddha Bar, Garufa, Hive, Labitzke, Mascotte, Supermarket, and Tonight. One of the early manifestations of my culture shock after moving to Switzerland was my surprise at the striking difference between my habitual dancing style, honed in the clubs of Boston and LA, and that of the native Swiss. Although it's now clouded by a year of forgetfulness and conflicting experiences, I remember many of the black-clad masses in both Boston's and LA's goth clubs as practicing a consistent and distinctive dance style, featuring sharp and violent arm motions mostly from the elbow. Something on the order of this. Admittedly, it doesn't have the same effect when performed in your parents' basement, but you get the idea. In particular, there was one clique of corpse-painted Spanish-speaking guys decked out in leather and spikes at the ironically name Das Bunker whose dancing looked like a fight scene from a Kung Fu movie remade for the Dark Ages. I also recall a sizable contingent of club-goers who preferred a more emotive style of dance, perhaps philosophically akin to ballet. Their dancing seemed to express the emotional content of the music, sometimes going so far as to act out the lyrics. Regardless of the details, no one was afraid to move around, and half the fun of going out was watching everyone else.

The Swiss, in contrast, are more reserved, both physically and emotionally. A substantial fraction of the people on the dance floor just rock from side to side. The slightly more creative will take three steps forwards and three steps back. Sometimes with a reckless disregard for anyone who might have strayed into their path. I think this particular style may even be enshrined in a song, but it's in German (something like "drei Schritte vor und drei zuruck") and my google-fu is not up to the task. Occasionally (at establishments playing electronic rather than goth music), someone will throw their arms in the air as if they were gesticulating with pistols held sideways, gangsta-style. These people are not gangstas. And then there is the frightening menagerie of truly atrocious dancers. Like the guy who wheels around the dance floor like a fencer set free from the piste, all the while waving his arms like a conductor counting out 4/4 with a baton. Or the fat balding guy who violently rocks back and forth while smoking a pipe.

There are a few exceptions. One couple at X-TRA wears long skirts and has perfected a style heavy on pirouettes which pull them into elegant motion. Another couple has managed to develop an expressive and fluid technique that wouldn't seem out of place in a US club. Other than those four exceptional cases, and a few other competent dancers, I'm in a sea of people performing the sort of ur-dance known intrinsically to every five-year-old. The sort of dance executed by wall-flowers when told that they just need to move to the music. Maybe the difference is that American children of my generation were raised on a diet of American Bandstand and MTV. Michael Jackson's moonwalk permeated our lives as much as his music. City streets were filled with kids break-dancing on sheets of cardboard. Dancing was intrinsically understood to be as much a public performance as a form of personal expression. Then again, Swiss beer is almost uniformly bland, in contrast to the bolder traditions of many of its neighbors. Maybe Swiss dancing is similar.

Friday, June 29, 2007

SCOTUS can suck it

The Supreme Court has gone off the deep end. SCOTUS has recently come to the learned conclusion that students are not permitted to exercise their right to free speech if their message contains any reference to mind-altering substances. Quoting from a Times article to which I can't seem to generate a stable link,
"In light of the history of American public education,” Justice Thomas said, “it cannot seriously be suggested that the First Amendment ‘freedom of speech’ encompasses a student’s right to speak in public schools.”

That's right. The mission of American public education is to tell students what to think, to indoctrinate them with the prevailing beliefs of the day, rather than to teach them to use their own powers of rational thought. This meshes perfectly with the Bush administration's push for a unitary executive, with powers trumping those of the other branches of government. In both instances, America is being rendered vulnerable to a tyrannical majority which seeks to impose its values on the entirety of society. Long gone are the days of a pluralistic culture, embracing everyone's individual perspective and favoring none. Indeed, in yet another blow against an open society where ideas are freely and universally exchanged, the Supreme Court has deemed active desegregation through race-conscious school admission programs unconstitutional. And what better way to cement the control of those already in power than to repermit them to bias elections by saturating the media with advertisements immediately before balloting? The McCain-Feingold act has been rendered more porous than the legal arguments supporting the torture of "enemy combatants." With television, radio, print, and internet approaching nitrogen in their ubiquity, allowing the wealthy to suffocate the populace with a self-serving message will further drown out the voice of the common man. When the unitary executive does trample the constitutional guarantee of freedom from state-imposed religion, SCOTUS joins the cheering section and announces that the proletariat has no standing to challenge such abuses in court. And as a rancid cherry on top of this foul four-scoop sundae, the Supreme Court has ruled that manufacturers and distributors who forbid discounting and set minimum price floors do not necessarily violate the Sherman Antitrust Act. An independent judiciary is a fine thing, but only when it is committed to being (a) not stupid and (b) not evil. Our new chief justice and his conservative bloc seem to be failing at least one of these two key tests.

Monday, June 25, 2007

Pet Peeve #115

People who use "three-dimensional" graphs. Paper is two dimensional. There is no way to represent a general black-and-white three-dimensional object using a single black-and-white two-dimensional figure. Even if you use Matlab. Even if you use wireframes or shading. Even if you really, really want to.



You are encouraged to use color to create an artificial third dimension. Back when I was writing real-time spike analysis software for electrophysiology recordings (read: stick wires into rat brains; try to extract the signals of individual neurons from the muck), I tried using the red and green color channels to represent the two dimensions of our four-channel tetrodes which wouldn't fit on the (by definition) two-dimensional scatterplots. Note that the human eye contains three (3) distinct color receptors, so this strategy unambiguously encodes the desired information, although it can be a bit difficult to decipher visually. I thought it worked pretty well. My adviser thought it looked "unprofessional"; i.e. unlike the stupid commercial system we were sacrificing a year of our lives to replace. Meh.

If you're some sort of crazy chemist (e.g. Gylbert, 1973 or Hackert and Jackobson, 1971) and want to use a stereogram, that's pretty swanky, but you'll need to depend upon the ability of your audience to freely rotate their eyeballs in their sockets. Maybe the format is standardized and they give out stereoscopes at chemistry conferences they way they used to give out those horrible red/blue cellophane glasses for 3-D movies. In any case, a single two-dimensional view of a three-dimensional object is necessarily ambiguous. Computers let you do all sorts of things you really ought to avoid. Powerpoint presentations, for instance. The 80's are dead. Get over it.

For the love of god and all that is holy, Earth's Hex: Or Printing in the Infernal Method is utterly sublime

For one reason or another, it's been a month or two since I've listened to any later-day Earth. Sunn O))) has found its way onto my playlist regularly. I've experienced A Bureaucratic Desire for Revenge (Parts 1 and 2) within the past week. But Pentastar and Hex have been the subject of an unconsidered neglect.

Friends, please allow me to warn you against such a thoughtless and ultimately self-destructive course of action. Listening to Hex as I type, I am all but overcome by the sumptuous textures layered one on top of another in this album. Earth is a group of traditionalists, and Hex is constructed using only the standard guitars and drums, but through a miracle of ingenious recording techniques, they coax lyrical and organic voices out of these commonplace tools. As their name would suggest, Earth eschews the ethereal; their instrumentation is not evocative of the angelic or the demonic. Rather, you can almost feel the dusty soil sliding through your fingers as the guitars peal, resonate, and sing. The mundane is the sublime. Earth elevates the coarsest, most visceral elements of physical reality to an exalted stature. Even a simple stone takes on the epic proportions of a grand monument to the shocking presence, the undeniable reality, of physical existence. These songs are hymns to the world at dawn, while the intentionality of the small scurrying creatures still sleeps, but the earth sits with open eyes, ever watchful.

Sunday, June 24, 2007

Utopias and artificial scarcity

Utopian literature is by and large a rather dull genre. Consider Edward Bellamy's Looking Backward, Aldous Huxley's Island, and Sir Thomas More's original vision. Many of the famous contributions were written before the failure of socialism demonstrated that human self-interest is more powerful than any community, and later writers seem to conveniently develop selective amnesia regarding the success of efforts to implement collective societies. In a particularly egregious instance of such intentional blindness, the psychologist B. F. Skinner produced fantastical visions where an end-run around human nature created out each sovereign individual a worker bee primarily dedicated to the good of the hive, a cog in the larger machine. A key feature of these political, social, and economical fictions is the state of plenty, or at least sufficiency, which arises when everyone works for the common good and takes no more than they need. Work days are uniformly short, but while on the job, workers are focused and productive. No one resents offering the sweat of their brow for the benefit of others, nor do any strive to lift themselves above their fellows.

While flesh and blood humans may never be able to achieve these ideals, it strikes me that certain features of these utopias are within our grasp. Many of the products of our present information economy are kept out of reach of most people only because of artificial scarcity, enforced through intellectual property laws. Consider a few examples: only a minute fraction of the cost of most medications is due to the expense of their manufacture or distribution. Aside from a small percentage of substances of biological origin, once the effective agent and synthesis technique are known, most pills can be produced for pennies. India has taken advantage of this fact by rewriting their patent laws to cover the manufacturing process rather than the final product. As a result, Indian companies can produce substantially discounted generic versions of HIV and other medications by subtly changing the process by which the substances are made. I'm no expert on world economics and trade law, but the article linked above claims that these generics are sold for one twentieth to one fiftieth of the American price. Pharmaceutical companies shout themselves horse asserting that the high cost of their wares reflects the expense and risks of research, but most of the basic research which underlies medical innovation is performed at tax-payer expense by university laboratories. It is true that the clinical trials needed to prove the safety and efficacy of new drugs are time consuming and expensive, but it is hard to believe that they could not be performed by the public sector. The academic world has produced a stunning system for motivating intelligent and industrious individuals with a carrot that isn't made of dollar bills. Given the opportunity and the right incentives, these institutions could turn their considerable intellectual clout towards producing medications financed and owned by the public.

At a more mundane level, consider the designer fashion industry. A $200 or even $500 pair of limited-edition designer jeans is not made of substantially different materials than the $20 jeans from Old Navy. They are not the product of superior workmanship. In many cases, they are made in the same overseas sweatshops. And most importantly, in many places in east Asia (and their outposts in major American cities), you can obtain knock-offs of these designer jeans for pennies on the dollar.

Finally, I would be remiss if I failed to mention that the substantial output of the movie, television, music, software, and art industries could be available to all for no more than the cost of a broadband internet connection. While the arguments regarding intrinsically electronic media have been flogged until their backs are raw and bleeding, it is relevant to note that many of the works of art which sell for thousands or millions of dollars could be easily mass produced and made available for everyone's living room. Indeed, many of the most famous contemporary artists operate factories where journeyman artists render into canvas and paint the vision of their popularly anointed overseers. These same artistic apprentices could produce their paintings at less than stratospheric prices if less value was accorded to the proprietary signature of their masters. Once again, east Asia has beaten the western world to the punch.

I recognize that simply eliminating intellectual property rights would wreak havoc on the American and world economy. But I think it is high time we seriously considered the cost of artificially restricting the distribution of goods that people want and need. In all of the cases described above, the majority of the costs passed on to the public consist of the expense of redeveloping a product which is already available, and then convincing the public that they cannot do without the new version. How many different selective serotonin reuptake inhibitors, boot-cut hip-huggers, or commercially created boy bands do we really need? Are there no better uses to which those financial and human resources could be put? What benefit do we as consumers derive from the manufactured desire for artificially scarce goods distinguished only by their expensive advertising campaigns? Government, and the rights it supports, is at least in principle of the people, by the people, and for the people. If the present formulation of intellectual property rights is no longer serving our collective interest, we can and should change them.

Saturday, June 23, 2007

Walk softly, but carry a big knife

When I was an undergraduate, I lived in an anarchic commune (colloquially called a fraternity) with a collective cooking arrangement. In return for cooking with two or three other people once a week, you could benefit from the bounty of everyone else's culinary adventures throughout the rest of the week. However, being a group of left-leaning college students, punctuality and responsibility were not really our defining characteristics, and it was not unusual for people to show up late for their cooking team. One afternoon, I dragged myself down to the kitchen promptly and began cooking on time, only to find all of my comrades-in-arms detained by what were certainly more pressing engagements. Like checking email. I was thus already feeling a bit antisocial when the siren which served as the doorbell for the rear door started blaring. Generally, this indicated that a resident of the house had either been too absentminded to remember their key, or too lazy to take it out of their pocket. Rather than put down the bloody knife with which I had been cutting meat, I took it with me as I trudged down the hallway to the back door. When I grumpily opened the door, I was greeted not by a blithe housemate, but by one of our well dressed, well-to-do next-door neighbors. He had some extra tickets to a baseball game which he could not attend, and wanted to know if we could make use of them. He obviously wasn't expecting the door to be answered by a knife-wielding maniac. Ever so slightly mortified, I thanked him for the tickets and retreated back to the safety of my socially insulated abode, where one wouldn't think twice of answering a door knife in hand. Until after the fact, of course. I strongly suspect this neighbor never again darkened our doorstep.

Which brings us to the present day. My lab has a small but functional kitchen, in which I have taken to preparing my dinners rather than cut my nocturnal work-day short or shift my schedule towards daylight hours. Despite having an oven, a range, and a reasonable selection of pots and pans, it lacks some obvious necessities. Specifically, although the kitchen has communal non-stick cookware, the only non-metal stirring implement is a spatula. I refuse to prepare pasta sauce with a spatula. Even more incongruous, there is an entire drawer full of knives. Many of which are relatively new. None of which are sharp enough to cut a vegetable except for one bread knife. This state of affairs is untenable. So when I went grocery shopping today, I picked up a two-pack of wooden spoons, at least one of which I will donate to the kitchen's collection, and 25 franc Victorinox Cook's Knife. So far as I am concerned, this is about twice as much as one should pay for a single kitchen knife. Victorinox makes Swiss army knives with five times as many blades for the same price. But this knife should make even the most recalcitrant carrot feel like butter.

I will not be donating this knife to the communal kitchen collection. This knife will live on my desk, where it will remain sharp and shiny for the duration of my PhD. Moreover, should anyone come to me with objections to one of my papers, I will have 19 centimeters of sharpened steel fury with which to drive home my point. So to speak.

Music search technique

For many moons, my main strategy for finding new music has been to go prowling through the internets, looking for groups with interesting names and reading the reviews on the Metal Archives. Back in the day, I turned to such inconsistent sources as Satan Stole My Teddybear and shoutcast streams. The reviews on SSMT by John Chedsey are evocative, often humorous, and generally reflect good taste, but the other reviewers are not of equal quality. Moreover, the small number of non-professional reviewers on SSMT limits their ability to cover the full breadth of the relatively obscure black and doom metal scenes. I owe a karmic debt to the internet radio station on which I first heard Opeth, catapulting me headlong into the world of dark metal, but the streams that do play the heavier incarnations of metal are generally broader in their scope than my musical interests, and skew towards orthodox bands rather than progressive fair. The Metal Archives, for all the inconsistency in the literary quality of its user-generated content, is stunningly complete and generally accurate.

Investigating bands based upon their names is surprisingly effective. Consider the following list of black metal bands with names beginning with 'forest,' taken from The Metal Archives:
Forest (Cze) - Black Metal
Forest (Pol) - Black Metal
Forest (Rus) - Black Metal
Forest Nocturne - Melodic Black Metal
Forest of Castles - Black Metal
Forest of Demons - Black Metal
Forest of Doom - Black Metal
Forest of Evil - Black Metal
Forest of Fog - Black Metal
Forest of Impaled - Black/Death Metal
Forest of Souls - Black Doom Metal
Forest of Triglav - Black Metal
Forest of Witchery - Black Metal

Of course, a band like Forest of Shadows does not make this list, because the Metal Archives classifies it as doom metal rather than black metal, nor does it include bands whose names include but do not begin with 'forest.' And don't forget the 50 black metal bands whose names begin with 'funeral.' These names are more formulaic than whatever bubble-gum pop hit is currently contaminating the airwaves. As a testament to the predictability of metal band appellations, you can algorithmically generate the names of your next dozen musical enterprises at the Metal Band Name Generator.

Recently, though, I've stumbled on a much more efficient technique for finding new music. I call this strategy Southern Lord. Consider what I hope we can agree are the three best drone doom bands in existence (in alphabetical order): Boris, Earth, and Sunn O))). Further consider the quality black metal groups Wolves in the Throne Room, Xasthur, and Nortt. All are presently or have in the past been distributed by Southern Lord. My present project is thus to go through the entire Southern Lord roster and sample the wares. The giant record labels may have outlived their usefulness, but there is still a need to winnow the wheat from the chaff of the ever more prolific world music scene, and Southern Lord seems to be doing a pretty good job.

Thursday, June 21, 2007

Wolves in the Throne Room - Self-titled demo

Wolves in the Throne Room is the realization of promise of American black metal. Freed from the straight-jacket of narrow-minded expectations which lovingly enfolds the Scandinavian scene, WitTR manages to be true, rather than trve. They have traded corpse paint and church burning for an organic, back-to-the-land ethos which approaches the real spirit of black metal, with its frequent folk influences. This is existentialist music: the songs scream in rage that the world is fundamentally flawed, that human effort is ultimately futile, but that no alternatives exist. They accept the impossibility of meaningful action, but answers with a cry of defiance and perseverance. In this simultaneous denial and acceptance of the absurdity of life lies the closest thing to redemption.

Their first demo is saddled with the sort of low fidelity production which defines early Darkthrone. Lesser bands may intentionally use a subtle background hiss, flat drums, and inaudible bass to build "atmosphere." On a Wolves in the Throne Room album, this lack of clarity merely obscures the stark beauty of the underlying music. Were it not for the almost total lack of bass, the furious drumming on this album would take on a crushing heaviness. The double-bass in particular is used to good effect, menacing rather than thin. Indeed, this entire album has a heavy intensity which is only accentuated by the counterpoint of the vocalists' raspy delivery.

While technically proficient, Wolves in the Throne Room's debut effort sometimes demonstrates a frustrating immaturity in the composition of the songs. There are transcendent passages where everything clicks, and the listener is carried away by soaring, haunting melodies. The band occasionally breaks into an unexpected but effective jazz-influenced solo format, with one guitar taking flight over a repeated melody. But there are intervals where tedious repetition is confused for atmosphere. The band is not as tight as in its later incarnations, and the listener is left with the distinct impression that a few more rehearsals would have yielded a superior final product. Certainly nothing compared to the debacle that was In The Woods...'s final live album, but the slips in synchrony and missed notes are irritating on a careful listen.

On the whole, this demo is a clear portent of the band's promise, and a worthy opus in its own right. Though unrefined, its successes outweigh its failures, and I've found myself playing it more often than not over the past few days. Wolves in the Throne Room's signature melodic structure is already well established. Aside from a few rough patches, the songs are generally engaging and often truly beautiful.

Tuesday, June 19, 2007

My brain is not a Christmas tree

There are few things that irk me more than the way in which fMRI studies are reported in the popular press. Invariably, the article refers to parts of the brain lighting up in a Christmas-tree-like fashion. There was no Christmas tree in my house when I was growing up. There certainly isn't one in my head. More importantly, this phraseology seems to imply that the brain regions in question were:
a) not active before the experimental manipulation which triggered the festive lighting, and
b) in some sense uniformly "activated" by the stimulation.
Even the revered journal Science has adopted this misleading nomenclature in its lay publications (don't you love ecclesiastic language used to describe academia? I'm a monk of science):
Ever flinch at the sight of an actor being punched in the face? The reason is that neurons in the brain light up when we watch others suffering.

No, no, no (and not just because the Mean Monkey doesn't care if you are sad).

The first implication (conveniently denoted (a) above) is completely wrong, but in a simple way. Neural activity occurs constantly throughout the brain. Urban legends would have you believe that only some small fraction of the brain is actively used. While it is true that many neurons only fire action potentials infrequently, information is carried in their silences as well as their action potentials. If a neuron spiked without pause, it would transmit no more information than if it were constantly inactive. Moreover, at any given moment, there are neurons active in every part of the brain. Even the parts that fail to convey their wishes for a Happy Holiday in fMRI pictures. If you close your eyes, neurons continue to fire in the primary visual cortex. The primary visual cortex is even active in blind people. fMRI measures relative increases and decreases in activity; the baseline is never zero.

The second implication ((b), if you've been following along) is more pernicious, because the underlying reality is a bit more complicated than the whiz-bang notion of "lighting up." fMRI's blood-oxygen-level dependent (BOLD) signal measures the slightly counter-intuitive increase in oxygenated hemoglobin when the metabolic requirements of a brain area increase. (Presumably, this is triggered by a homeostatic mechanism which senses the increased oxygen consumption and dilates blood vessels accordingly. The brain is all tricky like that. Don't even ask how it manages to maintain a reasonable connection strength at each synapse in the face of constant potentiation and depotentiation.) This signal is best correlated with the local field potential (the low-frequency component of electrode recordings due to the average synaptic activity over a span of hundreds of micrometers), rather than the actual spiking activity of the neurons in the area. The upshot of this is that fMRI represents the inputs to a brain region, not the local activity.

That a metabolic measure signals input rather than output is reasonable from a biophysical perspective, since relatively few ions move across the axonal membrane in the process of transmitting an action potential. Most of the axon is covered by a lipid-rich myelin sheath which blocks the flow of ions and decreases the capacitance, allowing the action potential to be transmitted quickly between the gaps in the myelin (known as the nodes of Ranvier, which is also the name of a metal band of questionable quality). In contrast, neurons have giant dendritic trees which are subject to a constant barrage of neurotransmitters, most of which cause ion channels to open. When ion channels open, ions flow through them passively along their electrochemical gradient, reducing the strength of the gradient. Thus, when the amount of input increases, more energy needs to be expended to move the ions back against the gradients, hence the increased need for oxygen.

Now that I write this, I'm not entirely convinced by this justification of the coupling between input strength and oxygen utilization, since although the total ionic flow is much greater in the dendrites than the axon, it's still very small compared to the total ionic content of the neuron. You could cut out the ionic pumps and the cell would be fine for hours or days, or so I'm told, in which case there's no need to immediately increase the amount of available oxygen so the ionic pumps can be run in overdrive. However, it's possible that while the cell as a whole would not lose its overall negative charge were the ionic pumps shut off briefly, everything would go out of equilibrium in the dendritic tree. The branches of the dendritic tree are really small, so the total number of ions in a dendritic spine or branch is not very large. Even the relatively insignificant ionic flow due to synaptic activity may be enough to knock around the ionic concentrations in such small volumes.

Anyway, my point is that the BOLD signal from fMRI measures input, not local activity. And it has absolutely atrocious spacial and temporal resolution. Something on the order of millimeters and seconds. But it makes pretty pictures and lets hack science journalists tell the doe-eyed public "this part of the brain is for when you feel sad and lonely; this part of the brain is for when you feel happy." The real action is in calcium imaging, which can track single spikes (almost) from hundreds of cells at a time (but only in layer 2/3 of the cortex of anaesthetised animals), and chronic multitetrode recordings (disclosure: my old lab used this technique; tetrodes are bundles of four electrodes, which allow the isolation of dozens of cells from a single such bundle through a variety of black-magic mathematical tricks), which can record from perhaps a hundred cells for days, weeks, or months at a time (depending upon the strength of your experimental mojo). But no one wants to see pictures of comatose cats with the backs of the heads lopped off, or rats running around with bundles of wires popping out of their skulls. And the experimental results, while useful and meaningful, rarely come with a five second sound-bite. Half the time even specialists in the field aren't sure what the ultimate implication of a study is. So fMRI gets the publicity and a disproportionate share of the funding.

Which is dumb, because very little useful science has come out of fMRI. Glorified phrenology. One of the most striking known facts about the brain is that most of it looks the same. So far as anyone can tell, aside from a few small and probably insignificant differences, cortex is cortex, regardless of whether it's processing visual data or planning an arm movement or contemplating the secrets of the universe. In fact, you can rewire things visual input to the auditory areas and everything works out just fine (von Melchner, Pallas, & Sur, 2000). In ferrets. Not fruit flies. Not frogs. Visual mammals like you and me.

This would seem to suggest that the same basic computation underlies most of what the brain is doing. Wouldn't it be nice to know what this computation is? Why would you waste your time attempting to pinpoint exactly how the computation is divided spatially, when all the evidence suggests that the computation is the same everywhere? People are strange...

Monday, June 18, 2007

The art of science

In the process of thinking deep thoughts, whiteboards can become miraculously covered with some rather strange designs. Think of whiteboard marker ink as the academic equivalent of the holy oil that sometimes accumulates on statues and portraits of the Virgin Mary. Please note that "You get a cookie" is an important technical concept. We have defined Z to be beauty.







Sunday, June 17, 2007

A little reductionism goes a long way

Back in freshman year of college, I took a course called Relativism, Reason, and Reality. I had been exposed to a little existentialism in high school, and I was sufficiently intrigued to consider a philosophy minor. Until I took R, R, and more R. Academic philosophy seems like nothing so much as the dumping ground for failed theoretical mathematicians. Rather than proving well-defined theorems with precise logic, most of the works we read aspired to rigor, but were ultimately undone by their dependence upon metaphor, allusion, and the inherent vagueness of colloquial language. If you can't completely define the assumptions and terms with which you are working, it is impossible to construct an irrefutable argument. There is no room for dissent in proper mathematics. While philosophers may argue about the nature of truth, and the most complex theorems may require years of consideration before they are finally accepted or rejected, mathematics is as close as the human mind can come to absolute certainty. So far as I can see, the greatest potential weakness in mathematics is the distinct possibility that all humans are inherently and consistently irrational, in which case true rational thought is forever beyond our grasp. Short of the mass failure of the human mind, however, mathematics seems beyond challenge.

Despite my dissatisfaction with the presentation of the material, we did discuss some interesting ideas in R, R, & R. Consider the following: surely, were a single atom of your body replaced by an equivalent atom, you would agree that your identity would remain uncompromised. The new atom and the old atom are indistinguishable, so even though the particles composing your body would be slightly altered, the pattern would remain entirely unchanged. Similarly, if you believe that you are nothing more than your body, then if 1%, or 10%, or 100% of the atoms in your body were instantaneously switched with identical atoms, the exchange should go entirely entirely unnoticed and your identity should remain intact.

Now consider the case where your body is reconstructed somewhere other than its original location; say, ten feet away. The pattern of your body is the same. Your location hasn't changed much. I would hope that you would be willing to accept that the resulting person would still be you.

But what if your original body were not destroyed in the process? What if an exact, perfect duplicate were created ten feet away, but you were left standing exactly where you were before. Would you then allow yourself to be killed, knowing that a perfect duplicate would immediately take your place? Would this stranger standing next to you actually be you?

This may sound suspiciously like the sort of philosophical nonsense I was railing against a few paragraphs above, but consider the following: are you not effectively being replaced with an exact copy of yourself every instant? Does continuity in space and time really affect the core of the argument? To what extent can you legitimately claim that the you-of-right-now is the same as the you-of-five-minutes-ago? If death is just the cessation of this succession of you's, why should the you-of-right-now care that the you-of-50-years-from-now (or 5-minutes-from-now) will not exist? Why bother planning for the future or defining yourself in terms of past actions? In what sense can you be said exist, if your existence is inextricably bound to a single instant?

Consider in this light Descartes' claim that "I think, therefore I am." Certainly, there are thoughts, but the thinker need not be consistent. At any given moment, you have access to memories of past thoughts, but such memories are but imperfect afterimages of the original thought, and you are bound to this previous thinker only by these echoes. Where then is the "I" that is doing the thinking?

Saturday, June 16, 2007

Market-based science

Yesterday, I ranted about faith-based science. Well, religion and capitalism make for strange bedfellows. The Financial Times is presently carrying an impassioned but thoroughly irrational diatribe by Czech president Vaclav Klaus against global warming and climate change science. In claiming that the scientific consensus on climate change is politically motivated, Klaus demands that
The scientists should help us and take into consideration the political effects of their scientific opinions. They have an obligation to declare their political and value assumptions and how much they have affected their selection and interpretation of scientific evidence.

As a potential solution to global climate change, Klaus proposes that
Instead of organising people from above, let us allow everyone to live as he wants

and
Instead of speaking about “the environment”, let us be attentive to it in our personal behaviour


First of all, as a scientist, I find the suggestion that scientific conclusions are generally tainted by political considerations gravely insulting. Any scientist worthy of the name is driven to discover underlying reality and firmly constrained by empirical observation, regardless of the political implications. Klaus' call to "resist the politicisation of science and oppose the term 'scientific consensus', which is always achieved only by a loud minority, never by a silent majority" amounts to a dismissal of science itself. The general agreement of the scientific community, achieved through the exchange of ideas in journals, conferences, and personal communications, allows humanity to arrive at the best possible approximation to the truth. Obviously, the true state of the world can never be known with certainty, and individuals will always be subject to personal bias, but the collective discussion of a group of rational, intelligent, informed individuals serves as a filter on inconsistent reasoning and political predisposition alike. Scientific consensus has given us our greatest paradigm shifts, from the heliocentric universe to evolution to quantum mechanics. While the gears of scientific consensus may turn slowly, the juggernaut rarely commits itself to the wrong path.

More frighteningly, though, Klaus completely ignores the inevitable march of unrestricted free-market capitalism to the tragedy of the commons. Indeed, Klaus counsels that "any suppression of freedom and democracy should be avoided." Such suppressions of freedom include antitrust legislation, consumer protection laws, workers' rights, and environmental protections. Klaus would have us believe that, left to their own devices, the global mega-corporations which drive the world economy would naturally become conscientious stewards of the larger world in which they exist. Certainly, no corporation would poison our rivers and oceans with toxic chemicals, because it is cheaper to dump them into public resources than to dispose of them safely. Clearly, no company would market unsafe products and cover up the inevitable accidents or illnesses. Even if human activity is inducing an increase in temperature, as the "scientific consensus" seems to believe, and given that such an increase would almost invariably have catastrophic consequences, it is nevertheless not in the interest of any single corporation to modify its behavior to address the problem. Governments exist precisely to protect the masses from the actions of the few. I don't know what the Czech Republic is doing these days, but if their leader so firmly believes that that government is best which governs not at all, then I don't see how they can avoid a swift descent into anarchy.

Friday, June 15, 2007

A modest proposal

I don't understand the general public reaction to the negotiated transfer of organs. For instance, from the linked article:
I think we’d reject as a matter of morality and equity that the prettiest people, the people with the best story, or the ones who can pay the most, should get access to this very scarce resource.

The prettiest and wealthiest people already receive preferential access to every other scarce resource. In particular, the wealthy can afford medical care vastly superior to that available to the poor. How is a new kidney so different from a new cancer drug?

Looking at the issue from another perspective, why is the outright, informed, deliberate sale of organs by a living person so unthinkable? People already sell their time, and often their health, through their jobs. Black lung, anyone? Experimental subjects are compensated for Phase I clinical trials, which assess the safety rather than the efficacy of a new drug. Unfortunately, I've been unable to find any sites advertising the rates for experimental subjects. Perhaps because of the aforementioned dubious ethics of allowing people to sell their bodies, very little information is publicly accessible, although you can request a quote for the value of your health.

If I were starving and homeless, I think an offer to barter a kidney for a year's worth of food and shelter would seem more than fair. Why should the government be able to tell me what I can and cannot do with my body? In this light, the selling of organs seems rather similar to prostitution or assisted suicide. On the surface, the government appears to be protecting the poor (in the case of prostitution, and the godless heathens, in the case of assisted suicide) from exploitation, but I would argue that poverty itself is unjust in a land of plenty. How is it ethical to strip people of one of the tools with which they might free themselves from poverty?

The Swedish Chef goes to Washington

Now playing in the New York Times: Bork versus Bork? Bork, bork, bork!!!

I sometimes wonder whether the NY Times is intentionally using subtle humor, or whether the editors' shirts are so stuffed that the irony doesn't penetrate. For instance, Sam Brownback's views on evolution are either an epic piece of deadpan sarcasm on the level of Stephen Colbert's address at the 2006 White House Press Correspondents' Association dinner, or an absolutely appalling commentary on the state of scientific understanding and rationality in America. Consider the following quote:
Many questions raised by evolutionary theory — like whether man has a unique place in the world or is merely the chance product of random mutations — go beyond empirical science and are better addressed in the realm of philosophy or theology.

And this one:

It does not strike me as anti-science or anti-reason to question the philosophical presuppositions behind theories offered by scientists who, in excluding the possibility of design or purpose, venture far beyond their realm of empirical science.

And especially this one:
Man was not an accident and reflects an image and likeness unique in the created order. Those aspects of evolutionary theory compatible with this truth are a welcome addition to human knowledge. Aspects of these theories that undermine this truth, however, should be firmly rejected as an atheistic theology posing as science.

Now presenting (cue lights and music): Faith Based Science!!! First decide what you want to believe, then look for data supporting your pre-established conclusion, dismissing any contradictory evidence as merely an opposing religious claim. I would ask in amazement what sort of political policy such a decision-making framework would lead to, but I think we've been seeing the effects for the past six year. The real punch-line, however, is the following: this was not a haphazard statement sputtered out unprepared during a press conference or even a debate. This was an official, carefully honed policy statement deliberately submitted to a major news outlet. This is how Senator Brownback WANTS to be seen. At the very least, I'd like to imagine that the people at the Times editorial desk had a good chuckle before sending this to press.

Thursday, June 14, 2007

Inconsistent constituencies

I've been dancing at X-TRA's More than Mode event every week for about eight months now. Although the DJ's constantly rotate, they all tend to play the same songs, and I've learned the words to most of the English ones just through repeated exposure while shaking my hips. The stasis can be a bit tedious at times, but the routine also has an air of comfort and familiarity. More importantly, I go to let the pounding bass become a metronome by which I can structure my actions and thoughts, not to discover new bands. I don't particularly like most of the stuff they play, regardless of variety. The verse-chorus-verse structure and simple, repeating melodies of club standards will never engage me intellectually, but more nuanced music is usually not very good for rhythmic gyrations.

Attending every week, I've come to recognize the regulars. Sometimes, they even acknowledge my existence, although in practice I tend to discourage such interactions. What surprises me, though, is that the crowd of regulars shifts over a time scale of perhaps three months. People who appeared without fail every single week in December and January have not graced the assembled black-clad masses with their presence in weeks. I can't decide whether these prodigal children have moved on to other musical genres, or have given up dancing entirely. Perhaps the appeal of the club lay as much in its social as its auditory atmosphere, and their interest waned as their ever-shifting social circles turned over and over. Maybe these people change identities the way other people change clothes: this month, they're goth; next month, they're gangstas. I'm thinking Raven in QC, although comic characters admittedly do not make the most reliable exemplars of actual human behavior... Or perhaps they like their lives spicy with variety, and going out to the same venue week after week grew stale.

I've never understood novelty for its own sake. I'm a creature of habit. I can more fully appreciate those things which I understand. Sensory learning is a reasonable metaphor. The first time you taste a dry martini or a very dark espresso, you're overwhelmed and almost choked by the most obvious flavors. With repeated exposure, you are able to discern the nuances layered on top of the more prominent tastes. The connoisseur experiences the same raw sensory stimuli in a completely different way than the dilettante. Although perhaps I should consider the possibility that my lack of appreciation for novelty stems from my relative lack of experience.

My amp goes to 11

I'm not a big fan of loud noise. For as long as I can remember, I've been a little phobic about hearing loss. When I go out clubbing or to a concert, I always wear earplugs (leaving a club and rejoining the world without ringing ears is a delightful experience). When listening to music by myself, I tend to keep the volume very low. During senior year of high school, with a license, a car, and a daily commute of at least half an hour, I would keep the music turned down so low that I was only able to follow the songs because I already knew the words and melody. Sometimes it takes me a few minutes to realize that an album has ended; black metal at low volumes can sound surprisingly similar to the ambient hiss of an empty room. But recently, I've been experimenting with turning the volume up a bit higher. It's remarkable how much more detail you can hear with good headphones and adequate volume. Just now, listening to Wolves in the Throne Room (who are presently touring and, according to Metal Archives, have a new album coming out), I heard an absolutely fantastic drum fill I had never noticed before. If I'm going to burn my sensory acuity on something, I think this is a worthy cause. Look at me, the intrepid risk-taker!

Did I mention that Wolves in the Throne Room is touring? I think I did. If you live on the West Coast, they will probably be passing through a nearby city very soon, potentially with Sunn O))) and Earth in tow. This is clearly the music event of the year. I obviously can't go, so you will need to enjoy it in my stead.

Wednesday, June 13, 2007

Binge and purge

I have a binging problem. I don't do alcohol binges or cocaine-fueled gambling binges or even relatively benign nitrous oxide binges. My problem is with cartoons. Mostly web comics and anime. When I first started reading Questionable Content, I spent at least one entire evening riveted to my desk, clicking through episode after episode. My experience was similar with Girly and Order of the Stick, even though their quality didn't really merit the single-minded fascination with which I devoured them. And there have been a number of anime series for which, while watching, I had to force myself to go to sleep because the sun was rising. There's something about the decadence of wasting an entire day doing something utterly worthless which I find strangely appealing.

The draw has been even stronger in recent days after finishing my NIPS paper. When I have a substantial goal with a well-defined deadline, I tend to push myself as hard as I can, all the while envisioning all of the pleasures I've deferred along the way. But when I finally finish, the freedom crashes over me like a tidal wave and drowns me, rather than carrying me aloft. In college, after finishing my finals, I generally curled up in a ball in my room for days at a time, leaving only to make use of the kitchen and the bathroom. After my Master's defence, I holed up in my parents house for two weeks, mostly watching HBO movies. Similarly, for the past few days I've had trouble getting out of bed in the morning. I barely managed to buy groceries over the weekend, and I left lab yesterday perhaps seven hours after I arrived, including an hour-long nap and some mindless internet surfing.

I think the problem is that when you finish pushing a boulder to what appeared to be the top of a mountain, although the boulder may not exactly roll down the other side, you very clearly see that you've only reached a small plateau, and the mountain extends indefinitely. While small goals can be defined and achieved, one cannot extrapolate from the limited objectives of daily life to the general motivation for life itself. I am Sisyphus, but I don't imagine that Sisyphus is happy.

Sunday, June 10, 2007

The cult of scientific celebrity

Science is, in essence, a purely rational discipline. While naive notions of hypothesis testing or theory falsification as the primary purpose of experimentation can be dismissed out of hand, the project of science is nonetheless inherently logical, as opposed to emotional, political, or spiritual. The proper measure of a theory is always its ability to model the observed world. Neither the personal ramifications of the theory, nor its source, have any impact on its truth. In this light, I have difficulty understanding the reverence heaped upon those who have achieved success in their scientific pursuits.

Nobel prize winners in particular are accorded almost god-like status. The walls of the atrium of my present lab are decorated with pictures of our collaborators, but also with pictures of notable scientists with whom we have no direct connection. Amongst these latter pictures are a few featuring the heads of the lab together with Nobel winners who happened to pass through Zurich and give a talk at the university or ETH. Recently, Roderick MacKinnon, who won the prize for determining the structure of the potassium channel using x-ray crystallography, deigned to grace our lab with his presence for a few hours and was given the royal treatment. Indeed, in announcing this visit, one of the lab heads said, "Rod MacKinnon will be coming to visit. I trust you all know who Rod MacKinnon is," and left it at that. No, I don't know who Rod MacKinnon is. While the result for which he received the prize is important, it constituted a page or two in my introductory neuroscience textbook. It is an important piece of background information regarding the biophysics of neurons, but it has absolutely no impact on my daily work. My research would be unaffected if the three-dimensional structure of all the neuronal ion channels was still unknown. I work in a laboratory focused on computational and theoretical neuroscience. Why should any of us know who Rod MacKinnon is? Nevertheless, we had a special tea to fete MacKinnon, and everyone gathered around at his feat so that they could root about for any pearls of wisdom he might carelessly cast down. After making the obligatory graduate-student-pounce on the free food, I went back to my desk to get some real work done.

Perhaps my awe of the Nobel prize and those who have received its blessings was dulled by my years at MIT and Caltech, where you could sometimes bump into such holy personages while using a urinal. The sight of David Baltimore zipping around on his Segway like a doofus, crowned with a bicycle helmet, fails to arouse in me any worshipful feelings. The Nobel prize and other such awards are a valuable motivation for scientific achievement, but it is important to recognize that the sort of success they honor depends on luck as much as skill. The ranks of scientists at prominent universities are filled with researchers of the highest caliber who didn't happen to try the one long-shot technique that actually worked, or make just the right mistake when performing an experiment to reveal a wholly unexpected phenomenon. Just because the Nobel committee doesn't think a particular result is worthy of recognition this year does not make it less important than the finding which does happen to be honored.

Did I mention how much I like NIPS's double-blind review policy? I hope all journals adopt that model. Papers should be judged on their content, not their authors.