Saturday, June 30, 2007
A taxonomy of Swiss dance styles
I'm a pretty committed club-goer. Except under extenuating circumstances, I'm at X-TRA's More Than Mode every week, and I generally sojourn out to Abart or Dynamo when they're having a goth event. I've also spasmed in time to a beat at Buddha Bar, Garufa, Hive, Labitzke, Mascotte, Supermarket, and Tonight. One of the early manifestations of my culture shock after moving to Switzerland was my surprise at the striking difference between my habitual dancing style, honed in the clubs of Boston and LA, and that of the native Swiss. Although it's now clouded by a year of forgetfulness and conflicting experiences, I remember many of the black-clad masses in both Boston's and LA's goth clubs as practicing a consistent and distinctive dance style, featuring sharp and violent arm motions mostly from the elbow. Something on the order of this. Admittedly, it doesn't have the same effect when performed in your parents' basement, but you get the idea. In particular, there was one clique of corpse-painted Spanish-speaking guys decked out in leather and spikes at the ironically name Das Bunker whose dancing looked like a fight scene from a Kung Fu movie remade for the Dark Ages. I also recall a sizable contingent of club-goers who preferred a more emotive style of dance, perhaps philosophically akin to ballet. Their dancing seemed to express the emotional content of the music, sometimes going so far as to act out the lyrics. Regardless of the details, no one was afraid to move around, and half the fun of going out was watching everyone else.
The Swiss, in contrast, are more reserved, both physically and emotionally. A substantial fraction of the people on the dance floor just rock from side to side. The slightly more creative will take three steps forwards and three steps back. Sometimes with a reckless disregard for anyone who might have strayed into their path. I think this particular style may even be enshrined in a song, but it's in German (something like "drei Schritte vor und drei zuruck") and my google-fu is not up to the task. Occasionally (at establishments playing electronic rather than goth music), someone will throw their arms in the air as if they were gesticulating with pistols held sideways, gangsta-style. These people are not gangstas. And then there is the frightening menagerie of truly atrocious dancers. Like the guy who wheels around the dance floor like a fencer set free from the piste, all the while waving his arms like a conductor counting out 4/4 with a baton. Or the fat balding guy who violently rocks back and forth while smoking a pipe.
There are a few exceptions. One couple at X-TRA wears long skirts and has perfected a style heavy on pirouettes which pull them into elegant motion. Another couple has managed to develop an expressive and fluid technique that wouldn't seem out of place in a US club. Other than those four exceptional cases, and a few other competent dancers, I'm in a sea of people performing the sort of ur-dance known intrinsically to every five-year-old. The sort of dance executed by wall-flowers when told that they just need to move to the music. Maybe the difference is that American children of my generation were raised on a diet of American Bandstand and MTV. Michael Jackson's moonwalk permeated our lives as much as his music. City streets were filled with kids break-dancing on sheets of cardboard. Dancing was intrinsically understood to be as much a public performance as a form of personal expression. Then again, Swiss beer is almost uniformly bland, in contrast to the bolder traditions of many of its neighbors. Maybe Swiss dancing is similar.
The Swiss, in contrast, are more reserved, both physically and emotionally. A substantial fraction of the people on the dance floor just rock from side to side. The slightly more creative will take three steps forwards and three steps back. Sometimes with a reckless disregard for anyone who might have strayed into their path. I think this particular style may even be enshrined in a song, but it's in German (something like "drei Schritte vor und drei zuruck") and my google-fu is not up to the task. Occasionally (at establishments playing electronic rather than goth music), someone will throw their arms in the air as if they were gesticulating with pistols held sideways, gangsta-style. These people are not gangstas. And then there is the frightening menagerie of truly atrocious dancers. Like the guy who wheels around the dance floor like a fencer set free from the piste, all the while waving his arms like a conductor counting out 4/4 with a baton. Or the fat balding guy who violently rocks back and forth while smoking a pipe.
There are a few exceptions. One couple at X-TRA wears long skirts and has perfected a style heavy on pirouettes which pull them into elegant motion. Another couple has managed to develop an expressive and fluid technique that wouldn't seem out of place in a US club. Other than those four exceptional cases, and a few other competent dancers, I'm in a sea of people performing the sort of ur-dance known intrinsically to every five-year-old. The sort of dance executed by wall-flowers when told that they just need to move to the music. Maybe the difference is that American children of my generation were raised on a diet of American Bandstand and MTV. Michael Jackson's moonwalk permeated our lives as much as his music. City streets were filled with kids break-dancing on sheets of cardboard. Dancing was intrinsically understood to be as much a public performance as a form of personal expression. Then again, Swiss beer is almost uniformly bland, in contrast to the bolder traditions of many of its neighbors. Maybe Swiss dancing is similar.
Friday, June 29, 2007
SCOTUS can suck it
The Supreme Court has gone off the deep end. SCOTUS has recently come to the learned conclusion that students are not permitted to exercise their right to free speech if their message contains any reference to mind-altering substances. Quoting from a Times article to which I can't seem to generate a stable link,
That's right. The mission of American public education is to tell students what to think, to indoctrinate them with the prevailing beliefs of the day, rather than to teach them to use their own powers of rational thought. This meshes perfectly with the Bush administration's push for a unitary executive, with powers trumping those of the other branches of government. In both instances, America is being rendered vulnerable to a tyrannical majority which seeks to impose its values on the entirety of society. Long gone are the days of a pluralistic culture, embracing everyone's individual perspective and favoring none. Indeed, in yet another blow against an open society where ideas are freely and universally exchanged, the Supreme Court has deemed active desegregation through race-conscious school admission programs unconstitutional. And what better way to cement the control of those already in power than to repermit them to bias elections by saturating the media with advertisements immediately before balloting? The McCain-Feingold act has been rendered more porous than the legal arguments supporting the torture of "enemy combatants." With television, radio, print, and internet approaching nitrogen in their ubiquity, allowing the wealthy to suffocate the populace with a self-serving message will further drown out the voice of the common man. When the unitary executive does trample the constitutional guarantee of freedom from state-imposed religion, SCOTUS joins the cheering section and announces that the proletariat has no standing to challenge such abuses in court. And as a rancid cherry on top of this foul four-scoop sundae, the Supreme Court has ruled that manufacturers and distributors who forbid discounting and set minimum price floors do not necessarily violate the Sherman Antitrust Act. An independent judiciary is a fine thing, but only when it is committed to being (a) not stupid and (b) not evil. Our new chief justice and his conservative bloc seem to be failing at least one of these two key tests.
"In light of the history of American public education,” Justice Thomas said, “it cannot seriously be suggested that the First Amendment ‘freedom of speech’ encompasses a student’s right to speak in public schools.”
That's right. The mission of American public education is to tell students what to think, to indoctrinate them with the prevailing beliefs of the day, rather than to teach them to use their own powers of rational thought. This meshes perfectly with the Bush administration's push for a unitary executive, with powers trumping those of the other branches of government. In both instances, America is being rendered vulnerable to a tyrannical majority which seeks to impose its values on the entirety of society. Long gone are the days of a pluralistic culture, embracing everyone's individual perspective and favoring none. Indeed, in yet another blow against an open society where ideas are freely and universally exchanged, the Supreme Court has deemed active desegregation through race-conscious school admission programs unconstitutional. And what better way to cement the control of those already in power than to repermit them to bias elections by saturating the media with advertisements immediately before balloting? The McCain-Feingold act has been rendered more porous than the legal arguments supporting the torture of "enemy combatants." With television, radio, print, and internet approaching nitrogen in their ubiquity, allowing the wealthy to suffocate the populace with a self-serving message will further drown out the voice of the common man. When the unitary executive does trample the constitutional guarantee of freedom from state-imposed religion, SCOTUS joins the cheering section and announces that the proletariat has no standing to challenge such abuses in court. And as a rancid cherry on top of this foul four-scoop sundae, the Supreme Court has ruled that manufacturers and distributors who forbid discounting and set minimum price floors do not necessarily violate the Sherman Antitrust Act. An independent judiciary is a fine thing, but only when it is committed to being (a) not stupid and (b) not evil. Our new chief justice and his conservative bloc seem to be failing at least one of these two key tests.
Monday, June 25, 2007
Pet Peeve #115
People who use "three-dimensional" graphs. Paper is two dimensional. There is no way to represent a general black-and-white three-dimensional object using a single black-and-white two-dimensional figure. Even if you use Matlab. Even if you use wireframes or shading. Even if you really, really want to.
You are encouraged to use color to create an artificial third dimension. Back when I was writing real-time spike analysis software for electrophysiology recordings (read: stick wires into rat brains; try to extract the signals of individual neurons from the muck), I tried using the red and green color channels to represent the two dimensions of our four-channel tetrodes which wouldn't fit on the (by definition) two-dimensional scatterplots. Note that the human eye contains three (3) distinct color receptors, so this strategy unambiguously encodes the desired information, although it can be a bit difficult to decipher visually. I thought it worked pretty well. My adviser thought it looked "unprofessional"; i.e. unlike the stupid commercial system we were sacrificing a year of our lives to replace. Meh.
If you're some sort of crazy chemist (e.g. Gylbert, 1973 or Hackert and Jackobson, 1971) and want to use a stereogram, that's pretty swanky, but you'll need to depend upon the ability of your audience to freely rotate their eyeballs in their sockets. Maybe the format is standardized and they give out stereoscopes at chemistry conferences they way they used to give out those horrible red/blue cellophane glasses for 3-D movies. In any case, a single two-dimensional view of a three-dimensional object is necessarily ambiguous. Computers let you do all sorts of things you really ought to avoid. Powerpoint presentations, for instance. The 80's are dead. Get over it.
You are encouraged to use color to create an artificial third dimension. Back when I was writing real-time spike analysis software for electrophysiology recordings (read: stick wires into rat brains; try to extract the signals of individual neurons from the muck), I tried using the red and green color channels to represent the two dimensions of our four-channel tetrodes which wouldn't fit on the (by definition) two-dimensional scatterplots. Note that the human eye contains three (3) distinct color receptors, so this strategy unambiguously encodes the desired information, although it can be a bit difficult to decipher visually. I thought it worked pretty well. My adviser thought it looked "unprofessional"; i.e. unlike the stupid commercial system we were sacrificing a year of our lives to replace. Meh.
If you're some sort of crazy chemist (e.g. Gylbert, 1973 or Hackert and Jackobson, 1971) and want to use a stereogram, that's pretty swanky, but you'll need to depend upon the ability of your audience to freely rotate their eyeballs in their sockets. Maybe the format is standardized and they give out stereoscopes at chemistry conferences they way they used to give out those horrible red/blue cellophane glasses for 3-D movies. In any case, a single two-dimensional view of a three-dimensional object is necessarily ambiguous. Computers let you do all sorts of things you really ought to avoid. Powerpoint presentations, for instance. The 80's are dead. Get over it.
For the love of god and all that is holy, Earth's Hex: Or Printing in the Infernal Method is utterly sublime
For one reason or another, it's been a month or two since I've listened to any later-day Earth. Sunn O))) has found its way onto my playlist regularly. I've experienced A Bureaucratic Desire for Revenge (Parts 1 and 2) within the past week. But Pentastar and Hex have been the subject of an unconsidered neglect.
Friends, please allow me to warn you against such a thoughtless and ultimately self-destructive course of action. Listening to Hex as I type, I am all but overcome by the sumptuous textures layered one on top of another in this album. Earth is a group of traditionalists, and Hex is constructed using only the standard guitars and drums, but through a miracle of ingenious recording techniques, they coax lyrical and organic voices out of these commonplace tools. As their name would suggest, Earth eschews the ethereal; their instrumentation is not evocative of the angelic or the demonic. Rather, you can almost feel the dusty soil sliding through your fingers as the guitars peal, resonate, and sing. The mundane is the sublime. Earth elevates the coarsest, most visceral elements of physical reality to an exalted stature. Even a simple stone takes on the epic proportions of a grand monument to the shocking presence, the undeniable reality, of physical existence. These songs are hymns to the world at dawn, while the intentionality of the small scurrying creatures still sleeps, but the earth sits with open eyes, ever watchful.
Friends, please allow me to warn you against such a thoughtless and ultimately self-destructive course of action. Listening to Hex as I type, I am all but overcome by the sumptuous textures layered one on top of another in this album. Earth is a group of traditionalists, and Hex is constructed using only the standard guitars and drums, but through a miracle of ingenious recording techniques, they coax lyrical and organic voices out of these commonplace tools. As their name would suggest, Earth eschews the ethereal; their instrumentation is not evocative of the angelic or the demonic. Rather, you can almost feel the dusty soil sliding through your fingers as the guitars peal, resonate, and sing. The mundane is the sublime. Earth elevates the coarsest, most visceral elements of physical reality to an exalted stature. Even a simple stone takes on the epic proportions of a grand monument to the shocking presence, the undeniable reality, of physical existence. These songs are hymns to the world at dawn, while the intentionality of the small scurrying creatures still sleeps, but the earth sits with open eyes, ever watchful.
Labels:
music
Sunday, June 24, 2007
Utopias and artificial scarcity
Utopian literature is by and large a rather dull genre. Consider Edward Bellamy's Looking Backward, Aldous Huxley's Island, and Sir Thomas More's original vision. Many of the famous contributions were written before the failure of socialism demonstrated that human self-interest is more powerful than any community, and later writers seem to conveniently develop selective amnesia regarding the success of efforts to implement collective societies. In a particularly egregious instance of such intentional blindness, the psychologist B. F. Skinner produced fantastical visions where an end-run around human nature created out each sovereign individual a worker bee primarily dedicated to the good of the hive, a cog in the larger machine. A key feature of these political, social, and economical fictions is the state of plenty, or at least sufficiency, which arises when everyone works for the common good and takes no more than they need. Work days are uniformly short, but while on the job, workers are focused and productive. No one resents offering the sweat of their brow for the benefit of others, nor do any strive to lift themselves above their fellows.
While flesh and blood humans may never be able to achieve these ideals, it strikes me that certain features of these utopias are within our grasp. Many of the products of our present information economy are kept out of reach of most people only because of artificial scarcity, enforced through intellectual property laws. Consider a few examples: only a minute fraction of the cost of most medications is due to the expense of their manufacture or distribution. Aside from a small percentage of substances of biological origin, once the effective agent and synthesis technique are known, most pills can be produced for pennies. India has taken advantage of this fact by rewriting their patent laws to cover the manufacturing process rather than the final product. As a result, Indian companies can produce substantially discounted generic versions of HIV and other medications by subtly changing the process by which the substances are made. I'm no expert on world economics and trade law, but the article linked above claims that these generics are sold for one twentieth to one fiftieth of the American price. Pharmaceutical companies shout themselves horse asserting that the high cost of their wares reflects the expense and risks of research, but most of the basic research which underlies medical innovation is performed at tax-payer expense by university laboratories. It is true that the clinical trials needed to prove the safety and efficacy of new drugs are time consuming and expensive, but it is hard to believe that they could not be performed by the public sector. The academic world has produced a stunning system for motivating intelligent and industrious individuals with a carrot that isn't made of dollar bills. Given the opportunity and the right incentives, these institutions could turn their considerable intellectual clout towards producing medications financed and owned by the public.
At a more mundane level, consider the designer fashion industry. A $200 or even $500 pair of limited-edition designer jeans is not made of substantially different materials than the $20 jeans from Old Navy. They are not the product of superior workmanship. In many cases, they are made in the same overseas sweatshops. And most importantly, in many places in east Asia (and their outposts in major American cities), you can obtain knock-offs of these designer jeans for pennies on the dollar.
Finally, I would be remiss if I failed to mention that the substantial output of the movie, television, music, software, and art industries could be available to all for no more than the cost of a broadband internet connection. While the arguments regarding intrinsically electronic media have been flogged until their backs are raw and bleeding, it is relevant to note that many of the works of art which sell for thousands or millions of dollars could be easily mass produced and made available for everyone's living room. Indeed, many of the most famous contemporary artists operate factories where journeyman artists render into canvas and paint the vision of their popularly anointed overseers. These same artistic apprentices could produce their paintings at less than stratospheric prices if less value was accorded to the proprietary signature of their masters. Once again, east Asia has beaten the western world to the punch.
I recognize that simply eliminating intellectual property rights would wreak havoc on the American and world economy. But I think it is high time we seriously considered the cost of artificially restricting the distribution of goods that people want and need. In all of the cases described above, the majority of the costs passed on to the public consist of the expense of redeveloping a product which is already available, and then convincing the public that they cannot do without the new version. How many different selective serotonin reuptake inhibitors, boot-cut hip-huggers, or commercially created boy bands do we really need? Are there no better uses to which those financial and human resources could be put? What benefit do we as consumers derive from the manufactured desire for artificially scarce goods distinguished only by their expensive advertising campaigns? Government, and the rights it supports, is at least in principle of the people, by the people, and for the people. If the present formulation of intellectual property rights is no longer serving our collective interest, we can and should change them.
While flesh and blood humans may never be able to achieve these ideals, it strikes me that certain features of these utopias are within our grasp. Many of the products of our present information economy are kept out of reach of most people only because of artificial scarcity, enforced through intellectual property laws. Consider a few examples: only a minute fraction of the cost of most medications is due to the expense of their manufacture or distribution. Aside from a small percentage of substances of biological origin, once the effective agent and synthesis technique are known, most pills can be produced for pennies. India has taken advantage of this fact by rewriting their patent laws to cover the manufacturing process rather than the final product. As a result, Indian companies can produce substantially discounted generic versions of HIV and other medications by subtly changing the process by which the substances are made. I'm no expert on world economics and trade law, but the article linked above claims that these generics are sold for one twentieth to one fiftieth of the American price. Pharmaceutical companies shout themselves horse asserting that the high cost of their wares reflects the expense and risks of research, but most of the basic research which underlies medical innovation is performed at tax-payer expense by university laboratories. It is true that the clinical trials needed to prove the safety and efficacy of new drugs are time consuming and expensive, but it is hard to believe that they could not be performed by the public sector. The academic world has produced a stunning system for motivating intelligent and industrious individuals with a carrot that isn't made of dollar bills. Given the opportunity and the right incentives, these institutions could turn their considerable intellectual clout towards producing medications financed and owned by the public.
At a more mundane level, consider the designer fashion industry. A $200 or even $500 pair of limited-edition designer jeans is not made of substantially different materials than the $20 jeans from Old Navy. They are not the product of superior workmanship. In many cases, they are made in the same overseas sweatshops. And most importantly, in many places in east Asia (and their outposts in major American cities), you can obtain knock-offs of these designer jeans for pennies on the dollar.
Finally, I would be remiss if I failed to mention that the substantial output of the movie, television, music, software, and art industries could be available to all for no more than the cost of a broadband internet connection. While the arguments regarding intrinsically electronic media have been flogged until their backs are raw and bleeding, it is relevant to note that many of the works of art which sell for thousands or millions of dollars could be easily mass produced and made available for everyone's living room. Indeed, many of the most famous contemporary artists operate factories where journeyman artists render into canvas and paint the vision of their popularly anointed overseers. These same artistic apprentices could produce their paintings at less than stratospheric prices if less value was accorded to the proprietary signature of their masters. Once again, east Asia has beaten the western world to the punch.
I recognize that simply eliminating intellectual property rights would wreak havoc on the American and world economy. But I think it is high time we seriously considered the cost of artificially restricting the distribution of goods that people want and need. In all of the cases described above, the majority of the costs passed on to the public consist of the expense of redeveloping a product which is already available, and then convincing the public that they cannot do without the new version. How many different selective serotonin reuptake inhibitors, boot-cut hip-huggers, or commercially created boy bands do we really need? Are there no better uses to which those financial and human resources could be put? What benefit do we as consumers derive from the manufactured desire for artificially scarce goods distinguished only by their expensive advertising campaigns? Government, and the rights it supports, is at least in principle of the people, by the people, and for the people. If the present formulation of intellectual property rights is no longer serving our collective interest, we can and should change them.
Saturday, June 23, 2007
Walk softly, but carry a big knife
When I was an undergraduate, I lived in an anarchic commune (colloquially called a fraternity) with a collective cooking arrangement. In return for cooking with two or three other people once a week, you could benefit from the bounty of everyone else's culinary adventures throughout the rest of the week. However, being a group of left-leaning college students, punctuality and responsibility were not really our defining characteristics, and it was not unusual for people to show up late for their cooking team. One afternoon, I dragged myself down to the kitchen promptly and began cooking on time, only to find all of my comrades-in-arms detained by what were certainly more pressing engagements. Like checking email. I was thus already feeling a bit antisocial when the siren which served as the doorbell for the rear door started blaring. Generally, this indicated that a resident of the house had either been too absentminded to remember their key, or too lazy to take it out of their pocket. Rather than put down the bloody knife with which I had been cutting meat, I took it with me as I trudged down the hallway to the back door. When I grumpily opened the door, I was greeted not by a blithe housemate, but by one of our well dressed, well-to-do next-door neighbors. He had some extra tickets to a baseball game which he could not attend, and wanted to know if we could make use of them. He obviously wasn't expecting the door to be answered by a knife-wielding maniac. Ever so slightly mortified, I thanked him for the tickets and retreated back to the safety of my socially insulated abode, where one wouldn't think twice of answering a door knife in hand. Until after the fact, of course. I strongly suspect this neighbor never again darkened our doorstep.
Which brings us to the present day. My lab has a small but functional kitchen, in which I have taken to preparing my dinners rather than cut my nocturnal work-day short or shift my schedule towards daylight hours. Despite having an oven, a range, and a reasonable selection of pots and pans, it lacks some obvious necessities. Specifically, although the kitchen has communal non-stick cookware, the only non-metal stirring implement is a spatula. I refuse to prepare pasta sauce with a spatula. Even more incongruous, there is an entire drawer full of knives. Many of which are relatively new. None of which are sharp enough to cut a vegetable except for one bread knife. This state of affairs is untenable. So when I went grocery shopping today, I picked up a two-pack of wooden spoons, at least one of which I will donate to the kitchen's collection, and 25 franc Victorinox Cook's Knife. So far as I am concerned, this is about twice as much as one should pay for a single kitchen knife. Victorinox makes Swiss army knives with five times as many blades for the same price. But this knife should make even the most recalcitrant carrot feel like butter.
I will not be donating this knife to the communal kitchen collection. This knife will live on my desk, where it will remain sharp and shiny for the duration of my PhD. Moreover, should anyone come to me with objections to one of my papers, I will have 19 centimeters of sharpened steel fury with which to drive home my point. So to speak.
Which brings us to the present day. My lab has a small but functional kitchen, in which I have taken to preparing my dinners rather than cut my nocturnal work-day short or shift my schedule towards daylight hours. Despite having an oven, a range, and a reasonable selection of pots and pans, it lacks some obvious necessities. Specifically, although the kitchen has communal non-stick cookware, the only non-metal stirring implement is a spatula. I refuse to prepare pasta sauce with a spatula. Even more incongruous, there is an entire drawer full of knives. Many of which are relatively new. None of which are sharp enough to cut a vegetable except for one bread knife. This state of affairs is untenable. So when I went grocery shopping today, I picked up a two-pack of wooden spoons, at least one of which I will donate to the kitchen's collection, and 25 franc Victorinox Cook's Knife. So far as I am concerned, this is about twice as much as one should pay for a single kitchen knife. Victorinox makes Swiss army knives with five times as many blades for the same price. But this knife should make even the most recalcitrant carrot feel like butter.
I will not be donating this knife to the communal kitchen collection. This knife will live on my desk, where it will remain sharp and shiny for the duration of my PhD. Moreover, should anyone come to me with objections to one of my papers, I will have 19 centimeters of sharpened steel fury with which to drive home my point. So to speak.
Music search technique
For many moons, my main strategy for finding new music has been to go prowling through the internets, looking for groups with interesting names and reading the reviews on the Metal Archives. Back in the day, I turned to such inconsistent sources as Satan Stole My Teddybear and shoutcast streams. The reviews on SSMT by John Chedsey are evocative, often humorous, and generally reflect good taste, but the other reviewers are not of equal quality. Moreover, the small number of non-professional reviewers on SSMT limits their ability to cover the full breadth of the relatively obscure black and doom metal scenes. I owe a karmic debt to the internet radio station on which I first heard Opeth, catapulting me headlong into the world of dark metal, but the streams that do play the heavier incarnations of metal are generally broader in their scope than my musical interests, and skew towards orthodox bands rather than progressive fair. The Metal Archives, for all the inconsistency in the literary quality of its user-generated content, is stunningly complete and generally accurate.
Investigating bands based upon their names is surprisingly effective. Consider the following list of black metal bands with names beginning with 'forest,' taken from The Metal Archives:
Forest (Cze) - Black Metal
Forest (Pol) - Black Metal
Forest (Rus) - Black Metal
Forest Nocturne - Melodic Black Metal
Forest of Castles - Black Metal
Forest of Demons - Black Metal
Forest of Doom - Black Metal
Forest of Evil - Black Metal
Forest of Fog - Black Metal
Forest of Impaled - Black/Death Metal
Forest of Souls - Black Doom Metal
Forest of Triglav - Black Metal
Forest of Witchery - Black Metal
Of course, a band like Forest of Shadows does not make this list, because the Metal Archives classifies it as doom metal rather than black metal, nor does it include bands whose names include but do not begin with 'forest.' And don't forget the 50 black metal bands whose names begin with 'funeral.' These names are more formulaic than whatever bubble-gum pop hit is currently contaminating the airwaves. As a testament to the predictability of metal band appellations, you can algorithmically generate the names of your next dozen musical enterprises at the Metal Band Name Generator.
Recently, though, I've stumbled on a much more efficient technique for finding new music. I call this strategy Southern Lord. Consider what I hope we can agree are the three best drone doom bands in existence (in alphabetical order): Boris, Earth, and Sunn O))). Further consider the quality black metal groups Wolves in the Throne Room, Xasthur, and Nortt. All are presently or have in the past been distributed by Southern Lord. My present project is thus to go through the entire Southern Lord roster and sample the wares. The giant record labels may have outlived their usefulness, but there is still a need to winnow the wheat from the chaff of the ever more prolific world music scene, and Southern Lord seems to be doing a pretty good job.
Investigating bands based upon their names is surprisingly effective. Consider the following list of black metal bands with names beginning with 'forest,' taken from The Metal Archives:
Forest (Cze) - Black Metal
Forest (Pol) - Black Metal
Forest (Rus) - Black Metal
Forest Nocturne - Melodic Black Metal
Forest of Castles - Black Metal
Forest of Demons - Black Metal
Forest of Doom - Black Metal
Forest of Evil - Black Metal
Forest of Fog - Black Metal
Forest of Impaled - Black/Death Metal
Forest of Souls - Black Doom Metal
Forest of Triglav - Black Metal
Forest of Witchery - Black Metal
Of course, a band like Forest of Shadows does not make this list, because the Metal Archives classifies it as doom metal rather than black metal, nor does it include bands whose names include but do not begin with 'forest.' And don't forget the 50 black metal bands whose names begin with 'funeral.' These names are more formulaic than whatever bubble-gum pop hit is currently contaminating the airwaves. As a testament to the predictability of metal band appellations, you can algorithmically generate the names of your next dozen musical enterprises at the Metal Band Name Generator.
Recently, though, I've stumbled on a much more efficient technique for finding new music. I call this strategy Southern Lord. Consider what I hope we can agree are the three best drone doom bands in existence (in alphabetical order): Boris, Earth, and Sunn O))). Further consider the quality black metal groups Wolves in the Throne Room, Xasthur, and Nortt. All are presently or have in the past been distributed by Southern Lord. My present project is thus to go through the entire Southern Lord roster and sample the wares. The giant record labels may have outlived their usefulness, but there is still a need to winnow the wheat from the chaff of the ever more prolific world music scene, and Southern Lord seems to be doing a pretty good job.
Labels:
music
Thursday, June 21, 2007
Wolves in the Throne Room - Self-titled demo
Wolves in the Throne Room is the realization of promise of American black metal. Freed from the straight-jacket of narrow-minded expectations which lovingly enfolds the Scandinavian scene, WitTR manages to be true, rather than trve. They have traded corpse paint and church burning for an organic, back-to-the-land ethos which approaches the real spirit of black metal, with its frequent folk influences. This is existentialist music: the songs scream in rage that the world is fundamentally flawed, that human effort is ultimately futile, but that no alternatives exist. They accept the impossibility of meaningful action, but answers with a cry of defiance and perseverance. In this simultaneous denial and acceptance of the absurdity of life lies the closest thing to redemption.
Their first demo is saddled with the sort of low fidelity production which defines early Darkthrone. Lesser bands may intentionally use a subtle background hiss, flat drums, and inaudible bass to build "atmosphere." On a Wolves in the Throne Room album, this lack of clarity merely obscures the stark beauty of the underlying music. Were it not for the almost total lack of bass, the furious drumming on this album would take on a crushing heaviness. The double-bass in particular is used to good effect, menacing rather than thin. Indeed, this entire album has a heavy intensity which is only accentuated by the counterpoint of the vocalists' raspy delivery.
While technically proficient, Wolves in the Throne Room's debut effort sometimes demonstrates a frustrating immaturity in the composition of the songs. There are transcendent passages where everything clicks, and the listener is carried away by soaring, haunting melodies. The band occasionally breaks into an unexpected but effective jazz-influenced solo format, with one guitar taking flight over a repeated melody. But there are intervals where tedious repetition is confused for atmosphere. The band is not as tight as in its later incarnations, and the listener is left with the distinct impression that a few more rehearsals would have yielded a superior final product. Certainly nothing compared to the debacle that was In The Woods...'s final live album, but the slips in synchrony and missed notes are irritating on a careful listen.
On the whole, this demo is a clear portent of the band's promise, and a worthy opus in its own right. Though unrefined, its successes outweigh its failures, and I've found myself playing it more often than not over the past few days. Wolves in the Throne Room's signature melodic structure is already well established. Aside from a few rough patches, the songs are generally engaging and often truly beautiful.
Their first demo is saddled with the sort of low fidelity production which defines early Darkthrone. Lesser bands may intentionally use a subtle background hiss, flat drums, and inaudible bass to build "atmosphere." On a Wolves in the Throne Room album, this lack of clarity merely obscures the stark beauty of the underlying music. Were it not for the almost total lack of bass, the furious drumming on this album would take on a crushing heaviness. The double-bass in particular is used to good effect, menacing rather than thin. Indeed, this entire album has a heavy intensity which is only accentuated by the counterpoint of the vocalists' raspy delivery.
While technically proficient, Wolves in the Throne Room's debut effort sometimes demonstrates a frustrating immaturity in the composition of the songs. There are transcendent passages where everything clicks, and the listener is carried away by soaring, haunting melodies. The band occasionally breaks into an unexpected but effective jazz-influenced solo format, with one guitar taking flight over a repeated melody. But there are intervals where tedious repetition is confused for atmosphere. The band is not as tight as in its later incarnations, and the listener is left with the distinct impression that a few more rehearsals would have yielded a superior final product. Certainly nothing compared to the debacle that was In The Woods...'s final live album, but the slips in synchrony and missed notes are irritating on a careful listen.
On the whole, this demo is a clear portent of the band's promise, and a worthy opus in its own right. Though unrefined, its successes outweigh its failures, and I've found myself playing it more often than not over the past few days. Wolves in the Throne Room's signature melodic structure is already well established. Aside from a few rough patches, the songs are generally engaging and often truly beautiful.
Tuesday, June 19, 2007
My brain is not a Christmas tree
There are few things that irk me more than the way in which fMRI studies are reported in the popular press. Invariably, the article refers to parts of the brain lighting up in a Christmas-tree-like fashion. There was no Christmas tree in my house when I was growing up. There certainly isn't one in my head. More importantly, this phraseology seems to imply that the brain regions in question were:
a) not active before the experimental manipulation which triggered the festive lighting, and
b) in some sense uniformly "activated" by the stimulation.
Even the revered journal Science has adopted this misleading nomenclature in its lay publications (don't you love ecclesiastic language used to describe academia? I'm a monk of science):
No, no, no (and not just because the Mean Monkey doesn't care if you are sad).
The first implication (conveniently denoted (a) above) is completely wrong, but in a simple way. Neural activity occurs constantly throughout the brain. Urban legends would have you believe that only some small fraction of the brain is actively used. While it is true that many neurons only fire action potentials infrequently, information is carried in their silences as well as their action potentials. If a neuron spiked without pause, it would transmit no more information than if it were constantly inactive. Moreover, at any given moment, there are neurons active in every part of the brain. Even the parts that fail to convey their wishes for a Happy Holiday in fMRI pictures. If you close your eyes, neurons continue to fire in the primary visual cortex. The primary visual cortex is even active in blind people. fMRI measures relative increases and decreases in activity; the baseline is never zero.
The second implication ((b), if you've been following along) is more pernicious, because the underlying reality is a bit more complicated than the whiz-bang notion of "lighting up." fMRI's blood-oxygen-level dependent (BOLD) signal measures the slightly counter-intuitive increase in oxygenated hemoglobin when the metabolic requirements of a brain area increase. (Presumably, this is triggered by a homeostatic mechanism which senses the increased oxygen consumption and dilates blood vessels accordingly. The brain is all tricky like that. Don't even ask how it manages to maintain a reasonable connection strength at each synapse in the face of constant potentiation and depotentiation.) This signal is best correlated with the local field potential (the low-frequency component of electrode recordings due to the average synaptic activity over a span of hundreds of micrometers), rather than the actual spiking activity of the neurons in the area. The upshot of this is that fMRI represents the inputs to a brain region, not the local activity.
That a metabolic measure signals input rather than output is reasonable from a biophysical perspective, since relatively few ions move across the axonal membrane in the process of transmitting an action potential. Most of the axon is covered by a lipid-rich myelin sheath which blocks the flow of ions and decreases the capacitance, allowing the action potential to be transmitted quickly between the gaps in the myelin (known as the nodes of Ranvier, which is also the name of a metal band of questionable quality). In contrast, neurons have giant dendritic trees which are subject to a constant barrage of neurotransmitters, most of which cause ion channels to open. When ion channels open, ions flow through them passively along their electrochemical gradient, reducing the strength of the gradient. Thus, when the amount of input increases, more energy needs to be expended to move the ions back against the gradients, hence the increased need for oxygen.
Now that I write this, I'm not entirely convinced by this justification of the coupling between input strength and oxygen utilization, since although the total ionic flow is much greater in the dendrites than the axon, it's still very small compared to the total ionic content of the neuron. You could cut out the ionic pumps and the cell would be fine for hours or days, or so I'm told, in which case there's no need to immediately increase the amount of available oxygen so the ionic pumps can be run in overdrive. However, it's possible that while the cell as a whole would not lose its overall negative charge were the ionic pumps shut off briefly, everything would go out of equilibrium in the dendritic tree. The branches of the dendritic tree are really small, so the total number of ions in a dendritic spine or branch is not very large. Even the relatively insignificant ionic flow due to synaptic activity may be enough to knock around the ionic concentrations in such small volumes.
Anyway, my point is that the BOLD signal from fMRI measures input, not local activity. And it has absolutely atrocious spacial and temporal resolution. Something on the order of millimeters and seconds. But it makes pretty pictures and lets hack science journalists tell the doe-eyed public "this part of the brain is for when you feel sad and lonely; this part of the brain is for when you feel happy." The real action is in calcium imaging, which can track single spikes (almost) from hundreds of cells at a time (but only in layer 2/3 of the cortex of anaesthetised animals), and chronic multitetrode recordings (disclosure: my old lab used this technique; tetrodes are bundles of four electrodes, which allow the isolation of dozens of cells from a single such bundle through a variety of black-magic mathematical tricks), which can record from perhaps a hundred cells for days, weeks, or months at a time (depending upon the strength of your experimental mojo). But no one wants to see pictures of comatose cats with the backs of the heads lopped off, or rats running around with bundles of wires popping out of their skulls. And the experimental results, while useful and meaningful, rarely come with a five second sound-bite. Half the time even specialists in the field aren't sure what the ultimate implication of a study is. So fMRI gets the publicity and a disproportionate share of the funding.
Which is dumb, because very little useful science has come out of fMRI. Glorified phrenology. One of the most striking known facts about the brain is that most of it looks the same. So far as anyone can tell, aside from a few small and probably insignificant differences, cortex is cortex, regardless of whether it's processing visual data or planning an arm movement or contemplating the secrets of the universe. In fact, you can rewire things visual input to the auditory areas and everything works out just fine (von Melchner, Pallas, & Sur, 2000). In ferrets. Not fruit flies. Not frogs. Visual mammals like you and me.
This would seem to suggest that the same basic computation underlies most of what the brain is doing. Wouldn't it be nice to know what this computation is? Why would you waste your time attempting to pinpoint exactly how the computation is divided spatially, when all the evidence suggests that the computation is the same everywhere? People are strange...
a) not active before the experimental manipulation which triggered the festive lighting, and
b) in some sense uniformly "activated" by the stimulation.
Even the revered journal Science has adopted this misleading nomenclature in its lay publications (don't you love ecclesiastic language used to describe academia? I'm a monk of science):
Ever flinch at the sight of an actor being punched in the face? The reason is that neurons in the brain light up when we watch others suffering.
No, no, no (and not just because the Mean Monkey doesn't care if you are sad).
The first implication (conveniently denoted (a) above) is completely wrong, but in a simple way. Neural activity occurs constantly throughout the brain. Urban legends would have you believe that only some small fraction of the brain is actively used. While it is true that many neurons only fire action potentials infrequently, information is carried in their silences as well as their action potentials. If a neuron spiked without pause, it would transmit no more information than if it were constantly inactive. Moreover, at any given moment, there are neurons active in every part of the brain. Even the parts that fail to convey their wishes for a Happy Holiday in fMRI pictures. If you close your eyes, neurons continue to fire in the primary visual cortex. The primary visual cortex is even active in blind people. fMRI measures relative increases and decreases in activity; the baseline is never zero.
The second implication ((b), if you've been following along) is more pernicious, because the underlying reality is a bit more complicated than the whiz-bang notion of "lighting up." fMRI's blood-oxygen-level dependent (BOLD) signal measures the slightly counter-intuitive increase in oxygenated hemoglobin when the metabolic requirements of a brain area increase. (Presumably, this is triggered by a homeostatic mechanism which senses the increased oxygen consumption and dilates blood vessels accordingly. The brain is all tricky like that. Don't even ask how it manages to maintain a reasonable connection strength at each synapse in the face of constant potentiation and depotentiation.) This signal is best correlated with the local field potential (the low-frequency component of electrode recordings due to the average synaptic activity over a span of hundreds of micrometers), rather than the actual spiking activity of the neurons in the area. The upshot of this is that fMRI represents the inputs to a brain region, not the local activity.
That a metabolic measure signals input rather than output is reasonable from a biophysical perspective, since relatively few ions move across the axonal membrane in the process of transmitting an action potential. Most of the axon is covered by a lipid-rich myelin sheath which blocks the flow of ions and decreases the capacitance, allowing the action potential to be transmitted quickly between the gaps in the myelin (known as the nodes of Ranvier, which is also the name of a metal band of questionable quality). In contrast, neurons have giant dendritic trees which are subject to a constant barrage of neurotransmitters, most of which cause ion channels to open. When ion channels open, ions flow through them passively along their electrochemical gradient, reducing the strength of the gradient. Thus, when the amount of input increases, more energy needs to be expended to move the ions back against the gradients, hence the increased need for oxygen.
Now that I write this, I'm not entirely convinced by this justification of the coupling between input strength and oxygen utilization, since although the total ionic flow is much greater in the dendrites than the axon, it's still very small compared to the total ionic content of the neuron. You could cut out the ionic pumps and the cell would be fine for hours or days, or so I'm told, in which case there's no need to immediately increase the amount of available oxygen so the ionic pumps can be run in overdrive. However, it's possible that while the cell as a whole would not lose its overall negative charge were the ionic pumps shut off briefly, everything would go out of equilibrium in the dendritic tree. The branches of the dendritic tree are really small, so the total number of ions in a dendritic spine or branch is not very large. Even the relatively insignificant ionic flow due to synaptic activity may be enough to knock around the ionic concentrations in such small volumes.
Anyway, my point is that the BOLD signal from fMRI measures input, not local activity. And it has absolutely atrocious spacial and temporal resolution. Something on the order of millimeters and seconds. But it makes pretty pictures and lets hack science journalists tell the doe-eyed public "this part of the brain is for when you feel sad and lonely; this part of the brain is for when you feel happy." The real action is in calcium imaging, which can track single spikes (almost) from hundreds of cells at a time (but only in layer 2/3 of the cortex of anaesthetised animals), and chronic multitetrode recordings (disclosure: my old lab used this technique; tetrodes are bundles of four electrodes, which allow the isolation of dozens of cells from a single such bundle through a variety of black-magic mathematical tricks), which can record from perhaps a hundred cells for days, weeks, or months at a time (depending upon the strength of your experimental mojo). But no one wants to see pictures of comatose cats with the backs of the heads lopped off, or rats running around with bundles of wires popping out of their skulls. And the experimental results, while useful and meaningful, rarely come with a five second sound-bite. Half the time even specialists in the field aren't sure what the ultimate implication of a study is. So fMRI gets the publicity and a disproportionate share of the funding.
Which is dumb, because very little useful science has come out of fMRI. Glorified phrenology. One of the most striking known facts about the brain is that most of it looks the same. So far as anyone can tell, aside from a few small and probably insignificant differences, cortex is cortex, regardless of whether it's processing visual data or planning an arm movement or contemplating the secrets of the universe. In fact, you can rewire things visual input to the auditory areas and everything works out just fine (von Melchner, Pallas, & Sur, 2000). In ferrets. Not fruit flies. Not frogs. Visual mammals like you and me.
This would seem to suggest that the same basic computation underlies most of what the brain is doing. Wouldn't it be nice to know what this computation is? Why would you waste your time attempting to pinpoint exactly how the computation is divided spatially, when all the evidence suggests that the computation is the same everywhere? People are strange...
Monday, June 18, 2007
The art of science
In the process of thinking deep thoughts, whiteboards can become miraculously covered with some rather strange designs. Think of whiteboard marker ink as the academic equivalent of the holy oil that sometimes accumulates on statues and portraits of the Virgin Mary. Please note that "You get a cookie" is an important technical concept. We have defined Z to be beauty.
Sunday, June 17, 2007
A little reductionism goes a long way
Back in freshman year of college, I took a course called Relativism, Reason, and Reality. I had been exposed to a little existentialism in high school, and I was sufficiently intrigued to consider a philosophy minor. Until I took R, R, and more R. Academic philosophy seems like nothing so much as the dumping ground for failed theoretical mathematicians. Rather than proving well-defined theorems with precise logic, most of the works we read aspired to rigor, but were ultimately undone by their dependence upon metaphor, allusion, and the inherent vagueness of colloquial language. If you can't completely define the assumptions and terms with which you are working, it is impossible to construct an irrefutable argument. There is no room for dissent in proper mathematics. While philosophers may argue about the nature of truth, and the most complex theorems may require years of consideration before they are finally accepted or rejected, mathematics is as close as the human mind can come to absolute certainty. So far as I can see, the greatest potential weakness in mathematics is the distinct possibility that all humans are inherently and consistently irrational, in which case true rational thought is forever beyond our grasp. Short of the mass failure of the human mind, however, mathematics seems beyond challenge.
Despite my dissatisfaction with the presentation of the material, we did discuss some interesting ideas in R, R, & R. Consider the following: surely, were a single atom of your body replaced by an equivalent atom, you would agree that your identity would remain uncompromised. The new atom and the old atom are indistinguishable, so even though the particles composing your body would be slightly altered, the pattern would remain entirely unchanged. Similarly, if you believe that you are nothing more than your body, then if 1%, or 10%, or 100% of the atoms in your body were instantaneously switched with identical atoms, the exchange should go entirely entirely unnoticed and your identity should remain intact.
Now consider the case where your body is reconstructed somewhere other than its original location; say, ten feet away. The pattern of your body is the same. Your location hasn't changed much. I would hope that you would be willing to accept that the resulting person would still be you.
But what if your original body were not destroyed in the process? What if an exact, perfect duplicate were created ten feet away, but you were left standing exactly where you were before. Would you then allow yourself to be killed, knowing that a perfect duplicate would immediately take your place? Would this stranger standing next to you actually be you?
This may sound suspiciously like the sort of philosophical nonsense I was railing against a few paragraphs above, but consider the following: are you not effectively being replaced with an exact copy of yourself every instant? Does continuity in space and time really affect the core of the argument? To what extent can you legitimately claim that the you-of-right-now is the same as the you-of-five-minutes-ago? If death is just the cessation of this succession of you's, why should the you-of-right-now care that the you-of-50-years-from-now (or 5-minutes-from-now) will not exist? Why bother planning for the future or defining yourself in terms of past actions? In what sense can you be said exist, if your existence is inextricably bound to a single instant?
Consider in this light Descartes' claim that "I think, therefore I am." Certainly, there are thoughts, but the thinker need not be consistent. At any given moment, you have access to memories of past thoughts, but such memories are but imperfect afterimages of the original thought, and you are bound to this previous thinker only by these echoes. Where then is the "I" that is doing the thinking?
Despite my dissatisfaction with the presentation of the material, we did discuss some interesting ideas in R, R, & R. Consider the following: surely, were a single atom of your body replaced by an equivalent atom, you would agree that your identity would remain uncompromised. The new atom and the old atom are indistinguishable, so even though the particles composing your body would be slightly altered, the pattern would remain entirely unchanged. Similarly, if you believe that you are nothing more than your body, then if 1%, or 10%, or 100% of the atoms in your body were instantaneously switched with identical atoms, the exchange should go entirely entirely unnoticed and your identity should remain intact.
Now consider the case where your body is reconstructed somewhere other than its original location; say, ten feet away. The pattern of your body is the same. Your location hasn't changed much. I would hope that you would be willing to accept that the resulting person would still be you.
But what if your original body were not destroyed in the process? What if an exact, perfect duplicate were created ten feet away, but you were left standing exactly where you were before. Would you then allow yourself to be killed, knowing that a perfect duplicate would immediately take your place? Would this stranger standing next to you actually be you?
This may sound suspiciously like the sort of philosophical nonsense I was railing against a few paragraphs above, but consider the following: are you not effectively being replaced with an exact copy of yourself every instant? Does continuity in space and time really affect the core of the argument? To what extent can you legitimately claim that the you-of-right-now is the same as the you-of-five-minutes-ago? If death is just the cessation of this succession of you's, why should the you-of-right-now care that the you-of-50-years-from-now (or 5-minutes-from-now) will not exist? Why bother planning for the future or defining yourself in terms of past actions? In what sense can you be said exist, if your existence is inextricably bound to a single instant?
Consider in this light Descartes' claim that "I think, therefore I am." Certainly, there are thoughts, but the thinker need not be consistent. At any given moment, you have access to memories of past thoughts, but such memories are but imperfect afterimages of the original thought, and you are bound to this previous thinker only by these echoes. Where then is the "I" that is doing the thinking?
Saturday, June 16, 2007
Market-based science
Yesterday, I ranted about faith-based science. Well, religion and capitalism make for strange bedfellows. The Financial Times is presently carrying an impassioned but thoroughly irrational diatribe by Czech president Vaclav Klaus against global warming and climate change science. In claiming that the scientific consensus on climate change is politically motivated, Klaus demands that
As a potential solution to global climate change, Klaus proposes that
and
First of all, as a scientist, I find the suggestion that scientific conclusions are generally tainted by political considerations gravely insulting. Any scientist worthy of the name is driven to discover underlying reality and firmly constrained by empirical observation, regardless of the political implications. Klaus' call to "resist the politicisation of science and oppose the term 'scientific consensus', which is always achieved only by a loud minority, never by a silent majority" amounts to a dismissal of science itself. The general agreement of the scientific community, achieved through the exchange of ideas in journals, conferences, and personal communications, allows humanity to arrive at the best possible approximation to the truth. Obviously, the true state of the world can never be known with certainty, and individuals will always be subject to personal bias, but the collective discussion of a group of rational, intelligent, informed individuals serves as a filter on inconsistent reasoning and political predisposition alike. Scientific consensus has given us our greatest paradigm shifts, from the heliocentric universe to evolution to quantum mechanics. While the gears of scientific consensus may turn slowly, the juggernaut rarely commits itself to the wrong path.
More frighteningly, though, Klaus completely ignores the inevitable march of unrestricted free-market capitalism to the tragedy of the commons. Indeed, Klaus counsels that "any suppression of freedom and democracy should be avoided." Such suppressions of freedom include antitrust legislation, consumer protection laws, workers' rights, and environmental protections. Klaus would have us believe that, left to their own devices, the global mega-corporations which drive the world economy would naturally become conscientious stewards of the larger world in which they exist. Certainly, no corporation would poison our rivers and oceans with toxic chemicals, because it is cheaper to dump them into public resources than to dispose of them safely. Clearly, no company would market unsafe products and cover up the inevitable accidents or illnesses. Even if human activity is inducing an increase in temperature, as the "scientific consensus" seems to believe, and given that such an increase would almost invariably have catastrophic consequences, it is nevertheless not in the interest of any single corporation to modify its behavior to address the problem. Governments exist precisely to protect the masses from the actions of the few. I don't know what the Czech Republic is doing these days, but if their leader so firmly believes that that government is best which governs not at all, then I don't see how they can avoid a swift descent into anarchy.
The scientists should help us and take into consideration the political effects of their scientific opinions. They have an obligation to declare their political and value assumptions and how much they have affected their selection and interpretation of scientific evidence.
As a potential solution to global climate change, Klaus proposes that
Instead of organising people from above, let us allow everyone to live as he wants
and
Instead of speaking about “the environment”, let us be attentive to it in our personal behaviour
First of all, as a scientist, I find the suggestion that scientific conclusions are generally tainted by political considerations gravely insulting. Any scientist worthy of the name is driven to discover underlying reality and firmly constrained by empirical observation, regardless of the political implications. Klaus' call to "resist the politicisation of science and oppose the term 'scientific consensus', which is always achieved only by a loud minority, never by a silent majority" amounts to a dismissal of science itself. The general agreement of the scientific community, achieved through the exchange of ideas in journals, conferences, and personal communications, allows humanity to arrive at the best possible approximation to the truth. Obviously, the true state of the world can never be known with certainty, and individuals will always be subject to personal bias, but the collective discussion of a group of rational, intelligent, informed individuals serves as a filter on inconsistent reasoning and political predisposition alike. Scientific consensus has given us our greatest paradigm shifts, from the heliocentric universe to evolution to quantum mechanics. While the gears of scientific consensus may turn slowly, the juggernaut rarely commits itself to the wrong path.
More frighteningly, though, Klaus completely ignores the inevitable march of unrestricted free-market capitalism to the tragedy of the commons. Indeed, Klaus counsels that "any suppression of freedom and democracy should be avoided." Such suppressions of freedom include antitrust legislation, consumer protection laws, workers' rights, and environmental protections. Klaus would have us believe that, left to their own devices, the global mega-corporations which drive the world economy would naturally become conscientious stewards of the larger world in which they exist. Certainly, no corporation would poison our rivers and oceans with toxic chemicals, because it is cheaper to dump them into public resources than to dispose of them safely. Clearly, no company would market unsafe products and cover up the inevitable accidents or illnesses. Even if human activity is inducing an increase in temperature, as the "scientific consensus" seems to believe, and given that such an increase would almost invariably have catastrophic consequences, it is nevertheless not in the interest of any single corporation to modify its behavior to address the problem. Governments exist precisely to protect the masses from the actions of the few. I don't know what the Czech Republic is doing these days, but if their leader so firmly believes that that government is best which governs not at all, then I don't see how they can avoid a swift descent into anarchy.
Friday, June 15, 2007
A modest proposal
I don't understand the general public reaction to the negotiated transfer of organs. For instance, from the linked article:
The prettiest and wealthiest people already receive preferential access to every other scarce resource. In particular, the wealthy can afford medical care vastly superior to that available to the poor. How is a new kidney so different from a new cancer drug?
Looking at the issue from another perspective, why is the outright, informed, deliberate sale of organs by a living person so unthinkable? People already sell their time, and often their health, through their jobs. Black lung, anyone? Experimental subjects are compensated for Phase I clinical trials, which assess the safety rather than the efficacy of a new drug. Unfortunately, I've been unable to find any sites advertising the rates for experimental subjects. Perhaps because of the aforementioned dubious ethics of allowing people to sell their bodies, very little information is publicly accessible, although you can request a quote for the value of your health.
If I were starving and homeless, I think an offer to barter a kidney for a year's worth of food and shelter would seem more than fair. Why should the government be able to tell me what I can and cannot do with my body? In this light, the selling of organs seems rather similar to prostitution or assisted suicide. On the surface, the government appears to be protecting the poor (in the case of prostitution, and the godless heathens, in the case of assisted suicide) from exploitation, but I would argue that poverty itself is unjust in a land of plenty. How is it ethical to strip people of one of the tools with which they might free themselves from poverty?
I think we’d reject as a matter of morality and equity that the prettiest people, the people with the best story, or the ones who can pay the most, should get access to this very scarce resource.
The prettiest and wealthiest people already receive preferential access to every other scarce resource. In particular, the wealthy can afford medical care vastly superior to that available to the poor. How is a new kidney so different from a new cancer drug?
Looking at the issue from another perspective, why is the outright, informed, deliberate sale of organs by a living person so unthinkable? People already sell their time, and often their health, through their jobs. Black lung, anyone? Experimental subjects are compensated for Phase I clinical trials, which assess the safety rather than the efficacy of a new drug. Unfortunately, I've been unable to find any sites advertising the rates for experimental subjects. Perhaps because of the aforementioned dubious ethics of allowing people to sell their bodies, very little information is publicly accessible, although you can request a quote for the value of your health.
If I were starving and homeless, I think an offer to barter a kidney for a year's worth of food and shelter would seem more than fair. Why should the government be able to tell me what I can and cannot do with my body? In this light, the selling of organs seems rather similar to prostitution or assisted suicide. On the surface, the government appears to be protecting the poor (in the case of prostitution, and the godless heathens, in the case of assisted suicide) from exploitation, but I would argue that poverty itself is unjust in a land of plenty. How is it ethical to strip people of one of the tools with which they might free themselves from poverty?
The Swedish Chef goes to Washington
Now playing in the New York Times: Bork versus Bork? Bork, bork, bork!!!
I sometimes wonder whether the NY Times is intentionally using subtle humor, or whether the editors' shirts are so stuffed that the irony doesn't penetrate. For instance, Sam Brownback's views on evolution are either an epic piece of deadpan sarcasm on the level of Stephen Colbert's address at the 2006 White House Press Correspondents' Association dinner, or an absolutely appalling commentary on the state of scientific understanding and rationality in America. Consider the following quote:
And this one:
And especially this one:
Now presenting (cue lights and music): Faith Based Science!!! First decide what you want to believe, then look for data supporting your pre-established conclusion, dismissing any contradictory evidence as merely an opposing religious claim. I would ask in amazement what sort of political policy such a decision-making framework would lead to, but I think we've been seeing the effects for the past six year. The real punch-line, however, is the following: this was not a haphazard statement sputtered out unprepared during a press conference or even a debate. This was an official, carefully honed policy statement deliberately submitted to a major news outlet. This is how Senator Brownback WANTS to be seen. At the very least, I'd like to imagine that the people at the Times editorial desk had a good chuckle before sending this to press.
I sometimes wonder whether the NY Times is intentionally using subtle humor, or whether the editors' shirts are so stuffed that the irony doesn't penetrate. For instance, Sam Brownback's views on evolution are either an epic piece of deadpan sarcasm on the level of Stephen Colbert's address at the 2006 White House Press Correspondents' Association dinner, or an absolutely appalling commentary on the state of scientific understanding and rationality in America. Consider the following quote:
Many questions raised by evolutionary theory — like whether man has a unique place in the world or is merely the chance product of random mutations — go beyond empirical science and are better addressed in the realm of philosophy or theology.
And this one:
It does not strike me as anti-science or anti-reason to question the philosophical presuppositions behind theories offered by scientists who, in excluding the possibility of design or purpose, venture far beyond their realm of empirical science.
And especially this one:
Man was not an accident and reflects an image and likeness unique in the created order. Those aspects of evolutionary theory compatible with this truth are a welcome addition to human knowledge. Aspects of these theories that undermine this truth, however, should be firmly rejected as an atheistic theology posing as science.
Now presenting (cue lights and music): Faith Based Science!!! First decide what you want to believe, then look for data supporting your pre-established conclusion, dismissing any contradictory evidence as merely an opposing religious claim. I would ask in amazement what sort of political policy such a decision-making framework would lead to, but I think we've been seeing the effects for the past six year. The real punch-line, however, is the following: this was not a haphazard statement sputtered out unprepared during a press conference or even a debate. This was an official, carefully honed policy statement deliberately submitted to a major news outlet. This is how Senator Brownback WANTS to be seen. At the very least, I'd like to imagine that the people at the Times editorial desk had a good chuckle before sending this to press.
Thursday, June 14, 2007
Inconsistent constituencies
I've been dancing at X-TRA's More than Mode event every week for about eight months now. Although the DJ's constantly rotate, they all tend to play the same songs, and I've learned the words to most of the English ones just through repeated exposure while shaking my hips. The stasis can be a bit tedious at times, but the routine also has an air of comfort and familiarity. More importantly, I go to let the pounding bass become a metronome by which I can structure my actions and thoughts, not to discover new bands. I don't particularly like most of the stuff they play, regardless of variety. The verse-chorus-verse structure and simple, repeating melodies of club standards will never engage me intellectually, but more nuanced music is usually not very good for rhythmic gyrations.
Attending every week, I've come to recognize the regulars. Sometimes, they even acknowledge my existence, although in practice I tend to discourage such interactions. What surprises me, though, is that the crowd of regulars shifts over a time scale of perhaps three months. People who appeared without fail every single week in December and January have not graced the assembled black-clad masses with their presence in weeks. I can't decide whether these prodigal children have moved on to other musical genres, or have given up dancing entirely. Perhaps the appeal of the club lay as much in its social as its auditory atmosphere, and their interest waned as their ever-shifting social circles turned over and over. Maybe these people change identities the way other people change clothes: this month, they're goth; next month, they're gangstas. I'm thinking Raven in QC, although comic characters admittedly do not make the most reliable exemplars of actual human behavior... Or perhaps they like their lives spicy with variety, and going out to the same venue week after week grew stale.
I've never understood novelty for its own sake. I'm a creature of habit. I can more fully appreciate those things which I understand. Sensory learning is a reasonable metaphor. The first time you taste a dry martini or a very dark espresso, you're overwhelmed and almost choked by the most obvious flavors. With repeated exposure, you are able to discern the nuances layered on top of the more prominent tastes. The connoisseur experiences the same raw sensory stimuli in a completely different way than the dilettante. Although perhaps I should consider the possibility that my lack of appreciation for novelty stems from my relative lack of experience.
Attending every week, I've come to recognize the regulars. Sometimes, they even acknowledge my existence, although in practice I tend to discourage such interactions. What surprises me, though, is that the crowd of regulars shifts over a time scale of perhaps three months. People who appeared without fail every single week in December and January have not graced the assembled black-clad masses with their presence in weeks. I can't decide whether these prodigal children have moved on to other musical genres, or have given up dancing entirely. Perhaps the appeal of the club lay as much in its social as its auditory atmosphere, and their interest waned as their ever-shifting social circles turned over and over. Maybe these people change identities the way other people change clothes: this month, they're goth; next month, they're gangstas. I'm thinking Raven in QC, although comic characters admittedly do not make the most reliable exemplars of actual human behavior... Or perhaps they like their lives spicy with variety, and going out to the same venue week after week grew stale.
I've never understood novelty for its own sake. I'm a creature of habit. I can more fully appreciate those things which I understand. Sensory learning is a reasonable metaphor. The first time you taste a dry martini or a very dark espresso, you're overwhelmed and almost choked by the most obvious flavors. With repeated exposure, you are able to discern the nuances layered on top of the more prominent tastes. The connoisseur experiences the same raw sensory stimuli in a completely different way than the dilettante. Although perhaps I should consider the possibility that my lack of appreciation for novelty stems from my relative lack of experience.
My amp goes to 11
I'm not a big fan of loud noise. For as long as I can remember, I've been a little phobic about hearing loss. When I go out clubbing or to a concert, I always wear earplugs (leaving a club and rejoining the world without ringing ears is a delightful experience). When listening to music by myself, I tend to keep the volume very low. During senior year of high school, with a license, a car, and a daily commute of at least half an hour, I would keep the music turned down so low that I was only able to follow the songs because I already knew the words and melody. Sometimes it takes me a few minutes to realize that an album has ended; black metal at low volumes can sound surprisingly similar to the ambient hiss of an empty room. But recently, I've been experimenting with turning the volume up a bit higher. It's remarkable how much more detail you can hear with good headphones and adequate volume. Just now, listening to Wolves in the Throne Room (who are presently touring and, according to Metal Archives, have a new album coming out), I heard an absolutely fantastic drum fill I had never noticed before. If I'm going to burn my sensory acuity on something, I think this is a worthy cause. Look at me, the intrepid risk-taker!
Did I mention that Wolves in the Throne Room is touring? I think I did. If you live on the West Coast, they will probably be passing through a nearby city very soon, potentially with Sunn O))) and Earth in tow. This is clearly the music event of the year. I obviously can't go, so you will need to enjoy it in my stead.
Did I mention that Wolves in the Throne Room is touring? I think I did. If you live on the West Coast, they will probably be passing through a nearby city very soon, potentially with Sunn O))) and Earth in tow. This is clearly the music event of the year. I obviously can't go, so you will need to enjoy it in my stead.
Wednesday, June 13, 2007
Binge and purge
I have a binging problem. I don't do alcohol binges or cocaine-fueled gambling binges or even relatively benign nitrous oxide binges. My problem is with cartoons. Mostly web comics and anime. When I first started reading Questionable Content, I spent at least one entire evening riveted to my desk, clicking through episode after episode. My experience was similar with Girly and Order of the Stick, even though their quality didn't really merit the single-minded fascination with which I devoured them. And there have been a number of anime series for which, while watching, I had to force myself to go to sleep because the sun was rising. There's something about the decadence of wasting an entire day doing something utterly worthless which I find strangely appealing.
The draw has been even stronger in recent days after finishing my NIPS paper. When I have a substantial goal with a well-defined deadline, I tend to push myself as hard as I can, all the while envisioning all of the pleasures I've deferred along the way. But when I finally finish, the freedom crashes over me like a tidal wave and drowns me, rather than carrying me aloft. In college, after finishing my finals, I generally curled up in a ball in my room for days at a time, leaving only to make use of the kitchen and the bathroom. After my Master's defence, I holed up in my parents house for two weeks, mostly watching HBO movies. Similarly, for the past few days I've had trouble getting out of bed in the morning. I barely managed to buy groceries over the weekend, and I left lab yesterday perhaps seven hours after I arrived, including an hour-long nap and some mindless internet surfing.
I think the problem is that when you finish pushing a boulder to what appeared to be the top of a mountain, although the boulder may not exactly roll down the other side, you very clearly see that you've only reached a small plateau, and the mountain extends indefinitely. While small goals can be defined and achieved, one cannot extrapolate from the limited objectives of daily life to the general motivation for life itself. I am Sisyphus, but I don't imagine that Sisyphus is happy.
The draw has been even stronger in recent days after finishing my NIPS paper. When I have a substantial goal with a well-defined deadline, I tend to push myself as hard as I can, all the while envisioning all of the pleasures I've deferred along the way. But when I finally finish, the freedom crashes over me like a tidal wave and drowns me, rather than carrying me aloft. In college, after finishing my finals, I generally curled up in a ball in my room for days at a time, leaving only to make use of the kitchen and the bathroom. After my Master's defence, I holed up in my parents house for two weeks, mostly watching HBO movies. Similarly, for the past few days I've had trouble getting out of bed in the morning. I barely managed to buy groceries over the weekend, and I left lab yesterday perhaps seven hours after I arrived, including an hour-long nap and some mindless internet surfing.
I think the problem is that when you finish pushing a boulder to what appeared to be the top of a mountain, although the boulder may not exactly roll down the other side, you very clearly see that you've only reached a small plateau, and the mountain extends indefinitely. While small goals can be defined and achieved, one cannot extrapolate from the limited objectives of daily life to the general motivation for life itself. I am Sisyphus, but I don't imagine that Sisyphus is happy.
Sunday, June 10, 2007
The cult of scientific celebrity
Science is, in essence, a purely rational discipline. While naive notions of hypothesis testing or theory falsification as the primary purpose of experimentation can be dismissed out of hand, the project of science is nonetheless inherently logical, as opposed to emotional, political, or spiritual. The proper measure of a theory is always its ability to model the observed world. Neither the personal ramifications of the theory, nor its source, have any impact on its truth. In this light, I have difficulty understanding the reverence heaped upon those who have achieved success in their scientific pursuits.
Nobel prize winners in particular are accorded almost god-like status. The walls of the atrium of my present lab are decorated with pictures of our collaborators, but also with pictures of notable scientists with whom we have no direct connection. Amongst these latter pictures are a few featuring the heads of the lab together with Nobel winners who happened to pass through Zurich and give a talk at the university or ETH. Recently, Roderick MacKinnon, who won the prize for determining the structure of the potassium channel using x-ray crystallography, deigned to grace our lab with his presence for a few hours and was given the royal treatment. Indeed, in announcing this visit, one of the lab heads said, "Rod MacKinnon will be coming to visit. I trust you all know who Rod MacKinnon is," and left it at that. No, I don't know who Rod MacKinnon is. While the result for which he received the prize is important, it constituted a page or two in my introductory neuroscience textbook. It is an important piece of background information regarding the biophysics of neurons, but it has absolutely no impact on my daily work. My research would be unaffected if the three-dimensional structure of all the neuronal ion channels was still unknown. I work in a laboratory focused on computational and theoretical neuroscience. Why should any of us know who Rod MacKinnon is? Nevertheless, we had a special tea to fete MacKinnon, and everyone gathered around at his feat so that they could root about for any pearls of wisdom he might carelessly cast down. After making the obligatory graduate-student-pounce on the free food, I went back to my desk to get some real work done.
Perhaps my awe of the Nobel prize and those who have received its blessings was dulled by my years at MIT and Caltech, where you could sometimes bump into such holy personages while using a urinal. The sight of David Baltimore zipping around on his Segway like a doofus, crowned with a bicycle helmet, fails to arouse in me any worshipful feelings. The Nobel prize and other such awards are a valuable motivation for scientific achievement, but it is important to recognize that the sort of success they honor depends on luck as much as skill. The ranks of scientists at prominent universities are filled with researchers of the highest caliber who didn't happen to try the one long-shot technique that actually worked, or make just the right mistake when performing an experiment to reveal a wholly unexpected phenomenon. Just because the Nobel committee doesn't think a particular result is worthy of recognition this year does not make it less important than the finding which does happen to be honored.
Did I mention how much I like NIPS's double-blind review policy? I hope all journals adopt that model. Papers should be judged on their content, not their authors.
Nobel prize winners in particular are accorded almost god-like status. The walls of the atrium of my present lab are decorated with pictures of our collaborators, but also with pictures of notable scientists with whom we have no direct connection. Amongst these latter pictures are a few featuring the heads of the lab together with Nobel winners who happened to pass through Zurich and give a talk at the university or ETH. Recently, Roderick MacKinnon, who won the prize for determining the structure of the potassium channel using x-ray crystallography, deigned to grace our lab with his presence for a few hours and was given the royal treatment. Indeed, in announcing this visit, one of the lab heads said, "Rod MacKinnon will be coming to visit. I trust you all know who Rod MacKinnon is," and left it at that. No, I don't know who Rod MacKinnon is. While the result for which he received the prize is important, it constituted a page or two in my introductory neuroscience textbook. It is an important piece of background information regarding the biophysics of neurons, but it has absolutely no impact on my daily work. My research would be unaffected if the three-dimensional structure of all the neuronal ion channels was still unknown. I work in a laboratory focused on computational and theoretical neuroscience. Why should any of us know who Rod MacKinnon is? Nevertheless, we had a special tea to fete MacKinnon, and everyone gathered around at his feat so that they could root about for any pearls of wisdom he might carelessly cast down. After making the obligatory graduate-student-pounce on the free food, I went back to my desk to get some real work done.
Perhaps my awe of the Nobel prize and those who have received its blessings was dulled by my years at MIT and Caltech, where you could sometimes bump into such holy personages while using a urinal. The sight of David Baltimore zipping around on his Segway like a doofus, crowned with a bicycle helmet, fails to arouse in me any worshipful feelings. The Nobel prize and other such awards are a valuable motivation for scientific achievement, but it is important to recognize that the sort of success they honor depends on luck as much as skill. The ranks of scientists at prominent universities are filled with researchers of the highest caliber who didn't happen to try the one long-shot technique that actually worked, or make just the right mistake when performing an experiment to reveal a wholly unexpected phenomenon. Just because the Nobel committee doesn't think a particular result is worthy of recognition this year does not make it less important than the finding which does happen to be honored.
Did I mention how much I like NIPS's double-blind review policy? I hope all journals adopt that model. Papers should be judged on their content, not their authors.
Wired for transcranial magnetic stimulation
Transcranial magnetic stimulation (TMS) is one of the most frightening tools used by neuroscientists in the course of their research into the function of the brain. The basic idea is that rapidly oscillating magnetic fields applied through the brain will induce currents in the neurons through which the magnetic field passes, altering their firing properties. It's sort of like magnetoencophalography in reverse. However, these currents are completely uncontrolled and have a spatial scale of centimeters, whereas a single neuron is a few micrometer in diameter. TMS is thus very similar to electroconvusive therapy; the main difference is that the currents are entirely internal to the brain. Unsurprisingly, TMS can induce seizures, and I have yet to see convincing evidence that it does not do any focal damage to the stimulated area.
Some reasonably reputable people use TMS to disable small areas of the brain and thus identify their function. For instance, they apply TMS to a specific region of the brain while a subject counts up from one. If an area related to short term memory or language is targeted, subjects will stop mid-count and lose their place. If the visual areas of the occipital lobe are instead subjected to the induced current, the subjects may report seeing flashes of light, but their count will continue undisturbed. From a scientific perspective, this is relatively reasonable. If you inject a whole bunch of noise into a robust and stable computational system, you would expect to disrupt the local computation without severely impacting more distant computational modules. Experiments like this need not assume that the stimulation has any particular effect on computation or learning, just that it scrambles the local state of the brain.
Less reputable people claim that TMS can be used to enhance cognitive function or treat psychiatric disorders. Wired seems to lap up these claims like a dog guzzling antifreeze: they're sweet going down, but ultimately toxic. I just stumbled on no fewer than three separate articles on this unproven and potentially dangerous technique. Of course, the real crazy reports come not from the popular media, but the academic fringe. Allan Snyder is particular notable in this respect. Anyone who thinks that "the left fronto-temporal lobe" is a localized brain area or that eleven is a sufficient sample size for statistically significant conclusions in a single-blind protocol where the experimenter must subjectively evaluate the quality of hand-drawn images needs to have his head examined.
Some reasonably reputable people use TMS to disable small areas of the brain and thus identify their function. For instance, they apply TMS to a specific region of the brain while a subject counts up from one. If an area related to short term memory or language is targeted, subjects will stop mid-count and lose their place. If the visual areas of the occipital lobe are instead subjected to the induced current, the subjects may report seeing flashes of light, but their count will continue undisturbed. From a scientific perspective, this is relatively reasonable. If you inject a whole bunch of noise into a robust and stable computational system, you would expect to disrupt the local computation without severely impacting more distant computational modules. Experiments like this need not assume that the stimulation has any particular effect on computation or learning, just that it scrambles the local state of the brain.
Less reputable people claim that TMS can be used to enhance cognitive function or treat psychiatric disorders. Wired seems to lap up these claims like a dog guzzling antifreeze: they're sweet going down, but ultimately toxic. I just stumbled on no fewer than three separate articles on this unproven and potentially dangerous technique. Of course, the real crazy reports come not from the popular media, but the academic fringe. Allan Snyder is particular notable in this respect. Anyone who thinks that "the left fronto-temporal lobe" is a localized brain area or that eleven is a sufficient sample size for statistically significant conclusions in a single-blind protocol where the experimenter must subjectively evaluate the quality of hand-drawn images needs to have his head examined.
Saturday, June 9, 2007
Character class upgrade: Scientician became scientist
Yesterday I completed one of the rights of passage of aspiring scientists: I submitted my first journal article. Admittedly, the submission was to NIPS, which is a conference rather than a proper journal, but my eight page paper will be properly reviewed and, if accepted, published in a format more accessible than a Nature paper. Of course, it won't be searchable via Web of Science, but that's why god made Citeseer, and nothing escapes the all-seeing eye of Google Scholar.
Writing those eight pages was considerably more difficult and time consuming than I expected. I churned out text for my Master's thesis at twice or three times the rate at which this paper came together. I think the difference is that while my Master's thesis needed to be basically original and correct, I knew that no one would read it especially carefully. Slightly imprecise statements were forgivable, and less relevant pieces of data and reckless speculation were mixed freely with the essential, solid results. In the months since my candidacy exam, I've come to realize that significant sections of my thesis were in fact already known. Such weakness is not acceptable in a paper submission. I read my drafts with the same hypercritical eye I apply to every other paper I read: all statements are assumed to be not just false but stupid until proven otherwise.
In the end, I think I produced a pretty strong submission. The eight page maximum, which I originally saw as a burden because it implied a certain minimum amount of material, ended up constraining me in the opposite direction. In the last day or two, I spent hours deleting a word here and there or shrinking a figure to force everything into the allowed space. We'll see what the reviewers think. If they don't like it, someone is going to get their kneecaps kicked in. I'm not sure who exactly, because the reviews are double-blind (my name appears as A. Anonymous), but there will be at least one more cripple in the ranks of the world's theoretical neuroscientists and statistical learning theorists.
Speaking of which, I really like NIPS's double-blind review policy. While necessarily imperfect because everyone in the field has a reasonable idea of the particular topics on which each of their colleagues works, at least in principle the papers will be judged on their merit rather than the prominence of the authors. I find this policy especially just, seeing as how neither I nor my adviser is particularly well known in the NIPS community. In my case, the last sentence is obviously something of an understatement...
Finally, I think I officially received my Master's degree (in absentia) yesterday. I am now one small step closer to wallpapering a room entirely with diplomas. I'm afraid, though, that I'm going to be leaning very heavily on receiving fistfuls of honorary doctorates after the first two or three square feet.
Writing those eight pages was considerably more difficult and time consuming than I expected. I churned out text for my Master's thesis at twice or three times the rate at which this paper came together. I think the difference is that while my Master's thesis needed to be basically original and correct, I knew that no one would read it especially carefully. Slightly imprecise statements were forgivable, and less relevant pieces of data and reckless speculation were mixed freely with the essential, solid results. In the months since my candidacy exam, I've come to realize that significant sections of my thesis were in fact already known. Such weakness is not acceptable in a paper submission. I read my drafts with the same hypercritical eye I apply to every other paper I read: all statements are assumed to be not just false but stupid until proven otherwise.
In the end, I think I produced a pretty strong submission. The eight page maximum, which I originally saw as a burden because it implied a certain minimum amount of material, ended up constraining me in the opposite direction. In the last day or two, I spent hours deleting a word here and there or shrinking a figure to force everything into the allowed space. We'll see what the reviewers think. If they don't like it, someone is going to get their kneecaps kicked in. I'm not sure who exactly, because the reviews are double-blind (my name appears as A. Anonymous), but there will be at least one more cripple in the ranks of the world's theoretical neuroscientists and statistical learning theorists.
Speaking of which, I really like NIPS's double-blind review policy. While necessarily imperfect because everyone in the field has a reasonable idea of the particular topics on which each of their colleagues works, at least in principle the papers will be judged on their merit rather than the prominence of the authors. I find this policy especially just, seeing as how neither I nor my adviser is particularly well known in the NIPS community. In my case, the last sentence is obviously something of an understatement...
Finally, I think I officially received my Master's degree (in absentia) yesterday. I am now one small step closer to wallpapering a room entirely with diplomas. I'm afraid, though, that I'm going to be leaning very heavily on receiving fistfuls of honorary doctorates after the first two or three square feet.
Subscribe to:
Posts (Atom)