Archive for: October, 2010

On bullshit

Oct 26 2010 Published by under From the Melodye Files

I've spent the last couple of days exploring The Guardian's secret philosophy / religion section and reading interviews from the Paris Review (and reading more and more Marquez; I can understand why Bolaño makes fun of him now, though it's hard not to find the man's scribblings adorable).  The last interview I read this afternoon was with Haruki Murakami, who changed my life at fourteen with Norwegian Wood (him and Eugenides; see: The Virgin Suicides).  There are two quotes I loved, in particular :

"In the 19th and early 20th centuries, writers offered the real thing; that was their task. In War and Peace Tolstoy describes the battleground so closely that the readers believe it’s the real thing. But I don’t. I’m not pretending it’s the real thing. We are living in a fake world; we are watching fake evening news. We are fighting a fake war. Our government is fake. But we find reality in this fake world. So our stories are the same; we are walking through fake scenes, but ourselves, as we walk through these scenes, are real. The situation is real, in the sense that it’s a commitment, it’s a true relationship. That’s what I want to write about."


"I like to write comic dialogue; it’s fun. But if my characters were all comic it would be boring. Those comic characters are a kind of stabilizer to my mind; a sense of humor is a very stable thing. You have to be cool to be humorous. When you’re serious, you could be unstable; that’s the problem with seriousness. But when you’re humorous, you’re stable. But you can’t fight the war smiling."

This came, of course, directly after reading a joyful interview with the comic writer P.G. Wodehouse who must have been the merriest man alive. But still -- these words gave me some solace. "You can't fight the war smiling." Of course you can't, though you must try to delight in what you can.

And then there was this Don DeLillo ruby hidden in the midst of a beguiling David Mitchell interview :

"It is so much simpler to bury reality than it is to dispose of dreams."

I suppose, in the heady pursuit of science, it is impossible to bury reality.  In the academic world, there is the interminable reality of rejections; politics; ghastly ideas (and some ghastly people promoting them as well).  That strange, fake world.  But sometimes, for a moment, it's nice to forget it -- and to struggle on, imagining the rightly impossible to be very near.  Sometimes I think that perhaps what is so devastatingly wrong with the current state of academic psychology is the lack of wonderment among certain careerist academics; the obsession with protocol and publicity and status and not (rightly) with the meat and marrow of ideas, which might transform our world, which might bring us some much needed relief. When money and career are at stake, there are far too many too easily compromised.  And for what?

Should earnestness be a requirement in a good scientist?  Or should science teach us to be cynical?

(Link is to "On Bullshit," the Frankfurt essay.)

8 responses so far

What belongs in the public domain?

Oct 20 2010 Published by under Links Best Served Cold

Most science blogs link to other science blogs.  I get the feeling our readers are in good hands when it comes to getting their science-fill.  So here's what else I'm reading (or watching) right now:

Man Gets Revenge on Ex-Girlfriend on C-SPAN 2

What belongs in the public domain?  This video begs the question in a big way.  (It also happens to be hilarious).  See the Washington Post recap here.

Coming Out : On Gay Identity (A Video Series, Courtesy of BigThink)

There is something intensely voyeuristic about this, which makes it both compelling and avidly watchable.  I've seen quite a number of the videos on BigThink and am usually bored to tears within minutes (academics rambling on does not make for good viewing, typically).  But catch the brilliant at their most personal and it's something else entirely.  My favorite?  John Waters -- he comes at the end.

The Book Bench : Paul Muldoon takes on K$sha

Ever wanted to watch a vaunted Princeton lit professor take on pop's dirtiest star?  (In the vein of : Ali G. goes to Princeton)

Christine O'Donnell & Sarah Palin are proof that the more incendiary your beliefs, the better

A hysterical take-down of Christine O'Donnell by none other than the Guardian's AF Kennedy.  (A woman)  Be sure to at least check out the picture and O'Donnell's tightightight smile!

"Uppity anti-masturbation campaigner, ex-witch and TV pundit Christine O'Donnell is both an embarrassing threat to established Republican interests and a woman with the stunned eyes and tighttighttight smile of a stranger to self-love. (She also presents an apparently intoxicating, Palinesque persona: part 80s hooker, part moron, part woman who may wake boys with garden shears for impure thinking.)"

Please see also Christine O'Donnell, constitutional scholar, on the separation of church and state (what's that again--?)

What is it like to be a woman in philosophy?

A collection of contributed horror stories.  Hat tip to Mr. Ritchie for sharing.

Deprecated language columnist wins fiction prize

A short, short post on Language Log by Geoffrey Pullum (a linguist whom I greatly respect and admire).  He seems to always be advocating for optimism.  "I like the diversity of humankind, and the complicated character of individual human beings. The surprises and the contradictions appeal to me."

One link that's been circulating that I really can't stand : "FCKH8 (Warning : You Will Be Offended)"

There's something truly pathetic about preaching to the converted while offending the on-the-fencer's (who are you trying to win over here, anyway?).  One of the most powerful shorts I've seen in the last year was MIA's "Born Free."  I think that lyric testimony made me feel much more likely to give to a gay-rights campaign (or any other campaign to aid the marginalized or oppressed) than a flippant (adolescent) "f*ck you."  Anger can prove a powerful weapon, but wielding it is a delicate matter.

One response so far

The Knobe Effect

As an avid reader of Language Log, my interest was recently piqued by a commenter asking for a linguist's eye-view on the "Knobe Effect":

"Speaking of Joshua Knobe, has any linguist looked into the Knobe Effect? The questionnaire findings are always passed off as evidence for some special philosophical character inherent in certain concepts like intentionality or happiness. I'd be interested in a linguist's take. If I had to guess, I'd say the experimenters have merely found some (elegant and) subtle polysemic distinctions that some words have. As in, 'intend' could mean different things depending on whether the questionnaire-taker believes blameworthiness or praiseworthiness to be the salient question. Or 'happy' could mean 'glad' in one context but 'wholesome' in another, etc…"

Asking for an opinion, eh?  When do I not have an opinion?  (To be fair, it happens more than you might expect).

But of course, I do have an opinion on this, and it's not quite the same as the one articulated by Edge.  This post is a long one, so let me offer a teaser by saying that the questions at stake in this are : What is experimental philosophy and is it new?  How does the language we speak both encode and subsequently shape our moral understanding?  How can manipulating someone's linguistic expectations change their reasoning?  And what can we learn about all these questions by productively plumbing the archives of everyday speech?

Continue Reading »

24 responses so far

Or is that just what Chomskytron programmed you to say?

Oct 13 2010 Published by under From the Melodye Files

I can't even explain how happy this comic makes me.  The poverty of the digits?  Chomskytron?  And just look at my lips!  I'm a mad hot bot, apparently.

Anyway, not what this post is about!  (But so excellent, I had to include).

I recently came by a fantastic little textbook written by Larry Trask, an acclaimed (and out-spoken) American linguist who specialized in the study of Basque, and was known to occasionally rage against Chomsky in The Guardian.  The subject of his text?  Historical linguistics.  A subject that, to be fair, I haven't read much about since being an impressionable young teenager, and discovering Merritt Ruhlen and protohuman language in the musty (dusty) stacks of the Glendale Public Library.  ("This sounds like historical fiction..." I remember thinking)  In any case, the Trask text has proved a wonderful refresher and I highly recommend it if you can find it on Amazon; it appears to be selling for under $5!

Continue Reading »

6 responses so far

We've been drawn!

Frequent commenter and excellent artist Joseph Hewitt has immortalized us in colored pencil. Apparently I play the role of "pseudoscientific dogmatist." wut?

(click to see the full page)

While we're at it, here are a few more links to enjoy:

From the NY Times:

When a 12-year-old’s mother asks him “How many times do I have to tell you to stop?” he will understand that the answer, if any is required, had better not include a number.

But that insight requires a sophisticated understanding of ironic language that develops long after fluent speech. At what age do children begin to sense the meaning of such a question, and to what degree can they respond appropriately to other kinds of irony?

In laboratory research on the subject, children demonstrate almost no comprehension of ironic speech before they are 6 years old, and little before they are 10 or 11. When asked, younger children generally interpret rhetorical questions as literal, deliberate exaggeration as a mistake and sarcasm as a lie.

Also from the NYT:

Child Protective Services investigated more than three million cases of suspected child abuse in 2007, but a new study suggests that the investigations did little or nothing to improve the lives of those children.

Feel free to share other interesting links in the comments.

2 responses so far

Concepts : A Thing or An Act?

Oct 08 2010 Published by under From the Melodye Files

There is a temptation to see a concept as a static ‘thing’ somehow stored in mind. One might imagine, for instance, that our concept of the word ‘hammer’ consists of visual memory for a hammer or series of hammers (or a ‘prototype’ or ‘schema’ of a hammer, whatever that should be).  That we can think how this might be so, is not a good reason to adopt this idea.  Indeed, it would be far better if should we discard this notion altogether.  If we take language to be a skill, like tennis or painting, we see quickly how this idea breaks apart at the seams.  For one does not have a static ‘concept’ of how one paints a landscape, or a fixed ‘concept’ of how one approaches a serve; rather one learns, over time, the toss of the ball, the arc of the back, the gentle shiting of weight, the flex of the wrist; and in all of this, there is the demand of sustained practice and coordination, the reproof and rebuke of time, and as ever, a great number of processes – physiological and mental – that contribute to the execution of the act (which is, almost certainly, imprecise).  If a concept is a skill too, then it is a learned process; an active engagement; one of a suite of ways of representing the world.

On our confusion with words (and our idea that a word 'stands' for a thing), Wittgenstein said :

"This [confusion] is connected with the conception of naming as... an occult process.  Naming appears as a queer connection of a word with an object. --[But] you really [only] get such a queer connexion when the philosopher tries to bring out the relation between name and thing by staring at an object in front of him and repeating a name or even the word "this" innumerable times.  For philosophical problems arise when language goes on holiday.  ...We can [of course] say the word "this" to an object, or as it were, address the object as "this"--[but this] is a queer use of the word, which doubtless only occurs in doing philosophy."

10 responses so far

Hilarity did not ensue

Oct 07 2010 Published by under Links Best Served Cold

There's this disturbing (and simultaneously hilarious) article in yesterday's NY Times about rampant fraud and plagiarism among China's academic ranks.

My favorite line, by far :

He cited the case of Chen Jin, a computer scientist who was once celebrated for having invented a sophisticated microprocessor but who, it turned out, had taken a chip made by Motorola, scratched out its name, and claimed it as his own.

I can just imagine the poor man patiently scratching out "Motorola" and writing "Chen Jin" over it in crayon.  Brilliant, really.  What's truly amazing, of course, is that anyone believed this -- as my friend Joe pointed out, taking credit for a chip is kind of like taking credit for a 767.  "Oh zees?  I built it in in ze evenings, weeth some scrap metal an' a soldering iron."

More unnerving :

After Mr. Chen was showered with government largess and accolades, the exposure in 2006 was an embarrassment for the scientific establishment that backed him.  But even though Mr. Chen lost his university post, he was never prosecuted. “When people see the accused still driving their flashy cars, it sends the wrong message,” Mr. Zeng said.

The problems in China are more than a little blatant.  But what have the recent American scandals told us about US institutions?  --Are these anomalies, to be brushed under the table?  Or does the integrity of scientific research in the US deserve a closer look?  (Thanks to @CaldenWloka for the scoop)

One response so far

What is ADHD? Paradigm Shifts in Psychopathology

Wow. Lots of psycho linguists around lately, huh? How about a change of pace? Think you guys can handle something not about Lord Chomsky?

Over the last one hundred years, paradigm shifts in the study of psychopathology have altered our conceptualization of attention deficit/hyperactivity disorder (ADHD), as a construct and as a diagnostic category. With few exceptions, it has generally been accepted that there is a brain-based neurological cause for the set of behaviors associated with ADHD. However, as technology has progressed and our understanding of the brain and central nervous system has improved, the nature of the neurological etiology for ADHD has changed dramatically. The diagnostic category itself has also undergone many changes as the field of psychopathology has changed.

In the 1920s, a disorder referred to as minimal brain dysfunction described the symptoms now associated with ADHD. Researchers thought that encephalitis caused some subtle neurological deficit that could not be medically detected. Encephalitis is an acute inflammation of the brain that can be caused by a bacterial infection, or as a complication of another disease such as rabies, syphilis, or lyme disease. Indeed, children presented in hospitals during an outbreak of encephalitis in the United States in 1917-1918 with a set of symptoms that would now be described within the construct of ADHD.

In the 1950s and 1960s, new descriptions of ADHD emerged due to the split between the neo-Kraepelinian biological psychiatrists and the Freudian psychodynamic theorists. The term hyperkinetic impulse disorder, used in the medical literature, referred to the impulsive behaviors associated with ADHD. At the same time, the Freudian psychodynamic researchers (who seem to have won the battle in the DSM-II) described a hyperkinetic reaction of childhood, in which unresolved childhood conflicts manifested in disruptive behavior. The term "hyperkinetic," which appears in both diagnoses, describes the set of behaviors that would later be known as hyperactive – despite the fact that medical and psychological professionals were aware that there were many children who presented without hyperactivity. In either case, it was the presenting behavior that was the focus – which was implicit, given the behavioral paradigm that guided the field.

When the cognitive paradigm became dominant, inattention became the focus of ADHD, and disorder was renamed attention deficit disorder (ADD). Two subtypes would later appear in the literature, which correspond to ADD with or without hyperactivity. The diagnostic nomenclature reflects the notion that the primary problem was an attentional (and thus, cognitive) one and not primarily behavioral. The attentional problems had to do with the ability to shift attention from one stimulus to another (something that Jonah Lehrer has called an attention-allocation disorder, since it isn't really a deficit of attention). The hyperactivity symptoms were also reformulated as cognitive: connected with an executive processing deficit termed “freedom from distractibility.”

In DSM-IV, published in 1994, the subtypes were made standard and there wasn’t much change in the diagnostic criteria per se, but there were changes in the name of the disorder, which reflected changes in the literature in terms of the understanding of the etiology of the disorder. The term ADD did not hold up, and the disorder became known as ADHD, with three subtypes: ADHD with hyperactivity/impulsiveness, ADHD with inattention, and a combined subtype in which patients have both hyperactive and attention-related symptoms. Due to improved neuroimaging technology, these subtypes seem to reflect structural and functional abnormalities found in the frontal lobe, and in its connections with the basal ganglia and cerebellum.

The set of the symptoms associated with ADHD seem not to have changed much in the last one hundred years. However, paradigm shifts within the field of psychopathology have changed the way in which researchers understand the underlying causal factors, as well as which of the symptoms are thought to be primary.

6 responses so far

The reality of a universal language faculty?

Oct 05 2010 Published by under Forget What You've Read!

An argument is often made that similarities between languages (so-called "linguistic universals") provide strong evidence for the existence of an innate, universal grammar (UG) that is shared by all humans, regardless of language spoken.  If language were not underpinned by such a grammar, it is argued, there would be endless (and extreme) variation, of the kind that has never been documented.  Therefore -- the reasoning goes -- there simply must be design biases that shape how children learn language from the input they receive.

There are several potentially convincing arguments made in favor of innateness in language, but this, I think, is not one of them.

Why?  Let me explain by way of a evolutionary biology:

Both bats and pterodactlys have wings, and both humans and squid have eyes, but neither pair shares a common ancestor that had these traits.  This is because wings and eyes are classic examples of 'convergent' evolution -- traits that arose in separate species as 'optimal' (convergent) solutions to the surrounding environment.  Convergent evolution has always struck me as a subversive evolutionary trick, because it demonstrates how external constraints can produce markedly similar adaptations from utterly different genetic stock.  Not only that, but it upends our commonsense intuitions about how to classify the world around us, by revealing just how powerfully surface similarities can mask differences in origin.

When we get to language, then, it need not be surprising that many human languages have evolved similar means of efficiently communicating information. From an evolutionary perspective, this would simply suggest that various languages have, over time, 'converged' on many of the same solutions.  This is made even more plausible by the fact that every competent human speaker, regardless of language spoken, shares roughly the same physical and cognitive machinery, which dictates a shared set of drives, instincts, and sensory faculties, and a certain range of temperaments, response-patterns, learning facilities and so on.  In large part, we also share fairly similar environments -- indeed, the languages that linguists have found hardest to document are typically those of societies at the farthest remove from our own (take the Piraha as a case in point).

Continue Reading »

11 responses so far

The questions we should still be asking about gender

Oct 04 2010 Published by under From the Melodye Files

"...for the present enshrines the past – and in the past all history has been made by men." --Simone de Beauvoir, The Second Sex

Last week, I wrote a short piece on my (stunning) failure to be socialized according to our culture's gender norms. As I pointed out, I spent much of my adolescence wearing my father's hand-me-downs and drinking cheap whiskey with the loud boys (the kind, you know, who wore cordovan wingtips and eyeliner to first period). We were a delightful lot of misfits. Anyhow, an old friend, reading this, sent me a link to a new collection by Dries Van Noten, as photographed by The Sartorialist, with the note "ahead of the times, eh?" But of course.

The Sartorialist writes : "The take away from this show? Steal your Dad's clothes, all your dad's clothes. His shirts, his jeans, his sportcoats are all fair game now."

Why, I wonder, did I ever learn to wear lipstick? And like it?

To be fair, menswear has long been a 'classic look' on women of my build (broad shoulders, long neck) and height (as tall as a man). In search of my style predecessors, I searched the Internets for some of my women-loves. I've posted a small gallery at the end of the post.

But before getting to that, I thought I would raise a number of questions on gender that still demand discussion :

On language
How does language shape our thoughts on gender? How do we use language - as a form of behavior and as an expression and extension of culture - to implicitly enforce gender norms?  And then -- secondarily -- If we think that language is shaping (or implicitly constraining) our thoughts and beliefs about gender, is it worthwhile to assess and try to change those values? Should we try to self-consciously change the way we speak?

On biology
In society, what role does biology play in propelling men (and not women) to the top? Are the traits of highly successful men (e.g., hyperfocus, ambition, hypomania) truly absent in women? In what way is the expression of these traits mediated by cultural norms and practice?

On sex
How does female sexuality play into all of this? What part does modern culture (pornography, fashion, etc) play in shaping our expectations of women? Must powerful, iconic women necessarily be de-sexualized, gay, or explicitly counter-culture?  What happens when women turn the tables and objectify men?  [Links are to the Martin Amis classic on pornography "Pussies are bullshit," the Dove ads scandal, and the Karen Owen sex-thesis (er, f*ck list), respectively.]

And finally, a provocative question from a conversation I was having this evening about Hemingway (who was often accused of misogyny) :

What does it mean to be a misogynist in an age (or society) where women are socialized to be powerless, subservient and inferior?  What does it mean to be a feminist?

In the days to come, I'll write a little on the research I've been doing on the differences in how we use gendered words (like "he" and "she" and "man" and "woman").  The differences are striking, and sometimes more than a little startling.  Here's a simple one you might not expect : when it comes to labeling people by their sexual orientation, we're far more interested in a man's preference than a woman's.  In fact, we label men by their orientation (gay, straight, bi) about ten times more often than we do women.  But that ratio nearly reverses when it comes to marital status.  We talk incessantly about whether women are "married," "single," or "divorced," but when it comes to the guys, we couldn't care less.  What does it all mean?  --I'll get to that bit shortly.

11 responses so far

Older posts »