Politics, Technology, and Language

If thought corrupts language, language can also corrupt thought — George Orwell

Archive for February, 2008

William F. Buckley Jr. (1925-2008)

Posted by metaphorical on 28 February 2008

I suppose I’m not supposed to mourn the passing of William F. Buckley Jr. but I can’t help myself. Buckley was was a fierce proponent of a sort of spare, consistent arch-conservativism that one almost longs for in these days of big-government, big-business Republicanism.

Ryan Lizza’s New Yorker article this week, “On the Bus: Can John McCain reinvent Republicanism?” reminds us, as if we need it, that there are many types of Republican – the radical religious right-wingers who flocked to Huckabee; the strangle-government types such as Grover Norquist; the small-government Goldwater/Reagan types; the oddly pragmatic sort that Gingrich has turned into; the moderates in the tradition of Eisenhower and Nelson Rockefeller (it’s Lizza’s contention that McCain falls into this category); and the libertarians, such as Ron Paul (except that Paul is also a bat-shit-crazy conspiracy theorist).

While I’ve never been a connoisseur of conservatism, and so I might get this wrong, Buckley struck me as one who straddled the Goldwater and libertarian camps, reminding them both to be at once pragmatic and pure. And while he sometimes wore the same ideological blinders as, say, Reagan, he was also committed to reason in a way that few hard-core conservatives are. And so it is that the moment I remember best about Buckley was also one of his most rational and therefore also, surely, one of his finest.

I couldn’t have been more than about 16. It was, therefore, about 1972, and my grandmother had that still-rare commodity, a color television. I’m not sure that’s why I was downstairs, in her part of the house, while she was out, or whether it was just to watch some TV without arguing with anyone about what to have on. I don’t know why I would have stopped turning the dial at Firing Line, except that in the days of 9 channels and no remotes it was sometimes even harder to find something worthwhile than it is with today’s 999.

And so I sat on her worn couch, watching her Sony television. Any memory of her place necessarily includes the lingering smells of olive oil, chicken livers, overripe bananas, and Chesterfield nonfiltered cigarettes. If it was winter, then her place was also much warmer than the basement couch I slept on.

Certainly the topic itself was interesting – decriminalizing drugs. I’d certainly done my share already, but one certainly knew what a Buckley would think about them, and who needed to hear that? I guess the thing is, I don’t channel surf as quickly as most people do.

I don’t know who Buckley’s guest was. All I know is that he advocated decriminalizing drugs, and he had plenty of good reasons, and he was kicking Buckley’s ass, because he had none. And there was this moment somewhere down near the bottom of the hour, maybe at the 28 minute mark, when you could see this look on Buckley’s face as if he was hearing the guy for the first time and you could see that 2+2 was starting to equal 4 for him.

Wait, you could hear him think. A small government doesn’t care what adults do in the privacy of their own home. Wait, these people are only hurting themselves, if anyone, and a small government is okay with that. Wait, why should a government care about whether people self-medicate with cocaine instead of caffeine? Actually, that’s more of a 1980s thought. But I do think I remember his guest asking Buckley where he would draw the line: what if the government decided to consider caffeine a narcotic?

Right then and there I saw that rare thing, someone listening to the voice of reason enough to switch sides. On television, with millions (okay, some significant fraction of a million) watching. And a hard-line conservative to boot.

But Buckley wasn’t just any hard-line conservative. He was a thoughtful guy. He could hear, and even heed, the voice of reason. And forever that made him and me more alike than different. Farewell, WFB, Jr. Goodspeed.

Posted in language, philosophy, politics, pop culture | 8 Comments »

It’s time to kill the USDA, before it kills us

Posted by metaphorical on 25 February 2008


We haven’t talked yet about last week’s recall of 143 million pounds of beef, the largest meat recall in U.S. history.

As the BBC and others reported at the time, “It comes from a company in California, which officials said allowed meat from cattle unable to stand at the time of slaughter to enter the food chain.”

Today, the Wall Street Journal is reporting that the company, Hallmark/Westland Meat Packing Co., will almost certainly shut down for good.

The meatpacker voluntarily suspended operations in early February, after the U.S. Department of Agriculture began investigating how it treated animals.

What’s of particular interest, from a journalistic point of view, is the step before that – how the USDA came to investigate in the first place.

The USDA investigation began after the Humane Society of the United States released an undercover video showing workers at the Chino slaughterhouse trying to make sick or injured cows stand up with electrical-shock devices, fork lifts and high-pressure water hoses. State and federal animal-cruelty laws prohibit such activities.

Federal laws also prohibit the sale and distribution of so-called downer cattle because of the high risk of mad-cow disease. That risk isn’t taken seriously by consumers, in large part because they rely on the government to take it seriously. And the USDA doesn’t do the job its counterparts do in other countries, largely because it’s insufficiently independent of the industry it’s supposed to regulate.

In the days after Upton Sinclair’s The Jungle was published, the need for government oversight must have seemed obvious, for the safety of the meat supply and of the meatpackers, if not for the short-lived well-being of the animals themselves. Unfortunately, if you read any of that book’s latter-day latter-day counterparts, like Fast Food Nation or Diet For a New America, it’s clear that, 102 years later, little if any progress has been made.

According to KCBS in Los Angeles and others, one-third of the recalled beef went to schools. (KCBS is also the source for the nice graphic at the top of this post.)

Basically, this was a bottom-of-the-market meatpacker that was probably on shaky ground until it got the federal contract to supply schools. Going back to the Wall Street Journal article for a moment,

Until the plant suspended operations, it was earning a modest profit on annual sales of roughly $100 million, he said. “It’s a low profit-margin business,” he said.

In the last government fiscal year, the Agriculture Department paid Hallmark/Westland about $39 million for ground beef for food nutrition programs, including the school-lunch program. Hallmark/Westland was honored by the department as its Supplier of the Year for the 2004-05 school year. It began supplying meat to the program in 2003 after a rigorous application process with the Agriculture Department, which has authorized about 10 meatpackers nationwide to compete for contracts to supply beef to the program.

Quite the rigorous testing if the Humane Society had to do the USDA’s job for them. It might be fair to paraphrase Groucho and say that we don’t want the schools to be supplied by any company that needs the work.

So maybe it’s time to think about the unintended consequences of having an agency like the USDA exist in the first place. Maybe it’s time to notice that inadequate oversight is in many ways worse than no oversight at all.

In a caveat-emptor world, consumers would be warier of what they let pass through their mouths. (The wording is deliberate there; even the most desparate hooker is more discriminating than the average hamburger consumer.) We would come to rely on brands, either of the distributor, or the restaurant or supermarket itself, and those brands would be on the line with every purchase. The A&Ps and Vons of the world would have to either police their food sources themselves or get out of the game. Perhaps third-party inspectors would emerge to do what the USDA can’t or won’t – rigorously examine the practices of factory farms and slaughterhouses.

For as long as we’ve known about mad-cow disease, the USDA has done a poor job of protecting consumers from it. Take this summary report from 2006, for example:

USDA slammed for letting high-risk downer cattle reach consumers

(Japan Economic Newswire Via Thomson Dialog NewsEdge) WASHINGTON, Feb. 8 (Kyodo) U.S. beef inspectors have failed to fully comply with rules banning cattle that are unable to walk to safeguard consumers from mad cow disease, leading at least 29 such animals, including 20 high-risk “downers,” to reach the food chain, according to a recent government audit report.

The failure angered some activist groups in the United States, blasting the U.S. Department of Agriculture for putting consumers at risk of the disease, formally known as bovine spongiform encephalopathy, despite a no-downer policy maintained for more than two years as a protective firewall against BSE.

The first benefit to ending the USDA’s miserable existence would probably be the inspection of each and every slaughtered animal for mad-cow disease.

Japan tests every animal, and in 2003 halted imports of U.S. beef over mad-cow concerns – $1.7 billion worth in 2003, according to an MSNBC editorial in 2006. That was the year that Japan lifted the ban, only to have to quickly reinstate it.

How pure is the U.S. beef supply, really?
By Phil Lempert
“Today” Food Editor
Tues., Jan. 24, 2006.

[….] Last week, just a month after the Japanese government decided to allow the import of U.S. beef into that country, it has once again halted shipments of American beef into Japan because animal spines were found in three boxes of frozen beef being brought into the country.

When the two-year-old ban was lifted late last year, it was with the expressed condition that imported U.S. beef come from cattle no older than 20 months and that spinal cords, brains and other parts blamed for spreading the human variant of mad-cow disease be removed.

There are those who argue that the risks just aren’t high enough for us to mimic the paranoid Japanese. Let’s leave aside a multi-billion-dollar export opportunity for American business, and focus on our own health and safety.

Back in 2005, a California State Senator, Jeff Denham, tried to make the case that universal testing was unnecessary.

Since the first cow tested positive in 2003, the United States Department of Agriculture (USDA) has tested over 400,000 cows and only one other tested positive. To put this in perspective, you have a better chance of being struck by lightning this year than a neighborhood cow testing positive for Mad Cow disease.

the USDA is testing those cattle with the highest likelihood of having Mad Cow disease – not just a random sampling. Cattle with the highest likelihood of contracting Mad Cow include “downer cows,” that are unable to stand-up, die unexpectedly, or have other signs of illness are the ones that are tested. So those cattle that are healthy are even less likely to have Mad Cow disease.

Obviously, though, the USDA is not testing those cattle with the highest likelihood of having Mad Cow disease, even though they’re supposed to be.

The cattle industry, and guys like Denham, think it’s just too expensive to test every head as it comes to slaughter.

Some will still argue that those odds are not good enough and that every head of cattle should be tested. With more than 95.8 million cows nationwide, it simply is not feasible and not cost effective.

So how expensive would it be?

As it happens, that calculation has been done, for 10 million head per year, the same as Denham argued against. As it happens, that’s not for the Japanese standard of testing every slaughtered animal, but the European standard of testing those over 30 months. An article here quotes a Wall Street Journal article from 2004 in which the calculation is pretty straightforward.

Test kits cost about $10 a pop…. Add in salaries of lab technicians, the cost of grinding up and delivering cattle brain samples for testing, and the tab would be $30 to $50 per animal, industry experts say. The average U.S. cow slaughtered for food yields meat with a retail value of $1,636.

Each year in the U.S., about 35 million cattle are slaughtered. About 10 million of these animals — those over 30 months of age — would be tested for BSE if the U.S. were to adopt European standards, because age is associated with infection.

The grand total to test about 10 million cows in the U.S. would be $300 to $500 million a year. Considering that Americans spend more than $50 billion on beef annually, that would add between six cents and 10 cents per pound.

I’m not too crazy about the 6-10 cents/lb. calculation, since it’s hard to know what the $50 billion figure refers to. It might include $30 entrees at Morton’s Steakhouse. So let’s look instead at the per-head stats: $50 out of $1686. If spread out per-dollar, instead of per-pound, in round numbers the testing adds 3%. If chopped meat is roughly $3.00/lb, we’re still in the same range, another 9 cents.

So there you have it. The cost to be ensure against mad-cow disease is 10 cents/lb. or less. That still doesn’t do anything about the harmful antibiotics in meat, the other chemicals, the hazardous working conditions in slaughterhouses, the inhumane ways that animals are reared and killed, the befouling of the nations drinking water, the erosion of its land, or any of the other problems of factory farming. But it would solve, or start to solve, the mad-cow disease.

Is the industry really afraid of adding 10 cents a pound to meat prices? Hardly. It’s afraid to find out the extent of the mad-cow problem. And it’s afraid of the costs that might be engendered by the changes needed in the way animals are reared and slaughtered, once the extent of the problem is known.

Basically, the meat industry doesn’t want to find out how many animals are infected. And it doesn’t want the extra work of keeping brains and spines out of the hamburgers we eat. And until we get rid of the USDA, or truly empower it with resources and independence, no one is going to make the industry do anything it doesn’t want.

Posted in animal-rights, food, Orwell, politics, pop culture | Tagged: , , , , | 3 Comments »

No award for old men?

Posted by metaphorical on 24 February 2008

In the run up to the Academy Awards, Knowledge News has a nice article, “Oscar’s Biggest Snubs” (thanks Claire, for the link), describing how some of Hollywood’s best films didn’t even win best-picture in the year they were released.

Citizen Kane, often cited as the greatest movie of all time, tops the list, and two of my favorite movies ever are there as well, Chinatown, and Double Indemnity. Singin’ in the Rain, not one of my favorite movies, but surely touched by greatness, and Some Like It Hot, round out the list. There’s also an homage to Alfred Hitchcock, surely the most underawarded director in Academy history.

Singin’ in the Rain apparently lost out to The Greatest Show on Earth. Now that’s a movie that I could watch over and over again, but it’s hard to see it as better than one of a few score movies that people will remember for the next fifty years.

Hollywood has always confused entertainment with greatness, and it’s always fun to see that tension play out as the Academy votes each year. Oddly, they struggled in reverse with Hitchcock—voters obviously thought of movies like Rear Window and Psycho as throw-away entertainment, when in fact we now see their lasting value and Hitchcock as one of the great auteurs of all time.

Which brings us to this year. Of the five nominees, there’s no obvious winner, though a couple will be memorable for a long time and none of them is really disposable entertainment. (The official list is here, but you have to like ImdB’s for its linkability.)

Michael Clayton
No Country for Old Men
There Will Be Blood

We can cross There Will Be Blood off the list right away. It’s a mess of a movie, structurally unsound, poorly plotted, and with absolutely no likeable characters. It’s hard to even see how it even got nominated, except for Daniel Day-Lewis’s performance.

Michael Clayton is a terrific movie, but not the kind that normally emerges as Best Picture. For one thing, it has no actual point, other than revenge is sweet and, at least in Hollywood, the smartest guy sometimes wins. It puts wit and charm in an action movie, but, frankly, that was true of Sneakers and The Italian Job, and no one ever nominated them for Oscars.

No Country for Old Men is a strong contender, because it captures a lot of mind-share as possibly the best-ever for its genre, which is that of Gruesome Thoughtful-Action Movie, a specialty of the Coen brothers. Unforgiven was in that genre, and did well its year, as did Fargo. The comparisons are limited, in that each of those movies had characters more likeable than Tommy Lee Jones’s. On the other hand, there’s a growing recognition of the auteur quality to the Coen oeuvre.

Juno is the kind of small picture that can, in these post-Little-Miss-Sunshine days, easily get nominated, but perhaps never win. It does have the merits of actual themes, a plot, a point of view, and funky believable characters, the central one of which has just the sort of change that a leading lady, even one of 16, is supposed to undergo. In other words, it’s a classic movie, and those are in somewhat short supply this year.

Even more interestingly, the central character in Atonement is likewise transformed and then, as the characters who inspired it die off, reverts to her earlier self. That’s a remarkably difficult message for Hollywood to deliver, and Atonement succeeds against all odds. Combine that with the luminous development of two characters we give our hearts to in the first part of the movie, and the radically different cinematography in the front and back halves, either of which probably deserves an award in that category, and I would have to pick this as my favorite movie of the year, and the one I’d like to see win the Best Picture award.

Some other quick picks:

Best Actor – I only saw two of the nominated performances, so I don’t get a vote. If anyone beats Daniel Day-Lewis, though, I will have to run out and see that movie.

Best Actress – I only saw one performance here. Normally that wouldn’t matter, because it was Ellen Page’s, and you ask yourself, is anyone good enough to beat that? Unfortunately, when the category includes Cate Blanchett and Julie Christie, the answer is yes.

Best Supporting Actor – the three performances I saw, Javier Bardem, Philip Seymour Hoffman, and Tom Wilkinson, were pretty amazing. Even more astonishing, though, is that Casey Affleck is nominated for something that’s presumably even better than he was in Gone Baby Gone. Personally, I hope Javier Bardem wins, because we’ll see Philip Seymour Hoffman get nominated a bunch more times, while this was Bardem’s role of a lifetime.

Best Supporting Actress – I saw four of the performances. Ruby Dee might get it, for sentimental reasons. I hope not, because it just wasn’t that memorable a role, certainly not compared to Saoirse Ronan’s, or Amy Ryan’s. Again, the missing performance is Cate Blanchett, so anything could happen here. I’m rooting for the kid.

Adapted Screenplay – I missed two of these films, unfortunately. I just hope and trust that There Will Be Blood doesn’t win, because most of its problems as a movie, not the least of which is an ending that’s both totally inevitable and completely unsatisfying, could have been fixed at the screenplay level.

Original Screenplay – I only saw two nominees, but I hope Juno gets it. It is, truly, original, in its story and its characters, in all the best ways. As a budding screenwriter, I am in awe of the writing in movies like Sideways, Little Miss Sunshine, and Juno.


Well, most of the awards I cared most about fell where I wanted them to. In many cases, I didn’t see the winner’s work, so I can’t judge how smartly the Academy vote.

One exception to that was Tilda Swinton, who won best supporting actress; it was a great little part, played with greatness, sure, but it was a little part, and surely any number of actresses would have done just as well. I thought none of that was true of Saoirse Ronan’s performance.

We actually have the DVD of “La Vie en Rose” in the house, I’m eager to see Marion Cotillard’s performance. She looked and sounded pretty damned good.

I’m disappointed that Atonement didn’t win Best Picture, but I’m happy that the Coen brothers won for directing. Similarly that There Will Be Blood won for cinematography; whatever that’s pictures flaws were, there were none at the level of images on the screen.

On the plus side, Javier Bardem won his gold, and gave a great speech.

Best of all, Juno won for original screenplay.

Posted in pop culture, screenwriting, the arts, writing | Tagged: , , | 30 Comments »

“Best Sidekicks Ever”? Sending Wired back to the silver screening room

Posted by metaphorical on 23 February 2008

No, I don’t exactly know why I still subscribe to Wired, except that it’s only $12, which, per-issue, meets my going price for never saying no to something ($1). And there’s always one article that’s worth breaking into the plastic bag it comes in. Wired is no longer a technology publication, if it ever was, it’s a tech-oriented lifestyle magazine. Not surprising, since it’s published by Conde Nast—GQ and Self, for example, are, lifestyle magazines as well, the one fashion-oriented, the other fitness-related.

So it’s useless to ask, why would Wired run an article, “The 9 Best Sidekicks Ever.” This is just the kind of pop-culture pap that it now excels at. The March issue isn’t online yet, so I can’t point to it. I’ll just name them:

Sam from Lord of the Rings
Mr. Spock
Dana Scully

Some of these are hard to disagree with (Scully, Smithers, Spock); others are bizarre—I thought Michael Knight was the sidekick, for example, and if Willow was Buffy’s sidekick in the first season, she wasn’t one by the last one. And I don’t even know who Beakey is (the picture looks Sesame Street-related).

But, whatever. Rational people can disagree about these things. What I find objectionable is that that none goes back to before 1966 and hardly any are pre-1980. (Admittedly, the Sam and Robin characters predate their televison and film instantiations, but as there are no pure book or comic book entries in the Wired list, I’m supposing that some form of video life is a prerequisite.)

I’ve taken it upon myself, then, to come up with another list—not necessarily the “Best Ever,” just a list of great black-and-white sidekicks that, by being at least as good as Wired’s “best ever,” refute their list.

By the way, even sticking to the modern era, the list is not hard to refute: From TV, its two greatest buddy-roles: Bill Cosby (I’m not even going to mention the character’s name… okay, it turns out to be Alexander Scott) in I, Spy, and Illya Kuryakin (David McCallum) in The Man from U.N.C.L.E. From film, the two greatest buddies are surely Bob Hope’s sidekick roles to Bing Crosby in the Road movies, and Jerry Lewis to Dean Martin.

And I’ll just mention in passing two other glaring omissions: Cameron Frye (Alan Ruck) in Ferris Bueller’s Day Off and Hobson (John Gielgud) in Arthur. Oh, and let me just put in a good word for Bruno Kirby, an exceptional sidekick to Billy Crystal in When Harry Met Sally (and again in City Slickers) and to Matthew Broderick in the highly underrated The Freshman.

Without further ado, my black-and-white sidekick list.

Muggsy (William Demarest) in The Lady Eve

A movie so terrific, it has two sidekicks – Gerald (Melville Cooper), is Charles Coburn’s, and the dueling between them makes this the best sidekick movie ever.

Eddie (Walter Brennan), in To Have and Have Not

Was you ever stung by a dead bee?


Nora Charles (Myrna Loy) in The Thin Man

Dick Powell is a bit miscast as Nick, but Myrna Loy will forever be the perfect Nora.

Jeffrey Baird (Edward Everett Horton) in Shall We Dance

Horton was the perfect sidekick in dozens of films; this might be his biggest role, so it’s my choice to represent him.

Cosmo Brown (Donald O’Connor) in Singin’ In The Rain

Over in rec.climbing, someone once said, “Mount Meeker would be a really impressive mountain if God had not decided to place it right next to Long’s Peak.” Donald O’Connor might have been Hollywood’s greatest dancer if Fred Astaire had never lived. He was certainly a better dancer than Gene Kelly, and hiding that fact in Singin’ In The Rain makes him the best dance sidekick ever.

Alma (Thelma Ritter), Doris Day’s inebriated sidekick in Pillow Talk

Ritter’s most memorable role might be The Misfits, where she was Marilyn Monroe’s sidekick, but Pillow Talk was probably the most sidekick-y of her 6 Oscar nominations.

Top Sgt. Quincannon (Victor McLaglen) in Rio Grande and She Wore A Yellow Ribbon

John Wayne always brings out the best in supporting actors, and many of them are sidekicks. The Quincannon role is simply the best of the best, the classic one-dimensional heart-of-gold supporting character who will take a bullet for the star, has a not-very-well-hidden unrequited crush for his wife, and loves their children more than life itself. McLaglen, like Brennan, Demerest, and Horton, did it his entire career.

Posted in pop culture, technology, the arts | 6 Comments »

After 500 years of progress, we’re still waterboarding people and stoning them to death for adultery

Posted by metaphorical on 19 February 2008

Q: Which is worse, adultery or witchcraft?

A: They are equally bad, and should both be punished by death.

That seems to be the news from the Arab world.

A couple of weeks ago, Amnesty International reported on two sisters that face execution by stoning in Iran. Does Iran really stone people to death? For adultery? The answer to both questions seems to be yes. A news report from last July, “Iran confirms man stoned to death,” describes just such a case.

In the case of the two women, Amnesty International seems at least as concerned about some of the legal niceities of the case. For example, there’s the double-jeopardy fact that the women were already convicted and sentenced to floggings and prison.

The five were tried in March 2007 and sentenced to flogging for “having illicit relations”; Zohreh also received five years’ imprisonment for forming ‘a centre of corruption’. But after the floggings were carried out, fresh charges of “committing adultery while being married” were brought against Zohreh and Azar Kabiri-niat. On 6 August 2007. Both were found guilty and were sentenced to death by stoning.

The witchcraft case comes from Saudi Arabia.

Saudis to Execute a Woman for Witchcraft


BEIRUT, Lebanon (AP) — A leading human rights group appealed to Saudi Arabia’s King Abdullah on Thursday to stop the execution of a woman accused of witchcraft and performing supernatural acts.

Besides sounding more appropriate to the middle ages than the 21st century, what the cases have in common is an appalling lack of due process.

A new lawyer representing the women told journalist Marjan Lagha’i that, “the case has fundamental problems, since a person can not be tried twice for the same crime. Yet these two sisters have been tried twice in the same case, and two sentences have been issued for them… the circumstances that are required to prove adultery – confession by the accused on four different occasions that can be corroborated by the testimony of four eyewitnesses to the alleged crime – are entirely absent, and there is absolutely no legal document in this case that a judge can use to issue a stoning sentence… Given that I view this sentence to be against the principles of Sharia, as well as the criminal laws [of Iran], I have filed an official objection, and I have asked that the Head of Judiciary review the case once again.”

To be sure, in a case of witchcraft, it’s hard to imagine what would actually count as due process.

“The fact that Saudi judges still conduct trials for unprovable crimes like ‘witchcraft’ underscores their inability to carry out objective criminal investigations,” said Joe Stork, Middle East director at Human Rights Watch.

Here in the U.S., we don’t stone people to death or kill them in any other way for adultery, but due process is still a bit of an issue for the Bush administration. And it’s come up in the context of yet another gruesome medieval practice — waterboarding, a favorite form of torture going back to the Spanish Inquisition.

Last week, in testimony before a House Judiciary subcommittee, Steven Bradbury, the head of the Justice Department’s Office of Legal Counsel, made the case that waterboarding is not torture, at least, it’s not under the laws in effect when the CIA conducted waterboarding. As Muckraker reports:

The CIA’s use of waterboarding was legal and not torture, a Justice Deparment official argued this morning, because it was a “procedure subject to strict limitations and safeguards” that made it substantially different from historical uses of the technique by the Japanese and the Spanish Inquisition.

The question, in other words, is whether the “safeguards” and “strict limitations” make the American version of waterboarding something other than torture.

According to Malcolm Nance a former instructor at the Navy’s training program, they do not. Another Muckraker link quotes Nance as saying:

Waterboarding is a controlled drowning that, in the American model, occurs under the watch of a doctor, a psychologist, an interrogator and a trained strap-in/strap-out team. It does not simulate drowning, as the lungs are actually filling with water. There is no way to simulate that. The victim is drowning. How much the victim is to drown depends on the desired result (in the form of answers to questions shouted into the victim’s face) and the obstinacy of the subject. A team doctor watches the quantity of water that is ingested and for the physiological signs which show when the drowning effect goes from painful psychological experience, to horrific suffocating punishment to the final death spiral.

What’s needed to settle this dispute is some form of due process. In the U.S., that comes in the form of legislative oversight of the executive’s possibly overzealous questioning of prisoners.

The problem is, Bradbury won’t tell the House Judiciary what are the differences between American 21st century waterboarding, and the Spanish Inquisition’s. Bradbury’s reason is that the information is classified, even though the committee members have the highest possible security clearances, and even though, as Bradbury acknowledged, Congress has a Constitutional duty of oversight. The Muckraker story has a link to this YouTube video. The relevant exchange between Bradbury and NY representative Jerry Nadler comes a bit after the 5 minute mark.

Of course, there’s no comparison between actually stoning someone to death, and merely convincing them you’re drowning them. But drawing out subtle distinctions of “time limits” and “medical oversight” sounds a lot like Scholastic angel-pinhead-dancing more appropriate to 12th or 16th century Spain than modern-day America.

Of the witchcraft case, Human Rights Watch says it

underscores shortcomings in Saudi Arabia’s Islamic legal system in which rules of evidence are shaky, lawyers are not always present and sentences often depend on the whim of judges.

The video of Bradbury ducking Congressman Nadler’s questions equally looks to involve shaky rules of evidence and the whims of administration lawyers. That’s no way to call such a government “modern,” whether it’s in Iran, Saudi Arabia, or the U.S.. And calling ours a “Justice” Department seems more than a little Orwellian.

Posted in Orwell, politics, religion | Tagged: , , | 1 Comment »

The race is not always to the swift

Posted by metaphorical on 14 February 2008

[Huckabee] also overwhelmingly won Virginians who identified themselves as conservatives, pointing to continued resistance toward McCain among many of the GOP’s base voters.

The LA Times says that as if it’s a bad thing. But if you’re the Republicans, that’s exactly what you want – a candidate whose appeal stretches into independents, Democrats, and liberals, even at a cost of some conservatives. After all, these are the primaries. Where are those conservatives going to go in the general election – into the Obama camp? Are any of the evangelical Christians who voted for Huckabee yesterday instead of McCain going to vote for Obama (or Clinton) come the general election?

This is similar to the question of delegate counts vs popular counts that this blog has already visited. And it’s a problem for the candidates. McCain is running a smart campaign (finally!) with one aim – to win the general election. Everything else is a subsidiary goal to that. If he has to lose some primary votes, and squeak instead of sail into the nomination, in order to retain the broadest possible appeal after the convention, so be it.

But the press doesn’t allow that. It defines each week as if it were an NCAA Sweet 16 knockout tournament, instead of treating the primaries like a long baseball season. The two have very different strategies. You can’t rest your best players nearly as often in a knockout. You can’t say, it’s okay to split here in Chicago, the Boston series next week is more important. So too, as the media defines the game, you can’t temporarily sacrifice any of your party’s base, expecting, rightly, to get them back.

And the media has power. If they say the race is close, or – heaven forbid – you’re losing, it becomes true. So the candidates are forced to consider adopting a less effective strategy, just to pass the media test.

It’s great, frankly, that the press wields such influence. Even if I weren’t a journalist myself, I wouldn’t have it any other way. But with great power comes great responsibility. In this case, that means smart analyses that take into account the way the candidates define the race they’re running.

Posted in journalism, language, Orwell, politics | 15 Comments »

Meat the enemy

Posted by metaphorical on 3 February 2008

The United States produces “nearly 10 billion farm animals a year, more than 15 percent of the world’s total.”

“An estimated 30 percent of the earth’s ice-free land is directly or indirectly involved in livestock production”

“livestock production generates nearly a fifth of the world’s greenhouse gases — more than transportation”

Now imagine if global meat production doubles between now and 2050. That’s the scary scenario posed by an article in last Sunday’s NY Times, “Rethinking the Meat-Guzzler,” by Mark Bittman.

The title comes from the idea that “Grain, meat and even energy are roped together in a way that could have dire results. More meat means a corresponding increase in demand for feed, especially corn and soy, which some experts say will contribute to higher prices.”

Gidon Eshel, a geophysicist at the Bard Center, and Pamela A. Martin, an assistant professor of geophysics at the University of Chicago, calculated that if Americans were to reduce meat consumption by just 20 percent it would be as if we all switched from a standard sedan — a Camry, say — to the ultra-efficient Prius. Similarly, a study last year by the National Institute of Livestock and Grassland Science in Japan estimated that 2.2 pounds of beef is responsible for the equivalent amount of carbon dioxide emitted by the average European car every 155 miles, and burns enough energy to light a 100-watt bulb for nearly 20 days.

Environmental, political, health, and moral concerns have all been the main reasons I gave up meat 18 years ago. Bittman hits all cylinders:

Though some 800 million people on the planet now suffer from hunger or malnutrition, the majority of corn and soy grown in the world feeds cattle, pigs and chickens.

About two to five times more grain is required to produce the same amount of calories through livestock as through direct grain consumption

Agriculture in the United States — much of which now serves the demand for meat — contributes to nearly three-quarters of all water-quality problems in the nation’s rivers and streams

Administration of antibiotics [in farm animals] is routine, so much so that it can result in antibiotic-resistant bacteria that threaten the usefulness of medicines that treat people.

Those grain-fed animals, in turn, are contributing to health problems among the world’s wealthier citizens — heart disease, some types of cancer, diabetes

There’s some faint — and in my opinion false — hope at the article’s end (One academic is quoted as saying, “The good of people’s bodies and the good of the planet are more or less perfectly aligned”). More convincing are these depressing comments. One expert is quoted,

“I just don’t think we can count on market prices to reduce our meat consumption. There may be a temporary spike in food prices, but it will almost certainly be reversed and then some.”

Bittman says, if price spikes don’t change eating habits, perhaps the combination of deforestation, pollution, climate change, starvation, heart disease and animal cruelty will gradually encourage the simple daily act of eating more plants and fewer animals.

Perhaps, but it’s been going one for a long time with few people caring at all. John Robbins’s Diet For A New America, the foundational book to which Fast Food Nation, Supersize Me, and The Omnivore’s Dilemma all owe an acknowledged debt, was published 21 years ago. In that same time, meat consumption in the developing world has doubled. What will the planet look like when it doubles again?

Posted in animal-rights, food, Orwell, politics, pop culture, Times-watch | Leave a Comment »

How low can high schools go?

Posted by metaphorical on 2 February 2008

Have you seen “Dumbing Us Down, The American Tragedy”? It’s a YouTube video that seems to be just going around now, even though it goes back to at least November 2006, when it was posted. It seems to have hit Digg just a couple of days ago.

There are a lot of links to it, but not much information. It was made by Brandon Telg, Jarred McKinney, Austin Woodall, three Gainesville (Fla.) high school students, at least at the time.

Their film takes a quick look at the declining state of education in the U.S. They call it a documentary, but at 13 minutes, its more of an outline of one. Still, they have a very nice mix of anecdote and statistic, and the video is pretty well made, a few easily excused typos and other glitches aside.

Their impetus seems to be a conversation with a history teacher who noticed, by accident at first, that almost none of his students — two of 32, in fact — knew who Gerald Ford was. The teacher later learned that only two of his students knew the name Mahatma Gandhi.

That led Telg, et al., to wonder how extensive the ignorance of their fellow students was. So they asked around to see what people knew of Gandhi. The depressing but predictable answer was, not much. Many didn’t know the name at all, while others misplaced it, such as the kid who thought Gandhi was a Mongol conqueror.

So the three videographers drew up two lists, one of famous names from history — Thomas Edison, Calvin Coolidge, Dick Cheney — the other pop culture stars — Eminem, Paris Hilton, Jack Black, and so on.

It’s hard to fathom the depths of student ignorance on display in the video. Edison was variously thought to be a former president and located in the 18th century by one student who mumbled, “kite, electricity, light bulb” — as if the light bulb were invented in the 18th century. Even Cheney was not universally known, though Eminem was.

They descry standardized testing mandated by No Child Left Behind. They invoke John Dewey, Horace Mann, and Cotton Mathers, and find, in their words, “This generation is witnessing first-hand the disintegration of the original intent of the American public education system.”

As I said, there’s not much information about the video and it’s more cited than discussed in the blogverse. But 24-year-old Daniel H. had some comments I found interesting.

For example, all I knew about Calvin Coolidge was that he was a president… that was pretty much it. I did know about Gandhi and Edison, but only a couple sentences’ worth. Now, I consider myself an intelligent person, but that doesn’t really help much for what would be considered “book learning”. You see, with the current state of our education system, students are learning less and less. I think it started around the time I was coming up through elementary school, and I’m only 24 years old.

In elementary school, I was given a calculator from day 1 and was told to use it when multiplying and dividing. Did we learn our “times tables” ? Yeah.. but we only went over it for a few days before we had a calculator stuffed in our hand. Talking to people even just a few years older than me makes me realize what all I never had in school.

Previous generations had to memory their multiplication table backwards and forwards- I never really did. They learned the presidents in order with facts about each- we barely went over the list once, and certainly didn’t have to memorize it. We never had to learn where all 50 states are in the US and the capital of each (I had friends in high school who thought Alaska was down by Mexico because of concatenated maps)… there are plenty more instances like that.

Daniel certainly overestimates the older generation. I went to some pretty good schools growing up, including the top high school in New York and maybe the country. While we were required to know our times tables up to 12, we never memorized state capitals or presidents. I have a self-selected group of very smart friends online, but of the people I know day-to-day in ordinary life, I might be the only one who can locate all 50 states geographically, another thing I wasn’t required to know growing up.

Anyway, it’s hard for me not to connect the video up with another youth-culture documentarian who this week floated through one of the mailing lists I’m on: Virgil, creator of Booksthatmakeyoudumb.

Basically, this guy looked at college students’ favorite-book lists on Facebook, then correlated them with the average SATs of the schools’ student bodies, to rank the most popular books in terms of the scores. Hence, Books That Make You Dumb. (“Yes, I’m aware correlation ≠ causation. The results are hilarity incarnate regardless of causality. You can stop sending me email about this distinction. Thanks.”)

Here’s the methodology in a little more detail.

Ever read a book (required or otherwise) and upon finishing it thought to yourself, “Wow. That was terrible. I totally feel dumber after reading that.”? I know I have. Well, like any good scientist, I decided to see how well my personal experience matches reality. How might one do this?

Well, here’s one idea.

1. Get a friend of yours to download, using Facebook, the ten most frequent “favorite books” at every college (manually — as not to violate Facebook’s ToS). These ten books are indicative of the overall intellectual milieu of that college.

2. Download the average SAT/ACT score for students attending every college.

3. Presto! We have a correlation between books and dumbitude (smartitude too)!

    Books ~ Colleges ~ Average SAT Scores

4. Plot the average SAT of each book, discarding books with too few samples to have a reliable average.

5. Post the results on your website, pondering what the Internet will think of it.

He takes at face value Facebook’s book names (“The Bible” and “The Holy Bible” are different, for example), and he categorizes the books by simply taking what shows up most for them on LibraryThing. And the methodology itself is probably specious, but the list is pretty interesting. The data itself is fascinating. While I think it’s mostly in accord with expectations, it’s good to see in living color.

It’s pleasing to see how popular 1984, To Kill A Mockingbird, Pride and Prejudice, and The Great Gatsby are, and that East of Eden, Lolita, Running With Scissors, and 100 Years of Solitude show up at all — heck, I’m even glad that Atlas Shrugged and Anthem show up; as bad as Rand is, you can’t read them mindlessly. (It’s disappointing that The Republic isn’t at least as popular, though.)

As crazy as this project is, it’s worth a look. And given that the Facebook crowd is probably just a few years older than that of Gainesville High School, maybe there’s some cause for hope. At least a few college students have favorite books, and some pretty damned good ones at that.

Posted in education, language, pop culture, the arts, writing | 3 Comments »