Serving the will of Tommy Westphall since 2004.

escoles's blog

New Terror Threat: Unitarian Jihadis

Can you actually wage jihad for tolerance? John Carrol @ SFGate "reprints" the manifesto of the Unitarian Jihad [fwd courtesy Amy]:

We are Unitarian Jihad, and our motto is: "Sincerity is not enough." We have heard from enough sincere people to last a lifetime already. Just because you believe it's true doesn't make it true. Just because your motives are pure doesn't mean you are not doing harm. Get a dog, or comfort someone in a nursing home, or just feed the birds in the park. Play basketball. Lighten up. The world is not out to get you, except in the sense that the world is out to get everyone.

Brother Gatling Gun of Patience notes that he's pretty sure the world is out to get him because everyone laughs when he says he is a Unitarian. There were murmurs of assent around the room, and someone suggested that we buy some Congress members and really stick it to the Baptists. But this was deemed against Revolutionary Principles, and Brother Gatling Gun of Patience was remanded to the Sunday Flowers and Banners committee.

It would be funnier if powerful and highly educated men didn't believe that there's some kind of "anti-christian conspiracy", or think that judges who have the integrity to make objective judgements are just asking to be shot. Those folks should try being a non-christian for a while, and see what that feels like.

What Is Neo-Calvinism?

In a nutshell: Neo-Calvinism is Smith's and Weber's Iron Cage. Except that instead of holding the Neo-Calvinists, it restrains the lesser beings that would trouble them -- namely, the poor.

More prosaically, "Neo-Calvinism" is the idea that the rich are more morally worthy than the poor. Their wealth does not confer virtue -- rather, it signifies it. It is the most potent and dangerous of several modern Capitalist sects, because it unifies moral righteousness with an ideology of power.

Calvin, along with notable reformation successors like Martin Luther, believed that a person's salvation was predestined: God (being infallible, omnipresent, omniscient, etc.) had determined in His own time that you or I should be saved or damned. Nothing that we do in our lives can affect that; the decision is already made. In pure Calvinism, this doesn't let you off the hook for moral behavior, because moral behavior is said to be an indicator of your fate.

So good people are good not because they do good things, but because God said they were. "Goodness" is merely an indicator.

As is prosperity. Wesley once famously lamented that as Methodists lived good lives (rising early, working hard, practicing thrift and sobriety), they tended to prosper -- which had the unfortunate side effect of causing them to focus on that worldly prosperity.

Neo-calvinism essentially forgets about God, and makes commerce itself the religion. Prosperity is still a signifier of moral worth -- but instead of being a secondary signifier, it's primary. It indicates stronger character, superior "fitness." It's a close kin to Greedism, but it's more powerful because it marshalls concepts like virtue and fairness to its service. It's related to Objectivism -- and I daresay most Objectivists are Neo-Calvinists -- but it permits a spiritual dimension that can be lacking amongst Randians.

Neo-calvinists are everywhere, all around us. Wherever you find someone who cries "It's not fair!" when they notice that the wealthy pay proportionally greater taxes than the poor, you have found a Neo-Calvinist or one of his fellow-travellers.

Most of the people, most of the time is good enough

Or bad enough, depending on your point of view. And it's most fun if you can fool yourself while you're at it. The tutor points out this morning that most Americans are pretty profoundly confused about what's good for them:.

We live in a democracy where most of those on the verge of bankruptcy are more concerned to repeal the Death Tax on estates above $2 mil, than they are with preserving their own home when their credit card debt catches up with them. This is a testimony to the relative power of marketing versus education. Who can blame Congress for making an honest buck off the passing of bills? Meanwhile, the media look more and more like the WB Studio Productions, what in the trade actors call "Industrials."

My point of view is from the bottom. Or down below, at least, if not on the rocks. I made a bunch of money last year; but I've made hardly any this year, and that's much more typical. I'll freely admit, that if I got badly sick, I'd be pretty screwed.

The really fun and interesting thing about all of this oppobrium about deadbeat consumers who are ruining America is that it's the culture of over-consumption that these people exemplify that keeps America going. Responsible consumption would destroy the American way of life faster and more certainly than any market crash. So the forces of Right are really fooling themselves, too, if they actually think that this is at all about helping the economy. Personal bankrupcy is the expansion grid on the American economy.

So clearly, that's not what it's really about. It's about a long-range re-solidification of the American economic class structure. The class structure broke apart in the 20th century, and (excepting the 1920s) especially since 1950. It became possible for working class families to reliably place their children into the middle and upper-middle classes; now, those at the upper end of that spectrum would like to solidify their hold on the higher strata of neo-calvinist blessedness by setting skid-traps to the underclass: Below a certain threshhold, any wrong step can take you all the way back down. And once you're down, those new bankrupcy laws will make damn sure you don't get out.

But this is America. And in America, anything is possible. The longer the odds, the bigger and sweeter seems the dream.

Table Talk

Diana Abu Jaber said something on ATC this evening [RealAudio] that struck a chord with me:

"I think that we always carry around these crystallized nuggets of memory, these moments in our lives, and for me the food memories were so imbued with meaning and emotion, that I was able to kind of move from one to the next, and as I wrote the story they would take on the details and the textures of my own memories, and weave themselves into stories."

Looking back I can't say we had all that much good food as a kid, but I remember the food, nonetheless. And we did have food that I loved. Porcupine meatballs, salsbury steak, stew with dumplings, Frito™ pie ... and stuff that I disliked, like the wax beans and green beans with new potatoes, or beet-tops, swimming in their own cooking water. All stuff that you might find in some white-bread cookbook. I'd never cook any of those things, now, but at the time, it was what I knew.

We always ate dinner together, even as the families of many of my friends seldom did. It required significant special despensation to be separated from the family at dinner. Dad sat at one end, Mom at the other, my brothers on one side and my sister and I on the other. In the summer, there was produce from our gardens: those green and wax beans and beet tops, broccoli (nope), chard or beet greens (yup), and later in the summer, sliced tomatoes and cucumbers (yes!). Before my elder siblings started drifting off to college, the plate of tomatoes and cucumbers would always come to me last, to ensure that other people would get a chance to have some.

But it was Saturday breakfast that I remember fondly. Breakfast at home with the family gathered was a much less formal affair. Some Friday nights, especially when Glen or Steve were home to visit, my sister Cheri's family would sleep over. On the Saturday mornings after, Dad would usually make pancakes, cinnamon pinwheels, waffles, or a big skillet full of huevos rancheros -- that is, scrambled eggs with bell peppers and ham, with the dry pan scrapings set aside for Cheri. (This was the pre-salsa era, after all.)

Pancakes would come off the griddle in waves. There were never enough for everyone at once, so we ate in shifts, loitering around the room afterward to stay in the conversation. Dad's pancakes were seldom predictable, but (almost) always good: They might include anything he felt a need to use up (like over-ripe bananas or wilted apples), or anything that struck his imagination (pecans, frozen blueberries, All Bran™). Once he added a half cup of Masa Harina; the result was only slightly less dense than a tortilla. Another time, he pulled a near-empty bag of oat flour from the cabinet that had a mis-printed label. A spot of ink on the "O" in "oat" made it look like "Bat Flour"; that gag grew gray hair and used a cane before we let it rest.

It was over breakfast that we could discuss politics or religion, even going as far afield as homosexuality on a couple of occasions. The conversation was most free when my brother Glen was there, and when I was young enough to let the need to define myself trump avoiding conflict. Now, I'm intellectully alone when I visit my parents', or my sister's. Even when Glen's there, it's not the same; we both know that it's not worth the trouble to express ourselves too freely. We've settled in our ways, now; the old jousting isn't fun anymore, the stakes on the other side of the table seem higher. Or maybe it's just a measure of how far I've grown from them.

I remember Diana Abu Jaber, though I never knew her well; she taught a creative writing section I took at Binghamton in the spring of 1984, during my first higher-academic lifetime. I'd been acquainted with her younger sisters from my dorm: One smart and haunted, the other sweet and reserved. But what I knew of the three of them still doesn't align with the things I've heard or read Diana say about her childhood. There's just no connect. What I know now, on "background", is less than I knew then, with no background. Then, I knew people; now, I know stories.

And that's what I should expect. I've just told a story about my own youth; but it really conceals as much as it reveals, truth or not. Diana said in her interview, "I have a real novelist's perspective, I think, and that means that you never, never sacrifice a great story just to tell the truth. You have to let the richness and the beauty of the story manifest itself first." But being a novelist only teaches her how to do it well. We all edit our history, whether for our own consumption or that of others. Some of us know the difference. If it matters.


Delocating The Village Green

During one particular, unhappy period of my life, I used to cross the street from the Y to the Village Green after my morning workout, and get a large coffee (and some sesame noodles, if I was feeling flush), and sit at the counter while I scribbled in my notebook.

The first refill was free; some days I'd go through four large cups. I'd mostly just write, alternating with long stretches of staring out the window. Sometimes I'd take a break to make a to-do list (top item of which was usually something on the lines of "GET JOB"). "The Green" was one of those large-ish, eclectic bookstores that you often used to find in urban to marginally-urban settings, featuring huge selections of magazines, unusual selection, and a section filled with some interesting food and candy.

And coffee. They always sold coffee, and as early as when I started visiting Rochester in the winter of '90/'91, it was good coffee -- not that crap that chain coffee shops dark-roast or pump full of artificial flavor to conceal its poor grade. Later, as they expanded in an attempt to compete with the suburban mega-bookstores, they added tables and chairs to go along with a new selection of pastries, cake, and vegetarian deli goods. They expanded their big suburban store in Pittsford; they built out their "flagship" store (really the much smaller of the two) to add a new CD store, trying to target the custom order market.

They went out of business not long after that, like a player at Risk who gambles on too rapid an expansion. It was a slow death-spiral, first rumored around the neighborhood, then heralded by the closure of the Pittsford store. As I saw it at the time, it was purely a matter of bad cost-containment: The wastage in their coffee shop operation was terrific. I counted one time, while I sat there, and noted that on any given weekday, they'd keep a dozen or more cakes, pies, torts and cheesecakes in the display case. At the end of the day, they might have completely consumed four or five of them. Still, they stubbornly insisted on keeping their food inventory until almost the end.

When the Green finally went under, they walked through and put price tags on everything: The books, the bookshelves, breadracks, refrigerated cases, anything that wasn't nailed down. Then one day, it closed, and was replaced a few weeks later by remaindered book wholesaler. He stayed for a month or two (probably sitting out the end of their lease), and then the space was closed. Half of the ground floor would be refurbished into a Pizza Hut; the old record area, upstairs, became a YMCA youth center; and the main building became a Hollywood Video.

It didn't take long for a succession of new coffee shops to open up, in a pair of buildings across the street and down a half-block. Neither lasted: The first was badly-managed and ahead of its time (an Internet cafe in 1997), and the second got knocked out cold when a Starbucks opened right across the street. Right between the sushi place and the trendy boutique, and across the alley from a cozy, carefully-hidden used bookstore called the Brown Bag, in a residential home that once housed a trendy wood-fired pizza place. (The Brown Bag used to be called the Oxcart, until its owners got out to write childrens books full time. That was something more than 15 years ago. It changed so little that lots of folks still call it the Oxcart.)

Starbucks is much busier than the Green ever was. In my gut, I don't know why; the Green was cheaper, their coffee better, their desserts were from the best dessert bakers in town. (Cheesy Eddie's carrot cake is pretty hard to beat.) Intellectually, of course, I know that people don't go to Starbucks with any conventional notion of value in mind. They go for an upscale version of that same ritualized sameness that Ray Kroc grokked: The beverage names are an incantation, a call-and-response to the baristas; the packaged and routinized baked goods are offerings to some god of status-through-commerce. I feel unclean whenever I go into a Starbucks, because I know that I'm in the temple of a faith to which I am apostate.

I've been to lots of coffee shops since then, and even spent a fair amount of time in one or two or three. But it's not the same. They're more expensive, and that's a big part of it. It's not that I'm cheap; it's that the cost starts to feel like an offering to those same gods of style, of status-through-commerce. It's a different sect, but it still feels like the same creed. Still, the coffee is good, the food is good, and the old Hallman Chevy building is fairly charming.

All of this is brought to mind this morning by an entry on the Daypop top 40: Delocator can help you find an independently-owned coffee shop near any US zip code. I don't see any near my zip code that I didn't already know about; it would be nice if they could take proximity arguments, which would let me see several more. But this is a pretty unusual area; we had "indie" coffee shops here before they were cool, and some of the best of them weren't proper coffee shops at all. Like the Green.

Toothsome Ironism

Folks sure do some funny things to make other folks think they're hip.

"Toothing" seems to have been a hoax. At least, that's what everybody's stumbling over their shoelaces to declaim. ("Dogging", though -- which differs from toothing in kind only insofar as it doesn't have a "fake" name -- is apparently real. Unless Ste Curran and Simon Byron are going to claim credit for that, too. "Yes, you see, all hedonism is a great hoax. Nobody actually has anonymous or exhibitionist sex. We made that up and you're all rubbish for thinking otherwise.")

What interests me is not the feeding-frenzy around the original "hoax" so much as the feeding-frenzy around its exposure, as people and institutions race to minimize the damage to their egos.

I use scare-quotes on "hoax" because, while I don't doubt the story about how the term came to be, I also don't doubt that people do it. Because, of course, the fact that somebody made it up has more or less nothing to do with whether people actually did it -- after, or even before. But it's officially a hoax, now; ergo, anyone who "believes" it ever happened is a fool. (And, apparently, anyone who dares notice that the ironists behind are prancing about naked is twice the fool. C'est la vie.)

There are some interesting things that often (if not usually) happen during the coarse of a big hoax:

  • The hoaxers have a clever idea that they think is sufficiently outrageous that people ought not believe it. Anyone who did, would be a fool, and therefore would merit ridicule.
  • In a successful hoax, the pranksters then expend no small effort actively duping at least a few people into going along. Documentation of this will later be used to beef up their credentials as Clever Blokes©.
  • People start actually doing the hoaxed thing -- or treat the hoax as sufficiently real that they start re-enacting it. (Making it, like, not fake, eh?)
  • The hoaxer claims credit, usually implying that all the buzz after the fact was entirely totally his doing, and therefore totally fake. (This is done, of course, to make the hoaxer feel important.)
  • The media outlets and members of the public who got taken in on the original hoax stumble all over themselves in the rush to discredit any reports of the hoax-activity. (This is done, of course, to alleviate the sense of foolishness that comes of being "taken in.")

I personally never "bit" on toothing in the sense of blogging about it or expressing moral outrage, etc. That's not becuase I immediately recognized how improbable it was -- quite the opposite. It was because I personally never found it that implausible. Aside from the fact that bluetooth messaging has had well-documented and unexpected use in ad hoc social networking [pdf], I've seen enough amateur hedonists casting about aimlessly in the "culture of death" that toothing doesn't seem that improbable to me -- certainly not in the realm of "throwing a brick at the dancefloor with a love letter attached, and hoping that the person it hits will agree to sleep with you."

I always reckoned the success rates for toothing to be in territory that a party-bar wingman (among the most troublesome of amateur hedonists) could wrap his sodden cognition around: "One in thirty, those are pretty good odds, bra!" Toothing would have poorer odds than one in thirty, to be sure; but the effort involved is less, too. Technology decreases the marginal cost. And as toothing became more "popular" (i.e., the "hoax" spread more widely), a greater proportion of amateur hedonists would leave their bluetooth open, and there'd be a substantial likelihood that toothing would actually work.

And in certain settings, it would most likely work really well. Think bathhouses....

So for me, what's really interesting is that in buying into the "toothing is a hoax" meme -- in accepting that the idea of anonymous sex mediated by text messaging was only ever always merely a hoax cooked up by a couple of bored wankers -- we miss the opportunity to learn whether the activities described as "toothing" ever actually happened. That would be kind of fascinating: Where? How frequently? What were its etiquettes? What did it do to the spread of STDs?

I expect that we would find it's done in dark places with loud music and lots of intoxicants, by people who don't then go home and blog about it. Making it part of that world outside that hip young blogospherians like Byron and Curran often seem ill-acquainted with.

So Much for Captcha; Anonymous Posting Disabled, Again

Captcha appears to be useless, at least for Drupal. has just been DOS'd for three hours (again, bringing down email as well as the website and site control panel). So apparently when you have 1300 (that's one thousand three hundred) "users" trying to post comments in a short timespan, it doesn't matter that they're not getting to actually submit the comments.

(I thought at first it was just another DNS failure -- I've had about six DNS failures since I moved to my current hosting provider. But then Peggy told me she'd seen 1300 "visitors" listed in the "Who's Online" block before the site went down.)

It also seems to be the case that capcha insertion doesn't prevent SQL writes in Drupal. That makes it pretty much useless for foiling DOS-attacks (in Drupal, at least).

Finally, at least four anonymous comment spams got through last night. So either the spambots have captcha-defeating code (odd that they'd have built that into a Drupal-attacking spambot...), or Captcha breaks stuff, as Peggy and Lynne have been reporting to me.

I have a few other things I can try, but they require some development. I could be able to implement it as a module, but it would be easier as a hack. Simpler, too. More another time...

Addendum: The site has been down twice more today, both times preceded by thousands of attempts to comment-spam. It seems that just hitting and submitting the "reply" page a few thousand times in a short time frame is sufficient to crash my site. I've turned caching back on, and there's a remote chance that might help, but if it's the reply validation that's killing the server then it won't help much. I'll just have to wait for the MoFos to get tired of attacking me.

Captcha That

I've finally gotten around to implementing the Drupal Captcha module. Since I've already patched my comments module to include spam filtering (which is still in place, BTW), it was a slight nuisance to integrate the required comment.module patch. But it was a very slight nuisance, and it's done, now.

Hopefully, putting a captcha between the "add comment" click and posting a live comment will stop spambots from being able to post anonymous comments in the first place.

Karol Joseph Wojtyla

I believe the dead should be remembered as they were, not as we would have had them.

An atheist, raised Methodist, I've nevertheless always been somewhat in awe of the office of the Papacy. But I've often wondered whether that was due to some familial kinship -- some hearkening back to the origins of Christian churches -- or due to the man who has held the office for more than half of my lifetime.

The Papacy is one of the single most important political offices in the world. Make no mistake: It is a political office, and I would argue even more so than a religious office. So it's entirely appropriate, purely on those grounds, to attend to the death of Karol Joseph Wojtyla. And even if his office didn't bear the weight that it does, he might very well. His presence was forceful; one eulogist after another has spoken of his quiet but absolute confidence the seem again and again to be describing something like intellectual courage. Something like honor.

John Paul II has an enormously problematic legacy. His interpretation of dogma is arguably responsible for exacerbating the spread of AIDS, for the birth of tens of millions of unwanted children -- and, implicitly and indirectly, for tens of millions of abortions -- for tacitly endorsing sexual intolerance and squelching processes that could have led to greater and more rapid spread of freedom throughout Latin America. He appears to have been actively complicit in helping American Catholic dioceses to avoid the consequences of sexual abuse on the part of their priests.

And at the same time, he is arguably responsible for the collapse of the Polish communist state -- for toppling-back the first domino. Hist moral example in forgiving his own would-be assassin made a huge impact on me. And he found ways to reach across sectarian boundaries and to admit some of the failings of the church.

Nevertheless, his legacy is immensely ambivalent. Only time will tell whether the world can approach it objectively.

The Imaginary Terri Schiavo

Imaginary people make much better martyrs.

Case in point: Terri Schiavo. The appeals are finally exhausted; Terri Schiavo is dead, unequivocally, unappealably. And we've just begun to see the consequences. Quite aside from the impending wrongful-death suit (which will be brought regardless of the results from the forthcoming autopsy, to be performed by a Jeb Bush appointee), the fight has catalyzed a constituency. It's given bullshit artists like Tom Delay (that old exterminator) a soapbox to stand on. Note, as we go forward, the endless repetition of their Big Lies: That the "American People" are behind the reckless Conservative-Republican adventurism; that the case shows improper involvement by the courts, instead of the courts doing their jobs by (perish the thought!) making judgements.

What was this case about? It certainly wasn't about whether one person would have preferred to have her body die; it passed beyond that threshold years ago. It passed beyond that when Bob and Mary Schindler concocted a "person" they called "Terri Schiavo", and identified her with their daughter, and pasted her face over their daughter's face whenever they saw her limbic-brained body in that bed. The "Terri Schiavo" that Bob and Mary struggled so hard to defend was not their daughter, but their dream of their daughter, or at least the best dream they could muster under the circumstances.

And she was a perfect daughter, in many ways: She didn't talk back, never contradicted their version of her life's narrative, never corrected their inventions about what she might be thinking at that moment. Or have thought when she was eight, for that matter.

It certainly wasn't about what the real Terri Schiavo's wishes might have been. What they are, I can't know, and I daresay Michael Schiavo can't know for sure. But judges have been evaluating the matter for seven years and not found a reason to suspect that she wanted her body to remain alive long after she'd lost the capacity to engage in detectable interactions with other people.

True, Michael can't have known for sure; but her parents -- surely they must have known?

Why? Why would we suppose that? My own parents wouldn't have the faintest idea what I'd want in such a situation. For practical purposes, they know nothing of real substance about me that they didn't know before I was eight. I could name four or five close friends, a handful of ex-lovers and seven or eight not-so-close friends who'd have a better idea.

So, no, it's got nothing to do with Terri's wishes. But it's got a great deal to do with how her parents imagine her wishes -- with the wishes of their fictional Terri, as it were.

And Jeb and George Bush's and Randall Terry's and Tom Delay's fictional Terri. Which is the real obscenity, here, of course. If it were just Bob and Mary, it would be a tragedy. And anyway, their version of Terri is at least based on something real. But with Jeb & George & Randall & Tom in the game, any hope of the real Terry S. being remembered are completely gone. She's doomed to be immortalized as an abstracted martyr for the cause of eliminating secular justice.

GDMFSOBs (Or: Why Anonymous Comments Have Been Turned Off, For Now)

My friend Lynne pointed something out to me a few days ago: Looking through the "recent posts" lists, she was able to deduce that this site had been getting comment spammed at a rate of about once every five seconds for a period of around an hour. All were flagged as spam; none of them made it through to human eyes. I was mildly impressed, for two reasons: First, that my site had handled the load; second, that someone had written a bot to attack that aggressively.

The day before yesterday, I stopped being impressed, and became furious.

I had pointed out to Lynne that the only real restrictions on how fast a comment-spamming 'bot could attack were the capacity of the web application and the speed of HTTP. Since HTTP is stateless, there can be any number of concurrent attacks in play; ergo, the server will most likely collapse first. that's what happened the day before yesterday, as the rate of attack shot up to more than once per second. The site buckled in about 20 minutes; it failed just as I was looking for the setting to turn off anonymous comments.

That's the roundabout way of explaining why anonymous comments are now switched off. The attack bots simply have no way to see the "post comments" link, for the moment -- they can't login, they can't comment.

I hate this solution. I'll be adding a captcha module and patching the comments module to use it, Real Soon Now ("in my copious spare time", as we used to say at Ziff-Davis Education). I have ideas about how to make the comment button harder for bots to see, but they require patching core on Drupal, which I'm loathe to do.

As Peggy points out, I guess I'll need to monitor enrollments more carefully. And as Lynne could tell you with regard to other sites, I'm not very good about that, anywhere....

Ruth and Ron Conklin

Many, many moons ago, I came out as an atheist, and my religious orientation hasn't truly wavered since. It would have hurt my parents less if I'd told them I was gay, I think. I thought about that this morning, as I read a message from my father. It seems that my childhood pastor and his wife are soon to celebrate their 60th anniversary. I'm thinking of sending them a note.

Ron Conklin played an important role in my coming out. As I think back on it now, the sequence of events is hazy in my mind. But this much I can definitely say: It came to a head over confirmation. In the United Methodist Church, children are confirmed at about the age of 13. It's their last year in "Sunday school"; after that, they're supposed to join the adults in the sanctuary. We had a deal in our family that we could make up our own minds about attending services once we'd been through confirmation class.

By that time, I'd been dubious about Christianity for at least three years, and had counted myself firmly as an atheist for at least two. I felt unclean every time I sat in the santuary and mouthed hymns or prayers, and had even begun to consider leaving the Boy Scouts because I felt dishonest reciting the Oath and Law.

So when June rolled around, and Confirmation time drew nigh, I elected not to be confirmed in the United Methodist Church. I told my parents why; they didn't like it. They implied I would be forced to confirm. I made it clear that I would not be.

As a compromise, I agreed to talk to Reverend Conklin. We met after school one day, at the church. As I recall, we didn't talk for long; I told him what I believed, told him that I felt it would serve no purpose to argue about it, and ultimately, though I'm sure it didn't make him happy to do so, he agreed that it would be dishonest for me to confirm if I did not believe in God.

I promised him in return that I'd come see him if I wanted to discuss my "doubts." He didn't demand that; he asked for it. I knew him well enough to know it wasn't a deal, but an offer. I've always been grateful to Ron for that, but then, that was the nature of his belief: Faith had to be freely chosen, or it was without meaning.

That wasn't the end of friction -- my parents tried to go back on our traditional deal regarding confirmation and church-attendance, and my mother still occasionally begs me to "reconsider" and tries to guilt me into Easter services whenever the logistics align -- but my meeting with Ron Conklin more or less forestalled a war. I've always been grateful to him for that, too.

The Conklins have been close to my family in the years since. Ron married all three of my siblings (even my agnostic brother Glen asked him to perform the service), twice travelling a day's drive to do so. And my parents have often stayed at their cabin in the Adirondacks.

There's another story that comes to my mind more often with regard to Ron Conklin, though. At the reception for my brother Glen's wedding, I stood to offer the traditional "best man's toast". I hadn't given it much thought, intentionally: I felt it would be more appropriate if more sincere, and more sincere if done off the cuff. So as I started to talk, I didn't know quite where I was going -- only that Glen and Sheila would want me to avoid cliches and embarrasing stories. I allowed myself to start out on the usual "I was there when..." journey, let myself wind into the story for thirty or forty seconds, and then stopped, suddenly, put on an urgent expression, and cried out: "Sheila, it's not too late! The window's open! He's a monster, run away now!"

Ron burst out laughing (along with everyone in the room except for the mothers of the couple), and called out, "It is too late, I've got your signature as a witness!" He later congratulated me: "You did the two most important things a best man is supposed to do, you made the couple feel good and you kept it short."

Dad tells me that the Conklins are living in the same area where I first knew them, where our church was -- his last church as a full-time pastor. I'm glad they're well. I'll never share their faith, but I'm glad that at least some people of faith have had people like Ron and Ruth to look out for them.

Santeria in Patriotic Drag

The Iraq war is like santeria in patriotic drag. Soldiers are sacred. It's that blood thing, again, I think: Once blood has been spilled in a cause, the cause is somehow sanctified. The straussian bullshit artists in the vulcan cabal are happy every time they can say that the evil enemy has spilled blood, happier still if they can say it's American blood.

Supporting the war on the argument that we've "sacrificed" and it would be disrespectful not to honor that sacrifice isn't solidarity; it's bloodthirsty primitive ritualism. It's voodoo. It reminds me of the late-Aztec "hummingbird-god", Huitzlpochtli, who fed on the blood and hearts of brave warriors, in large quantities, in return for permitting the sun to keep on shining.

Similarly, the Bushite-Vulcan cabal and their fellow-travellers demand the blood of American youth. Modern historians tell us that the Aztec sacrifice-empire was on the verge of collapse, as evidenced by the fact that the outsider, Cortez, was able to quickly unite so many diverse factions around his tiny band of Spaniards. Ed Calnek, for one, has argued that the sacrefice regime served the purpose of proppin up the Aztec state.

If I believed that history repeats, I'd be expecting another Cortez -- or wondering if his modern name isn't "Bin Laden."

And What Does "Blarney" Mean, Again?

Geoff Pullum has some advice:
Whenever you hear someone starting to say something that begins with "The X have no word for Y", or "The X have N different words for Y", never listen to them, and always check your wallet to make sure it's still there.

We're told often enough that the Eskimo have "x [where x varies from 30 to 1,000,000] words for snow". It's nonsense, of course -- they actually have quite a bit fewer than we do. I can thank an old anthro prof for clueing me in on that myth, but Anthony C. Woodbury back in 1991 compiled a list of English and Yupik lexemes for "snow" for an old-school mailint list. But it just won't go away. It's another aspect of that idea that some peoples really do think differently.

Pullum was moved to comment by a blarney-rich (which is to say, charming and engaging) interview with Irish novelist Frank Delaney on WeSat. The Irish are "devious", Delaney says, because they've made do without words for ideas like "yes", "no", and "sex". After putting paid to the issue of words for sex, Pullum goes on to make this fascinating observation about how the Irish say "yes" and "no":

The story about Irish lacking particles meaning "yes" and "no" is true, by the way. But it has nothing to do with the Irish mind or spirit or way of looking at the world or the notion of neither agreeing nor disagreeing. In Irish you repeat the verb of someone's clause to agree with it (as if someone said "Got milk?" and the way you gave an affirmative response was to say "Got"), and you repeat their verb with the negation particle in front to deny it ("Not got"). But the same is true of Chinese.

Which is not to say that there aren't interesting cultural consequences related to that particular characteristic of the Irish language. Certainly they could have said 'yes' or 'no'; no language would last long without a simple and straightforward way of doing so, and in any case the Irish construction is somwhat analogous to the German constructs for negation. In German, the word for "not" (as in "not green") is the same as the word for "nothing": nicht. But I've yet to hear anyone suggest this colors the German way of thought. I submit that the reason is more or less frank ethnocentrism, again. (And in fact have never heard a German-speaker make an error in speaking English that would suggest any confusion on the issue.)

I'm reliably informed, for example, that it's regarded as being in very bad form to "say no" (i.e., to refuse to do something) in Japanese. Certainly they have a way of doing it; but it's not something that polite people observing the pretense of social equality will easily do. Instead, you find indirect ways of expressing refusal. The subject came up with regard to Japanese marketing campaigns by a large, Rochester-based "document company" which used far too assertive language forms. But this is not so much a matter of language per say as of its usage in a cultural context. A language can be used, generally, to express any number of different ideas; in a cultural context, though, it may not be very good for expressing some of them.

Spam-Whack: What Happens When You Cut Humans Out Of The Loop

For about 12 hours, I've been getting hit heavily by texas-holdem spam. This, coming just two days after "" "spam-whacked" (to coin a phrase) its way to a high position in the Daypop Top 40, one of the key indicators of memetic activity in the blogosphere. It didn't stay there more than a day, but it was there long enough for my 12-hour aggregation cycle on Daypop Top 40 to pick it up.

This wave of comment spam here (all caught by my filter, after the initial training) is conventional comment spam. But my hunch is that the "" Daypop-whack was done with trackback. I just can't imagine it happening rapidly enough and in a widespread enough form to do so without the assistance of trackback auto-discovery.

BTW, I haven't found anybody actually mentioning this incident, which is very interesting to me. It meas, I think, that they either didn't notice, didn't understand the importance, or didn't want to admit the importance. Which is huge, because this would demonstrate two things -- one very important, the other merely interesting:

  1. The effectiveness of trackback spam is more or less entirely due to auto-discovery, which effectively automates the distribution of trackback spam. (The blogorati will underestimate the importance of this by observing snarkily that this could have been avoided by using nofollow. They're probably right, but the observation misses the point in a big way.)
  2. The merely interesting thing is that this helps to clarify who's responsible for the attractiveness of Trackback spam: Sixapart.

We can say safely that SixApart are responsible, by the way, because they initially invented trackback as a manual means of "giving juice" to someone else, and then failed to understand that it needed to stay manual. It was intended to be initiated by human action, not automated. But then they proceeded to automate it; that made trackback geometrically more attractive as a target for spam: It meant that spammers could potentially place themselves into the various automatically-compiled "top"-lists in a completely automated fashion (i.e., at cost to the spammer approaching zero). And with no legal consequences, to boot: They couldn't be prosecuted under email laws, because it wasn't email; they couldn't be charged with theft of service or hacking because -- and this is key -- the spamming was being carried out as a normal designed feature of the "exploited" systems, using their resources.

The great mystery isn't that it happened, but that it took so long to happen.

Shelley et al.'s "tagback" concept might profide a "fix" for this, of a very limited sort, but it still leaves us without trackback. Trackback was a very useful concept; it allowed people to create implicit webs of interest, one connection at a time and without central planning, and -- and this is really important -- without the mediation of a third party like Google or Technorati. And we all know that spammers will find a way to parasitize themselves onto tagback, as well.

And anyway, reliance on third parties for integration gives them power they should not be allowed to have. It's a bad design principle. Trackback, pre-autodiscovery, was a good simple piece of enabling technology. But it was mis-applied, quickly, with the encouragement of people who should have known better. And now it will be forgotten. Which is really, deeply stupid, when instead it could simply be re-invented without auto-discovery.

Libertarianism as Inverse Marxism

A thought for the moment:

The most fundamental problem with libertarianism is very simple: freedom, though a good thing, is simply not the only good thing in life. Simple physical security, which even a prisoner can possess, is not freedom, but one cannot live without it. Prosperity is connected to freedom, in that it makes us free to consume, but it is not the same thing, in that one can be rich but as unfree as a Victorian tycoonâ??s wife. A family is in fact one of the least free things imaginable, as the emotional satisfactions of it derive from relations that we are either born into without choice or, once they are chosen, entail obligations that we cannot walk away from with ease or justice. But security, prosperity, and family are in fact the bulk of happiness for most real people and the principal issues that concern governments.
[Robert Locke, "Marxism of the Right", in American Conservative]

It strikes me that many people will find American Conservative to be an unusual venue for this kind of analysis. But this is Pat Buchanan's rag, and it bears his stamp; this is Buchanan Conservatism, speaking loosely -- the "Buchananite" camp (for lack of a better term) has always been something of a herd of cats, by comparison with their more ends-means-challenged fellow-travellers on the Republican Right, like Rove and Norquist. They prize analytical thinking and intellectual independence and integrity, though they're not above swallowing a bit of that independence to take one for the team on occasion.

I've argued in the past that Pat Buchanan is more or less personally responsible for the debased state of popular political discourse in America. But I like to think he'd have exercised more restraint if he knew it was going there. And while my parents will gleefully describe me as a "flaming liberal", I still regard myself as, in many ways, conservative (that's with a small-c); and while I think this analysis is largely spot-on and that American Liberarianism (that's with a big-L) is a bunch of dangerous humbug, I'm still very sympathetic to the libertarian ideals (that's with a small-l) of free choice and freedom from external restraint; still, I cut my political teeth reading William F. Buckley and Will Safire, and sitting in on meetings of the Executive Committee of the Saratoga County Conservative Party, and I did my time as an adolescent follower of the teachings of Ayn Rand. So at the least, I understand Conservatives better than most people I know. They fascinate me. But I digress, as usual.

It's an interesting analysis of Libertarianism, and overall I think it's correct. But Locke goes too far in two areas: He characterizes Libertarianism as having a "dogma that all free choices are equal", and of having "contempt for self-restraint." The first is his own straw-man version of Libertarianism, based on his assertion that when Libertarian views on free choice are taken to their "logical conclusion", they imply "... that a man who chose to spend his life playing tiddlywinks has lived as worthy a life as a Washington or a Churchill." While that might be true that Libertarians ought to hold that view, if they were being logically consistent, it's probably also true that the vast majority of people who call themselves Libertarians don't hold that view. So the correct criticism would be for inconsistency, not for equating all free choices.

On the second point he's simply wrong, as far as I can see, since it appears he's making a simple assertion and not even a conclusion about what Libertarians ought to think. Rather than having contemt for self-restraint, in my experience, most Libertarians assume it. Which, to be fair, can end up having the same effect: Children are not taught such techniques of self-restraint as delayed gratification or imagining the consequences of their actions. (There's that imagination thing, again...)

Perhaps by "contempt", he means "neglect." And it's certainly not a spirit that's restricted to Libertarians; American popular culture -- well, really, all modern consumer culture, as well as our entire world economy -- is really predicated on the statistically "wealthiest" populations exercising as little self-restraint as they can while still retaining their capacity to spend capital.

In closing, here's a thought for the Grover Norquists of the Conservative world:

[Libertarians] often confuse the absence of government impingement upon freedom with freedom as such. But without a sufficiently strong state, individual freedom falls prey to other more powerful individuals. A weak state and a freedom-respecting state are not the same thing, as shown by many a chaotic Third-World tyranny.

To be honest, though, I'm giving Norquist credit I don't believe he deserves. I don't sincerely think he's at all interested in idealized Libertarian freedom, but rather primarily in power for its own sake -- and only secondarily in power as a means to the end of his doctrinary agenda.

Bloody Amarillo

It's as though only blood will satisfy Texans. They seem to be largely outraged at the Supreme Court's recent decision regarding the execution of minors. From Morning Edition this morning [RealAudio], I heard again and again that these convicted murderers are somehow not being punished because they won't be executed.

As though they're getting let out of jail, instead of having their sentences commuted to life in prison. Justin Wiley Dickens, convicted of murduring a man in an Amarillo pawn shop robbery when he was 17, is puzzled by the outrage at the ruling. "This is hell. It really is. I can't understand the outrage of them saying we don't be executed, we're just goin' to another life of hell. They ain't never gonna let any of us out. Life sentence means a life sentence. And I pray for Jim Jacobs and Francis Carter's families, I just live every day with regret, I really do. Just tell them I'm sorry. If you would."

Justin Wiley Dickens's case is an interesting illustration of this bloodthirstiness: The shooting happened during a struggle over the weapon, under circumstances where it's unclear that Dickens engaged in any meaningful premedidation. In other states, this might have been second-degree murder, or even manslaughter.

But not in Amarillo, because in Amarillo, the District Attorney knows what's in the criminal's heart: "I got to know Justin Wiley Dickens very well, in that trial", says Amarillo DA James Farron. "If you have something he wants, and he has to kill you to get it, he'll kill you in a heartbeat, I assure you. You, me, anybody else." That's not a particularly Texan attitude for a DA, of course, but it is a particularly DA attitude. Criminal DAs generally take the line (at least publicly) that everyone they've ever prosecuted was guilty, regardless of the verdict (or the evidence, for that matter).

(For what it's worth, Amarillo attorney and adult death penalty supporter Russ Bailey, who was assigned to defend Dickens, disagrees strongly with Farron's assessment: "Justin in my opinion did not have the requisite intent. He was not an adult for any purpose in my opinion at that time. He was a nice kid....Most of these kids don't have any control over their lives. Justin didn't have any. He never stood a chance. And to throw away a life before they've even tried to live their own is a real tough thing to accept. It was for me for Justin. ")

People can be great at missing the point, though, especially when it's in their interest to do so. Farron, for example, sees the Surpreme Court ruling as a statement that "all 17 year olds" are decision-impaired: "It is simplistic and sophomoric to suggest that we can draw a line in the sand and announce that everyvbody younger than this many days is immature, unable to make decisions the same way that you and I do -- is that true of some 17 year olds? Absolutely. Is it true of most 17 year olds? Probably. Is it true of all 17 year olds? Absolutely not." Farron, for his part, seems to think that "it's true" of at least a third of 17 year olds: one third of Farron's own death row convictions are under 18.

Of course, it's simplistic and sophistic (and most likely Frankfurtian bullshit, to boot) for Farron to suggest that's what the Supreme Court ruling was meant to establish. As legal language goes, the decision is really quite plainly worded; if Farron really believes that's what they meant, he should be disbarred for incompetence. To quote Justice Kennedy's opinion [pdf]:

.... An unacceptable likelihood exists that the brutality or cold-blooded nature of any particular crime would overpower mitigating arguments based on youth as a matter of course, even where the juvenile offenderâ??s objective immaturity, vulnerability, and lack of true depravity should require a sentence less severe than death. When a juvenile commits a heinous crime, the State can exact forfeiture of some of the most basic liberties, but the State cannot extinguish his life and his potential to attain a mature understanding of his own humanity. While drawing the line at 18 is subject to the objections always raised against categorical rules, that is the point where society draws the line for many purposes between childhood and adulthood and the age at which the line for death eligibility ought to rest. ....

Which means that yes, they understand 18 years (or 6574 days, if DA Farron prefers) is an arbitrary cut-off date; but then, arbitration is their job. Part of that job means that they have to act, sometimes, when the demagoguery of some local politicians, or the particular popularity of some victim (as in the case of Justin Dickens) exacerbates local bloodlust.

Put another way, the point of the ruling was that elected or politically-appointed operatives like Farron ought not be trusted to turn American civil society into a cruel myth, ever-invoked but seldom obtained to. We've already let federal demagogues do that with anti-terror statutes that effectively permit the abrogation of basic constitutional rights to free speech, habeus corpus, and freedom of association.

Rebranding Dissent, 2005-03-05

After reading Daniel Schorr, writing at the CSMonitor, I'm left wondering whether he's getting sloppy. It's not that he thinks George W. Bush may have "gotten it right" when he said that "a liberated Iraq can show the power of freedom to transform that vital region." (Though Schorr doesn't address whether Bush ever really cared about whether he was right -- i.e., whether or not the President is a bullshit artist.) It's not even that he thinks the so-called by some "cedar revolution" underway in Lebanon has some causal connection with our disastrous liberation of Iraq. It's that I can't figure out why he'd think that.

It's just not a view that makes sense. Why should Lebanese (or Egyptians, for that matter) be positively inspired to seek political freedom by the images of American military dominance in Iraq? Fear, perhaps, that the US would make them the next exmaple, even though the Egyptians have been our partners in crime for many years and Lebanon doesn't seem to capture the American political imagination anymore (if it ever did). Schorr nevertheless thinks that some mysterious "Iraq effect" is inspiring Lebanese to drive out the Syrian "security" force. He thinks that Iraqi "freedom" must be what's inspiring these people.

He thinks this, in spite of the fact that the "cedar revolution" bears a much closer resemblance to the Czechoslovakian "Velvet Revolution" of a generation earlier (not to mention contemporary actions in Poland, East Germany, Romania and Hungary). Of, for that matter, to any number of popular uprisins throughout the world, successful or not, in the decades since.

Or, more to the point, Schorr could consider the electric shock that seemed to go through the Arab world with the death of Yasser Arafat and the re-emerence of Abu Mazen as a popular political force, thanks in no small part to the weariness of ordinary Palestinians.

The only explanation I can think of for this failure to see other, far more likely causes, is the comon and highly ethnocentric (if not frankly racist) view that "the Middle East" is somehow different, its peoples and nations somehow less cultured and civilized, and certainly less aware of world events. They don't know about popular protests in the Ukraine; they never learned anything aout European or Asian or New World history, so they don't know about Tienanmen Square, the popularly-inspired rennaissance of South Korean democracy in the late '80s and early '90s, or the popular groundswell that saved nascent Russian democracy (for a while) from "counter-revolutionary" forces. Those ignorant Arabs must not understand any of that stuff. This is the Middle East, after all. It's different there.

Perhaps it's just a generational thing; we're past the time, maybe, when we should expect any foundation in history from reporters and "news analysts". Of courase, an "analyst" as seasoned as Schorr doesn't have that excuse: He's "analyzed" every significant popular uprising since the fall of the Iron Curtain, and was reporting the news way back when Lebanon was one of the most beautiful and culturally vital nations on the Mediterranean.

Bullshit Is A Process

"One who is concerned to report or to conceal the facts assumes that there are indeed facts that are in some way both determinate and knowable. His interest in telling the truth or in lying presupposes that there is a difference between getting things wrong and getting them right, and that it is at least occasionally possible to tell the difference. Someone who ceases to believe in the possibility of identifying certain statements as true and others as false can have only two alternatives. The first is to desist both from efforts to tell the truth and from efforts to deceive. This would mean refraining from making any assertion whatever about the facts. The second alternative is to continue making assertions that purport to describe the way things are but that cannot be anything except bullshit."
-- Harry G. Frankfurt, from "On Bullshit" (in The Importance of What We Care About, and as quoted on wrongheaded)
"I had a guaranteed military sale with ED209! Renovation program! Spare parts for 25 years! Who cares if it worked or not!"
-- 'Dick Jones', Robocop

I've been thinking a lot lately about a problem, a phenomenon, a type of behavior that I've described for myself as "po-mo ironism". It's a way of keeping a sense of ironic detachment that lets you criticize something as you valorize your decision to participate in it -- for example, carefully noting the defects and problems of SUVs while arguing that your decision to buy one is, nevertheless, virtuous.

It's seemed to me that there's something more than mere disingenuousness at work, here, and something more than simple ironic detachent, as well. Now I have a simple, concise word for it: Bullshit. And a framework to hang it on. Bullshit could be said to be the basic "truth-process" underlying lysenkoism, disingenuousness, ironic detachment, self-delusion, and a host of other discretely-named ills.

Bullshitting, according to Harry G. Frankfurt, is far more insidious and corrosive than lying, thought they might superficially seem to be synonymous. Something can be true and still be bullshit, if the purveyor of said excrement never cared whether it was true. So, for example, even if we choose to believe that Saddam really did want to buy Yellowcake in Africa, the infamous "16 words" are still bullshit, because Rice, Bush & Co. never actually cared whether they were true.

What makes it bullshit, in other words, is not that they lied -- they might have even believed it was true -- but rather that they if they hadn't found evidence, they would have made it up. (Which, in fact, they did; that Saddam might have actually tried does not alter the fact that the Bushites ["Bull-Sites"?] fabricated their evidence.)

Because what mattered to them was not whether it was true, but whether they had the evidence to support doing what they wanted to do: Go beat up the guy who made George's daddy look silly.

Tim's Mammalian Brain

Heaven forbid we should make a rational choice. Because, of course, rational, counter-intuitive thinking has never gotten us anywhere. Not anywhere that we remember, at least, while our lizard-brains are in charge. It might be worthwhile, though, to remember that for the last few ten-million years, the mammals have been in charge.

Jennifer Loviglio wants an SUV. She wants it because she wants to feel safe:

.... I want an SUV. I want to be safe. Last month I totaled my old Volvo in a scary accident, and at that moment everything changed.

It was late afternoon and the weather was fine --- dry roads, good visibility. I was driving along East Avenue and without warning a young driver in a Honda made an illegal left across traffic. I hit the brakes but it was too late. The awful metal smash. The explosion of airbags with their acrid smoke and debris. My son screaming in the backseat.

The car lurched onto the sidewalk and we got out fast. No one was hurt. ....

... and yet, she still wants her SUV.

She's test driven them, and she felt that tendency to roll over; she felt it as even more pronounced in the full-sized SUV, but she still wants "8,600 pounds of metal between my boys and the other cars."

She wants those 8,600 pounds because she wants to feel safe, not because she wants to be safe. In fact, she knows she'll be less safe:

In larger SUVs, that top-heavy pull is even stronger. And yet, even though I know better, it does feel safe up there. A couple of years ago, in a New Yorker article about SUVs by Malcolm Gladwell, an industry expert pointed out that this paradox is common. On an intellectual level people know taller vehicles have a greater chance of a rollover, but on what he calls the "reptilian level," consumers think "if I am bigger and taller I'm safer."

The article also shows how SUVs take much longer to stop and are difficult to steer even at moderate speeds, whereas sporty little cars with their better handling can avoid potential collisions at speeds upwards of 50 miles an hour. It makes the case that a smaller car, which could be crushed by an SUV, might nonetheless be a safer vehicle because of its maneuverability. Still, though, if I'm going to hit something --- God forbid --- I'd rather be in a tank.

Of course, the rational thing to do would be to check the crash test ratings for various models, or even just buy another Volvo. The first one served her well: The much-lauded Volvo space frame did its job, the airbags worked, and no one was hurt. And in the unlikely event of a rollover, there are few cars in current manufacture that will keep her family safer than a Volvo.

But this isn't a rational issue, it's a "maternal gene" [sic] kicking in. And we all know, don't we, that it's "crucial" (by which she clearly means 'forgivable') to obey the yearnings that we think are wired into our genes.

Which is to say, to be good and conscientious parents -- well, mothers, really, since "paternal genes" aren't under discussion -- we must always obey our lizard brains. Heaven forbid we think for a moment with our mammalian brains.

My sister went off the road one time and rolled her car. She was on her way to church on Sunday evening, with her two year old son and a bunch of warm pumpkin pies in the back seat. When her '73 Saab 99 settled back onto its wheels, she felt something warm and sticky on her head; but it was only pumpkin pie, and Luke was screaming that frightened but unhurt scream from his fiberglass car seat.

Luke is now 25 and a father of two. The much-lauded Saab roll-cage had done its job. The next day, once he'd arranged to have the wreck (which still at least looked driveable) towed back to the house, Luke's father Tim was on the phone looking for a new-used Saab to replace it with.

That was Tim's mammalian brain -- his "paternal gene" [sic] -- working. He wanted his family to be safe.

And after all, the mammals are in charge, now.

Chris Rock, the Red-State Conservative

While I can't say it surprises me, it's amused me for a long time that people like Matt Drudge think Chris Rock is "dangerous." Dangerous to what, I wonder. Perhaps to their complacency. Certainly not to "family values" or "common mores" -- not if you're paying attention. I am a bit surprised, though, to hear him referred to as a 'William F***ing Buckley Conservative'.

Years ago, I saw Chris Rock on television -- probably HBO, probably "Bring the Pain", but I don't remember exactly. I do remember a bit he did about men cheating on their girlfriends. He started by signalling his intent -- showing the club, as it were: "Men are stupid. Because you know you're gonna get caught." He does it to be fair, maybe, or maybe to prove that even after he's signalled that he's going to drop a hammer on them, he'll still sucker the men in the audience in. Which he proceeded to do, describing how natural it was to want to cheat, how easy it should be to lie -- if, of course, men weren't stupid. And, more important, if they didn't know damn well they deserved to get caught.

Rock is a stealth moralist. He's a preacher to the pop-cultural -- a wandering rabbi or imam, ministering to the barflies by telling stories in terms they can understand and that will elicit enough of their empathy to make them stretch their minds and consider their world. It's an old and proud tradition (as old probably as storytelling), realized with a wide array of techniques. My favorite contemporary example is Matt Groening's marvellous creation, The Simpsons, which panders to our baser instincts and then springs the trap on us by making its characters renounce their ill-gotten gains in the service of What's Right. Rock's technique is similar: He lulls his audience into a false sense of security, and then explains in quick, brutal strokes that anyone he's suckered is a fool. And a morally depraved fool, at that.

What Rock's not is a 'William F***ing Buckley Conservative', as John Swansburg seems to think he is. Raising your daughters to not be strippers, or suggesting that single mothers ought to put their children before Girls Night at least most of the time, or suggesting that abortion might be a little too cavalierly chosen, are not "Red State Red" conservative positions: They're mainstream moderate American moral positions, shared by the vast majority of adult "Blue" and "Red" state residents, and anyone who suggests otherwise is buying into the Republican framing myth that holds that True American Values are right-wing religious conservative values. They're not actually religious values at all, and what's more, they're not communicated via religion -- at least, not in a healthy, functioning society they're not. In a healthy family, they're communicated by example. Children learn responsible and moral behavior by watching their parents, their extended family, their neighbors, and the people they meet in daily life.

What Rock is, is Lenny Bruce with tamed demons, or George Carlin with more integrity. Any good comic keeps a few demons in the closet to feed him material; but if they're smart (and probably a little lucky), then sometimes, just sometimes, they learn to tame them without losing the energy the demons feed them.

Chris Rock is also another important thing: He's a professional, just like Whoopee Goldberg or Billy Crystal. Personally, I expect his impact on the quality (moral or otherwise) of the show to be positive.

Love, Pain, and Story Logic

One of the most egregious failures of imagination that I see every day is what looks very much like an inability (or more likely an unwillingness) to stretch the mind to understand what a story is trying to tell you.

And what stories are trying to tell you isn't some single, specific thing. If they are, they're bad stories -- maybe even false stories. Good stories -- "true" stories -- are like a thought-experiment: "What would happen if someone did this?" I am increasingly convinced that stories are how humans are wired to make sense of the world. Stories are why we have advanced language skills: Better language made for better stories, better stories made for a more survivable community, etc.

If a story has consistent, valid story logic and character logic -- if the characters behave in ways that makes sense for those characters, in that circumstance, to behave -- then we can safely say that there's at least some truth in it. If the story is powerfully told, so much the better: Without good telling, we won't stretch ourselves to find the empathy we'll need to make the narrative talk to us. This is what "great" makers of narrative (those folks we call "writers", but also film-makers, poets, songwriters, painters...) have always done.

So Medved is not only being reductionist on this point, he's being a bad critic, because he's approaching criticism without imagination. He's looking at the film as though it's some kind of a morality-machine, and any good film -- any good story -- is something more than that. It's a narrative, from which we can draw a deeper understanding (that is, if it's story-logic and character logic are true).

Anyway, I don't have high expectations for a review from anybody who's expecting to find a clear moral universe in an Eastwood film. Think of Unforgiven (endorses prostitution and lawless behavior), Midnight in the Garden of Good and Evil (romanticizes gay sex and murder and promotes an anti-christian agenda through endorsing voudoun), Bridges of Madison County (glorifies adultery), or probably any of his other films from the past 15 years. There is a theme there, though, I think, and it's that Clint Eastwood lives in an increasingly vague moral universe these days. They only things that seem to be certain in Eastwood's moral universe are pain and love. (And there are worse absolutes to fixate on. Power, per se, for example, has no real moral endorsement in Eastwood's vision -- it's a fact, to be sure, but it's always in service of love or pain. But I digress...)

What these films can help us to understand is that a vague moral universe is not an amoral one. Every Eastwood picture that I can recall (aside from his forgettable late Dirty Harry outings, done to win studio backing for future projects) has been driven by its moral choices. His characters do not serve as moral models; rather, they model moral behavior. There's a crucial difference: The first means that they are merely shadows on the cave wall, cast by the contorted hands of a finger-puppeteer; the latter allows us to imagine ourselves in that world, and consider the choices we would make.

Please Just At Least Try To Imagine. That's All I Ask.

But be honest about it. Don't just pretend. That's cheating, and lying, and it makes Baby Jeebus cry.

I'm increasingly convinced that the greatest roadblock to human progress is lack of imagination. More particularly, the inability -- or unwillingness -- to imagine onesself in the position of another.

The problem can look like other things: Like (selective) literalism, as when someone like Michael Medved or Ted Kavanau can see nothing in Million Dollar Baby but a "pro-euthenasia" or "anti-christian" tract. Or it can look like lack of empathy, as when wannabe uber-geeks dismiss the problems of "lUsErS" as being of their own making, or knee-jerk free-will zealots (willfully?) ignore the benefits they accrue from being members of civil society to rip out one of that society's underpinnings.

That empathy requires imagination I regard as self-evident; that people who lack empathy literally lack imagination, I regard as open to question. As a friend remarked to me recently, "it's all about what's at your front door."

Food Adventure for the day: Shrimp and Vegetarian Etouffée with Dirty Rice

I was at an 80th birthday party on Saturday night, for the mother of my good friend and XSO, Amy. Jane's friends were mostly out of town or simply gone, so we decided to invite our own. (Of course, we didn't tell her that in advance, so everyone after myself and the family were "unexpected." There's precedent for that, from Jane's 70th birthday party, for which I made a large lasagna with tortillas and dark meat of turkey, and maybe I'll get to that, later. After I've actually gotten some work done.)

I made two etouffées (one vegetarian, one shrimp); Unexpected Guest Lynne made a lovely vegetable creole, with nicely crisp green beans and firm zuchini chunks; Unexpected Guest and Mother Of The Son In Law, Loretta, made a large mesclun salad with a light viniaegrette [sp?]; Daughter #1, Ellen, made a very moist bundt cake with a nutty, spicy center; and Amy (Daughter #2) made Bellinis.

I fended off a fusillade of compliments on my dirty rice and my etouffées. So, here's how I did it all, more or less. I didn't actually have recipes; I took ideas instead from a bunch of different recipes, and combined them with my recollection from the last time I made shrimp etouffée, something like ten years ago. (There's a story there, too.) Of course, any decent recipe would start with a basic roux; my plan was to make a double batch of roux, and then split it to make one vegetarian and one shrimp etouffée. I used Paul Prudhomme's roux formula, which calls for celery, onion, and bell peppers and a balanced presentation of pepper (equal parts cayenne, black, and white); I amended it slightly to accommodate my circumstances, but the base is sound. The idea of balancing the peppers is based on the fact that it takes about twice as much black pepper as red pepper to have the same level of effect (which is in part because black and red pepper are sensed by different taste receptors). If you just used black pepper, it would have a negative impact on flavor, so at least half of the "black pepper" should be white pepper. (White pepper is just black pepper without the skin.) Note that if you follow Chef Paul's instructions on spicing hot dishes precisely, you'll often get dishes that are uncomfortably hot for most people; here, I used about two thirds as much pepper as would normally be used, and added some ingredients to improve the vegetarian presentation.

The effect was really quite good; I was worried about the vegetarian etouffée, but I corrected for what I thought of as a little blandness with some lemon juice, and the visual presentation improved a great deal after "Unexpected Guest" Michelle added tomatoes and parsley.

My dirty rice was an agglomeration of several recipes; I chose it because I needed to fire and forget: I put it in the oven for an hour and concentrated on the etouffées. Without that, we would have been screwed, frankly.

So, here are the recipes, as I remember them.

Roux: 2 cups flour; 2 cups olive oil; about 1.5 cups each of finely-chopped bell peppers, onion and celery; four or five cloves of chopped garlic; about a half cup of finely chopped 50:50 mix almonds and cashews; about 1 tsp cayenne; about 1/2 tsp black pepper, cracked; about 1 cup of finely chopped mushrooms; about 1 tsp white pepper; 2 tsp dried basil; 2 tsp dried thyme; two dried Ancho peppers, shredded and reconstituted in water, with juice. (Anchos will not add much heat to the palate, but will add lots of dried pepper flavor.) Heat the oil in a heavy high-sided iron skillet; add the flour slowly, stirring in with a fork, breaking any lumps immediately; work constantly over medium-high to high heat, continually scraping the bottom to keep the roux from burning. When the roux is cooked to desired darkness (for etouffée, should be about the color of a brown paper bag, or even a little darker), add the nuts and cook for at least a minute or two to bring out the flavor. Then add the chopped celery, garlic and peppers. Stir to mix well, then addd pepper and spices. Finally add all remaining ingredients, stir until well mixed. Separate out half to use with the shrimp half of the batch.

A note of caution: When cooking a significant amount of roux, you should take basic safety precautions: Have a source of cold water handy to deal with burns, and have a fire extinguisher handy. Hot roux is literally like NAPALM: It will stick to your skin and burn you from the heat of its cooking, and if it catches fire, it will stick to your skin more or less just like NAPALM would. And I think it could probably catch fire pretty easily: The oil is heated well above its flash point, and the flour will burn, too. So just be careful, mkay?

If you taste the roux at this point, it may seem too hot. It should: You're going to still be adding a lot more stuff, which will dilute some of the impact. (Be advised, though, that because of the white pepper it will have a delayed hit; taste, then reserve judgement for about 10 seconds until it kicks in.)

For the vegetarian etouffée, add another cup of finely chopped mushroom and about two cups fo sliced or chunked mushrooms, and enough vegetable broth to get to the desired consistency, and cook until the mushroom chunks are cooked to taste, adjusting spicing as you go. When mushrooms are done, stir in about three cups of sliced roma tomatoes, and sed aside, covered, until ready to serve. Before serving, stir in about a half cup of coarsely chopped parsley.

For the shrimp etouffée:

Start by browning some sausages. I browned about a pound and a half of italian-style turkey sausages, then set them aside to cool; after they'd cooled, I sliced them and set it all aside. I heated a large paella pan (a very large skillet or a wok might work ok, too) until it was fairly hot, then added oil to cover the bottom. I fried about two pounds of de-veined hulled shrimp until they were firm; I had to work in stages to allow for enough cooking space. Throughout that time, a nice crust was developing under the oil. When I had all the shrimp cooked, I dumped it all back in the pan and deglazed with fish bouillon, added the sliced, browned sausages and their juices, then added the reserved roux from the vegetarian preparation. When I had it well mixed, I added two to three cups of sliced roma tomatoes, and removed it from the heat (didn't want the tomatoes to break down). Immediately before serving, Unexpected Guest Michelle stirred in a bunch of coarsely chopped parsley.

This made a lot of both kinds of etouffée, but then, we had a lot of people. And anyway, I think it should freeze well. The residual heat will blanche the tomatoes and stop them from getting mealy when frozen. Though the texture of the shrimp would probably suffer.

The dirty rice looked less than overwhelming when I first took it out of the oven -- it seemed too wet, but it firmed up a little after relaxing for a while. I started with a pound of Gimme Lean vegetarian sausage, browned in olive oil and broken up as much as I could. Then I added about 3/4 cup of finely-chopped onion, about three or four cloves of chopped garlic, a fistfull of ground nuts (again, 50:50 almonds and cashews), and about a cup each of finely chopped celery, bell pepper, and mushroom. I added the pepper after adding about half the vegetables to cool the pan (to prevent the pepper oil from permeating the kitchen): As much as a tsp of black pepper, a bit less than a tsp each of white pepper and cayenne, and probably about one of the reconstituted ancho peppers. I stirred in about a cup and a half of brown rice and about three and a half cups of vegetable broth (that was too much -- it should really have been only about three cups, maximum). Then I put it all into an oiled casserole pan with a close-fitting cover, and put it in a 350 degree (fahrenheit) oven for about an hour or so.

None of the recipes I built on to create these monstrosities included nuts or mushrooms. My theory on including them was that they would add more "meat like" flavors to enhance the vegetarian dishes. I can't swear that it did that, but I still believe in the idea. And anyway, both nuts and mushrooms are good for you, generally. As for the specific nuts, I think filberts or hazelnuts would do as well as almonds; pine nuts would work well, too. Pecans would be interesting. But I wouldn't do this with peanuts.


Life Among The Oxy-Morons

In all the fuss over President Bush's "plan" for Social Security privitization, a simple fact keeps getting ignored: It can't possibly work, because it's based on a mistaken premise. And the error is so obvious, that I find it hard to understand why people don't see that the clear goal is to eliminate Social Security altogether.

The error, to me, is the assumption that Social Security funds invested in the stock market would actually accrue enough interest to "save" the system. It's painfully clear that's almost certainly wrong, even if we consider two simple data points:

  • The President keeps assuring us that the money will only be invested in "safe" "conservative" funds. But given the safety rules that will have to be erected around those funds to keep them from losing money, it's hard to imagine how they could make very much.
  • He also reminds us at every whistle-stop that participants will be permitted to invest only 2% of their Social Security deductions. For me, last year, that would have been about $80.And I made a lot of money last year.

Really, this should be obvious to everyone, and especially to "fiscally conservative" Republicans. Of course, "fiscally conservative Republican" is an oxymoron, so what should I expect? To the American Conservative mind, "social security" is a mistaken concept. There is no quicker way to marginalize yourself in American political discourse than to take seriously the concept of a commons (i.e., to treat community as an actual community). Ecological viewpoints -- the very idea of considering second-order effects in reckoning what Might Be -- are frowned upon. That requires subtle thinking. And that's not something we go for over here.

Shenzhen to Nashville, Non-Stop

We live in the Era of Air Freight.

My new Mac Mini shipped early this morning from Shenzhen, China, via FedEx. From there it will probably fly non-stop to Nashville on a FedEx 747-400, 777, or 767, and thence be routed here. I can track the movement of the package online, and see by implication how it's travelling: It hit FedEx at 8:51pm (local time) on 1/18/2005 ("Package received after FedEx cutoff time"). It left the FedEx ramp at 7:09pm. By my reckoning it will be in the air about 12-13 hours, based on the distance from Nashville to Shenzhen. So I should be able to browse to the FedEx site and see the Arrival Scan by about 9pm EST today. I'll be able to follow it hop by hop until it goes out on the delivery truck, which will be either Friday or Monday, depending on how seriously Apple takes their delivery-date promises.

We live in the Era of Air Freight. This ecological fact is in many ways the most important practical implication of advancing technology: Computing and networking technology makes the coordination global logistics possible, and efficient long-haul cargo aircraft from Boeing and Airbus make it cost-effective to distribute directly from a factory in China to a doorstep in western NY state. And all of this allows economies to pump capital more quickly -- allows the concrete manifestations of ideas and desire to move across the globe at 700 miles per hour. Thinking of it all in terms of goods and capital seems trivial, but this kind of point-to-point distribution is really the engine that drives the global marketplace, which in turn is what drives global society, for good or ill. We can blame the idea on Sears and Ward. The transit of goods in turn subsidized the rest of our long distance mass transportation network, as the big widebodies pack the extra space in their bellies with cargo, the complex spoke-end to spoke-end routing enabled by efficiently networked logistics systems.

And yet, all we see moving are the people. We are blind to the goods in the cargo hold on all the big planes; we taxi by the big, windowless cargo-haulers, logoed-up for DHL, UPS or FedEx, and most of us probably just have a quickly-forgotten moment of "Oh, so that's how they do it..." We only think about the people.

When the World Trade Center was attacked on 9/11, British Airways lost about 40 Concorde frequent-flyers. The impact went much deeper, though, than just the loss of 40 reliable fares. Many of those 40 were senior decision-makers at their companies. They were the people who could approve the expensive Concorde tickets, either formally or tacitly. The Concorde relied almost entirely on human passengers to pay its way, and so from "Golden Eagle", the Concorde returned to it's early-'80s status as a money-burner. So we can add the Concorde to the list of things that can be said to have been killed by 9/11.

Amidst all the hoopla surrounding the formal unveiling of the Airbus A380 "super-jumbo", many asked where Boeing's competetive product was. Boeing's answer: The A380 is a "big plane for a small market." The same could have been said of the Concorde: It's market was so small, that losing 40 passengers upset it's fare-ecology sufficiently to make the plane commercially non-viable. But the Super Jumbo won't suffer the same fate. I heard it said more than once in the news coverage that half the orders were "from Asia" -- which means, they're for air freight.

Public reaction (and amongst the "public" I include most media business analysts) to the A380 under-reports a very important point: While hub-to-hub people-hauling is important, the 580-seat luxury model and even the as-yet unbuilt 800-seat steerage special versions of the A380 are really almost red-herrings. The real target market for these aircraft is not passenger hauling, but air freight. There's big money to be saved by increasing the weight and range of the planes, even just between major hubs. This is a plane designed to fly non-stop from Yokohama to Louisville, Shenzhen to Nashville, Taipei to LA, São Paulo to London, with a really big payload of shoes and consumer electronics. Airbus's bread and butter customers for the A380 are outfits like FedEx, UPS, DHL, that won't stop using the hub-and-spoke model for the bulk of their traffic for a long, long time.

It's easy to see this as a triumph for economic models of understanding. But that would be a mistake. While all of this can be seen in economic terms, its effect is human, social, and the field on which the economic facts are cast is fundamentally ecological. And that's the reason that economists (Marx first and chief among them) fail to predict accurately: They fail to understand that economics is only ecology writ fine, and hence divorced from the larger picture. And from the fine-writ bits from other aspects of the big picture. Capital -- money -- is fuel in an ecology of commerce. But it is not, yet, the reason. For the reason, we can still, at least, look to such intangibles as desire.

From The "Planning Is Everything" Department

Thomas P. M. Barnett, of the Naval War College, on Getting What You Wish For: "... Rumsfeld's answer was that 'sometimes you go to the war with the army that you have, not the army that you want' -- not exactly. You go to the war with the army you've been wanting." Because that's the army you've got: "we've been wanting for the past 15 years an army that doesn't do peacekeeping, doesn't do nation building, doesn't do post-conflict stabilization." [Thomas P. M. Barnett, "The Pentagon's New Map", on ME 2005-01-18] [as RealAudio] So what we got was an army on the ground in Iraq that couldn't handle "winning the peace", just as General Eric Shinseki predicted.

Alas, we didn't wish for Iraq to become the world's new terrorist training ground. But we got it. What do these things have in common? That they were both predicted by anyone who bothered to think about it. That they were both obvious consequences of the way we chose to do things. That people responsible for seeing this stuff in advance, did see it, and did raise warnings. That people responsible for listening to those warnings did not heed them (and perhaps did not even listen).

But common sense has never traded well in American politics. If it did, we'd have taken the common sense approach that Barnett advocates and opened trade with Cuba. That would have brought Cuba into economic contact with the rest of the world. Barnett argues that when nations are brought into economic contact with the world -- when they're given a stake in the world community, as it were -- they don't cause problems. Ergo, if we wanted to make Cuba play well with the rest of the world, we should have practiced a little [ahem!] constructive engagement. (If it's good enough for China, why not for Cuba? Oh, wait, that's logic...)

Since this is basically consonant with orthodox market theory, with conservatives theories of personal responsibility, and also plays well with the "liberal" comprehension of the fact that we've gotten ourselves far up the excrement race with no apparent means of propulsion, I'll be curious to see what kind of traction these ideas get.

Barnett's a military historian, so there's a military angle to all of this, too, of course: He seems to suggest basically that if we're to become colonial lords, we should make our army fit that bill. We should procure and plan and train for a force and force structure that is suited to peacekeeping, is suited to nation building, is suited to post-conflict stabilization. Common sense. Though I'm afraid having such a force might make us more likely to meddle in the business of other countries (I'd rather not see us as colonial lords), perhaps we'd get into less trouble if our army weren't well-suited for large-scale invasions. I'll be curious to see his reviews.

Podcasting Is Dead. Long Live Podcasting.

I'm mildly surprised that in the storm of mutual annoyance over podcasting, there hasn't been a clearer statement of where, how and why podcasting can succeed and fail. I suppose I shouldn't be, since clear-headed analysis doesn't generally sell trackbacks, but I think it's a really interesting phenomenon that will teach us a lot if we baseline and understand it correctly. And that can start with etymology.

As it is:

podcasting = [i]Pod + [broad/multi]casting = "multi-casting to people's iPods"

As it may be:

1: [private] podcasting = pod + [narrow/multi]casting = "narrowcasting to my pod"
2: [public] podcasting = [i]Pod + broadcasting = "broadcasting to people's media players"

Podcasting "as it is" currently understood is a short transitional phase. As a popular blogging modality, it won't last beyond 2005. Yet by 2006, something or things called "podcasting" will be extremely popular, and might even drive some interesting and powerful changes in the distribution of information.

Podcasting will very soon split into two distinct types of output: One that's highly personal, targeted at people you know and who hence know your voice (and hence don't require high production values), who are in tune with your interests (and hence don't require extensive meta-data to get your point). Personal podcasting will serve to cement bonds among groups of people who are not immediately and intimately connected. The second form, pro-podcasting, will be the kind of stuff David Berlind is talking about: Professionally or quasi-professionally produced output, primarily from media outlets but also from people for whom it's cost-effective to produce output that's essentially promotional.

The reasons are really simple and kind of rock solid, and they are simply that it's just not cost effective for either the 'caster to produce a high-quality podcast unless you've got facilities, skill and time at your command to do so; or for the listener to spend a lot of time listening to something that s/he could apprehend a lot faster and with more flexiblity by reading it. A podcast of sufficient quality that even interested strangers would want to listen to takes time to produce; furthermore, on-air reading is not something everyone can do well enough to make for a tolerable listening experience. Podcasts are also more or less invulnerable to full-text indexing (which makes it seem ironic to me that many of its proponents are also strong proponents of letting Google traffic arbitrate on the importance of a resource). It's arguable that software solutions will be found to these problems, and I think there's merit to those arguments. But that's not to say that people will then actually use those solutiong to blog as podcasts.

Typical "pro-podcasters" will range from Bill O'Reilly to Al Franken to Dave Barry. I wouldn't expect it to include people like Glenn Reynolds and Markos Moulitsas, because too much of their value comes from nimbleness and textual integration with the blogosphere. It may include people like Wonkette or Drudge, who could use their pro-podcasts to drive spiral traffic to their website, and vice-versa. Pro-podcasting will have a market-mover effect in terms of driving progress toward "radio TiVo" and pushing media players (and media players of all kinds, since it will rapidly start to include offline video content).

But it's the personal-podcasters who will have the most interesting effects. The obvious market is distributed groups of friends and families -- people will be able to send narrowly-targeted multicasts to groups of people with whom they share an emotional connection. But there are also tremendous potential business applications for personal-podcasting. Personal podcasting could be used to facilitate workgroup solidarity, send what amount to persistent offline voicemails, even facilitate something like non-real-time audio chat.

And I find it interesting that I haven't heard about these uses, yet. Perhaps it's that for the first-movers and strong evalngelists like Curry, Searls and Winer, there really isn't a separation between the business and personal pplications. Which would also be interesting, if true. But more on that another time.

Podcasting By Any Other Name

People like to find arguments. It gives them a place to plant their intellectual flags and say "I was here first!" For example, there's apparently an argument over whether "podcasting" is "significant" from an investment perspective. David Berlind weighs in on his ZDNet blog. Berlind's answer is quite oblique, and while making some very important points implicitly, I think it will be accused by the podcasting faithful of 'not getting' podcasting; I'll accuse him of the same thing, for different reasons.

Basically, as far as I'm concerned, "podcasting" borders on being a hoax, of sorts: It's a name concocted more or less with the sole purpose of counting coup in the blogosphere, that's been blown up as something important and significant, and in blogospheric terms, it is both, but not on the scale that's presumed on its behalf. Podcasting as practiced in blogland will have very little impact on what the thing that will be called "podcasting" will look like in the future. It's one of those things that's important for the impact it's said to have, and not for the impact it actually has. It's important, in short, for the same reason that Jessica Simpson is [sic] important: Because people say so. It's got nothing to do with her singing.

The spur to Berlind's meditation was a question from a fellow reporter, working on a story (and hence, kept anonymous -- and no, I do not find anything sinister in that). "Old media" blokes, it seems, are still wondering whether blogs are "significant", and -- here's the curious part -- what that means for "podcasting". "His perception is that the blogger phenomenon is insignificant," Berlind's colleague supplies, "making podcasting negligible." From an investment perspective, of course.

Well, it's a terrible analysis, of course, as far as it goes: Major acquisitions and strategic investments are being made that are directly motivated by the idea of blogging, and so blogging is by definition "siginificant", and so we have to wonder what the heck this expert really means. Even if the raw numbers of new bloggers (tens of millions in the last year alone, similar to the boom-period growth figures for internet use) don't impress him, he's myopic if he doesn't understand that blogging per se isn't the issue; it's just the nascent stage of new modes of mass-personal communication. My own nutshell evaluation of this particular analyst is that I suspect he doesn't actually know what he's talking about.

Nevertheless, there is a grain of truth in the analysis. Personalistic "morning coffee notes", produced on an ad hoc basis by random bloggers, will never be significant in this "investment" sense. (Though I can see some interesting possibilities, there, for things that will be significant.) Why? Because the medium sucks; podcasting will never, ever become popular in the way that blogging is popular. On the other hand, as Berlind rightly points out, the rather old idea of media-shifting print content to voice (which used to go by the name "radio") and then mode-shifting that from a stream to an offline file, not only will be big, but has been going on for a while. In fact, it's older than the web, even on the Internet. The only things that're new about it are, first, doing the notification and distribution through RSS, and second, automating the media load onto portable devices.

Those are important things, sure; but the podcasters didn't think of them. They just took their particular process public. And the particular "open" modality that they specified will be important during a transitional period -- but it's not where the money will be made or most of the traffic will happen. That will be on satellite. Podcasting in its current form is merely an interim step to the full realization of potential of satellite radio. "[U]sing the technology to audio-tivo satellite" would be just a start; wait until Apple or XM really get going on these ideas.

The Not-So-Hidden Truth About The iPod Shuffle

I'm not sure what's radical about the iPod Shuffle. OK, I'm lying, I know what's "radical" about it, and that's nothing: It has exactly two things that haven't appeared in previous flash-based players, and lacks a lot of things that have. Even in those two things, it breaks no new ground, since they're both attributes of the leading high-capcity product: It comes from Apple, and it integrates with iTunes. ("The Future Is Random"?!)

Those two little non-revolutionary things (Being Apple and Being [Of] iTunes) are pretty important. And the impact of the Shuffle doesn't lie within whether it's actually new or not, or even whether it's actually any good. The impact lies in how it serves to expand the iPod halo.

The random shuffle feature is nothing new -- I can do that on my iRiver. Neither is the integrated USB A-plug (I own a Virgin player, currently on permanent loan to a friend, that has a better-designed implementation of that). Recharging off the USB bus? It's been done. And though I don't troll the flash-player market, I'd be surprised to find it hadn't already all been done in the same player.

Even the "radical" step of "eliminat[ing] the user interface altogether" [sic] has been done before: There have been plenty of flash-based players that eschewed a song title display. Though usually, players that do that are actually cheaper than their competitors, instead of more expensive. But I digress.

As for what it lacks: An FM tuner, and a display. FM tuners have become big differentiators in the flash-player market in recent years; it happened because the circuitry to make them suddenly became really cheap, and not as such because of demand -- a matter of capacity converging with sub-rasa desire, as it were. But I digress, again: Apple apparently doesn't think that matters, and I think I know why. They're planning to horn in on the ground floor with Satellite Radio integration into the Digital Media Center. (Mark this, that's their real next target. The micro-workstation market will expand under its own steam for a while; the next strategic play is getting XM Radio into the iPod Halo.) How they accomplish this is yet to be determined; as iTunes grows, they're increasingly integrated into the DRM fold, and it's a mistake to think that "Rip, Mix, Burn" was any more than a marketing strategy.

I can virtually guarantee that I will never own an iPod Shuffle. But it's important. And by all the accouts I've read so far, it was done contrary to Jobs's better judgement. But again, I digress....

[sic]: Memo to David Pogue at the NYT: Buttons are a user interface.


Subscribe to RSS - escoles's blog