"The fact that the winner gets to write history, doesn't mean they made it."
Folks sure do some funny things to make other folks think they're hip.
"Toothing" seems to have been a hoax. At least, that's what everybody's stumbling over their shoelaces to declaim. ("Dogging", though -- which differs from toothing in kind only insofar as it doesn't have a "fake" name -- is apparently real. Unless Ste Curran and Simon Byron are going to claim credit for that, too. "Yes, you see, all hedonism is a great hoax. Nobody actually has anonymous or exhibitionist sex. We made that up and you're all rubbish for thinking otherwise.")
What interests me is not the feeding-frenzy around the original "hoax" so much as the feeding-frenzy around its exposure, as people and institutions race to minimize the damage to their egos.
I use scare-quotes on "hoax" because, while I don't doubt the story about how the term came to be, I also don't doubt that people do it. Because, of course, the fact that somebody made it up has more or less nothing to do with whether people actually did it -- after, or even before. But it's officially a hoax, now; ergo, anyone who "believes" it ever happened is a fool. (And, apparently, anyone who dares notice that the ironists behind are prancing about naked is twice the fool. C'est la vie.)
There are some interesting things that often (if not usually) happen during the coarse of a big hoax:
I personally never "bit" on toothing in the sense of blogging about it or expressing moral outrage, etc. That's not becuase I immediately recognized how improbable it was -- quite the opposite. It was because I personally never found it that implausible. Aside from the fact that bluetooth messaging has had well-documented and unexpected use in ad hoc social networking [pdf], I've seen enough amateur hedonists casting about aimlessly in the "culture of death" that toothing doesn't seem that improbable to me -- certainly not in the realm of "throwing a brick at the dancefloor with a love letter attached, and hoping that the person it hits will agree to sleep with you."
I always reckoned the success rates for toothing to be in territory that a party-bar wingman (among the most troublesome of amateur hedonists) could wrap his sodden cognition around: "One in thirty, those are pretty good odds, bra!" Toothing would have poorer odds than one in thirty, to be sure; but the effort involved is less, too. Technology decreases the marginal cost. And as toothing became more "popular" (i.e., the "hoax" spread more widely), a greater proportion of amateur hedonists would leave their bluetooth open, and there'd be a substantial likelihood that toothing would actually work.
And in certain settings, it would most likely work really well. Think bathhouses....
So for me, what's really interesting is that in buying into the "toothing is a hoax" meme -- in accepting that the idea of anonymous sex mediated by text messaging was only ever always merely a hoax cooked up by a couple of bored wankers -- we miss the opportunity to learn whether the activities described as "toothing" ever actually happened. That would be kind of fascinating: Where? How frequently? What were its etiquettes? What did it do to the spread of STDs?
I expect that we would find it's done in dark places with loud music and lots of intoxicants, by people who don't then go home and blog about it. Making it part of that world outside that hip young blogospherians like Byron and Curran often seem ill-acquainted with.
I believe the dead should be remembered as they were, not as we would have had them.
An atheist, raised Methodist, I've nevertheless always been somewhat in awe of the office of the Papacy. But I've often wondered whether that was due to some familial kinship -- some hearkening back to the origins of Christian churches -- or due to the man who has held the office for more than half of my lifetime.
The Papacy is one of the single most important political offices in the world. Make no mistake: It is a political office, and I would argue even more so than a religious office. So it's entirely appropriate, purely on those grounds, to attend to the death of Karol Joseph Wojtyla. And even if his office didn't bear the weight that it does, he might very well. His presence was forceful; one eulogist after another has spoken of his quiet but absolute confidence the seem again and again to be describing something like intellectual courage. Something like honor.
John Paul II has an enormously problematic legacy. His interpretation of dogma is arguably responsible for exacerbating the spread of AIDS, for the birth of tens of millions of unwanted children -- and, implicitly and indirectly, for tens of millions of abortions -- for tacitly endorsing sexual intolerance and squelching processes that could have led to greater and more rapid spread of freedom throughout Latin America. He appears to have been actively complicit in helping American Catholic dioceses to avoid the consequences of sexual abuse on the part of their priests.
And at the same time, he is arguably responsible for the collapse of the Polish communist state -- for toppling-back the first domino. Hist moral example in forgiving his own would-be assassin made a huge impact on me. And he found ways to reach across sectarian boundaries and to admit some of the failings of the church.
Nevertheless, his legacy is immensely ambivalent. Only time will tell whether the world can approach it objectively.
Imaginary people make much better martyrs.
Case in point: Terri Schiavo. The appeals are finally exhausted; Terri Schiavo is dead, unequivocally, unappealably. And we've just begun to see the consequences. Quite aside from the impending wrongful-death suit (which will be brought regardless of the results from the forthcoming autopsy, to be performed by a Jeb Bush appointee), the fight has catalyzed a constituency. It's given bullshit artists like Tom Delay (that old exterminator) a soapbox to stand on. Note, as we go forward, the endless repetition of their Big Lies: That the "American People" are behind the reckless Conservative-Republican adventurism; that the case shows improper involvement by the courts, instead of the courts doing their jobs by (perish the thought!) making judgements.
What was this case about? It certainly wasn't about whether one person would have preferred to have her body die; it passed beyond that threshold years ago. It passed beyond that when Bob and Mary Schindler concocted a "person" they called "Terri Schiavo", and identified her with their daughter, and pasted her face over their daughter's face whenever they saw her limbic-brained body in that bed. The "Terri Schiavo" that Bob and Mary struggled so hard to defend was not their daughter, but their dream of their daughter, or at least the best dream they could muster under the circumstances.
And she was a perfect daughter, in many ways: She didn't talk back, never contradicted their version of her life's narrative, never corrected their inventions about what she might be thinking at that moment. Or have thought when she was eight, for that matter.
It certainly wasn't about what the real Terri Schiavo's wishes might have been. What they are, I can't know, and I daresay Michael Schiavo can't know for sure. But judges have been evaluating the matter for seven years and not found a reason to suspect that she wanted her body to remain alive long after she'd lost the capacity to engage in detectable interactions with other people.
True, Michael can't have known for sure; but her parents -- surely they must have known?
Why? Why would we suppose that? My own parents wouldn't have the faintest idea what I'd want in such a situation. For practical purposes, they know nothing of real substance about me that they didn't know before I was eight. I could name four or five close friends, a handful of ex-lovers and seven or eight not-so-close friends who'd have a better idea.
So, no, it's got nothing to do with Terri's wishes. But it's got a great deal to do with how her parents imagine her wishes -- with the wishes of their fictional Terri, as it were.
And Jeb and George Bush's and Randall Terry's and Tom Delay's fictional Terri. Which is the real obscenity, here, of course. If it were just Bob and Mary, it would be a tragedy. And anyway, their version of Terri is at least based on something real. But with Jeb & George & Randall & Tom in the game, any hope of the real Terry S. being remembered are completely gone. She's doomed to be immortalized as an abstracted martyr for the cause of eliminating secular justice.
My friend Lynne pointed something out to me a few days ago: Looking through the "recent posts" lists, she was able to deduce that this site had been getting comment spammed at a rate of about once every five seconds for a period of around an hour. All were flagged as spam; none of them made it through to human eyes. I was mildly impressed, for two reasons: First, that my site had handled the load; second, that someone had written a bot to attack that aggressively.
The day before yesterday, I stopped being impressed, and became furious.
I had pointed out to Lynne that the only real restrictions on how fast a comment-spamming 'bot could attack were the capacity of the web application and the speed of HTTP. Since HTTP is stateless, there can be any number of concurrent attacks in play; ergo, the server will most likely collapse first. that's what happened the day before yesterday, as the rate of attack shot up to more than once per second. The site buckled in about 20 minutes; it failed just as I was looking for the setting to turn off anonymous comments.
That's the roundabout way of explaining why anonymous comments are now switched off. The attack bots simply have no way to see the "post comments" link, for the moment -- they can't login, they can't comment.
I hate this solution. I'll be adding a captcha module and patching the comments module to use it, Real Soon Now ("in my copious spare time", as we used to say at Ziff-Davis Education). I have ideas about how to make the comment button harder for bots to see, but they require patching core on Drupal, which I'm loathe to do.
As Peggy points out, I guess I'll need to monitor enrollments more carefully. And as Lynne could tell you with regard to other sites, I'm not very good about that, anywhere....
Yesterday during a visit to Michigan State University, I had the opportunity to walk around the beautiful grounds. What a glorious spring day it was. As I neared Beaumont Tower, I noticed the large stump of an old oak tree that had been felled earlier this month for safety reasons. Acorns, now seedlings, had been collected from the 165-year-old tree and will be planted on campus.
Back in 1840, 165 years ago, as the black oak was beginning its earthbound evolution, Ralph Waldo Emerson and Transcendentalist friends gave birth to The Dial, a magazine for literature, philosophy, and religion.
Emerson wrote in a passage from the essay, New Poetry, in Volume I:
What better omen of true progress can we ask than an increasing intellectual and moral interest of men in each other? What can be better for the republic than that the Capitol, the White House, and the Court House are becoming of less importance than the farm-house and the book-closet? If we are losing our interest in public men, and finding that their spell lay in number and size only, and acquiring instead a taste for the depths of thought and emotion as they may be sounded in the soul of the citizen or the countryman, does it not replace man for the state, and character for official power?
Also, from Emerson's introduction to the new journal:
This spirit of the time is felt by every individual with some difference, -- to each one casting its light upon the objects nearest to his temper and habits of thought; - to one, coming in the shape of special reforms in the state; to another, in modifications of the various callings of men, and the customs of business; to a third, opening a new scope for literature and art; to a fourth, in philosophical insight; to a fifth, in the vast solitudes of prayer. It is in every form a protest against usage, and a search for principles. In all its movements, it is peaceable, and in the very lowest marked with a triumphant success. Of course, it rouses, the opposition of all which it judges and condemns, but it is too confident in its tone to comprehend an objection, and so builds no outworks for possible defence against contingent enemies. It has the step of Fate, and goes on existing like an oak or a river, because it must.
May the seedlings of the great oak prosper.
Chicago Sun-Times Jim Ritter reports:
A pharmacist's refusal to dispense morning-after emergency contraceptives sparked a Planned Parenthood protest outside a downtown Osco drugstore Tuesday.
The unidentified pharmacist apparently was exercising her right under the state's "conscience clause" law to not participate in a health care service that violates her beliefs.
What are â??conscienceâ? clauses?
Conscience clauses are provisions in legislation, regulations, by-laws and other governing instruments that permit individual medical providers, insurers, and facilities to refuse to provide services to which they are morally, ethically and/or religiously opposed. Conscience clauses may also allow the right to refuse to counsel or refer for services. [â??Issues and Trends in Reproductive Healthâ?, Planned Parenthood of New York City]
Around thirty years ago conscience clauses served to protect primarily individuals in private practice. With the proliferation of managed care and corporations, conscience laws are becoming broader in scope.
According to a recent article by Tresa Baldas in The National Law Journal,
many attorneys fear the government is on a slippery slope to allowing refusal for virtually any medical procedure, limiting access to practices and technology such as in-vitro fertilization, stem cell research and end-of-life practices.
I find this troubling, to say the least, especially in light of recent events in the Terri Schiavo case. Itâ??s disheartening to read about â??providers denying pain medications in end-of-life situationsâ? and â??rape victims being denied information about emergency contraception pills.â?
Abandonment is becoming a litigious point. Some doctors and facilities not only wonâ??t treat patients on moral grounds, but they also wonâ??t refer them to someone who can help them.
It seems that not only do we need to protect ourselves with documents like living wills, but also it might behoove us to ask to see our medical professionalsâ?? moral beliefs in writing.
Many, many moons ago, I came out as an atheist, and my religious orientation hasn't truly wavered since. It would have hurt my parents less if I'd told them I was gay, I think. I thought about that this morning, as I read a message from my father. It seems that my childhood pastor and his wife are soon to celebrate their 60th anniversary. I'm thinking of sending them a note.
Ron Conklin played an important role in my coming out. As I think back on it now, the sequence of events is hazy in my mind. But this much I can definitely say: It came to a head over confirmation. In the United Methodist Church, children are confirmed at about the age of 13. It's their last year in "Sunday school"; after that, they're supposed to join the adults in the sanctuary. We had a deal in our family that we could make up our own minds about attending services once we'd been through confirmation class.
By that time, I'd been dubious about Christianity for at least three years, and had counted myself firmly as an atheist for at least two. I felt unclean every time I sat in the santuary and mouthed hymns or prayers, and had even begun to consider leaving the Boy Scouts because I felt dishonest reciting the Oath and Law.
So when June rolled around, and Confirmation time drew nigh, I elected not to be confirmed in the United Methodist Church. I told my parents why; they didn't like it. They implied I would be forced to confirm. I made it clear that I would not be.
As a compromise, I agreed to talk to Reverend Conklin. We met after school one day, at the church. As I recall, we didn't talk for long; I told him what I believed, told him that I felt it would serve no purpose to argue about it, and ultimately, though I'm sure it didn't make him happy to do so, he agreed that it would be dishonest for me to confirm if I did not believe in God.
I promised him in return that I'd come see him if I wanted to discuss my "doubts." He didn't demand that; he asked for it. I knew him well enough to know it wasn't a deal, but an offer. I've always been grateful to Ron for that, but then, that was the nature of his belief: Faith had to be freely chosen, or it was without meaning.
That wasn't the end of friction -- my parents tried to go back on our traditional deal regarding confirmation and church-attendance, and my mother still occasionally begs me to "reconsider" and tries to guilt me into Easter services whenever the logistics align -- but my meeting with Ron Conklin more or less forestalled a war. I've always been grateful to him for that, too.
The Conklins have been close to my family in the years since. Ron married all three of my siblings (even my agnostic brother Glen asked him to perform the service), twice travelling a day's drive to do so. And my parents have often stayed at their cabin in the Adirondacks.
There's another story that comes to my mind more often with regard to Ron Conklin, though. At the reception for my brother Glen's wedding, I stood to offer the traditional "best man's toast". I hadn't given it much thought, intentionally: I felt it would be more appropriate if more sincere, and more sincere if done off the cuff. So as I started to talk, I didn't know quite where I was going -- only that Glen and Sheila would want me to avoid cliches and embarrasing stories. I allowed myself to start out on the usual "I was there when..." journey, let myself wind into the story for thirty or forty seconds, and then stopped, suddenly, put on an urgent expression, and cried out: "Sheila, it's not too late! The window's open! He's a monster, run away now!"
Ron burst out laughing (along with everyone in the room except for the mothers of the couple), and called out, "It is too late, I've got your signature as a witness!" He later congratulated me: "You did the two most important things a best man is supposed to do, you made the couple feel good and you kept it short."
Dad tells me that the Conklins are living in the same area where I first knew them, where our church was -- his last church as a full-time pastor. I'm glad they're well. I'll never share their faith, but I'm glad that at least some people of faith have had people like Ron and Ruth to look out for them.
A thought for the moment:
The most fundamental problem with libertarianism is very simple: freedom, though a good thing, is simply not the only good thing in life. Simple physical security, which even a prisoner can possess, is not freedom, but one cannot live without it. Prosperity is connected to freedom, in that it makes us free to consume, but it is not the same thing, in that one can be rich but as unfree as a Victorian tycoonâ??s wife. A family is in fact one of the least free things imaginable, as the emotional satisfactions of it derive from relations that we are either born into without choice or, once they are chosen, entail obligations that we cannot walk away from with ease or justice. But security, prosperity, and family are in fact the bulk of happiness for most real people and the principal issues that concern governments.
[Robert Locke, "Marxism of the Right", in American Conservative]
It strikes me that many people will find American Conservative to be an unusual venue for this kind of analysis. But this is Pat Buchanan's rag, and it bears his stamp; this is Buchanan Conservatism, speaking loosely -- the "Buchananite" camp (for lack of a better term) has always been something of a herd of cats, by comparison with their more ends-means-challenged fellow-travellers on the Republican Right, like Rove and Norquist. They prize analytical thinking and intellectual independence and integrity, though they're not above swallowing a bit of that independence to take one for the team on occasion.
I've argued in the past that Pat Buchanan is more or less personally responsible for the debased state of popular political discourse in America. But I like to think he'd have exercised more restraint if he knew it was going there. And while my parents will gleefully describe me as a "flaming liberal", I still regard myself as, in many ways, conservative (that's with a small-c); and while I think this analysis is largely spot-on and that American Liberarianism (that's with a big-L) is a bunch of dangerous humbug, I'm still very sympathetic to the libertarian ideals (that's with a small-l) of free choice and freedom from external restraint; still, I cut my political teeth reading William F. Buckley and Will Safire, and sitting in on meetings of the Executive Committee of the Saratoga County Conservative Party, and I did my time as an adolescent follower of the teachings of Ayn Rand. So at the least, I understand Conservatives better than most people I know. They fascinate me. But I digress, as usual.
It's an interesting analysis of Libertarianism, and overall I think it's correct. But Locke goes too far in two areas: He characterizes Libertarianism as having a "dogma that all free choices are equal", and of having "contempt for self-restraint." The first is his own straw-man version of Libertarianism, based on his assertion that when Libertarian views on free choice are taken to their "logical conclusion", they imply "... that a man who chose to spend his life playing tiddlywinks has lived as worthy a life as a Washington or a Churchill." While that might be true that Libertarians ought to hold that view, if they were being logically consistent, it's probably also true that the vast majority of people who call themselves Libertarians don't hold that view. So the correct criticism would be for inconsistency, not for equating all free choices.
On the second point he's simply wrong, as far as I can see, since it appears he's making a simple assertion and not even a conclusion about what Libertarians ought to think. Rather than having contemt for self-restraint, in my experience, most Libertarians assume it. Which, to be fair, can end up having the same effect: Children are not taught such techniques of self-restraint as delayed gratification or imagining the consequences of their actions. (There's that imagination thing, again...)
Perhaps by "contempt", he means "neglect." And it's certainly not a spirit that's restricted to Libertarians; American popular culture -- well, really, all modern consumer culture, as well as our entire world economy -- is really predicated on the statistically "wealthiest" populations exercising as little self-restraint as they can while still retaining their capacity to spend capital.
In closing, here's a thought for the Grover Norquists of the Conservative world:
[Libertarians] often confuse the absence of government impingement upon freedom with freedom as such. But without a sufficiently strong state, individual freedom falls prey to other more powerful individuals. A weak state and a freedom-respecting state are not the same thing, as shown by many a chaotic Third-World tyranny.
To be honest, though, I'm giving Norquist credit I don't believe he deserves. I don't sincerely think he's at all interested in idealized Libertarian freedom, but rather primarily in power for its own sake -- and only secondarily in power as a means to the end of his doctrinary agenda.
It's as though only blood will satisfy Texans. They seem to be largely outraged at the Supreme Court's recent decision regarding the execution of minors. From Morning Edition this morning [RealAudio], I heard again and again that these convicted murderers are somehow not being punished because they won't be executed.
As though they're getting let out of jail, instead of having their sentences commuted to life in prison. Justin Wiley Dickens, convicted of murduring a man in an Amarillo pawn shop robbery when he was 17, is puzzled by the outrage at the ruling. "This is hell. It really is. I can't understand the outrage of them saying we don't be executed, we're just goin' to another life of hell. They ain't never gonna let any of us out. Life sentence means a life sentence. And I pray for Jim Jacobs and Francis Carter's families, I just live every day with regret, I really do. Just tell them I'm sorry. If you would."
Justin Wiley Dickens's case is an interesting illustration of this bloodthirstiness: The shooting happened during a struggle over the weapon, under circumstances where it's unclear that Dickens engaged in any meaningful premedidation. In other states, this might have been second-degree murder, or even manslaughter.
But not in Amarillo, because in Amarillo, the District Attorney knows what's in the criminal's heart: "I got to know Justin Wiley Dickens very well, in that trial", says Amarillo DA James Farron. "If you have something he wants, and he has to kill you to get it, he'll kill you in a heartbeat, I assure you. You, me, anybody else." That's not a particularly Texan attitude for a DA, of course, but it is a particularly DA attitude. Criminal DAs generally take the line (at least publicly) that everyone they've ever prosecuted was guilty, regardless of the verdict (or the evidence, for that matter).
(For what it's worth, Amarillo attorney and adult death penalty supporter Russ Bailey, who was assigned to defend Dickens, disagrees strongly with Farron's assessment: "Justin in my opinion did not have the requisite intent. He was not an adult for any purpose in my opinion at that time. He was a nice kid....Most of these kids don't have any control over their lives. Justin didn't have any. He never stood a chance. And to throw away a life before they've even tried to live their own is a real tough thing to accept. It was for me for Justin. ")
People can be great at missing the point, though, especially when it's in their interest to do so. Farron, for example, sees the Surpreme Court ruling as a statement that "all 17 year olds" are decision-impaired: "It is simplistic and sophomoric to suggest that we can draw a line in the sand and announce that everyvbody younger than this many days is immature, unable to make decisions the same way that you and I do -- is that true of some 17 year olds? Absolutely. Is it true of most 17 year olds? Probably. Is it true of all 17 year olds? Absolutely not." Farron, for his part, seems to think that "it's true" of at least a third of 17 year olds: one third of Farron's own death row convictions are under 18.
Of course, it's simplistic and sophistic (and most likely Frankfurtian bullshit, to boot) for Farron to suggest that's what the Supreme Court ruling was meant to establish. As legal language goes, the decision is really quite plainly worded; if Farron really believes that's what they meant, he should be disbarred for incompetence. To quote Justice Kennedy's opinion [pdf]:
.... An unacceptable likelihood exists that the brutality or cold-blooded nature of any particular crime would overpower mitigating arguments based on youth as a matter of course, even where the juvenile offenderâ??s objective immaturity, vulnerability, and lack of true depravity should require a sentence less severe than death. When a juvenile commits a heinous crime, the State can exact forfeiture of some of the most basic liberties, but the State cannot extinguish his life and his potential to attain a mature understanding of his own humanity. While drawing the line at 18 is subject to the objections always raised against categorical rules, that is the point where society draws the line for many purposes between childhood and adulthood and the age at which the line for death eligibility ought to rest. ....
Which means that yes, they understand 18 years (or 6574 days, if DA Farron prefers) is an arbitrary cut-off date; but then, arbitration is their job. Part of that job means that they have to act, sometimes, when the demagoguery of some local politicians, or the particular popularity of some victim (as in the case of Justin Dickens) exacerbates local bloodlust.
Put another way, the point of the ruling was that elected or politically-appointed operatives like Farron ought not be trusted to turn American civil society into a cruel myth, ever-invoked but seldom obtained to. We've already let federal demagogues do that with anti-terror statutes that effectively permit the abrogation of basic constitutional rights to free speech, habeus corpus, and freedom of association.
After reading Daniel Schorr, writing at the CSMonitor, I'm left wondering whether he's getting sloppy. It's not that he thinks George W. Bush may have "gotten it right" when he said that "a liberated Iraq can show the power of freedom to transform that vital region." (Though Schorr doesn't address whether Bush ever really cared about whether he was right -- i.e., whether or not the President is a bullshit artist.) It's not even that he thinks the so-called by some "cedar revolution" underway in Lebanon has some causal connection with our disastrous liberation of Iraq. It's that I can't figure out why he'd think that.
It's just not a view that makes sense. Why should Lebanese (or Egyptians, for that matter) be positively inspired to seek political freedom by the images of American military dominance in Iraq? Fear, perhaps, that the US would make them the next exmaple, even though the Egyptians have been our partners in crime for many years and Lebanon doesn't seem to capture the American political imagination anymore (if it ever did). Schorr nevertheless thinks that some mysterious "Iraq effect" is inspiring Lebanese to drive out the Syrian "security" force. He thinks that Iraqi "freedom" must be what's inspiring these people.
He thinks this, in spite of the fact that the "cedar revolution" bears a much closer resemblance to the Czechoslovakian "Velvet Revolution" of a generation earlier (not to mention contemporary actions in Poland, East Germany, Romania and Hungary). Of, for that matter, to any number of popular uprisins throughout the world, successful or not, in the decades since.
Or, more to the point, Schorr could consider the electric shock that seemed to go through the Arab world with the death of Yasser Arafat and the re-emerence of Abu Mazen as a popular political force, thanks in no small part to the weariness of ordinary Palestinians.
The only explanation I can think of for this failure to see other, far more likely causes, is the comon and highly ethnocentric (if not frankly racist) view that "the Middle East" is somehow different, its peoples and nations somehow less cultured and civilized, and certainly less aware of world events. They don't know about popular protests in the Ukraine; they never learned anything aout European or Asian or New World history, so they don't know about Tienanmen Square, the popularly-inspired rennaissance of South Korean democracy in the late '80s and early '90s, or the popular groundswell that saved nascent Russian democracy (for a while) from "counter-revolutionary" forces. Those ignorant Arabs must not understand any of that stuff. This is the Middle East, after all. It's different there.
Perhaps it's just a generational thing; we're past the time, maybe, when we should expect any foundation in history from reporters and "news analysts". Of courase, an "analyst" as seasoned as Schorr doesn't have that excuse: He's "analyzed" every significant popular uprising since the fall of the Iron Curtain, and was reporting the news way back when Lebanon was one of the most beautiful and culturally vital nations on the Mediterranean.
"One who is concerned to report or to conceal the facts assumes that there are indeed facts that are in some way both determinate and knowable. His interest in telling the truth or in lying presupposes that there is a difference between getting things wrong and getting them right, and that it is at least occasionally possible to tell the difference. Someone who ceases to believe in the possibility of identifying certain statements as true and others as false can have only two alternatives. The first is to desist both from efforts to tell the truth and from efforts to deceive. This would mean refraining from making any assertion whatever about the facts. The second alternative is to continue making assertions that purport to describe the way things are but that cannot be anything except bullshit."
-- Harry G. Frankfurt, from "On Bullshit" (in The Importance of What We Care About, and as quoted on wrongheaded)
"I had a guaranteed military sale with ED209! Renovation program! Spare parts for 25 years! Who cares if it worked or not!"
-- 'Dick Jones', Robocop
I've been thinking a lot lately about a problem, a phenomenon, a type of behavior that I've described for myself as "po-mo ironism". It's a way of keeping a sense of ironic detachment that lets you criticize something as you valorize your decision to participate in it -- for example, carefully noting the defects and problems of SUVs while arguing that your decision to buy one is, nevertheless, virtuous.
It's seemed to me that there's something more than mere disingenuousness at work, here, and something more than simple ironic detachent, as well. Now I have a simple, concise word for it: Bullshit. And a framework to hang it on. Bullshit could be said to be the basic "truth-process" underlying lysenkoism, disingenuousness, ironic detachment, self-delusion, and a host of other discretely-named ills.
Bullshitting, according to Harry G. Frankfurt, is far more insidious and corrosive than lying, thought they might superficially seem to be synonymous. Something can be true and still be bullshit, if the purveyor of said excrement never cared whether it was true. So, for example, even if we choose to believe that Saddam really did want to buy Yellowcake in Africa, the infamous "16 words" are still bullshit, because Rice, Bush & Co. never actually cared whether they were true.
What makes it bullshit, in other words, is not that they lied -- they might have even believed it was true -- but rather that they if they hadn't found evidence, they would have made it up. (Which, in fact, they did; that Saddam might have actually tried does not alter the fact that the Bushites ["Bull-Sites"?] fabricated their evidence.)
Because what mattered to them was not whether it was true, but whether they had the evidence to support doing what they wanted to do: Go beat up the guy who made George's daddy look silly.
Frank Rich of The New York Times is speaking out about News in â??Gonzo Gone, Rather Going, Watergate Still Hereâ?:
Two weeks ago Hunter S. Thompson committed suicide. Next week Dan Rather commits ritual suicide, leaving the anchor chair at CBS prematurely as penance for his toxic National Guard story. The two journalists shared little but an abiding distaste - make that hatred in Thompson's case - for the Great Satan of 20th-century American politics, Richard Nixon. The best work of both was long behind them. Yet memories of that best work - not to mention the coincidental timing of their departures - only accentuate the vacuum in that cultural category we stubbornly insist on calling News.
What's missing from News is the news.
I just grabbed my old copy of Dan Ratherâ??s book from 1977, The Camera Never Blinks, recounting his adventures as a TV journalist. Yep, there he is, with his photo on the jacket of the book, long sideburns and all. As I read the front, inside flap I come across these words:
... a professor had assigned Dan and his fellow students the works of a courageous broadcast journalist. â??For the working reporter,â? the teacher demanded, â??what is the most important thing Elmer Davis said?â? There was silence in the classroom. Finally the teacher told them: â??Donâ??t let the bastards scare you.â?
Now I need to find another passage I remember so well after reading this book many years ago. Here it is at the end of Chapter 13:
It pleases me to imagine what Abraham Lincoln would have been on television. I suspect he would have been superb. He had the wit, the power of language, honesty and humanity, and all that would have shone. Lincoln held news conferences, although they were not given that name. He simply kept an open door.
You have to pinch yourself to remember what the presidency was like in Abe Lincolnâ??s time. No fence around the White House. People coming off the street, lounging in the hallway, waiting to see him and frequently doing so. He walked the grounds freely, often accompanied only by his secretary.
It is not possible, nor desirable, to go back to 1861. But stripping away some of the royal trappings around the presidency would be helpful. Whatever its faults, and they exist, television is one of the tools.
There is no cause to get emotional, but I am an electronic journalist. I think in those terms, and I like what I do. Ed Murrow once said, â??Unless you care about it, unless you really work at it, all your have is wires in a box.
Ironically, â??royal trappings around the presidencyâ? lately have included some fake newsmen.
Frank Rich concludes:
As for Mr. Rather, he gave a valedictory interview to Ken Auletta of The New Yorker in which he said, "The one thing I hope, and I believe, is that even my enemies think that I am authentic." The bar is so low these days that authenticity may well constitute a major journalistic accomplishment in itself.
While I can't say it surprises me, it's amused me for a long time that people like Matt Drudge think Chris Rock is "dangerous." Dangerous to what, I wonder. Perhaps to their complacency. Certainly not to "family values" or "common mores" -- not if you're paying attention. I am a bit surprised, though, to hear him referred to as a 'William F***ing Buckley Conservative'.
Years ago, I saw Chris Rock on television -- probably HBO, probably "Bring the Pain", but I don't remember exactly. I do remember a bit he did about men cheating on their girlfriends. He started by signalling his intent -- showing the club, as it were: "Men are stupid. Because you know you're gonna get caught." He does it to be fair, maybe, or maybe to prove that even after he's signalled that he's going to drop a hammer on them, he'll still sucker the men in the audience in. Which he proceeded to do, describing how natural it was to want to cheat, how easy it should be to lie -- if, of course, men weren't stupid. And, more important, if they didn't know damn well they deserved to get caught.
Rock is a stealth moralist. He's a preacher to the pop-cultural -- a wandering rabbi or imam, ministering to the barflies by telling stories in terms they can understand and that will elicit enough of their empathy to make them stretch their minds and consider their world. It's an old and proud tradition (as old probably as storytelling), realized with a wide array of techniques. My favorite contemporary example is Matt Groening's marvellous creation, The Simpsons, which panders to our baser instincts and then springs the trap on us by making its characters renounce their ill-gotten gains in the service of What's Right. Rock's technique is similar: He lulls his audience into a false sense of security, and then explains in quick, brutal strokes that anyone he's suckered is a fool. And a morally depraved fool, at that.
What Rock's not is a 'William F***ing Buckley Conservative', as John Swansburg seems to think he is. Raising your daughters to not be strippers, or suggesting that single mothers ought to put their children before Girls Night at least most of the time, or suggesting that abortion might be a little too cavalierly chosen, are not "Red State Red" conservative positions: They're mainstream moderate American moral positions, shared by the vast majority of adult "Blue" and "Red" state residents, and anyone who suggests otherwise is buying into the Republican framing myth that holds that True American Values are right-wing religious conservative values. They're not actually religious values at all, and what's more, they're not communicated via religion -- at least, not in a healthy, functioning society they're not. In a healthy family, they're communicated by example. Children learn responsible and moral behavior by watching their parents, their extended family, their neighbors, and the people they meet in daily life.
What Rock is, is Lenny Bruce with tamed demons, or George Carlin with more integrity. Any good comic keeps a few demons in the closet to feed him material; but if they're smart (and probably a little lucky), then sometimes, just sometimes, they learn to tame them without losing the energy the demons feed them.
Chris Rock is also another important thing: He's a professional, just like Whoopee Goldberg or Billy Crystal. Personally, I expect his impact on the quality (moral or otherwise) of the show to be positive.
One of the most egregious failures of imagination that I see every day is what looks very much like an inability (or more likely an unwillingness) to stretch the mind to understand what a story is trying to tell you.
And what stories are trying to tell you isn't some single, specific thing. If they are, they're bad stories -- maybe even false stories. Good stories -- "true" stories -- are like a thought-experiment: "What would happen if someone did this?" I am increasingly convinced that stories are how humans are wired to make sense of the world. Stories are why we have advanced language skills: Better language made for better stories, better stories made for a more survivable community, etc.
If a story has consistent, valid story logic and character logic -- if the characters behave in ways that makes sense for those characters, in that circumstance, to behave -- then we can safely say that there's at least some truth in it. If the story is powerfully told, so much the better: Without good telling, we won't stretch ourselves to find the empathy we'll need to make the narrative talk to us. This is what "great" makers of narrative (those folks we call "writers", but also film-makers, poets, songwriters, painters...) have always done.
So Medved is not only being reductionist on this point, he's being a bad critic, because he's approaching criticism without imagination. He's looking at the film as though it's some kind of a morality-machine, and any good film -- any good story -- is something more than that. It's a narrative, from which we can draw a deeper understanding (that is, if it's story-logic and character logic are true).
Anyway, I don't have high expectations for a review from anybody who's expecting to find a clear moral universe in an Eastwood film. Think of Unforgiven (endorses prostitution and lawless behavior), Midnight in the Garden of Good and Evil (romanticizes gay sex and murder and promotes an anti-christian agenda through endorsing voudoun), Bridges of Madison County (glorifies adultery), or probably any of his other films from the past 15 years. There is a theme there, though, I think, and it's that Clint Eastwood lives in an increasingly vague moral universe these days. They only things that seem to be certain in Eastwood's moral universe are pain and love. (And there are worse absolutes to fixate on. Power, per se, for example, has no real moral endorsement in Eastwood's vision -- it's a fact, to be sure, but it's always in service of love or pain. But I digress...)
What these films can help us to understand is that a vague moral universe is not an amoral one. Every Eastwood picture that I can recall (aside from his forgettable late Dirty Harry outings, done to win studio backing for future projects) has been driven by its moral choices. His characters do not serve as moral models; rather, they model moral behavior. There's a crucial difference: The first means that they are merely shadows on the cave wall, cast by the contorted hands of a finger-puppeteer; the latter allows us to imagine ourselves in that world, and consider the choices we would make.
But be honest about it. Don't just pretend. That's cheating, and lying, and it makes Baby Jeebus cry.
I'm increasingly convinced that the greatest roadblock to human progress is lack of imagination. More particularly, the inability -- or unwillingness -- to imagine onesself in the position of another.
The problem can look like other things: Like (selective) literalism, as when someone like Michael Medved or Ted Kavanau can see nothing in Million Dollar Baby but a "pro-euthenasia" or "anti-christian" tract. Or it can look like lack of empathy, as when wannabe uber-geeks dismiss the problems of "lUsErS" as being of their own making, or knee-jerk free-will zealots (willfully?) ignore the benefits they accrue from being members of civil society to rip out one of that society's underpinnings.
That empathy requires imagination I regard as self-evident; that people who lack empathy literally lack imagination, I regard as open to question. As a friend remarked to me recently, "it's all about what's at your front door."
Iâ??m sitting here reading the morning news while sipping my second cup of hot, black coffee, which by the latest study seems to be a good thing to do. I have to admit though that while reading the news today, chances of esophageal reflux might outweigh any possible early onset of liver cancer.
The new CIA Director Porter Goss is feeling his oats with the help of Rumsfeld and friends in spreading fear, albeit â??offering few specifics.â?
U.S. Secretary of State Condoleezza Rice has been looking quite grim as tension mounts between the U. S. and Syria.
Michael Jackson has the flu. (What. Ever.)
James Guckert a.k.a Jeff Gannon, former White House conservative correspondent, is â??Bushâ??s Barberini Faunâ??
NYT Maureen Dowd raises a good point:
I'm still mystified by this story. I was rejected for a White House press pass at the start of the Bush administration, but someone with an alias, a tax evasion problem and Internet pictures where he posed like the "Barberini Faun" is credentialed to cover a White House that won a second term by mining homophobia and preaching family values?
But, see. Ms. Dowd is submitting to critical thinking. Any attempts at understanding might be better achieved by first descending into the gross, cavernous mindset of ethical inconsistency, where convenient â??realityâ? is created.
"Modern man just got older"? Now thereâ??s a story. I wonder if some spinmeisters will try to creatively cram 195,000 years into 6,000 years...
The study showed about 23 percent of ninth-grade girls, typically 13 to 14 years old, had sex before receiving abstinence education. After taking the course, 29 percent of the girls in the same group said they had had sex.
Boys in the tenth grade, about 14 to 15 years old, showed a more marked increase, from 24 percent to 39 percent, after receiving abstinence education.
As Detroit Free Press Susan Ager quipped,
A popular pundit, Williams can't be trusted anymore, but that's only because his behavior surprised us.
Mine won't surprise anyone because I'm telling the world: Anyone can now buy positive exposure in my column.
Take out the element of surprise; be upfront. At least weâ??d know what weâ??re getting. Bought and paid-for mouthpieces, supporting the (sometimes â??moralâ?) causes of others.
I am sick of anti-abortion proselytizers seeking government support. Iâ??m so sick of them that I risk becoming a proselytizer myself in defense of a womanâ??s right to make decisions about her own body.
On this front, Hillary Clinton is blasting the Bush administration while seeking common ground with anti-abortion groups. Others, throwing the â??moralâ? ball back in Bushâ??s court, are demanding moral clarity.
Common ground. Moral clarity.
The government as warden of a womanâ??s womb is a downright messy business.
How can I remark on digital convergence without remarking on the forthcoming "headless iMac"?
More to the point, what the hell does a "headless Mac" have to do with digital convergence?
I'll explain. Gizmodo facilitated leaking a bunch of really convincing (to me) product unpacking shots of a device called "iHome", which has a buttload of ports on the back and a CD-ROM slot on the front. Alas, there's lots of smoke and steam on the Apple rumor forums to the effect that these must be fake, because the box is just so ugly. Apple's legendary industrial design staff surely couldn't have produced something so "fugly". (Um...right. Something about this presentation really offends Mac-heads, as is clear from the Engadget comments, but I'm not sure what.) But consider that any box unveiled now is most likely not a production version, and might well be camoflaged the way Detroit camoflages their long-range test models.
Be that as it may, and leaving aside the authenticity of the photos, the name would tell us volumes about how Apple sees the market-positioning of this device, and I belive they do not see it the way that 'Bob Cringely' sees it:
.... The price for that box is supposed to be $499, which would give customers a box with processor, disk, memory, and OS into which you plug your current display, keyboard, and mouse. Given that this sounds a lot like AMD's new Personal Internet Communicator, which will sell for $185, there is probably plenty of profit left for Apple in a $499 price. But what if they priced it at $399 or even $349? Now make it $249, where I calculate they'd be losing $100 per unit. At $100 per unit, how many little Macs could they sell if Jobs is willing to spend $1 billion? TEN MILLION and Apple suddenly becomes the world's number one PC company. Think of it as a non-mobile iPod with computing capability. Think of the music sales it could spawn. Think of the iPod sales it would hurt (zero, because of the lack of mobility). Think of the more expensive Mac sales it would hurt (zero, because a Mac loyalist would only be interested in using this box as an EXTRA computer they would otherwise not have bought). Think of the extra application sales it would generate and especially the OS upgrade sales, which alone could pay back that $100. Think of the impact it would have on Windows sales (minus 10 million units). And if it doesn't work, Steve will still have $5 billion in cash with no measurable negative impact on the company. I think he'll do it.
I see it different[ly].
Nobody's talking yet about what the iHome actually does have. Rumors abound, and they mostly assume it's basically an iBook without a display. I don't buy it.
The very name of the device indicates to me that iHome is not intended to be used as a general purpose computer in any really sophisticated way. It's intended as a media hub, and any other functions it fulfills are incidental, and what's more, Apple won't be enthusiastic about helping it fulfill those other uses: It will most likely be a mediocre platform for applications work. It will be somewhat more than a set-top box, only because it would cost more to dumb it down. (If I'm proven wrong, I'll certainly be taking a look at iHomes for my own use, but I don't think I'm wrong here. We'll see in a few days.)
I think it will be somehow substantially crippled, and I think I know how: It will have limited display capability, ouputting by S-Video and composite only (and the latter through an extra-cost converter from S-Video); and it will not have expandable RAM. Both decisions will be defended on the basis of price, but they'll really have been taken to prevent cannibalizing iBook, iMac and eMac sales. By the way, I essentially agree with Cringely's analysis of the market impact of a fully-capable and cheap iHome, but I think he's applying a much too rational (and charitable) thought process to Apple's senior management.
I think Jobs doesn't know what to do with iTunes. It's a juggernaut he doesn't know how to stop; it's prompting people at his company to actually think about ideas that could shake up the personal computing marketplace, like, say, a genuinely cheap computer with a powerful OS and operating environment. Baseline Macs are built with remarkably inexpensive electronic components: Many still use relatively slow and old versions of the PowerPC chip (the "G4" generation), which by virtue of their vintage are dirt cheap; the "G5" models mostly use relatively slow versions of that chip, and below the most expensive levels, they all use graphics subsystems that are last year's news on PCs. Macs are cheap, cheap, cheap to build. And yet, they're hideously expensive on a bang:buck basis.
If Jobs wanted to really go big, he could have done it years ago. Opportunities like the one that Cringely describes are always there for Apple, all the time. And they never take them. Why? The only answer that's compelling to me is that Steve Jobs does not want Apple to be successful, because that would mean that Apple was no longer about him. Sure, the cult of personality would flourish for a while, but I think he understands that part of his bizarre public loveability is the fact that his exposure is limited. He'll never be as much of a self-charicature as Steve Ballmer or Larry Ellison, but the tarnish would settle pretty quickly and Apple would quickly become beset by the woes of any company that moves beyond a customer base comprised primarily of true believers.
So Cringely's right, I think, about the opportunity, and he's right about what iHome is, but I think he's wrong about what Apple will do with it. And though I predict that Jobs will be accused of not taking these steps out of greed, I think his motivation will be darker: Ego. Though I suppose the Dark Steve's flavor of ego could be cast as a kind of greed....
Why does anyone still trust Apple? I suppose it could just be that they don't pay attention. Maybe it's that they love a bully, especially when the bully speaks and looks so fair. Apple is one of the great counter-arguments to the wisdom of the Cluetrain: They keep their customers in the dark and feed them nothing but cheap wine and communal wafers, and yet they're worshipped for it.
Last week, ThinkSecret fronted a rumor that Apple would be announcing a sub-$500 "headless" Macintosh at Macworld Expo on 1/11. They also slipped in a mention, which I somehow missed, that Apple was working on an office suite to compete with MS-Office for OS X.
So, naturally, Apple is sueing them [via Gizmodo]. Said ThinkSecret was revealing "trade secrets". Seem to think that the stuff ThinkSecret is putting up on their website might somehow cause Apple harm. For example, maybe Microsoft doesn't already suspect that Apple is canoodling with KDE to produce an OS X customized fork of KOffice. (KDE are already got the whole suite working natively under Aqua.) Maybe now that Microsoft knows, they'll conjure some nefarious plot to destroy Apple once and for all. Or not.
And as though suing ThinkSecret didn't just confirm at least one of the rumors.
Now, if the Cluetrain Manifesto told the whole story, Apple would be toast thanks to hijinks like these. Their hardware is expensive and slow, the software is more expensive and there's less of it. And on top of that, they treat their customers like marks to be manipulated and jerked around. On the other hand, Apple products come in whatever color you want. As long as it's white.
Friends and acquaintences know that I've considered buying a Mac for a while now, so I can move away from Windows while still having access to high-quality design and graphics tools like PageMaker and Flash. Much as I love the idea of scoring a slightly used PowerBook, a $599 desktop Mac would be a nearly ideal solution. But the Dark Steve just keeps making it hard for me to switch. At least Bill Gates and Steve Ballmer don't pass themselves off as nice guys.
Whence this mania for secrecy stems, I could only speculate. It's apparently new since Jobs rejoined in the late 90s, and since Apple more or less exists for the sole purpose of making Steve Jobs feel like a big man, my first guess would be that it sources back to him. In any case, it's a brilliant piece of crazy-making. They have to grok very deeply that their true believers will love them even more for this, and that once a convert has drunk their koolaid, going over to Windows is unthinkable. (Why, that would mean feeling uncool....)
The Business Blogging Boot Camp (@ Windsor Media) provides a more bottom-line perspective on the growth of blogging, driven by Fortune's 1/10/2005 feature story on technology trends; their observations came to me as part of an email thread related to the BBC story I mentioned last night. They stress the importance of blogging for business, and furthermore the importance of blogging earnestly. They cite the Kryptonite affair and moves toward blog-monitoring by Bacon's Information -- the latter characterized as tentative, "inane", 'Not Getting It.' (I'm usually leery when a huge quant-marketing shop is characterized as Not Getting It. Often it's true, yes; but as far as I can see they often understand a lot more than they bother to explain to us proles. But I digress.)
There are two things I feel compelled to point out before going further: First, blogs are qualitatively analogous to specialist newsletters, which are nothing new to savvy marketers. As with specialty newsletters, the influence of a blog hinges on a subtle balance between the publisher's access to information, their (perceived) personal integrity, and the volume (direct or indirect) of their readership. What's new is the speed of blogging. I'm leery of pointing out emergent qualities, but it's hard to argue that a ten-day cycle time doesn't indicate that (a lack of) quantity may indeed, in this case, have a quality all its own.
The second thing I feel compelled to point out -- and this is both much more and much less important than it sounds -- is that the Kryptonite business not only didn't start on blogs, but didn't get its first traction there. It started on the cycling boards, and that's where it was hashed out, refined, debugged, and researched, and where the first instructional videos were posted, before it was ever reported on a blogospherically-integrated weblog. Some of these bicycling boards are almost as old as the web, and most have many members who trace their net-cred back to Usenet days. My point being that anyone focusing only on blogs as such is setting themselves up for obsolescence. Blogs as they are, are almost certainly not blogs as they will be.
Anyway, Windsor Media's take is largely blogospheric orthodoxy. And in practical terms, it's probably right: The important thing for businesses to do right now is to make it part of some people's jobs to go out, and read and post like humans. But there's a second thing that not only needs to happen, but will happen, and what's more will be enabled by the first: Smart businesses will take steps to understand how the blogosphere works, and how it can be gamed, and then they will go forth and game it. And it will work. The knowledge required will come from a few main sources: From big outfits like Bacon and free-range old-school marketing pundits (who will keep it to themselves and share out bits of wisdom to key clients); and from less old-school marketing pundits like Darren Barefoot and BL Ochman, and from product evangelism folks at big companies (who as a group will tend to share it on their blogs, undercutting Bacon et al's old-school attempts to make money off consulting). And, perhaps most important of all, it will come from research in social network analysis. More on that another time.
Blogging will be gamed by corporate and business interests, make no mistake about that. Because it can be, and is being, gamed. It happens every day. And, contrary to the blogospheric orthodoxy, the broader the cross-section of people who get involved in blogging, the easier it gets to game the system without looking like a weasel. And if the golden rule of capitalist systems is that money wants to make more money (and I'm pretty sure it's something like that), and if blogging has an impact on the growth and flow of money, then money will drive blogging, and blogging will get gamed.
Now I'm getting into blogging heterodoxy. The conventional wisdom on the blogging ethos is very cluetrain, and in fact, the Kryptonite affair does indeed show a real "cluetrain" cause-effect loop. I missed it at the time because I just didn't tune in to the story, but the folks at Fortune and Windsor Media are right about that: The ten-day problem-to-product recall cycle at Kryptonite was characterized by all the corporate communications failures criticized in the Cluetrain Manifesto. It just took a lot longer for this first clear case to emerge than either they or, frankly, I thought it would.
The orthodox position is that the more people get involved in blogging, the harder it gets to game the system. It's a variation on the open-source golden rule of debugging ("Given enough eyes, all bugs are trivial"): "Given enough eyes, all misinformation will be found." But open-source debugging works (when it works, which it often doesn't, but that's another story) because the "eyes on the code" belong to people who know how to spot a problem, and have the capability to affect it more or less directly. In blogging, the "eyes on the information" are often people with little or no real expertise. Much of what they spout is nonsense.
And yet, it's effective.
The blogosphere shifts like a body of water: Fast, and irresistibly. Part of the reason that happens is that the blogging community is comprised largely of small communities with large enough membership to make an impact, and what's more, those communities overlap: PoliBloggers are tight with techbloggers who are tight with lifestyle bloggers who are tight with polibloggers.... So when the loop has looped a few times, we find that a relatively small group of people can pretty reliably and rapidly shift the character of the blogosphere. But as the blogosphere becomes larger, it grows more statistically homogeneous, and small communities of movers will not have the same kind of predictable results anymore. Then it will seem less like water, and more like mud.
But I digress, again. I started this to talk about gaming the blogosphere, and that this will happen, I do not doubt for an instant. There's a lot of money riding on this, after all. Some people will figure out how to game the blogosphere -- to game the cluetrain. It will be a painful process with lots of false starts, but we are well beyond the beginning of the process. It started long before the Kryptonite affair; if I had to pick a point in time, I'd pick the consolidation of successful blogs like Wonkette, Gizmodo and ... under the Gawker Media banner. Gawker sells lots of ads, gets lots and lots of daily eyeballs, and their more overtly commercial blogs (like Gizmodo and Jalopnik) have pull with the product managers by virtue of the fact that they can say things like:
What consumers wantâ??an out-of-box way to share and transmit files between different storage media and computers (and users)â??is exactly what manufacturers don't want to give them, but they'll tease us a little. So, if you're really rich, DigitalDeck Entertainment Network is busting out an in-home network PC to gear to DVD sharing system that costs $4000 - $5000. It probably consists of a bunch of cables and a universal remote that your geeked-out younger brother could hack together himself.
And so, we've come back around again to the specialist newsletter: I take Gizmodo seriously (and I confess, I do read it more or less every day) because I see things like this that indicate to me that they bother to think a bit about what they're reviewing. They have credibility for me because they speak not merely in a human voice, but in one that says credible things. And they have the benefit of comprehensiveness because somebody (namely, Gawker Media) is paying them to do nothing but that.
And by the way, at some point does it stop being "blogging" and start being journalism? Open question, IMO.
It's just that relatively few people have realized it, yet. As I so often say: When there's big money involved, the alternate modalities will be co-opted. (Or crushed.) Even more than information wants to be free, money wants to make more money. We're now sitting in that fragile cusp (oh, hell, we may be past it) where the "winners" of the next gold-rush will be decided. It's not a huge gold rush -- not yet -- but in its own way, it will be just as hokum-driven as the dotcom boom.
I know this because I bothered to do some simple math with numbers in a news story about American blogging habits. From Britain, of all places. A friend pointed me to the BBC's obligatory popular rundown on what a blog is and why their readers should care, combined with a little bit of exoticism regarding us cousins. The article relies heavily on a report from Pew Internet and American Life Project; it's thin on details, but the do provide us with a helpful bullet list in their sidebar:
- Blog readership has shot up by 58% in 2004
- Eight million have created a blog
- 27% of online Americans have read a blog
- 5% use RSS aggregators to get news and other information
- 12% of online Americans have posted comments on blogs
- Only 38% of online Americans have heard about blogs
By implication (according to the sidebar), of Americans who've heard of blogs (38% of online Americans), 71% have read read them (27% of online Americans -- 27/38=.71); and a bit less than half of those have gone on to post comments (12% of online Americans -- 12/27=.44). (Less interesting, but more impressive: about 30% of people who've heard of blogs have posted comments...) Interesting. If taken at face value (which wouldn't be a good idea), that means almost half of people who've read blogs have posted comments to them. Before we even start to think about commercial applications, that may well represent a radical increase in the population of people participating in online forums.
But here's the real meat: When they saw those numbers in the sidebar, direct marketing people in the reading audience (who eat, sleep and breathe much more complex math than that) were drooling on their keyboards. Consider that a direct mail campaign is regarded as doing very, very well at 5% response. These are not numbers to swing elections as a constituency; but they are well into "thought-leader" territory. These blog readers are high-throughput nodes. They're the folks who spread Jib-Jab movies and forwarded the Kick Osama Butt song. At least, that's how the consultants will spin it.
Also quite interesting: Almost a fifth of people who've "read a blog" (5% of online Americans) use RSS readers to aggregate blog content. RSS readers by their definition identify regular readers, so something in excess of about 20% of blog-readers are regular blog-readers. And the stream of drool intensifies.
You have to actually do some math to sort all that out, mind; I think they're probably better at it over there, but I wonder if they weren't actively hiding those numbers by not crunching the numbers. (In America, I'd just go for ignorance -- I don't have much faith that our reporters have the math skills to calculate a proportion.)
I can honestly say that I never thought blogging was a fad. But I will go out on a limb (not that I have to go very far) and say that "podcasting" was dead before it started. Or, at least, the meaning of the term will change. "Podcasting" will come to be the audio equivalent of "TiVO", as we start to see those forthcoming gizmos that let folks TiVO-ize satellite-radio broadcasts. They'll start as special attachments for iPods. (Perhaps even as an iPod itself -- though I don't think Apple will go that far. It would hurt iTunes sales.) Then they'll spread to other music players ("there are music players besides the iPod?!"). Podcasting as we currently know it will die a quick and inglorious death, mourned only by the people who hoped to have their name forever attached to the term.
Blogging has previously never really been at an equivalent risk. The technical barriers to entry are low: A decent secondary education and enough disposable income to afford $10/mo or less in hosting fees. They face very little competition. (Well, except for newspaper columnists. What are those? Well, um, they're these folks who'd regularly get their "blogs" printed in newspapers. See, these newspapers, they're printed on really big paper, so everything is in columns, and a columnist would get one column out of six on the page... ... Newspapers. They print them, on paper, and sell them to people so they can carry them around and read them.... How do they know how many to print? They don't. A lot get wasted. Yes, I know that's a waste...)
Edge.org have posed an interesting question [courtesy MeFi] to a collection of "scientists and science-minded thinkers": "WHAT DO YOU BELIEVE IS TRUE EVEN THOUGH YOU CANNOT PROVE IT?" (It's just the latest in a series of annual questions.) Many of the answers are thought-provoking, or instructive (even though most are simply restatements of that thinker's area of interest in the form of an "unprovable" "assertion"). The zeitgeist implicit in their answers is interesting, too. John Brockman writes:
This year there's a focus on consciousness, on knowing, on ideas of truth and proof. If pushed to generalize, I would say it is a commentary on how we are dealing with the idea of certainty.
We are in the age of "searchculture", in which Google and other search engines are leading us into a future rich with an abundance of correct answers along with an accompanying naÃ¯ve sense of certainty. In the future, we will be able to answer the question, but will we be bright enough to ask it?
This is an alternative path. It may be that it's okay not to be certain, but to have a hunch, and to perceive on that basis.
Maybe it says that. Maybe it says that this is how science actually works: Having hunches, then trying to prove them, which is really what most of the answers are about. Some of them get more fundamental, as when Richard Dawkins answers:
I believe, but I cannot prove, that all life, all intelligence, all creativity and all 'design' anywhere in the universe, is the direct or indirect product of Darwinian natural selection. It follows that design comes late in the universe, after a period of Darwinian evolution. Design cannot precede evolution and therefore cannot underlie the universe.
... which is a remarkably blunt and honest thing for him to say, since it faces head-on the core weakness of his anti-ID positions. I personally think ID is a load of horse-hockey, but I don't think it can be countered with "proof" that it can't work any more than we can solve the first-mover conundrum. I'm glad Dawkins doesn't shy away from that. I'm not always crazy about the way he formulates ideas ("selfish gene" theory still seems too simplisticly reactionary to me, nearly 20 years after I first heard of it), but he is nevertheless one of the most able and vigorous opponents of ID, so it behooves me to pay attention to what he's saying out there.
In any case, while the Q&A is intriquing, in many cases (and as I've noted) it's largely a matter of researchers restating their research-focus as though it were a controversial idea. [bonehead @ MeFi observes, "... scratch post-docs or hungry assistant profs for real wild-eyed speculation. Of course, most of them will be wrong (entertainingly so), but that's where the future Nobels are too."] And I don't think Brockman is really giving credit to scientific process: Believing something you can't prove is usually how anything valuable and previously unknown gets to be learned. Call it a hunch, call it belief; the process whereby that belief is substantiated (though hardly evern "proved" in a strict logicalist sense) is what we know as science. And I'm not altogether sure that Brockman groks that.
Brockman also seems to think there's a new way of being an intellectual:
... There is also evidence here that the[se] scientists are thinking beyond their individual fields. Yes, they are engaged in the science of their own areas of research, but more importantly they are also thinking deeply about creating new understandings about the limits of science, of seeing science not just as a question of knowing things, but as a means of tuning into the deeper questions of who we are and how we know.
It may sound as if I am referring to a group of intellectuals, and not scientists. In fact, I refer to both. In 1991, I suggested the idea of a third culture, which "consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are. "
I believe that the scientists of the third culture are the pre-eminent intellectuals of our time. But I can't prove it.
This idea of "Third Culture" scientists is worth exploring, but it's a topic for another time. Suffice for now to say that I don't see anything sufficiently new that a new organizing principle is required; in fact, I think a concept like "third culture" has more potential to alienate thinkers from cross-pollination than it does to encourage them. A bit like "brights" in that regard.
But that's an issue I haven't got time to take on right now....
"And if I don't ever get married or have a baby, what -- I get bupkes? Think about it: If you are single, after graduation, there isn't one occasion where people celebrate you. I am talking about the single gal. I mean, Hallmark doesn't make a 'Congratulations, you didn't marry the wrong guy' card." -- 'Carrie Bradshaw', Sex and the City
"I started to get notes the next week that said that single women were starting to register, at stores, for their birthdays. And I thought, 'That's great, because we put something out there.'" -- Jenny Bicks on Morning Edition [listen]
Yeh. Right. You put something out there, alright: Another quasi-official reason to spend, and spend in a store-register-validated, label-appropriate manner. Sex and the City is really all about social activism and culture-jamming, after all.
As far as I'm concerned, the time is well-overdue to re-examine the idea that human existence is solely for procreation -- if there's one thing that Humans consistently do that other animals don't, it's make their own rules about what their existence is for -- but relating that to Carrie Bradshaw's sense of loss over her $400 shoes really, truly advances that particular cause not one whit, and it's insultingly disingenuous for the script's author to argue otherwise.
In 1993-94, Ron Avitzur and Greg Robbins exploited poor plant security and bureacratic buck-passing to sneak into the Apple offices in Cupertino day after day, month after month, producing a product the company didn't want, for no pay. By the end, they were working 16 hour days, seven days a week, and had been assigned engineering, QA and Human Factors resources. The product they built, Graphing Calculator, was subsequently included as a standard applet through MacOS System 9.
Apple at that time had a strong tradition of skunkworks projects, in which engineers continued to work on canceled projects in hopes of producing demos that would inspire management to revive them. On occasion, they succeeded. One project, appropriately code-named Spectre, was canceled and restarted no fewer than five times. Engineers worked after hours on their skunkworks, in addition to working full time on their assigned projects. Greg and I, as nonemployees who had no daytime responsibilities, were merely extending this tradition to the next level.
Why did Greg and I do something so ludicrous as sneaking into an eight-billion-dollar corporation to do volunteer work? Apple was having financial troubles then, so we joked that we were volunteering for a nonprofit organization. In reality, our motivation was complex. Partly, the PowerPC was an awesome machine, and we wanted to show off what could be done with it; in the Spinal Tap idiom, we said, "OK, this one goes to eleven." Partly, we were thinking of the storytelling value. Partly, it was a macho computer guy thing - we had never shipped a million copies of software before. Mostly, Greg and I felt that creating quality educational software was a public service. We were doing it to help kids learn math. Public schools are too poor to buy software, so the most effective way to deliver it is to install it at the factory.
Beyond this lies another set of questions, both psychological and political. Was I doing this out of bitterness that my project had been canceled? Was I subversively coopting the resources of a multinational corporation for my own ends? Or was I naive, manipulated by the system into working incredibly hard for its benefit? Was I a loose cannon, driven by arrogance and ego, or was I just devoted to furthering the cause of education?
I view the events as an experiment in subverting power structures. I had none of the traditional power over others that is inherent to the structure of corporations and bureaucracies. I had neither budget nor headcount. I answered to no one, and no one had to do anything I asked. Dozens of people collaborated spontaneously, motivated by loyalty, friendship, or the love of craftsmanship. We were hackers, creating something for the sheer joy of making it work.
.... On March 11, 1994, the front page of the Times business section contained an article on the alliance among Apple, IBM, and Motorola, picturing Greg and me in my front yard with a view of the Santa Cruz Mountains. Someone I knew in Apple Public Relations was livid. I had asked if she wanted to send someone for the interview, but she had said that engineers are not allowed to talk with the press. It's hard to enforce that kind of thing with people who can't be fired. It was positive press for Apple, though, and our parents were pleased.
We wanted to release a Windows version as part of Windows 98, but sadly, Microsoft has effective building security.
Personally, I think it's a dessert topping and a floor wax: At no point were Ron and Greg doing anything out of fear for their livelihoods; yet they've surely done their small part to further Apple's "Think Different" mythology, so it can be argued that Apple had value from their efforts for no measurable return. And the squad of supporters that accreted to them over the months of their effort were effectively stealing cycles from other Apple processes, to debatable end. See, I would tend to side with a business analyst if s/he decided that Graphic Calculator really didn't contribute much to the Apple bottom line.
But I don't work for Apple, and most likely never will. I can't stop thinking it's great. And it's damn near an American myth.
There's a darker side, of course: This isn't far different from the maverick spirit that fueled Ollie North's blood-money circle-jerk; the same initiative and gung-ho spirit can drive both dreams and nightmares.
As always, the only real resort is to calls it as we sees it....
.... One myth that I find interesting, but which has nothing to do with Linux or even the IT sector in particular, is the myth of how a single person or even a single company makes a huge difference in the market. It's the belief that things happen because somebody was visionary and "planned" it that way. Sometimes the people themselves seem to believe it, and then the myth becomes hubris.
I have to continually try to explain to people that no, I don't "control" what happens in Linux. It's about having an environment that is conducive to development, not so much about any particular leader. And I think that is true in most cases, be it the "great sport coach" or the "great spiritual leader."
"We should take care, in inculcating patriotism into our boys and girls, that it is a patriotism above the narrow sentiment which usually stops at one's country, and thus inspires jealousy and enmity in dealing with others... Our patriotism should be of the wider, nobler kind which recognizes justice and reasonableness in the claims of others and which lead our country into comradeship with...the other nations of the world. The first step to this end is to develop peace and goodwill within our borders, by training our youth of both sexes to its practice as their habit of life, so that the jealousies of town against town, class against class and sect against sect no longer exist; and then to extend this good feeling beyond our frontiers towards our neighbors." -- Robert Baden-Powell
[courtesy Rickie Lee Jones / FurnitureForThePeople]
Man, the psychic landscape in those distant Red States is looking a little weirder every day.
"Yeah folks, our simple, cornpone sincerity is so heartfelt and transparent that we have to arrange a glass-roots [sic]crypto-military PR campaign to get some market traction."
What you will hear in the song, The Bumper Of My SUV, is the absolute truth. No exaggerations, no poetic license, and truly how it made me feel. I had no intention of ever playing this song for anyone.
-- Chely Wright
It's old and big news by now, but Chely Wright wrote a song [clip] that's apparently a big indy hit, about the time a Woman In A Mini-Van gave her the finger because of the Marine Corps bumper sticker.
I'll be blunt: I have to wonder if it actually happened.
I'm willing to believe that Chely Wright believes it happened as she tells it, but I can't say I do. After all, her accounts of the incident have gotten more dramatic as she goes along, to the point where as of mid-December, many months after first telling a shorter version of the story as a stage-rap, the Woman In The Mini-Van is now honking and weaving and damn near foaming at the mouth. My Search-Fu hasn't uncovered an instance of this enhanced version of the story prior to 12/13; the CNN, Reuters, all the instances I've found so far source back to a Billboard profile that I can't track down directly, and the Tennesseean, in it's account of Ms. Wright's recent woes, goes with the earlier, less-rabid version of the Woman In The Mini-Van.
But let's say that every word of the later version of her story is true. This song is still the most insidious kind of jingoism, in that it more or less amounts to saying "once the shooting starts, you don't get to protest unless you're part of it." Or your family is part of it. Or you're willing to lie about your family being part of it. The song itself can be read as innocuous. Certainly there's lip service paid to balance ("I don't have all the answers I need", "But that doesn't mean that I want war / I'm not Republican or Democrat"). But it's interspersed with none-too-subtle coding of Ms. Minivan as a Liberal Elitist ("So I hope that lady in her mini-van / Turns on her radio and hears this from me / As she picks up her kids from their private school...").
"Does that lady know what I stand for / Or the things that I believe?" Well, if we believe this story: Yes. Yes she did. Perhaps by accident, but even if so, it sounds like she got Chely Wright, right.
Songwriters, poets, novelists, artists have always imputed technically unwarranted importance to small things. It's part of the troubador's art to embellish and to generalize to a larger context. But the best of them admit it; or, better yet, admit that they're not sure it means what they think it does. And even when they let passion grip them, they don't insult us by pretending to be fair. Bruce Cockburn doesn't pretend to be balanced when wishes for an RPG. Pete Seeger has painted "This weapon kills fascism" around the edge of his banjo-skins for decades. Ani DiFranco has never minced words. Chely Wright should follow good examples and come out and say, or at least figure out, what the hell she means.
... and still, it managed to slip my mind what I started to write about in the first place.
When Drupal's spam module sends me notifications, it tells me what the text of the message was. Here are a few samples:
Injustice, poverty, slavery, ignorance - these may be cured by reform or revolution. But men do not live only by fighting evils. They live by positive goals, individual and collective, a vast variety of them, seldom predictable, at times incompatible.
Women are systematically degraded by receiving the trivial attentions which men think it manly to pay to the sex, when, in fact, men are insultingly supporting their own superiority.
The most radical revolutionary will become a conservative on the day after the revolution.
Time heals what reason cannot.
They're all in the same vein: garden-variety profundities harvested from some encyclopedia of received wisdom. If I believed in hell, I'd believe there was a special place there for people who shit on other people's front lawns while dispensing nauseating bromides. It smacks of the used car salesman who pushes the swanky beast through insincere appeals to your fears for the safety of your family....