"The revolution will not be televised."
Lesson for the day: "Free speech" is a governmental concept. It doesn't apply to private enterprise. So you can lawfully say anything [non-threatening] that you want about Dick Cheney or George Bush (well, in theory), but if you start talking about your employer -- hey, man, nice knowing you, we'll send your redacted belongings by UPS Ground. And just make sure you don't say anything that might get our lawyers excited, once you're out the door.
It's an axiom of American Libertarianism that we negotiate (explicitly or implicitly) with employers for the right to use our labor power. (Hell, it's an axiom of Marxism, for that matter.) But what terms have we negotiated? What have we given up?
Joyce Park (a.k.a. "Troutgirl") is a PHP expert, who until 3pm yesterday was a development manager at Friendster. Today, she's a fired dev mgr at Friendster. The reason given was that she had blogged about the company.
Let's be clear: She hadn't revealed any business confidential information, and she hadn't said anything really even very bad about them. The worst thing that I've found so far was a snarky, weary comment about app performance ("hopefully we can now stop being a byword for unacceptably poky site performance"). But she's out the door, with her blogging activity given as the stated reason.
Again, let's be clear: She was fired for things she did on her own time, with her own resources. She was fired for voicing an opinion about matters of public record. She was fired for putting into words thoughts that were in her head. And if I may judge, in a fairly judicious way.
What surprises me so far is not the groundswell of peer support (there's a movement among geeks to cancel Friendster accounts in sympathy); what surprises me is how many folks think this is a perfectly appropriate thing for Friendster to do. The prevailing wisdom in some quarters seems to be that this is basically just business ethics 101. Hell, they owned her labor power, right? That must mean they also own the right to dictate what she can and cannot say about them, right? (I'm thinking back to a similar case a few months ago. Typical comment: "It's all so simple, if your employer doesn't tell you to blog, don't. At least don't while mentioning your employer." Simple, indeed...)
But this just sounds less and less "right" and "appropriate" the more I think about it. After all, it's basically a logical progression from that to arguing that any corporate-specific information in her head should be restricted to use by Friendster, only. So that means that they can make a case for owning things she learned while working on that big JSP-to-PHP conversion that Friendster finished a couple of months ago.
In 2001, the Supremes handed down a ruling in Kyllo v. United States [250KB PDF] that found a scan of Mr. Kyllo's home using thermal imaging equipment to be an illegal search. It's the grounds that were interesting: It was found to be an illegal search because the thermal imaging device was not "in common use." If such devices had been "in common use", by the ruling, the "search" would not have been illegal.
In other words, once everyone knows that it's become feasible to search non-invasively, it'll be legal. Just like looking into open windows.
That's analogous to this situation in that, presumably, as soon as it becomes feasible to erase or otherwise control content in someone's brain, there will be corporate masters who regard it as their right to take that action when an employee is "discharged" -- or leaves voluntarily.
Jakob Nielsen has done Nietzsche one better: Instead of just two basic ideologies ("master" and "slave"), he's identified three: Mastery, Mystery, and Misery. These correspond roughly to empowerment, game-play, and control. In the "Mastery Ideology", "...the designer's job is to provide the features users need in a transparent interface that gets out of the way and lets users focus on the task at hand." Mystery 'Obfuscates Choices' by using novel interaction elements. And Misery is an ideology of "... oppression, as mainly espoused by certain analysts who wish the Web would turn into television and offer users no real choices at all. Splash pages, pop-ups, and breaking the Back button are typical examples of the misery ideology."
Nielsen's purpose is to drive the cause of design for usability. That's what NNG do for a living. So it's not surprising that he focuses on the negative aspects of "mystery" (obfuscation) and control ("misery"), and carefully (re)interprets empowerment to mean "usability". He's mapped out (as usual) one path that, if followed, will more or less lead to a better design. It's the most bottom-line path, the path most suited to NNG's target audience: They guys with the money (they're the ones who tell the designers what to do, after all). But it's not the only path, and his re-interpretations have some pifalls.
To start with, empowerment isn't always all it's cracked up to be. Sometimes (as Nielsen implicitly points out elsewhere) it's necessary to constrain in order to empower -- or at least, to create the sense of empowerment. Search is a good example. The earliest search interfaces included Boolean parsing as an integral part of their user interaction design. Gradually, Boolean parsing slipped out of the user interfaces as designers became convinced that it was an impediment.
Boolean search would be empowering; but for most users, it would be less usable. Nielsen has accepted that conclusion for years, incidentally. It's experimentally verifiable. (And that seems to me to be Nielsen in a nutshell: 'Where are the numbers?', he'd ask. At a conference, I heard him tell a story of a site whose usability was improved by increasing the number of clicks to perform a purchase. That's what the numbers told them was the right thing to do. And sure enough, the client's revenue increased. Counterintuitive -- but true.)
Similarly, control isn't all bad. UIs can often be made cleaner and easier to use -- especially for novice users -- by limiting functionality. Again, this is nothing Nielsen himself hasn't accepted for years. This is not to say that constraint is freedom; but constraint can give you more free time, when it prevents you from wasting effort on things you don't need.
Aside: I'm always wary of Google as an exmple of any kind of "empowerment." Google right now controls a mind-boggling array of resources, and is in the process of leveraging them to exert an unprecedented level of control over the merchandisability of your browsing experience. That you will remain largely unaware of this process is a testament both to their technical aplomb and to their insipid arrogance.
Where this starts to get interesting is with mystery. I've conflated Nielsen's "Mystery" with "game-play" -- guilty of my own reinterpretation, to be sure, but I think it's valid, and I'm not really alone. Kim Krause has made a similar leap. "Conformity is Nielsen's mantra," she declares. But to proclaim that, she has to ignore Nielsen's praise for J. K. Rowling's "personal" site, which makes extensive use of playful, "mysterious" interface metaphors. "The site feels more like an adventure game," writes Nielsen, "but thatâ??s appropriate because its primary purpose is to feed fans rumors about Rowlingâ??s next book." He goes on:
User research with children shows that they often have problems using websites if links and buttons don't look clickable. At the same time, using a virtual environment as a main navigation interface does work well with kids, even though it's rarely appreciated by adults (outside of games). Also, children have more patience for hunting down links and rolling over interesting parts of a page to see what they do. On balance, the mystery approach to design succeeds for Rowling -- just don't try it for sites that are not about teenage wizards.
So, maybe these aren't hard and fast rules. Maybe there's a little wiggle-room in Nielsen's declarative statements, after all.
Cool. I like that, too. In fact, that's why I dislike Google, because their "I'm feeling lucky" search is nothing more than a glorified popularity meter. I don't want to know what the most popular return on my search term is; I want to see what my options are.
That's how I find things I don't expect to find: By being able to see the results that might not be "most popular." That's how I get serendipity.
Krause does have a point, though, when she notes that it's memorability that makes the site. Google was memorable, she said, because people learned new ways to use the tool: "They could look up people before that first date. They could type in search terms and hit 'I'm Feeling Lucky' to see what one web site Google would find for them out of all the pages in its index. Google was fun to use." (Actually, I always thought HotBot was terrific fun to use, because its Boolean search interface gave me a sense of power by letting me whittle down my results set to exactly what I wanted. But hey, that's just me, I guess...)
When she talks about sites being "memorable", what's she's talking about sounds an awful lot like Don Norman's "emotional design"; and indeed, I think that's what you get when you unify good design for usability with strong content and a design that speaks to that content. One site that strikes me as very successful in this regard is Burningbird. Superficially, the site is constantly changing, seeming to show a new look with almost every viewing. But having used the site once or twice, you will always still know how to use it again. Nothing about the interaction design per se changes when the graphics and colors and typefaces change. The menus stay in the same place, the action-cues stay the same.
I disagree with people who say that this is inherently hard. It does require care, but it's as hard as you make it; if you lean toward control, then you will be frustrated in your attempt to force an experience of memorable mystery upon your users, and it will all be very, very hard. If you let the content and your purpose drive your design, you will, by definition, get what you went looking for. The problem, as always, is to pick the right goal.
According to John Carroll, alternative browsers will never catch on because they're being promoted by religious zealots who fail to understand that 100% compatability with obscure IE extensions is critical to browsing success.
His main point (once you wade through an irrelevant and unnecessary retelling of the History of the Browser Wars, as told by the winning side) seems to be that because the Web Standards Project is behind BrowseHappy.com, the entire issue of browser-switching must be purely religious.
The reasoning resembles the common Conservative canard that if a "liberal" has an idea, it must by that token be a bad idea.
The truth of the matter is that there remain no important incompatabilities between IE and any of the four principle "alternative" browsers, except in areas that render Internet Explorer inherently less secure. Carroll should know this, but he apparently refuses to. Instead, he remains reflexively committed to supporting Microsoft. In this regard, he resembles Freepers: Aligned with the biggest kid on the block, for no other apparent reason than that he's the biggest.
I'll be curious to see what his reaction is to the fact that the Opera and Mozilla development teams will be incorporating Apple-driven extensions, intended to improve their ability to serve as application interfaces on a personal computer. These represent an attempt to reach a saner compromise between security and functionality than Microsoft's ActiveX. Macromedia and Adobe are on board (both heavy players in the Mac space); Mozilla (which is used most widely on PCs) will push the architecture to Windows, and KHTML will push it to Unix and Linux. Unless MS successfully embraces and extends (not likely, since they haven't got the plugin muscle to out-compete Macromedia), they'll be stuck playing on a level field. For the first time in a while.
It scares me a little when I'm on the same wavelength with Garisson Keillor:
.... Fifties Republicans were giants compared to today's. Richard Nixon was the last Republican leader to feel a Christian obligation toward the poor.
In the years between Nixon and Newt Gingrich, the party migrated southward down the Twisting Trail of Rhetoric and sneered at the idea of public service and became the Scourge of Liberalism, the Great Crusade Against the Sixties, the Death Star of Government, a gang of pirates that diverted and fascinated the media by their sheer chutzpah, such as the misty-eyed flag-waving of Ronald Reagan who, while George McGovern flew bombers in World War II, took a pass and made training films in Long Beach. The Nixon moderate vanished like the passenger pigeon, purged by a legion of angry white men who rose to power on pure punk politics. "Bipartisanship is another term of date rape," says Grover Norquist, the Sid Vicious of the GOP. "I don't want to abolish government. I simply want to reduce it to the size where I can drag it into the bathroom and drown it in the bathtub." The boy has Oedipal problems and government is his daddy.
The party of Lincoln and Liberty was transmogrified into the party of hairy-backed swamp developers and corporate shills, faith-based economists, fundamentalist bullies with Bibles, Christians of convenience, freelance racists, misanthropic frat boys, shrieking midgets of AM radio, tax cheats, nihilists in golf pants, brownshirts in pinstripes, sweatshop tycoons, hacks, fakirs, aggressive dorks, Lamborghini libertarians, people who believe Neil Armstrong's moonwalk was filmed in Roswell, New Mexico, little honkers out to diminish the rest of us, Newt's evil spawn and their Etch-A-Sketch president, a dull and rigid man suspicious of the free flow of information and of secular institutions, whose philosophy is a jumble of badly sutured body parts trying to walk. Republicans: The No.1 reason the rest of the world thinks we're deaf, dumb and dangerous.
... Hypocrisies shine like cat turds in the moonlight! O Mark Twain, where art thou at this hour? Arise and behold the Gilded Age reincarnated gaudier than ever, upholding great wealth as the sure sign of Divine Grace.
I'd been doing a lot of thinking, lately, not about Lincoln, but about Theodore Roosevelt and the Progressives. It dawned on me that T.R., with his unflinching commitment to both the physical and intellectual rigors of the "strenuous life", would be no more welcome in the Republican Party now than is John McCain -- and I suspect, considerably less so. T.R. would find George Bush repugnant, would burn with desire to see Dick Cheney in jail.
The last arguably moderate Republican left party leadership when Bob Dole retired. While a harbinger of Conservative encroachment, he was at least never a stooge, at at least made a good faith effort to fight with honor. The party drives its remaining powerful "moderates" -- e.g. Fred Thompson, Ben Nighthorse Campbell, John McCain -- to the marginal councils until they must act in frustration. Thompson and Campbell, both talented and capable politicians who could have served the Republicans well, had they been taken seriously, are gone (Thompson) or going (Campbell) from political life, while McCain might as well strike out and join the Reform. And that's not even to mention the queasy contempt that the party leadership feels for the Republican-dominated New England delegation. "RINOs", they're called: "Republicans In Name Only." And yet, along with McCain, they may be the only Republicans remaining in Congress who still keep the faith that the memory of Lincoln and T.R. evokes.
They are, in short, the true Conservatives: They are the only ones left who truly want to preserve what their party stood for. Or, at least, what they thought it stood for. My own opinion is that it never did, and that Keillor treats them too kindly.
I say this not as a Republican -- I have been registered Independent since the age of 18, despite my parents' frequent hand-wringing over the fact that Independent registration bars me from voting in primary elections, and I have leaned heavily Democratic since my early 20s. I say this, rather, as a student of history, and to illustrate how eerily I'm in synch with Keillor on this point.
Perhaps it's true after all that we're witnessing the self-marginalization of the Republican party.
"I would give these people involved in the debate the benefit of the doubt that it's not political lying," says psychologist Elizabeth Loftus, of the University of California, Irvine, an expert on the reliability of eyewitness testimony. "It's sort of wanting to remember things in a certain way. That's probably why all these people seem so sincere. They may actually believe what they're saying."
"Even if it was my own memory, I'd be skeptical about the details," says Christine Ruva, a psychologist at the University of South Florida. "Memories aren't stored in a data file of fact. Instead, we take all the information we know about the world, we know about ourselves, and we construct something."
Finally, someone other than me has pointed out that it's not such a strange thing for memories of combat to change over time. This slow Saturday night, via the AP Science news feed on Yahoo news, a summary piece on how stress relates to memory. Some hightlights:
You'd think the details would be scorched into a veteran's memory like a cattle brand: ducking gunfire, seeing someone die in battle, bracing against a blast's concussion. Who could forget?
Yet such memories not only blurred over time in one classic psychological study of soldiers, but mutated too. Old recollections faded; new mental pictures took over. Whole new chunks of personal history materialized from the muck of memory.
"People went from, 'Yes, I saw one friend killed,' to 'I saw no friends killed,' to 'I saw two friends killed,' to `I saw three friends killed,'" said Dr. Andy Morgan, a Yale University psychiatrist who helped run the six-year study.
Far from being an indelible recording, human memory is fragile, incomplete, malleable and highly subject to suggestion, researchers have shown in dozens of studies.
Time isn't the only factor that obscures memory. Great stress or danger during an event â?? as in combat â?? appears to gum up the mechanisms of remembrance, perhaps through a hormone rush that temporarily dulls memory-forming areas of the brain.
Later, our own, sometimes incorrect inferences about what happened gain equal footing with what we really saw or heard. The recollections of others, like old war buddies at a reunion, can overwrite our own. [emphasis added]
"Memory doesn't work like a videotape," says Dawn McQuiston-Surrett, a psychologist at Arizona State University West.
...Yale researchers interviewed about 150 [soldiers] at intervals over six years, starting soon after their return from the first war with Iraq in 1991.
They asked the soldiers questions about their experiences, including whether they took incoming gunfire, faced Scud missile attacks and witnessed a friend's death. About 15 percent changed their recall of something significant, like seeing a friend die, the researchers reported.
Some veterans were upset when their own discrepancies were pointed out. Some even asked for help. ""They would say, `Which one is it?' to me," Morgan said. "I'd say, 'I don't know. I wasn't there.'"
Veterans with psychological or emotional problems tended to change their memories more often, the researchers found. But nearly everyone changed recollections over the six years.
Memory experts say a mild state of vigilance during an event boosts its commitment to memory. But being scared for your life, as during a crime or combat, impedes memory.
Other researchers say memories are especially fickle when the events unfolded on a broad stage or in multiple parts. Such recollections are inevitably partial, and a soldier will tend to fill in blanks unconsciously with personal inferences and the memories of others.
In unconsciously remolding memories, people often substitute details that make more sense or enhance their personal self-image, like turning a routine act of soldiering into heroism. People reshape their memories under pressure or encouragement from others.
So here we're looking at emperical verification that people's memories of traumatic events are malleable, and even subject to fabrication. We're looking at verification that it's possible for a group of people -- motivated, say, by hatred for a former comrade -- might convince one another that things happened in a certain way.
This angle of the story should have been discussed literally months ago, because it's obvious, it's non-controversial among people who actually know anything about the field, and it's spectacularly relevant -- and, oh, also because there's a whole lot of arrant nonsense being fronted around this whole issue of behavior under fire. But this touches on a very uncomfortable set of truths, all of which come back to the fact that we do not -- I say again, that we do not, not that we might not -- remember things as they really happened, and that such tools as recall through hypnosis may be worse than useless.
Left unsaid in articles like this: If memories formed during periods of high stress (when "...scared for your life, as during a crime or combat....") are less trustworthy, doesn't that call into question most convictions for violent crime? And is there anyone who dares to point this inconvenient fact out?
Also worth mentioning: One would think that memories formed of George Bush during his time serving in the Alabama Air National Guard would have been formed during "a mild state of vigilance" (e.g. gettign prepared to fly training missions over Alabama), and hence be that much more reliable than the memories of those serving while 'scared for their lives.' And yet, still, nobody remembers serving in Alabama with the President. Curious indeed.
It's fashionable in many circles to trash on Internet information resources. And worst is any information resource that's driven by "community." Take the recent story from the Syracuse Post-Standard by would-be technopundit Al Fasoldt.
Wikipedia, [Liverpool High School Librarian Susan Stagnitta] explains, takes the idea of open source one step too far for most of us.
"Anyone can change the content of an article in the Wikipedia, and there is no editorial review of the content. I use this Web site as a learning experience for my students. Many of them have used it in the past for research and were very surprised when we investigated the authority of the site."
"I was amazed at how little I knew about Wikipedia," Fasoldt continues. I'm amazed at how little he still does. For example, he doesn't correct Ms. Stagnitta's fallacious assertion that there's "no editorial review". In fact, Wikipedia articles do, absolutely, receive editorial review. All the time. Twenty-four-by-seven.
The research required to correct this misapprehension wouldn't be difficult: Fasoldt (or Stagnitta) could start by scanning the Wikipedia Community Portal, look at the Wikipedia Village Pump for discussions of policy questions, or look at their Policies and guidelines entry. If he wanted to be really adventurous, and really interested in testing how reliable Wikipedia is, he could experiment by trying to hack the system and drive an inaccurate edit; if he did that, he'd discover that there is, in face, editorial review -- it's just not performed by an anointed editor, but rather by people who might have some kind of actual knowledge on the subject. (Mike at Techdirt.com suggested such an experiment, and was rebuffed.)
But there's more at play here than sloppy research. In correspondence with Mike at Techdirt.com, Fasoldt used terms like "repugnant" and "outrageous" -- terms which are clearly driven by fear or anger (the latter in any case usually being driven by fear). So I have to sit here and ask myself: What is it about Wikipedia that inspires such fear and rage? And I think I know what it is. It's the very idea that information not sanctioned by some kind of official authority could be taken as reliable.
Because, after all, if information is "free", then information gate-keepers have empty rice-bowls.
Let's look for a moment at who's complaining: A high school librarian (well, we assume she's a librarian, Fasoldt's piece actually doesn't identify her as such), and a would-be pundit with a penchent for John Stossel-ish ranting. These are both people in eroding professions, most likely looking to avoid challenges from "authorities" who aren't designated as "authoritative" by membership in their guild. Heaven forbid that some student should rely on a Wikipedia article that's the collective work of three or four entomology graduate students in different universities, rather than one from Brittanica that was written by one grad student and then signed by his advisor. Such things will certainly and truly cause the end of civilization as we know it.
This is another one of those false dichotomies that frightened practitioners of marginal professions use as leverage to get their heads screwed still deeper into the sand. Wikipedia is a good thing. It's not a good thing because community-driven content is an inherently good thing (though that last is almost true); it's a good thing because they do it well. That's partly a function of size and critical mass; but it's also partly a function of rigor in management. The rules get enforced, and editorial quality stays generally good, because like most successful "open source" projects, there's really a fairly high degree of central control in the areas that really matter.
It's easy to see why Wikipedia would be very, very threatening to a public school librarian; it's also easy to see why it could suddenly seem very threatening -- or, at least, like a blood-spotted chicken -- to someone who's set himself up to be a mediator for technical information. In the more "elite" echelons of librarianship and technical journalism (visit the reference desk at a good-sized college or public library for examples of the former, or read Dan Gillmour or ... for examples of the latter), the practitioners for the most part have a deep understanding that they are not gate-keepers, but guides. In the margins, that sense seems to get lost. Whether that's primarily due to the general noise of trying to make a living, or due to more petty fear of the future, is hard to tell -- and in any case, they're probably not so often mutually exclusive.
All that said, and as a final word, the free and open creation and maintenace of public information resources by the public that uses them is an inherently good thing, provided the quality of the information remains high. In that sense, Wikipedia could and probably should be a poster child for the proper and proportional application of [American] Libertarian and Anarchist ideas. It's an example of the "direct action" of many participants aggregating into an objectively good result.
One final point: Curiously enough, the quality of information never actually seems to be at issue for Stagnitta and Fasoldt. You'd think that if they're so concerned about reliability of the information, they'd want to actually test the information. But they seem more focused on explaining why it couldn't possibly be reliable, versus testing whether it actually is. Well, I guess I can't expect them to be scientists.
ADDENDUM: I got some of the links wrong, herein. The original story lead was via BoingBoing, and that's where the terms "repugnant", "dangerous", and "outrageous" appeared.
la Cie are specialists in external storage devices (though they also make excellent flat-panel monitors). They got their start building SCSI drives for Macs and other SCSI-equipped PCs, and then were heavy early adopters of IEEE 1394 (a.k.a. "FireWire" -- still superior to USB 2.0, as far as I'm concerned, but what's a guy gonna do...).
Now they've partnered with MandrakeSoft to package one of their pocket-sized 40GB "portable" drives with a bootable, autoconfiguring version of Mandrake Linux version 10. Called the "Globe Trotter" [ZD Net story / MandrakeSoft product page], this is essentially a lineal descendent of MandrakeMove, and directly analogous to interesting and generally excellent Linux distros like Knoppix. It's designed to be booted from a CD and then auto-configure to use the system's resources.
The advantage of a gizmo like this, or of these bootable CDs, is that they let you carry your own computing environment with you without carrying (or even owning) a computer. With MandrakeMove, you carry just the computer and a one-ounce USB thumb-drive; I've typically also carried around my 20GB Archos external drive, which gives me still more capacity. This gives you an even more complete computing environment with even more storage space, as well as the ability to easily install new Linux software. With the benign neglect of a helpful librarian, or by just rebooting your office PC for your lunch hour, you can escape the confines of "public" computing environments. This type of device can also be handy for students having to use PCs in computing labs.
(Aside: While you could probably figure out a way to do this with Mac OS X, it would be technically difficult and legally questionable to try it with Windows.)
I'm a huge fan of MandrakeMove, and have been planning to upgrade to their second generation version; I might just get this instead. True, $219 is a little high for a 40GB drive (even one of that size), but the markup from MSRP of the naked 40GB drive is only about $60. So what you have to ask yourself is whether it's worth $60 of your time and effort to buy naked and install on top. For many Linux geeks, the answer will be "yes"; more power to them.
Me, I'll think seriously about this, because I've already found so many uses for my MandrakeMove CD that I can't begin to tell you. For example, it's been hugely useful in filling in for the deficiencies of Windows NT 4, which I still use on one of my systems at the office. The only way I have of making backups is by copying from my old PC to a slightly less old network fileserver. But since this box has USB, I can reboot using my MandrakeMove disc, and then backup my files to my 20GB Archos disk or to my 1GB Lexar thumb-drive.
Of course, there are also less savory uses for this kind of thing, such as bypassing IT policies or serving as a hackers toolbox. But then, just as you can use a car to transport stolen goods, you can use any of these things (and I, personally, do use them) for legitimate purposes, too.
"To announce that there must be no criticism of the president, or that we are to stand by the president, right or wrong, is not only unpatriotic and servile, but is morally treasonable to the American public. Nothing but the truth should be spoken about him or any one else. But it is even more important to tell the truth, pleasant or unpleasant, about him than about any one else."
-- Theodore Roosevelt, Kansas City Star (May 7, 1918)
"Patriotism means to stand by the country. It does not mean to stand by the President or any other public official save exactly to the degree in which he himself stands by the country. It is patriotic to support him insofar as he efficiently serves the country. It is unpatriotic not to oppose him to the exact extent that by inefficiency or otherwise he fails in his duty to stand by the country."
-- Theodore Roosevelt (attributed)
Despite wishful attack-blog assertions to the contrary, it seems that the Kerry campaign has not disavowed Kerry's three recorded references (once each in 1969, 1986 and 1992) to being under fire in Cambodia. And according to a fairly detailed deconstruction in Slate, there's even good reason to argue that he could have been there on Christmas Eve in '68.
Which won't impress an attack-blogger, of course; someone like Glenn Reynolds enters the fight with his mind made up, and it takes a lot more than facts or argument to sway him. What it would take is an interesting question for another time...
When you highlight a community website, it's a good idea to check and make sure that it does actually have a community involved with it.
Case in point: Dan Gillmour heaped praise upon GoSkokie.com [alternate link] as a great example of "hyperlocal online journalism". As The Register UK points out, as of Gillmour's blog entry there hadn't been a posting in three weeks.
He'd have been better served to note a site I mentioned in a while back in a comment on "Open Source Journalism", iBrattleboro. They're active, and they use the site for real community news. Of course, we have no idea how much of iBrattleboro's news might be the product of one feverishly detail-oriented brain, but the point remains that it could actually be useful to someone, where GoSkokie won't be. It clearly has no critical mass. (That Gillmour could cite this as a 'done-right' example without it having critical mass is pretty strange -- that should be the most obvious requirement for a successful community site.)
To me, the key and obvious difference between these two efforts is that the one that has traffic and posting activity was actually created by real, bona fide members of the community -- not by students at a journalism school working from a grand plan [1.5MB pdf]. At risk of seeming anti-intellectual: If you're not from there, it's incumbent upon you to explain why the locals should give a damn what you think.
Addendum: Dan Gillmour pointed out that he's featured iBrattleboro before; now that I think about it, he may have been where I heard of it...I'd say my memory isn't what it used to be, but I fear it never was.
And just in case anyone had any doubt that the media really is owned by rich white Republican lysenkoists, the RNC has apparently admitted that it will be actively seeking to pin blame [NYT, reg req'd] on the Democrats for anything that goes wrong next week at their convention, in NYC. Well, at least they're up front about being deceitful scumbags, unlike those impish Freepers.
Just in case there remained any doubt that the GOP was the party of choice for bullies, a cadre of helpful Freepers have been volunteering their homes as crash-pads for people coming to NYC to protest the Republican National Convention. Well, not their homes, exactly. Well, not anyone's homes, exactly. As a matter of fact, they're fictional homes:
.... The people were very friendly, giving me wonderful directions to their apartment and telling me that I could crash there until I leave on the 5th. Hmm. That's generous. Too generous, actually.
But then, I had that nagging feeling. Wouldn't it suck to fly all the way there and find out that we were being duped? I figured I could Google their email address and see what was up. Well, there was nothing except the aforementioned cc.org housing listing. So I removed the "@hotmail.com" from the email address and Googled the handle instead. Surely if they were not on the level they would be smart enough to not use a handle that would lead to incriminating evidence. Well, I forgot that Republicans in general and Freepers specifically are a bunch of heartless, brainless fucking assholes.
Oh, golly, that's hilarious. Gosh, those Freepers sure do have a great sense of humor -- making sure those "moonbats" end up sleeping on the street in New York City during the five days of the year that the NYPD is most likely to be hauling in every "vagrant" they find on the street; wouldn't that just serve those liberal pussies right? It's almost as funny as when they used to call Chelsea Clinton the "dog faced girl."
The "Christmas In Cambodia" meme is really picking up steam. It's a really problematic meme for the Kerry camp for two reasons:
Now, lots of people do really seem to think this is a big deal. These people need to look in the mirror. Or rather, look to their own past. And they need to do it honestly. Because I can just about guarantee you that they all have things they're sure about back there that just plain never happened.
I was born in Las Vegas, and left there at the age of 3. I have a very vivid memory of standing at the curb with my father, and watching the moving van -- a big, yellow and green Mayflower van -- come around the corner at the end of the block. My father was wearing shorts and a three-colored knit shirt that he was fond of in those days (very '60s-Vegas). I'd had no idea before then that we were moving.
The memory is especially vivid for me because I did this with my father, who I seldom saw. He worked at a nuclear research test site 150 miles out in the desert. He left the house well before I woke in the morning, and was back only long after I'd gone to bed. On weekends, he kept busy with the endless home improvement projects that young homeowners think they have to undertake to improve their resale value. (Though truth be told, he probably did it just to stay sane. He's always loved working with his hands.) This is not an indictment; he was just trying to take care of his family. And after all, he did quit that life after a few years for a saner one.
There's just one problem with this memory: It can't possibly be real.
See, I'm blind as a bat. I wear glasses with a very strong prescription. There's no way I could have seen that Mayflower van. Without my glasses, I could just about safely walk home through a neighborhood I knew well, but it would be a really good idea to get someone to tell me whether there were cars coming. At night, I'd be up the excrement race without a propulsion device. I've been this way all my life, but no one spotted it until I was 5, getting ready to go into kindergarten.
I have other memories from a similar age that I regard as more authentic, because they include my frustration at not being able to see what people were telling me to look at: Landscapes, fish, trees in a field, Yosemite falls. Where this particular one comes from is a complete mystery to me, because, as I've noted, before we left Las Vegas I barely knew my father. There are no pictures of moving day or the moving van in our otherwise extensive slide collection. No one else in the family remembers this happening. I remember having this memory as far back as I can think, but never mentioned it to anyone until I was in my teens, when I was informed in no uncertain terms that I must have made it up. And yet, though I know all these things as probable facts, that memory is still as vivid to me as it ever was.
The point of this story being that I was able to manufacture a vivid, powerful memory of something that most likely never occurred -- or, at least, never occurred the way I remember it. People do this all the time. It's perfectly normal. You can even make other people confess to things they haven't done. I have other vivid memories that I can carefully situate, that don't stand up to the evidence. And what's more, if we're honest, we all do.
If I could do it, so could John Kerry.
The United States is full of men in their 50s who have lots of vivid memories of a very nasty place and time that are probably not very accurate, but seem very, very real and present. John Kerry remembers 'searingly' being on a boat in Cambodia on Christmas; records indicate that he wasn't there on Christmas, but he was (probably) there a couple of weeks later. It's not at all implausible that he conflated the two nights in his mind. (Hell, Ronald Reagan "remembered" photographing the liberation of Auschwitz.)
I talked with my father about memory a couple of weeks ago. He showed me a list that he'd been making, to help him write a personal memoir. Nothing scandalous; just something to help him sort his life out, maybe provide some insight to grand- and great-grand-children. He told me that he'd found so many places where he realized he'd gotten the order of things wrong that he decided he needed to put them on a timeline, and in so doing found a lot of other things, too.
Well, why not? We have this illusion, this fantasy, that our minds are sure and true and that our vivid, present memories are always what really was. And sometimes they are. But what do we expect? Billions of neurons we may well have -- but we store many decades of sights and sounds and scents integrated into rich, cinematic wholes. Shouldn't we expect that we're filling in at least some of the details through interpolation?
Some folks at the Web Standards Project have put together a slick little website called BrowseHappy.com (very Carbon-ish design, if I may say) that highlights the four principle alternative browsers: Firefox, Mozilla, Apple Safari and Opera. I've used all of these but Safari, though I have used recent versions of its close cousin, Konqueror. I can say with confidence that all of them are superior to IE in almost every siginificant regard.
The one area where they're not superior, is in compatability with applications that rely on IE's idiosyncracies and proprietary extensions, like Outlook Netmail. (It's a Microsft application, what do you want?) But for many such applications, the "incompatabilities" can be resolved by simply setting the user agent string to let you masquerade as IE. (That's a standard feature on Opera and KHTML browsers like Safari and Konqueror, and an easy add-on for Mozilla and Firefox.)
I'm a Mozilla user. Since version 1.1 or so, I've used it for everything that didn't explicitly require Internet Explorer. In that time, it's gotten faster, smaller, more feature-packed, and developed better and better support for web standards. Intentionally or not, this site is optimal in Mozilla, since that's what I use when I'm building pages. IE compatability tweaks are the last step I take for my personal sites.
I'm also an enthusiastic booster of Mozilla Mail, which is baked into Mozilla. It's fast, standards-based, robust, secure, easy to learn, and provides some of the best support I've ever seen for multiple email identities. One big reason that I prefer Mozilla to Firefox is that Mozilla does come with the mail client baked in, and I'd have to go and get Mozilla Thunderbird to switch to Firefox. I have some general gripes about this arrangment, but overall, either Mozilla or Firefox+Thunderbird is a good, safe, usable combination.
The message: Go for it, folks. Switch to a safe browser. And while you're at it, switch to a safe email client, too.
The Swift boats controversy [free reg req'd] seems to me to be a great illustration of the "Telephone Game" in action.
I remember one time in fifth grade, there was a rumor. What it was, or who it was about, aren't important; what matters is that it started small, and got embellished along the way, and in the course of it I got introduced to the game of "Telephone" and the rules of human communication that it illustrates. I've been fascinated by them ever since.
You know the game, even if you've never heard of it: You start at one end of the line with a whispered but inane truth ("Jeanne couldn't find her favorite socks this morning") and end up at the other with something dramatically different ("Jeanne's ex-boyfriend broke into her house last night while she was out with her new boyfriend, and cut up all her clothes with a 12-inch Bowie knife, then he stabbed the knife into her bed right where her heart would be").
In the parlor-game version, the rumor gets passed person to person, sometimes around a circle -- embellished a little at each retelling, to be sure, but each person knows that what they're saying is at some level fabricated. In a real world version, though, people don't know they're playing a game: They take what they hear as true. So even if tellers know that what they've said is false, they assume that what they hear is true.
The SwiftVets appear to me, basically, to be playing a sophisticated and un-acknowledged game of Telephone. Each of them knows (or at least, at one time, knew) that they're editing events to suit their anger. But at the same time, each is "honor-bound" to take each of their shipmates at his word.
(Well, not each of them. They discount the word of the traitor, John Kerry, on principle; and if he's lying, so must be anyone who agrees with him.)
In this manner, they're able to get a nice little round-robin going. As each sailor hears the next's embellishments -- which, remember, they are honor bound to take as true -- they're encouraged to add them to their own. Those in turn are taken as true by the next person in the vicious circle.
That we're looking at Telephone and not group discovery of truth is supported by the fact that when Swift boat vets who aren't part of the Swift Vets for Truth circle are polled, they don't support the SVFT gospel. They may not support the Kerry-boat story completely, either -- but they do tend to support the general terms of the Navy's version.
Sometimes you hear something unexpected in the music of language. It just takes a new context to bring out the nuances.
That's not a new idea, of course, but it suddenly makes sense to me in a way that it didn't, before.
The key to this understanding was the phrase "Follow the money." That's a Buchananesque phrase if ever I've heard one.
Think about it. Put it in his mouth: "Follow the money." Put it together with the working-class Socratic style that Buchanan has used in debates throughout his public career. He doesn't fight to win points; he fights to convince. And the best way to convince someone is to make them think through to your position on their own.
It's Socratic -- the kind of thing that a teacher (or journalist) might say. Follow my points. Ask the same questions I'm asking, and get my answers. Follow the money. You've going to have to earn this, because I had to earn it. You're going to have to feel my pain.
Pat Buchanan was Deep Throat.
There's a difference between a whistleblower and an informant. Deep Throat was the latter: He had some key information that pointed at something he didn't like, and he fed it to people he knew would make it blow up, and take care of his ethical problem for him without having to expose himself. It's never been said that he was a part of the conspiracy, though everyone seems to assume it. To me, the fact that the assistance he provided was relatively small (confirmations, vague hints, corrective admonitions) indicates to me that he was not a central player.
I can make what I think is a strong case that DT was someone like Pat Buchanan -- i.e., a middle-tier player, not part of the actual machinations of the scandal, perhaps not even regarded as being in-the-know. A watchful person in a crooked organization can know an awful lot without having to be actively let in on the secrets. Buchanan was peripheral to the great councils, not a central player, but he would have been trusted. And as a bright young thing, his elders might have wanted to display their plumage in his company. There are few things that make powerful men and women feel more powerful than to show off in front of bright young things. This sounds sexual, and at a certain level I suppose it is, but not in the way that I think most people will take it. I (merely?) mean to say that it makes the elder seem more wise than powerful (or more powerful than wise, pick your poison) if s/he can be perceived as such by someone s/he sees as a future peer.
Buchanan was such a bright young thing at that time: Intelligent, strong-minded, principled. But also (as he's since often shown) a cunning scrapper -- he had the makings of a powerful political operative, a fact that wouldn't have been lost on Nixon and his crew of seasoned pols. He was just the kind of guy, in short, who would have been able to know enough about what was going on to know that he didn't like it.
And look at his career, since then. Not since the early Reagan years has he had real power among the senior ranks of the Republican Party. Why, it's almost as though he didn't trust them. Almost as though he didn't respect them anymore...
This may not be the truth. But it makes a good story. As some baseball guy is supposed to have said: "If it ain't the truth, it oughta be."
"I'm mad less about losing the job -- I'm more mad about the reasons," said Glen Hiller, 35, of Berkeley Springs. "All I did was show up and voice my opinion."
The father of two young girls had worked at the design firm for five months, doesn't plan to appeal the firing, and holds no grudge against his boss.
"To some degree I can see her point of view," Hiller said. "Advertising is all about having the perfect tan and driving a cool car. It's all about image."
Hiller said he now plans to pursue work as a registered nurse, a field in which he worked for 10 years before landing the design job.
There must be some strange psychic connection between nursing and advertising. Glen Hiller is at least the fourth person I've heard of who left nursing for advertising and then went back. (I wish him luck; I think he'll probably be happier as an RN. Less stressful -- at least, that's what my advertising friends tell me....)
"I go all over the country, and all I see are supportive crowds," I recently heard The President remark. Small wonder: His vocal critics are fenced off into "protest zones", and the stealth critics weeded out by being forced to sign affidavits that they support the president. Unrestricted events such as the one Hiller got into are relatively rare, and yet we keep hearing whispers of heckling. Even with their best efforts, critics seem to be getting through the cordon. Their voices are quickly drowned out in un-ironic chants of "four more years." (I chanted that at a Reagan rally in the summer of '84, and got dirty looks from the Broome County Republican Faithful. Irony, it seems, has lost its ability to be ironic.)
When Pat Buchanan is a voice of moderation, I fear for the country. I fear for this country in the next four years. This profound schism between "conservative" and "liberal" Americans is Ronald Reagan's unanticipated legacy, brought to fruition by Newt Gingrich. Whoever wins the race, as Shelley points out, will most likely do so in one of the narrowest races in our history. The "losers" in 2000 were patriotic enough to concede, and the "winners" ungracious -- frankly, unpatriotic -- enough to govern as though God were at their collective shoulder. And the consequence is that whoever wins in November will be fiercely, passionately loathed by about 30% of Americans. So fiercely and passionately loathed that the losers will do whatever they think they have to in order to smear the winner. Whoever it is.
Oddly enough, I think we have Pat Buchanan to blame for this, at least partially. He's a fierce competitor, and has said a lot of things in anger and taken a lot of absolutist positions, and by that example has made it easier for others who followed to get away with the same. His successors, alas, have not had his intellectual honesty (or, for that matter, his intellectual capacity).
At a certain level, campaigning pop/rock/punk/country musicians have an important role to play in defining the cultural territory -- and it's not really an ideological role, or even an economic one -- though it has economic consequences.
Look, I'd love to sell more records. But you can make an embarrassing amount of money–for a borderline Marxist–selling 100,000 records a year, if you're willing to go out and work. I make what I consider to be an obscene amount of money. I do have to work for it, but I'm totally okay with that. I think everyone's going to have to do it. I think the music business is changing. Artists that don't want to tour and just want to collect royalty checks and stay home are not going to be able to do that anymore. And the more I think about it, the more I think that's the way it should be. I feel like I owe my audience something. They feed my kids. And I really like my job, a lot. Thank God, because the reality of the business is that people have to tour now. I always have, so it's not like something I have to get comfortable with. But that's the way the business is going. There's no way that file-sharing and downloading aren't going to affect the bottom line. But I really believe that if I make records that are indispensable to my audience, they'll go out and spend money to buy them, even if they've already downloaded them. If they can afford it. If they can't, I'd rather they be able to download it than not get it at all.
[Steve Earle in The Onion AV Club] [Steve Earle on Django]
The "reality of the business" at Steve Earle's level was always that you had to tour; and an awful lot of people felt just like Earle, that that was just how it should be. Until quite recently, BB King played something like 300 dates a year, long after he had to to make ends meet. Duke Ellington quit only when his doctors told him he had to, and then promptly wasted away.
The world is full of musicians like Steve Earle. I went to see New Model Army a few months back, surprised at the time to see that they were still around. Not only still around, I learned, but going as strong as ever, without having really ever stopped. And while older and wiser, perhaps, still as outraged as ever on behalf of the downtrodden -- and perhaps more so, in the wake of our countries' joint invasion of Iraq.
If you go to the Bug Jar (where I saw NMA) any Tuesday, Thursday or Friday night, you'll see live music played by people who will never be famous. Sometimes they're doing it because they do expect to be famous; sometimes (and I suspect this is more often true than these folks like to admit to themselves), they're just doing it because, damnit, they like doing it. Expectations of success are more of a rationalization than a real dream; the real dream is to someday be remembered and found "left of the dial." There are hundreds, if not thousands, of venues like the Jar, all across North America; hell, all across the western world. And there's been something analogous to them as far back as the middle ages. There's something primal and human ("honest" is too hackneyed a way to put it) about folks who do this kind of thing because they love it -- whether they love the music or the life or can't decide which, probably really doesn't matter.
There's been a lot of maundering about how digital recording technology and the mass-personalization capabilities of the Internet age will make it feasible for anyone to 'become a star' -- well, at least, to make their own records. And that's more or less true; it's also true that if everyone is making music on a large scale, and enough regular folks want to start listening to that music instead of the processed mainstream music made by those folks who "just want to collect royalty checks and stay home", it will break the power of the big labels. Which is, of course, the real reason that they're scared of "file sharing" and fair-use.
And it's also true that the majority of record-buying Americans alive today prefer to wash down their Packaged Music Product with a stiff draught of Marketing. I won't say that's OK, but it's a fact of life. I don't regard it as a necessary truth, but I do believe it's pretty much inextricable from our Capitalist culture as we know it.
The likely truth of the matter is that things will continue more or less as they always have, or at least in an analogous way. Troubadors will continue to travel and play in tiny venues, some of them pretty nasty. [FWIW, the current proprietors of the Bug Jar have recently renovated -- though they've kept their famous ceiling...] Recent history (American Idol, Britney Spears, Jessica Simpson, etc.) might seem to imply a resurgence of programmed "corporate pop" (to paraphrase Dave Rawson); but "indy" labels have risen rapidly at the same time to fill a middle tier. What remains to be seen is whether the bottom-end rennaissance will occur as predicted, or whether such interesting projects like GarageBand.com will struggle in the same obscurity as the bulk of the bands they encourage, until they run out of money or get co-opted from above.
We shall see. The only thing I feel certain of is that wherever I go in North America, I'll probably be able to find a place to see someone I've never heard of plaing music more or less just because they want to. That may be a small consolation, but it's an important one.
What happens to a whistleblower? Usually, nothing good. Case in point: Joe Darby has to be hidden by the Army to protect him from the people in his own community who regard him as a traitor. (Remember Joe Darby? He's the guy who was the first to blow the whistle on the big S&M party that was Abu Ghraib Prison.)
Of course, there are different species of "whistleblower." Matt Drudge and Ana Marie Cox get praised for blowing the whistle. People who get caught and then rat out their (former) compatriots usually end up getting cast as "shrewd" or "smart" (unless they end up getting cast as "dead"). What's the operative difference between these folks and Joe Darby?
There is a simple answer: Someone powerful is usually served, or someone's prurient interest piqued, by what Wonkette and Drudge "expose". And by contrast with a rube like Joe Darby, who stuck his neck out for a bunch of Iraqis, the rat is playing it smart. At least, until he gets caught by his compadres. Then, well, that's cool, 'cuz you gotta defend your honor, yeh?
We are developing interesting ideas about honor here in America. On the one hand, we have such high-minded expressions of honor's subtle difficulties as the denoument of Aaron Sorkin's and Rob Reiner's A Few Good Men. If you ask most people how that story ends, they'll say, "Tom Cruise wins the case." But what actually happens is that his two clients -- GiTMO Marines -- are found innocent of murder, but guilty of "Conduct Unbecoming", and discharged dishonorably. ("We were supposed to fight for the people who couldn't fight for themselves," Lance Corporal Dawson explains. "We were supposed to fight for Willie.") But Sorkin's nuanced take -- which would allow us to say that both the military hierarchy and the offending soldiers were wrong -- doesn't appeal to the American sensibility.
On the other hand, we have a kind of schizophrenic ideal of omertà, where loyalty to the group -- the band, the herd, the organization -- is a harshly judged criterion, and yet the freedom to act on one's own whims is taken as a God-given right. The crux of the dichotomy lies in our worship of strength, and our near-pathological obsession with not playing the patsy. (This, too, is tied to our "exceptionalist" ideals.) These lead us to both idealize and demonize police, to loathe and admire mobsters. I'm not talking about obvious class or race divisions, here; those certainly exist -- I'm talking rather about the admiration and fear we each find within ourselves if we look for it. Maybe this isn't a purely American thing, but we certainly do commodify it well. The identification is strongest when it's seen from it's lowest level vantage point, as with the street-level thugs in Scorsese's Goodfellas, or the in-the-shit grunts like "Hoot" Hooten in Black Hawk Down: "They won't understand why we do it. They won't understand that it's about the men next to you, and that's it. That's all it is."
And therein lies the problem: The whistleblower is seen as a traitor because s/he violates the sanctity of the team. The worse things are -- the more depraved the situation, the more under threat the "team" members feel -- the more the whistleblower will be seen as a squealer, not a hero.
And it's naive, of course, to ever expect that the actual organization -- the real, formal organizational entity, the corporation, the battallion command, the federal or state agency, the hospital management -- would treat a whistleblower with respect. This is a person who has pointed out the emperor's nakedness, and embarrassed the management. And that will never do.
And that, ultimately, is Joe Darby's great crime: He pulled the loose thread that unravelled the whole garment. And as much as the Pentagon has wanted this to be "... about getting up in the middle of the night and going somewhere you weren't supposed to go, then beating and raping people there," it was also about several people at fairly high level authorizing civilian contractors to go and recruit people to do just that. As has been said again and again and as has been lost again and again, that Charles Graner and his droogies got a little more excited and indiscreet than the civilian interrogation contractors planned does not absolve the Pentagon of guilt for a stupid, profoundly amoral plan.
For whatever small thing it's worth, the Army seems to be taking care of Joe Darby. For now. For now, he's useful to them, in their cat and mouse game with their amoral civilian overlords, so for now he gets their protection. But make no mistake, you will find few in the regular Army hierarchy who have any respect for him. After all, he did the unforgiveable: He obeyed his own conscience over his orders. Not that he actually had any orders. Which is probably how the Army, at least, will come to understand this issue: What Joe Darby and Charles Graner shared was the experience of living in the absence of discipline and leadership. Graner found in that license for perversion; his droogies found in that license to sign over their conscience at his urging. Joe Darby, apparently, found his own conscience staring back at him. Does anyone really have any doubt which of the two s/he'd rather be stuck in shit with?
And if the Army draws that as its conclusion, they'll be right, as far as they go. Real leadership, with an actual, accountable command -- instead of the fake, subverted command that General Karpinski apparently had -- might well have prevented the worst and most embarrassing of the excesses of Abu Ghraib (or at least limited them to the ranks of civilian contractors). But an actual moral compass, that pointed to actual humanity, was what was needed, and that was needed at levels much higher than the military command in the field.
One final thought: I find it interesting that the lead sadist in the little drama was, in civilian life, a corrections officer with a documented record of abusing his power. And that the man who took steps to stop it was a diesel mechanic, assigned to corrections work by the Army. Perhaps the moral there is "Never trust anyone who volunteers to be a prison guard."
.... You would find News Anchors interviewing a whore at 9:15 pm about oral sex and the President's Chief of Staff on Iraqui Freedom at 9:22 pm, with no change in tone, that of admiration. In such a society, CEOs are honored for "growth," without regard to negative social impact; attorneys are honored for winning cases for the guilty; PR firms boast to clients of successful hoaxes; Intellectuals are raised like little girls in a Brothel that they might work for Wealth in a Think Tank. We have praised what we should despise and dishonored all that we should hold holy. The result is what you see. Brands where God should be. Whores endorsing Whores for office, as the Pimp counts the money and calls the show a Success.
So who am I to object? .... All over American parents would spank sense into their kids. Citizens would spank their elected representatives. The CEOs would run from the lash. Rich and Famous Clients would come to me. ... A Resurrection in the Spirit; when it comes those who have sold piety for votes, or for celebrity, will not find the holy ghost to be as easily manipulated as they take the American people to be. Truth is a wind, not a shape. And it rising uncontrollably in defiance of all mortal shapes imposed upon it.
Alas, I fear the Tutor gives too much credit. He assumes a sense of guilt, where I suspect there is no sense of guilt at all. These are not Carnegie, Rockefeller or Theodore Roosevelt, possessed by the spectre of divine approval: These are Nietzschean Calvinists, convinced that their prosperity proves their purity, and willing to define new Master Moralities on the fly to keep the unwashed masses confused.
Liberals in the United States have been losing political debates to conservatives for a quarter century. In order to start winning again, liberals must answer two simple questions: what is conservatism, and what is wrong with it? As it happens, the answers to these questions are also simple:
Q: What is conservatism?
A: Conservatism is the domination of society by an aristocracy.
Q: What is wrong with conservatism?
A: Conservatism is incompatible with democracy, prosperity, and civilization in general. It is a destructive system of inequality and prejudice that is founded on deception and has no place in the modern world.
Talk about throwing down the gauntlet.
Funny thing is, once you think the proposition through, it's not so extreme: Conservatism, by definition, aims to preserve the status quo. The status quo by its own nature favors moneyed interests. And all else being equal, money will flow like blood through generations, conveying a powerful advantage on the descendants of the wealthy. It worked well for the Medici, or more recently for the Rockefellers and Kennedeys.
Agre continues: "From the pharaohs of ancient Egypt to the self-regarding thugs of ancient Rome to the glorified warlords of medieval and absolutist Europe, in nearly every urbanized society throughout human history, there have been people who have tried to constitute themselves as an aristocracy. These people and their allies are the conservatives."
Of course, most conservatives don't have any such conscious agenda. They don't think this is how it works.
Well, most of them don't. That disingenuous carpetbagger Alan Keyes seems to be an exception. After all, what other real argument could there be for abolishing the direct election of US Senators? Well, according to Keyes:
"The balance is utterly destroyed when the senators are directly elected because the state government as such no longer plays any role in the deliberations at the federal level," Keyes said at a taping of WBBM Newsradio's "At Issue" program.
He said it was one of the reasons "there has been a steady deleterious erosion of the sovereign role of the states."
So it's a "states' rights" issue, I guess -- "Stop the electoral abuse of California citizens by Wyoming!!!" Or, for that matter, stop the annoying tendency of American voters to cast their local or federal votes outside of party lines.
Because, let's face it, that's what this is all about: Getting those disloyal voters back in line with their party, damnit. Keyes should just bite the bullet and advocate a shift to parliamentary government. That's what he's describing, after all: Solidification of the party system by letting the ruling parties send their Senators to Washington. Heaven forbid the Senator should be from a different party than the legislative majority leaders...
Put another way (though Keyes must at all costs avoid putting it this way): Individual voters are not qualified to make decisions such as senatorial representation. We're too stupid. Or something. Maybe we don't have enough money.
New ideas can only form where knowledge is incomplete. Ideas are a response to gaps or shortfalls in knowledge; they fill in the blanks of what is not yet understood. Where knowledge is (or appears to be) complete, there will be no new ideas.
My brother Glen pointed that out to me on Friday night, in the course of explaining to me that he had started writing his papers to push ideas in addition to data. It's true, and trivial, and yet it's non-obvious, because it requires that we invert our normal analytical behavior (finding solutions).
At times like this I wish I'd pursued math more diligently. I wonder if this is another way of stating the Incompleteness Theorem.
The Washington Post is adding another minute or two to "Washintonienne" Jess Cutler's Warhol Clock:
.... Jessica and her friend slid onto stools in the cool dimness of Bullfeathers, a popular Capitol Hill watering hole. Jessica ordered a Southern Comfort. It was the middle of the afternoon on May 18.
"What happened to you today?" the bartender asked.
"I got fired. I lost my boyfriend and my job, and it's my birthday," Jessica remembers telling him.
Of course, this is all so three-months-ago, now, but it's rearing its ugly head again, and I'm tempted to wonder why; but here it is, so I'll put into bits some of what I thought at the time, aged by a couple of months to bring a little nuance to the brew. Sit back and listen to a story, then, children, all about a girl who thought she was invisible. Or maybe she just didn't think at all. The details are sordid, and while some of the details are a matter of dispute, the core sounds disturbingly familiar: A young woman takes a quasi-glamorous job in a Senate office that doesn't pay enough to support her in the style to which hip modern young middle-class people believe they should be accustomed. She discovers that she can use sex to improve her position, often through direct cash payments. One day, she begins to blog; her day to day life has the sordid cachet that seems to connote credibility among the denizens of the Fox Generation. And this time, unlike the Plain Layne soap opera, people can actually corroborate the story.
One day, again, she counts her lovers and they total six. "There are seven days in the week," she says in her own defense. The "boyfriend" she "loses" is only one of six men she is sleeping with on an ongoing basis. Two of the others pay her regularly. One of those is a married political appointee, who is only interested in anal sex.
I have some empathy for poor little Jessica, but little sympathy. She bragged about her job and her connections and talked up her exploits as though they were something she was proud of -- though reading between the lines, she seemed to me to be running on bravado as much as anything. After all, bravado -- the willingness to take that dare, even if it's from yourself -- leads an awful lot of people into trouble who are otherwise perfectly "nice". I see her as buying into a set of ideas about what she ought to be like, and how she ought to use her sexuality -- market-based ideas, really. She had an "asset" to sell, and she "sold" it, and she's selling it still, looking for ways to drag out her 15 long enough to close the book deal. All stuff, by the way, that I'm quite sure all her friends are eager to tell her are "smart" things to do.
But even if it's all bravado, all behavior to impress her herd-mates, she still had to choose to live that way. It probably wasn't one big choice. It would have been the result of a bunch of little choices, like making up an imaginary boyfriend to impress the other girls in school. And it's interesting to note that the style of her presentation is much like that men use to tell their own sexual lies. (And while the fake-boyfriend story isn't substantiated, a pattern of other lies-to-impress has been.) It's not that I regard it as scandalous when women adopt male sexual behaviors; in fact, I rather think it's kind of pathetic. After all, what does identifying with the oppressor ever really get you except self-repression?
I also lack sympathy for the Conservative backlash. It's unexamined; it lacks credibility. After all, this "girls gone wild" culture is pushed most assiduously in the broadcast realm by "Conservative" stalwarts like Fox Broadcasting and Murdoch's News Corp. That tells me that there's more than a simple "conservative" v. "liberal" opposition going on here. It seems clear that in a culture driven by consumption, business prospers as we become more neurotic, and the simplest way to drive that is to increase the degree of sexual stimulation at the same time that you increse the pressure to conform to conservative sexual moral strictures. This is nothing new, and it's not driven by any genuine philosophy of enlightenment: It's as old as advertising, and it's driven by the desire to make profit.
So both the Conservatives and the Libertines are fellow-travellers in this, as far as I'm concerned.
Well, within a few hours, I'm off to Interlochen for this year's version of last year's fun -- bigger and bolder this time, since the entire family will be involved. (Well, not the entire, ramified family -- 20 people -- just 15 of us.)
I'm a day behind schedule, just like last year. At least this time, my day behind schedule isn't Civic Holiday spent sans auto in a motor lodge in Woodstock, Ontario. A few things to button up before I can put myself to bed for the night, then looking to hit the road by 7 in the morning (to beat the Kodak traffic). Car is packed, oil changed, fluid levels checked, just a few last minute to do list items... Still, they'll keep me up for a few hours, yet.
This year, at least, I hope to be a bit more assiduous in completing my trip riport; maybe I'll even put it online (though when real people are involved, ...).
I'd say, "wish me luck", but when push comes to shove, I have to admit that I think being wished luck is a tad...unlucky. So I'll see you all back here ... when I see you.
Sometimes I have to turn off my cynicism filter and take things at something like face value. The new experiment in "open source journalism" at Bakersfield's The Northwest Voice is a good example. Starting three months ago (May 2004), they began deriving their news content directly from community members, contributed via the web.
Northwest Voice describes itself as a "community newspaper", but since they're "carrier delivered" to 22,000 homes, they're clearly really a shopper. I.e., their emphasis is on the ads, with actual content only a sweetener to get people to actually leaf through. "Community newspaper" is more commonly applied to takeaway-distributed newspapers like our own City Newspaper, the Ithaca Times, or the venerable Boston Phoenix.
What a move like this does is allow them to easily and inexpensively move upscale from "shopper" territory to the realm of more sophisticated "community newspaper", without the cost of hiring reporters. Editors are more cost-effective, because they can handle many more stories in a day than could a reporter, even if they're doing some rudimentary fact-checking. They're not the only ones to have this idea -- look to Belfast, Camden & Rockland, Maine's Village Soup for a more traditional (i.e., harder to use) rendition of a similar idea.
That cynicism filter sees this as being all about money and business -- and for that matter, the publishers are quite willing to spell that out. They're clearly in this to improve their position and their financials. They've paid good money to buy an integrated content workflow management system (albeit something that appears to use appropriately simple technical solutions).
But at the end of the day, what really matters is that people are being brought back into the news process. This is a move that makes commercial sense for the Northwest Voice, but as they're successful, they can give implicit aid and comfort to non-commercial and less-commercial ventures like Brattleboro, Vermont's Geeklog-based iBrattleboro -- based on a sparsely-configured implementation of commodity, open-source content management software. Aesthetically, the Voice seems closer to iBrattleboro than to Village Soup, and that's a good thing. It will make them more interesting to their customers, for sure, and if they can find a shared win between community involvement, commercial success, it's got to be a good thing.
Another thought: It's important to note that this is not blogging. This is edited news, that happens to be provided by the public. Clearly, it's inspired by blogging, but it illustrates something that many boosters of Bloggism have not been willing to accept: That it cannot at any point claim to be an end-form; that it cannot, in fact, ever certainly claim to be anything but a transitional, enabling form. Blogs will certainly exist in a year or two or three; but the things they spawn will not look like them, and will not care what the community standards of "Blogistan" are. Nor should they.
Here's how I want to work: I want to be able to just note stuff down, wherever I happen to be at that moment, and have it get stored and automatically categorized, and be available for publication wherever I want from wherever I am, whenever I want to. This has been an achievable dream for nearly ten years -- people are constantly hacking together systems to do just that. But we're stuck in a technologically-determined rut that keeps these solutions from being developed.
I've been thinking about these things a lot, and decided it was time that I wrote it all out, to organize my own ideas as much as anything else. So here's part one, where I try to unpack what it is that I'm really asking for, and start to get a sense for what's not working now, and why. So, as a separate story (because they're long, and would push everything down the page and out of site), here's how I want to work...
... will be happening sometime this weekend. Site will be down briefly. Posting will be shut off. Not to panicking.
Update: 4.4.2 went with no apparent hitches. Looking at a couple of promising modules for addition to the site; will probabily upload a bunch of module code so I can experiment at leisure. Documentation is (as usual) sparse, so I'll have to install these to see what they actually do.
... and will be looking at a few HTML correction modules, menu modules, upload modules, but probably not today.
Martha Stewart got five months for lying to the SEC about her insider trading activities.
The judge said she chose the minimum sentence because Stewart had no criminal record, the public interest had been served and she believed the defendant had "suffered and will continue to suffer enough."
"The sentence I have just imposed is, in my opinion, the minimum permitted under current law," the judge said. "I have not lost sight of the seriousness of the offense of which you have been convicted. lying to government agencies during the course of an investigation is a very serious matter."
It will be interesting to see how this plays. I'm with the crew that believes she was selected as a scapegoat -- an outsider, a buccaneer -- not really a member of the club, and so regarded as expendable. (Though the water-carriers for the club would naturally deny those reasons.) At the end of the day today, it remained true that she had been pursued zealously and at great expense while men who bilked to the tune of much greater sums remained unprosecuted.
For her part, Martha doesn't seem more than pro forma contrite: "I'm just very, very sorry that it's come to this, that a small personal matter has been able to be blown out of all proportion and with such venom... It's just terrible." [Reuters] (Of course, she was the one that made this more than a "personal matter" by lying to investigators....)
Every now and then, I get a referer [sic] hit on "Richard Pindell."
Why? Well, I have a couple of quotes from him on my random quotes rotation, and they show up on search engine crawls. When I took his classes 20 years ago (spring and fall of 1984), he was in the habit of writing a quote on the blackboard (yes, 20 years ago!) before each class. He'd pull out a little black notebook -- one of those small three-by-five, six-ring affairs, well-worn and sweat-damaged -- and leaf through it until he found what he wanted, and he'd write. Only very seldom would he speak before he was finished. Most often, he'd finish writing, then look at it for a moment, as though to make sure that's what he really wanted to say.
One time, he marched in, took up his chalk, and started writing right away:
"To get to eden, follow the snake."
... and left it unsigned for a moment while he looked at it. Then he glanced quickly, slyly, around the room, marched back up to the board, and signed it: "Richard Pindell."
He turned to us and remarked, with a mischevious smile: "I don't remember saying that. But one of my students many years ago insisted that I had, and it sounds like something I'd like to have said."
"Dick" Pindell flaunted his southern-ness; he was fascinated by Faulkner and Flannery O'Conner, and wallowed in masculine doubt before it was cool (except to impressionable undergraduate girls).
In those days, he was a bit of a cult character. His lectures gave the impression of being stream of consciousness, but i know from comparing notes with friends that the material covered was surprisingly consistent. Was he a good teacher? Who can say from the volunteered opinions of undergrads? I can only say that I was rather fond of him, myself; he helped me ask questions about what I was reading, which is really what his job was, after all.
He never struck me as a happy man, though. I hope that's changed, at least a little.