Serving the will of Tommy Westphall since 2004.

escoles's blog

Karma, language, and action

In following the Horgan debate on MeFi, I encountered this, in a response to a Horgan critique of Buddhism:

.... Karma literally means "action." Action always produces results and so the word karma is often misunderstood as referring only to the results of our actions, not the actions themselves. In fact, action and its results are one and the same. Time, the thing which makes us see them as separate matters, is the illusion. Time is no more than a clever fiction we humans have invented to help organize stuff in our brains. ....

... which is, in turn, no more than a clever fiction that Buddhists have invented to help organize stuff in their brains. Because, of course, if the world is an illusion, then we can't prove the world is an illusion.

Tricksy, these Buddhists, is....

But I digress, as usual. What really interests me is the simple assertion that actions and their results are "one and the same", without any attempt to explain what that means. If you parse the language, what doubtboy is really saying is that the Buddhist term karma isn't a synonym for "action", it's a cognate. "Karma", that is, doesn't "mean" "action" -- it "means" (in English) "action plus result."

The two different conceptualizations of "action" let you reason to different ends -- they give you different kinds of power. One gives you power to include, the other gives you power to divide. As Pirsig pointed out in Zen and the Art of Motorcycle Maintenance, the power to divide is powerful, too.

Where Buddhist practice starts to get really interesting (to me) is where it allows you to use both conceptualizations of "action" simultaneously. It's my experience that many dedicated students of Buddhism don't grok this possibility; I expect that doubtboy, like Pirsig, does.

The problem of trackback and comment spam in Drupal, and one way to address it

(I just posted a version of the following over at Drupal.org, in their "Drupal Core" forum. I doubt it will make much of an impact, but I had to try...)

I propose that there is a problem with the ways that program function URLs are written in Drupal, that causes Drupal to be a disproportionate target for trackback and comment spammers.

The problem with comment and trackback spam in Drupal is this: It's too easy to guess the URL for comments and trackbacks.

In Drupal, the link for a node has the form "/node/x", where x is the node id. In fact, you can formulate a lot of Drupal URLs that way; for example, to track-back to x, the URI would be "/trackback/x"; to post a comment to x would be "/node/comment/reply/x". So you can see that it would be a trivially easy task to write a script that just walked the node table from top to bottom, trying to post comments.

Which is pretty much what spammers do to my site: They fire up a 'bot to walk my node tree, looking for nodes that are open to comment or accepting trackbacks. I have some evidence that it's different groups of spammers trying to do each thing -- one group seems to be switching IPs after a small number of attempts, and the other tends to use the same IP until I block it, and then takes a day or so to begin again -- but that hardly matters. What does matter is that computational horsepower and network bandwidth cost these guys so little that they don't even bother to stop trying after six or seven hundred failures -- they just keep on going, like the god damned energizer bunny. For the first sixteen days of August this year, I got well over 100,000 page views, of which over 95% were my 404 error page. The "not found" URL in over 90% of those cases was some variant on a standard Drupal trackback or comment-posting URL.

One way to address this would be to use something other than a sequential integer as the node ID. This is effectively what happens with tools like MoveableType and Wordform/WordPress because they use real words to form the path elements in their URIs -- for example, /archives/2005/07/05/wordform-metadata-for-wordpress/, which links to an article on Shelley Powers's site. Whether those real words correspond to real directories or not is kind of immaterial; the important point is that they're impractically difficult to crank out iteratively with a simple scripted 'bot. Having to discover the links would probably increase the 'bot's time per transaction by a factor of five or six. Better to focus on vulnerable tools, like Drupal.

But the solution doesn't need to be that literal. What if, instead of a sequential integer, Drupal assigned a Unix timestamp value as a node ID? That would introduce an order of complexity to the node naming scheme that isn't quite as dramatic as that found in MT or WordPress, but is still much, much greater than what we've got now. Unless you post at a ridiculous frequency, it would guarantee unique node IDs. And all at little cost in human readability (since I don't see any evidence that humans address articles or taxonomy terms by ID number, anyway).

Some people will immediately respond that this is "security through obscurity", and that it's therefore bad. I'd argue that they're wrong on two counts: First, it's not security through obscurity so much as security through economic disincentive; second, it's not bad, because even knowing exactly how it works doesn't help you very much to game it. The problem with security through obscurity, see, is that it's gameable. Once you know that the path to the images directory is "/roger/jessica/rabbit/", then you can get the images whenever you want; even if you know that the URL to post a comment is "/node/timestamp/reply/comment/", you're not a heck of a lot closer to getting a valid trackback URL than you were before you knew that.

Because They Can. (Cringely on Google)

'Cringely' offers an anecdote to illustrate Google's power over the soul of the market: He goes into the bank to deposit a check, gets in line behind a kid who insists on keeping ten feet of empty space in front of him.

... The queue was perhaps 20 feet long and right in the middle was this 10-foot gap. I was in no hurry, I thought. That gap was not going to cause me to get to the teller more than a second or so later than I might if the gap was closed. No problem.

Only it WAS a problem. As the minutes passed that gap started to drive me insane. Finally I asked the kid to move forward.

"It was making you crazy, right?" he asked, clearly enjoying the moment.

(Ah, yes, the joys of being a self-important little putz..but I editorialize....)

Google has something over $6B -- that's six billion dollars -- in cash on hand right now. That's cash -- not credit, not valuation, but real money that people have paid them. And everyone wants to know what they'll do with it.

The day when six billion could buy three Eisenhower-class aircraft carriers has long passed, but you can still make a pretty good splash with that much cash. So, as Cringely points out, all the gorillas in technology are sitting around waiting to see exactly what it is that they'll do. Which gives them an amazing amount of power -- as long as they don't actually do anything.

Putting things in perspective, Google has been really really super good at exactly one thing: Self-promotion. Sure, some of their technology is pretty good, but there's really no evidence that their algorithms are really any better than, say, Teoma's. What they do have is more power. There's a saying among marketing folks: "Go big or go home." Google went big, more or less from day ten or so. "Day ten" because they had to get the money to go big with, first. And that's where self-promotion came in.

I distinguish between "marketing" and "self-promotion" here because I think it's important. Google, at the root, has always rooted its mystique in the cult of personality that's coalesced around these mythical beasts "Sergey" and "Larry". That's suffered a little, no doubt, as a result of Eric Schmidt's incredible childishness in response to CNet feeding him a half teaspoon of his own company's medicine. Nevertheless, Google still builds its reputation in large part out of the sheepskins of its PhD-filthy workforce.

As Cringely points out (and as I've pointed out for years, myself), though, and much like Microsoft, Google's technical solutions are seldom really cutting edge, but because of their market dominance people more or less have to use them. What Google have done well is mobilize the good will of geeks; which is to say, what they've done well is to work the cult of personality for all its guerilla marketing mojo.

And now, all they have to do is twitch -- or even hint at twitching -- to make gorillas jump. Rumors abound: Google is buying up dark fiber, so they can run their own internet; Google is building a vast new data center, so large that it will need a major hydroelectric plant to power it; Google is producing their own desktop OS. Sometimes they're even true: Google is in the process of rolling out its own "desktop", a search/chat/email client that will allow them to entrench even more deeply and even more richly enhance their vast database of geographically-linked internet behaviors.

That database is the elephant in the room in any discussion about Google, though of course it's useless without the market-muscle to deploy (and grow) it. In military terms, Google's market mojo pairs up with its database like big satellite-guided bombs pair up with the geographical databases that tell you where the targets are. It's their market position that lets them get the database; the database is what's going to guarantee their market position for years to come.

Will the Real MSM Please Speak Up?

The neo-conservative cabal has gotten back into gear and mobilised its counter-attack, as evidenced by this report from ABC affiliate KGO:

This week Simi Valley California Gold Star wife Melanie House flew to Idaho for a protest and then flew to Crawford.

ABC7's Mark Matthews: "Can you tell us if you're getting help in airfare to come down here?"

Melanie House: "What difference does that make?"

There is real reluctance to talk about who's paying, and the P.R. machine that's promoting Cindy Sheehan, but not everyone here is completely comfortable with it.

Perhaps it's merely my own subjectivity talking, but it seems to me that there's a real reluctance to talk about who's paying for and calling the shots with the P.R. machine that's promoting the ends of the neo-conservative cabal. I can't recall many discussions in the MSM about who funds right-wing "popular efforts." In this piece, there's some token attention to balance, but only about four of the piece's twenty-nine paragraphs are devoted to considering who funds the Republican counter-efforts -- and those paragraphs are buried at the bottom of the pyramid.

Some outlets do try to play fair. NPR, for example, points out [listen] that the President's counter-campaign has been mobilised in states where he has a strong base of support, and in front of hand-picked ("reliably friendly, mostly-military") audiences. But then, much as the punditocracy would like us to believe otherwise, NPR is not part of any usefully-defined "main-stream". Unless, of course, someone wants to explain how they could be part of the mainstream and part of the fringe at the same time. (Hint: There's a way to do it. But it might require that you tread on some intellectually slippery ground.)

It's frightening enough that the Bush regime are willing to do this to shape public opinion; it's even scarier when you start to become convinced (as I did a long time ago) that our President believes that the hand-picked audiences represent a real cross-section of American views.

"I've met with a lot of families," the President has remarked. "She doesn't represent the views of a lot of families." (Which of course means that she still could represent the views of a lot of families, since "a lot" does not imply "a majority." But I digress.) Now, he could be speaking with painful literalness. Parse out the sentence: It literally states only that a lot of people don't agree with Cindy Sheehan. Big news. A lot of people don't agree with the manufacturer's contention that Marshmallow Fluff sandwiches are wholesome food and not candy. (They're called "good parents.")

I fear it's more sinister -- or at least, more frightening -- than that: George W. Bush does not expose himself to contrary opinions unless he's compelled to, and it's hard to compel the President of the United States to do anything.

But again, I digress. The point is this: People who believe in the existence of a leftist media bias really need to do two things: First, start listening, really listening, with an open mind, to the news; second, come to understand that "leftist" is not synonymous with "stuff I don't like."

Really, I suppose the second should come first. It's kind of a necessary step before you can understand that people on the left hate some of the same stuff that people on the right hate: Crime, lying politicians, moral turpitude, callous disregard for human life, just to be going on with. But since the rightist positions have been almost wholly co-opted by religionistic moralists (and this is nothing new, by the way), every message that people on the right get tends to come in black and white terms: You are with us or against us. You love Jesus, or you love child pornography. You back the President, or you hate America.

It's a simplistic portrayal, sure; but the people in charge of trying to drive that portrayal like it that way. It's easier to manipulate people if you have hot-button terms arranged into superficially simplistic (and thus, ambiguous) statements. That most people don't adhere to these positions when you really start to look at their positions, kind of testifies both to the effectiveness of the technique (after all, people act in great numbers to support these simplistic ideals) and to the stubbornness of human intellect (after all, people still don't believe it entirely, once you cut through the ambiguity to get to their actual opinions).

Imagination Failure of the Moment

Failure of imagination is often indistinguishable from arrogance.

Here's how The Blue Technologies Group conceptualizes the ideal "writers" editing environment:

The concept of single documents in the classical sense is dismissed. Text elements take their part and are organised in a project, the container.
Every text element has two editing levels: the "standard" text and a "note pad".
The ability to format texts in an optical way (bold faced, italics, etc.) is omitted - you can divide paragraphs into levels and set markers instead.

It's passages like this that drive home to me how sorely and sadly in need most people are of a little applied personality theory. Because it's painfully clear to me just from the language that they use that their word processor, Ulysses, is going to be a painfully inappropriate tool for the vast majority of writers.

I know that because Ulysses has clearly been defined to suit the personality of a particular type of writer. The words and concepts its creators deploy tell me that. They talk about "projects", "markers", "levels" (of paragraphs?). These are organizational terms; they're conceptual terms. Using them to appeal to "writers" exposes the assumption that all writers think in similar ways. It implies that "writers" will want to restructure the way they think about producing texts such that they're vulnerable to being organized in "levels", and that they'll find it a benefit to replace italics and boldface with "markers".

My own experience working with writers who need to maintain HTML demonstrates to me abundantly that people aren't typically very interested in replacing italics with an "emphasis" tag. The idea that "italic" is visual and "emphasis" is conceptual (and hence, independent of presentation) is too abstract from the reality of writing, for them -- it's too high-concept; for them, the reality of writing is that emphasized passages are in italics, and strongly emphasized passages are in boldface.

And I also see that while they talk about elimiinating distractions, they produce an application with a cluttered and confusing user interface that looks to me like nothing so much as the UI of an Integrated Development Environment (IDE). While I've grown accustomed to the metaphor, I can remember when I found it cluttered and confusing, and I know from long experience that most people find those UIs as confusing as hell.

Now, this may be a great environment for some creative people. But based on what I know about personality theory, that subset of people is going to be very small -- something less than 7% of the population, most likely, and then reduce that to the much smaller subset that are writers who work on substantial projects.

I might even try Ulysses myself, for whatever that's worth; but if it looks to me like it would be the slightest nuisance to produce reviewable copy (for example, if I have to spend ANY TIME AT ALL formatting for print when I send it to friends and colleagues for review) then it's more or less worse than useless to me: Any time I save by having my "projects" arranged together (and how many writers do I know who organize things into discreet projects like that?), would be wiped out and then some by time wasted formatting the document for peer-reviewers. And I haven't even started to talk about trying to work cooperatively with other people....

The (partly valid) response might be that if writers would only learn to use it correctly, and adopt it widely enough that you wouldn't need special formatting to send a manuscript out for review, then Ulysses would be a fine tool. Of course, that's the same kind of thing that Dean Kamen and his true believer followers said about the Segway: If we'd all just rearrange our cities to suit it, the Segway would be an ideal mode of transport....

It's not the marketing I object to -- that will either work or it won't -- it's the arrogance of presuming that they've found the True Way. Because the implicit lack of interoperability that goes along with defining a new file storage protocol (and I don't care how you dress them up, they're still files) basically inhibits Ulysses users from working with other writers, and therefore implies that it's a truly separate way, if not a purely better way. Ulysses looks to me like a tool that fosters separateness, not cooperation -- isolation, not interaction. It's farther than ever from the hypertext ideal.

But then, I suppose my irritation is indicative of my own personality type.

London is Burning

I'm having flashbacks. I managed to miss the Madrid attacks, or maybe there's just something in the circumstance of the moment that makes it hit me harder this morning. But I'm feeling a little numb right now. All I can think about is how convenient this will be for people who want to lock things down still more and more....

I'm not so worried about Britain; they've faced this kind of thing before, though it's a long time since they had to deal with it at this volume. And they're tougher (for lack of a better term) than we Americans, I think, about the preservation of their freedoms. They have a deeper understanding of the compromises involved and required to ensure "security". But that's a whole longish essay in its own right.

Co-ordinated attacks -- six bombs in five locations. [Correction: Four blasts, three of them deep in London's oldest tube tunnnels.] It's not like an IRA hit, and my small, cynical voice says the IRA could get some good PR out of this by pointing out that they often gave a heads-up and didn't try to hit in more than one place. But my contingency-planner's brain thinks that those IRA hits will have caused their emergency response to be cleaner, more efficient.

And while I'm numbed, I'm thinking also: What will be made of this by a people who've been taking casualties from terrorists, off and on, for more than thirty years? (Or, if you stretch your imagination to the Colonies, longer than that.)

Will there be instructive contrasts with Madrid (where they'd also been taking terrorist assaults for decades), and with the United States (where we largely make do with crime)?

Anti-Fittism Of The Moment

Big targets mean big distractions.

I'm sitting here listening to Whadya Know on the radio while I write. While I do this, I've got a couple of applications and part of my desktop visible on screen, and a cluttery clumsy app launching device pinned to the left edge of my screen. (I move my Dashboard to the left edge because I value the vertical screen space too much. More vertical space means more text on screen, which means more chance at understanding what the hell I'm looking at. Which is to the point, believe it or not, but I'm not going to go there right now.)

And I'm getting distracted by it all. Part of it is that I'm over 40 and wasn't "raised to multi-task" (as so many people raised in the age of multi-tasking OSs and multiple-media-streams seem to think they have been). But part of the problem is all this extraneous visual noise -- stuff I don't need to see right now, like the "drawer" to the left of my application window that lets me see the subject lines of previous journal entries and, more to the point, blocks out a bunch of other distracting stuff in the windows behind this one. Obviously, I could close the "drawer" and widen my editing window to cover them, but then I'd have a line-length that would be difficult to eye-track.

Anyway, the point of this (man, I am getting distracted) is that having all this clutter on my screen distracts me. Presumably that's why MacJournal (like a lot of "journaling" applications) has a full-screen mode that lets me shut out everything else if I so choose.

Fitt's Law is increasingly invoked these days to justify a lot of design decisions, like pinning a menu bar to the top of the screen for all applications, or putting "hot zones" in the corners of the screen. It's invoked as a rationalization for putting web page navigation at the edge of the page (and hence, presumably, at the edge of a window).

Interestingly, it seldom gets used as a rationalization for making navigation large.

Fitt's Law reduces to a fairly simple principle: The likelihood of hitting a target by using a mouse pointer is a function of the size of the target and the distance from the starting point. That is, it's easier to hit a big target with a mouse pointer than it is to hit a small target.

Fitt's Law is also often cited as demonstrating that it's easier to hit targets that are at a constrained edge or corner; it's as valid a principle as Fitt's Law, but isn't really implied by it. So Fitt's Law gets cited to justify things like parking the active menu bar at a screen edge. It's easy to get to the edge of a constrained screen: Just bang your mouse or track-pad finger or pointing-stick finger over to the screen edge and it will stop -- won't go any farther. Bang in the general direction of the corner, and the cursor will behave like water and "flow" into the corner, so the corners become the easiest thing to hit on the screen. Tognazzini, among others, uncritically and inaccurately cites this as an example of Fitt's Law in action. I don't know who came up with this conflation first, but Tog is the most vocal exponent of it that I'm aware of so I'll probably start referring to it as "Tognazzini's Corollary."

(Aside: Obviously this only holds for constrained edges, as on single-screen systems. On two-screen systems, or on systems with virtualized desktop scrolling, it's a little more complex. Less obviously, this principle is more or less meaningless on systems that are actuated with directly-positioned devices like touch-screens, and it requires that people engage in some selective modification of their spatial metaphors. But that's another topic for another time.)

It's interesting to me that Fitt's Law isn't applied to the size of buttons because that's its most obvious implication: You want to make the element easier to hit, the most obvious thing to do is make it bigger. Yet I don't recall ever seeing it invoked as an argument for making screen elements larger, or discussed when people call for making screen elements smaller. Which makes me suspect even more that Fitt's Law is often more a matter of Fittishization than Fittizing.

Because the reason people (read: designers) don't want to make things bigger is obvious: It doesn't look good. Things look better (or at least, cooler) when they're small. That's why designers who'll lecture you endlessly about the virtues of design usability have websites with tiny text that has poor intensity contrast with its background. So Fittism really tends to serve more as a all-purpose way of rationalizing design decisions than it really does as a way of making pages or applications more usable.

In any case, Fitt's Law isn't really nearly as important as its advocates make it out to be. The current rage for Fittism ("Fittishism"?) over-focuses on the motor-experience of navigation, and de-emphasizes the cognitive aspects. The reason for this is that Fitts Law can be very easily validated on a motor interaction level; but the cognitive aspects of the problem tend to get ignored.

And that's not even considering the effect of edge detection in the visual field. This is not strictly a motor issue, and it's not really a cognitive issue -- though it has cognitive aspects.

For example, if the menu bar is always parked at the edge of the screen -- let's say at the top edge -- then it becomes very important that users be able to have confidence that they're using the right menu. If menus are affixed to a window, then know that menu applies to the application to which that window belongs. If menus are affixed to the top of the screen, you are required to do cognitive processing to figure out which application you've got in the "foreground".

(Another aside: That's fairly difficult to do on a Macintosh, which ever since the advent of OS X and Aqua, has very poor visual cues to indicate what application window is active. Title bars change shade and texture a little; text-color in the title bar changes intensity a little; the name of the application is appended to the beginning of the menu bar, in the space that people visually edit out of their mental map of the page in order to limit distractions. In other windowing environments -- Windows and KDE spring to mind -- it's possible to configure some pretty dramatic visual cues as to which windows are in focus, even if you ignore the fact that the menu you need is normally pinned to the damn window frame. It's trivially easy on an OS X system to start using the menu without realizing that you're looking at the menu for the wrong application. I don't do it much myself anymore, but I see people do it all the time.)

But I'm getting off point, here: I started this to talk about distractions, and I keep getting distracted....

Why is there no decent Mac word processor?

The late Isaac Asimov famously resisted computers for many years. With good reason: Until relatively late in his life, they couldn't have kept up with him. His workspace was infamous. He kept several long tables in the attic of his town house, arranged in a big "U", with an IBM Selectric (the fastest typewriter available then or since) every few feet. Each smaller workspace was set up to work on a different project, or part of a project. When he got bored working on one thing, he'd simply roll over to another project.

I got into computers to use word processors. That's not true: I got into computers to manage prose. That was really my dream: To manage prose, which meant managing ideas, managing text, searching seamlessly through stuff that I'd written, changing on the fly, getting rid of hard copy, automating tedious tasks.... I imagined a day when I'd be able to store large amounts of text and search through them easily. I imagined a day when I'd be able to effortlessly switch back and forth between projects the way that Asimov would wheel from one Selectric to the next.

That was in the mid-80s; I'm part of the way there. I use (and have used for something around ten years) a multi-tasking computer that lets me keep multiple projects in progress (dare I say "a la Asimov"?); with wireless networking, I can get connected to the Internet in a surprising and growing number of places; I have a small, light, powerful laptop that lets me do real work when away from an "office."

But I still don't have the text tools that I really want. OS X 10.4 has nice meta-data-aware indexing, implemented in a fairly efficient way; it also has good solid multitasking and power management. But it's still lacking one thing:

It doesn't have a decent word processor.

What would a word processor need to have for me to regard it as "decent"? At a high level, it needs to fulfill three basic criteria:

  1. It has to have good usability characteristics.
  2. It has to support all of the basic, required business functionality that people nowadays expect from a word processor.
  3. It has to be able to interchange files with no meaningful loss of information or formatting with the people with whom I need to work.
Those are actually pretty loaded criteria. Let's break them down a little:
  1. Usability: By this I mean that it has to stay out of my way and let me work. It has to not require that I do a lot of things with the mouse. It has to not place unusual constraints on me, like saving everything into some proprietary "project" or "drawer".
    1. Good interaction performance: Screen writes need to be fast and free of artifacts, document navigation actions like page up and page down need to be quick.
    2. It must be easy to do basic, standard things like move to different points in a document. There are conventional ways of doing this that might be CUA, but are probably just convention: Ctrl-End to move to the end of the current document, Ctrl-Home to move to the beginning, Ctrl-Up-Arrow to go back one paragraph, etc. You will find these conventions honored on the majority of Windows (and *nix) editors and word processors, with spotty acceptance on the Mac.
    3. It must at least be possible to de-clutter the visual field -- to remove extraneous noise. As an example, many find word processors have for many years offered a "full screen" mode that brings that page to your focus and blocks out all other programs. That's an extreme example; Word and OpenOffice 2.0 have a "draft mode" that's pretty good in that regard.
  2. Features: Again, pretty loaded, but at a minimum I think a useful business word processor absolutely has to support the following -- these are things that I have found myself using again and again in preparing business documents, and they save incredible amounts of time:
    1. Automatically formatted (and numbered) lists and outlines. This might seem picky, but if you don't understand the need for it, you haven't really created many complex business documents. Consider a project plan document, that might have a list of things in order. On review, the order changes. If your list has 50 items, you might need to change 50 ordinal numbers. (This has been available in MS Word, WordPerfect, StarOffice/OpenOffice, and many others for many years.)
    2. Section-sensitive headers and footers. I.e., start a new section, you can change the presentation or content of the headers and footers.
    3. Automated tables of contents.
    4. A simple way to format the first page of a simple document differently than the subsequent pages. This has been possible for many years in Word, WordPerfect.
    5. It must implement style-based formatting at at least the character and paragraph levels; more than that (such as page styles) might be overkill, since my experience so far suggests that they don't interoperate well. Furthermore, though, it must be possible to import styles from other documents or from some kind of repository. The feature is dramatically less useful without that capability.
  3. Interoperability: The software must, must, must be able to both import and export files -- files, not text, but files (this is important, guys, please listen) -- in one or more widely used formats. For practical purposes right now, that means that it must be able to interchange files with Word 2000 and later versions on the Windows platform. OASIS OpenDocument format compatibility would be nice from a future-proofing standpoint, but I'm already seeing some indications that the OpenDocument format may go places where it's not very inter-operable with Word's native RTF. So interoperability with RTF, clumsy and locked-in as it is, is what's needful.
    1. No information should be lost in an import/export. E.g., you should never ever lose footnotes/endnotes; you should not lose change tracking; you should not lose bookmarks.
    2. No formatting should be altered in an import/export. Obviously that's easier said than done -- especially with a poorly-documented format like RTF -- but OpenOffice and Word have come surprisingly close.

It's a fact -- and this is not seriously disputable by any honest and experienced user of both platforms -- that Windows (and to lesser extent Linux) beat all but one (arguably two) of the available Mac word processors hands down on all these counts.

I leapt into using a Mac with the assumption I'd be able to find what I needed when I got here, and for the most part, that's been true. Some glaring exceptions: There really aren't any good music players (iTunes is a sad and cynical joke), and -- most glaringly -- there are no (repeat, no, repeat, no capable, stable, usable, general purpose word processors.

The field of modern word processors is pretty small to begin with. On Windows you've basically got Word, OpenOffice, and WordPerfect, with a few specialist players. Down the feature ladder a bit you've got AbiWord lurking in the shadows: It's pretty stable on Windows, and does most of what you'd need to do for basic office word processing, but it has some problems translating Word docs with unusual features like change tracking.

On *nix, you've always got OpenOffice and AbiWord. In addition, you've got kWrite, which is about on feature-par with AbiWord, but tends to remain more stable version to version.

To be fair, there are a lot of word processors available for the Mac. But few of them really fill the minimal requirements for a business word processor, and those few fail in critical to borderline critical extended requirements. And what's most frustrating for me is that it's been that way for years, and the situation shows no real signs of changing.

Here are the players on the Mac:

Word (Mac)

The Good: It supports all the basic, required business features.

The Ugly: Performance sucks, and so does price. I

OpenOffice 1.1.2
The Good: Supports all the basic, required business features.
The Ugly:The two big problems are that it requires X11 and that it's not up to version with OO on the other platforms. I don't think. Truthfully, I haven't tried it yet, but my expectation is for poor performance. In any case, OpenOffice is in general clumsier than Word on a PC. That may not be true versus MacWord. Also, it does lack some Word features I've come to be very very fond of: Chapter navigation in the sidebar, and (this is a real biggie) the Outline Mode document view.
NeoOffice/J 1.1.4

The Good: Price -- it's free. Features -- it's got all the basic features, just as OpenOffice 1.2 does. By all accounts, it's more stable and performs better than OOo 1.1.2 does on a Mac. This is what I use every day, for better or worse. It's very impressive for what it is; I'd just like it to be more.

The Ugly: Rendering performance is flaky. It's hard to de-clutter the visual field -- there's nothing analogous to Word or OOo 2.x's "draft mode". NO/J is somewhat unstable from build to build, though genuine stability issues seem to get fixed pretty quickly, and the software will (theoretically) prompt you when there's a new patch or version available. Unpredictable behavior with regard to application of styles -- e.g. I apply a style, and it often doesn't fully obtain. Some of these problems get addressed on a build by build basis, but it's hard to know which are bugs and which are core defects of OOo. This is OO 1.x, after all, which was kind of flaky in the best of times.

Nisus Writer Express

The Good: Small, fast, good-looking, and the drawer-palette is less obtrusive than Word 2002's right-sidebar. RTF is its native format, which gives the (false) hope that it will have a high degree of format compatibility with Word.

The Ugly: I had high hopes for this one, but it's been disappointing to learn that it fails in some really critical areas. Format compatibility with Word is hampered by the fact that it's missing some really important basic features, like automatic bullets and outlining. I use those all the time in business and technical writing -- hell, just in writing, period. I don't have time to screw around adding bullets or automating the feature with macros, and because the implementation for bulleted or numbered lists is via a hanging indent, the lists won't map to bullet lists or numbered lists in Word or OO. Ergo, NWE is useless for group work. This is intriguing to me, since they've clearly done some substantial work to make it good for handling long documents, and yet they've neglected a very basic formatting feature that's used in the most commonly created kind of long document, business and technical reports: Automatically numbered lists and outlines.

Interestingly, it also fails to import headers and footers. I would have expected those to be pretty basic. Basically, this isn't exactly a non-starter, but it's close.

AbiWord 2.x

The Good: Free.

The Ugly: Unstable and has poor import and rendering performance in the Mac version. I know the developers are working on it, but there's only one guy working on the OS X port right now so I don't have high hopes. Also, it's not as good for long technical documents as Word or OO would be.

Mellel

The Good: Don't know; haven't tried it. People swear by it for performance, but see below.

The Ugly: File compatibility. Doesn't read OpenOffice files or OpenDocument (OASIS-standard) files, and has a native format that isn't RTF. That makes me think it's a waste of time to even bother to evaluate it. I don't need to be screwing around with something new if I'm going to run up against the same file compatibility issues I have with Nisus.
MarinerWrite

The Good: Cheap. Light. Quick.

The Ugly: Features. As in, ain't got many.

Apple Pages

The Good: Inexpensive. Conforms to the Mac UI.

The Ugly: Conforms to the Mac UI -- which means that it requires finger-contorting key combinations to do basic things without using the mouse, and makes poor use of the screen. And it's severely lacking in features: Apparently it can't export very well to RTF, which is odd, considering how deeply Apple has ingrained RTF into their system.

Why am I mincing words, here? Pages, based on what I know about it, is the same kind of sad and cynical joke as iTunes. It's a piece of brainwashing; it's eye-candy; it's got nothing very useful to anyone who does anything serious with documents.

For the time being, it looks as though I'll be sticking with NeoOffice/J, and at some point installing the OO plus X11 package to see how ugly that is.

Youthful Indiscretion

So some people seem to remember that Iran's new President-Elect was one of their captors during the time when they were held prisoner in the American Embassy in Tehran in 1979-1980.

Sorry if I seem like a bit of a hard-ass on this, but: So what? Even if they're right -- why should this have any bearing on how we deal with Iran? Vladimir Putin was a KGB man -- a member of the secret intelligence service of an enemy state. Abu Mazen was a terrorist, and we deal with him. Menachem Begin planned bombings for the Irgun during the Jewish insurgency in Palestine; he was responsible for the deaths of many non-combatants. But they were British, mostly, or Palestinian, so they don't matter to us.

Really, it seems as though we look for excuses to refuse to deal with other countries. And by "we", I mean the Vulcan Cabal, and by "other countries", I mean ones that might possibly oppose the hidden agenda of the Vulcan Cabal, which is American Hegemony, plainly and simply put. But I digress.

When former soldiers go to Vietnam as tourists, Americans expect them to be greeted with respect -- which, by and large, they are, at least as far as I've heard. And by and large, we treat old Viet Cong and NVA "terrorists" with respect when they come here. What's the difference?

I expect it has something to do with blood. People look at me like I'm a little off when I tell them this, but I really do think that bodies politic (the "American People", the "Iranian People") "think" (which is to say, "feel") in terms of blood sacrifice. This is all at a sub-rational level, of course; we find other rationalizations for our behavior, but in the end it's a ritual matter: Once blood is spilled, the nature of the discussion changes.

If you walk through the world for a few days looking at news reports, I submit that you'll start to see this view as making sense. We sacrificed blood in Vietnam. We have never sacrificed any blood in Iran. Not publicly, at least. Blood would have sanctified our humiliation -- it would have taken it to a new level, made it "serious".

So in Vietnam, we had a sense that we paid a price, in blood. Blood is real currency; humiliation is just getting taken. It's not real currency, not to most people. We're going to have the same sense of things with regard to Iraq, I predict. (Though I expect history to reliably fail to repeat: the dynamics will be very different in the long run.)

This isn't likely to happen with regard to Iran -- at least, not soon. We'd have to really go to war with them, and I like to hope that won't happen, because the price would be ... fantastic. It's not Iraq; it's a functioning state with a patriotic people, well-armed with real (as in non-imaginary) and extremely dangerous weapons.

I hear on NPR that Iranians don't think much about the hostage crisis. To them, it's part of the "American Satan" background noise. When people in the US do remember it (and I doubt that many do, at least accurately), they remember it as shaming, as humiliation: That those little pissants could thumb their noses at us in public and we could do nothing about it.... I was there -- that is, I was alive and politically conscious, 15 and 16 years old, at the height of my natural adolescent boy's obsession with respect and purity of purpose.

I remember it like a little scar. I remember how much it made me despise Jimmy Carter. He was responsible (in my mind, at that time) for making the US seem weak. I talked tough about it with my friends; I think that deep down, many Americans wanted to wake up one morning and find out that all those hostages had been killed. It would have made us victims, given us the "right" to start shooting. And I can tell you, we wanted to start shooting. We wanted that so badly.

Thought for the Moment: Eisenhower on Social Security

"Should any political party attempt to abolish social security, unemployment insurance, and eliminate labor laws and farm programs, you would not hear of that party again in our political history. There is a tiny splinter group, of course, that believes that you can do these things. Among them are a few Texas oil millionaires, and an occasional politician or businessman from other areas. Their number is negligible and they are stupid."
-- President Dwight D. Eisenhower, 1952 [courtesy Amy]

... which is why, of course, the Bushites will never admit that's what they want to do. They aim to set the systems up to fail without their having to take any action. Clever bastards. Stupid, perhaps; but clever.

A Random Walk At Travel Writing

I attended the first session of a class in travel writing last night. As I got home, I started poring over my shelves to find examples of the genre. Some leapt out at me. So here's a random-walk, dartboard-at-the-page first quick pass. I sort them here from the most unequivocal examples, to examples I have to justify.

Unequivocal examples:

Lost In The Arctic (Lawrence Millman) [buy @Powells.com]
This is what i think of when I think of "travel writing." The articles are mostly short -- this was great bathroom reading -- and they often fall somewhere between "To Build A Fire" and a less-pejorative version of "Shooting An Elephant" in their focus on the fool-hardiness of a civilized western traveller in the wild. A lot of it is very funny; Millman is far more often a fool than the natives are, but he always strives to be gracious.
Waking Up In Iceland (Paul Sullivan) [buy @Powells.com]
All the travel literature I read before going to Iceland did less to prepare me for the place than this book did. I heard people speaking in their own voices; to be sure, it tells the story that Sullivan wants to tell, and I'm sure he idealizes the scene -- but from what I could see, not much.
Some Orwell selections, especially: "Shooting and Elephant", "Marrakech", and "Looking Back on the Spanish War" (both in A Collection of Essays by George Orwell [buy @Powells.com / Orwell @Powells.com])

I read all of these years ago; I remember that I picked up the book somewhere unexpected, like a Salvation Army or at some church book sale, and read nothing in it for a long time, carrying it from place to place. As I remember it, I then picked it up one day to read "Shooting an Elephant" on someone's recommendation, and then devoured the whole thing. He's an engaging essayist with a penetrating, if jaundiced view. Viz the opening to his essay on Kipling, when he observes that in definding Kipling against the charge of being a fascist, he writes that:

... [T. S. Eliot] falls into the opposite error of defending that which is indefensible. .... Kipling is a jingo imperialist, he is morally insensitive and aesthetically disgusting. It is better to start by admitting that, and then to try to find out why it is that he survives while the refined people who have sniggered at him seem to wear so badly.

Which is to say, Eric Blair was a tough-minded "T", like me; an INTP, to be specific, or at least, that would be my guess. If he was going to understand the situation, it was important not to mystify it. This is perhaps why he got on so poorly with other socialists....

Arguable:

Democracy in America (Alexis de Tocqueville) [buy @Powells.com]
It's a traveller's tale, to be sure, but a highly conceptual one. Early ethnography, if you will. I've only ever skimmed it; I should really read it, someday, I suppose...

Conceptual:

The Assembly Line (Robert Linhart) [buy @Powells.com]
Linhart "travels" from comfortable bohemian Marxism to the world of the (primarily) Algerian workers in a circa-1967 Citroen plant. This is real-world existentialism; it's a reminder that you don't really understand a place until you think you can't get out of it.
The Man Who Mistook His Wife For A Hat (Oliver Sachs) [buy @Powells.com]
It struck me as I was shelf-reading that one of the main criteria that I use to identify something as "travel writing" is that it reads like a visit. And it struck me as my eye hit the spine of this book that Sachs writes like a visitor, to the worlds of his patients. It's like a travelogue into the world of neurological disorder.

That will do for now, I suppose.

"Voluntary Personal Savings Account"

How do conservatives manage to support the Bushite Nanny State and still sleep at night? They should at least be a little restive over the fact that their children have such weak and gullible parents.

I think that as we permanently solve the system, that we ought ot make it a better deal for younger workers by allowing younger workers to take some of their own payroll taxes and set it aside in what is called a voluntary personal savings account.
[President G. W. Bush, speaking in Greece, NY, 2005-05-24; listen to mp3 stream / download mp3]

Huh. I thought they already could. Don't we have these things called "IRAs" and "401Ks"? Or, for that matter -- "savings accounts"? You'd think the President didn't know about those. Maybe he just forgets -- from moment to moment:

Don't you like the idea of -- I mean, some of you have got 401Ks, and you open up your statement every month. Don't you like the idea of a statement?
[President G. W. Bush, speaking in Greece, NY, 2005-05-24]

More to the current conservative mania: How does George Bush deal with the cognitive dissonance of his party stumping by proxy for the will of the majority while he so strenuously attempts to force his own will upon the people? After all, as David Greene at NPR reminds us [RealAudio], the President's "60 day campaign" just passed day 84 with yesterday's visit to Rochester....

Insidious Bot-ulism

As grim and depressing as I can find the automation of spam and the proliferation of bot networks, I like to think I have some perspective on the matter. For example, I recognize that there's a real danger of incredible, profound disruption from bot networks like the one that's driving the spread of the Sober.x worm[s].

But that disruption won't come from "hacking" -- most particularly, it won't come from using the bot networks to crack encryption. As usual, Bruce Schneier has cut through a lot of the nonsense that passes for wisdom on the subject.

The very idea that the main threat from bot networks is cracking is ridiculous -- it displays a basic misunderstanding not only of how end to end security systems are designed, but also some very peculiar and extremely fuzzy thinking about how to defeat those systems. You defeat the systems by gaming them, not by cracking encryption. Sure, you may want to crack encryption at some point to get through some particular locked door -- but the hard part is finding that door in the first place. And more often than not, if you're clever and you know how to game systems, you'll find that you don't need to crack encryption: You can get someone to just give you the key, or even (figuratively) open the door wide and usher you through.

Of course, it is possible, and even likely, that computers will be or even are as I write this being used to game security systems more effectively than humans can. Some clever bloke somewhere might even be writing bots that crack systems. But bot networks -- "herds" of dumb, zombified PCs, even if harnessed into a computational grid -- are more or less irrelevant to that.

Heuristics like that aren't helped by brute force. Anyone who calls himself a security expert ought to know that.

The greatest threat from bot-driven disruption is not hacking or cracking, but denial of service. The person or persons controlling the Sober zombie network alone could, should they so choose, have a significant impact on the operation of the open, civilian internet. It would be easy. It would be pointless, but it would be easy.

But again: it wouldn't be the end of civilization. We'd get by. That's what we do.

And that's the ultimate lesson of security: Unless the system is severely broken (as in Iraq after the fall of Saddam or in Rwanda in '96), people will generally act to preserve structures of civilization (as we see again and again after natural disasters throughout the world).

Mature Content Warning: This Program Might Make You Think

Before Nova this evening, there was a "mature content" warning.

Nova tonight discussed an ongoing controversy regarding the origin of the original human inhabitants of the Americas. In brief, it discussed the long-standard "Clovis-first" theory in the light of new archaeological finds, conjectures based on analysis of tools from ice age Europe, and evidence from analysis of mitochondrial DNA.

In short: About as hard-science as archaeology gets. Finding the bones, finding the tools, big-time.

During the entire program, there wasn't a single bare breast, not a single blue word or phrase, not one mention of gay marriage or even a hint of sexual liaison (aside from the implication that the ancestors of modern native americans might have, you know, reproduced).

So why the mature content? Is it, perhaps, that they're scared that some religious fanatic might point out that when you're talking about things that happened 15,000-20,000 years ago, you're pretty much assuming that the world is older than 4,000 years -- hence insulting all the KJV Baptists in the Nova audience?

By contrast, consider any random episode of Law & Order: Elevator Inspectors Unit, or CSI: Sheboygan, wherein you're likely to find references to "last meals" of semen or violent sexual deviance. I don't recall ever seeing a "mature content warning" before either of those shows. Ever. But then, they don't challenge the Dog-given age o' the universe....

A Sobering Milestone

I can foresee a day when we're nostalgic about commercially-motivated spammers and mass-mailing-worms.

I get jaded about virus and worm stories. Each day seems to bring a new watershed in rate of infection, purpose, or technique. Sober is the worm du jour: It appeared sometime during the week of May 2, spread widely and rapidly, and this week started to download updates to itself. The latest variant, Sober.Q, is being used to spread "hate speech."

So let's count the milestones: Rapid spread; remote control; use for propaganda. None all that impressive anymore, on their own. But put together, they're like seeing someone walk down the street wearing sandals with black socks: It's just another sign of the end times. It's depressing.

But Seriously, Folks: Using mass-mailing worms to spread propaganda really is something to take notice of. It's a truism that spam is just about too cheap to meter, as exemplified by the fact that it's not cost effective for a spammer to even care whether most of his messages get through, much less whether you're trying to sell cialis to a woman; it was only a matter of time before the marketers of ideology grokked the significance of that fact and started using it to virtualize their lightpost handbills.

Self-updating zombie mass-mailing worms are the computing equivalent of a bio-weapon: (mind-bogglingly) cheap, effective, and devilishly hard to kill. Previously, they've been used for a rationally-accessible goal: Making money. Now, they're being used for goals that are comprehensible only in terms of the ideologies that drive their purveyors.

Still more proof, as though we needed it, that markets are dangerously deficient metaphors for understanding human social behavior.

Land of the Sterile Storm Troopers

When George Lucas first deigned to underwhelm us with his vision of the last days of the Galactic Empire in the summer of 1999, SF writer David Brin responded with a thoughtful essay on Salon.com describing in some detail why the idea of life in the Star Wars universe left him depressed, and the idea of life in Roddenberry's "Next Generation"-era Trek universe didn't.

The short version is that George Lucas is a closet fascist.

That's putting a few words into Brin's mouth, but not many. I found his arguments very appealing, and still do. So I'm tittilated by Anthony Lane's review of Star Wars Episode III in The New Yorker:

... Mind you, how Padmé got pregnant is anybody's guess, although I'm prepared to wager that it involved Anakin nipping into a broom closet with a warm glass jar and a copy of Ewok Babes. After all, the Lucasian universe is drained of all reference to bodily functions. Nobody ingests or excretes. Language remains unblue. Smoking and cursing are out of bounds, as is drunkenness, although personally I wouldn't go near the place without a hip flask. Did Lucas learn nothing from "Alien" and "Blade Runner"—from the suggestion that other times and places might be no less rusted and septic than ours, and that the creation of a disinfected galaxy, where even the storm troopers wear bright-white outfits, looks not so much fantastical as dated? What Lucas has devised, over six movies, is a terrible puritan dream: a morality tale in which both sides are bent on moral cleansing, and where their differences can be assuaged only by a triumphant circus of violence. Judging from the whoops and crowings that greeted the opening credits, this is the only dream we are good for. We get the films we deserve.

Come to think of it, I don't recall seeing a toilet in any of those immaculate Death Star prison cells... Geez. Thanks a lot. Now (on the off chance I do go to see it in the theaters), I'll keep looking for the door to the bathroom the whole time.

Most Of The People, Some Of The Time, Redux, Etc.

"Judicial activism" is a funny term. It seems that now, when Judges behave conservatively (as in, conserving clearly delineated constitutional rights), that's "activism" -- especially if it requires that the judge point out the simple Lincolnesque truth, that it's possible to fool most of the people for long enough to get a really dangerously sweeping proposition passed into law.

Case in point: Nebraska's version of the boilerplate "Defense of Marriage" act has been struck down as federally unconstitutional in two distinct and sufficient ways: It "creates a significant barrier to the plaintiffs' right to petition or to participate in the political process" and "imposes significant burdens on both the expressive and intimate associational rights" of gays, lesbians, and potentially anyone who wants to form a legally binding association that's not a state-sanctioned "marriage" between a "man" and a "woman."

Like, say, shacking up. Or signing a palimony agreement. Between straights.

The Neo-Calvinists and their fellow-travellers keep talking about the fact that "over 70% of Nebraskans" decided to support the measure "defining marriage as between a man and a woman" after being barraged with highly charged advertisements and exhortations from the (real or virtual) pulpit for weeks to months. What really happened is that "over 70% of Nebraskans" decided to support a measure that they clearly did not understand. They didn't understand, for example, that it would radically restrict the rights of foster parents, unmarried opposite-sex domestic partners, persons in power-of-attorney relationships, non-custodial parents, and so on.

What really happened is that "over 70% of Nebraskans" got conned.

If it weren't for the fact that it would require abrogating the US Constitution, I'd be inclined to let Nebraska, Kansas and the rest of the virtual bible-belt just slide back into the dark ages. Politicized evangelism has far, far greater potential to destroy this nation than racial issues have had at any time in the last 40 years. Racial issues have at least been constrained: By notions of decency (no mainstream white could use the "N" word without censure), and by commonly-held economic desires (almost everybody wants the American Dream, and almost everybody is willing to see that, even about classes of people for whom they have contempt). Religious issues are not so constrained: When it's a religious issue, your opposition is evil, pure and simple -- believe that, or be damned. End of discussion. Please leave the church by the side door, so you don't soil the earth your neighbors have to walk on.

But [un?]fortunately, we do all have to live together in this country. We don't get to let them live in the mediaeval hell they seem determined to create. Not the least reason being that the virtual belt isn't limited to big square red states -- it harms people in places like Michigan and Connecticut who've never done any harm to anyone by being so immoral (or so unfortunate in their sexual orientation) as to dare to co-habitate without the benefit of state-sanctioned marriage.

So we don't get to let them sleep in the bed they've made. But we don't have to let them make us sleep in it, either.

The User Experience is the User Experience

Jakob Nielsen, among others, has remarked that "the network is the user experience." They're all wrong, and they're all right.

Browsing through UseIt.com yesterday left Nielsen's June 2000 predictions of sweeping change in the user experience loaded in my browser when I sat down at my desk this morning:

Since the late 1980s, hypertext theory has predicted the emergence of a navigation layer that would be the nexus of the user experience. Traditionally, we assumed that this would happen by integrating the browser with the operating system to create a unified interface for manipulating remote information and local files. It has always been silly to have some stuff treated specially because it happened to come in over a certain network. Browsers must die as independent applications.

It is counter-productive to have users suffer sub-standard user interfaces for applications that happen to run across the Internet as opposed to the local client-server environment. Application functionality requires more UI than document browsing: another reason browsers must die.

Silly, counter-productive: Sure. I've always thought so. But the tendency in the late 1990s was to assume that document browsing was exactly enough. And though the peculiar insanity of things like "Active Desktop" (which strove to make the Win95 desktop work just like the Web circa 1999) does seem to have passed, it remains true that the bias is toward the browser, not toward rich application-scope UIs.

Which is to say that Nielsen, in this old piece, is failing to heed his own advice. Users are inherently conservative: They continue to do what continues to work, which drives a feedback loop.

But more than that, he -- like almost everyone else I can think of, except myself -- is missing the single most important thing about modern computing life: People don't use the same computer all the time. Working from home, now, I frequently use two: My desktop, an OS X Mac, and my laptop, a Sony Picturebook running Windows 2000. In my most recent full time job (where I sometimes spent 12 hour days on a routine basis), I used two more systems: A desktop running Windows NT and a laptop running Windows 2000. And that's not even counting the Windows 2000 desktop I still occasionally use at home. (And would use more if I had an easy way to synchronize it with my Mac and my Picturebook.)

And so it's interesting to look at each of Nielsen's predictions as of June 2000:

  1. Amazon is healthier than ever, in no small part because "zero click payments everywhere" are no closer now than they were in 2000. (See [3].)
  2. Yahoo's network of services is healthier than ever, in no small part because people are less and less tied to specific machines. (See [3].)
  3. Websites know your preferences only insofar as you invest those with a particular services vendor/provider, like Yahoo or Google. That's actually a reflection of increasing network-centricity: These services are finally recognizing that people have lives that cross many machines.
  4. AOL is failing rapidly, but its proprietary messaging system is still going strong -- as are the proprietary messaging systems of Yahoo and Microsoft. Messaging aggregators like Trillian are still bleeding edge.

None of this is to say that I don't think the network is the user experience. He's sort of right about that -- or at least, he's right that it sort of should be, that things would work better if we made apps more network aware. After all, in the age of ubiquitous wireless, the network is spreading to places it's never been before. But what the 2005 situation reveals is that relatively low-impact solutions like using cell phone networks for instant messaging or logging-in to websites have trumped high-impact solutions like re-architecting the user experience to eliminate the web. Instead of using the increasingly ubiquitous broadband services to synch all our stuff from a networked drive, we're carrying around USB keychain drives and using webmail. Instead of doing micropayments, we're still living in a world of aggregated vendors a la Amazon and charity (Wikipedia) or ad-/sales-supported services (IMDB, GraceNote).

At a more fundamental level, we have to be mindful that we don't define "the network" too narrowly. Consider the old school term "sneakernet": Putting files on floppies to carry them from one person to another. It was an ironism -- sneaker-carried "networking" wasn't "networking", right? -- but it revealed a deeper truth: "Networking" describes more than just the stuff that travels across TCP/IP networks. At a trivial level, it also includes (mobile) phone networks and their SMS/IM/picture-sharing components. But at a deeper level, it covers the human connections as well. In fact, the network of people is really at least as important as the network of machines.

Understood that way, "the network is the user experience" takes on a whole new meaning.

Momentary Thoughts on Empiricism in Design

When I read reports from other people's research, I usually find that their qualitative study results are more credible and trustworthy than their quantitative results. It's a dangerous mistake to believe that statistical research is somehow more scientific or credible than insight-based observational research. In fact, most statistical research is less credible than qualitative studies. Design research is not like medical science: ethnography is its closest analogy in traditional fields of science.
[Jakob Nielsen, "Risks of Quantitative Studies"]

I've always found it more than a little ironic that many designers have such a strong, negative reaction to Jakob Nielsen, especially since most of them do so by banging the drum in protest of what could be termed "usability by axiom": The idea that following a set of magic rules will make a website (or any application) more usable. I find it ironic, because Nielsen has always seemed to me to be a fairly ruthless empiricist: His end positiion is almost invariably that if a design idea doesn't provide the usability benefit you imagine it will before you use it, then you shouldn't be using it. This month's Alertbox is a case in point, but there are plenty of others I could cite.

And therein lies the problem. Designers, painting broadly, really do know more than the rest of us do about design, at least on the average: They spend years in school, they produce designs which are done according to the aesthetic rules and basic facts about human interaction with machines and which are then critiqued by their teachers and colleagues. They've often even done their research quite meticulously. But they seldom bother to actually look at what real users do -- at least, in any way that might do something other than validate their preconceptions. And it is, after all, the real users doing real work who will get to decide whether a design is effective or not.

Take "Fitt's Law", for example: If you search for tests of Fitt's Law, you'll find plenty of tests, but the last time I looked, I could find none that tested Fitt's Law in a real working context. And there's a good reason for that: It would be really hard to do. To test effectively, you'd have to include such contextual features as menus, real tasks, application windows -- and then, change them. It's barely feasible, but do-able -- it would be a good project for someone's Master's thesis in interaction design, and it would be simplest to do with Linux and collection of different window managers. You'd have to cope with the problems of learning new applications, and sort out the effect of those differences on the test. It's a tough problem to even specify, so it's not surprising that people wouldn't choose to fully tackle it.

But I digress. My point is that it's relatively easy to validate the physical fact that it's easier to hit big targets than small ones and easier to hit targets at the edge or corner of the mouse area than out in the middle of the visual field. Unfortunately, that's not very interesting or useful, because we all know that by now. (Or should. Surprisingly few university-trained or industrially-experienced interaction designers take account of such factors.)

One thing it would be interesting or useful to do, would be to figure out what the relationship is between "hard edges" (like the screen edge of a single-monitor system) and what we could call "field edges" (like the borders of a window).

What would be interesting would be to study the relationship of the physical truths exposed by Fitt's "Law" with the physical truths hinted at by things like eye-tracking studies.

What would be interesting would be to figure out what the relationship is between understanding and physical coordination. Quickly twitching your mouse to a colored target doesn't tell you much about that; but navigating a user interface to perform a word processor formatting operation could. Banging the mouse pointer into the corner of the screen like a hockey-stick tells you that mouse pointers stop when they hit screen edges; I already knew that from having used windowing user interfaces since 1987. What I don't know is whether cognitive factors trump physical factors, and simple validations of Fitt's Law do nothing to tell me anything about that.

What would be interesting, would be to design to please customers, instead of to please designers.

Keeping America Safe From British Novelists

There's a perfectly good explanation for why US Customs refused Ian MacEwan entry to the country. It wasn't because he was dangerous; nor was it because he was deemed to pose some kind of terror threat. Nor was it because someone thought he might be a journalist instead of a tourist. It apparently wasn't even because he disagrees with US climate policy and doesn't mind saying so in public.

It was because he was going to make too much money. It seems the honoraria for his series of Seattle-area speaking ingagements totaled a wee bit too much. So he needed to have a work visa, not a tourist visa.

So they stamped his passport "Refused Entry." "Once that stamp gets in a passport, it's difficult to get it out," said Britain's Consul General for Vancouver, James Rawlinson. "The process of reversing that is not merely a matter of crossing that out. Reversing that requires referrals to Washington, D.C., and the headquarters of the State Department and Homeland Security. It gets rather heavy."

Oh, well -- at least when you're a well-known novelist like Ian McEwan, people [who matter] might miss you.

[via Bruce Sterling's Viridian Note 00440]

No Papers, State To State

Capt. Vasili Borodin: I will live in Montana. And I will marry a round American woman and raise rabbits, and she will cook them for me. And I will have a pickup truck... maybe even a "recreational vehicle." And drive from state to state. Do they let you do that?
Captain Ramius: I suppose.
Capt. Vasili Borodin: No papers?
Captain Ramius: No papers, state to state.
[Hunt for Red October]

As a boy, during the Cold War (remember the Cold War?), one of the big filmic signifiers that you Weren't In America Anymore was an official looking character asking for your "papers": Those mysterious documents that people had to carry in those grim gray communiss countries behind the iron/bamboo curtain. They had papers; we had "freedom."

So, sometime soon, we'll all be carrying "real" IDs: No more slipping under the radar, no more living in the underworld. Unless you're "16 and SIN-Less", in which case you'll be invisible.

And so wouldn't be missed.

Small Choices Moving Fast

It's a truism: You can't use the system to really fight the system. If you use a record label to sell songs about smashing capitalism, you're not doing anything substantial to smash capitalism.

So what do you do? Opt out of everything? Or act in small ways? Small ways are unsatisfying; and in any case, how do you know that the soap or chips you buy are really doing anything like what your conscience would have you hope?

Seminal straight-edger Ian MacKaye noticed these contradictions years ago [RealAudio], and they played a role in his move to a lower-volume sound:

"Volume had relegated bands to playing largely commercial venues. Most of the places that had sound systems were commercial venues; their economy is based on their bar sales. It cements this really insidious link between rock and roll and the alcohol industry. The idea that the people who music epseaks to in some ways the most deeply -- and by that I'm talking about kids, teenagers -- are by and large not allowed to see bands play because they're not old enough to drink."

And in turn, it cements the role of rock and roll as a gateway to the bar life. Not the connection someone like Ian MacKaye would miss. I don't doubt that awareness of that contributed to his desire to play in "non-traditional" venues like family restaurants, public places, and repurposed rented spaces like boathouses.

Small choices can make a difference. They might not overthrow the order of things, but then, revolutions are messy things that often do more harm than good.

Remembrance As Modern Art Gone Bad

Speaking of Oklahoma City, my old Okie friend Kelley offered his thoughts on the memorial:

"I still say they should have planted 168 redbuds-a veritable forest that would be blooming now. What a sight that would be, an affirmation of life, a colorful display that cannot be equaled. Instead, they have those silly chairs. Stupid. Modern art gone bad. Yes, they were bureaucrats (mostly) but I think the chair is simplistic and mundane. After all, the noble, tough redbud is the state tree- they're hard to kill and they deal with adversity in a manner I think transcends their environs. Oh yeah, they're the state tree. Duh."

As I sit here, I have a vision of hundreds of ghosts sitting in those cold stone chairs for eternity.... Bureaucrat or no, I find it hard to imagine they wouldn't rather be sitting in a Redbud grove.

I responded that subtlety has become a lost art, accepted only from people (like, say, Roy Blount) who can pretend they're being obvious; and that real local character is passé, like the "southernness" of Atlanta or Houston.

But we've become a monumental culture. We might once have planted trees and let the glory be to God or Nature, and had faith that the tree would one day grow large. But that kind of sentiment died off in the Dutch Elm plague or was suffocated by Cold War techno-optimism. Now, it's no good if it's not human-made. (Ayn Rand smirks from her grave.)

Here in NY, I think the appropriate plant would be blackberry bushes. Let one survive, and you're buried in them forever. My friend Lynne planted blackberries around her fence for some reason a few years back, and now the whole area is a wasp-loud glade all summer long.

Up in Maine, it would be wild roses. Those things grow *as* *weeds* in the cracks between the big wave-smoothed boulders right at the ocean's edge. Even the salt grass has a hard time there.

CORRECTION: I'm chagrinned to be reminded that Lynne's bushes are raspberries, not blackberries. But either will take over in the rockiest, most clay-bound soil, given half a chance. And I'll stand by my Yeats allusion, even if it doesn't represent a literal truth, because I like the way it sounds...

Remaining Marla

Over the past few days I have seen many descriptions of Marla, including those likening her to an angel or a saint. Neither of those words do her justice. She was driven by a passion I have never encountered before, and she had a boundless heart. But she was also consumed by extreme lows as well as highs, tears along with laughter. In discussing plans for a book, she wanted to be depicted as the rich and complex woman that she was. But she would quickly remind me that the families' stories were most important. So, she wasn't a saint, but she possessed saintlike qualities.

[Jennifer Abrahamson on Slate]

I'll bet you a magnet Support Our Troops sign that the Tillman story will continue to have legs far longer than Ruzicka's.

[Neologian on MeFi]

Somehow I doubt it.

I'm sure that Neologian hopes for better, of course, and he'd have good cause to. Marla's story is the kind of thing that deeply inspires people who are willing to commit their entire lives without the possibility of external reward. Pat Tillman arguably did the same thing, but there's a different quality to his committment. Marla could have gotten out at any time -- she just had to go to he airport and go home. She never gave up, though. Her legacy (like Tillman's for that matter) should be that effort and sacrifice are not pointless.

At the very least, Marla's memory has a better chance of remaining true to "Marla" than Pat Tillman's does of remaining true to "Pat". Both have been or will be remade into whatever their admirers want/need them to be. But where Tillman's personality was exposed to small groups of a fairly limiting nature (his family, the men in his unit), Marla forcefully projected hers across strata of society, across cultural boundaries, across domains of experience -- and, not insignificantly, across airwaves. All without apparent loss of committment.

Which makes the Coulteresque feeding frenzy at LGF and Teh Freep all that much more pornographic.

Footnote, for now: I woke up to Ivan Watson's story about Marla on NPR on Monday morning. I've been thinking about it off and on ever since.

From the "Holy Crap!" Department: Adobe Acquires Macromedia

At first, I thought it must have been some kind of a joke, but it seems to be true: Adobe and Macromedia have agreed to a friendly takeover, at a price of about $3.4B. So the question is, does this save Adobe or destroy Macromedia? And is there any conceivable way that merging two 800 pound gorillas could be good for web developers or end users?

Macromedia and Adobe have presented as competitors for years, but they actually compete head to head in very few areas. Even places where they seem to butt up against one another, as in the case of ImageReady versus Fireworks, or FlashPaper versus PDF, the truth is more complex: In the first case, most design shops just buy their people one of each, and in the second, the formats, while presented as directly competetive, really aren't. PDF is almost zealously print-centric; FlashPaper is really an effort to make Flash more print-friendly, and in fact ends up incorporating PDF into its own standards stack. Both have more usability warts than most people on either side like to admit.

It's hard to see how this helps consumers. Adobe have become enormously complacent in recent years. They're effectively the only game in town for professional image editing, and they know it. In the graphics professions, the price of a copy of Creative Suite is simply a part of the cost of setting up a new designer graphic artist. Even heavily Macromedia-focused web shops use Adobe software at some stage in their workflow, thanks to Adobe's strong lock on certain color technologies. But they've never bothered to develop anything like Flash, and have never worked very hard to overcome the profound weaknesses of PDF as a format.

Macromedia are somewhat hungrier, somewhat more innovative -- but they, too, have a market lock. Professional web design shops either work with Macromedia StudioMX (or possibly just Dreamweaver), or they most likely do inferior work. I know of a few good web designers who stick with Creative Suite for everything, but they're old pros with lots of experience dealing with Adobe's deeply idiosyncratic conventions and techniques. Macromedia's workflow for web production is far, far superior to Adobe's in every regard except for color management and high-end image/graphic editing. Their "round-trip" code management is on an entirely different plane from Adobe's understanding of how to deal with HTML.

If I have to predict the shakeout, I'd predict that the final product lineup from the merged entity will include Dreamweaver and Flash from Macromedia, Acrobat, Photoshop, Illustrator, and InDesign from Adobe, and will probably include both ImageReady and Fireworks until they figure out which one is harder to de-integrate. My guess would be that the good bits of ImageReady would be incorporated into Fireworks, which has much, much stronger HTML generation capabilities. (That said, its file format may prove difficult to integrate with Photoshop and Illustrator.) Acrobat and Flash will have a relationship analogous to that between Flash and Director: Flash will be a mezzanine for rendering and delivering PDF, and Acrobat itself will continue as a separate product.

And, of course, Macromedia's server-side products will remain intact, because they're what Adobe really wants. Adobe is digital graphics, basically; but they aren't positioned to continue to grow in a post-Web world. Specifically, they are vulnerable to being obselesced as technology moves beyond them. Macromedia, by contrast, has spent the last several years experimenting with web-focused (not merely web-based) workflows.

ADDENDUM: After reviewing the MeFi thread, I'm no longer so sure that Adobe will be humble enough to keep Macromedia's very emperically-grown software development stack. And I see that some of my assumptions regarding the smartest choice of components may be too optimistic. One thing's for sure: Our web dev apps will sure be a lot more expensive...

Who Needs A Majority When You've Got Righteousness?

Forget about resisting the tyranny of the majority. We're past that. Right-wing Republicans are looking to lock in the tyranny of a minority.

People in both parties, but most notably currently prominent Republicans, are saying there's still hay to be made on the Schiavo case. Democrats say that it can be used to galvanize opposition to the planned Republican takeover of the judiciary branch. Republicans say that it 'energizes the base' -- where the definition of "base" seems to be "hard-core right-to-life Evangelical Christian Republicans." The most wildly exaggerated numbers I know of put that at about 25% of the American population.

Republican pollster Tony Fabrizio, on Morning Edition this morning [RealAudio], says they're both wrong: That if there had really been an opportunity there, smart Democratic legislators would have jumped on it; and that, furthermore, Republicans had gone too far. "You know, there is a difference between energizing your base, and having your base push you off the edge.... Was it that we needed to prove to the middle, the middle of American politics, that we were willing to go someplace place that they didn't want us to go? How many times can you do that and still be successful politically?"

Republican lion and Episcopelian minister John Danforth agrees. He notes that "traditional Republicans" have complained about the courts going too far. "Most republicans would have said, 'We think that the courts go too far.' Now, it turns out that it's Republicans who are saying that we want the courts to go very far, but in our direction, and I just think that's wrong."

I'd like to think that both Danforth and Fabrizio are right, in their own ways. Danforth's view implies a basic belief that people ought to be morally consistent: That means ought to be consistent with the end, not merely contributory to it. Further, he's clearly a real believer in a pluralistic society. Fabrizio states his view in pragmatic terms, with the clear underlying assumption that it's normal for a nation to be comprised of people with differing views. (If you can dictate people's views, you don't need to worry about being "successful politically.")

But they're both wrong, at least in pragmatic terms. The Republican train is being driven by people who don't see a problem with means that are contrary to the end, or with the idea that the nation ought to bend to their will. Voices of moderation, even when conservative, are no longer welcome. They get in the way of the program, which is to let the Republican Party (by which they mean the intensely activist religious right component) install hegemonic control over American discourse.

Their will is a holy will, after all. Whether it's all the same religion is another question; all that really matters is that one (the religion of power and capital) can be translated into the ends of the other (semitic absolutism, as manifest in right-wing American Christianity).

Lost Mysteries

Sometimes I miss not knowing things.

I'm not talking about the big mysterious things. There are a few of those I'd rather not know, but that's a different issue. This is little, simple stuff, for which we can now easily find an authoritative (if not necessarily correct) reference on the web.

For example: When I was a kid, I was always seeing movies without knowing anything about them. I might recognize a face ("Hey, there's that guy from that thing!") or a voice or a walk, or even a style. But I saw a lot of cool movies as a kid that I couldn't have told you anythng about aside from the plot or the setting.

I remember one time as a kid, coming home from school at noon (it was the last week of the school year); I think I must have been in junior high. The local PBS affiliate was showing afternoon movies that week, so I switched it on to see what was there. It was old -- in black and white, and in Japanese, and as I turned it on it was mostly a motley bunch of people having an oblique conversation while they waited out a rainstorm in a busted-up building. But I stuck with it, and soon it got more interesting: The conversation was about a murder case, and one by one they worked through five different versions of the event. In one, a noble-born husband dies by a fierce bandit's sword, while defending his wife's honor as she cowers in the shadow; in another, the wife tempts the bandit, and the husband must be goaded into fighting; and so on, with each version glorifying or justifying its teller. In a fifth and final version, from a surprising source, all parties come off petty, venal, and weak-willed.

I never knew the name of the picture or anything about it when I was watching. But it stuck with me for years. Probably a week didn't pass that I didn't think about that ugly fifth version, thick with fear and utterly lacking in grace for anyone. Until one night in college, I went to the regular screening session for my Japanese cinema class. That night I saw a film called Rashômon.

These days, there wouldn't be a mystery. I'd just look it up on IMDB or post a question to Ask Metafilter. It's all easy, now. We go, we get our answer -- we don't spend time chewing on the memory of some mysterious film or book or song, reworking it in our memories until we make it into something that speaks to us.

My brother Glen once told me about a film he'd seen as a kid. It was an old film -- black and white. About a rich old man who dies alone and friendless after uttering the mysterious phrase "Rosebud!" -- which turns out to refer to an old sled. He'd thought about that movie a lot, over the years, but had never been able to remember the name of it, or who starred, or who directed.

I thought about it for a moment, and took a guess: "Sounds like Citizen Kane." (I'd never actually seen Citizen Kane at that point, mind you.)

He shook his head resolutely, as I recall. "No, that's definitely not it."

Pages

Subscribe to RSS - escoles's blog