You are here

aesthetics

Why do you do what you do? Why do you decide what you decide? In an ecological context: What consequence does it have for the larger system, that many people decide or do as they do?

Poker Dreams. Or: Let's be clear how this is going to go.

I had a dream about poker last night. Barack Obama was in it, sort of, as a presence in the background, someone I knew was playing, somewhere. So was John McCain. I was getting together things for a rummage sale, and one of the things I was putting in was a poker set. Only, it was poker played with dice, and the game had been somehow "simplified" so that people wouldn't have to actually understand suits and hands and betting rules. The dice had arabic numerals on them. (In my dream, the old-fashioned dot-patterns were deemed 'too complicated'.) And there were lots of dice -- hundreds, possibly.

I'm generally not big on the idea that dreams are metaphors for life, but this one seems so relevant, so poetic, that I can't ignore it. See, Obama is a poker player. Supposed to be quite good at it. The most important thing in poker is to make decisions about your course of action that are based on what you actually know (is he showing his tell? what cards are face up? what have I got? is my gut telling me anything?), and then sticking to it until you know something that warrants changing your plan. McCain is a craps player. He throws dice. The most important thing in craps is that you have a lot of money, so it doesn't hurt so much when you lose it. Snap decisions don't matter one way or another, so intermittent reinforcement will tend to make those decisions stick with you as valuable more often than as detrimental.

John McCain will not be debating tonight. Sarah Palin will not be debating next week. John McCain will be maneuvered into position to take credit for a solution to the financial crisis as a favor from the Republican leadership, even though he'll have nothing constructive to do with it; in so doing, it will be made clear to half of America that he muddied the waters by injecting himself into the mix, and to the other half that he Took Charge And Got It Done.

I actually think there's an excellent chance there won't be any debates at all. He seems desperate to avoid them. Obama's best bet is to let it be known that he'll be available whenever McCain wants to carry forward with the planned debates, and keep pointing out that the Senate Finance Committee (which McCain has no part in) has actually been making excellent progress without intereference from the Presidential candidates.

At this point in time the campaign starts to look like a slow-motion train wreck: Palin is being shown for the lightweight she is, McCain is cracking under the pressure of trying to be something other than John McCain, and Obama is keeping his cool and sticking to his game. He seems to know what's in his hand, and to have known for months. Let the dice fall where they may: He's not playing that game. He's playing poker, not craps.

The Message Is The Message; Or, Enforcing Subjective Aesthetics Through Ridicule Since 2007

Design Police Are Operating In This Area -- And They've Got Crappy Design Sense, Too

Why does anybody still think that ridicule is a useful tool for achieving positive ends? And why is anyone still willing to accept the idea that people who claim to use ridicule for positive ends are doing anything other than bullying people to make themselves feel superior?

Design is a religion. Let's just be clear about that. It has so many of the salient characteristics of religion that I find it difficult to understand why people become so offended at the notion that what they're preaching is not objective truth, it's faith. After all, they've expended a great deal of effort, karma and usually money to get their design credentials, and then they have to live in a world that Doesn't Take Design Seriously. (Much like the world doesn't Love Poetry. But that's another subject for another time.)

I feel for them, but I can't quite reach them, as my dad used to say. Here's a hint: Preachers go to grad school, too. There's a difference between VoTech and science, and unless you're formulating and falsifying hypotheses, design students, you're basically in a jumped-up VoTech program. Just like preachers.

Design fascists like the Design Police start very quickly to sound like folks who see oppression of Christians in all aspects of American daily life. What they're really seeing is that their particular religious biases are not honored by everyone who doesn't share them. Designers see stuff they don't like and confuse that with "bad design" in much the same way that extreme religioninsts see attitudes and behaviors they don't like and confuse it with immorality.

Truly hard-core design fetishists have a wonderful and seemingly limitless capacity for arrogance. They can say stuff like "Comic Sans is Evil", can insist that proper kerning and ligature are crucial to truly understanding the meaning of a text, and basically imply that the rest of the world is populated with design-illiterate idiots who are destroying civilization through sloth and ignorance, all with a straight face and all without realizing that they're basically the design-equivalent of Anne Coulter: endlessly blathering that people who don't "get" them just have no sense of humor. (And bad taste, of course, to boot. Because Helvetica on pink bubblegum is the height of design, doncha know. Wait, I forgot: Intention is what matters; they meant it to be ugly, it's a statement....)

Take the Design Police ("Bring bad design to justice"). (Please take them.) They're a couple of design students (ah, they're still in DESIGN SCHOOL, that goes a long way toward explaining their sophomoric arrogance) I got a link to this lovely little bit of high-concept hideousness ("it's ugly on purpose! that makes it clever!") from a designer in my company. She's easily offended and basically a nice person, not given to deep thought about the fact that her attitude basically implies that everyone else is an idiot, so I refrained from pointing out to her that this is actually pretty fucking offensive elitist bullshit. She works in advertising. She doesn't realize or doesn't accept that design is not as important as designers like to think it is, and why should she? Why would she? It would have a negative impact on her ability to do her work. Heaven forbid someone should point out that the high-concept design choice may not communicate as effectively as a simpler, more message-oriented choice.

Many designers seem to have been drilled in the facile mantra that "medium is message", without any real analysis of what that means. So they take a basically insightful concept like Emotional Design and turn it into a justification for the simple subordination of understanding to gut feeling. Most designers are what the President would call "libruhls", but the attitude is the same as his: The gut is king, the emotions rule over all, what I feel is much more important than anything you or I might know, and that's as it should be. That's not, of course, what Don Norman was arguing when he wrote Turn Signals Are The Facial Expressions Of Automobiles, and it's not what guys like Tognazzini profess to mean when they use the term "emotional design." But I've worked and interacted with a lot of designers, and it seems pretty clear to me that in the current design zeitgeist -- at least on the web -- emotional design means "to look good is much more important than to be good." That appearance becomes its own reality. A very neo-conservative attitude.

I've got no illusions about changing the viewpoint of designers any more than I have about changing the viewpoints of militant religioninsts or militant atheists. They'll believe what they believe. I would really just prefer that they stop wasting my attention and lots of people's energy and money with their bullying (pomo) blather about the importance of clearly marginal crap like the "unimaginative" choice of Helvetica.

Technorati Tags: , ,

Andreas Duany on New Urbanism

Courtesy of the Peoria Chronicle's blog, here are links to a lecture on "New Urbanism" given by Andres Duany in Houston. It's on YouTube in 9 parts of 10 minutes each, and the first several have been posted on the Peoria Chronicle's blog. I'll be working my way through them bite by bite, as I hardly have 90 minutes to spare for anything that requires both visual and auditory attention, simultaneously. I may yet find something objectionable in it, but the basic presentation is quicker than reading Death and Life of Great American Cities.

  1. Introduction; Background; Suburban sprawl patterns; the four major components; public realm/private realm | New Urbanism in 10 minutes a day, Pt. 1
  2. Part 2: Zoning/Codes; Single Use vs. Mixed Use Planning; Traffic and congestion issues; Quality of Life issues; Scale and relation to physical compatibility vs. functional compatibility | New Urbanism in 10 minutes a day, Pt. 2
  3. Part 3: The four major components of suburban sprawl cont'd; Business/retail component | New Urbanism in 10 minutes a day, Pt. 3
  4. Part 4: Residential component today, vs. the way we used to do it-(combining retail with residential); Importance of mixed use/range of income earners; Privacy and Community; "McMansions"; why people prefer to live in traditional towns vs. suburbs
  5. Part 5: Residential, continued; granny flats/garage apartments, addressing affordable housing; The discipline of front/back; Intro, "sense of place
  6. Part 6: "Sense of Place", cont'd; What is it? How do you achieve it? What makes historical neighborhoods so desirable? The role of landscaping; Current residential development issues
  7. Part 7: Residential development issues, cont'd; Open Spaces; Roads: highways,avenues: It's all about the cars; Kevin Lynch; Landmarks; Terminating vistas, then and now
  8. Part 8: It's all about the cars, cont'd; Seniors & children suffer the most from today's sprawl, causing poor quality of life issues and reverse migrations ( more )
  9. Part 9: Back to the 11-hour workday: Spending our lives in our cars; Gold-plated highways at the expense of our civic and public buildings; Vertical vs. horizontal infrastructure; Affordable housing cont'd, by allowing families 'one car less' they can afford $50k more house! Conclusion; Year 2010 and 2015 projections

One comment from the Chronicle blog is interesting:

“New urbanism” is just a facade being used by developers to pack as many people into the smallest footprint as possible, to increase their profits.

In San Diego, older neighborhoods are being transformed into jam packed, noisy, traffic infested cesspools, by billionaires who live on 10 acre estates in Rancho Santa Fe (SD’s Bel Aire).

The 40 year old, 10 unit, low income apt building next to me was converted to $400k “condos” last year. It’s been pure hell, with 15 rude, loudmouthed, morons moving in, several of whom are already about to default on their loans. Several units are now being rented, at 3 times the monthly rent as before. Who wins? A handful of guys sitting around dreaming up their next scheme.

That he misses the point of New Urbanism completely isn't the interesting part -- it's that he's so willing to conflate New Urbanism with a newspeak co-optation of its ideals. He's not necessarily wrong to do so. Like many idealistic movements, it has some foolishness and romanticism baked into and is vulnerable to abuse. There are plenty of people who jump into idealistic movements with a partial understanding of the situation and then end up taking it in whole new, highly rationalized direction.

That's one of my objections to "emotional design": When you choose, as Don Norman, Bruce Tognazzini et al seem to have chosen, to make your evaluation of a design's quality hinge upon its gut, emotional appeal, you're basically opening up the door to tossing out real design and replacing it with pandering. Machines become good if they look cool. By that metric, the AMC Javelin would be one of the coolest, hottest cars ever manufactured. The nigh-indisputable fact that it was a piece of crap would be irrelevant: It had great "emotional design."

Similarly, the fact that PowerBooks are screwed together using 36 (or more) tiny screws of five to six different sizes and head-types, but also force-fit using spring clips, becomes irrelevant: The design feels great, looks great. Never mind that it could cost less to manufacture, cost less to repair and upgrade, and be just as solid, just as sound, if it were designed better. It's still great "emotional design."



Technorati Tags: , ,

Myths are Metaphors

Pop quiz -- does this passage describe the present, or the future?

You sit immersed in a wireless cloud, navigating your way through the folders on your hard drive. It is a floating forest of branching tree directories anchored to a root folder buried somewhere deep inside the machine. You are listening to streaming audio whilst a torrent of more music flows into your MP3 player. While it downloads, your system is organising your music library into fields within a database and generating a feed direct to your homepage. Via your Flock browser you twitter to your friends about the latest item on the newsriver then post a few paragraphs to your blog, where they join the complex trail of links and paths going in and out of your site. While you surf, it's easy to forget that beneath you lies a creepy invisible underworld populated by spiders, bugs, crawlers, worms, and microscopic viruses, whilst above ground your transactions are hungrily devoured by sheep that shit grass before being aggregated into the Long Tail. That data trail you're leaving behind stimulates the synapses of the global brain, which is in turn pulled towards the gravitational core of the Web 2.0 solar system...

Windows Vista: dreaming nature in cyberspace (PART)

Answer: It's the present, of course.

The lesson: With the right language, you can make anything sound cool. Welcome to cyberspace. Let the meat rot where it lives.

(Via Sterling @ Wired)

Time is the new Bandwidth

I've been doing a lot of video blogging on BEYOND THE BEYOND lately, which must be annoying to readers who don't have broadband. But look: outside the crass duopoly of the USA's pitifully inadequate broadband, digital video is gushing right through the cracks. There's just no getting away from it. There is so much broadband, so cheap and so widespread, that the video pirates are going out of business. I used to walk around Belgrade and there wasn't a street-corner where some guy wasn't hawking pirated plastic disks. Those crooks and hucksters are going away, their customers are all on YouTube or LimeWire...

Bruce Sterling, WIRED Blogs: Beyond the Beyond

Broadband isn't the problem. Bruce makes his living being a visionary. I make my living doing work for other people. It's truly not the visionaries who actually change things -- it's the people who buy (into) their visions, and those people just don't have the time to look and listen at the same time to continuous "bites" of see-hear content.

Podcasts are bad enough -- I have to listen for as long as someone speaks in order to get their point, I can't really skim ahead or scan around with my eyes. I've got to buy into their narrative construction. And I'm paying for that purchase with my time and attention.

This also goes to Cory Doctorow's point about text bites. He's grazing around, taking in small chunks of text at a go, and the web is fine for that, that's his message. Great. Fine. But text can be time- and space-shifted far more effectively than audio, which in turn can be time-/space-shifted far more effectively than video.

What's really needful as I've noted before is a way to mode shift text into audio without human intervention. Or video, for that matter, if you want to get visionary about it. But I'm not going to worry about video right now, because audio is something that some basement hacker could actually pull off with an evening's work, and refine with the labor of just a few weeks. Or so it seems to me. On my Mac, right now, I can select text and have the machine speak it to me, complete with sentence and paragraph pauses. The Speech service is AppleScript-able, so (if I actually knew AppleScript) I could script it to pick up blog posts and pump them into audio files, that in turn could be pumped onto my audio player for listening in the gym or on the road. If I spent that much time in the gym or on the road. Which I don't.

Triumph of the Mundane

I have seen the infamous "Hilary 1984" video, and I am profoundly unimpressed. Presumably the creator thought he was doing something profound, or clever, or both, but he's not really saying anything to anybody who hasn't already swallowed the "Hilary is the Anti-Christ" koolaid. Are we supposed to see Hilary Clinton as as "Big Sister"? Are we supposed to hear her words as Newspeak, just because we seem them juxtaposed with elements from Ridley Scott's bombastic vision-for-hire?

To cut to the chase: Does something become profound as soon as you mash it up with sacred (or at least iconic) (commercial) content? Ridley Scott rubbing off on Phil De Vellis, just by virtue of De Vellis getting his grubby mitts on Scott's footage?

My first feeling on viewing the mashup was disgust. I'm not quite the farthest thing from a Hilary Clinton supporter, but I'm not far off from that. She's more or less unelectable, as far as I'm concerned, and I do strongly suspect that she's got some control issues, as the therapists like to put it.

But this is just sophomoric. If I were Barack Obama, I'd be embarrassed to have supporters like that. Good thing I'm not Barack Obama, of course, because to get elected he's going to need a lot of supporters like that, and he can't afford to let them know they embarrass him...

My second thought was that you could pretty effectively cut the legs out from under Phil De Vellis's juvenile pseudo-intellectualism by just taking the same bombastic content and splicing in somebody else. Like, oh, I don't know, maybe...Barack Obama?

And so now I see that I'm not the only person who finds the whole thing kind of silly and puerile. Though honestly, I had something more like Everybody Loves Raymond in mind. That might actually border on profound.

Male people are all, like, .... And female people are all, like, ....

Most comedy is bullshit, at some level. That is, it doesn't matter whether it's true, so long as it's funny. "If it ain't the truth, it oughta be." The point of comedy isn't to be honest -- the point of comedy is to make people laugh (at you).

Or, in the words of Peter "Wait, Wait, Don't Tell Me" Sagal, some stories are just "too good to check." Like the one about how women talk twice as fast and three times as much as men, and men think about sex 300 to 1000 times as often as women.* It's only funny if we don't point out that, as far as any actual evidence has ever shown, it's just not true. And it really stops being funny as soon as we point out that the main reason people think it's so funny is that it's a convenient reinforcement of existing stereotypes.

And it really, really stops being funny when you put those two facts together and come up with the realization our stereotypes aren't actually based on evidence. How inconvenient. And we were having such fun with this new wave of reactionary "innate differences" nonsense.

I guess when all is said and done, it really is "just a story, dude." And it's not as though anybody ever did anything bad by making up stories. Right? Right? But still, I was fairly disappointed when I learned that Peter Sagal and his crew had swallowed the load of crap that Dr. Brizendine is dishing out in her new-ish book The Female Brain. I guess it was a bit much to expect, that they might, you know, have a view that wasn't a lockstep endorsement of the same old bullshit.

But then, that probably wouldn't be funny.


--
*According to Dr. Brizendine, men think about sex about every 52 seconds, while women think about sex one to three times per day; I extrapolated based on a 16 hour day, assuming that at least some men don't dream about sex every 52 seconds....

Great Big Voices

Breath control! Support your voice! Support, support!!

Those words ring in my memory as I recall instructions from past voice teachers of mine, like Avery Crew, whom I studied with in high school, and Rosemary Russell, my college instructor. Avery Crew responsibly would not take any students under the age of 16 because voices took time to mature. Ms. Russell was an organist and didnâ??t even know she could sing until her beautiful, low mezzo-soprano voice gradually began to mature.

These teachers were not deterred by young, unwieldy, big voices. They were part of an era that understood the importance of technique and ability to project voices in a concert hall. They were not in a hurry to push young voices, especially ones such as mine that wouldnâ??t blossom fully for another decade or so. I was a young contralto, a minority among other female singers. I used to envy the versatile sopranos as I lumbered along, attempting to harness my voice through proper technique and taking care of my health (big voices donâ??t always come in big bodies). I learned to project my voice (especially important for low ranges) and sang without the Microphone God appendage. I didnâ??t overuse my voice and never had problems with vocal cords, etc. Singing eventually wasnâ??t my fulltime career, but I never stopped solo performing, part-time, professionally. And Iâ??m still going strong decades later.

Today I read with interest Anne Midgetteâ??s New York Times article, â??The End of the Great Big American Voice.â? Oh, to be endowed with not just a big voice, but that rare Great Big Voice, a voice that unreservedly could envelop a captivated audience with its spiritual radiance! Such were the voices of opera greats I remember and admired from afar. But surely, itâ??s not the end?

Listening to Anne Midgetteâ??s audio presentation accompanying her article, I wonder if the Big Recording God would not be willing to share the throne with beautiful Great Big Voices that donâ??t always record as well as small voices, yet display their magnificence more fully with good technique in an opera house. Nothing at all wrong with smaller voices. But not all music was written for small voices in a recording studio. Truth is, Iâ??d rather listen to Pavarotti than Bocelli on a concert stage. Just my not-so-humble opinion.

Topics: 

White Bread

What would you do if presented with a mound of 3,960 slices of white bread, from 180 loaves minus the end pieces but with the crusts? Eat it? Give it away? Torch it? Squish the bread and roll pieces into gummy balls?

Artist Beili Liu dried the pieces and created â??Breadth,â? a wall of white bread -- length about 20 feet and more than three feet high -- displayed at a University of Michigan dorm dining room.

"I want the work to encourage questions about what you think about food," Liu said. U-M nutritionist Ruth Blackburn responded by saying, "If you have to use white sandwich bread, it's better to use it for art than eat it."

I suppose that might depend on the kind of white bread, for me at least. Iâ??d agree with Ruth Blackburn when it comes to the fluffy, tasteless, stick-to-the-roof-of-your-mouth variety. Blah. But I wouldnâ??t turn down an occasional piece of homemade, crusty French bread.

Thereâ??s been a lot of research recently regarding the perils of eating white bread using refined flour. But results often fall on deaf ears of white bread aficionados. So, in response, the food industry is offering a whole grain white bread. Is everybody happy now?

What do I think about food in general? Well, is anyone really interested in a dissertation? Iâ??ll keep it short, for now. I like to eat food. It helps keep me alive. I like to cook food. I even find preparation and partaking of food to be a creative and sometimes sensuous experience.

If sticky, insipid processed white bread were the only food available to me, Iâ??d hold my nose and eat it. Otherwise, please pass the whole grain bread. It tastes better to me.

Topics: 

The Megalomaniacal Mac

When you start certain Apple applications (such as iTunes and Safari), they check to see if they have a shortcut in the Dock. If they don't, they automatically make one. If Microsoft did that, it would be regarded as incredibly rude; if Apple does it, it's "friendly."

Similarly, the Finder comes configured by default to favor Apple applications, like iLife, iTunes, and FinalCut, by virtue of the fact that it defaults to creating "libraries" of media types that are tailored to those applications.

In case the rationale isn't clear: iTunes makes Apple money. Wherever there is a way to "monetize" the uses to which a personal computer is put, Apple will take every opportunity to put themselves in the front of the queue. iPhoto has hooks to pay services; FinalCut is an expensive piece of software that Apple hopes to sell as an upgrade to home-videographers; and iTunes, of course, is making millions of dollars for Apple by linking Mac users directly to the Apple music store.

So why is it again that people see Microsoft as megalomaniacal, but don't see Apple that way?

The Masonic Mac

Some design-geek at Frog Design thinks that iPods are "universally" described as "clean" because the iPod "references bathroom materials." It's kind of a silly little think-piece, not least in that it makes a point and then throws out a lot of unrelated arguement in an attempt to hide the fact that it doesn't really make much of a case for what might otherwise be an interesting assertion. But that's not what I'm writing about.

A comment in-thread lead me to this insight: Being a "Mac Person" is a little like being a mason.

Which is to say, to be a "Mac Person" is to feel that you belong to something, while at the same time feeling yourself to be different from other (lesser) people. If you belong to a secret society of some kind, you feel both privileged to belong, and empowered by your connection to that society.

Membership in the secret society comes with a cost: Dues, expenses for robes or other paraphernalia (as Stetson Kennedy wrote in his book about infiltrating the Klan), and any opportunity cost associated with providing expected assistance to other members. Any extra costs are obviously assumed to be at least offset by benefits, by "believers" in the secret society. Those costs are their "dues"; they're what they pay for the privilege of being made special by the organization.

Committing to the Apple Way has similar costs: Software is more expensive and less plentiful; hardware is often proprietary (as with iPod peripherals), or hardware options more limited (if you don't believe it, try to buy a webcam off the shelf at a mainstream store); software conventions are different, and require retraining. Apple users (rationally) presume there to be offsetting benefits, typically cast in terms of usability. My own experience using and supporting Macs tells me that those benefits are illusory, but that's beside the point: Mac users assume them to exist, and act on that assumption.

But they also gain a sense of superiority from it, and they get that reinforced every time they pay more for something, every time they have a document interchange problem with a Windows-using compatriot, every time have a problem figuring out what to do when they sit down at a non-Mac microcomputer.

The extra cost is understood as an investment. They are paying dues. Being a Mac Person is, in that way, a little like being a Mason. Or at least, a little like what we might imagine it's like to be a Mason, since most of us have never actually met one.

Karma, language, and action

In following the Horgan debate on MeFi, I encountered this, in a response to a Horgan critique of Buddhism:

.... Karma literally means "action." Action always produces results and so the word karma is often misunderstood as referring only to the results of our actions, not the actions themselves. In fact, action and its results are one and the same. Time, the thing which makes us see them as separate matters, is the illusion. Time is no more than a clever fiction we humans have invented to help organize stuff in our brains. ....

... which is, in turn, no more than a clever fiction that Buddhists have invented to help organize stuff in their brains. Because, of course, if the world is an illusion, then we can't prove the world is an illusion.

Tricksy, these Buddhists, is....

But I digress, as usual. What really interests me is the simple assertion that actions and their results are "one and the same", without any attempt to explain what that means. If you parse the language, what doubtboy is really saying is that the Buddhist term karma isn't a synonym for "action", it's a cognate. "Karma", that is, doesn't "mean" "action" -- it "means" (in English) "action plus result."

The two different conceptualizations of "action" let you reason to different ends -- they give you different kinds of power. One gives you power to include, the other gives you power to divide. As Pirsig pointed out in Zen and the Art of Motorcycle Maintenance, the power to divide is powerful, too.

Where Buddhist practice starts to get really interesting (to me) is where it allows you to use both conceptualizations of "action" simultaneously. It's my experience that many dedicated students of Buddhism don't grok this possibility; I expect that doubtboy, like Pirsig, does.

Imagination Failure of the Moment

Failure of imagination is often indistinguishable from arrogance.

Here's how The Blue Technologies Group conceptualizes the ideal "writers" editing environment:

The concept of single documents in the classical sense is dismissed. Text elements take their part and are organised in a project, the container.
Every text element has two editing levels: the "standard" text and a "note pad".
The ability to format texts in an optical way (bold faced, italics, etc.) is omitted - you can divide paragraphs into levels and set markers instead.

It's passages like this that drive home to me how sorely and sadly in need most people are of a little applied personality theory. Because it's painfully clear to me just from the language that they use that their word processor, Ulysses, is going to be a painfully inappropriate tool for the vast majority of writers.

I know that because Ulysses has clearly been defined to suit the personality of a particular type of writer. The words and concepts its creators deploy tell me that. They talk about "projects", "markers", "levels" (of paragraphs?). These are organizational terms; they're conceptual terms. Using them to appeal to "writers" exposes the assumption that all writers think in similar ways. It implies that "writers" will want to restructure the way they think about producing texts such that they're vulnerable to being organized in "levels", and that they'll find it a benefit to replace italics and boldface with "markers".

My own experience working with writers who need to maintain HTML demonstrates to me abundantly that people aren't typically very interested in replacing italics with an "emphasis" tag. The idea that "italic" is visual and "emphasis" is conceptual (and hence, independent of presentation) is too abstract from the reality of writing, for them -- it's too high-concept; for them, the reality of writing is that emphasized passages are in italics, and strongly emphasized passages are in boldface.

And I also see that while they talk about elimiinating distractions, they produce an application with a cluttered and confusing user interface that looks to me like nothing so much as the UI of an Integrated Development Environment (IDE). While I've grown accustomed to the metaphor, I can remember when I found it cluttered and confusing, and I know from long experience that most people find those UIs as confusing as hell.

Now, this may be a great environment for some creative people. But based on what I know about personality theory, that subset of people is going to be very small -- something less than 7% of the population, most likely, and then reduce that to the much smaller subset that are writers who work on substantial projects.

I might even try Ulysses myself, for whatever that's worth; but if it looks to me like it would be the slightest nuisance to produce reviewable copy (for example, if I have to spend ANY TIME AT ALL formatting for print when I send it to friends and colleagues for review) then it's more or less worse than useless to me: Any time I save by having my "projects" arranged together (and how many writers do I know who organize things into discreet projects like that?), would be wiped out and then some by time wasted formatting the document for peer-reviewers. And I haven't even started to talk about trying to work cooperatively with other people....

The (partly valid) response might be that if writers would only learn to use it correctly, and adopt it widely enough that you wouldn't need special formatting to send a manuscript out for review, then Ulysses would be a fine tool. Of course, that's the same kind of thing that Dean Kamen and his true believer followers said about the Segway: If we'd all just rearrange our cities to suit it, the Segway would be an ideal mode of transport....

It's not the marketing I object to -- that will either work or it won't -- it's the arrogance of presuming that they've found the True Way. Because the implicit lack of interoperability that goes along with defining a new file storage protocol (and I don't care how you dress them up, they're still files) basically inhibits Ulysses users from working with other writers, and therefore implies that it's a truly separate way, if not a purely better way. Ulysses looks to me like a tool that fosters separateness, not cooperation -- isolation, not interaction. It's farther than ever from the hypertext ideal.

But then, I suppose my irritation is indicative of my own personality type.

Captivating Museums

I have a membership at The Henry Ford, which is comprised of various attractions: Henry Ford Museum, Greenfield Village, an IMAX theatre, Ford Rouge Factory Tour, and the Benson Ford Research Center.

Greenfield Village is my favorite of the offerings there. Itâ??s been a favorite since I was a kid. Itâ??s a step back in time. A village from an era past.

The Henry Ford Museum is cool, too. But it doesnâ??t have the enveloping ambience for me that the village does. Iâ??d rather walk through a village than through lines of automobiles, planes, and trains, I suppose. To be fair, the museum isnâ??t at all boring. There are some wonderful interactive features.

At Greenfield Village I fully participate in the activities. I walk in real houses and laboratories and stores and ride on the train and the steamboat. This year Iâ??m looking forward to checking out the â??Summer Eveningsâ?:

Light up a Saturday night with inspiration for a new generation at Greenfield Village Summer Evenings. New this year, national renowned guest artists will make appearances at the Village selling their wares alongside our own award-winning artisans. Grab a cocktail, stroll through Liberty Craftworks, listen to the music and jump in a Model T. Itâ??s a different park, just before dark.

I remember several fine, candlelit, family style holiday dinners at the Eagle Tavern. One year each table was assigned a verse from â??The Twelve Days of Christmas,â? and we had to sing and act out the verse. Much fun. Especially entertaining were those who had imbibed enough to perform in truly uninhibited fashion.

My mother tells a story of a wedding at Greenfield Village she attended years ago when the groom, a fair, freckled redhead, turned very pale and fainted during his vows.

All these memories and more come to mind as I read Sam Smithâ??s â??How to keep people going to museumsâ? (The Progressive Review, June, 2005). I thought of other fine museums Iâ??d visited and aspects I liked about them and others I didnâ??t. Certainly not all can or should be outdoor museums, but Sam Smith has offered some invigorating ideas for museums. He figures that creating â??museum advisory boards of 12 year olds - i.e. those most likely to enthuse about or get bored with exhibitionsâ? might help to make museums creative as well as educational.

Couldnâ??t hurt.

Topics: 

Live Theater for Mall Rats

Yesterday evening I traveled north of town to see a live performance of the musical, The Scarlet Pimpernel,... in a shopping mall, in a cozy theater nested between two
retail chain stores.

Apparently this concept of doing theater in shopping malls is not new, but I personally have never seen a live stage theater built inside a mall before. And I thought it was cool. Shop a little, stroll along, grab a snack, take in a movie. Scratch that. Take in a play. How charming. How non-elitist. How wonderfully culture-building, this collective enterprise.

Topics: 

Mature Content Warning: This Program Might Make You Think

Before Nova this evening, there was a "mature content" warning.

Nova tonight discussed an ongoing controversy regarding the origin of the original human inhabitants of the Americas. In brief, it discussed the long-standard "Clovis-first" theory in the light of new archaeological finds, conjectures based on analysis of tools from ice age Europe, and evidence from analysis of mitochondrial DNA.

In short: About as hard-science as archaeology gets. Finding the bones, finding the tools, big-time.

During the entire program, there wasn't a single bare breast, not a single blue word or phrase, not one mention of gay marriage or even a hint of sexual liaison (aside from the implication that the ancestors of modern native americans might have, you know, reproduced).

So why the mature content? Is it, perhaps, that they're scared that some religious fanatic might point out that when you're talking about things that happened 15,000-20,000 years ago, you're pretty much assuming that the world is older than 4,000 years -- hence insulting all the KJV Baptists in the Nova audience?

By contrast, consider any random episode of Law & Order: Elevator Inspectors Unit, or CSI: Sheboygan, wherein you're likely to find references to "last meals" of semen or violent sexual deviance. I don't recall ever seeing a "mature content warning" before either of those shows. Ever. But then, they don't challenge the Dog-given age o' the universe....

Momentary Thoughts on Empiricism in Design

When I read reports from other people's research, I usually find that their qualitative study results are more credible and trustworthy than their quantitative results. It's a dangerous mistake to believe that statistical research is somehow more scientific or credible than insight-based observational research. In fact, most statistical research is less credible than qualitative studies. Design research is not like medical science: ethnography is its closest analogy in traditional fields of science.
[Jakob Nielsen, "Risks of Quantitative Studies"]

I've always found it more than a little ironic that many designers have such a strong, negative reaction to Jakob Nielsen, especially since most of them do so by banging the drum in protest of what could be termed "usability by axiom": The idea that following a set of magic rules will make a website (or any application) more usable. I find it ironic, because Nielsen has always seemed to me to be a fairly ruthless empiricist: His end positiion is almost invariably that if a design idea doesn't provide the usability benefit you imagine it will before you use it, then you shouldn't be using it. This month's Alertbox is a case in point, but there are plenty of others I could cite.

And therein lies the problem. Designers, painting broadly, really do know more than the rest of us do about design, at least on the average: They spend years in school, they produce designs which are done according to the aesthetic rules and basic facts about human interaction with machines and which are then critiqued by their teachers and colleagues. They've often even done their research quite meticulously. But they seldom bother to actually look at what real users do -- at least, in any way that might do something other than validate their preconceptions. And it is, after all, the real users doing real work who will get to decide whether a design is effective or not.

Take "Fitt's Law", for example: If you search for tests of Fitt's Law, you'll find plenty of tests, but the last time I looked, I could find none that tested Fitt's Law in a real working context. And there's a good reason for that: It would be really hard to do. To test effectively, you'd have to include such contextual features as menus, real tasks, application windows -- and then, change them. It's barely feasible, but do-able -- it would be a good project for someone's Master's thesis in interaction design, and it would be simplest to do with Linux and collection of different window managers. You'd have to cope with the problems of learning new applications, and sort out the effect of those differences on the test. It's a tough problem to even specify, so it's not surprising that people wouldn't choose to fully tackle it.

But I digress. My point is that it's relatively easy to validate the physical fact that it's easier to hit big targets than small ones and easier to hit targets at the edge or corner of the mouse area than out in the middle of the visual field. Unfortunately, that's not very interesting or useful, because we all know that by now. (Or should. Surprisingly few university-trained or industrially-experienced interaction designers take account of such factors.)

One thing it would be interesting or useful to do, would be to figure out what the relationship is between "hard edges" (like the screen edge of a single-monitor system) and what we could call "field edges" (like the borders of a window).

What would be interesting would be to study the relationship of the physical truths exposed by Fitt's "Law" with the physical truths hinted at by things like eye-tracking studies.

What would be interesting would be to figure out what the relationship is between understanding and physical coordination. Quickly twitching your mouse to a colored target doesn't tell you much about that; but navigating a user interface to perform a word processor formatting operation could. Banging the mouse pointer into the corner of the screen like a hockey-stick tells you that mouse pointers stop when they hit screen edges; I already knew that from having used windowing user interfaces since 1987. What I don't know is whether cognitive factors trump physical factors, and simple validations of Fitt's Law do nothing to tell me anything about that.

What would be interesting, would be to design to please customers, instead of to please designers.

A Graceful Interlude

Aglaia, Euphrosyne, and Thalia personified grace and charm in Greek mythology. The trio of goddesses, dancing and singing, would often accompany Aphrodite, goddess of love and beauty.

These goddesses of grace came to mind as I watched young performers in concert dancing lithely across a high school stage yesterday evening. The flamenco suite of alegrias, sevillanas, and tangos accompanied by a professional flamenco guitarist was especially wonderful.

I thank these young performers for a much needed, elegant respite from some otherwise graceless aspects of my day. And for the moment I imagined dancing, as Yeats wrote (â??To a Child dancing in the Windâ?), â??there upon the shoreâ? without a care â??for wind or waterâ??s roar...â?

Topics: 

Small Choices Moving Fast

It's a truism: You can't use the system to really fight the system. If you use a record label to sell songs about smashing capitalism, you're not doing anything substantial to smash capitalism.

So what do you do? Opt out of everything? Or act in small ways? Small ways are unsatisfying; and in any case, how do you know that the soap or chips you buy are really doing anything like what your conscience would have you hope?

Seminal straight-edger Ian MacKaye noticed these contradictions years ago [RealAudio], and they played a role in his move to a lower-volume sound:

"Volume had relegated bands to playing largely commercial venues. Most of the places that had sound systems were commercial venues; their economy is based on their bar sales. It cements this really insidious link between rock and roll and the alcohol industry. The idea that the people who music epseaks to in some ways the most deeply -- and by that I'm talking about kids, teenagers -- are by and large not allowed to see bands play because they're not old enough to drink."

And in turn, it cements the role of rock and roll as a gateway to the bar life. Not the connection someone like Ian MacKaye would miss. I don't doubt that awareness of that contributed to his desire to play in "non-traditional" venues like family restaurants, public places, and repurposed rented spaces like boathouses.

Small choices can make a difference. They might not overthrow the order of things, but then, revolutions are messy things that often do more harm than good.

Remembrance As Modern Art Gone Bad

Speaking of Oklahoma City, my old Okie friend Kelley offered his thoughts on the memorial:

"I still say they should have planted 168 redbuds-a veritable forest that would be blooming now. What a sight that would be, an affirmation of life, a colorful display that cannot be equaled. Instead, they have those silly chairs. Stupid. Modern art gone bad. Yes, they were bureaucrats (mostly) but I think the chair is simplistic and mundane. After all, the noble, tough redbud is the state tree- they're hard to kill and they deal with adversity in a manner I think transcends their environs. Oh yeah, they're the state tree. Duh."

As I sit here, I have a vision of hundreds of ghosts sitting in those cold stone chairs for eternity.... Bureaucrat or no, I find it hard to imagine they wouldn't rather be sitting in a Redbud grove.

I responded that subtlety has become a lost art, accepted only from people (like, say, Roy Blount) who can pretend they're being obvious; and that real local character is passé, like the "southernness" of Atlanta or Houston.

But we've become a monumental culture. We might once have planted trees and let the glory be to God or Nature, and had faith that the tree would one day grow large. But that kind of sentiment died off in the Dutch Elm plague or was suffocated by Cold War techno-optimism. Now, it's no good if it's not human-made. (Ayn Rand smirks from her grave.)

Here in NY, I think the appropriate plant would be blackberry bushes. Let one survive, and you're buried in them forever. My friend Lynne planted blackberries around her fence for some reason a few years back, and now the whole area is a wasp-loud glade all summer long.

Up in Maine, it would be wild roses. Those things grow *as* *weeds* in the cracks between the big wave-smoothed boulders right at the ocean's edge. Even the salt grass has a hard time there.

CORRECTION: I'm chagrinned to be reminded that Lynne's bushes are raspberries, not blackberries. But either will take over in the rockiest, most clay-bound soil, given half a chance. And I'll stand by my Yeats allusion, even if it doesn't represent a literal truth, because I like the way it sounds...

Lost Mysteries

Sometimes I miss not knowing things.

I'm not talking about the big mysterious things. There are a few of those I'd rather not know, but that's a different issue. This is little, simple stuff, for which we can now easily find an authoritative (if not necessarily correct) reference on the web.

For example: When I was a kid, I was always seeing movies without knowing anything about them. I might recognize a face ("Hey, there's that guy from that thing!") or a voice or a walk, or even a style. But I saw a lot of cool movies as a kid that I couldn't have told you anythng about aside from the plot or the setting.

I remember one time as a kid, coming home from school at noon (it was the last week of the school year); I think I must have been in junior high. The local PBS affiliate was showing afternoon movies that week, so I switched it on to see what was there. It was old -- in black and white, and in Japanese, and as I turned it on it was mostly a motley bunch of people having an oblique conversation while they waited out a rainstorm in a busted-up building. But I stuck with it, and soon it got more interesting: The conversation was about a murder case, and one by one they worked through five different versions of the event. In one, a noble-born husband dies by a fierce bandit's sword, while defending his wife's honor as she cowers in the shadow; in another, the wife tempts the bandit, and the husband must be goaded into fighting; and so on, with each version glorifying or justifying its teller. In a fifth and final version, from a surprising source, all parties come off petty, venal, and weak-willed.

I never knew the name of the picture or anything about it when I was watching. But it stuck with me for years. Probably a week didn't pass that I didn't think about that ugly fifth version, thick with fear and utterly lacking in grace for anyone. Until one night in college, I went to the regular screening session for my Japanese cinema class. That night I saw a film called Rashômon.

These days, there wouldn't be a mystery. I'd just look it up on IMDB or post a question to Ask Metafilter. It's all easy, now. We go, we get our answer -- we don't spend time chewing on the memory of some mysterious film or book or song, reworking it in our memories until we make it into something that speaks to us.

My brother Glen once told me about a film he'd seen as a kid. It was an old film -- black and white. About a rich old man who dies alone and friendless after uttering the mysterious phrase "Rosebud!" -- which turns out to refer to an old sled. He'd thought about that movie a lot, over the years, but had never been able to remember the name of it, or who starred, or who directed.

I thought about it for a moment, and took a guess: "Sounds like Citizen Kane." (I'd never actually seen Citizen Kane at that point, mind you.)

He shook his head resolutely, as I recall. "No, that's definitely not it."

New Terror Threat: Unitarian Jihadis

Can you actually wage jihad for tolerance? John Carrol @ SFGate "reprints" the manifesto of the Unitarian Jihad [fwd courtesy Amy]:

We are Unitarian Jihad, and our motto is: "Sincerity is not enough." We have heard from enough sincere people to last a lifetime already. Just because you believe it's true doesn't make it true. Just because your motives are pure doesn't mean you are not doing harm. Get a dog, or comfort someone in a nursing home, or just feed the birds in the park. Play basketball. Lighten up. The world is not out to get you, except in the sense that the world is out to get everyone.

Brother Gatling Gun of Patience notes that he's pretty sure the world is out to get him because everyone laughs when he says he is a Unitarian. There were murmurs of assent around the room, and someone suggested that we buy some Congress members and really stick it to the Baptists. But this was deemed against Revolutionary Principles, and Brother Gatling Gun of Patience was remanded to the Sunday Flowers and Banners committee.

It would be funnier if powerful and highly educated men didn't believe that there's some kind of "anti-christian conspiracy", or think that judges who have the integrity to make objective judgements are just asking to be shot. Those folks should try being a non-christian for a while, and see what that feels like.

Delocating The Village Green

During one particular, unhappy period of my life, I used to cross the street from the Y to the Village Green after my morning workout, and get a large coffee (and some sesame noodles, if I was feeling flush), and sit at the counter while I scribbled in my notebook.

The first refill was free; some days I'd go through four large cups. I'd mostly just write, alternating with long stretches of staring out the window. Sometimes I'd take a break to make a to-do list (top item of which was usually something on the lines of "GET JOB"). "The Green" was one of those large-ish, eclectic bookstores that you often used to find in urban to marginally-urban settings, featuring huge selections of magazines, unusual selection, and a section filled with some interesting food and candy.

And coffee. They always sold coffee, and as early as when I started visiting Rochester in the winter of '90/'91, it was good coffee -- not that crap that chain coffee shops dark-roast or pump full of artificial flavor to conceal its poor grade. Later, as they expanded in an attempt to compete with the suburban mega-bookstores, they added tables and chairs to go along with a new selection of pastries, cake, and vegetarian deli goods. They expanded their big suburban store in Pittsford; they built out their "flagship" store (really the much smaller of the two) to add a new CD store, trying to target the custom order market.

They went out of business not long after that, like a player at Risk who gambles on too rapid an expansion. It was a slow death-spiral, first rumored around the neighborhood, then heralded by the closure of the Pittsford store. As I saw it at the time, it was purely a matter of bad cost-containment: The wastage in their coffee shop operation was terrific. I counted one time, while I sat there, and noted that on any given weekday, they'd keep a dozen or more cakes, pies, torts and cheesecakes in the display case. At the end of the day, they might have completely consumed four or five of them. Still, they stubbornly insisted on keeping their food inventory until almost the end.

When the Green finally went under, they walked through and put price tags on everything: The books, the bookshelves, breadracks, refrigerated cases, anything that wasn't nailed down. Then one day, it closed, and was replaced a few weeks later by remaindered book wholesaler. He stayed for a month or two (probably sitting out the end of their lease), and then the space was closed. Half of the ground floor would be refurbished into a Pizza Hut; the old record area, upstairs, became a YMCA youth center; and the main building became a Hollywood Video.

It didn't take long for a succession of new coffee shops to open up, in a pair of buildings across the street and down a half-block. Neither lasted: The first was badly-managed and ahead of its time (an Internet cafe in 1997), and the second got knocked out cold when a Starbucks opened right across the street. Right between the sushi place and the trendy boutique, and across the alley from a cozy, carefully-hidden used bookstore called the Brown Bag, in a residential home that once housed a trendy wood-fired pizza place. (The Brown Bag used to be called the Oxcart, until its owners got out to write childrens books full time. That was something more than 15 years ago. It changed so little that lots of folks still call it the Oxcart.)

Starbucks is much busier than the Green ever was. In my gut, I don't know why; the Green was cheaper, their coffee better, their desserts were from the best dessert bakers in town. (Cheesy Eddie's carrot cake is pretty hard to beat.) Intellectually, of course, I know that people don't go to Starbucks with any conventional notion of value in mind. They go for an upscale version of that same ritualized sameness that Ray Kroc grokked: The beverage names are an incantation, a call-and-response to the baristas; the packaged and routinized baked goods are offerings to some god of status-through-commerce. I feel unclean whenever I go into a Starbucks, because I know that I'm in the temple of a faith to which I am apostate.

I've been to lots of coffee shops since then, and even spent a fair amount of time in one or two or three. But it's not the same. They're more expensive, and that's a big part of it. It's not that I'm cheap; it's that the cost starts to feel like an offering to those same gods of style, of status-through-commerce. It's a different sect, but it still feels like the same creed. Still, the coffee is good, the food is good, and the old Hallman Chevy building is fairly charming.

All of this is brought to mind this morning by an entry on the Daypop top 40: Delocator can help you find an independently-owned coffee shop near any US zip code. I don't see any near my zip code that I didn't already know about; it would be nice if they could take proximity arguments, which would let me see several more. But this is a pretty unusual area; we had "indie" coffee shops here before they were cool, and some of the best of them weren't proper coffee shops at all. Like the Green.

Toothsome Ironism

Folks sure do some funny things to make other folks think they're hip.

"Toothing" seems to have been a hoax. At least, that's what everybody's stumbling over their shoelaces to declaim. ("Dogging", though -- which differs from toothing in kind only insofar as it doesn't have a "fake" name -- is apparently real. Unless Ste Curran and Simon Byron are going to claim credit for that, too. "Yes, you see, all hedonism is a great hoax. Nobody actually has anonymous or exhibitionist sex. We made that up and you're all rubbish for thinking otherwise.")

What interests me is not the feeding-frenzy around the original "hoax" so much as the feeding-frenzy around its exposure, as people and institutions race to minimize the damage to their egos.

I use scare-quotes on "hoax" because, while I don't doubt the story about how the term came to be, I also don't doubt that people do it. Because, of course, the fact that somebody made it up has more or less nothing to do with whether people actually did it -- after, or even before. But it's officially a hoax, now; ergo, anyone who "believes" it ever happened is a fool. (And, apparently, anyone who dares notice that the ironists behind are prancing about naked is twice the fool. C'est la vie.)

There are some interesting things that often (if not usually) happen during the coarse of a big hoax:

  • The hoaxers have a clever idea that they think is sufficiently outrageous that people ought not believe it. Anyone who did, would be a fool, and therefore would merit ridicule.
  • In a successful hoax, the pranksters then expend no small effort actively duping at least a few people into going along. Documentation of this will later be used to beef up their credentials as Clever Blokes©.
  • People start actually doing the hoaxed thing -- or treat the hoax as sufficiently real that they start re-enacting it. (Making it, like, not fake, eh?)
  • The hoaxer claims credit, usually implying that all the buzz after the fact was entirely totally his doing, and therefore totally fake. (This is done, of course, to make the hoaxer feel important.)
  • The media outlets and members of the public who got taken in on the original hoax stumble all over themselves in the rush to discredit any reports of the hoax-activity. (This is done, of course, to alleviate the sense of foolishness that comes of being "taken in.")

I personally never "bit" on toothing in the sense of blogging about it or expressing moral outrage, etc. That's not becuase I immediately recognized how improbable it was -- quite the opposite. It was because I personally never found it that implausible. Aside from the fact that bluetooth messaging has had well-documented and unexpected use in ad hoc social networking [pdf], I've seen enough amateur hedonists casting about aimlessly in the "culture of death" that toothing doesn't seem that improbable to me -- certainly not in the realm of "throwing a brick at the dancefloor with a love letter attached, and hoping that the person it hits will agree to sleep with you."

I always reckoned the success rates for toothing to be in territory that a party-bar wingman (among the most troublesome of amateur hedonists) could wrap his sodden cognition around: "One in thirty, those are pretty good odds, bra!" Toothing would have poorer odds than one in thirty, to be sure; but the effort involved is less, too. Technology decreases the marginal cost. And as toothing became more "popular" (i.e., the "hoax" spread more widely), a greater proportion of amateur hedonists would leave their bluetooth open, and there'd be a substantial likelihood that toothing would actually work.

And in certain settings, it would most likely work really well. Think bathhouses....

So for me, what's really interesting is that in buying into the "toothing is a hoax" meme -- in accepting that the idea of anonymous sex mediated by text messaging was only ever always merely a hoax cooked up by a couple of bored wankers -- we miss the opportunity to learn whether the activities described as "toothing" ever actually happened. That would be kind of fascinating: Where? How frequently? What were its etiquettes? What did it do to the spread of STDs?

I expect that we would find it's done in dark places with loud music and lots of intoxicants, by people who don't then go home and blog about it. Making it part of that world outside that hip young blogospherians like Byron and Curran often seem ill-acquainted with.

The Imaginary Terri Schiavo

Imaginary people make much better martyrs.

Case in point: Terri Schiavo. The appeals are finally exhausted; Terri Schiavo is dead, unequivocally, unappealably. And we've just begun to see the consequences. Quite aside from the impending wrongful-death suit (which will be brought regardless of the results from the forthcoming autopsy, to be performed by a Jeb Bush appointee), the fight has catalyzed a constituency. It's given bullshit artists like Tom Delay (that old exterminator) a soapbox to stand on. Note, as we go forward, the endless repetition of their Big Lies: That the "American People" are behind the reckless Conservative-Republican adventurism; that the case shows improper involvement by the courts, instead of the courts doing their jobs by (perish the thought!) making judgements.

What was this case about? It certainly wasn't about whether one person would have preferred to have her body die; it passed beyond that threshold years ago. It passed beyond that when Bob and Mary Schindler concocted a "person" they called "Terri Schiavo", and identified her with their daughter, and pasted her face over their daughter's face whenever they saw her limbic-brained body in that bed. The "Terri Schiavo" that Bob and Mary struggled so hard to defend was not their daughter, but their dream of their daughter, or at least the best dream they could muster under the circumstances.

And she was a perfect daughter, in many ways: She didn't talk back, never contradicted their version of her life's narrative, never corrected their inventions about what she might be thinking at that moment. Or have thought when she was eight, for that matter.

It certainly wasn't about what the real Terri Schiavo's wishes might have been. What they are, I can't know, and I daresay Michael Schiavo can't know for sure. But judges have been evaluating the matter for seven years and not found a reason to suspect that she wanted her body to remain alive long after she'd lost the capacity to engage in detectable interactions with other people.

True, Michael can't have known for sure; but her parents -- surely they must have known?

Why? Why would we suppose that? My own parents wouldn't have the faintest idea what I'd want in such a situation. For practical purposes, they know nothing of real substance about me that they didn't know before I was eight. I could name four or five close friends, a handful of ex-lovers and seven or eight not-so-close friends who'd have a better idea.

So, no, it's got nothing to do with Terri's wishes. But it's got a great deal to do with how her parents imagine her wishes -- with the wishes of their fictional Terri, as it were.

And Jeb and George Bush's and Randall Terry's and Tom Delay's fictional Terri. Which is the real obscenity, here, of course. If it were just Bob and Mary, it would be a tragedy. And anyway, their version of Terri is at least based on something real. But with Jeb & George & Randall & Tom in the game, any hope of the real Terry S. being remembered are completely gone. She's doomed to be immortalized as an abstracted martyr for the cause of eliminating secular justice.

Tim's Mammalian Brain

Heaven forbid we should make a rational choice. Because, of course, rational, counter-intuitive thinking has never gotten us anywhere. Not anywhere that we remember, at least, while our lizard-brains are in charge. It might be worthwhile, though, to remember that for the last few ten-million years, the mammals have been in charge.

Jennifer Loviglio wants an SUV. She wants it because she wants to feel safe:

.... I want an SUV. I want to be safe. Last month I totaled my old Volvo in a scary accident, and at that moment everything changed.

It was late afternoon and the weather was fine --- dry roads, good visibility. I was driving along East Avenue and without warning a young driver in a Honda made an illegal left across traffic. I hit the brakes but it was too late. The awful metal smash. The explosion of airbags with their acrid smoke and debris. My son screaming in the backseat.

The car lurched onto the sidewalk and we got out fast. No one was hurt. ....

... and yet, she still wants her SUV.

She's test driven them, and she felt that tendency to roll over; she felt it as even more pronounced in the full-sized SUV, but she still wants "8,600 pounds of metal between my boys and the other cars."

She wants those 8,600 pounds because she wants to feel safe, not because she wants to be safe. In fact, she knows she'll be less safe:

In larger SUVs, that top-heavy pull is even stronger. And yet, even though I know better, it does feel safe up there. A couple of years ago, in a New Yorker article about SUVs by Malcolm Gladwell, an industry expert pointed out that this paradox is common. On an intellectual level people know taller vehicles have a greater chance of a rollover, but on what he calls the "reptilian level," consumers think "if I am bigger and taller I'm safer."

The article also shows how SUVs take much longer to stop and are difficult to steer even at moderate speeds, whereas sporty little cars with their better handling can avoid potential collisions at speeds upwards of 50 miles an hour. It makes the case that a smaller car, which could be crushed by an SUV, might nonetheless be a safer vehicle because of its maneuverability. Still, though, if I'm going to hit something --- God forbid --- I'd rather be in a tank.

Of course, the rational thing to do would be to check the crash test ratings for various models, or even just buy another Volvo. The first one served her well: The much-lauded Volvo space frame did its job, the airbags worked, and no one was hurt. And in the unlikely event of a rollover, there are few cars in current manufacture that will keep her family safer than a Volvo.

But this isn't a rational issue, it's a "maternal gene" [sic] kicking in. And we all know, don't we, that it's "crucial" (by which she clearly means 'forgivable') to obey the yearnings that we think are wired into our genes.

Which is to say, to be good and conscientious parents -- well, mothers, really, since "paternal genes" aren't under discussion -- we must always obey our lizard brains. Heaven forbid we think for a moment with our mammalian brains.

My sister went off the road one time and rolled her car. She was on her way to church on Sunday evening, with her two year old son and a bunch of warm pumpkin pies in the back seat. When her '73 Saab 99 settled back onto its wheels, she felt something warm and sticky on her head; but it was only pumpkin pie, and Luke was screaming that frightened but unhurt scream from his fiberglass car seat.

Luke is now 25 and a father of two. The much-lauded Saab roll-cage had done its job. The next day, once he'd arranged to have the wreck (which still at least looked driveable) towed back to the house, Luke's father Tim was on the phone looking for a new-used Saab to replace it with.

That was Tim's mammalian brain -- his "paternal gene" [sic] -- working. He wanted his family to be safe.

And after all, the mammals are in charge, now.

Chris Rock, the Red-State Conservative

While I can't say it surprises me, it's amused me for a long time that people like Matt Drudge think Chris Rock is "dangerous." Dangerous to what, I wonder. Perhaps to their complacency. Certainly not to "family values" or "common mores" -- not if you're paying attention. I am a bit surprised, though, to hear him referred to as a 'William F***ing Buckley Conservative'.

Years ago, I saw Chris Rock on television -- probably HBO, probably "Bring the Pain", but I don't remember exactly. I do remember a bit he did about men cheating on their girlfriends. He started by signalling his intent -- showing the club, as it were: "Men are stupid. Because you know you're gonna get caught." He does it to be fair, maybe, or maybe to prove that even after he's signalled that he's going to drop a hammer on them, he'll still sucker the men in the audience in. Which he proceeded to do, describing how natural it was to want to cheat, how easy it should be to lie -- if, of course, men weren't stupid. And, more important, if they didn't know damn well they deserved to get caught.

Rock is a stealth moralist. He's a preacher to the pop-cultural -- a wandering rabbi or imam, ministering to the barflies by telling stories in terms they can understand and that will elicit enough of their empathy to make them stretch their minds and consider their world. It's an old and proud tradition (as old probably as storytelling), realized with a wide array of techniques. My favorite contemporary example is Matt Groening's marvellous creation, The Simpsons, which panders to our baser instincts and then springs the trap on us by making its characters renounce their ill-gotten gains in the service of What's Right. Rock's technique is similar: He lulls his audience into a false sense of security, and then explains in quick, brutal strokes that anyone he's suckered is a fool. And a morally depraved fool, at that.

What Rock's not is a 'William F***ing Buckley Conservative', as John Swansburg seems to think he is. Raising your daughters to not be strippers, or suggesting that single mothers ought to put their children before Girls Night at least most of the time, or suggesting that abortion might be a little too cavalierly chosen, are not "Red State Red" conservative positions: They're mainstream moderate American moral positions, shared by the vast majority of adult "Blue" and "Red" state residents, and anyone who suggests otherwise is buying into the Republican framing myth that holds that True American Values are right-wing religious conservative values. They're not actually religious values at all, and what's more, they're not communicated via religion -- at least, not in a healthy, functioning society they're not. In a healthy family, they're communicated by example. Children learn responsible and moral behavior by watching their parents, their extended family, their neighbors, and the people they meet in daily life.

What Rock is, is Lenny Bruce with tamed demons, or George Carlin with more integrity. Any good comic keeps a few demons in the closet to feed him material; but if they're smart (and probably a little lucky), then sometimes, just sometimes, they learn to tame them without losing the energy the demons feed them.

Chris Rock is also another important thing: He's a professional, just like Whoopee Goldberg or Billy Crystal. Personally, I expect his impact on the quality (moral or otherwise) of the show to be positive.

Love, Pain, and Story Logic

One of the most egregious failures of imagination that I see every day is what looks very much like an inability (or more likely an unwillingness) to stretch the mind to understand what a story is trying to tell you.

And what stories are trying to tell you isn't some single, specific thing. If they are, they're bad stories -- maybe even false stories. Good stories -- "true" stories -- are like a thought-experiment: "What would happen if someone did this?" I am increasingly convinced that stories are how humans are wired to make sense of the world. Stories are why we have advanced language skills: Better language made for better stories, better stories made for a more survivable community, etc.

If a story has consistent, valid story logic and character logic -- if the characters behave in ways that makes sense for those characters, in that circumstance, to behave -- then we can safely say that there's at least some truth in it. If the story is powerfully told, so much the better: Without good telling, we won't stretch ourselves to find the empathy we'll need to make the narrative talk to us. This is what "great" makers of narrative (those folks we call "writers", but also film-makers, poets, songwriters, painters...) have always done.

So Medved is not only being reductionist on this point, he's being a bad critic, because he's approaching criticism without imagination. He's looking at the film as though it's some kind of a morality-machine, and any good film -- any good story -- is something more than that. It's a narrative, from which we can draw a deeper understanding (that is, if it's story-logic and character logic are true).

Anyway, I don't have high expectations for a review from anybody who's expecting to find a clear moral universe in an Eastwood film. Think of Unforgiven (endorses prostitution and lawless behavior), Midnight in the Garden of Good and Evil (romanticizes gay sex and murder and promotes an anti-christian agenda through endorsing voudoun), Bridges of Madison County (glorifies adultery), or probably any of his other films from the past 15 years. There is a theme there, though, I think, and it's that Clint Eastwood lives in an increasingly vague moral universe these days. They only things that seem to be certain in Eastwood's moral universe are pain and love. (And there are worse absolutes to fixate on. Power, per se, for example, has no real moral endorsement in Eastwood's vision -- it's a fact, to be sure, but it's always in service of love or pain. But I digress...)

What these films can help us to understand is that a vague moral universe is not an amoral one. Every Eastwood picture that I can recall (aside from his forgettable late Dirty Harry outings, done to win studio backing for future projects) has been driven by its moral choices. His characters do not serve as moral models; rather, they model moral behavior. There's a crucial difference: The first means that they are merely shadows on the cave wall, cast by the contorted hands of a finger-puppeteer; the latter allows us to imagine ourselves in that world, and consider the choices we would make.

Please Just At Least Try To Imagine. That's All I Ask.

But be honest about it. Don't just pretend. That's cheating, and lying, and it makes Baby Jeebus cry.

I'm increasingly convinced that the greatest roadblock to human progress is lack of imagination. More particularly, the inability -- or unwillingness -- to imagine onesself in the position of another.

The problem can look like other things: Like (selective) literalism, as when someone like Michael Medved or Ted Kavanau can see nothing in Million Dollar Baby but a "pro-euthenasia" or "anti-christian" tract. Or it can look like lack of empathy, as when wannabe uber-geeks dismiss the problems of "lUsErS" as being of their own making, or knee-jerk free-will zealots (willfully?) ignore the benefits they accrue from being members of civil society to rip out one of that society's underpinnings.

That empathy requires imagination I regard as self-evident; that people who lack empathy literally lack imagination, I regard as open to question. As a friend remarked to me recently, "it's all about what's at your front door."

Convergence through Desire, Redux

How can I remark on digital convergence without remarking on the forthcoming "headless iMac"?

More to the point, what the hell does a "headless Mac" have to do with digital convergence?

I'll explain. Gizmodo facilitated leaking a bunch of really convincing (to me) product unpacking shots of a device called "iHome", which has a buttload of ports on the back and a CD-ROM slot on the front. Alas, there's lots of smoke and steam on the Apple rumor forums to the effect that these must be fake, because the box is just so ugly. Apple's legendary industrial design staff surely couldn't have produced something so "fugly". (Um...right. Something about this presentation really offends Mac-heads, as is clear from the Engadget comments, but I'm not sure what.) But consider that any box unveiled now is most likely not a production version, and might well be camoflaged the way Detroit camoflages their long-range test models.

Be that as it may, and leaving aside the authenticity of the photos, the name would tell us volumes about how Apple sees the market-positioning of this device, and I belive they do not see it the way that 'Bob Cringely' sees it:

.... The price for that box is supposed to be $499, which would give customers a box with processor, disk, memory, and OS into which you plug your current display, keyboard, and mouse. Given that this sounds a lot like AMD's new Personal Internet Communicator, which will sell for $185, there is probably plenty of profit left for Apple in a $499 price. But what if they priced it at $399 or even $349? Now make it $249, where I calculate they'd be losing $100 per unit. At $100 per unit, how many little Macs could they sell if Jobs is willing to spend $1 billion? TEN MILLION and Apple suddenly becomes the world's number one PC company. Think of it as a non-mobile iPod with computing capability. Think of the music sales it could spawn. Think of the iPod sales it would hurt (zero, because of the lack of mobility). Think of the more expensive Mac sales it would hurt (zero, because a Mac loyalist would only be interested in using this box as an EXTRA computer they would otherwise not have bought). Think of the extra application sales it would generate and especially the OS upgrade sales, which alone could pay back that $100. Think of the impact it would have on Windows sales (minus 10 million units). And if it doesn't work, Steve will still have $5 billion in cash with no measurable negative impact on the company. I think he'll do it.

I see it different[ly].

Nobody's talking yet about what the iHome actually does have. Rumors abound, and they mostly assume it's basically an iBook without a display. I don't buy it.

The very name of the device indicates to me that iHome is not intended to be used as a general purpose computer in any really sophisticated way. It's intended as a media hub, and any other functions it fulfills are incidental, and what's more, Apple won't be enthusiastic about helping it fulfill those other uses: It will most likely be a mediocre platform for applications work. It will be somewhat more than a set-top box, only because it would cost more to dumb it down. (If I'm proven wrong, I'll certainly be taking a look at iHomes for my own use, but I don't think I'm wrong here. We'll see in a few days.)

I think it will be somehow substantially crippled, and I think I know how: It will have limited display capability, ouputting by S-Video and composite only (and the latter through an extra-cost converter from S-Video); and it will not have expandable RAM. Both decisions will be defended on the basis of price, but they'll really have been taken to prevent cannibalizing iBook, iMac and eMac sales. By the way, I essentially agree with Cringely's analysis of the market impact of a fully-capable and cheap iHome, but I think he's applying a much too rational (and charitable) thought process to Apple's senior management.

I think Jobs doesn't know what to do with iTunes. It's a juggernaut he doesn't know how to stop; it's prompting people at his company to actually think about ideas that could shake up the personal computing marketplace, like, say, a genuinely cheap computer with a powerful OS and operating environment. Baseline Macs are built with remarkably inexpensive electronic components: Many still use relatively slow and old versions of the PowerPC chip (the "G4" generation), which by virtue of their vintage are dirt cheap; the "G5" models mostly use relatively slow versions of that chip, and below the most expensive levels, they all use graphics subsystems that are last year's news on PCs. Macs are cheap, cheap, cheap to build. And yet, they're hideously expensive on a bang:buck basis.

If Jobs wanted to really go big, he could have done it years ago. Opportunities like the one that Cringely describes are always there for Apple, all the time. And they never take them. Why? The only answer that's compelling to me is that Steve Jobs does not want Apple to be successful, because that would mean that Apple was no longer about him. Sure, the cult of personality would flourish for a while, but I think he understands that part of his bizarre public loveability is the fact that his exposure is limited. He'll never be as much of a self-charicature as Steve Ballmer or Larry Ellison, but the tarnish would settle pretty quickly and Apple would quickly become beset by the woes of any company that moves beyond a customer base comprised primarily of true believers.

So Cringely's right, I think, about the opportunity, and he's right about what iHome is, but I think he's wrong about what Apple will do with it. And though I predict that Jobs will be accused of not taking these steps out of greed, I think his motivation will be darker: Ego. Though I suppose the Dark Steve's flavor of ego could be cast as a kind of greed....

Pages

Subscribe to RSS - aesthetics