You are here

Technophilia

Techno-fetishim, e.g. a la Boing-Boing. See also techno-optimism, geekism, googlism.

Singularity as Death-Avoidance

IEEE Spectrum focuses on the technological singularity. Glenn Zorpette thinks it's largely driven by fear of death:
The singularity debate is too rarely a real argument. There’s too much fixation on death avoidance. That’s a shame, because in the coming years, as ­computers become stupendously powerful—really and truly ridiculously powerful—and as electronics and other technologies begin to enhance and fuse with biology, life really is going to get more interesting.

Technorati Tags: ,

Freeman Dyson and the Pretense of Vision

Freeman Dyson is one of the more dangerous scientists alive right now.


.... The wiggles in the [Keeling] graph show us that every carbon dioxide molecule in the atmosphere is incorporated in a plant within a time of the order of twelve years. Therefore, if we can control what the plants do with the carbon, the fate of the carbon in the atmosphere is in our hands. That is what Nordhaus meant when he mentioned "genetically engineered carbon-eating trees" as a low-cost backstop to global warming. The science and technology of genetic engineering are not yet ripe for large-scale use. We do not understand the language of the genome well enough to read and write it fluently. But the science is advancing rapidly, and the technology of reading and writing genomes is advancing even more rapidly. I consider it likely that we shall have "genetically engineered carbon-eating trees" within twenty years, and almost certainly within fifty years.

Carbon-eating trees could convert most of the carbon that they absorb from the atmosphere into some chemically stable form and bury it underground. Or they could convert the carbon into liquid fuels and other useful chemicals. Biotechnology is enormously powerful, capable of burying or transforming any molecule of carbon dioxide that comes into its grasp. Keeling's wiggles prove that a big fraction of the carbon dioxide in the atmosphere comes within the grasp of biotechnology every decade. If one quarter of the world's forests were replanted with carbon-eating varieties of the same species, the forests would be preserved as ecological resources and as habitats for wildlife, and the carbon dioxide in the atmosphere would be reduced by half in about fifty years.

That's just science fiction, of course -- not the scary part at all. This is the scary part:

It is likely that biotechnology will dominate our lives and our economic activities during the second half of the twenty-first century, just as computer technology dominated our lives and our economy during the second half of the twentieth. Biotechnology could be a great equalizer, spreading wealth over the world wherever there is land and air and water and sunlight. This has nothing to do with the misguided efforts that are now being made to reduce carbon emissions by growing corn and converting it into ethanol fuel. The ethanol program fails to reduce emissions and incidentally hurts poor people all over the world by raising the price of food. After we have mastered biotechnology, the rules of the climate game will be radically changed. In a world economy based on biotechnology, some low-cost and environmentally benign backstop to carbon emissions is likely to become a reality.

Translation: "We don't need to do anything now, because we'll invent our way out of the problem when the time comes."

I suppose I should be grateful that he's no longer appointing himself global diagnostician. At least now he admits that there might be a problem.

I've been told by people I respect that Dyson is a very good physicist. But I'm hard put to recall anything outside of his domain that wasn't just plain stupid once you got past the "oh, neato" moment. I mean, Dyson Spheres are a cool idea, but also a really dumb one if you think about them just a tiny bit. They're a triumph of the broadly logically possible: We can imagine it, therefore it must be feasible. We can imagine going Niven & Pournelle one better and building a sphere around a small star (or arranging otherwise to intercept all of the star's energy). We can imagine nesting matrioshka layers one inside the other, to overlap and trap the inevitable leakage. All we have to do is solve this list of several thousand technical problems. We've solved every other technical problem we've ever been presented with; we'll clearly be able to solve these. What is conceivable, is feasible.

We can imagine magic carbon-sequestering trees, therefore they must be feasible. We can imagine a quarter of the world's trees being replaced by these magic inventions, therefore we should count on it happening (when the alternative is essentially the collapse of civilization).

All of these speculations commit an obvious and really, really troubling error: They assume that certain important things, like rate of technological innovation, rate of increate in energy use, etc., are essentially laws of nature: That not only won't they change, but that their not changing is a righteous thing. Moore's Law will go on forever; we'll keep increasing our need for energy at a predictable and increasing rate; we'll keep inventing new ways to solve all of our problems; better living through chemistry.

This kind of thinking is usually based on a detailed look at only a very short span of human history, and a very high-level gloss of anything beyond the past three or four hundred years.

It's disturbingly short sighted, in other words, even as it pretends to vision.

This is why I don't respect Dyson: He pretends to vision, but is blind to his own short-sightedness

Technorati Tags:

Feral Residents of the Panopticon

What's it like to live inside the panopticon? Here's what I just wrote at the 'bot.

What's the effect of this kind of life? No doubt the people who brain-farted the idea for htis kind of a system in the first place would respond at this point that they are putting eyes on the street, they're addressing "lifestyle crime" (littering, loitering, miscellaneous minor malfeasance), and that the net effect is to get, through technology, what Jacobs asked for in the 1960s. But an honest appraisal would have to recognize that response as disingenuous. The voice is detached, judgemental, and doesn't brook response -- doesn't even afford it, since there are no pickups (that the security company is admitting to) on the cameras. It can't possibly work to provide the kind of human-scale, person-to-person interaction that happens in in the relatively messy but relatively safe neighborhoods of the real world.

Residents of the Panopticon | FeralRobots.net

Myths are Metaphors

Pop quiz -- does this passage describe the present, or the future?

You sit immersed in a wireless cloud, navigating your way through the folders on your hard drive. It is a floating forest of branching tree directories anchored to a root folder buried somewhere deep inside the machine. You are listening to streaming audio whilst a torrent of more music flows into your MP3 player. While it downloads, your system is organising your music library into fields within a database and generating a feed direct to your homepage. Via your Flock browser you twitter to your friends about the latest item on the newsriver then post a few paragraphs to your blog, where they join the complex trail of links and paths going in and out of your site. While you surf, it's easy to forget that beneath you lies a creepy invisible underworld populated by spiders, bugs, crawlers, worms, and microscopic viruses, whilst above ground your transactions are hungrily devoured by sheep that shit grass before being aggregated into the Long Tail. That data trail you're leaving behind stimulates the synapses of the global brain, which is in turn pulled towards the gravitational core of the Web 2.0 solar system...

Windows Vista: dreaming nature in cyberspace (PART)

Answer: It's the present, of course.

The lesson: With the right language, you can make anything sound cool. Welcome to cyberspace. Let the meat rot where it lives.

(Via Sterling @ Wired)

Time is the new Bandwidth

I've been doing a lot of video blogging on BEYOND THE BEYOND lately, which must be annoying to readers who don't have broadband. But look: outside the crass duopoly of the USA's pitifully inadequate broadband, digital video is gushing right through the cracks. There's just no getting away from it. There is so much broadband, so cheap and so widespread, that the video pirates are going out of business. I used to walk around Belgrade and there wasn't a street-corner where some guy wasn't hawking pirated plastic disks. Those crooks and hucksters are going away, their customers are all on YouTube or LimeWire...

Bruce Sterling, WIRED Blogs: Beyond the Beyond

Broadband isn't the problem. Bruce makes his living being a visionary. I make my living doing work for other people. It's truly not the visionaries who actually change things -- it's the people who buy (into) their visions, and those people just don't have the time to look and listen at the same time to continuous "bites" of see-hear content.

Podcasts are bad enough -- I have to listen for as long as someone speaks in order to get their point, I can't really skim ahead or scan around with my eyes. I've got to buy into their narrative construction. And I'm paying for that purchase with my time and attention.

This also goes to Cory Doctorow's point about text bites. He's grazing around, taking in small chunks of text at a go, and the web is fine for that, that's his message. Great. Fine. But text can be time- and space-shifted far more effectively than audio, which in turn can be time-/space-shifted far more effectively than video.

What's really needful as I've noted before is a way to mode shift text into audio without human intervention. Or video, for that matter, if you want to get visionary about it. But I'm not going to worry about video right now, because audio is something that some basement hacker could actually pull off with an evening's work, and refine with the labor of just a few weeks. Or so it seems to me. On my Mac, right now, I can select text and have the machine speak it to me, complete with sentence and paragraph pauses. The Speech service is AppleScript-able, so (if I actually knew AppleScript) I could script it to pick up blog posts and pump them into audio files, that in turn could be pumped onto my audio player for listening in the gym or on the road. If I spent that much time in the gym or on the road. Which I don't.

Cyberspace as Woodstock Nation

Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.--John Perry Barlow, 1996

In-thread at Web 2.0 ... The Machine is Us/ing Us | MetaFilter, at 12:17 PM

Yet another example of the happy horseshit approach to social activisim: Put an absurd stake in the ground and hope that it makes people come that much closer to what you want.

Of course, Barlow never got what he said he wanted, but there are enough new Web 2.0 toys floating around that let people do superficially cool collaborative things that Barlow's probably pretty assuaged, most of the time. Meanwhile a new post-industrial market has co-opted Barlow's Cyberspace (though, since they've been paying for it, maybe just plain "bought" is a better word), and governments like that of China have been doing a good-enough job of exercising sovereignty where it's cyber-citizens gather.

Behold the Agonizer

"You asked me once," said O'Brien, "what was in Room 101. I told you that you know the answer already. Everybody knows. The thing in Room 101 is the worst thing in the world." -- George Orwell, 1984

In the coverage at Wired of the Air Force's new Active Denial System for crowd control, I didn't see any mention of the agonizer. And yet, that's what it is, more or less: A device that induces searing, burning pain that's so intense, subjects cannot help but struggle to get away from it.

It works via millimeter-wave radiation. Wired (courtesy of the Sunshine Project) has thoughtfully provided a rundown of publicly available documentation. On a quick scan, it's hard to tell whether the pain is caused by heating in the skin or by some other interaction between pain-nerves and millimeter-wave radiation. But prolonged exposer to the beam can cause second-degree burns, so heating does definitely occur.

And there's also no mention in Wired's coverage of the applications for torture, which are painfully [sic] obvious to me. An uncreative sadist would leave a victim with second-degree burns after leaving the beam focused for too long in one spot. A creative sadist would hack together something like Gene Roddenberry's agony booth, to move the focus of radiation around to different sets of nerve endings, in order to reduce the effect of neurotransmitter depletion. After an hour or so, I'm quite sure just about anybody would be willing to tell us whatever we wanted to hear as long as it makes the pain stop. In room 101, the man who works the latch on the rat cage is your god.

A vehicle-mounted version is apparently being tested in Iraq right now. I'm very, very curious to know what Iraqis will make of it. I think they'll get the torture angle right away. And since the technology is pretty easy to replicate, I can envision disreputable dictatorships throughtout the world deploying copycat devices in the near future.

WESun 0: Mainstreaming Singularitarianism

This morning on Weekend Edition, The Singularity rears its ugly head in the persons of Vernor Vinge (who coined the concept) and Cory Doctorow. It's another manifestation of our increasing dread in the face of technological change, and the increasing degree with which we approach that change in irrational ways: in the Vingean scenario, as a rescuing parent; in the Doctorovian vision, .

Doctorow posits the scenario of a modern human interacting with a pre-literate human: That they would be "in a sense, different species." That they and we would have "nothing to talk about." Maybe he was clearer in the un-aired portions about what's meant by "literate", but unless it means "without language" (and one would expect the word chosen for that to be "pre-linguistic"), he's clearly overstating his case. We can easily talk with "pre-literate" or even "illiterate" people, because there remain between us basic human consistencies that will not be removed by any extropian enhancements which we can plausibly predict.

It's a badly chosen analogy, to be sure, and surely one can be forgiven for choosing analogies badly, no? No. Because the craft of science fiction as gedankenexperiment is all about precision -- or at least, insight -- in your analogies. We need to remember that the beings making the singularity are humans. The aspects of the singularity that are truly, deeply a-human, are not likely to persist in that form. They're likely to get reshaped, recrafted, in some kind of human image.

I think Doctorow's analogy illustrates the most fundamental problem with Singularity Theory, in that it is often a failure of a certain kind of imagination: Empathy.

Vinge posits a more traditional scenario, in a way, as a revisitation of the Jack Williamson nightmare -- but with Williamson's logical problems fixed. Vinge's singularity-intelligence is more of a savior than a successor. A lost parent, restored, if you will. Clarke's technological god. Maybe it can save us from global warming.

Doctorow's singularity-beings are replacements, successors. They are what we are not -- they overcome our weaknesses, and supersede us. There's a sense of mingled dread and fascination in the prospect. I'm still trying to understand how to talk about the impulse. I feel it, myself, to be sure, but I don't have a pat name for it.

Sterling's critique still seems sound. (See his short essay in Wired; longer talk at the Long Now Foundation, as MP3 or OGG or as a summary.) He points out (among other things) that the singularity-being will not come about entirely by accident. It will come about through our choices, and some of those choices will tend to constrain the singularity-being.

Because They Can. (Cringely on Google)

'Cringely' offers an anecdote to illustrate Google's power over the soul of the market: He goes into the bank to deposit a check, gets in line behind a kid who insists on keeping ten feet of empty space in front of him.

... The queue was perhaps 20 feet long and right in the middle was this 10-foot gap. I was in no hurry, I thought. That gap was not going to cause me to get to the teller more than a second or so later than I might if the gap was closed. No problem.

Only it WAS a problem. As the minutes passed that gap started to drive me insane. Finally I asked the kid to move forward.

"It was making you crazy, right?" he asked, clearly enjoying the moment.

(Ah, yes, the joys of being a self-important little putz..but I editorialize....)

Google has something over $6B -- that's six billion dollars -- in cash on hand right now. That's cash -- not credit, not valuation, but real money that people have paid them. And everyone wants to know what they'll do with it.

The day when six billion could buy three Eisenhower-class aircraft carriers has long passed, but you can still make a pretty good splash with that much cash. So, as Cringely points out, all the gorillas in technology are sitting around waiting to see exactly what it is that they'll do. Which gives them an amazing amount of power -- as long as they don't actually do anything.

Putting things in perspective, Google has been really really super good at exactly one thing: Self-promotion. Sure, some of their technology is pretty good, but there's really no evidence that their algorithms are really any better than, say, Teoma's. What they do have is more power. There's a saying among marketing folks: "Go big or go home." Google went big, more or less from day ten or so. "Day ten" because they had to get the money to go big with, first. And that's where self-promotion came in.

I distinguish between "marketing" and "self-promotion" here because I think it's important. Google, at the root, has always rooted its mystique in the cult of personality that's coalesced around these mythical beasts "Sergey" and "Larry". That's suffered a little, no doubt, as a result of Eric Schmidt's incredible childishness in response to CNet feeding him a half teaspoon of his own company's medicine. Nevertheless, Google still builds its reputation in large part out of the sheepskins of its PhD-filthy workforce.

As Cringely points out (and as I've pointed out for years, myself), though, and much like Microsoft, Google's technical solutions are seldom really cutting edge, but because of their market dominance people more or less have to use them. What Google have done well is mobilize the good will of geeks; which is to say, what they've done well is to work the cult of personality for all its guerilla marketing mojo.

And now, all they have to do is twitch -- or even hint at twitching -- to make gorillas jump. Rumors abound: Google is buying up dark fiber, so they can run their own internet; Google is building a vast new data center, so large that it will need a major hydroelectric plant to power it; Google is producing their own desktop OS. Sometimes they're even true: Google is in the process of rolling out its own "desktop", a search/chat/email client that will allow them to entrench even more deeply and even more richly enhance their vast database of geographically-linked internet behaviors.

That database is the elephant in the room in any discussion about Google, though of course it's useless without the market-muscle to deploy (and grow) it. In military terms, Google's market mojo pairs up with its database like big satellite-guided bombs pair up with the geographical databases that tell you where the targets are. It's their market position that lets them get the database; the database is what's going to guarantee their market position for years to come.

Podcasting By Any Other Name

People like to find arguments. It gives them a place to plant their intellectual flags and say "I was here first!" For example, there's apparently an argument over whether "podcasting" is "significant" from an investment perspective. David Berlind weighs in on his ZDNet blog. Berlind's answer is quite oblique, and while making some very important points implicitly, I think it will be accused by the podcasting faithful of 'not getting' podcasting; I'll accuse him of the same thing, for different reasons.

Basically, as far as I'm concerned, "podcasting" borders on being a hoax, of sorts: It's a name concocted more or less with the sole purpose of counting coup in the blogosphere, that's been blown up as something important and significant, and in blogospheric terms, it is both, but not on the scale that's presumed on its behalf. Podcasting as practiced in blogland will have very little impact on what the thing that will be called "podcasting" will look like in the future. It's one of those things that's important for the impact it's said to have, and not for the impact it actually has. It's important, in short, for the same reason that Jessica Simpson is [sic] important: Because people say so. It's got nothing to do with her singing.

The spur to Berlind's meditation was a question from a fellow reporter, working on a story (and hence, kept anonymous -- and no, I do not find anything sinister in that). "Old media" blokes, it seems, are still wondering whether blogs are "significant", and -- here's the curious part -- what that means for "podcasting". "His perception is that the blogger phenomenon is insignificant," Berlind's colleague supplies, "making podcasting negligible." From an investment perspective, of course.

Well, it's a terrible analysis, of course, as far as it goes: Major acquisitions and strategic investments are being made that are directly motivated by the idea of blogging, and so blogging is by definition "siginificant", and so we have to wonder what the heck this expert really means. Even if the raw numbers of new bloggers (tens of millions in the last year alone, similar to the boom-period growth figures for internet use) don't impress him, he's myopic if he doesn't understand that blogging per se isn't the issue; it's just the nascent stage of new modes of mass-personal communication. My own nutshell evaluation of this particular analyst is that I suspect he doesn't actually know what he's talking about.

Nevertheless, there is a grain of truth in the analysis. Personalistic "morning coffee notes", produced on an ad hoc basis by random bloggers, will never be significant in this "investment" sense. (Though I can see some interesting possibilities, there, for things that will be significant.) Why? Because the medium sucks; podcasting will never, ever become popular in the way that blogging is popular. On the other hand, as Berlind rightly points out, the rather old idea of media-shifting print content to voice (which used to go by the name "radio") and then mode-shifting that from a stream to an offline file, not only will be big, but has been going on for a while. In fact, it's older than the web, even on the Internet. The only things that're new about it are, first, doing the notification and distribution through RSS, and second, automating the media load onto portable devices.

Those are important things, sure; but the podcasters didn't think of them. They just took their particular process public. And the particular "open" modality that they specified will be important during a transitional period -- but it's not where the money will be made or most of the traffic will happen. That will be on satellite. Podcasting in its current form is merely an interim step to the full realization of potential of satellite radio. "[U]sing the technology to audio-tivo satellite" would be just a start; wait until Apple or XM really get going on these ideas.

Apple Proves Me Wrong (about a few things, at least)

The "headless iMac" is the "Mac Mini." (Close-follower branding from Apple? Or synergy from their cooperative projects with BMW -- er, I mean, Cooper? But I digress, as usual...) Of course, they'll sell millions of the buggers. That's what they do: Create cute things that people want to buy, regardless of what it is or really does. But I swear, I'm different: I swear, I actually care what it does.

But is it an earth shattering device? Even without wireless, as it is, it could be, but in and of itself -- no. Everyone I know who's ever thought of getting a Mac wants one -- hell, I want one -- and yet, I don't think it will take over the low-end market the way it could if the price point were, say, $100 lower, or the base RAM were 256KB bigger.

But in another way, it will be revolutionary. Consider the size of the thing: It will now no longer be acceptable for PCs to be as big as they have traditionally been. Ultra-small variations on the ATX form factor, which are common now only among hobbyists and "gear fetishists", will become standard PC form factors, and will at the same time cease to command a premium price. They will drive devices the same size as (or smaller than) a Mac Mini, and aren't inherently much more expensive to manufacture than the larger boards; since Intel and AMD chips clock higher, they'll be faster; and they'll become radically cheaper as demand soars from people who've seen the Mac Mini, but still can't afford the extrapolated $800-$1000 price tag for a really capable, obsolescence-resistent MiniMac.

It's interesting to see where the rumors went wrong. The "iHome" branding turned out to be a red herring; it would be interesting to find out where it came from, because it so effectively skewed the speculative field in the days just before the presentation that it seems as though no one even tried to get spy shots of a Mac Mini. It's a lot smaller than the hoaxed pictures. The hoaxter dubbed it 'iHome', and various rumour millers reported with confidence that it would be "branded" as an iMac; neither turned out to be true. It was said to include WiFi in its base configuration; WiFi ("Airport Extreme") is an add-on, as is Bluetooth. Performance numbers were more or less right, though the rumors missed the fact that there'd be two base processor speeds. And to illustrate just how far off the original rumor was, the "headless iMac" was said to "share the 1.5" [1U, or "one rack unit"] height of the latest Apple G5 server; it's actually 2" tall. A picky detail, but it demonstrates how completely off-mark we all were.

It's tempting to speculate (as I'm sure someone has) that Apple planted rumors to throw people off the scent. But I don't think they need to. For what other PC brand would people bother to create physical hoax models? Whatever the explanation, the community of Mac users has a hardened core of Macintosh and Apple fetishists. In fact, I think they don't really try, for the most part, to get real rumors; they just make stuff up, because it's more satisfying than the truth. Anyway, true wisdom, to the Mac zealot, is received wisdom: It issues forth every January from the Dark Steve, from a well-lit stage at the MacWorld keynote address...

Apple's DNA Is Mystique

Why does anyone still trust Apple? I suppose it could just be that they don't pay attention. Maybe it's that they love a bully, especially when the bully speaks and looks so fair. Apple is one of the great counter-arguments to the wisdom of the Cluetrain: They keep their customers in the dark and feed them nothing but cheap wine and communal wafers, and yet they're worshipped for it.

Last week, ThinkSecret fronted a rumor that Apple would be announcing a sub-$500 "headless" Macintosh at Macworld Expo on 1/11. They also slipped in a mention, which I somehow missed, that Apple was working on an office suite to compete with MS-Office for OS X.

So, naturally, Apple is sueing them [via Gizmodo]. Said ThinkSecret was revealing "trade secrets". Seem to think that the stuff ThinkSecret is putting up on their website might somehow cause Apple harm. For example, maybe Microsoft doesn't already suspect that Apple is canoodling with KDE to produce an OS X customized fork of KOffice. (KDE are already got the whole suite working natively under Aqua.) Maybe now that Microsoft knows, they'll conjure some nefarious plot to destroy Apple once and for all. Or not.

And as though suing ThinkSecret didn't just confirm at least one of the rumors.

Now, if the Cluetrain Manifesto told the whole story, Apple would be toast thanks to hijinks like these. Their hardware is expensive and slow, the software is more expensive and there's less of it. And on top of that, they treat their customers like marks to be manipulated and jerked around. On the other hand, Apple products come in whatever color you want. As long as it's white.

Friends and acquaintences know that I've considered buying a Mac for a while now, so I can move away from Windows while still having access to high-quality design and graphics tools like PageMaker and Flash. Much as I love the idea of scoring a slightly used PowerBook, a $599 desktop Mac would be a nearly ideal solution. But the Dark Steve just keeps making it hard for me to switch. At least Bill Gates and Steve Ballmer don't pass themselves off as nice guys.

Whence this mania for secrecy stems, I could only speculate. It's apparently new since Jobs rejoined in the late 90s, and since Apple more or less exists for the sole purpose of making Steve Jobs feel like a big man, my first guess would be that it sources back to him. In any case, it's a brilliant piece of crazy-making. They have to grok very deeply that their true believers will love them even more for this, and that once a convert has drunk their koolaid, going over to Windows is unthinkable. (Why, that would mean feeling uncool....)

A Question of Belief

Edge.org have posed an interesting question [courtesy MeFi] to a collection of "scientists and science-minded thinkers": "WHAT DO YOU BELIEVE IS TRUE EVEN THOUGH YOU CANNOT PROVE IT?" (It's just the latest in a series of annual questions.) Many of the answers are thought-provoking, or instructive (even though most are simply restatements of that thinker's area of interest in the form of an "unprovable" "assertion"). The zeitgeist implicit in their answers is interesting, too. John Brockman writes:

This year there's a focus on consciousness, on knowing, on ideas of truth and proof. If pushed to generalize, I would say it is a commentary on how we are dealing with the idea of certainty.

We are in the age of "searchculture", in which Google and other search engines are leading us into a future rich with an abundance of correct answers along with an accompanying naïve sense of certainty. In the future, we will be able to answer the question, but will we be bright enough to ask it?

This is an alternative path. It may be that it's okay not to be certain, but to have a hunch, and to perceive on that basis.

Maybe it says that. Maybe it says that this is how science actually works: Having hunches, then trying to prove them, which is really what most of the answers are about. Some of them get more fundamental, as when Richard Dawkins answers:

I believe, but I cannot prove, that all life, all intelligence, all creativity and all 'design' anywhere in the universe, is the direct or indirect product of Darwinian natural selection. It follows that design comes late in the universe, after a period of Darwinian evolution. Design cannot precede evolution and therefore cannot underlie the universe.

... which is a remarkably blunt and honest thing for him to say, since it faces head-on the core weakness of his anti-ID positions. I personally think ID is a load of horse-hockey, but I don't think it can be countered with "proof" that it can't work any more than we can solve the first-mover conundrum. I'm glad Dawkins doesn't shy away from that. I'm not always crazy about the way he formulates ideas ("selfish gene" theory still seems too simplisticly reactionary to me, nearly 20 years after I first heard of it), but he is nevertheless one of the most able and vigorous opponents of ID, so it behooves me to pay attention to what he's saying out there.

In any case, while the Q&A is intriquing, in many cases (and as I've noted) it's largely a matter of researchers restating their research-focus as though it were a controversial idea. [bonehead @ MeFi observes, "... scratch post-docs or hungry assistant profs for real wild-eyed speculation. Of course, most of them will be wrong (entertainingly so), but that's where the future Nobels are too."] And I don't think Brockman is really giving credit to scientific process: Believing something you can't prove is usually how anything valuable and previously unknown gets to be learned. Call it a hunch, call it belief; the process whereby that belief is substantiated (though hardly evern "proved" in a strict logicalist sense) is what we know as science. And I'm not altogether sure that Brockman groks that.

Brockman also seems to think there's a new way of being an intellectual:

... There is also evidence here that the[se] scientists are thinking beyond their individual fields. Yes, they are engaged in the science of their own areas of research, but more importantly they are also thinking deeply about creating new understandings about the limits of science, of seeing science not just as a question of knowing things, but as a means of tuning into the deeper questions of who we are and how we know.

It may sound as if I am referring to a group of intellectuals, and not scientists. In fact, I refer to both. In 1991, I suggested the idea of a third culture, which "consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are. "

I believe that the scientists of the third culture are the pre-eminent intellectuals of our time. But I can't prove it.

This idea of "Third Culture" scientists is worth exploring, but it's a topic for another time. Suffice for now to say that I don't see anything sufficiently new that a new organizing principle is required; in fact, I think a concept like "third culture" has more potential to alienate thinkers from cross-pollination than it does to encourage them. A bit like "brights" in that regard.

But that's an issue I haven't got time to take on right now....

Business Vision Envisioned, Darkly

What makes a corporate leader "visionary"? Spencer Reiss at Wired offers a murky window onto his view, in his profile of Richard Branson's new space tourism venture:

Despite such a dazzling career, the business world has always been ambivalent toward Britain's best-known entrepreneur. He launches trendy companies the way Trump builds casinos. But a farsighted innovator like Steve Jobs or Jeff Bezos or even Southwest Airlines' Herb Kelleher he is not. Branson traffics in opportunism. He spots a stodgy, old-line industry, rolls out the Virgin logo, sprinkles some camera-catching glitter, and poof - another moneymaker. While that formula has kept him in champagne and headlines, no Virgin business has ever changed the world.

It's a murky window, though: Leaving aside the fact that Trump is in debt to his eyeballs and Branson wallowing in lucre, I'm not getting the distinction between Branson's "opportunism" and the "innovation" of Reiss's exemplars. What's the qualitative difference between Herb Kelleher's epiphany that there was room at the bottom of the air travel market with Branson's epiphany that there was room at the top? Or between Job's gut instinct about what would make pseudo-counterculturals perk up and drool and Branson's feel for the bleeding edge of music? Or between the fortuitous combination of timing and investor-expectations-management that allowed Jeff Bezos to make a success of Amazon, and the fortuitous combination of timing and investor-expectations-management that allowed Richard Branson to start a full-service transatlantic airline from scratch at the height of the discount boom, with prior portfolio in nothing more mission-critical than music retail?

The thing is, Virgin businesses have changed important bits of the world. Virgin Records (later Virgin Entertainment), for example, started as an independent label before the first wave of Punk, and after early flirtation with the Sex Pistols hitched their cart to the artsier, less purely rebellious side of new music (think XTC, Human League, Simple Minds) with great commercial success. In so doing, they paved the way for countercultural co-optation efforts like Warner's "independent" IRS label. Virgin Air challenged the conventional analysis on how to make money in the airline business (cut operating cost to the bone, get more butts on the plane even if you have to make the seats smaller to do it, and so on). Virgin Mobile found a way to bring European pay-as-you-go models to the less technically advanced American market, and in so doing forced American mobile phone companies to rethink their contract-lockin approach to the business. It's arguable that Branson has had a bigger impact on the American mobile telephony market than even the recent, much-ballyhood phone number mobility legislation.

As an aside, I don't actually have anything in particular against Herb Kelleher or Jeff Bezos, but describing Jobs as a "far sighted innovator" really does gall. As someone who's actually familiar with the history of computing, I'm unclear on exactly how Apple (a company who barely ever cracked 2 percent of the personal computing market) "changed the world" prior to the launch of iTunes in 2003. After all, the truly synergic applications for windowed operating environments -- Excel and Word -- both came from Microsoft, and Apple was neither the first, last, nor (objectively judged) the best. What they were, was the most optimal, and that had little if anything to do with Jobs per se. I defy anyone to name an original idea from Steve Jobs that became successful; despite his reputation for micromanagement (he penciled in changes to comps of the original iMac's design and signed off on the colors in the original lineup), his real genius lies in understanding and manipulating human vanity. That the success of virtually every successful Apple product since the launch of the Lisa is due to appeal to vanity is something that shouldn't be seriously challenged; that Apple is and always has been a niche player can't be seriously debated; that the cultural impact of a niche player seems out of proportion with their market cap shouldn't be surprising to anyone who's ever considered wearing paisley (the common term for this phenomenon is "fashion"). Above the hard technical level (which was more the other Steve's purview, in any case) Steve Jobs and Apple's impact on the world resembles that of Ralph Lauren more than that of even, say, Richard Stallman and the FSF.

Part of the problem may be that, at the core, Branson's "vision" is simply not technophilic enough for a Wired feature writer. His admission to the "vision club", after all, comes via Virgin Galactic:

Until now. Mojave Airport isn't just where aging jets wait to die; it's where the dusty dream of commercial space travel is finally coming alive. Last summer, a tiny winged wonder called SpaceShipOne spiked 62 miles into the desert sky on its way to nailing the $10 million X Prize for the first sustainable civilian suborbital flight. The world's stuffed-shirt airline chiefs took one look and went back to worrying about fuel prices. Branson took one look at the gleaming white carbon-fiber spaceship and said, Beam me up.

The music business isn't about gadgets, after all, and Virgin Mobile's phones were always technically on the low-end. (How else could they afford to make them cheap without requiring a contract?) Too, Virgin Galactic just smells of "big vision", though the real scope is small: Like Apple, it will directly affect only a very small population of users. To be sure, Branson will almost certainly make money on it, but its primary impact will be a matter of fashion -- and inspiration -- than of actual market effects. In business terms, its direct impact will be much smaller than that of Virgin Atlantic Airways or probably even of Branson's new V2 music label.

The Next Cluetrain Test

Microsoft Windows XP SP2 will be an important, but probably un-noticed, watershed in the progress of the "cluetrain".

I've yet to see a major case where "cluetrain" customer/user-emplowerment juju actually had an impact on any company's actions. There are lots of cases on the books of products doing poorly, but the vast majority are the same traditional feedback mechanism: The product sucks, people don't use it, the product fails; or, the product is poorly marketed (Coke C2? New Coke?), people don't buy it, the product fails.

A cluetrain feedback loop would be different. It would mean that net-empowered buyers (which doesn't necessarily mean internet-empowered buyers) had acted consciously -- as opposed by passibly, by simply not buying -- to make the product fail. That action could be in the form of spreading word of the product's suckfulness via some network; in the pure Cluetrain vision, that network would be a human network, enabled by technology. (Side snark: Which network will shortly be owned and controlled by Google...)

Cluetrain thinking is quite a bit like Marxism or "singularity" theory, in that it presumes the inevitability of something which a little basic observation and some applied knowledge of human nature would tell you is highly unlikely. "But it's emergent," is one common (if foggy) response. "You won't be able to predict the shape of the future from the present." "But from what will it emerge?" would be my response. I've yet to see an "emergent phenomenon" that couldn't be traced back to properties of its culture medium.

So, what the hell does all this have to do with Microsoft? Well, they've decided not to bother doing security patches on IE for anything but XP, once they release SP2. (At least, that's what I think they mean; they might mean they're stopping now.) Many see this as a calculated move to incentivize paid upgrades (XP SP2 won't be free -- it will cost $99 for most XP users). If so, it's a very calculated move, based on the idea that they don't need to care anymore how people feel about Microsoft. It's Rock v. Hard Place. It means they think they're winning the anti-Linux fight (which may well be true).

If the Cluetrain is what it's boosters have always said it is, it will stop this, and what's more, it will stop this in a particular way: It will wound Windows XP via Market Forces. Microsoft's sales of SP2 will be poor, Linux and Mac adoption will rise sharply, and Market Forces will drive radical improvements in the usability of Linux desktops. Or MS will "get on the cluetrain" and cancel plans to charge for SP2, at least -- and ideally, continue to distribute updates for Win2K. (Which is a better OS, anyway -- though it doesn't have all those wonderful hooks for MS lock-in...)

Now, I actually think it's pretty likely that XP users will be getting SP2 for free. Whether MS continues to update Win2K is another matter. This will happen because their corporate customers will communicate their profound disappointment, and telegraph a willingness to migrate to Firefox or Googlezilla. Is this a manifestation of the Cluetrain in action? I don't know; it would tend to support the view that the "cluetrain" is nothing new or emergent, if it were, because changing plans based on Big Customer feedback is as old as the PC industry, and is mediated not by networking but by traditional sales channels. And the reaction would be just exactly as little as Microsoft has to give up to get what they want.

Now, all that snark having been levelled, I would love to see MS take a hard line on this. Because it would force the watershed, and make it that much more visible. Such a watershed would place more pressure on the open-source communities to come up with alternatives, whatever those might be. But whether those alternatives are really better and more empowering than Microsoft is another matter. Given the exclusive choice between a joyless overlord despised by most who still knows relatively little about me, and a beloved overlord who knows my every browsing habit, I'll pick the former -- Microsoft -- every time.

Dawn of the Google Era

The "Google OS" meme takes its next logical step: Signs indicate that Google is at work creating a Google-customized browser based on the Mozilla trunk. (My bet is that they would use Firefox, since the kewl kidz love Google so damn much.)

Last summer, Anil Dash suggested that it would be a good move for Google to develop a Google browser based on Mozilla. Give that kid a gold star because it looks more than plausible. Mozilla Developer Day 2004 was recently held at the Google Campus. Google is investing heavily in JavaScript-powered desktop-like web apps like Gmail and Blogger (the posting inferface is now WYSIWYG). Google could use their JavaScript expertise (in the form of Gmail ubercoder Chris Wetherell) to build Mozilla applications. Built-in blogging tools. Built-in Gmail tools. Built-in search tools. A search pane that watches what you're browsing and suggests related pages and search queries or watches what you're blogging and suggests related pages, news items, or emails you've written. Google Toolbar++. You get the idea.

Mozilla is currently getting some good press due to Microsoft's continuing troubles with their browser and the uptick in usage compared to IE is encouraging. But it's nothing compared to what could happen if Google decides to release a Mozilla-based browser. A Google Browser would give the Mozilla platform instant credibility and would be a big hit. The peerless Google brand & reputation and their huge reach are the keys here. Mom and Dad know about Google....

[Jason Kottke, "More evidence of a Google browser"]

"It's been obvious for awhile now that Google isn't a search company," Kottke says, pointing to earlier ruminations:

With their acquisition of Pyra and new Content-Targeted Advertising offering, it should be apparent that Google is not a search company. What they are exactly is unclear, but their biggest asset is: a highly annotated map of the web.

Unclear, indeed. But whatever it is that they are or become, it will control truly unprecedented amounts of power.

The relentless techno-optimism around Google is fascinating and frightening. That this "highly annotated map of the web" should reside in the hands of one closely-controlled company with strong profit motives and utterly unprecedented stores of information is, frankly, terrifying to me.

As a private entity, and as such not subject to public oversight (and no, stockholders don't count as "public oversight" -- and especially not at Google), Google is much more greatly to be feared than Government. There is effectively no control over what information they can collect and use internally, as long as they don't resell it. And if they are a one-stop-shop for all information usage, there ends up being effectively no limit to the uses they can put that information to.

In future, in fact, I can envision the Government outsourcing Total Information Awareness to Google. It would solve so many of their problems: No longer would the Government be hampered by silly "pre-9/11" rules that prohibit it from domestic spying; they'd effectively be able to get whatever they want, from Google. Sure, some kind of suitable chinese wall would have to be erected, but that's a trivial matter considering the power at stake, here.

(Leads courtesy Boing-Boing and the Register.)

Like A Kryptonite To A Bike Thief: "Steal Me!"

More proof that nothing is perfectly secure: these Quicktime videos show how to open a Kryptonite EV 2000 (525KB) and a Kryptonite EV "Disc" lock (955KB) like the one I own, using the barrel of a Bic Round Stic. (Courtesy MeFi, bikeforums.net, and thirdrate.com.)

The Moral: Don't count on one strong door. Double up your cylinder-key lock with a flat-key lock: Bike thieves don't like carrying more than one tool.

And don't get too attached to the bike. It's not good to be that attached to material possessions, anyway....

Update: I haven't made this work, yet. I suppose it could all be a colossal hoax, but I think it's probably got more to do with the fact that I was always pretty lousy at things like lock-picking...

Topics: 

More Thoughts on Tech-Macho Bullsh*t

A few more thoughts on the Kalsey-Firefox Affair. It's another illustration of Tech Macho Bullshit in action: If you're not "clueful" enough to see how much better off you are with Firefox, then you "deserve" IE.

Personally, I think that a "cluefulness-test" is the moral equivalent of playground bullying. (Geeks getting their own back?) And I don't think anyone deserves IE, but that's just me.

As far as I can see, this resembles the dueling of "eXtreme Programming" versus "traditional" methodoligies in that both are manifestations of the geek's adolescent obsession with control. They want to be able to make the decisions about what's "right" and "wrong" without subordinating themselves or their labor-power to inferior beings. The fact that those inferior beings do most of the working and paying and living and dying in this town escapes their mind; they only remember that they have to suffer the indignity of living with, working for, and being paid by them.

It's a real problem when backend geeks arrogate all app decisions unto themselves. Here's a real clue: If the app is hard for average users to use, it's a failure. Period. "Better" becomes irrelevant, because if you can still say it's "better" at that point, you're clearly using the wrong metrics.

Emergence as the Happy Horse-Hockey Du Jour

The Tutor is pessimistic about "emergence":

I think some folks on line are making an Hegelian case, without being Hegelians. They seem to think that the Zeitgeist is smarter than we are, freeing us from personal responsibility. We should all do what we please and emergence or the market, or God, Nature, or evolution will bring order out of choas, netting our errors to a wonderful whole.... I don't believe in emergence as a metaphysical principle. In my 55 years what I have seen emerge is increasingly malignant. What emerges most often in evolution are dead ends, of individuals, communities, species, and soon our planet. If left to itself, the mob will not get any smarter. And what will emerge will be a black ball floating in space.

[Emergence is Volkish Nonsense]

The Tutor finds fuel for his skepticism in the demonstrably poor qualifications of American voters:

Of course, if Converse is correct, and most voters really donâ??t have meaningful political beliefs, even ideological "closeness" is an artifact of survey anxiety, of people's felt need, when they are asked for an opinion, to have one. This absence of "real opinions" is not from lack of brains; it's from lack of interest. "The typical citizen drops down to a lower level of mental performance as soon as he enters the political field," the economic theorist Joseph Schumpeter wrote, in 1942. "He argues and analyzes in a way which he would readily recognize as infantile within the sphere of his real interests. He becomes a primitive again. His thinking is associative and affective." And [political scientist Morris] Fiorina quotes a passage from the political scientist Robert Putnam: "Most men are not political animals. The world of public affairs is not their world. It is alien to them &endash; possibly benevolent, more probably threatening, but nearly always alien. Most men are not interested in politics. Most do not participate in politics."
[LOUIS MENAND, New Yorker, "THE UNPOLITICAL ANIMAL: How political science understands voters"]

What do we do, then? Do we hand over governance to an elite? And who gets to choose who that is? If Menand is right (and though he tries to end on an optimistic note), we already have. Or, at least, we do it all the time, by basing our "heuristics" on elite opinion, by letting party loyalty stand in for assessing policy.

My own 40 years of experience tell me that Menand's "third theory of democratic politics" is more or less correct: People tend to judge who the "right" candidate is to vote for based on heuristics such as whether he knows how to eat a tamale (Gerry Ford didn't, so he couldn't have understood the needs of hispanics), the kind of phrasing that he uses to make points (Bill Clinton and George W. Bush share the repetetive cadences, the repetetive stressing, the repetetive repetition, of the southern protestant church).

So I'd like to be more optimistic than The Tutor, but I'm not. I still believe that The Cluetrain and the idea of emergent techno-democracy are dangerous libertarian techno-fetishist fantasies entertained by people who ought to know better and who should be making much, much more productive use of their time.

On the flipside, of course, the Cluetrain is very very useful to some other people who would like to keep smart folks asking the wrong questions -- or entertaining unachievable pipe dreams. In that sense, perhaps Cluetrain-think is the opiate of the elite.

The Next Logical Step in Bootable Media

la Cie are specialists in external storage devices (though they also make excellent flat-panel monitors). They got their start building SCSI drives for Macs and other SCSI-equipped PCs, and then were heavy early adopters of IEEE 1394 (a.k.a. "FireWire" -- still superior to USB 2.0, as far as I'm concerned, but what's a guy gonna do...).

Now they've partnered with MandrakeSoft to package one of their pocket-sized 40GB "portable" drives with a bootable, autoconfiguring version of Mandrake Linux version 10. Called the "Globe Trotter" [ZD Net story / MandrakeSoft product page], this is essentially a lineal descendent of MandrakeMove, and directly analogous to interesting and generally excellent Linux distros like Knoppix. It's designed to be booted from a CD and then auto-configure to use the system's resources.

The advantage of a gizmo like this, or of these bootable CDs, is that they let you carry your own computing environment with you without carrying (or even owning) a computer. With MandrakeMove, you carry just the computer and a one-ounce USB thumb-drive; I've typically also carried around my 20GB Archos external drive, which gives me still more capacity. This gives you an even more complete computing environment with even more storage space, as well as the ability to easily install new Linux software. With the benign neglect of a helpful librarian, or by just rebooting your office PC for your lunch hour, you can escape the confines of "public" computing environments. This type of device can also be handy for students having to use PCs in computing labs.

(Aside: While you could probably figure out a way to do this with Mac OS X, it would be technically difficult and legally questionable to try it with Windows.)

I'm a huge fan of MandrakeMove, and have been planning to upgrade to their second generation version; I might just get this instead. True, $219 is a little high for a 40GB drive (even one of that size), but the markup from MSRP of the naked 40GB drive is only about $60. So what you have to ask yourself is whether it's worth $60 of your time and effort to buy naked and install on top. For many Linux geeks, the answer will be "yes"; more power to them.

Me, I'll think seriously about this, because I've already found so many uses for my MandrakeMove CD that I can't begin to tell you. For example, it's been hugely useful in filling in for the deficiencies of Windows NT 4, which I still use on one of my systems at the office. The only way I have of making backups is by copying from my old PC to a slightly less old network fileserver. But since this box has USB, I can reboot using my MandrakeMove disc, and then backup my files to my 20GB Archos disk or to my 1GB Lexar thumb-drive.

Of course, there are also less savory uses for this kind of thing, such as bypassing IT policies or serving as a hackers toolbox. But then, just as you can use a car to transport stolen goods, you can use any of these things (and I, personally, do use them) for legitimate purposes, too.

That Pernicious "Search Is King" Meme

There's an ever-waxing meme out there which basically boils down to this: "Forget about organizing information by subject -- let a full-text search do everything for you." The chief rationale is that such searching will help increase serendipity by locating things across subject boundaries.

Here's the problem: It's a load of crap. It throws the baby out with the bathwater, by discarding one time-honored, effective way of organizing for serendipity in exchange for another, inferior (but sexier) one.

This morning, via Wired News:

"We all have a million file folders and you can't find anything," Jobs said during his keynote speech introducing Tiger, the next iteration of Mac OS X, due next year.

"It's easier to find something from among a billion Web pages with Google than it is to find something on your hard disk," he added.

... which is bullshit, incidentally. At least, it is on my hard drive...

The solution, Jobs said, is a system-wide search engine, Spotlight, which can find information across files and applications, whether it be an e-mail message or a copyright notice attached to a movie clip. "We think it's going to revolutionize the way you use your system," Jobs declared.

In Jobs' scheme, the hierarchy of files and folders is a dreary, outdated metaphor inspired by office filing. In today's communications era, categorized by the daily barrage of new e-mails, websites, pictures and movies, who wants to file when you can simply search? What does it matter where a file is stored, as long as you can find it?

Ah, I see -- the idea of hierarchically organizing data is bad because it's "dreary" and "outdated" -- that is, of course, so quintessentially Jobsian a dismissal that we can be pretty sure the reporter took his words from The Steve, Himself.

But this highlights something important: That this is not a new issue for Jobs, or for a lot of people. Jobs was an early champion (though, let's be clear, not an "innovator") in the cause of shifting to a "document-centric paradigm". The idea was that one ought not have to think about the applications one uses to create documents -- one just ought to create documents, and then make them whatever kind of document one needs. Which, to me, seems a little like not having to care what kind of vehicle you want, when you decide to drive to the night club or go haul manure.

But I digress. This is supposed to be how Macs work, but it's actually not: Macs are just exactly as application-centric as anything else, though it doesn't appear that way at first. The few attempts at removing the application from the paradigm, like ClarisWorks and the early versions of StarOffice (now downstream from OpenOffice), merely emphasized the application-centricity even more: While word processors and spreadsheet software could generally translate single-type documents without much data loss, there was no way that they were going to be able to translate a multi-mode (i.e. word processor plus presentation plus spreadsheet) document from one format to another without significant data loss or mangling.

Take for example, Rael Dornfest, who has stopped sorting his e-mail. Instead of cataloging e-mail messages into neat mailboxes, Dornfest allows his correspondence to accumulate into one giant, unsorted inbox. Whenever Dornfest, an editor at tech publisher O'Reilly and Associates, needs to find something, he simply searches for it.

Again, a problem: It doesn't work. I do the same thing (though I do actually organize into folders -- large sigle-file email repositories are a data meltdown just waiting to happen). This is a good paradigmatic case, so let's think it through: I want to find out about a business trip to Paris that was being considered a year and a half ago. I search for "trip" and "paris". If my spam folder's blocked, and assuming we're still just talking about email, I'm probably not going to get a lot of hits on Simple Life 2 or the meta-tags for some other Paris Hilton <ahem!> documentary footage. In fact, unless the office was in Paris, and the emails explicitly used the term "trip", which they may well not, I probably won't find the right emails at all. Or I'll only find part of the thread, and since no email system currently in wide use threads messages, I won't have a good way of linking on from there to ensure that I've checked all messages on-topic. (And that could lead into another rant about interaction protocols in business email, but I'll stop for now.)

By contrast, if I've organized my email by project, and I remember when the trip was, I can go directly to the folder where I keep that information and scan messages for the date range in question.

The key problem here is that search makes you work, whereas with organization, you just have to follow a path. I used to train students on internet searching. This was back in the days when search engines actually let you input Boolean searches (i.e., when you could actually get precise results that hadn't been Googlewhacked into irrelevance). Invariably, students could get useful results faster by using the Yahoo-style directory drill-down, or a combination of directory search and drill-down, than they could through search.

If they wanted to get unexpected results, they were better off searching (at least, with the directory systems we had then and have now -- these aren't library catalogs, after all). And real research is all about looking for unexpected results, after all.

And that leads me to meta data.

Library catalogs achieve serenditity through thesaurii and cross referencing. (Though in the 1980s, the LC apparently deprecated cross-referencing for reasons of administrative load.)

The only way a system like Spotlight works to achieve serendipitous searching -- and it does, by the accounts I've read -- is through cataloged meta-data. That is, when a file is created, there's a meta-information section of the file that contains things like subject, keywords, copyright statement, ownership, authorship, etc. Which almost nobody ever fills out. Trust me, I'm not making this up: from my own experience, and that of others, I know that people think meta-data is a nuisance. Some software is capable of generating its own meta-data from a document, but such schemes have two obvious problems:

  1. They only include the terms in the document -- no synonyms or antonyms or related subjects, and no obvious way of mapping ownership or institutional positioning -- so they're no real help to search.
  2. They only apply to that software, and then only going forward, and then only if people actually use them.

Now, a lot of this is wasted cycles if I take the position that filesystems aren't going away and this really all amounts to marketer wanking. But it's not wasted cycles, if I consider that the words of The Steve, dropped from On High, tend to be taken as the words of God by a community of technorati/digerati who think he's actually an innovator instead of a slick-operating second-mover with a gift for self-promotion and good taste in clothes.

This kind of thinking, in other words, can cause damage. Because people will think it's true, and they'll design things based on the idea that it's true. And since "thought leaders" like Jobs say it's important, people will use these deficient new designs, and I'll be stuck with them.

But there's little that anyone can do about it, really, except stay the course. Keep organizing your files (because otherwise, you're going to lose things, trust me on this, I know a little about these things). The "true way" to effective knowledge management (if there is one) will always involve a combination of effective search systems (from which I exclude systems like Google's that rely entirely on predictive weighting) with organization and meta-data (yes, I do believe in it, for certain things like automated resource discovery).

Funny, who would have thunk it: The "true way" is balance, as things almost always seem to come out, anyway. You can achieve motion through imbalance, but you cannot achieve progress unless your motions are in harmony -- in dynamic balance, as it were. What a strange concept...

I Want My Faraday Cage

From ZDNet: BAE is developing a smart wallpaper that will block some signals (e.g. WiFi) while allowing others to pass: "BAE says the material is cheap. The company will be developing it commercially through its corporate venture subsidiary."

In my rare fantasies of home-ownership (when I'm not too set in my renter's ways), my ideal cave -- er, I mean, home -- is usually encased in a Faraday Cage, to prevent the neighbors from listening in on my phone conversations and network chatter. (Of course, there is the small matter of being able to listen to the radio or talk on the cell phone...) OK, yes, I already know I'm strange, move along...

Ah, Humor -- What A Concept

Via BoingBoing, Darmouth researchers have found that we "get" a joke and find it funny with different parts of our brains:

The investigators found that instances of humor detection lit up the left inferior frontal and posterior temporal cortices--the left side of the brain. Humor appreciation, in contrast, led to spikes in activity in the emotional areas deeper inside--specifically, in the bilateral regions of the insular cortex and the amygdala.

Kelley believes that these results make sense. Past research has shown the left inferior frontal cortex to be involved in reconciling ambiguous meanings with prior knowledge. And ambiguity, incongruity and surprise are key elements in many jokes.

This makes perfect sense to me. A story: Many many moons ago, I experimented with megadosing amino acides to reduce my need for sleep: Choline, Inositol, and one other that escapes me now. After a couple of weeks, I found them to be effective: I needed much less sleep (four hours a night seemed adequate), and I found that I had an enhanced ability to focus on tasks.

But one day, while listening to the guys in the control room telling jokes, I had an epiphany: nothing was funny anymore.

Oh, I got the jokes. I watched comedy on TV. I listened to jokes that people told in the office. But I didn't laugh. I mean, I knew it was funny -- and at some level, I felt a cool appreciation for the humor. But I felt no joy, no emotion, from the jokes.

So naturally, I stopped taking the aminoes.

As a postscript, about a month or so after I decided to stop the amino acid megadoses, I started to experience severe anxiety and depression. At the time, I didn't make the connection; but over two years later, when I recounted my experiences on the old Factsheet Five bulletin board, someone (and I don't remember who) pointed out that one commonly reported side effect of taking megadoses of these amino acids was violent mood swings and anxiety attacks....

Stale Is Not Dead

Dave Winer finally speaks out on the Weblogs.com fiasco, and amongst all the usual stuff I find one thing that really leaps out at me:

One of the things I learned is that just because a site is dormant doesn't mean it's not getting hits. The referer spam problem on these sites was something to behold. Search engines still index their pages, and return hits. They were mostly dead, in the sense that most hadn't been updated for several years.

Something troubles me about this and the interminable HTTP code vs. XML redirect discussions, and that's this: If someone links to the content, it's live by definition.

I'll restate that, so it's clear: Content that is used should continue to be accessible. I don't actually know where Ruby or Winer or Rogers Cadenhead or anybody but the writers stand on this, but it remains a non-negotiable best practice and first principle of web interaction design for usability that links should not go dead.

If that means you have to redirect the user to a new location when the content moves, so be it. If you have to do that with meta-refresh in HTML or XML, so be it. Sure, there are "purer" ways to handle it; but it's just stupid to let the perfect be the enemy of the good by saying that you can't redirect if you can't modify your .htaccess file. Even a lot of people with their own hosting environments aren't allowed to modify their .htaccess.

I'm getting the sense that a lot of the people involved in these debates are forgetting that the web was supposed to be about linking information to related information. Protocols and methods are mere instrumentalities to that end. It's the content that matters; there really, really isn't a web without content.

The New Playground

Geeks think they live in a meritocracy. But more often than not, it's just another kind of playground where a different kind of bully hacks out a different pecking order.

Exceptions are rare, but they're getting less so, gradually. As people try to organize themselves to actually get things done in groups, they realize that the bullies don't inspire anything but conformity. And frightened conformity, at that, by turns bitter, tense, vengeful, and ready to pounce on the blood-spotted chicken at the first sign of weakness.

You'll only very rarely find original thought in the swamps of Slashdot, MetaFilter or Plastic. But you can find reports of original thought, and thus, inspiration. And occasionally, you will even find original thinkers (though they can be hard to see in their camoflage). Occasionally, they even escape.

More Tech Macho Bullshit

There's a thread of thought on the weblogs.com matter that's best expressed by MeFi's own Quonsar, albeit off the farm:

oh boo hoo. if you had taken blogging seriously enough to learn how to make a few lousy html tags and operate an ftp client and put your site on some paid hosting like any real site owner would do, you wouldn't now be screaming about 'murder'. TANSTAAFL. it means there aint no such thing as a free lunch. CentralizedShinyWidgets{tm} like weblogs.com and blogger.com pander to the willfully ignorant. murder? jesus, grow up. learn something. pay for hosting. buy a domain. get a real website.

Since he took the trouble to express that view several places, I'll assume he felt strongly about it. And I can't begin to communicate the contempt and scorn that a passage like that inspires in me. (Of course, that's what it's meant to do, so I suppose he achieved his end. Congrats.)

Of course, it's pretty much the equivalent of setting a technical bar for 'net participation. Let's break it down, kids: There ain't no such thing as a free lunch, so you've got to earn and learn like we did. You've got to run the man pages, Google 'til your fingers bleed, and figure it out on your own. And leave the 'net to Quonsar, me, and such friends as we deign to train.

It is an ownership society, after all, and knolwedge is (or at least, can be) wealth -- especially in Raymond's bazaar. We all have to pay to play, if not with cash, then with sweat-equity.

Or maybe it's just a sandbox where only the strongest kids get the kewlest toys, or a dive-bar where only the in-crowd gets the best seats at the bar. Either way, it's a place I choose not to play. I prefer a net where I can find some adult conversation instead of having to listen to the puerile maunderings of yet another technofetishist.

Of course not every credentialed geek is a puerile jerk. Some are quite literate; some (like me) got into these ridiculous devices in the first place with the mistaken idea that they would somehow help us be creative. (Damn, Drupal is stripping my sarcasm tags again...) And there are more and more of us all the time, at varying levels of technical sophistication. If we want our friends to play with us on the net, it behooves us to protect them from neanderthalish attitudes about technical gung-fu.

Cult of Personality Flaws

Over the weekend, something up to 3000 blogs disappeared from the net. They were disappeared by Dave Winer. He didn't bother to announce or explain for a few days; then when he did, it was as a large MP3 audio-blog entry (a move that seemed calculated to limit the audience). (Jeneane Sessums has helpfully posted a more dialup and syndicator-friendly transcript.)

He's getting royally roasted over this, which is appropriate (when you know something is going to harm people, you have the option not to do it, and you do it anyway, you ought to expect some flames).

I feel curiously detached over this. Winer's reasons aren't sufficient, in my opinion, and it looks to me as though he's actively dodging responsibility for his actions. But I think I understand the place he was at when he made the ill-considered decision to dump the free weblogs. It's a place I've referred to as the "f*ck it moment": That place in an implementation-gone-bad where you just want to toss it all up and let the chips fall where they may. He got into a migration without having properly scoped it, and without a rollback plan. Thinking on his feet, and maybe late at night, he just said it: "F*ck it." And it was writ.

The worst blowback on this is on Dave Winer. People who've read his blog or dealt with him long-distance for years weren't surprised; some said things like "Anyone relying on Dave Winer deserves what they get." But now many thousands more people think he's a jerk; many potential employers or investors in his next venture will think he's unreliable, or won't have the people skills to pull it off. This is particularly important, because he no longer works for Userland, and his fellowship at Harvard is over. I also think he's a jerk (based on the "rude to the waiter" metric); but he's a jerk who's going to need to find another job, and he's not doing himself any favors here.

The people whose sites have been zapped can be accommodated, if Winer or Userland choose to accommodate them. But Winer's reputation didn't need any more tarnishing.

Sometimes, Technology Is The Problem

Terrorists with leverage are scary, but I'm much more scared of nutty, cocksure attempts to build "technology" that supposedly keeps us safe. Terrorists get tired, give up, or shoot each other over the spoils, but once the hardware's installed, a lousy technology is harder to kill off than a cockroach.
[Bruce Sterling, speaking with Bruce Schneier]

Via Bruce Schneier's June 2004 Cryptogram, a "discussion" between the Bruces Schneier and Sterling that, though it consists mostly of one-paragraph positionings, does get in a few bon mots.

Cryptogram is worth looking at, too, if only for its revealing analysis of the effect of the superficially unspectacular Witty Worm. Witty was nearly unique in the degree of technical competence exhibited by its creators: If they'd chosen a different target, we could have lost the whole net in 45 minutes, instead of just 12,000 nodes.

Abstraction Layers

Jeff Veen talks about appropriate levels of detail orientation: "It's a balance between paying for a watchful eye, and maintaining some flexibility in the tools I use. I want to hack my templates, but I find it hard to care what modules are compiled into Apache":

So it was a relief to me that a couple more pieces of Web infrastructure moved into the "somebody else can worry" realm. The first is feeds. I spent a few years with the W3C working on HTML and CSS specifications, so I'll likely never bother to read another rant about which idea is more brilliant than the other when it comes to the minutia of standards making. RSS and Atom in particular fit squarely into that category these days. Goodbye to all of that. Rather than fret over the various feed templates on my site, I can now just point to Feedburner. They bravely content negotiate for all known aggregators and spit out the Right Thing. And lots of other stuff. Go look. They're cool.

Along the same lines, Ping-O-Matic will help promote your site for you. When you publish an entry on your blog, the software you use will go tell a couple of sites that you've updated. Typically, blo.gs or weblogs.com will get pinged, and they'll make a record of that. Then, search sites like Technorati and DayPop will come visit you and update their indices. But with the number of pingable sites is constantly growing, how can a Web author keep up? Now, you can just enter Ping-O-Matic into your blog publishing software, and let them keep track of all the new ones....

Sage advice, if you can afford the fees (for now, I can) and site promotion matters to you (... eh... I suppose it should). Veen can, and it does, because he makes his living in part by virtue of being the geek-cred version of "famous."

But then Dave Winer has to go and spoil it, as he so often does:

"Leave the hard stuff to someone else," says Jeff. "It wasn't supposed to be hard stuff," say I. It was supposed to be transparently simple. We're in a bad place, because after the next level of hard stuff it won't be possible for an intermediary to sort it out. Then we'll bemoan the lack of support of "standards" but the problems won't get solved, and eventually we'll give up and move on. Why we can't learn from the mistakes of the past is the mystery of the human hive.

... which, of course, entirely and spectacularly misses Veen's point: It's still "hard stuff" whether it's RSS 2.0 or Atom. It's hard stuff because Winer focuses on the wrong users: Geeks.

Veen's point is that there are levels of detail that it makes sense to pay attention to. For the vast, vast majority of actual users (Grandma, Dad, Uncle Harry, your boyfriend/girlfriend, etc.), Atom vs. RSS is irrelevant at a technical level.

By the way: I find it quite implausible to suppose that it truly "won't be possible for an intermediary to sort it out." That's kind of an absurd thing to say, especially for someone with a lifetime of experience in software development. It just plain doesn't make any sense, frankly. If they're both XML, and unless one or the other of the standards is so wildly extensible that you can't actually discover on the fly what it's supposed to mean -- which is to say, if it utterly ignores the concept of the semantic web -- then it will be possible to abstract between them.

And at the non-technical level that Winer tries to speak to with his talk of single platforms, it still doesn't matter, because it's a relatively simple matter to create abstraction layers. The existence of services like Feedburner prove that; the fact that it is difficult for individual hackers to reinvent the wheel in abstracting from some blogging software's data model to both Atom and RSS is not a technical argument on behalf of either platform.

Life Hacks: Tips for Getting Things Done

Cory Doctorow points to Danny O'Brien's notes for a talk on "Life Hacks". Just a survey of personal survival tactics used by some Geeks of O'Brien acquaintence. Handy stuff; simple stuff, like "most users use todo.txt" (like me, though I date my filename).

The general lesson: Simple is better. Complex solutions don't get used. "Some bits of life are too short to learn another app."

A consequent: Record your information; increasingly (because it's simple), this means recording it in blogs where it can be dumped as RSS. (Since these guys are geeks, they're rolling their own and "scraping" to actual RSS; if this were to be applied in general to other forms of "text", like XML, then somethign with a richer capabilities set -- like Atom -- is clearly a better solution.)

(True to form, O'Brien's notes are as a text file...)

Pages

Subscribe to RSS - Technophilia