You are here

Techno-optimism

The assumption that technology is the answer to all things. Usually, erroneously understood by its adherent as a belief that technology is neutral.

Singularity as Death-Avoidance

IEEE Spectrum focuses on the technological singularity. Glenn Zorpette thinks it's largely driven by fear of death:
The singularity debate is too rarely a real argument. There’s too much fixation on death avoidance. That’s a shame, because in the coming years, as ­computers become stupendously powerful—really and truly ridiculously powerful—and as electronics and other technologies begin to enhance and fuse with biology, life really is going to get more interesting.

Technorati Tags: ,

Freeman Dyson and the Pretense of Vision

Freeman Dyson is one of the more dangerous scientists alive right now.


.... The wiggles in the [Keeling] graph show us that every carbon dioxide molecule in the atmosphere is incorporated in a plant within a time of the order of twelve years. Therefore, if we can control what the plants do with the carbon, the fate of the carbon in the atmosphere is in our hands. That is what Nordhaus meant when he mentioned "genetically engineered carbon-eating trees" as a low-cost backstop to global warming. The science and technology of genetic engineering are not yet ripe for large-scale use. We do not understand the language of the genome well enough to read and write it fluently. But the science is advancing rapidly, and the technology of reading and writing genomes is advancing even more rapidly. I consider it likely that we shall have "genetically engineered carbon-eating trees" within twenty years, and almost certainly within fifty years.

Carbon-eating trees could convert most of the carbon that they absorb from the atmosphere into some chemically stable form and bury it underground. Or they could convert the carbon into liquid fuels and other useful chemicals. Biotechnology is enormously powerful, capable of burying or transforming any molecule of carbon dioxide that comes into its grasp. Keeling's wiggles prove that a big fraction of the carbon dioxide in the atmosphere comes within the grasp of biotechnology every decade. If one quarter of the world's forests were replanted with carbon-eating varieties of the same species, the forests would be preserved as ecological resources and as habitats for wildlife, and the carbon dioxide in the atmosphere would be reduced by half in about fifty years.

That's just science fiction, of course -- not the scary part at all. This is the scary part:

It is likely that biotechnology will dominate our lives and our economic activities during the second half of the twenty-first century, just as computer technology dominated our lives and our economy during the second half of the twentieth. Biotechnology could be a great equalizer, spreading wealth over the world wherever there is land and air and water and sunlight. This has nothing to do with the misguided efforts that are now being made to reduce carbon emissions by growing corn and converting it into ethanol fuel. The ethanol program fails to reduce emissions and incidentally hurts poor people all over the world by raising the price of food. After we have mastered biotechnology, the rules of the climate game will be radically changed. In a world economy based on biotechnology, some low-cost and environmentally benign backstop to carbon emissions is likely to become a reality.

Translation: "We don't need to do anything now, because we'll invent our way out of the problem when the time comes."

I suppose I should be grateful that he's no longer appointing himself global diagnostician. At least now he admits that there might be a problem.

I've been told by people I respect that Dyson is a very good physicist. But I'm hard put to recall anything outside of his domain that wasn't just plain stupid once you got past the "oh, neato" moment. I mean, Dyson Spheres are a cool idea, but also a really dumb one if you think about them just a tiny bit. They're a triumph of the broadly logically possible: We can imagine it, therefore it must be feasible. We can imagine going Niven & Pournelle one better and building a sphere around a small star (or arranging otherwise to intercept all of the star's energy). We can imagine nesting matrioshka layers one inside the other, to overlap and trap the inevitable leakage. All we have to do is solve this list of several thousand technical problems. We've solved every other technical problem we've ever been presented with; we'll clearly be able to solve these. What is conceivable, is feasible.

We can imagine magic carbon-sequestering trees, therefore they must be feasible. We can imagine a quarter of the world's trees being replaced by these magic inventions, therefore we should count on it happening (when the alternative is essentially the collapse of civilization).

All of these speculations commit an obvious and really, really troubling error: They assume that certain important things, like rate of technological innovation, rate of increate in energy use, etc., are essentially laws of nature: That not only won't they change, but that their not changing is a righteous thing. Moore's Law will go on forever; we'll keep increasing our need for energy at a predictable and increasing rate; we'll keep inventing new ways to solve all of our problems; better living through chemistry.

This kind of thinking is usually based on a detailed look at only a very short span of human history, and a very high-level gloss of anything beyond the past three or four hundred years.

It's disturbingly short sighted, in other words, even as it pretends to vision.

This is why I don't respect Dyson: He pretends to vision, but is blind to his own short-sightedness

Technorati Tags:

Freeman Dyson Is Smarter Than The Climate Scientists

Or at least he thinks he is:

.... It is at least a possibility to be seriously considered, that China could become rich by burning coal, while the United States could become environmentally virtuous by accumulating topsoil, with transport of carbon from mine in China to soil in America provided free of charge by the atmosphere, and the inventory of carbon in the atmosphere remaining constant. We should take such possibilities into account when we listen to predictions about climate change and fossil fuels. If biotechnology takes over the planet in the next fifty years, as computer technology has taken it over in the last fifty years, the rules of the climate game will be radically changed.

When I listen to the public debates about climate change, I am impressed by the enormous gaps in our knowledge, the sparseness of our observations and the superficiality of our theories. Many of the basic processes of planetary ecology are poorly understood. They must be better understood before we can reach an accurate diagnosis of the present condition of our planet. When we are trying to take care of a planet, just as when we are taking care of a human patient, diseases must be diagnosed before they can be cured. We need to observe and measure what is going on in the biosphere, rather than relying on computer models.

Such vision! Who knew it was that simple: China burns the coal, we sequester their windblown carbon as topsoil. Mirabile dictu! Dyson ex machina.

And who knew that Dyson had such a complete grasp of the processes of planetary ecology. He must, since he feels so ready to propose that we replace all of the current thinking by climate scientists and ecologists with a suggestion by a physicist that we just give up on climate modeling and replace it with a wholistic, diagnostician model.

It's convenient to be so brilliant that one doesn't feel the need to apply the same criteria to his own theories as he does to others.


Technorati Tags: , , , ,

Freeman Dyson Undercuts Himself

Freeman Dyson recently wrote:

In his "New Biology" article, [Carl Woese] is postulating a golden age of pre-Darwinian life, when horizontal gene transfer was universal and separate species did not yet exist. Life was then a community of cells of various kinds, sharing their genetic information so that clever chemical tricks and catalytic processes invented by one creature could be inherited by all of them. Evolution was a communal affair, the whole community advancing in metabolic and reproductive efficiency as the genes of the most efficient cells were shared. Evolution could be rapid, as new chemical devices could be evolved simultaneously by cells of different kinds working in parallel and then reassembled in a single cell by horizontal gene transfer.

But then, one evil day, a cell resembling a primitive bacterium happened to find itself one jump ahead of its neighbors in efficiency. That cell, anticipating Bill Gates by three billion years, separated itself from the community and refused to share. Its offspring became the first species of bacteria—and the first species of any kind—reserving their intellectual property for their own private use. With their superior efficiency, the bacteria continued to prosper and to evolve separately, while the rest of the community continued its communal life. Some millions of years later, another cell separated itself from the community and became the ancestor of the archea. Some time after that, a third cell separated itself and became the ancestor of the eukaryotes. And so it went on, until nothing was left of the community and all life was divided into species. The Darwinian interlude had begun.


The Darwinian interlude has lasted for two or three billion years. It probably slowed down the pace of evolution considerably. The basic biochemical machinery of life had evolved rapidly during the few hundreds of millions of years of the pre-Darwinian era, and changed very little in the next two billion years of microbial evolution. Darwinian evolution is slow because individual species, once established, evolve very little. With rare exceptions, Darwinian evolution requires established species to become extinct so that new species can replace them.

Now, after three billion years, the Darwinian interlude is over. It was an interlude between two periods of horizontal gene transfer. The epoch of Darwinian evolution based on competition between species ended about ten thousand years ago, when a single species, Homo sapiens, began to dominate and reorganize the biosphere. Since that time, cultural evolution has replaced biological evolution as the main driving force of change. Cultural evolution is not Darwinian. Cultures spread by horizontal transfer of ideas more than by genetic inheritance. Cultural evolution is running a thousand times faster than Darwinian evolution, taking us into a new era of cultural interdependence which we call globalization.

Freeman Dyson, "Our Biotech Future" (The New York Review of Books)

It's difficult to tell what Dyson wants to communicate. He argues against "reductionist biology" and floats a lot of pretty images of synergism and vaguely Taoist ideas about the resilience of life. But his own understanding of the complexity of life is clearly quite limited, or he wouldn't be so quick to idealize "non-Darwinian evolution" (a "golden age"?) and predict a rosy outcome from unrestricted biotech game-playing. History much more readily supports a skeptical view on the affects of biotech than it supports Dyson's positivist version. The reality will almost certainly be more of the same mixed bag we've got now: High-yield crops help feed more people and strain the land to a greater extent, which hurts crop yields, which demands still higher-tech farming technologies, and so on ad infinitum. It's not a sustainable cycle, and one would like to think someone with such a reputation for cleverness would get that. (The fact that he doesn't, is to me another indication that he was over-rated to begin with.)

Dyson's thought seems to me to be fundamentally adolescent, in the sense that he always wants more and always thinks that things are simpler than the experts do.

Darwinian evolution may indeed have slowed evolution down considerably; but it may also have stabilized it. I suspect it was Darwinian evolution that made multi-cellular life truly feasible by making it possible to rely on large support structures generation over generation. In a diverse non-Darwinian framework, that reliance just wouldn't be possible. "Designs" that are stable in one generation could change fundamentally in the next, or even before the generation propagated, leaving no basis for reproduction. What Dyson casts in clearly pejorative language ("one evil day", "refused to share", "anticipating Bill Gates") was most likely the very change that made it ultimately possible for him to make these observations.

The analogy to culture is clear: Cultural evolution is rapid and destructive. It wipes out what came before without regard, and it has no mechanism to prevent the willy-nilly propagation of cultural "genetic" material. What we end up with, then, is a bunch of unstable structures that collapse quickly and harm their constituent people in the process.

The common response is that evolutionary processes will yield stronger and more stable structures through natural selection. But what if that's not possible without some kind of constraint on what kind of "genetic material" gets incorporated?

There's also an analogy to be drawn to information theory. Dyson is a cross-pollinator. He believes that the only real change comes via cross-pollination of ideas. He doesn't want to believe that it's necessary nor, I think, even very important to create systems of thought. He thinks every wild idea needs to be considered. (With special attention to his, of course.) (What Dyson's thought on the scientific establishment boils down to, when you analyze the language, is essentially that he's smarter than they are so they should listen to him more than they do. But I digress.)

But what if it turns out that it's necessary to constrain information in order to get use out of it? That much has seemed intuitively clear to me for many years. It's the lack of such constraints that characterizes many mental illnesses, such as schizophrenia and mania.

Of course, there are plenty of people -- Dyson might be among them -- who are more than willing to idealize mental illness in the same way. I'd like to say that those are people without the experience of talking with people suffering from such mental illnesses. I'd like to say that, but I've heard too many of them illustrate their cases with allusions to their interactions with the mentally ill. Rather, I suspect that they are people more in love with their theory than with the people they hope to explain by it.

Myths are Metaphors

Pop quiz -- does this passage describe the present, or the future?

You sit immersed in a wireless cloud, navigating your way through the folders on your hard drive. It is a floating forest of branching tree directories anchored to a root folder buried somewhere deep inside the machine. You are listening to streaming audio whilst a torrent of more music flows into your MP3 player. While it downloads, your system is organising your music library into fields within a database and generating a feed direct to your homepage. Via your Flock browser you twitter to your friends about the latest item on the newsriver then post a few paragraphs to your blog, where they join the complex trail of links and paths going in and out of your site. While you surf, it's easy to forget that beneath you lies a creepy invisible underworld populated by spiders, bugs, crawlers, worms, and microscopic viruses, whilst above ground your transactions are hungrily devoured by sheep that shit grass before being aggregated into the Long Tail. That data trail you're leaving behind stimulates the synapses of the global brain, which is in turn pulled towards the gravitational core of the Web 2.0 solar system...

Windows Vista: dreaming nature in cyberspace (PART)

Answer: It's the present, of course.

The lesson: With the right language, you can make anything sound cool. Welcome to cyberspace. Let the meat rot where it lives.

(Via Sterling @ Wired)

Cyberspace as Woodstock Nation

Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.--John Perry Barlow, 1996

In-thread at Web 2.0 ... The Machine is Us/ing Us | MetaFilter, at 12:17 PM

Yet another example of the happy horseshit approach to social activisim: Put an absurd stake in the ground and hope that it makes people come that much closer to what you want.

Of course, Barlow never got what he said he wanted, but there are enough new Web 2.0 toys floating around that let people do superficially cool collaborative things that Barlow's probably pretty assuaged, most of the time. Meanwhile a new post-industrial market has co-opted Barlow's Cyberspace (though, since they've been paying for it, maybe just plain "bought" is a better word), and governments like that of China have been doing a good-enough job of exercising sovereignty where it's cyber-citizens gather.

Behold the Agonizer

"You asked me once," said O'Brien, "what was in Room 101. I told you that you know the answer already. Everybody knows. The thing in Room 101 is the worst thing in the world." -- George Orwell, 1984

In the coverage at Wired of the Air Force's new Active Denial System for crowd control, I didn't see any mention of the agonizer. And yet, that's what it is, more or less: A device that induces searing, burning pain that's so intense, subjects cannot help but struggle to get away from it.

It works via millimeter-wave radiation. Wired (courtesy of the Sunshine Project) has thoughtfully provided a rundown of publicly available documentation. On a quick scan, it's hard to tell whether the pain is caused by heating in the skin or by some other interaction between pain-nerves and millimeter-wave radiation. But prolonged exposer to the beam can cause second-degree burns, so heating does definitely occur.

And there's also no mention in Wired's coverage of the applications for torture, which are painfully [sic] obvious to me. An uncreative sadist would leave a victim with second-degree burns after leaving the beam focused for too long in one spot. A creative sadist would hack together something like Gene Roddenberry's agony booth, to move the focus of radiation around to different sets of nerve endings, in order to reduce the effect of neurotransmitter depletion. After an hour or so, I'm quite sure just about anybody would be willing to tell us whatever we wanted to hear as long as it makes the pain stop. In room 101, the man who works the latch on the rat cage is your god.

A vehicle-mounted version is apparently being tested in Iraq right now. I'm very, very curious to know what Iraqis will make of it. I think they'll get the torture angle right away. And since the technology is pretty easy to replicate, I can envision disreputable dictatorships throughtout the world deploying copycat devices in the near future.

Robots Of The Oil Swamp

Sitting in the pondering place, I pondered this: Where does vegetable oil come from?

The answer, of course, is that plants make it.

We have an oil-based economy, and we're running out of oil. But that's just the "mineral" petroleum, the stuff that's prehistoric. What about the stuff that the plants make?

Sure, plants can't make enough. It would be just like some nay-sayer somewhere to point out the number of acres we'd have to plant in Canola in order to make enough oil to fuel a single fleet of city buses. They'd probably say it's not cost effective, and they'd probably be right. But what about bio-engineering? How does the Canola plant make it? Or the Hemp plant, or the Olive tree, or any other plant? And what's to stop us from bio-engineering an organism to do just that?

Plenty of things, I'm sure, but most of them are moral or entail engaging foresight, and western capitalism doesn't have much history of respecting moral reasons. Or of thinking beyond the end of the depreciation cycle.

In any case, it's true that plants are very good at processing natural materials into more complex and very different natural materials. For example, they can make oil from organic waste. Or from cellulose. But plants are clearly not efficient enough. To even begin to feed the demand for fuel and synthetic plastics, we would need to operate at fairly high levels of efficiency. Fields of canola, regardless of how verdant, would not cut it.

But foetid swamps full of bacteria just might. To get the volumes we need, we would need to use open spaces, like swamps. We could digest whole forests, whole biomes, of cellulose, turn them into swamps, to get the hydrocarbons we want.

Gaseous hydrocarbons or light alcohols would probably be better for generation purposes, to drive our fuel cells, but we'd still need long-chain petrochemicals to make plastic. So I could envision different "crops," including even some semi-refined plastics.

Some of those crops would be quite hostile to life. The biological processes would most likely generate some rather toxic byproducts. And at the point where this type of production becomes necessary, I have to wonder whether the people who did it would care. These would, after all, be people arrogant enough to farm oil in an open swamp. If the global climate is sufficiently broken, all care might be thrown to the hot, dry winds. Or the fuming, damp winds, as the case may be, as we loose our hydrocarbon-synthesizing organisms onto the world and let them digest its organic waste matter into fuels.

I could envision great, sealed cities on the edge of seething hydrocarbon swamps habitable only by the most adaptable of organisms, and tended by fleets of fragmentarily sentient fuel-cell powered robots. Eventually, the robots might form their own cities (or be organized into them by a retreating humanity), existing only to tend (and perhaps contain) their swamps.

These robot cultures would evolve; they would not remain static. Evolution would apply to them as it does to us. This is where the admonitions of the Singularitarians would apply, because eventually our machines, once we are no longer an active influence upon them, will have to find their own reasons for living.

WESun 0: Mainstreaming Singularitarianism

This morning on Weekend Edition, The Singularity rears its ugly head in the persons of Vernor Vinge (who coined the concept) and Cory Doctorow. It's another manifestation of our increasing dread in the face of technological change, and the increasing degree with which we approach that change in irrational ways: in the Vingean scenario, as a rescuing parent; in the Doctorovian vision, .

Doctorow posits the scenario of a modern human interacting with a pre-literate human: That they would be "in a sense, different species." That they and we would have "nothing to talk about." Maybe he was clearer in the un-aired portions about what's meant by "literate", but unless it means "without language" (and one would expect the word chosen for that to be "pre-linguistic"), he's clearly overstating his case. We can easily talk with "pre-literate" or even "illiterate" people, because there remain between us basic human consistencies that will not be removed by any extropian enhancements which we can plausibly predict.

It's a badly chosen analogy, to be sure, and surely one can be forgiven for choosing analogies badly, no? No. Because the craft of science fiction as gedankenexperiment is all about precision -- or at least, insight -- in your analogies. We need to remember that the beings making the singularity are humans. The aspects of the singularity that are truly, deeply a-human, are not likely to persist in that form. They're likely to get reshaped, recrafted, in some kind of human image.

I think Doctorow's analogy illustrates the most fundamental problem with Singularity Theory, in that it is often a failure of a certain kind of imagination: Empathy.

Vinge posits a more traditional scenario, in a way, as a revisitation of the Jack Williamson nightmare -- but with Williamson's logical problems fixed. Vinge's singularity-intelligence is more of a savior than a successor. A lost parent, restored, if you will. Clarke's technological god. Maybe it can save us from global warming.

Doctorow's singularity-beings are replacements, successors. They are what we are not -- they overcome our weaknesses, and supersede us. There's a sense of mingled dread and fascination in the prospect. I'm still trying to understand how to talk about the impulse. I feel it, myself, to be sure, but I don't have a pat name for it.

Sterling's critique still seems sound. (See his short essay in Wired; longer talk at the Long Now Foundation, as MP3 or OGG or as a summary.) He points out (among other things) that the singularity-being will not come about entirely by accident. It will come about through our choices, and some of those choices will tend to constrain the singularity-being.

Remembrance As Modern Art Gone Bad

Speaking of Oklahoma City, my old Okie friend Kelley offered his thoughts on the memorial:

"I still say they should have planted 168 redbuds-a veritable forest that would be blooming now. What a sight that would be, an affirmation of life, a colorful display that cannot be equaled. Instead, they have those silly chairs. Stupid. Modern art gone bad. Yes, they were bureaucrats (mostly) but I think the chair is simplistic and mundane. After all, the noble, tough redbud is the state tree- they're hard to kill and they deal with adversity in a manner I think transcends their environs. Oh yeah, they're the state tree. Duh."

As I sit here, I have a vision of hundreds of ghosts sitting in those cold stone chairs for eternity.... Bureaucrat or no, I find it hard to imagine they wouldn't rather be sitting in a Redbud grove.

I responded that subtlety has become a lost art, accepted only from people (like, say, Roy Blount) who can pretend they're being obvious; and that real local character is passé, like the "southernness" of Atlanta or Houston.

But we've become a monumental culture. We might once have planted trees and let the glory be to God or Nature, and had faith that the tree would one day grow large. But that kind of sentiment died off in the Dutch Elm plague or was suffocated by Cold War techno-optimism. Now, it's no good if it's not human-made. (Ayn Rand smirks from her grave.)

Here in NY, I think the appropriate plant would be blackberry bushes. Let one survive, and you're buried in them forever. My friend Lynne planted blackberries around her fence for some reason a few years back, and now the whole area is a wasp-loud glade all summer long.

Up in Maine, it would be wild roses. Those things grow *as* *weeds* in the cracks between the big wave-smoothed boulders right at the ocean's edge. Even the salt grass has a hard time there.

CORRECTION: I'm chagrinned to be reminded that Lynne's bushes are raspberries, not blackberries. But either will take over in the rockiest, most clay-bound soil, given half a chance. And I'll stand by my Yeats allusion, even if it doesn't represent a literal truth, because I like the way it sounds...

Shenzhen to Nashville, Non-Stop

We live in the Era of Air Freight.

My new Mac Mini shipped early this morning from Shenzhen, China, via FedEx. From there it will probably fly non-stop to Nashville on a FedEx 747-400, 777, or 767, and thence be routed here. I can track the movement of the package online, and see by implication how it's travelling: It hit FedEx at 8:51pm (local time) on 1/18/2005 ("Package received after FedEx cutoff time"). It left the FedEx ramp at 7:09pm. By my reckoning it will be in the air about 12-13 hours, based on the distance from Nashville to Shenzhen. So I should be able to browse to the FedEx site and see the Arrival Scan by about 9pm EST today. I'll be able to follow it hop by hop until it goes out on the delivery truck, which will be either Friday or Monday, depending on how seriously Apple takes their delivery-date promises.

We live in the Era of Air Freight. This ecological fact is in many ways the most important practical implication of advancing technology: Computing and networking technology makes the coordination global logistics possible, and efficient long-haul cargo aircraft from Boeing and Airbus make it cost-effective to distribute directly from a factory in China to a doorstep in western NY state. And all of this allows economies to pump capital more quickly -- allows the concrete manifestations of ideas and desire to move across the globe at 700 miles per hour. Thinking of it all in terms of goods and capital seems trivial, but this kind of point-to-point distribution is really the engine that drives the global marketplace, which in turn is what drives global society, for good or ill. We can blame the idea on Sears and Ward. The transit of goods in turn subsidized the rest of our long distance mass transportation network, as the big widebodies pack the extra space in their bellies with cargo, the complex spoke-end to spoke-end routing enabled by efficiently networked logistics systems.

And yet, all we see moving are the people. We are blind to the goods in the cargo hold on all the big planes; we taxi by the big, windowless cargo-haulers, logoed-up for DHL, UPS or FedEx, and most of us probably just have a quickly-forgotten moment of "Oh, so that's how they do it..." We only think about the people.

When the World Trade Center was attacked on 9/11, British Airways lost about 40 Concorde frequent-flyers. The impact went much deeper, though, than just the loss of 40 reliable fares. Many of those 40 were senior decision-makers at their companies. They were the people who could approve the expensive Concorde tickets, either formally or tacitly. The Concorde relied almost entirely on human passengers to pay its way, and so from "Golden Eagle", the Concorde returned to it's early-'80s status as a money-burner. So we can add the Concorde to the list of things that can be said to have been killed by 9/11.

Amidst all the hoopla surrounding the formal unveiling of the Airbus A380 "super-jumbo", many asked where Boeing's competetive product was. Boeing's answer: The A380 is a "big plane for a small market." The same could have been said of the Concorde: It's market was so small, that losing 40 passengers upset it's fare-ecology sufficiently to make the plane commercially non-viable. But the Super Jumbo won't suffer the same fate. I heard it said more than once in the news coverage that half the orders were "from Asia" -- which means, they're for air freight.

Public reaction (and amongst the "public" I include most media business analysts) to the A380 under-reports a very important point: While hub-to-hub people-hauling is important, the 580-seat luxury model and even the as-yet unbuilt 800-seat steerage special versions of the A380 are really almost red-herrings. The real target market for these aircraft is not passenger hauling, but air freight. There's big money to be saved by increasing the weight and range of the planes, even just between major hubs. This is a plane designed to fly non-stop from Yokohama to Louisville, Shenzhen to Nashville, Taipei to LA, São Paulo to London, with a really big payload of shoes and consumer electronics. Airbus's bread and butter customers for the A380 are outfits like FedEx, UPS, DHL, that won't stop using the hub-and-spoke model for the bulk of their traffic for a long, long time.

It's easy to see this as a triumph for economic models of understanding. But that would be a mistake. While all of this can be seen in economic terms, its effect is human, social, and the field on which the economic facts are cast is fundamentally ecological. And that's the reason that economists (Marx first and chief among them) fail to predict accurately: They fail to understand that economics is only ecology writ fine, and hence divorced from the larger picture. And from the fine-writ bits from other aspects of the big picture. Capital -- money -- is fuel in an ecology of commerce. But it is not, yet, the reason. For the reason, we can still, at least, look to such intangibles as desire.

The Importance of Blogging Earnestly

The Business Blogging Boot Camp (@ Windsor Media) provides a more bottom-line perspective on the growth of blogging, driven by Fortune's 1/10/2005 feature story on technology trends; their observations came to me as part of an email thread related to the BBC story I mentioned last night. They stress the importance of blogging for business, and furthermore the importance of blogging earnestly. They cite the Kryptonite affair and moves toward blog-monitoring by Bacon's Information -- the latter characterized as tentative, "inane", 'Not Getting It.' (I'm usually leery when a huge quant-marketing shop is characterized as Not Getting It. Often it's true, yes; but as far as I can see they often understand a lot more than they bother to explain to us proles. But I digress.)

There are two things I feel compelled to point out before going further: First, blogs are qualitatively analogous to specialist newsletters, which are nothing new to savvy marketers. As with specialty newsletters, the influence of a blog hinges on a subtle balance between the publisher's access to information, their (perceived) personal integrity, and the volume (direct or indirect) of their readership. What's new is the speed of blogging. I'm leery of pointing out emergent qualities, but it's hard to argue that a ten-day cycle time doesn't indicate that (a lack of) quantity may indeed, in this case, have a quality all its own.

The second thing I feel compelled to point out -- and this is both much more and much less important than it sounds -- is that the Kryptonite business not only didn't start on blogs, but didn't get its first traction there. It started on the cycling boards, and that's where it was hashed out, refined, debugged, and researched, and where the first instructional videos were posted, before it was ever reported on a blogospherically-integrated weblog. Some of these bicycling boards are almost as old as the web, and most have many members who trace their net-cred back to Usenet days. My point being that anyone focusing only on blogs as such is setting themselves up for obsolescence. Blogs as they are, are almost certainly not blogs as they will be.

Anyway, Windsor Media's take is largely blogospheric orthodoxy. And in practical terms, it's probably right: The important thing for businesses to do right now is to make it part of some people's jobs to go out, and read and post like humans. But there's a second thing that not only needs to happen, but will happen, and what's more will be enabled by the first: Smart businesses will take steps to understand how the blogosphere works, and how it can be gamed, and then they will go forth and game it. And it will work. The knowledge required will come from a few main sources: From big outfits like Bacon and free-range old-school marketing pundits (who will keep it to themselves and share out bits of wisdom to key clients); and from less old-school marketing pundits like Darren Barefoot and BL Ochman, and from product evangelism folks at big companies (who as a group will tend to share it on their blogs, undercutting Bacon et al's old-school attempts to make money off consulting). And, perhaps most important of all, it will come from research in social network analysis. More on that another time.

Blogging will be gamed by corporate and business interests, make no mistake about that. Because it can be, and is being, gamed. It happens every day. And, contrary to the blogospheric orthodoxy, the broader the cross-section of people who get involved in blogging, the easier it gets to game the system without looking like a weasel. And if the golden rule of capitalist systems is that money wants to make more money (and I'm pretty sure it's something like that), and if blogging has an impact on the growth and flow of money, then money will drive blogging, and blogging will get gamed.

Now I'm getting into blogging heterodoxy. The conventional wisdom on the blogging ethos is very cluetrain, and in fact, the Kryptonite affair does indeed show a real "cluetrain" cause-effect loop. I missed it at the time because I just didn't tune in to the story, but the folks at Fortune and Windsor Media are right about that: The ten-day problem-to-product recall cycle at Kryptonite was characterized by all the corporate communications failures criticized in the Cluetrain Manifesto. It just took a lot longer for this first clear case to emerge than either they or, frankly, I thought it would.

The orthodox position is that the more people get involved in blogging, the harder it gets to game the system. It's a variation on the open-source golden rule of debugging ("Given enough eyes, all bugs are trivial"): "Given enough eyes, all misinformation will be found." But open-source debugging works (when it works, which it often doesn't, but that's another story) because the "eyes on the code" belong to people who know how to spot a problem, and have the capability to affect it more or less directly. In blogging, the "eyes on the information" are often people with little or no real expertise. Much of what they spout is nonsense.

And yet, it's effective.

The blogosphere shifts like a body of water: Fast, and irresistibly. Part of the reason that happens is that the blogging community is comprised largely of small communities with large enough membership to make an impact, and what's more, those communities overlap: PoliBloggers are tight with techbloggers who are tight with lifestyle bloggers who are tight with polibloggers.... So when the loop has looped a few times, we find that a relatively small group of people can pretty reliably and rapidly shift the character of the blogosphere. But as the blogosphere becomes larger, it grows more statistically homogeneous, and small communities of movers will not have the same kind of predictable results anymore. Then it will seem less like water, and more like mud.

But I digress, again. I started this to talk about gaming the blogosphere, and that this will happen, I do not doubt for an instant. There's a lot of money riding on this, after all. Some people will figure out how to game the blogosphere -- to game the cluetrain. It will be a painful process with lots of false starts, but we are well beyond the beginning of the process. It started long before the Kryptonite affair; if I had to pick a point in time, I'd pick the consolidation of successful blogs like Wonkette, Gizmodo and ... under the Gawker Media banner. Gawker sells lots of ads, gets lots and lots of daily eyeballs, and their more overtly commercial blogs (like Gizmodo and Jalopnik) have pull with the product managers by virtue of the fact that they can say things like:

What consumers wantâ??an out-of-box way to share and transmit files between different storage media and computers (and users)â??is exactly what manufacturers don't want to give them, but they'll tease us a little. So, if you're really rich, DigitalDeck Entertainment Network is busting out an in-home network PC to gear to DVD sharing system that costs $4000 - $5000. It probably consists of a bunch of cables and a universal remote that your geeked-out younger brother could hack together himself.

And so, we've come back around again to the specialist newsletter: I take Gizmodo seriously (and I confess, I do read it more or less every day) because I see things like this that indicate to me that they bother to think a bit about what they're reviewing. They have credibility for me because they speak not merely in a human voice, but in one that says credible things. And they have the benefit of comprehensiveness because somebody (namely, Gawker Media) is paying them to do nothing but that.

And by the way, at some point does it stop being "blogging" and start being journalism? Open question, IMO.

A Question of Belief

Edge.org have posed an interesting question [courtesy MeFi] to a collection of "scientists and science-minded thinkers": "WHAT DO YOU BELIEVE IS TRUE EVEN THOUGH YOU CANNOT PROVE IT?" (It's just the latest in a series of annual questions.) Many of the answers are thought-provoking, or instructive (even though most are simply restatements of that thinker's area of interest in the form of an "unprovable" "assertion"). The zeitgeist implicit in their answers is interesting, too. John Brockman writes:

This year there's a focus on consciousness, on knowing, on ideas of truth and proof. If pushed to generalize, I would say it is a commentary on how we are dealing with the idea of certainty.

We are in the age of "searchculture", in which Google and other search engines are leading us into a future rich with an abundance of correct answers along with an accompanying naïve sense of certainty. In the future, we will be able to answer the question, but will we be bright enough to ask it?

This is an alternative path. It may be that it's okay not to be certain, but to have a hunch, and to perceive on that basis.

Maybe it says that. Maybe it says that this is how science actually works: Having hunches, then trying to prove them, which is really what most of the answers are about. Some of them get more fundamental, as when Richard Dawkins answers:

I believe, but I cannot prove, that all life, all intelligence, all creativity and all 'design' anywhere in the universe, is the direct or indirect product of Darwinian natural selection. It follows that design comes late in the universe, after a period of Darwinian evolution. Design cannot precede evolution and therefore cannot underlie the universe.

... which is a remarkably blunt and honest thing for him to say, since it faces head-on the core weakness of his anti-ID positions. I personally think ID is a load of horse-hockey, but I don't think it can be countered with "proof" that it can't work any more than we can solve the first-mover conundrum. I'm glad Dawkins doesn't shy away from that. I'm not always crazy about the way he formulates ideas ("selfish gene" theory still seems too simplisticly reactionary to me, nearly 20 years after I first heard of it), but he is nevertheless one of the most able and vigorous opponents of ID, so it behooves me to pay attention to what he's saying out there.

In any case, while the Q&A is intriquing, in many cases (and as I've noted) it's largely a matter of researchers restating their research-focus as though it were a controversial idea. [bonehead @ MeFi observes, "... scratch post-docs or hungry assistant profs for real wild-eyed speculation. Of course, most of them will be wrong (entertainingly so), but that's where the future Nobels are too."] And I don't think Brockman is really giving credit to scientific process: Believing something you can't prove is usually how anything valuable and previously unknown gets to be learned. Call it a hunch, call it belief; the process whereby that belief is substantiated (though hardly evern "proved" in a strict logicalist sense) is what we know as science. And I'm not altogether sure that Brockman groks that.

Brockman also seems to think there's a new way of being an intellectual:

... There is also evidence here that the[se] scientists are thinking beyond their individual fields. Yes, they are engaged in the science of their own areas of research, but more importantly they are also thinking deeply about creating new understandings about the limits of science, of seeing science not just as a question of knowing things, but as a means of tuning into the deeper questions of who we are and how we know.

It may sound as if I am referring to a group of intellectuals, and not scientists. In fact, I refer to both. In 1991, I suggested the idea of a third culture, which "consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are. "

I believe that the scientists of the third culture are the pre-eminent intellectuals of our time. But I can't prove it.

This idea of "Third Culture" scientists is worth exploring, but it's a topic for another time. Suffice for now to say that I don't see anything sufficiently new that a new organizing principle is required; in fact, I think a concept like "third culture" has more potential to alienate thinkers from cross-pollination than it does to encourage them. A bit like "brights" in that regard.

But that's an issue I haven't got time to take on right now....

In An "Ownership Society", Love Is A Liability

Text for the moment is from Jack London's under-appreciated dystopian political thriller, The Iron Heel -- his narrator and heroine, sheltered young intellectual Avis Everhard, is trying to get to the bottom of a worker's injury claim:

"Why did you not call attention to the fact that Jackson was trying to save the machinery from being injured?" I asked Peter Donnelly, one of the foremen who had testified at the trial.

He pondered a long time before replying. Then he cast an anxious look about him and said:

"Because I've a good wife an' three of the sweetest children ye ever laid eyes on, that's why."

"I do not understand," I said.

"In other words, because it wouldn't a-ben healthy," he answered.

"You mean--" I began.

But he interrupted passionately.

"I mean what I said. It's long years I've worked in the mills. I began as a little lad on the spindles. I worked up ever since. It's by hard work I got to my present exalted position. I'm a foreman, if you please. An' I doubt me if there's a man in the mills that'd put out a hand to drag me from drownin'. I used to belong to the union. But I've stayed by the company through two strikes. They called me 'scab.' There's not a man among 'em today to take a drink with me if I asked him. D'ye see the scars on me head where I was struck with flying bricks? There ain't a child at the spindles but what would curse me name. Me only friend is the company. It's not me duty, but me bread an' butter an' the life of me children to stand by the mills. That's why."

"Was Jackson to blame?" I asked.

"He should a-got the damages. He was a good worker an' never made trouble."

"Then you were not at liberty to tell the whole truth, as you had sworn to do?"

He shook his head.

"The truth, the whole truth, and nothing but the truth?" I said solemnly.

Again his face became impassioned, and he lifted it, not to me, but to heaven.

"I'd let me soul an' body burn in everlastin' hell for them children of mine," was his answer.

Henry Dallas, the superintendent, was a vulpine-faced creature who regarded me insolently and refused to talk. Not a word could I get from him concerning the trial and his testimony. But with the other foreman I had better luck. James Smith was a hard-faced man, and my heart sank as I encountered him. He, too, gave me the impression that he was not a free agent, as we talked I began to see that he was mentally superior to the average of his kind. He agreed with Peter Donnelly that Jackson should have got damages, and he went farther and called the action heartless and cold-blooded that had turned the worker adrift after he had been made helpless by the accident. Also, he explained that there were many accidents in the mills, and that the company's policy was to fight to the bitter end all consequent damage suits.

"It means hundreds of thousands a year to the stockholders," he said; and as he spoke I remembered the last dividend that had been paid my father, and the pretty gown for me and the books for him that had been bought out of that dividend. I remembered Ernest's charge that my gown was stained with blood, and my flesh began to crawl underneath my garments.

"When you testified at the trial, you didn't point out that Jackson received his accident through trying to save the machinery from damage?" I said.

"No, I did not," was the answer, and his mouth set bitterly. "I testified to the effect that Jackson injured himself by neglect and carelessness, and that the company was not in any way to blame or liable."

"Was it carelessness?" I asked.

"Call it that, or anything you want to call it. The fact is, a man gets tired after he's been working for hours."

I was becoming interested in the man. He certainly was of a superior kind.

"You are better educated than most workingmen," I said.

"I went through high school," he replied. "I worked my way through doing janitor-work. I wanted to go through the university. But my father died, and I came to work in the mills.

"I wanted to become a naturalist," he explained shyly, as though confessing a weakness. "I love animals. But I came to work in the mills. When I was promoted to foreman I got married, then the family came, and . . . well, I wasn't my own boss any more."

"What do you mean by that?" I asked.

"I was explaining why I testified at the trial the way I did--why I followed instructions."

"Whose instructions?"

"Colonel Ingram. He outlined the evidence I was to give."

"And it lost Jackson's case for him."

He nodded, and the blood began to rise darkly in his face.

"And Jackson had a wife and two children dependent on him."

"I know," he said quietly, though his face was growing darker.

"Tell me," I went on, "was it easy to make yourself over from what you were, say in high school, to the man you must have become to do such a thing at the trial?"

The suddenness of his outburst startled and frightened me. He ripped* out a savage oath, and clenched his fist as though about to strike me.

"I beg your pardon," he said the next moment. "No, it was not easy. And now I guess you can go away. You've got all you wanted out of me. But let me tell you this before you go. It won't do you any good to repeat anything I've said. I'll deny it, and there are no witnesses. I'll deny every word of it; and if I have to, I'll do it under oath on the witness stand."

[full text at Gutenberg]

Subscribe to RSS - Techno-optimism