We're not going to change human nature any time soon. It isn't that we aren't rational -- we are rational -- but reason has limits.
Broadly, the idea that systems exist; stuff in this category is stuff that drives ecologies, like the ethos of a culture or the ethos inherent in a view or an act; or like the aesthetics that drive an ecological consequence.
The 'Net is quietly abuzz with chatter about Santy. It's a worm -- an old-school worm, that travels server-to-server, running a single exploit against one of the server's exposed services. But there's something "new" that scares people about Santy, Santy.B, and all its forthcoming incarnations: It can "discover" likely targets using internet search engines. Santy's success [re-]proves two points that have been made over and over again over the years: The more services you expose, the more vulnerable you are; and as you make it easier to code software, you also make it easier to code malware.
Security in computing, as in the non-virtual world, ends up being largely a matter of how many ways there are to get in and out: If you've got lots of doors and windows, you have poorer security. You can qualify the analogy somewhat, but that's pretty much how it works. Santy works because there are not only lots of doors and windows, but also because some of them aren't as well secured as they should be -- and because some of them advertise too much about themselves. (But that's another topic for another time. And none of this should be taken as excuses for following the "one strong door" model.)
Santy's first manifestation used Google to locate likely targets; now it also uses AOL and Yahoo search interfaces. It finds its way onto a system, patches itself into vulnerable code in certain versions of phpBB, and then proceeded to run Google searches for certain strings that would appear in those vulnerable versions of phpBB. This worked because Google has a stable and easy to use API -- really, from the perspective of Santy's author, just a standard format for input and output -- and it doesn't make any distinction between clients that have a person looking at the results and clients that have a machine looking at the results. As much as I'm wary of Google in general, this is a perfectly right and proper way for their software interfaces to operate, and they're not any different in this regard from any other search engine. Or any open bookmark repository, for that matter.
I repeat, this is old-school, albeit updated for new times: In 1994, someone might have written something analogous to this to use Gopher or WAIS or Archie. They'd have had to be smarter coders, and because some techniques hadn't been pioneered yet, the codebase would be larger and clumsier. But it could have been done, and probably was. I knew guys who thought that way, back in The Day. But the community of people who'd have been impressed was smaller, and frankly, servers were inherently more secure by virtue of simply having fewer exposed services.
The main thing that's actually changed is that it's now a lot easier to code this stuff. A mediocre scripter could hack together Santy in a week or less of spare time; a good coder in just a few hours. Back in the day it would have taken much more skill and time. The target would have been more esoteric, and the task would have taken much more technical expertise. There was no PHP or phpBB; there were no bulletin board systems to exploit. But it would have been possible to do something analogous to Santy in 1994. Not to slight the effort: What's needed in both cases is to have had the ideas about how to design it, which is a non-trivial point.
There has been a call for Google to "shut down" this vulnerability -- for it to block Santy's searches. That call was, frankly, ignorant: Yes, it would be possible for Google to play a game of cat and mouse with Santy coders (and it appears as though they may have been bullied into doing just that), but that would be bad for two reasons: First, because it would create an "artificial selection" environment in which Santy-coders would be forced to evolve things like source IP masking, metamorphic user-agent strings, and Bayesian pattern-matching for target identification thus indirectly causing the other side to get better; and second, because it's an unwinnable battle, and anyone smart enough to get hired at Google already knows that.
To be sure, poor security practices are also partly to blame. If I read the advisories correctly (and I may not), you have to configure phpBB in a fairly non-secure way to be vulnerable: Namely, you have to make your comment-posting forms world-accessible. That's common practice on boards that allow anonymous posting; there are obvious downsides to dis-allowing anonymous posting. It's a baby:bathwater conundrum, in that you throw away some of the liveliness and spontaneity of your board if you don't allow anonymous posting. (BTW, to forestall concern, Antikoan.net is not vulnerable to the exploit used by Santy, since our hosting provider has already installed the relevant PHP security patches.)
And the commodification of hosting plays a huge role in exposing vulnerabilities, though it can also help alleviate them. From the mid-'90s through the present, hardware and software commodification have conspired with consumer expectation and economies of scale to thin the margins on hosting to a hair's breadth; if a hosting provider spends two minutes a day of his own energy on a single customer, that's two minutes too long for him to be making a profit. So corners get cut, processes get automated using home-brewed scripts that aren't debugged. Best-practices for security get ignored because they make things harder to manage. The same commodification, though, has driven the development of highly automated hosting maintenance systems that bake-in things like security best-practices and software version control. There's distance to go, but it's getting better very rapidly.
Final aside: Santy is, in a way, the very worst kind of robotic exploit tool, because it's scripted, not compiled. Its instructions are available to any reasonably competent sysadmin who happens to get infected. And since it's PERL exploiting PHP, the development and execution environments are nearly ubiquitous, and the exploitable platforms still richly plentiful. phpBB may be the first exploit target, but it won't be the last; my quick research indicates that the most common family of OSS CMS systems may be vulnerable at its core, and almost certainly is in some of its more popular third-party modules. (The more frightening prospect is that one of those modules is now without a maintainer, and so if exploited, would remain exploitable. I'll leave my conjectures vague for the time being because they're still just conjectures, and anyway I wouldn't want to give anyone ideas.)
"And if I don't ever get married or have a baby, what -- I get bupkes? Think about it: If you are single, after graduation, there isn't one occasion where people celebrate you. I am talking about the single gal. I mean, Hallmark doesn't make a 'Congratulations, you didn't marry the wrong guy' card." -- 'Carrie Bradshaw', Sex and the City
"I started to get notes the next week that said that single women were starting to register, at stores, for their birthdays. And I thought, 'That's great, because we put something out there.'" -- Jenny Bicks on Morning Edition [listen]
Yeh. Right. You put something out there, alright: Another quasi-official reason to spend, and spend in a store-register-validated, label-appropriate manner. Sex and the City is really all about social activism and culture-jamming, after all.
As far as I'm concerned, the time is well-overdue to re-examine the idea that human existence is solely for procreation -- if there's one thing that Humans consistently do that other animals don't, it's make their own rules about what their existence is for -- but relating that to Carrie Bradshaw's sense of loss over her $400 shoes really, truly advances that particular cause not one whit, and it's insultingly disingenuous for the script's author to argue otherwise.
Sometimes, the primary driver in social change is the spread of technology. And the primary driver in the spread of technology is usually falling prices (not falling costs, that's another matter altogether).
And the primary driver in falling prices is usually theft.
If you're a business user, US$50 might get you just what you need in a desktop computing environment: A strong productivity suite, with Outlook/Exchange-like email and scheduling and centralized administration (very important in controlling the IT costs). Sun and Novell currently have enterprise-level offerings in that price range, and that's generally being taken as a sign that Microsoft is in trouble. But Dave Berlind at ZDNet argues that at that level of commodification, the underlying infrastructure doesn't really matter:
When software delivers a specific utility, that utility or "layer of value" is often referred to as "the contract." Like a real contract, a software contract sets the expectations of the external entities that will interface with the software. Those entities can be other systems or software, or they can be humans. ... If software interacts with users, then the rubber meets the road at the user-interface level where users feed something in and get something out in whatever format they want it (think documents and communications like instant messaging).
In the case of desktop Linux, the contract is in the user interface (which includes the applications). After all, a lot of the attraction to desktop Linux is due to the fact that it does things out of the box that Windows does not. For example, there's no need to run out and buy a productivity suite or install an instant messaging client. Most distributions of desktop Linux include fairly robust software for each. This model is remarkably similar to that of PDAs. As with PocketPC or PalmOS-based devices, the targeted users of JDS, NLD, and whatever Red Hat comes up with next will mostly interact with the applications and not with the operating system, which in turn reduces the OS to a mostly embedded and, not coincidentally, rather trivial commodity status. [emph added]
A minor point that Berlind misses: Commodity productivity only works as long as interoperability is rock solid. Ten years of domianance by MS Office have gotten us hooked on being able to freely trade editable documents with anybody, anywhere, anytime, with no format translation necessary. Not that I think Berlind misses this point; it would probably just confuse this issue, but it does end up being important, nonetheless.
But Berlind's main point is that this doesn't look as bad for Microsoft as we might think:
Anybody who thinks that Microsoft is just going to lie down and die as a result of this revolution in what $50 gets you is dreaming. If Novell, Sun, or any other company can turn a profit off of a $50 soup-to-nuts desktop offering, there's no reason Microsoft can't do it, too. It's just that the result may not be Windows and Office as we know them in their entirety. For example, Microsoft already has plans to offer a $36 Windows XP Starter Kit in India and will be offering copies of Office to certain schools at $2.50 per copy.
Berlind's right to say that MS wasn't driven to these tactics by Linux, per se. Linux has played a big role, particularly in the emerging nations and in India, and to a lesser extent in the EU. Especially in South America, new offerings to Governments often have to be Linux or nothing, more or less. And Berlind is right that hardware commodification and per seat pricing pressures in the corporate IT realms are prime proximate drivers for this kind of offering.
The real key driver, though, as I see it, is piracy.
These cheap Windows packagings that Microsoft is hawking aren't really intended to compete with Linux distros. Linux really isn't competition for this market. Much as places like Russia, the Ukraine, India, China, and Brazil are hotbeds of software innovation (and they are), the bulk of users in those places are still "home" and "office" users: They're even more unsophisticated, in other words, than the home users in the US market. It pains me to say this, but Linux is simply not a viable option for them. (Through no fault of the OS, let's be clear. It's a packaging and UI issue. Period.)
The alternatives aren't "buy MS" vs "install Linux for free"; they're "buy MS" or "steal MS". MS has understood this for years, and have taken localized stabs of this sort at it for a long time. What they seem to be realizing now is that their strategy needs to be global. After all, government purchasers in Brazil and Hyderabad can now easily communicate and compare notes on what they're hearing from their MS reps. Again, to be fair, this is probably not something that Berlind missed, so much as something that didn't fit in.
But in de-emphasizing the primary root cause (piracy) and over-focusing on the proximate cause (price wars in conjunction with hardware commodification), Berlind misses a very interesting point about information flow in the new digital world order.
It snowed here on Monday night and Tuesday morning. I saw it whirling in my headlights just as I drove into town, en route back from an extended weekend spend looping through central Pennsylvania. It was just a dusting, a half inch or less, but it's been cold enough since that it's still hanging on, mid-morning on Wednesday. Wednesday, November 11, 2004, in western NY state.
It's getting colder, earlier, here, it seems; two years ago we got substantial snow on Halloween that stuck on the ground for several days. I spent a week in Iceland, two weeks ago, and it was barely colder there than here the whole time. The last three winters here in Rochester have been brutally, abnormally cold and heavy. In the depth of last winter, I would drive to work early and alone and in the pitch dark, in near-zero-Fahrenheit cold, with clouds of fine, powdery snow snaking wildly in the cone of my headlights. I'd have to dig out my car to get it out of my parking space in the morning, and usually had to dig thorugh to get it back in at night.
It's not that bad, yet, but I'm tempted to blame it on global warming. Which is on my mind often, but more so today thanks to a friend pointing me to the new, slightly overdue climate report [PDF] just released by the Arctic Council.
Priorities: How does anything as small as religion, terrorism, or political partisanship matter when you're in the process of changing the physical world in ways that our civilization won't be able to deal with? Answer: It matters because we allow political expediencies to define something that we describe as "reality", when what we really mean is "worldview".
Calling it "reality" makes a position seem so much more forceful, so much more resolute. But "Reality," to quote the late P. K. Dick, "is that which, when you stop believing in it, does not go away." We've learned in the modern west that we can use science to divine some knowledge of those things that don't go away. Yet faced with some of that knowledge, many forces in power reject them as a "reality" they don't choose to "privilege". How ironic -- how "neat" -- that critics of "political correctness" should use the tools that enabled it to enforce their own reality distortion field.
And how ironic that these privileged classes who are busily extracting wealth as the clock winds down are turning themselves into the class of humans most likely to survive. It's actually a great case-study for the non-deterministic nature of evolution: That a class of actors with an ethos that revolves around denying reality are the most likely to survive the real global consequences of their own errors.
Lessig also complained about the Copyright Term Extension Act, which adds several years to the terms of protected works. I countered: Farmers can leave their property to their children; why shouldn't songwriters be able to leave their songs to their children?
[Former RIAA CEO Hilary Rosen, "How I Learned to Love Larry"]
Metaphor is a powerful thing. And like most powerful things, it can be dangerous when misused. Rosen here compares land to songs -- dirt to notes -- physical property to intellectual property. "I made it, I should be able to pass it along." It's a profoundly American way of thinking about something like "intellectual property."
But there's a difference in kind, here. Americans like to make analogies for everything -- we like to cut through boundaries, break down barriers, draw comparisons. We like to cross pollinate. There's power in shuffling the pieces to get new arrangements.
But we tend to lose sight of the fact that fair appearance is not the same as truth. There is a difference in kind between plots of land and songs. If I may draw an analogy, consider the similarity between musical notes and dirt: Both are basic constituents of larger things. But a landholder can sell the dirt; a song "owner" can't sell the notes from the tune. "You can have the first bar; I'm keeping the rest." And henceforth, "Happy Birthday" is one bar shorter... Doesn't work that way: The analogy breaks down.
Similarly, the analogies between physical and intellectual property break down pretty quickly, too. One is composed of real matter, that can be moved in the world and if recreated, is clearly a different thing, not the same thing. A song (or a story or a poem), by contrast, is said, felt, and understood to be the same thing no matter what form you re-publish it in.
More significantly, intellectual property is comprised of arranged ideas. Arranging ideas is the very most human of things that we identify as human things; we do it all the time. The very act of interacting with our culture is a process of arranging ideas. To say that one arrangements of ideas, as a song, a painting, a novel, or even the plan for a device, can be created and then belong forevermore to only one right-holder, is a profoundly anti-human concept.
A few more thoughts on the Kalsey-Firefox Affair. It's another illustration of Tech Macho Bullshit in action: If you're not "clueful" enough to see how much better off you are with Firefox, then you "deserve" IE.
Personally, I think that a "cluefulness-test" is the moral equivalent of playground bullying. (Geeks getting their own back?) And I don't think anyone deserves IE, but that's just me.
As far as I can see, this resembles the dueling of "eXtreme Programming" versus "traditional" methodoligies in that both are manifestations of the geek's adolescent obsession with control. They want to be able to make the decisions about what's "right" and "wrong" without subordinating themselves or their labor-power to inferior beings. The fact that those inferior beings do most of the working and paying and living and dying in this town escapes their mind; they only remember that they have to suffer the indignity of living with, working for, and being paid by them.
It's a real problem when backend geeks arrogate all app decisions unto themselves. Here's a real clue: If the app is hard for average users to use, it's a failure. Period. "Better" becomes irrelevant, because if you can still say it's "better" at that point, you're clearly using the wrong metrics.
John Perry Barlow noticed something interesting in NYC last week. It seems cocaine is making a comeback:
I'm talking about the interesting fact that, along with the Republican National Convention, New York is being hit with a cocaine epidemic that is even worse than the snowstorm that gripped this town during the mid to latter 80's. (During the last Bush administration, to put a finer point on it.) This time there won't be a crack problem to get all racist about, however. Cocaine in New York is now so cheap and plentiful that such economic measures as cooking it down to crack need not be taken by the poor.
People who learned better 20 years ago are suddenly snorting blow again. People you would never think would mess with this stuff are messing with it big time. Once again one commonly sees lines on the tabletops and the frantic eyes you can never make contact with. I was in a club the other night that was full as a tick with beautiful-looking people pharmaceutically disabled from beholding one another's beauty.
At the same time that the white death has made such a roaring comeback, the drugs that I think are relatively harmless, pot and the psychedelics, are in extremely short supply. Pot is selling at cocaine prices, a hundred dollars a gram sometimes. And coke is selling at pot prices. An ounce of coke would cost about the same as an ounce of decent sensimilla a few years ago. Mushrooms are scarce. LSD is functionally off the market.
What's going on?
But then, cocaine is a Republican drug. It makes its users self-obsessed, aggressive, and greedy. It plays hell with one's sense of consequence. It's generally preferred by people who have more money than humanity. And, best of all, the weirdos and peaceniks who like to waste their useless time stoned on marijuana or psychedelics, tend to hate it. ("All the more for us, eh, Buffy?")
Once again, one can see clearly what the War on Some Drugs is really about. It's the culture, stupid. It certainly isn't about public safety, since coke and booze are the perfect combination for social depravity of all sorts. Instead, it provides a beautiful opportunity to jail the blacks and hippies who prefer the non-Republican drugs. It makes huge bank for one's wing-tipped colleagues.
Can it really be that the Bush Administration has decided to turn a blind eye to blow? Or is it that they are simply too incompetent, despite turning Columbia into a war zone. Maybe this is just a local phenomenon, arising from the fact that approximately 10,000 New York police officers, who ordinarily focus on narco crime, have been diverted to convention patrol.
.... Just what we need, a whole arena packed with irritable, glaze-eyed folks who are even more certain of their superiority than usual....
But it may be simpler: It may just be economics. As we shift focus from the America-exacerbated problem of narco-terrorism to the America-exacerbated problem of global religious terrorism, the drug-kingpins can expand their production and trafficking operations. And since our "homeland security" measures have been such dismal failures, we still have wide-open borders.
And to top it all off, those goddamn drug producers are behaving like manufacturers, and working to improve production! That's right -- those bastards are getting scientific, and breeding super-high-production coca plants! Clever S.O.B.s....you'd think good capitalists like the Bushites would applaud that kind of initiative.
Maybe that's what's going on, after all...
New ideas can only form where knowledge is incomplete. Ideas are a response to gaps or shortfalls in knowledge; they fill in the blanks of what is not yet understood. Where knowledge is (or appears to be) complete, there will be no new ideas.
My brother Glen pointed that out to me on Friday night, in the course of explaining to me that he had started writing his papers to push ideas in addition to data. It's true, and trivial, and yet it's non-obvious, because it requires that we invert our normal analytical behavior (finding solutions).
At times like this I wish I'd pursued math more diligently. I wonder if this is another way of stating the Incompleteness Theorem.
One of the few lessons I've learned since I was a young boy is that the commerical marketplace is largely a meritocracy, but not a technical one. It's a marketing meritocracy....
Note to self to add this to my list of dangerous memes: "The Web as Meritocracy." Call it the "nigritude ultramarine" meme.
Furthermore, Dash maintained, his victory proves one thing: That the Web is a meritocracy.
"A page that's read by people instead of robots is going to do better," he said.
There are some really good, basic, honest techniques for getting placement, but they take work. What Anil Dash is talking about is one of those techniques, and in his narrow slice of the web it's the best single technique. It's not in the least surprising to me that this worked, especially given the "insanely generous" weighting that Google gives to blog pages; this is the tactic that I outline for people whenever they ask me about how to get Google placement.
And that this kind of technique works does tend to foster something that looks like a meritocracy. But it is not, in fact, a meritocracy at all: It's merely a measure of popularity. And that something is popular does not mean it's true.
I've found it's important to explain the distinction I'm drawing, because there seems to be a really quite strange tendency on the part of many technophiles to believe that appearance is essentially identical with reality. ("If it resembles a duck, it might as well be one.") I think one big reason for this is that in the limited frame of relevance comprised of what's relevant to a software or data interface, appearance is in fact reality. It's fair to say that a deep and conscious grokking of this fact is one of the most essential characteristics of a good net-hacker.
To be fair (and with due reference to the first quote), I'd be surprised if Anil Dash doesn't understand that. Or Doctorow, for that matter. Though sometimes I wonder if people lose appreciation for the finer distinctions after being beaten incessantly over the head with the "Virtual Is Real" squeak-hammer day after day after weak after year...but I digress, as usual.
To Dash's point, you could construe the web as a "marketing meritocracy", but that's really just a way of exposing the ramifications of Metcalfe's Law. The "merit" at hand isn't Anil Dash's personal merit, nor even his technical merit: It's the weight of his reputation, which is a function of how the brand known as "Anil Dash" has been marketed.
Anil Dash didn't receive his "winning" ranking by merit in a personal sense, or even in a real technical sense. Rather, he won it by gaming the system, so if the results of this competition demonstrate anything, in fact, it's that the web is not a meritocracy -- unless by "meritocracy", you are restricting the judgement of merit to social engineering skills.
One stock response to all of this would be: "So what? Systems get gamed. It's all subjective." Which brings us back around to Lysenkoism and intellectual relativism. It seems to me that to argue that reality is the result achieved by the best gamesman is to give up on the idea of knowledge, in a sense.
Dave Winer finally speaks out on the Weblogs.com fiasco, and amongst all the usual stuff I find one thing that really leaps out at me:
One of the things I learned is that just because a site is dormant doesn't mean it's not getting hits. The referer spam problem on these sites was something to behold. Search engines still index their pages, and return hits. They were mostly dead, in the sense that most hadn't been updated for several years.
Something troubles me about this and the interminable HTTP code vs. XML redirect discussions, and that's this: If someone links to the content, it's live by definition.
I'll restate that, so it's clear: Content that is used should continue to be accessible. I don't actually know where Ruby or Winer or Rogers Cadenhead or anybody but the writers stand on this, but it remains a non-negotiable best practice and first principle of web interaction design for usability that links should not go dead.
If that means you have to redirect the user to a new location when the content moves, so be it. If you have to do that with meta-refresh in HTML or XML, so be it. Sure, there are "purer" ways to handle it; but it's just stupid to let the perfect be the enemy of the good by saying that you can't redirect if you can't modify your .htaccess file. Even a lot of people with their own hosting environments aren't allowed to modify their .htaccess.
I'm getting the sense that a lot of the people involved in these debates are forgetting that the web was supposed to be about linking information to related information. Protocols and methods are mere instrumentalities to that end. It's the content that matters; there really, really isn't a web without content.
Terrorists with leverage are scary, but I'm much more scared of nutty, cocksure attempts to build "technology" that supposedly keeps us safe. Terrorists get tired, give up, or shoot each other over the spoils, but once the hardware's installed, a lousy technology is harder to kill off than a cockroach.
[Bruce Sterling, speaking with Bruce Schneier]
Via Bruce Schneier's June 2004 Cryptogram, a "discussion" between the Bruces Schneier and Sterling that, though it consists mostly of one-paragraph positionings, does get in a few bon mots.
Cryptogram is worth looking at, too, if only for its revealing analysis of the effect of the superficially unspectacular Witty Worm. Witty was nearly unique in the degree of technical competence exhibited by its creators: If they'd chosen a different target, we could have lost the whole net in 45 minutes, instead of just 12,000 nodes.
You want to find the number 216 in the world, you will be able to find it everywhere. 216 steps from a mere street corner to your front door. 216 seconds you spend riding on the elevator. When your mind becomes obsessed with anything, you will filter everything else out and find that thing everywhere. -- "Sol Robeson", Pi
Via MeFi, this morning, I find some skeptical meditations on the Golden Mean. We're told it's everywhere; we're told it (or at least, the Fibonacci Series) can be found thorughout nature, and this is presented as evidence of intelligent design.
But the truth seems to be that it's as good a demonstration of natural selection as it is of design. For example: Leaves tend rotate on a stem in a spiral that can be mapped to a ratio of Fibonacci numbers, known as the "Divergence". Wondrous, that; or perhaps not.
Although many of these observations were made a hundred year or more ago, it was only recently that mathematicians and scientists were finally able to figure out what is going on. It's a question of Nature being efficient.
For instance, in the case of leaves, each new leaf is added so that it least obscures the leaves already below and is least obscured by any future leaves above it. Hence the leaves spiral around the stem. For seeds in the seedhead of a flower, Nature wants to pack in as many seeds as possible, and the way to do this is to add new seeds in a spiral fashion.
As early as the 18th century, mathematicians suspected that a single angle of rotation can make all of this happen in the most efficient way: the Golden Ratio (measured in number of turns per leaf, etc.). However, it took a long time to put together all the pieces of the puzzle, with the final step coming in the early 1990s.
The worst kind of angle for efficient growth would be a rational number of turns, eg. 2 turns, or 1/2 a turn, or 3/8 of a turn, since they will soon lead to a complete cycle. Mathematically, a turn through an irrational part of a circle will never cycle, but in practical terms it could eventually come close. What angle will come least close to a cycle? Maximum efficiency will be achieved when the angle is "furthest away" from being a rational.
Or perhaps so, but not in the mystical way of a super-mathematician divining the world. Rather, wondrous, in that mathematics can help us to understand the real reason that such realizations evolve.
Yet, of course, like most good stories, people still want to believe it. We are story-making animals, after all; it's how we understand the world. Powerful stuff, stories. So long as we remember now and then that we do live in a real world.
Wikipedia is probably the most siginificant, important website on the net right here in May/June 2004. It's the signal success we can point to for bazaar-style projects, and the great white hope for the persistence of free, non-corporate-sponsored information on the web. Not to disregard Wikipedia's smaller cousin, WikInfo; they're just not big enough to be a great white hope, yet.
So, now, Wikipedia has done something intriguing: You can now talk about any article, or view previous versions. These appear to be benefits of upgrade to version 1.3 of MediaWiki, the hyper-extended Wiki implementation that Wikipedia developes and uses to drive the site.
Tired terms like "community portal" don't do this justice. I don't think the great mass of the digerati really have any clue how important Wikipedia (and WikInfo) are. This kind of move, once they notice it, could blow Wikipedia wide open.
My great fear is that it could literally blow it wide open: How will they be able to handle the loads? Will their community software be able to cope with input from every Tom, Dick and Harry with an opinion?
The upside, of course, is that with a project of this broad scope, we'll finally get that "online experiment" that other "communities" have been claiming to be for years.
Addendum: I've posted this on Mefi; let's see if anybody cares.
Second addendum: Mefites assure me that it's always had that functionality, though it wasn't as obvious as it is, now. I wonder if they've made changes that will let them handle the greater load and have decided to front-and-center those features?
Disney is apparently angling for a market coup by forcing the FCC's broadcast decency rules to apply to cable [daypop cites]:
The Walt Disney Co. has quietly been lobbying Congress to apply broadcast indecency rules to cable programming, according to informed sources.
Were Congress to agree with Disney, basic and expanded-basic cable networks could be fined thousands of dollars by the Federal Communications Commission for airing four-letter words and steamy love scenes prior to 10 p.m.
Under a Senate bill pending floor action, cable networks, with certain exceptions for news, premium and pay-per-view fare, could face fines for airing violent movies and dramas before 10 p.m.
It's a brilliant ploy, of course: Once broadcast decency rules applied (so their reasoning goes), no other vendor of entertainment would be as well positioned to sell entertainment (in all its myriad forms) to the American public...
jonh gets part of the way -- the same part of the way that Jeffersonian-tinged net.libertarians usually get: The tech has the power; the tech will cause changes that can cause changes.
I'll bet that in about five years ... by 2010 ... the use of blogs in the workplace will be widespread. This will require the continued spread of "transparency" in the dynamics of networked organizations, and so will continue to create pressure on core issues like leadership, structure and the processes by which people are managed in an interconnected information-based environment.
Just look at the pressures being faced by Donald Rumsfeld and you'll see an early signal - will leaders be able to lie their way through competitive challenges or major change in organizations ?
Powerful ideas, to be sure.
But as usual for the more optimistic heirs of Thomas J, he doesn't close the loop. The Tutor points out an obvious response:
Well, just look at Karl Rove. Yes, they will lie bigger and lie better. And nets will be the Terrorist tool of choice, demonized. Will the guards at GITMO blog when they return home, traumatized? Or will they take Prozac and wave flags? Did the SS write memoires? The story strong enough to extinguish evidence, leaving only the snow, the trees, and one lonely owl against the night. When the truth is ugly, the mind small, bet on lies. Unless our poets get off their postmodern ass. Where is our Mandelstam or Brodsky?
One error here is mistaking transparency for a technology; transparency is merely enabled by the technology.
Transparency can be shut off -- or, more ominously, controlled. Transparency need not be total, or even nearly total, in order to reap its benefits. The real cluetrain will run on rails paid for by people with lots of money or government influence, and those people will be placing restrictions on the passenger manifest: No bolsheviks allowed.
Why all the fuss? After all, the bugs are more of a nuisance than an actual threat to humans and pets. [â??Cicada hype â??bugsâ?? peopleâ? by Stephenie Steitzer, Cincinnati Post staff reporter]
Nuisance, yes. But what a nuisance! Ruined weddings and other outdoor celebrations. â??Who would have thought that entomologists could double as wedding consultants?â?
Besides, while chomping on cicada snacks, one could pass the time coloring a cicada or folding an origami cicada.
In fact, why not fold todayâ??s or yesterdayâ??s or tomorrowâ??s front page news into a cicada? It would provide an apt reminder of cyclical nature in more ways than one.
The modern concept of the corporation, as it is realized in America and Britain, is of an entity that is permitted to act as a "legal person". But just what kind of a person would a corporation be? The Economist, reviewing the 2003 documentary The Corporation, provides an answer: It would be a psychopath.
Like all psychopaths, the firm is singularly self-interested: its purpose is to create wealth for its shareholders. And, like all psychopaths, the firm is irresponsible, because it puts others at risk to satisfy its profit-maximising goal, harming employees and customers, and damaging the environment. The corporation manipulates everything. It is grandiose, always insisting that it is the best, or number one. It has no empathy, refuses to accept responsibility for its actions and feels no remorse. It relates to others only superficially, via make-believe versions of itself manufactured by public-relations consultants and marketing men. In short, if the metaphor of the firm as person is a valid one, then the corporation is clinically insane.
The main message of the film is that, through their psychopathic pursuit of profit, firms make good people do bad things. Lucy Hughes of Initiative Media, an advertising consultancy, is shown musing about the ethics of designing marketing strategies that exploit the tendency of children to nag parents to buy things, before comforting herself with the thought that she is merely performing her proper role in society. Mark Barry, a â??competitive intelligence professionalâ?, disguises himself as a headhunter to extract information for his corporate clients from rivals, while telling the camera that he would never behave so deceitfully in his private life. Human values and morality survive the onslaught of corporate pathology only via a carefully cultivated schizophrenia: the tobacco boss goes home, hugs his kids and feels a little less bad about spreading cancer....
The film-makers have the Corporate Person as their subject. True to their milieu, The Economist finds that insufficient: Socialist dictatorships are also psychopathic, they point out. Bureacracies are just as bad as corporations. Just ask the North Koreans.
But The Economist fails in its analysis inasmuch as it falsely concludes that states are somehow generically less accountable than corporations. In America -- at least, in theory, and assuming we have paid attention -- we do have some measure of control over the "Corporate Person" as embodied by the state. And as the Corporate Persons of the world proceed to internationalise themselves, they become less subject to the laws of individual countries. And as a state runs up against the petty restrictions forced upon by by its bleeding-heart populace, it turns to "contractors" in the pay of Corporate Persons who are not bound by state regulation.
"They got this guy, in Germany. Fritz Something-or-other. Or is it? Maybe it's Werner. Anyway, he's got this theory, you wanna test something, you know, scientifically - how the planets go round the sun, what sunspots are made of, why the water comes out of the tap - well, you gotta look at it. But sometimes you look at it, your looking changes it. Ya can't know the reality of what happened, or what would've happened if you hadn't-a stuck in your own goddamn schnozz. So there is no 'what happened'? Not in any sense that we can grasp, with our puny minds. Because our minds... our minds get in the way. Looking at something changes it. They call it the 'Uncertainty Principle'. Sure, it sounds screwy, but even Einstein says the guy's on to something." ['Freddy Riedenscheider', The Man Who Wasn't There]
Sam Arbesman's MemeSpread project aimed to chart the progress of a particular (albeit problematic) meme thoughout the "blogosphere", given known sources. Initially seeded on kottke.org, BoingBoing and Slashdot, Only Kottke picked it up; it apparently fared poorly until it hit MeFi, from whence it boomed across the web like one of those evanescant thunderclaps that wash across the blogosphere like a summer rain in the desert.
A Wired News article summarizes the story (though it fails to link to Arbesman's own writeup [pdf]). Aside from a passing reference to the "Hawthorne Effect", though, it doesn't really deal with the difficulty of studying phenomena such as this. It reminds me of a similar project I pitched to my undergrad advisor in 1992, with the idea of pushing out memes via Usenet. (He was uncomfortable with the human subjects concerns -- my experimental design was constructed to avoid observer effects.)
5. The next religion will value myth, and not confuse it with history.
And yet, a religion that presents its mythology as history will never make peace with science. Each new archeological discovery or new method of textual criticism will pose a challenge that will demand a new denial, rationalization, or reinterpretation. Eventually this baggage will be too heavy to carry, as it currently is for many Christians. But if a religion truly values its mythology, then there is no need to claim it as history. Fundamentalists cling so desperately to historical accuracy of the Genesis creation story because if it is "just a myth" it has no value for them. But this is an absurd point of view--would we really prefer an accurate history of Troy to The Iliad? Would we choose true Danish history over Hamlet? The next religion will recognize that myth is often more important than history, just as the exploits of King Arthur have as much significance to Western culture as the actions of any historical English king. Whether Arthur is historical or not does not matter, for he is mythic.
So many missed points, and all to the point.
Hamlet is not Danish history, and does not seriously purport to be. And if closely observing religion (as I dare say I've done since the tender age of about eight) has taught me anything, it's that people have surprisingly little problem with the 'heavy baggage' of mere logical inconsistency.
I'll cop to "point missed", of course; this passage is meant to describe how it oughta be, not how it is. But in hopefulness, it's trecherously naive: The idea of Arthur is much more dangerous (as a function of its greater worldly power) as a literal myth than a metaphorical one.