Rohan Jayasekera's thoughts on the evolving use of computers -- and the resulting effects

Occasional thoughts by Rohan Jayasekera of Toronto, Canada.

My Photo
Name:
Location: Toronto, Ontario, Canada

I've been online since 1971 and I like to smoothe the way for everyone else. Among other things I co-founded Sympatico, the world's first easy-to-use Internet service (and Canada's largest).

View Rohan Jayasekera's profile on LinkedIn Rohan Jayasekera's Facebook profile twitter / RohanSJ
Subscribe in a reader

Or enter your email address:

Powered by FeedBlitz

Sunday, August 27, 2006

Tara Hunt and wrong questions

Tara Hunt is someone who really, really “gets it” and whom I admire a lot. And she has the gift of the gab. So I was taken aback when she posted something which wasn’t as cogent as usual: “But, you are asking the wrong question...”.

I can understand why Tara sounded a bit confused: she was post-morteming her answers to an on-air interview. I think I may be able to help with some perspective, given that I have the benefit of third-party distance. Of course only Tara can say whether this is “help” or “no, no, that’s not what I mean at all!”

(Small-world note: it turns out that Tara and my wife have worked on the same project, Cheapeats Toronto. I only became aware of the connection after my friend Lex, the dynamo behind Cheapeats, noticed that Tara had mentioned me in one of her posts.)

What is the Long Tail “really about”? Tara says she regrets having given Snakes on a Plane as an example. But, I think, she’s said so much about Snakes on a Plane that if it isn’t a good example of something, then the problem isn’t that it’s not a good example, but that the “something” is wrong. (Another way of saying what Tara did, that “you are asking the wrong question”.) This is what started me thinking, and caused me to write this post.

Tara says that the Long Tail is “about the celebration of getting small”. But small’s always been around (big has to start somewhere), and it’s been celebrated for a long time. So I don’t think that really nails it.

To me, the central thing here, and it may be different from the Long Tail (yes there is a Long Tail, but there always has been!), is that those who are not part of established interests can now compete with those established interests better than they could in the past. From a left-brain perspective I call this reduced barriers to entry; from a right-brain perspective I call it empowerment.

This has come about because technology has somewhat levelled the playing field. When this happened in the Wild West, someone said that “God made man and God made woman, but Samuel Colt made them equal.” The “established interests”, the physically strong people, had their advantage removed by technology.

There are tons of examples of this in the world at large. For instance, human rights defenders now use video cameras to “document abuse and create change”.

With the onset of Web 2.0, the participatory Web, the “small” can now do what only the “big” could do before. Bloggers can play in the same sandbox as established columnists; individual programmers can now bid for jobs from other countries; my wife now sells pretty mostly-vintage items on eBay.

Tara mentions Amazon, Netflix and iTunes, all of which get large revenues from their large collections of smaller-selling items. Less-popular books, movies etc. have always been around, but on the Web they’re easier to find, to recommend, etc. But I’m not sure that this force is the biggest one. Sure, it eases consumption of the small, but I think the bigger story here is on the production side: the small producers can now compete more effectively with the big ones. Canadian singer-songwriter Allison Crowe (never heard of her?—you have now) has her own website and records on her own independent record label, having turned down a major-label recording contract because she didn’t want to do things their way. She uses relatively cheap equipment to make good-quality CDs — and you can hear her rendition of Leonard Cohen’s Hallelujah on iTunes. The playing field is still far from level in the music business, but it is changing.

And Snakes on a Plane does count. As Tara says, the story is “how a small group of people could hijack old media”. That it was a big-budget Hollywood production, as remarked by her interviewer, doesn’t cancel this as an example; if anything it strengthens it. A group of bloggers used the Internet to collectively create a modified version of a movie that was already in production. They didn’t shoot any film, but they had discussions, and made a new poster, that repositioned the movie somewhat. The people behind the film then changed it to match the new angle, including taking the extremely unusual step of shooting new footage after they’d already wrapped filming. These bloggers weren’t part of the Hollywood establishment; they used their technology-provided powers. And that, I think, is the answer to whatever the “right question” is.

UPDATE: Hugh MacLeod says that “The blogosphere doesn’t get us sales, but it makes us much smarter salesmen.” That is, it helps with production, not consumption.

UPDATE #2: Many people have commented that Snakes on a Plane didn’t do all that well at the box office, and that the Web 2.0 forces therefore failed. Here too the commentators are looking solely at consumption, not noticing the impact on production.

Saturday, August 26, 2006

Bubble sickness

There’s been a lot of talk about how much YouTube is worth, or Facebook, or digg. Some of it’s by people who analyze such things because it’s their job to do so, but there’s been way more than that. When people start getting excited about all the money that other people could make, that’s when I suspect that not only is there a bubble but it’s going to burst.

I hate this. In late 2000 I lost my job because of the dot-crash. Even though, as a non-believer in the “New Economy”, I’d been careful to work for a company that was private, profitable, debt-free and that had a CEO who also didn’t believe in a New Economy, the company ended up growing to take in the copious amounts of $$$ that dot-coms were begging us to take off their hands. (We did have to work for it.) So when the dot-coms got into trouble, so did we. I got axed in a round of layoffs but soon the whole company went under.

When a bubble bursts, everyone gets spooked and becomes reluctant to engage in even “normal” hiring and investing. It’s horrible and I’m not looking forward to a repeat of 2000/2001. Sure, “it’s different this time”: there are indeed some differences. But once mania/bubble psychology has set in, for whatever reasons, the result is inevitable.

Monday, August 21, 2006

Eflactem's Law

My previous post was about another reason that networks gain value with more members. Now I’d like to talk about how certain networks lose value with more members.

This has been done before, e.g. McCandlish’s Foil to Reed & Metcalfe. The particular force I want to talk about here relates to Web 2.0-era companies, which tend to have remarkably few staff by traditional standards.

A riddle I heard in the 1980s:
Q: In a group of workers, how can you tell which ones are the knowledge workers?
A: They’re the ones who shower before they go to work.

The companies we’re talking about usually consist entirely of knowledge workers. Knowledge workers, um, work with knowledge, and they need lots of it. Communicating knowledge can often be difficult to do well, and in any case takes time for both the “transmitter” and the “receiver”. The best way to deal with this is to eliminate the need for any communication, by having one person do everything. Or, if that’s not practical, as few as possible.

One way to reduce communication is to eliminate middlemen. For instance, do your own email rather than have an assistant deal with it. A post by Rick Segal refers to CEOs who are constantly on top of their email. Not only that, but he finds that many “big shots” are quite accessible to “mere mortals”. This increases their already-heavy workload, but does ensure that they can be reached without interference, and without corruption of the information.

If you thought you were already overworked, well ...

Saturday, August 19, 2006

Metcalfe's Law and those left behind

Recently there’s been a flood of discussion about Metcalfe’s Law, both the law itself and its applicability to Web 2.0. On Friday Bob Metcalfe himself posted that (as summarized by Mike Hirshland in that post) “to understand the value of a social network we need to consider not just the number of users but also the affinity between the members of the network”.

Not that I’ve searched (feel free to leave comments about what I’ve missed), but I haven’t seen any discussion about the number of non-users of the network.

Back in the 1980s when people started to put answering machines on their home phone lines, I hated leaving messages on them. So I didn’t buy a machine myself — until some of my friends started complaining that if I wasn’t home when they called they’d have to keep calling back. I was now subjecting them to an annoyance they were no longer used to (it didn’t matter that, not long earlier, having to phone people repeatedly had been a normal part of life). I had moved from the norm to the exception, even though I had done nothing. So I bought an answering machine.

My father didn’t buy a computer until he was 70 or so. My impression was that what pushed him over the edge was the computer-savviness of a friend of his who was 86: computers now seemed de rigueur for everyone, not just the younger folks.

Do you see what I’m driving at? When “everybody’s doing it”, people feel pressure (external or internal) not to be left out, and as a result often “do it” too. The popular term for the external stuff is “peer pressure”.

For social networks, this force is very relevant. People join MySpace not just because of the “positive” network effects but also because of the “negative” effects of being left out of where their friends are. So while Metcalfe’s Law applies, and Reed’s Law, so does, hmm, dare I call it Jayasekera’s Law? Smiley face

Oh, I guess that would require some math! (I have a math degree, but I don’t make use of it very often. The last time was to make shoulders fit better in made-to-measure suits tailored by The Fitter system.) OK, so let’s say that the number of participants in whatever is N, out of a total possible number of participants U (for universe). If N is a small percentage of U, there is little pressure to join in. But if that percentage grows, and grows so much that N dwarfs U-N (which is the number of those left behind), the pressure can become great.

How great? The closer the percentage gets to 100%, the greater the pressure. So we’re interested in how close N/U is to 1 (or “unity” in fancy math terminology). As in Metcalfe’s Law, I would expect the relationship to be more than linear, e.g. square as in Bob Metcalfe’s own formulation. In which case pressure P ~ (N/U)^2. Or maybe it’s more aggressive, e.g. P ~ (N/U)^3. Or even, if the pressure is quite low at first but ramps up spectacularly when the percentage gets really close to 100%, an inverse relationship based on how close 1-N/U is to zero: P ~ 1/(1-N/U) . While I don’t know what an accurate formula would be, I suspect the curve would fall somewhere in the range from square to inverse. In the time-honoured phrase of mathematicians: the actual formula is left as an exercise for the reader.

While applicable to similar situations, this is a somewhat different kettle of fish from Metcalfe’s Law and Reed’s Law, which have no limit on their independent variable (the size of the network): the dependent variable (the value of the network) can just grow forever. Jayasekera’s Law (hahaha — having your own blog means you can say whatever you want) “ends” when everyone’s a participant (which shouldn’t actually happen because there are always rebels). But if the potential membership is growing, as is MySpace’s for example as its appeal continues to broaden geographically, demographically and broadband-penetrationically, all these forces can apply simultaneously: the invisible hands of Metcalfe, Reed and Jayasekera (hahaha — Adam Smith may be long deceased but our hands are all quite visible) can all be active at once. (Rant: people are sometimes tempted to reduce a complex situation to a single force, e.g. Reaganomics was based on the idea that supply-side economics controlled the U.S. economy, and the Kyoto Protocol is based on the idea that the greenhouse gases generated by humans control global temperature. Such oversimplifications are appealing because they suggest that a difficult problem can be solved by a single course of action, but appealing doesn’t mean effective.)

I do believe that this effect has wider applicability than just to social situations. For example, in the original Metcalfe’s Law context of Ethernet, a device that is not connected loses perceived value relative to those that are, more so as it becomes more of an unconnected oddball.

Wednesday, August 16, 2006

Software hasn't become easier to develop

In an earlier post I referred to the common belief in the Web 2.0 world that software is now much easier to develop than it used to be, and my disagreement with that belief. In this post I look at why this belief might have come into existence.

1. A lot of the new applications are fine for demoing, but don’t meet the requirements of a commercial product (in the pre-2.0 world we called these “toy programs”), so they’re relatively easy to develop. (I think it’s wonderful that these things are being created; let’s just not kid ourselves about what they are and are not.)

2. The fact that large numbers of applications are now being created makes people assume that this is because software development has become easier. I would argue that while there is indeed a case of lowered barriers to entry, the particular barriers are not software development. Specifically:

2(a). In the past, a user needed to have a computer where s/he could install something, and then be willing and able to go through an (often arduous) installation process. Now, to use an application, just type in the application’s URL, or, even easier, just click on a link. So there is a lowered barrier to becoming a potential user for an application, resulting in a skyrocketing number of potential users, increasing the incentive to create the application.

2(b). In the past, an author had to create an installation package, and then distribute it. Now, to publish an application, just put it on your site (which is now dirt cheap to host). So there is a lowered barrier to becoming a potential author, resulting in a skyrocketing of their numbers as well. That is, publishing (not developing) software has become so much easier.

Those of you who are programmers, feel free to comment. Just don’t say anything like “Ruby on Rails is wonderful” unless you can argue why it’s so much better than what’s been around for a long time (please compare against something with high productivity, not something like JEE).

Saturday, August 12, 2006

Terrorists help Web 2.0

“Draconian restrictions on carry-on baggage may stay in place for months, even years.”

Business travellers may not be so keen on carrying laptops or PDAs in future.

So how’s a person to get any work done?

Answer: use Web-based email and word processors and spreadsheets and blogreaders and Web-based everything, on whatever computer is available wherever you happen to be. Hotels catering to business travellers will add computers to their rooms (desktop PCs are pretty cheap now). And airlines will have computers available on board (they may charge a rental fee) that are like laptops but are missing a screen because they plug into the entertainment screen that’s in the seatback in front of you. Of course they’ll need to offer in-flight Internet if they haven’t already. (UPDATE: Boeing announced on Aug. 17 that the service linked to in the preceding sentence, Connexion, is to be discontinued because of lack of interest to date. But there is at least one other service, OnAir, though the Internet access it provides is limited and will stay that way until sometime in 2007.)

ANOTHER UPDATE: For more, see the Techdirt post “Could Terrorism Be A Boon For Web 2.0?” (even though their post wasn’t till five days after mine Smiley face).

Thursday, August 10, 2006

Your own music-and-news radio station

This is my first post of the “boy is this cool!” variety.

Monday, August 07, 2006

Pollution in Web 2.0

Pollution, pollution
You can use the latest toothpaste
And then rinse your mouth
With industrial waste.

-from Tom Lehrer's song “Pollution”

If Web 2.0 is the participatory web, that’s a great thing, right? Not when some participants spoil it for their own selfish benefit. Like people who send spam in email. Web 2.0 has comment spam, for instance. But overall I’m quite impressed with the lack of what I call pollution.

This is no accident. Those of you who’ve been using the Net for more than a few years will remember “newsgroups”. They’re pretty much gone now (my ISP recently stopped carrying them, making arrangements with a third party for customers who still cared). Newsgroups used to be very useful, until the masses got onto the Internet. (Lest anyone think that I’m being elitist, let me point out that I’ve done more than my share toward bringing the masses onto the Internet, something I’m very proud of.) Then the signal-to-noise ratio of newsgroups dropped to the point where the knowledgeable and helpful stuff got lost among all the stuff that wasn’t. Moderated newsgroups didn’t have this problem, but few newsgroups were moderated and the whole space became unpopular.

But the function of newsgroups was still valuable. Web discussion forums had already arisen as an alternative more suitable to unsophisticated users, and far better integrated with websites than newsgroups could ever be. But they tend to have the same problems: most forums are unmoderated, and contain a lot of posts that most readers don’t want to read.

Enter blogs. Blogs have built-in pollution control because only the author(s) can post. (Unmoderated comments may be allowed, but then most people don’t bother reading comments so this doesn’t matter.) No wonder they’re so popular.

But a blog is the voice of only one person (or occasionally a small group), not an open forum. Wikipedia is different because “anyone can edit”. Well, not all pages. Over time the Wikipedia management has found it necessary to tighten up various things. Besides, one must stick to the particular topic of the encyclopedia entry. Doc Searls has written that the Net may not have a real “commons”. And I know why it doesn’t: it’s called “the tragedy of the commons”.

The Web, however, does have ways to make pollution irrelevant: search, filtering, ranking, voting, moderation, reversion in Wikipedia, and other ways for the good stuff to be separated from the bad. Newsgroups did not really have any such mechanism, and went the way of the dodo. Fortunately for Web 2.0, the Web is a more flexible platform and anti-pollution devices can be added wherever needed.

Sunday, August 06, 2006

Are there lots of content producers or just a few?

In a post well worth reading, Jay Rosen recently wrote about “the people formerly known as the audience”.

Demir Barlas of Line56.com, among others, counters that most people are readers/viewers/consumers and few choose to become writers/uploaders/producers.

I have a third view. (UPDATE: Actually I don't. In his comment on this post, Jay Rosen points out that my view is no different from his.) I agree with Barlas that most consumers don’t become producers. But now they can if and when they want (which traditionally they couldn’t without expending a lot of time or money or both). Like when a company screws them around, or simply sells them an inadequate product, and they get upset enough to write something somewhere.

Most potential producers will rarely, if ever, make use of their new options — but there are a lot of these people. Now let’s do some math. The “expected value” of something is (well, I’m simplifying the definition) its value multiplied by its probability. For instance, the expected value of my winnings in a lottery that has a single prize of a million dollars is: $1 million times the (let’s assume) one-in-five-million chance of winning, times the number of tickets I have. Why don’t I have a massive number of tickets? Because they cost (say) $1 each. Here the expected value per ticket is 20 cents (not a good deal if I paid $1 for it). The probability that any particular ticket will win is quite low. But if I have multiple tickets, obviously my chances are better: their total expected value is 20 cents times the number of tickets I have. Of course my cost goes up too, to buy all those tickets.

Now let’s apply this to potential producers. As with lottery tickets, the probability that any particular person will produce anything is quite low. But the more of them there are, the greater the chance that someone will produce. And now that the cost of a “ticket” has dropped so much, there are lots of them. Suppose you run a company and there’s a 1 percent chance that an unhappy customer would publicly complain about you (even 1 percent would be a massive increase from how things used to be, given the past difficulty of airing a complaint in a way that the public would be exposed to it). If you have 1000 unhappy customers, you can “expect” 1000 times 1 percent = 10 public complaints. And Google will display all of them. Companies, beware the passive consumers who could become producers.

OK, that was only about complaints, which are a very small fraction of all the user-generated content. But the idea applies in general: even if only a small percentage of the potential producers actually produce anything, the more potential producers there are, the more actual results there will be.

I’ve been saying for a while (though apparently I haven’t blogged it before) that the key mechanism of Web 2.0 is reduced barriers to entry. Not eliminated, just reduced — which still has massive implications.

Saturday, August 05, 2006

One risk of participating in Web 2.0

Jon Newton (at right) performing at his fundraiser on August 5, 2006, at The Rivoli in Toronto
Tonight I went to a fundraiser for Jon Newton who’s being sued for libel because of a blog post on his site p2pnet.net. Despite his taking down the post, and offering to let the aggrieved party post a response without any editing, the lawsuit continues.

He performed a song he’d composed for the occasion, possibly entitled Freedom of Speech and containing the great line “you might as well not think”.

In a short speech about what the event was all about, Newton made the point that “it could happen to you”. Anyone who has a blog at risk. Even if you’re ultra-careful about what you write, if you allow comments you are open. One of the things the lawsuit demands is the identity of someone who left a comment on Newton’s blog — even though the comment was submitted anonymously and Newton has no idea who it was (not that he’d reveal the identity if he did).

There was a petition to the Canadian government, but I didn’t sign it because I wasn’t sure about everything it said, especially the request that “defamatory libel” be removed as an offence from Canadian law. How should I know? I’m just a blogger. But that’s Newton’s point: we, the members of the participatory Web, Web 2.0, run risks we don’t even understand.

Tuesday, August 01, 2006

Why Wikipedia’s errors aren’t a problem

In a post on his twopointouch blog, Ian Delaney examines the Wikipedia phenomenon in depth. It’s a very comprehensive and well-written article (I suspect that it’s intended for the book he’s writing about Web 2.0) and I’d recommend it to anyone looking for a primer on Wikipedia — which you probably don’t need if you're reading this blog.

While reading the section of the post which covers the question of Wikipedia’s accuracy, something crystallized in my mind which had been lying nascent for a long time, and I wrote it as a comment on the post. You can read it at this link, but I thought I’d reproduce the relevant portion here:

I believe that the concerns about Wikipedia’s accuracy are excessive, for a reason I don’t think I’ve seen mentioned anywhere (not that I’ve searched). I trust well-written articles and distrust poorly written ones, and I believe this to be a very common habit. It’s a good strategy because writers who are careful about things like clear phrasing tend to be the same people who are careful about accuracy. And it’s a very easy strategy to use, because it’s obvious how well written a particular article is (or portion of an article): I don’t have to think. So as I read an entry I’m automatically rating it for accuracy — despite having no direct evidence. Consequently the many bad articles in Wikipedia don’t mislead me.

Note this implication: that many Wikipedia articles are badly written is a *good* thing. If the people who couldn’t get their facts correct were able to write well, I might end up believing a lot of nonsense. People *do* tend to believe what they read — but fortunately many or most of us have a built-in alarm bell for bad writing, which serves as an excellent indicator of poor accuracy.


Ian responded in agreement, and pointed out two additional things:

It also identifies articles that are contentious, since they will be the ones that have had hundreds of substantial edits, almost certainly leading to some stylistic inconsistency and ‘stammering’.

I guess the ‘difficulty’ would be people who can’t distinguish good writing from bad. Maybe, they’d be a lot safer paying the [Encyclopedia Brittanica] for a subscription.