January 7, 2005
Guest Writer Simon Waldman: The Importance of Being PermanentThe Guardian's Web guy: "Without permanence you slip off the search engines. Without permanence, bold ideas like 'news as conversation' fall away, because you're shutting down the conversation before it has barely started. Without permanence, you might be on the web, but you're certainly not part of it."I met Simon Waldman (who blogs here) when he came through New York a few months ago on a kind of study tour of the States. We had a chance to talk journalism and its Web philosophy, and that talk led to the piece below. He’s management now, the Director of Digital Publishing for The Guardian Newspapers. I add my own commentary at the end. The Importance of Being Permanent By Simon Waldman Ask anyone in a traditional news organization to name the most important characteristics of the Net, and you can pretty much guarantee that the answer will include the words “immediacy” or “interactivity.” They are the two big shiny baubles that grab everyone’s attention. Most of the ideas, debates and screaming arguments that happen within media organizations are about how to manage these two—immediacy, interactivity—alongside a traditional business. But there is another feature of the Net that gets a fraction of the attention of these two. It is less glamorous, less obvious, but in the long term it is going to be at least as important to understand, if not more so. Permanence. Permanence means understanding that when you put something on the Web it should be there for ever: ideally in the same place for perpetuity. It means that if I link to it now, someone else can follow that link in two days, two weeks or two years’ time. (I’m not going to lay out the business models in this piece, but I’m not excluding the possibility of pay-to-view; it’s the position that counts, not the price.) This is an alien concept to many people in the news industry, which creates work designed to appear in a particular place at a particular time. But permanence is critical to understanding the real challenges and potential for online publishing. The Web is the first medium where the publisher can put something in a place that it can be found forever. Other media can provide immediacy, and to varying degrees, some level of interactivity. But no other medium can be permanent in the same way as the Web (other media depend entirely on the consumer to archive: something completely different). The problem is that we are often so wrapped up in the immediacy of what we’re doing online that the issue of permanence is easily (and, in the heat of the moment, understandably) forgotten or ignored. Many news organizations, for example, are currently doing great journalism in covering the tsunami and its fall out. But who is creating work that in 12 months’ time will a.) be easily found and b.) make sense of the story as a whole? Enter new competition from Wikipedia. Much of the debate about Wikipedia, has been about the merits or de-merits of how it is created. And yes, there is plenty to talk about there. (See this and that.) But, perhaps its most remarkable achievement has been to create a live—and permanent—record of major events that manages to keep up with the news without becoming a slave to it. Its entry on 2004 Indian Ocean Earthquake, for example, fits both of the above criteria in a way that few news organization’s work ever will. It has better content— and, being permanent, it pops up near the top of Google search for tsunami. (I’m under strict orders from the editor not to flagrantly plug the Guardian here, but I think we’ll do pretty well also.) See their entries on Hurricane Frances or the Ukraine presidential elections for similar treatments of recent major events, These have a longevity way beyond almost anything produced by traditional media. Why does this matter? Why should you want today’s news to be read in 12 months time when everyone will be focused on the next disaster, explosion or election? It’s important for a number of reasons, but they all move in the same direction: permanence is about ensuring you have a real presence on the Net. It is a critical part of having a distinctive identity in an increasingly homogenous landscape. It is about becoming an authority and a point of reference for debate. It is about everything we want and need to be. Without permanence you slip off the search engines. Without permanence, bold ideas like “news as conversation” fall away, because you’re shutting down the conversation before it has barely started. Without permanence, you might be on the web, but you’re certainly not part of it. Here’s another example. Think of all the millions of words written by news organizations around the world about Abu Ghraib during 2004. Now go to Google and search (as suggested in the Wired article above) for Abu Ghraib, and you will find only a handful of traditional media outlets mentioned in the first few pages (fortunately, the Guardian is one). This isn’t just a quirk in Google’s search algorithm; this is about traditional media ceding responsibility for providing the definitive, permanent record of major events. All that reporting effort, all that insight and expertise, all those contacts: now completely invisible to the millions who decide to use Google as their first and final tool for researching. Chris Anderson’s influential essay, The Long Tail (now a blog and forthcoming book) explains how in a world without the traditional physical restrictions of high street space, online retailers can offer vast and diverse repertoires, brought to life through recommendation systems and other links. They still need their best sellers, but the future success of their businesses really depends on the long tail. The same is true for news sites. Freed from the physical restrictions of print, now is the time to see our Web operations not simply as the place for today’s news: but as the repository for everything we have ever done. Yes, we need big breaking news, but it’s the long tail of your content accumulated over time that makes you distinctive and lets you stand out. And while retailers have to build elaborate recommendation engines, the benefit of having newspaper content is that others will often do the linking and recommending for you. The easiest way for you to join the conversation is to allow yourself to become the focus of it. Immediacy and breaking news are critical: but too often the results lead to homogeneity. Look at any story on Google News (something that, tragically, many of our readers do several times a day) and follow the links to news organizations around the world. The chances are you will see the same wire stories repeated hundreds of times over with at best minor adjustments. Over time however, it is our truly distinctive and original content that comes to the fore. It is this that is linked to and discussed and then appears on search engines. And it is this that gives us a real identity. Anderson’s essay is, in itself, an example. It first appeared nearly four months ago, but it is still resonating around the web. As I write this, it has some 545 links according to Technorati, 15 in the last 24 hours alone. I used the epithet “influential” to describe it and that influence is partly down to the quality of the idea, and the place it originally appeared. But the permanence of the Web is also at work. Every day, more people are finding it. Every day it’s value is increasing as a result. Similarly, take that most ephemeral of events: a speech. In this case Tom Curley of the AP to the Online News Association in November, which was put on a site, given permanence, and is again gaining intellectual capital by the day as more and more people link to it and debate it. Web permanence is, I’d argue, one of the main things that journalists can learn from the more successful bloggers. The whole concept of the permalink allows blog posts to become part of the Web in the way that very few traditional media owners stories do. This is why they get linked to and why they often come to the top of search results. My personal experience: 18 months after I posted something on a (now deceased) personal blog about the decision between an iRiver and iPod, and comments are still being added because the post has made its way to near the top of a relevant Google search. What makes great news organizations great is not simply the work they do on a given day, but the accumulated quality of work done over weeks, months and years. For the first time, it can be available in one place: permanently. To neglect this is to go into battle with one arm tied behind your back. I’m not pretending it’s easy. I’m not pretending that we at the Guardian have all the answers (or anyone else for that matter). The point is that making the “long tail” of our content accessible and available in the most engaging, useful and—let’s not forget—profitable way is a major challenge for the smartest editorial, technical and commercial minds. They ignore it at their peril. In my opinion, Simon Waldman is one of the more important people in Web journalism today because The Guardian’s online edition is such a leading site. It not only looks better than just about any daily newspaper site; it is, among major newspapers in the Anglo-American field, the most Web-friendly. One could list reasons. But the biggest reason is also the simplest: The Guardian is striving not to be “on” the Web, but of it. (Yet it’s also a business.) That “of the Web” part has to come from Waldman, who’s just a guy—30s, reasonably hip—trying to feel his way to something journalistically sound but differently constructed on the new platform. The solution to how you do serious news and commentary on the Read/Write Web (a term I am starting to appreciate more and more) is not known to anyone. What we know is: some are actively trying to figure it out, and to meet high editorial standards. The Guardian is one. It has gotten into blogging in a careful but effective way. (I have been telling my readers about another one, a local newspaper in North Carolina that is going open source.) From the moment I started my weblog I noticed The Guardian standing out from the pack of news peddlers online. It had better links. If you linked to the Los Angeles Times, your link would be dead in a week or two, as the content moved, in some grindingly mechanical fashion, off the “free” site, into an closed and gated archive, with tolls, thereby removing the journalism part from circulation on the Read/Write web, which means removing it from Google, from active cultural memory, and interrupting the very patterns by which value is added to a piece of journalism— post-publication, online, because of how the web works. I once tried to capture it this way: In journalism the regular way, we imagine the public record accumulating with each day’s news— becoming longer. In journalism the weblog way, we imagine the public record “tightening,” its web becoming stronger, as links promotes linking, which produces more links. It is the current policy of most American newspapers to be anti-Web in the key matters of linking out and permitting deep-linked content through stable and reliable url’s. This policy is, in my view, wrong-headed. It was done to get revenues from the archive. There was a business reason. No one was trying to be anti-Web. They just ended up that way by trying to collect revenues from a “closed” archive. But being closed cannot be the way forward for journalists, and so they have to involve themselves in the business of linking. I think it shameful, not to mention a crisis in authority, that the solid journalistic achievement of most people in daily newspapers across the land is being lost to Google, lost to bloggers, lost to online forums and conversation, lost to the long tail where value is built up. Does your average hard working environment reporter even know that her comprehensive portrait of an ecological disaster in the making won’t ever make it into search engines so that people can see where it all began, so that high school kids researching an assignment would find it and get the whole story? Do the newsroom troops understand this “lost to the future” quality about their best work? Who was supposed to tell them? (See this.) Being “of” the Web, the way I mean it, has nothing whatsoever to do with how many blogs you start. It’s more basic things like the nature of authority online, why search is important, what “openness” truly means, what creates new value online. Blogs are important because they teach you about Web fundamentals. If you do them you will find it easier to be “of” the Web. After Matter: Notes, reactions and links… On the parallel universe of the press: The Guardian asks the people who run Britain what they think of journalists. (Jan. 10) Increasingly, as you talk to politicians and officials and executives about how their trades and the journalism about them meet each other, they talk of parallel universes. That is, there is what they do and how they do it: and then there is the journalism about what they do and how they do it: and they exist quite separately from each other. The journalism partakes of the field of action apparently described: and some of the characters exist and some of the events are recognisable. But as a description of what really happens, it’s deeply inadequate - or so say the politicians and the corporate executives and the trade union leaders and the bishops and the heads of NGOs and the generals and the top civil servants and the council leaders and the ministers and even the academics. The observations of “the people who run Britain” are non-hysterical, deeply critical and apt. See this part too. In the comments thread, Jakob Nielsen writes: I write an online column on Web usability (the Alertbox), and my traffic statistics show that the articles typically gather about 80% of their lifetime readership after they have passed into the archives, confirming the value of permanent content and of avoiding linkrot. Threadwatch, an Internet marketing blog, reads Waldman’s piece and says to its constituency (the people who claim to improve your Google rankings): Watch your backs - the Press are starting to Get It. “In so many areas of the web we as web devs have really had rather an easy ride with only other web devs to really compete with. Now, just because Simon is starting to get it doesn’t mean that our nice little cash cow is going to run dry tomorrow, but it sure will be tougher as time passes by.” Robert Andrews on permanence and news: “Say I’m planning a trip this weekend and will be taking a route involving a road I’ve never travelled before. A newspaper in that area might have run a story every one of the last four years about how that road is prone to rockfalls, one of which killed a driver there last year. Each of these stories will exist in the searchable archive, that’s a given, but it’s only their linking together (and perhaps their linking together with similar stories elsewhere) that would provide value to me as a driver (ie. not the odd isolated incident, but a trend).” Dave Pell, who was on my Recommended list before this, had two weblogs and only wanted one. So he found a subject midway between, merged the look of one, electablog, into the (already similar) look of another, davenetics, and came up with a third, the blog blog…. Also called: “The Beta Blog,” (I think he means meta.) “A blog about blogs, blogging and bloggers. Welcome to the front row of the personal publishing revolution.” I hope you type fast, Pell. There’s a lot going on these days. According to the New York Review of Magazines: The Guardian is Britain’s third-largest high-end broadsheet. It sets the standard for coverage not only of politics but also the arts, literature and history. It is also the quintessential independent paper. Operating under the philanthropic Scott Trust set up in 1936 by the Manchester family, the Guardian is known for its dedication to high journalistic standards. That hasn’t stopped the business from growing. The Guardian Media Group now owns a national radio station; daily, weekly and Sunday papers, including the prestigious weekly Observer; a Web site; and a group of classified magazines.” Forty percent of The Guardian’s online readers are in the US (January 04 study.) Online Journalism Review, Sep. 2003: The Guardian of the Web. Feature on the newspaper’s Web strategy.
My italics. Simon Waldman at his weblog: “Gizmodo gets handed an interview with Bill Gates. Good for them, I say - and a smart move by Microsoft. Now here’s a big challenge to traditional media: yes, anyone can run a blog and call themselves a reporter, but ‘access’ is operated almost on a cartel basis. The ability to bag the big names is one of the things that keeps big media big. If that moves, everything moves.” Evan Cornog, publisher of Columbia Journalism Review, asks: is it possible to do great journalism if the public does not care? So much of the thinking about this in the world of journalism… is done from the perspective of the flaws of journalism as currently practiced. And so it should be, because such flaws abound, from the cutbacks in foreign bureaus to the commercialization of news to the high-profile crimes of a few journalistic fabricators. But perhaps the problem, and therefore the solution, has broader and deeper roots. Perhaps we should, to an extent, blame the readers. Perhaps the old notions of an engaged and virtuous citizenry, upon which the founding fathers’ hopes for the republic were based, are archaic concepts. Mitch Ratcliffe: “The point of innovation in media is to expand, not simply to displace, the voices that existed before (politics is what replaces voices). I’m feeling more Buddhist all the time about this whole journalism v. blogging debate. The middle way in the metalogue that is emerging—the miraculous opening up of “the media” that’s going on—is plenty wide for all sorts of writing, the objective, the disclosed and the personal.” This is a very thorough portrait of the New York Times Company and prospects from Business Week. It includes this quote: “The Roman Empire that was mass media is breaking up, and we are entering an almost-feudal period where there will be many more centers of power and influence,” says Orville Schell, dean of the University of California at Berkeley’s journalism school. “It’s a kind of disaggregation of the molecular structure of the media.” Posted by Jay Rosen at January 7, 2005 1:08 PM Print Comments
Institutional memory! As an aside, the first editor at Slate's bulletin board, Moira Redmond, departed from that job to take on the wholly enviable task of exploring and mapping The Guardian's web site. I wonder how that's coming along. Outbound links and accessible archives have the potential, if used in conjunction with accessible and web-smart reporters, to make the news business just a whole lot better from the customer's perspective and more profitable from the purveyor's. One of the more pathetic examples of a story fading into the mist is documented by the American Journalism Review's Sherry Ricchiardi's post-mortem of Abu Ghraib coverage. She cites an instance where the Washington bureau chief of the LA Times had no idea even after the story blew up that his paper was among the first to cover the Pentagon's announcement, months before CBS broke the story open with its photos, of an investigation into prisoner abuse. The web is replete with examples of its potential to act as a collective memory. George Washington University's National Security Archive, the Memory Hole, the Smoking Gun and even, perhaps inadvertently, the US State Department serve as long-term memories. And the keys to it all are permanence and outbound links: memory and the ability to access it. Right on, Simon. Posted by: weldon berger at January 7, 2005 3:02 PM | Permalink Permanence is only good when that permanence is not mired in the Google Sandbox. Posted by: Sage at January 7, 2005 10:19 PM | Permalink The Google sandbox? I assume that means there are other measures of permanence than showing up in Google. Yes? Posted by: Jay Rosen at January 8, 2005 10:21 AM | Permalink Memory was actual the principle behind Tim Berners-Lee's development of the web. Posted by: Matt Stoller at January 8, 2005 2:21 PM | Permalink On Permanence On this issue it is easy to totally agree with Simon Waldman. The contradictions and differences of opinion on these issues are fascinating. For example, for the past few months, I've been receiving all kinds of advice for my Web site from an editor in New York who is so afraid of revealing his identity to me that he or she takes the extraordinary step of sending me messages from an encrypted, anonymous e-mail address. Some of the advice has been useful, so I've taken it into account. Other advice is harder to take. One example is a suggestion that I delete my archives older than three months. Another is that I use a short profile rather than a cirriculum vitae style resume so that I can be more of an "enigma." Without being able to respond to this e-mail address, there was no way to explain to this well-meaning person why it is important to keep the archives up. One of the messages gave me a clear indication that this person reads Press Think. So maybe this will sink in. The archives not only help position a site in the search engines and provide a research resource for students and interested others. They allow me to link back to previous discussions on repeating themes without having to spend a lot of space and time explaining things again and again. I write regularly on the importance of protecting the separation of church and state, for instance. I try to keep most of my blog posts under 1,000 words, the length of an editorial column in a daily newspaper. But back in the early days of this headline blog, in fact the second week, I put up a much longer post with a lot of the key quotes from history on this subject. That material is not only available for regular readers and researchers searching the Web. It is available to me to refresh my memory and to link to in future posts. What might of concern here is a story making the rounds again about the New York Times mulling charging Web readers. For those of us fortunate enough to have access to a database of articles such as Lexis/Nexis, this is not as much of a concern as it is for the average online reader with a dialup connection. Your points about dead links, also referred to as "link rot," are well taken concerning the major U.S. newspapers. The Times' Web folks got it early on about keep the site free, although I'm not convinced the paid archives are such a great idea. I would be interested in knowing how much revenue that has actually generated. It seems to me the way to make money online is to generate as much traffic as possible by making the site as useful as possible for readers and researchers. If the New York Times management wants to be the Web site of record for the next 100 years the way it has been the newspaper of record for the past 100 years, my advice echos Waldman's. Keep the site free, make the archives free, make better use of links to past articles and outside research, and charge the heck out of advertisers who want to support the site and been seen on it. Otherwise, Orville Schell, dean of the University of California at Berkeley's journalism school, may be right in what he told Business Week recently. "The Roman Empire that was mass media is breaking up, and we are entering an almost-feudal period where there will be many more centers of power and influence," he said. We will see. Posted by: Glynn Wilson at January 8, 2005 2:35 PM | Permalink Great subject. As a non-journalist but consumer of content, I find the conversion of archives into pay content to be very annoying. I don't have any idea what the business models are, but unless charging for that archive is financially important, it shouldn't be done. And, of course, permalinks on news articles are a no brainer. Google is not the last word in searching. The web will continue to evolve new ways to search, index and organize information, just as it evolves new concepts such as blogs. Posted by: John Moore at January 9, 2005 11:35 PM | Permalink Just a brief follow up here - in advocating permanence, I'm not necessarily saying that everything has to be free as well. Paid vs free archives is a business debate that can differ at any company. But the principle of permanence should remain whether something is free or paid for. It is slightly galling to follow a link only to be told you have to pay a small sum to see it. But at least you can still find it quickly and easily. It is infuriating if you find that the piece has moved and has effectively disappeared unless you now go through the archive search. Posted by: Simon Waldman at January 10, 2005 3:09 AM | Permalink Hi Jay and Simon,
There is simply too much news content online, for news sites to shut off access, without losing authority (whether defined as inbound links, or other expressions such as discussion by bloggers - key word cites, or subscriptions to RSS feeds or page views). Last spring, when I wrote that piece associating and comparing DRM and linking, the Wall Street Journal had 354 links in Technorati, compared with the NYTimes at 39,412 and the Washington Post at 21,319. So who has more authority online? The paper with premiere content in its niche behind a firewall which is accessible to 600,000 online subscribers? Or the paper of record. Having a searchable, linkable place, makes the NYTimes real for people online in a way, who have no idea the WSJ even exists as a viable source of information. At some point, legacy news sites will either get it, or disappear. Posted by: mary hodder at January 10, 2005 12:42 PM | Permalink The university I attend is about the digitize it's entire holdings. I mean, four buildings worth of books, catalogues, pictures, you name it. What happens then is that every academic move I make can be quickly linked to history. I can build better arguments, frame better solutions because I can expand the intellectual base for everything I do. Right now, doing that from books is tedious and difficult. But it won't be. That is incredible. And what's even more amazing is that it will be easier to look at copies of century-old books than it will be to look at newspaper articles from a year ago. Because for some assinine reason, newspapers have decided the information they provide should not be accessible to scholars. Just this weekend I went looking for articles on Nexis. The Newark Star-Ledger wasn't there. Nor was the Nashville Tennessean. Or the Philadelphia Inquirer (which has some ridiculous registration thing for its site.) Now Lexis is not some strange provider; its pretty mainstream. Why are newspapers so weird about their content? Why do they make it impossible to search and use? If these organizations have, as they claim, some commitment to the public good, then let's get those archives out so that scholars and others can use this store of knowledge. I may have strayed from the topic, but it felt good to rant a bit. I write an online column on Web usability (the Alertbox), and my traffic statistics show that the articles typically gather about 80% of their lifetime readership after they have passed into the archives, confirming the value of permanent content and of avoiding linkrot. In fact, this very link to the "linkrot" column is an example of the value of permanent content, since the article was written in 1998. Seven years old content? Still relevant. (And by the way, the top hit on Google for the query term linkrot.) Posted by: Jakob Nielsen at January 10, 2005 5:33 PM | Permalink I've been talking about the importance of of permanent links for many year. I hope more media sites will now listen. Also, another important thing is that the links be permanent immediately. The NY Observer has permanent URLs, but the articles at those URLs change each week. So this piece on CNN getting more ad revenue than Fox will soon be something different. They used to have permanent URLs in their archive which were a pain, but now they've gone to a pay-per-article newbank service. So now a bunch of blog posts on an article on Al Gore's cable channel link to a generic page (the article is available for free on looksmart which is another issue, the traffic goes to where the article is free). Also, now when you click on a NY Observer article you get a survey which is eager to learn your gender, age, an income but provides no area to give general feedback. Posted by: Steve Rhodes at January 12, 2005 7:42 AM | Permalink Great topic. Is it possible that news organizations are making decent money off their archives now? It can't possibly last, for all the reasons discussed above, but perhaps the thinking is that once the money dries up they can make the content free and permanent and restore their visibility. Or does failing to jump on the bandwagon now doom these papers to obscurity in the long run? Also, I think Google rankings may not be the be-all and end-all for web presence forever, but if it isn't Google, it'll be something similar. That is, people will always want to start their research in a single place, and linking will be important to determine relevance. Posted by: Joe at January 12, 2005 2:40 PM | Permalink Thanks for writing this. Years ago, I came across a two or three year old bit of chatter I'd entered in a forum and forgotten about. I'd been searching a topic that interested me in Google. Since then, the permanence of the web has seemed one of it's most significant characteristics. In addition to getting the old media to see what this means, we need to help writing teachers Posted by: Michael L. Umphrey at January 13, 2005 2:12 AM | Permalink |
|