End of Free at NYT
April 4, 2003
The New York Times site was perhaps my favorite site on the web. It is comprehensive, nicely designed and, until this week, linkrot. Linkrot is what happens when pages fail to remain at the same URL. I thought the NYT had a decent policy balancing stable linkage with the desire to profit from the archives. Although looking for an article earlier than a month old pointed towards the paid archive, existing links to NYT articles continued to work for years-- until this week. Now, all articles older than 30 days can only be retrieved through the pay archive. Actually, articles prior to February 2001 are still available. Blech.
Linkrot keeps the web from being as useful as it could be. I keep the sideblog (linky linky) to keep track of the most interesting things I've read. How useful will it be when half of the links are dead? It's why at work we had to print out everything we wanted to ever refer to again, thereby wasting reams of paper.
More at Tech Law Advisor, FurdLog, DaveNet, Glenn Fleishman and bIPlog's Mary Hodder
Posted by Andrew Raff at April 4, 2003 01:20 PM
Trackback URL for this entry: http://www.andrewraff.com/mt/mt-tb.cgi/380
As a follow-up to The End of Free at NYT
, some NYT links may still work.
Andrew Raff: Shameless Self Promotion
April 24, 2003 06:16 PM
Well, nothing lasts forever - but as I see it, a realization that links will self-destruct in thirty days is really, a positive development which might lead bloggers to include a bit more information about the articles to which they link.
What's always annoyed me about traditional blogs or blawgs are the one or two cryptic sentences and a link to an article (without even giving any identifying information) I don't have the time to spend my day flipping all over the Internet. I make it a point at my site, http://www.myshingle.com which is designed for solos and small law firm practitioners (who don't have the same amount of time on their hands as large firm lawyers who grind out memos and briefs) to always include a large enough chunk of text to give a pretty good idea of what the article says. People then have the option to link for more information or not. Moreover, I always give the name of the article, author and date so that it can be tracked down later on.
In addition, frankly, I didn't expect all the URLs that I include to stay live forever and I want to write my site so it stays good for posterity, not just the next thirty days. The potential for sites to disappear quickly is just like the year 2000 problem (where programmers used two digits for dates, either thinking that the millenium would toll the end of civilization or more optimistically, that systems would be replaced sometime down the line) - people just don't think ahead.
I've found that the links to the NYT on my site over one week old, rather than one month old, now lead to the Times' $2.99 "pay for view" requirement. Giving a "nice chunk" of the story (of course with credits and links) is nice, but that leads to the question of how big a chunk can you provide, and still stay within the confines of the "fair use" doctrine.
Instead of printing to reams of paper, get the full version of Adobe Acrobat and print to a pdf file. Hard disks are large and cheap, and files can be burned on a CDROM inexpensively. One CDROM will hold all the text you will ever read.
Since Martin mentioned printing to PDF, I'll put in a little plug for one of my favorite little features in Mac OS X: "print to PDF" is built-in and available from the print function of all programs without needing to buy extra software.
Using PDF to archive web reading is a good idea, but I could see filing and indexing articles that way becoming very time consuming. Another option would be creating a local, private blog with the full text of articles.
The Times policy shows that the WWW is not really an open digital library. (The NYT is still better than many other sites, those that fail to provide any sort of archives.)