Brad Fitzpatrick (bradfitz) wrote in lj_dev,
Brad Fitzpatrick

Robots and Netscape 4.?? sucks

A lot of users using Netscape 4.x prior to Netscape 4.7 have been reporting a "Communication Exception (-336)" when view any friends pages.

This happened after I started sending the "Robots: NOINDEX" header on those pages. The reason I did this was because they're pointless to index... they scroll so fast anyway, and a lot of people don't want to be indexed anyway.

I also send the Robots: NOINDEX header on people's journals that have selected the "Don't let robots index my site" option in editinfo.bml.

I used to be making a robots.txt file, but it grew to be over 200k, and people mailed me, saying they stop indexes sites when the robots.txt file gets over 100k ... I didn't realize so many paranoid people would be checking that option.

In any case, robots.txt isn't scalable, so I send the Robots HTTP header now instead.

BUT --- it's breaking Netscape 4.something.

Your mission, should you choose to accept it: Go fire up old versions of Netscape, get the error to reoccur, and mail me the user-agent string of the browser.

To get the user agent, go to:

Then, we'll write a regular expression to match buggy Netscape clients, and not send that header.

  • cl-journal livejournal client

    Hey everyone, I'd like to present a livejournal client that I wrote to fulfill my needs but maybe there are other people that can find it…

  • SessionGenerate and ljloggedin

    Are there any information after release 86 and changes in cookies scheme to use sessiongenerate? It returns ljsession key, but this key is not enough…

  • Retrieving comments

    Hi, Is there a way to retrieve a list of comments made by user XXX (which may or may be not the currently logged in user) in the journals of users…

  • Post a new comment


    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded