Brad Fitzpatrick (bradfitz) wrote in lj_dev,
Brad Fitzpatrick
bradfitz
lj_dev

Robots and Netscape 4.?? sucks

A lot of users using Netscape 4.x prior to Netscape 4.7 have been reporting a "Communication Exception (-336)" when view any friends pages.

This happened after I started sending the "Robots: NOINDEX" header on those pages. The reason I did this was because they're pointless to index... they scroll so fast anyway, and a lot of people don't want to be indexed anyway.

I also send the Robots: NOINDEX header on people's journals that have selected the "Don't let robots index my site" option in editinfo.bml.

I used to be making a robots.txt file, but it grew to be over 200k, and google.com people mailed me, saying they stop indexes sites when the robots.txt file gets over 100k ... I didn't realize so many paranoid people would be checking that option.

In any case, robots.txt isn't scalable, so I send the Robots HTTP header now instead.

BUT --- it's breaking Netscape 4.something.

Your mission, should you choose to accept it: Go fire up old versions of Netscape, get the error to reoccur, and mail me the user-agent string of the browser.

To get the user agent, go to:
http://www.livejournal.com/cgi-bin/dump.cgi

Then, we'll write a regular expression to match buggy Netscape clients, and not send that header.
Subscribe

  • Post a new comment

    Error

    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

  • 7 comments