This happened after I started sending the "Robots: NOINDEX" header on those pages. The reason I did this was because they're pointless to index... they scroll so fast anyway, and a lot of people don't want to be indexed anyway.
I also send the Robots: NOINDEX header on people's journals that have selected the "Don't let robots index my site" option in editinfo.bml.
I used to be making a robots.txt file, but it grew to be over 200k, and google.com people mailed me, saying they stop indexes sites when the robots.txt file gets over 100k ... I didn't realize so many paranoid people would be checking that option.
In any case, robots.txt isn't scalable, so I send the Robots HTTP header now instead.
BUT --- it's breaking Netscape 4.something.
Your mission, should you choose to accept it: Go fire up old versions of Netscape, get the error to reoccur, and mail me the user-agent string of the browser.
To get the user agent, go to:
Then, we'll write a regular expression to match buggy Netscape clients, and not send that header.