True - kinda difficult to get the bots to not index a page/url when they flood your site with hundreds of bots at a time (we've had >100 yahoo bots online at ubbdev in the last week) - it's more a google/se bot problem than a forum script problem. They need to only index the same page once - I thought that was the idea. It definitely has not been our intent to spam the search engines - they and only they control what their bots index. We could jump through hundreds of hoops and tomorrow they change the properties/activities of their bots 180 degrees.

To whit, I am not sure what the goal of search engine optimization is if they insist on listing the same page more than once - if *they* are listing the same page more than once, then what is the problem? I know they can penalize sites for spamming the indexes, but if ubbdev has >250k pages indexed (which has only grown over the last several years), then I don't think they're penalizing us for their bots repeatedly indexing our site.

I like the spider script (I was one of his first users/proselytizers) I still have links to some of my old spider script pages out there, found one on a site dedicated to marxism tongue Anyways, until google works the bugs out of their bots then there's not a lot we can do about them treating our anchor tags as seperate pages.

Yahoo is more reasonable, they have ~76,000 pages indexed.
search.live.com has ~1,400 pages.


- Allen
- ThreadsDev | PraiseCafe