G* and Y* etc will follow all links they encounter (that aren't excluded in robots.txt). So unless my Disallow: list contains 1000s of lines of products/categories (not gonna happen), I don't see any way to limit the search engines' crawl to specific levels...

BTW - I use this site to create my sitemaps. That version stops at 500pp, but even when I edit my sitemaps to maybe a dozen URLs, it still finds the entire catalog...

So of course, that's a "limitation" of SE's (that I'm not complaining about), but... it would be nice to be able to say "just list/monitor *these* URLs please" wink



GangsterBB.NET (Ver. 7.6.1.1)
PHP Version 5.6.40 / MySQL 5.7.23-23 (was 5.6.41-84.1) / Apache 2.4.54
2007 Content Rulez Contest - Hon Mention
UBB.classic 6.7.2 - RIP