G* and Y* etc will follow
all links they encounter (that aren't excluded in robots.txt). So unless my Disallow: list contains 1000s of lines of products/categories (not gonna happen), I don't see any way to limit the search engines' crawl to specific levels...
BTW - I use
this site to create my sitemaps. That version stops at 500pp, but even when I edit my sitemaps to maybe a dozen URLs, it still finds the entire catalog...
So of course, that's a "limitation" of SE's (that I'm not complaining about), but... it would be nice to be able to say "just list/monitor *these* URLs please"