Let me set-up template groups based off regex filters on the URLs and HTML content of the page (in case the URLs aren't descriptive) and then have the options to see the reports with segment breakdowns and then drill down there.
By and large problems don't exist on pages, they exist on templates.
Under consideration Suggested by: Dominic Woodman • Upvoted: 27 Jan • Comments: 7
Dominic Woodman Merged
This is an enhancement to a previous suggestion.
It's all well and good to try and scrape all of a giant site, however realistically in many cases, you just don't need to. If you don't need to crawl every URL in a template to find the problems.
It would be useful to be able to define a hard limit or % of the crawl (which would have to re-calculate at intervals presumably) for each different segment of pages. This would let you crawl giant sites, while still being safe in the knowledge that
A comment to my own feature request. It would be great if there was default segment option for folders, in a page tree-esque view.
Same reasoning, it's one of the best features of screaming frog.
Anders Riise Koch
This would be amazing +10pts
Sitebulb have to that feature, definitely!
"Crawl limits within page segments" (suggested by Dominic Woodman on 2017-11-16), including upvotes (1) and comments (0), was merged into this suggestion.
Ability to have auto segments set up by:
- body class (eg blog = .post-single)
- url structure
Or have custom segments be set up (eg regex, custom URLs, etc)
Would need an initial settings section for choosing which type, then a section that allow you to set labels.
"Issues reported by site segmenr" (suggested by Kyle on 2019-07-20), including upvotes (1) and comments (0), was merged into this suggestion.