Before I start I know you can manually create the sitemap, hence the request to improve automatic sitemaps :)
We have a lot of pages "in progress" and a lot of designers working on our project, so at any given time there are many pages not fit for public consumption.
There are also certain types of pages you don't want to be indexed. e.g. paid media pages and conversion pages where it's necessary to exclude from the public domain and indexing.
The current solution is to password protect the draft pages, but they are still included in our site map which presents a couple of problems.
The current solution it to manually create the sitemap, but who really wants to do that. Meh.
Dream solution - automagic sitemap:
The request here is to make the automatic sitemap smarter.
There are possibly some further improvements, what come to mind are auto generating the robots.txt file from the resulting options, but I'll leave that to your genius team.