I only realised that it may be a problem when I went to generate a sitemap online for google and it started to go on forever when I realised it was going through and through the navigation links.
Will be great to see this as a feature as for myself I can see it being a slight concern for search engines.
Also I couldn't find much to do with the robots.txt file other then there is a google specific term that could work but its only going to work for google and theres doubts as to whether it will work perfectly.
When do you hope to release the 6.3.0 version? By the way I might just mention the script has been great to work with as well