I'm in the process of replacing my old sitemaps with a new form which should better represent our site. However, as bots can take days or even weeks to scan all of our urls after submitted I do not want to adversely affect our site. If I submit the new sitemaps simultaneously so there are duplicates of all urls, will this affect my site negatively? The old sitemaps would be removed after the new ones are scanned.
Is this the correct process for replacing sitemaps, or does anyone have a different recommendation?
Yeah. That is the wrong approach.
Just replace your sitemap. Keep life Simple!
Honest and legitimate bots won't fully spider your site just because you updated or replaced your sitemap. They are smarter than that. The decision to visit a page for a search engine is based on metrics in their database and not an updated sitemap. I update my sitemap often. Some sites update their sitemaps daily and some every few minutes. It is not a problem. Go nuts and have fun!!
Remove the old sitemap and submit new one. For fast indexing, use bulk ping services for for all URLs.
If you want to remove duplicate URLs from current site map, then use Microsoft Excel to this job.