I have a serious problem in my site: http://www.coimbatorematrimony.com
I will create multiple
sitemaps.xml with all terms & conditions of Google Webmaster and some other search engines. I have 37 sitemaps each average 25.000 links, but totally index that 15.000 links only.
My sitemap link is http://www.coimbatorematrimony.com/sitemap_index.xml
You can declare your sitemaps in your robots.txt, one line per
sitemap.xml. Google will pick them faster, but there is no guarantee it will process them fully. You do not have control on this. It is more probable Google will process them little by little. If all your pages are too similar, Google will not index them all.
Google does not revisit the sitemaps you submit to Webmaster Tools, but it revisits
robots.txt frequently. So, it will reprocess your sitemaps more frequently too. Something important is to refresh your pages'
lastmod tag when these are modified. If you don't cheat with these tags, Google responds to them pretty well.
priority tag to tell Google which pages it should focus on first. If you have 25 000 pages, Google may not be willing to process all pages. It will pick those with high priority first.
For more info, see personal blog post.