http://www.example.com has a sub-domain
http://jobs.example.com that uses a tool which shares our job listings. We would like for job listings to become indexed in Google search results. The biggest hurdle is that we are unable to upload a sitemap to the sub-domain (long story,) so I am trying some work-arounds to get Google to notice these pages.
I have checked against all of the items in this post: Why isn't my website in Google search results?. As I mentioned, submitting the sitemap isn't possible. We do have incoming links, and the fetch test for individual links presented no problems, though none were ultimately indexed.
Now, I'm trying to see if the robots.txt file is the problem. If I can rule that out, then I think I can say that there is something in the code that is causing an issue. The provider of the tool says that pages are generated dynamically, but I don't see why that would be a problem for indexation.
Here is the information from the robot.txt file:
User-agent: * Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /feed/ Disallow: /trackback/ #Disallow: wp-login.php Disallow: wp-signup.php Disallow: /xmlrpc.php
My suspicion is that this is some issue with the tool on the sub-domain that I am not seeing, but I'd love to find out that there is something else going on that we can fix.
That looks like a fairly standard (WordPress) robots.txt file, so it shouldn't really be blocking anything of significance, regardless of where it is actually located (although that would be useful to know).
the fetch test for individual links presented no problems
Assuming this is the fetch tool within Google Search Console (formerly Google Webmaster Tools), then you've already answered your question. The tool would be unable to fetch your page if it (or a linked resource) was blocked by robots.txt.