Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Desktop & Mobile XML Sitemap Submitted But Only Desktop Sitemap Indexed On Google Search Console
-
Hi!
The Problem
We have submitted to GSC a sitemap index. Within that index there are 4 XML Sitemaps. Including one for the desktop site and one for the mobile site. The desktop sitemap has 3300 URLs, of which Google has indexed (according to GSC) 3,000 (approx). The mobile sitemap has 1,000 URLs of which Google has indexed 74 of them.
The pages are crawlable, the site structure is logical. And performing a Landing Page URL search (showing only Google/Organic source/medium) on Google Analytics I can see that hundreds of those mobile URLs are being landed on. A search on mobile for a longtail keyword from a (randomly selected) page shows a result in the SERPs for the mobile page that judging by GSC has not been indexed.
Could this be because we have recently added rel=alternate tags on our desktop pages (and of course corresponding canonical ones on mobile). Would Google then 'not index' rel=alternate page versions?
Thanks for any input on this one.
-
Hi Allison, any updates on this?
From my understanding, it is possible that Google is not indexing the mobile versions of pages if they are simply corresponding to the desktop pages (and indicated as such with the rel=alternate mobile switchboard tags). If they have that information they may simply index the desktop pages and then display the mobile URL in search results.
It is also possible that the GSC data is not accurate - if you do a 'site:' search for your mobile pages (I would try something like 'site:domain/m/' and see what shows up), does it show a higher number of mobile pages than what you're seeing in GSC?
Can you check data for your mobile rankings and see what URLs are being shown for mobile searchers? If your data is showing that mobile users are landing on these pages from search, this would indicate that they are being shown in search results, even if they're not showing up as "indexed" in GSC.
-
Apologies on the delayed reply and thank you for providing this information!
Has there been any change in this trend over the last week? I do know that subfolder mobile sites are generally not recommended by search engines. That being said, I do not feel the mobile best practice would change as a result. Does the site automatically redirect the user based on their device? If so, be sure Google is redirecting appropriately as well.
"When a website is configured to serve desktop and mobile browsers using different URLs, webmasters may want to automatically redirect users to the URL that best serves them. If your website uses automatic redirection, be sure to treat all Googlebots just like any other user-agent and redirect them appropriately."
Here is Google's documentation on best practices for mobile sites with separate URLs. I do believe the canonical and alternate tags should be left in place. It may be worth experimenting with the removal of these mobile URLs from the sitemap though I feel this is more of a redundancy issue than anything.
I would also review Google's documentation on 'Common Mobile Mistakes', perhaps there is an issue that is restricting search engines from crawling the mobile site efficiently.
Hope that helps!
-
Hi Paul and Joe
Thanks for the reply!
Responsive is definitely in the works...
In the meantime to answer:
-
GSC is setup for the mobile site. However its not on a subdomain, its a subdirectory mobile site. So rather than m.site.com we have www.site.com/m for the mobile sites. A sitemap has been submitted and thats where I can see the data as shown in the image.
-
Because the mobile site is a subdirectory site the data becomes a little blended with the main domain data in Google Search Console. If I want to see Crawl Stats for example Google advises "To see stats and diagnostic information, view the data for (https://www.site.com/)."
-
re: "My recommendation is to remove the XML sitemap and rely on the rel=alternate/canonical tags to get the mobile pages indexed. Google's John Mueller has stated that you do not need a mobile XML sitemap file." I had read this previously, but due to the nature of the sub-directory setup of the site, the mobile sitemap became part of the sitemap index...rather than having just one large sitemap.
Thoughts?
-
-
ASs joe says - set up a separate GSC profile for the mdot subdomain. The use that to submit the mdot sitemap directly if you wish. You'll get vastly better data about the performance of the mdot site by having it split out, instead of mixed into and obfuscated by the desktop data.
Paul
-
Hi Alison,
While this is a bit late, I would recommend moving to a responsive site when/if possible. Much easier to manage, fewer issues with search engines.
My recommendation is to remove the XML sitemap and rely on the rel=alternate/canonical tags to get the mobile pages indexed. Google's John Mueller has stated that you do not need a mobile XML sitemap file.
Also, do you have Google Search Console set up for both the m. mobile site and the desktop version? It does not seem so with all sitemaps listed in the one property in your screenshot. If not, I recommend setting this up as you may receive some valuable insights into how Google is crawling the mobile site.
I'd also review Google's Common Mobile Mistakes guide to see if any of these issues could be impacting your situation. Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Errors In Search Console
Hi All, I am hoping someone might be able to help with this. Last week one of my sites dropped from mid first day to bottom of page 1. We had not been link building as such and it only seems to of affected a single search term and the ranking page (which happens to be the home page). When I was going through everything I went to search console and in crawl errors there are 2 errors that showed up as detected 3 days before the drop. These are: wp-admin/admin-ajax.php showing as response code 400 and also xmlrpc.php showing as response code 405 robots.txt is as follows: user-agent: * disallow: /wp-admin/ allow: /wp-admin/admin-ajax.php Any help with what is wrong here and how to fix it would be greatly appreciated. Many Thanks
Technical SEO | | DaleZon0 -
How to remove Parameters from Google Search Console?
Hi All, Following are parameter configuration in search console - Parameters - fl
Technical SEO | | adamjack
Does this parameter change page content seen by the user? - Yes, Changes, reorders, or narrows page content.
How does this parameter affect page content? - Narrow
Which URLs with this parameter should Googlebot crawl? - Let Googlebot decide (Default) Query - Actually it is filter parameter. I have already set canonical on filter page. Now I am doing tracking of filter pages via data layer and tag manager so in google analytic I am not able to see filter url's because of this parameter. So I want to delete this parameter. Can anyone please help me? Thanks!0 -
How preproduction website is getting indexed in Google.
Hi team, Can anybody please help me to find how my preproduction website and urls are getting indexed in Google.
Technical SEO | | nlogix0 -
Upgrade old sitemap to a new sitemap index. How to do without danger ?
Hi MOZ users and friends. I have a website that have a php template developed by ourselves, and a wordpress blog in /blog/ subdirectory. Actually we have a sitemap.xml file in the root domain where are all the subsections and blog's posts. We upgrade manually the sitemap, once a month, adding the new posts created in the blog. I want to automate this process , so i created a sitemap index with two sitemaps inside it. One is the old sitemap without the blog's posts and a new one created with "Google XML Sitemap" wordpress plugin, inside the /blog/ subdirectory. That is, in the sitemap_index.xml file i have: Domain.com/sitemap.xml (old sitemap after remove blog posts urls) Domain.com/blog/sitemap.xml (auto-updatable sitemap create with Google XML plugin) Now i have to submit this sitemap index to Google Search Console, but i want to be completely sure about how to do this. I think that the only that i have to do is delete the old sitemap on Search Console and upload the new sitemap index, is it ok ?
Technical SEO | | ClaudioHeilborn0 -
How to stop google from indexing specific sections of a page?
I'm currently trying to find a way to stop googlebot from indexing specific areas of a page, long ago Yahoo search created this tag class=”robots-nocontent” and I'm trying to see if there is a similar manner for google or if they have adopted the same tag? Any help would be much appreciated.
Technical SEO | | Iamfaramon0 -
Staging & Development areas should be not indexable (i.e. no followed/no index in meta robots etc)
Hi I take it if theres a staging or development area on a subdomain for a site, who's content is hence usually duplicate then this should not be indexable i.e. (no-indexed & nofollowed in metarobots) ? In order to prevent dupe content probs as well as non project related people seeing work in progress or finding accidentally in search engine listings ? Also if theres no such info in meta robots is there any other way it may have been made non-indexable, or at least dupe content prob removed by canonicalising the page to the equivalent page on the live site ? In the case in question i am finding it listed in serps when i search for the staging/dev area url, so i presume this needs urgent attention ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Unnecessary pages getting indexed in Google for my blog
I have a blog dapazze.com and I am suffering from a problem for a long time. I found out that Google have indexed hundreds of replytocom links and images attachment pages for my blog. I had to remove these pages manually using the URL removal tool. I had used "Disallow: ?replytocom" in my robots.txt, but Google disobeyed it. After that, I removed the parameter from my blog completely using the SEO by Yoast plugin. But now I see that Google has again started indexing these links even after they are not present in my blog (I use #comment). Google have also indexed many of my admin and plugin pages, whereas they are disallowed in my robots.txt file. Have a look at my robots.txt file here: http://dapazze.com/robots.txt Please help me out to solve this problem permanently?
Technical SEO | | rahulchowdhury0 -
Google is indexing my directories
I'm sure this has been asked before, but I was looking at all of Google's results for my site and I found dozens of results for directories such as: Index of /scouting/blog/wp-includes/js/swfupload/plugins Obviously I don't want those indexed. How do I prevent Google from indexing those? Also, it only seems to be doing it with Wordpress, not any of the directories on my main site. (We have a wordpress blog, which is only a portion of the site)
Technical SEO | | UnderRugSwept0