Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Webmaster Guideline Change: Human-Readable list of links
-
In the revised webmaster guidelines, google says "[...] Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page)." (Source: https://support.google.com/webmasters/answer/35769?hl=en)
I guess what they mean by this is something like this: http://www.ziolko.de/sitemap.html
Still, I wonder why they say that. Just to ensure that every page on a site is linked and consequently findable by humans (and crawlers - but isn't the XML sitemap for those and gives even better information)? Should not a good navigation already lead to every page? What is the benefit of a link-list-page, assuming you have an XML sitemap? For a big site, a link-list is bound to look somewhat cluttered and its usefulness is outclassed by a good navigation, which I assume as a given. Or isn't it?
TL;DR: Can anybody tell me what exactly is the benefit of a human-readable list of all links?
Regards,
Nico
-
Hi Netkernz_ag,
It is just good practice to have those types of pages available. While I wouldn't say it is an absolute requirement, it should be something you do for your users. The page you pointed to is a general checklist of things to do, and not to do for your users. Creating a Site Index maybe a bit dated, but I still tend to do them as they are fairly easy to create. (example).
Hope this helps,
Don -
Hi there,
Remember that google always seeks to serve the a better user experience.
Technically, the XML sitemap is the one needed for crawlers. And the "human-readable" sitemap is focused on users.
I might be saiyng something obvious, that's the way i've understood it.The benefit of the "human-readable" sitemap shuld be in the part of user experience, Google might see it that way.
As a visitor, I find usefull that kind of sitemap, it gives you a quick overview of the siite and make your way to the final page faster.Hope it helps.
GR.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Virtual URL Google not indexing?
Dear all, We have two URLs: The main URL which is crawled both by GSC and where Moz assigns our keywords is: https://andipaeditions.com/banksy/ The second one is called a virtual url by our developpers: https://andipaeditions.com/banksy/signedandunsignedprintsforsale/ This is currently not indexed by Google. We have been linking to the second URL and I am unable to see if this is passing juice/anything on to the main one /banksy/ Is it a canonical? The /banksy/ is the one that is being picked up in serps/by Moz and worry that the two similar URLs are splitting the signal. Should I redirect from the second to the first? Thank you
On-Page Optimization | | TAT1000 -
Will Google Count Links Loaded from JavaScript Files After the Page Loads
Hi, I have a simple question. If I want to put an image with a link to another site like a banner ad on my page, but do not want it counted by Google. Can I simply load the link and banner using jQuery onload from a separate .js file? The ideal result would be for Google to index a script tag instead of a link.
On-Page Optimization | | CopBlaster.com1 -
After HTTPS upgrade, should I change all internal links, or a general 301 redirect is better?
I recently upgraded to https. Of course most internal links of my old posts are still http. So I set up a 301 redirect in order to make the old link works. In terms od SEO this is good or it is better to update all the internal links to https, manually? In that case can I do it in batch with a search/replace command in the phmyadmin database? any other suggested method? thank you
On-Page Optimization | | micvitale0 -
Linking to External Site In Nav Bar
Hi, we are a celebrity site but also own a separate sports site with its own URL. We have a link to that site in our Nav bar. Are we being penalized by having that link? thanks
On-Page Optimization | | Uinterview0 -
When I changes Template, why traffic goes down?
I've noticed that when I change my blog's template the traffic goes down dramatically, about of 40% decrease. I know that new themes can have some problems but I have tried this with 2 different themes. First try was with genesis framework(Paid one) and just in one day traffic went down and when I reverted the old theme, the traffic became normal. Should I wait for 1 week to see what happens? What could be the potential reason of this?
On-Page Optimization | | hammadrafique0 -
What is the best setup for conical Links
Should I have the conical link state: 1. www.autoinsurancefremontca.com 2. www.autoinsurancefremontca.com/index.html 3. autoinsurancefremontca.com Also do you need a conical link on each page if you have more than one page on your site?
On-Page Optimization | | Greenpeak0 -
Does Google look at page design
Hi everybody, At the moment i'm creating several webshops and websites with the same layout, so visitors can recognize the websites are from the same company. But i was wondering: Does google look at the layout of a webpage that it's not a copy of another website? This because loads of website have the same wordpress/joomla templates etc, or doesn't this effect rankingpositions? Thank you,
On-Page Optimization | | iwebdevnl0 -
Should I let Google index tags?
Should I let Google index tags? Positive? Negative Right now Google index every page, including tags... looks like I am risking to get duplicate content errors? If thats true should I just block /tag in robots.txt Also is it better to have as many pages indexed by google or it's should be as lees as possible and specific to the content as much as possible. Cheers
On-Page Optimization | | DiamondJewelryEmpire0