Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best Practice for www and non www
-
How is the best way to handle all the different variations of a website in terms of www | non www | http | https?
In Google Search Console, I have all 4 versions and I have selected a preference.
In Open Site Explorer I can see that the www and non www versions are treated differently with one group of links pointing to each version of the same page. This gives a different PA score.
eg.
- http://mydomain.com DA 25 PA 35
- http://www.mydomain.com DA 19 PA 21
Each version of the home page having it's only set of links and scores.
Should I try and "consolidate" all the scores into one page?
Should I set up redirects to my preferred version of the website?
Thanks in advance
-
thanks for your answer
that was helpful
-
Thanks for taking the time to put together such a wonderfully detailed answer.
-
Hi Samantha,
What you have is what are called "canonical issues." By allowing multiple versions of your domain open and crawlable to search engines you "split" your ranking authority and result in the issues you are seeing right now.
The best practice is to choose one version of your domain as the "true canonical" and then 301 redirect the others at the server level by means of mod_rewrite code. Doing so will consolidate your content, incoming links and PageRank and greatly increase the root domain authority of your site.
To search engines, if your site hasn't instituted 301 redirect commands at the server level, all of these versions of your site home page would be treated as "separate pages" and each would accumulate authority individually:
http://yoursite.com/
http://www.yoursite.com/
http://yoursite.com/index.php
http://www.yoursite.com/index.php
https://yoursite.com
https://www.yoursite.comYou get the idea.
Most websites are run on one of three different types of servers...
- Unix-based servers running Apache.
- Unix-based servers running Nginx.
- Microsoft Windows-based servers running IIS or similar.
If you're unsure of what kind of server runs your site, ask your hosting company. Most sites are run on Unix-based servers with Apache. In that case, the server's behavior is configured using something called the .htaccess file.
If your site's root domain already contains a
.htaccess
file, you can simply scroll to the end of whatever code is already there and append your 301 redirect code at the bottom of the file, starting on a new line. While this may sound complicated, it's actually very, very simple to do. If you can upload files to and from your Web server, then chances are you'll have no trouble managing (i.e. altering or creating and uploading) your.htaccess
file(s).But yes, bottom line, you ALWAYS want to consolidate URLs and present one uniform "preferred" URL format to search engines and users. In your case, that would appear to the be the non-www domain which has the higher Domain Authority.
You can learn all about redirection best practices at the Moz resource here: https://a-moz.groupbuyseo.org/learn/seo/redirection
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best redirect destination for 18k highly-linked pages
Technical SEO question regarding redirects; I appreciate any insights on best way to handle. Situation: We're decommissioning several major content sections on a website, comprising ~18k webpages. This is a well established site (10+ years) and many of the pages within these sections have high-quality inbound links from .orgs and .edus. Challenge: We're trying to determine the best place to redirect these 18k pages. For user experience, we believe best option is the homepage, which has a statement about the changes to the site and links to the most important remaining sections of the site. It's also the most important page on site, so the bolster of 301 redirected links doesn't seem bad. However, someone on our team is concerned that that many new redirected pages and links going to our homepage will trigger a negative SEO flag for the homepage, and recommends instead that they all go to our custom 404 page (which also includes links to important remaining sections). What's the right approach here to preserve remaining SEO value of these soon-to-be-redirected pages without triggering Google penalties?
Technical SEO | | davidvogel0 -
Solved Should I consolidate my "www" and "non-www" pages?
My page rank for www and non-www is the same. In one keyword instance, my www version performs SO much better. Wanting to consolidate to one or the other. My question is as to whether all these issues would ultimately resolve to my chosen consolidated domain (i.e. www or non-www) regardless of which one I choose. OR, would it be smart to choose the one where I am already ranking high for this significant keyword phrase? Thank you in advance for your help.
Technical SEO | | meditationbunny0 -
Google tries to index non existing language URLs. Why?
Hi, I am working for a SAAS client. He uses two different language versions by using two different subdomains.
Technical SEO | | TheHecksler
de.domain.com/company for german and en.domain.com for english. Many thousands URLs has been indexed correctly. But Google Search Console tries to index URLs which were never existing before and are still not existing. de.domain.com**/en/company
en.domain.com/de/**company ... and an thousand more using the /en/ or /de/ in between. We never use this variant and calling these URLs will throw up a 404 Page correctly (but with wrong respond code - we`re fixing that 😉 ). But Google tries to index these kind of URLs again and again. And, I couldnt find any source of these URLs. No Website is using this as an out going link, etc.
We do see in our logfiles, that a Screaming Frog Installation and a-moz.groupbuyseo.org w opensiteexplorer were trying to access this earlier. My Question: How does Google comes up with that? From where did they get these URLs, that (to our knowledge) never existed? Any ideas? Thanks 🙂0 -
What is SEO best practice to implement a site logo as an SVG?
What is SEO best practice to implement a site logo as an SVG?
Technical SEO | | twisme
Since it is possible to implement a description for SVGs it seems that it would be possible to use that for the site name. <desc>sitename</desc>
{{ STUFF }} There is also a title tag for SVGs. I’ve read in a thread from 2015 that sometimes it gets confused with the title tag in the header (at least by Moz crawler) which might cause trouble. What is state of the art here? Any experiences and/or case studies with using either method? <title>sitename</title>
{{ STUFF }} However, to me it seems either way that best practice in terms of search engines being able to crawl is to load the SVG and implement a proper alt tag: What is your opinion about this? Thanks in advance.1 -
Best practices for types of pages not to index
Trying to better understand best practices for when and when not use a content="noindex". Are there certain types of pages that we shouldn't want Google to index? Contact form pages, privacy policy pages, internal search pages, archive pages (using wordpress). Any thoughts would be appreciated.
Technical SEO | | RichHamilton_qcs0 -
What is the best way to deal with an event calendar
I have an event calendar that has multiple repeating items into the future. They are classes that typically all have the same titles but will occasionally have different information. I don't know what is the best way to deal with them and am open to suggestions. Currently Moz anayltics is showing multiple errors (duplicate page titles, descriptions and overly dynamic urls). I'm assuming that it's showing duplicate elements way into the future. I thought of having the calendar no followed at all but the content for the classes seems valuable. Thanks,
Technical SEO | | categorycode0 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0