Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Checkout on different domain
-
Is it a bad SEO move to have a your checkout process on a separate domain instead of the main domain for a ecommerce site. There is no real content on the checkout pages and they are completely new pages that are not indexed in the search engines. Do to the backend architecture it is impossibe for us to have them on the same domain.
An example is this page: http://www.printingforless.com/2/Brochure-Printing.html
One option we've discussed to not pass page rank on to the checkout domain by iFraming all of the links to the checkout domain.
We could also move the checkout process to a subdomain instead of a new domain.
Please ignore the concerns with visitors security and conversion rate. Thanks!
-
In my opinion there isn't really any downside to this from a Google perspective; as you said, they shouldn't even be indexed anyway. Many many vendors out there have their charge/fulfillment go straight to PayPal for example, and don't even host any checkout specific code (other than cart-building, account creation, etc.) on their site at all.
There's also the case where multiple microsites will all use the same checkout on another domain, used to centralize checkouts. As far as I know these sites aren't punished either, and it definitely saves money on the secure certificates.
There is however, another angle to consider, and that is the human angle. Some people (who aren't savvy about ecommerce) might be alarmed that their secure checkout is occurring on a different domain than the one they've been browsing on. This is a 'security/conversaion' rate issue though, so you may already know of it.
In my opinion I would leave it alone and not bother with the iframe tricks and so on. A subdomain might be more reassuring to the user (e.g. secure.printingofrless.com instead printingforless1.com) but I honestly can't see why the current setup would have Google implications, as long as your SSL/non-SSL pages are separate and canonicalized properly.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Lost ranking after domain switch
I recently migrated from https://whitefusemedia.com to https://whitefuse.com. The website URL structure and content remained the same and I followed all the best practice guidance regarding checks on the new domain and appropriate 301 redirects. I have seen traffic drop by about 50% and the traffic that is still coming through is mainly coming through links still listed by Google under the old domain (https://whitefusemedia.com). Is this normal? Should I expect to see this bounce back, or is there anything I can do now to regain the rankings?
Technical SEO | | wfm-uk0 -
English and French under the same domain
A friend of mine runs a B&B and asked me to check his freshly built website to see if it was <acronym title="Search Engine Optimization">SEO</acronym> compliant.
Technical SEO | | coolhandluc
The B&B is based in France and he's targeting a UK and French audience. To do so, he built content in english and french under the same domain:
https://www.la-besace.fr/ When I run a crawl through screamingfrog only the French content based URLs seem to come up and I am not sure why. Can anyone enlighten me please? To maximise his business local visibility my recommendation would be to build two different websites (1 FR and 1 .co.uk) , build content in the respective language version sites and do all the link building work in respective country sites. Do you think this is the best approach or should he stick with his current solution? Many thanks1 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Are .clinic domains effective?
We acquired a .clinic domain for a client, they are right now running under a .ca and I was just wondering if there were any cons to making the switch. On the flip side are there any pros? I've tried to search for the answer but couldn't seem to come across anything, thank you if you have any knowledge or could point me to a resource.
Technical SEO | | webignite0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
How to increase your Domain Authority
Hi Guys, Can someone please provide some pointers on how to best increase your Domain Authority?? Thanks Gareth
Technical SEO | | GAZ090 -
How much authority does a 301 pass to a different domain?
Hi, A client of mine is selling his business to a brand new company. The brand new company will be using a brand new domain (no way to avoid that unfortunately) and the current domain (which has tons of authority, links, shares, tweets, etc.) will not be used. Added to that, the new company will be taking over all the current content with just a few minor changes. (I know, I wish we could use the old domain but we can't.) Obviously, I am redirecting all pages on the current domain to the new domain via 301 redirects on a page by page basis. So, current.com/product-page-x.html redirects to new.com/product-page-x.html. My client and the new company both are asking me how much link juice (and other factors) are passed along to the new domain from the old domain. All I can find is "not the full value" or variants thereof.My experience with 301 redirects in the past has been within a single domain and I've seen some of those pages have decent authority and decent rankings as a result of the 301 (no other optimization work was done or links were added). Are there any studies out there that I'm missing that show how much authority/juice gets passed and/or lost via a 301 redirect? Anybody with a similar issue see any trends in page/domain authority and/or rankings? Thanks for any insights and opinions you have.
Technical SEO | | Matthew_Edgar0 -
Block a sub-domain from being indexed
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines? One item i cannot use is the meta "no follow" tag. Thanks! - Kyle
Technical SEO | | kchandler0