Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Reverse proxy a successful blog from subdomain to subfolder?
-
I have an ecommerce site that we'll call confusedseo.com. I created a WordPress blog and CNAME'd it to blog.confusedseo.com. Since then, the blog has earned a PageRank of 3 and a decent amount of organic traffic.
I am considering a reverse proxy to forward blog.confusedseo.com to confusedseo.com/blog/. As I understand it, this will greatly help the "link juice" of the root domain. However, I'm concerned about any potential harm done to the existing SEO value of the blog. What, if anything, should I be doing to ensure that the reverse proxy doesn't hurt my "juice" rather than help it?
-
Hey, I have a question in this:
We have setup a seperate Google Analytics ID and Google Search Console Property for the sub-domain and then if we are using reverse proxy to keep it under sub-directory.
So what happens to the GA tracking and Google Search Console in this case?
You can read my full question here:
-
Hi there,
Im investigating the same reverse proxy solution for my eCommerce blog. was your implementation successful?
-
Canonical will pass link juice almost exactly like 301s will, so there's no harm in going that route. Matt Cutts explains that in this video: http://www.youtube.com/watch?v=zW5UL3lzBOA
You sound like you're good to go. You've got duplicate content worked out, and you've got a plan to retain link juice (canonical).
-
Since the subdomain does still exist live, someone doing a reverse proxy would need to take some steps to mitigate duplicate content issues. The first would be to set up the new permalinks and rel canonical tags via Wordpress and Yoast's SEO plugin (which rocks, btw). Then you would need to do the robots.txt/GWT steps that you quoted. If there's anything else that needs doing, I am definitely all ears before I attempt this.
-
Ah! I misunderstood the bit about reverse proxying. In that case... to be perfectly honest, I'm not sure.
When you setup a reverse proxy, what happens to the sub-domain? Does it go away or does it still exist live? If it remains live, you'd end up with a duplicate content issue.
EDIT >> I found this at the source you linked to (which answers my question) -->
"The next thing you can do is add a robots.txt file to the sub-domain that stops robots from indexing it. As Reverse Proxying keeps the requested URL the /blog/ URLs will use the robots.txt from the main domain rather than the sub-domain.
The final (and most extreme) thing you can do is to register Google Webmaster Tools for the sub-domain and remove it from the index. If you are doing this, you need to do it in conjunction with robots.txt."
-
Thanks for your response, Philip. My research indicates that a 301 redirect on a location that is being reverse proxied would result in an infinite loop. (source) I haven't tested it to confirm, though. Is that true?
-
You need to setup 301 redirects for ALL of the pages and posts on the blog sub-domain to their new locations in the sub-folder. This is very important. Without the proper redirects in place, you will lose all value from links pointing to the blog sub-domain, plus all the history, authority, and rankings that the pages have earned.
As for your reasoning to move it from a sub-domain to a sub-folder, I'm not sure you'll receive any sort of link juice boost on your root domain from doing this. Maybe someone else can prove me wrong/correct me...
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Delete old blog posts after 301 redirects to new pages?
Hi Moz Community, I've recently created several new pages on my site using much of the same copy from blog posts on the same topics (we did this for design flexibility and a few other reasons). The blogs and pages aren't exactly identical, as the new pages have much more content, but I don't think there's a point to having both and I don't want to have duplicate content, so we've used 301 redirects from the old blog posts to the new pages of the same topic. My question is: can I go ahead and delete the old blog posts? (Or would there be any reasons I shouldn't delete them?) I'm guessing with the 301 redirects, all will be well in the world and I can just delete the old posts, but I wanted to triple check to make sure. Thanks so much for your feedback, I really appreciate it!
Technical SEO | | TaraLP1 -
How do I complete a reverse DNS check when completing log file analysis?
I'm doing some log file analysis and need to run a reverse DNS check to ensure that I'm analysing logs from Google and not any imposters. Is there a command I can use in terminal to do this? If not, whats the best way to verify Googlebot? Thanks
Technical SEO | | daniel-brooks0 -
Robots.txt in subfolders and hreflang issues
A client recently rolled out their UK business to the US. They decided to deploy with 2 WordPress installations: UK site - https://www.clientname.com/uk/ - robots.txt location: UK site - https://www.clientname.com/uk/robots.txt
Technical SEO | | lauralou82
US site - https://www.clientname.com/us/ - robots.txt location: UK site - https://www.clientname.com/us/robots.txt We've had various issues with /us/ pages being indexed in Google UK, and /uk/ pages being indexed in Google US. They have the following hreflang tags across all pages: We changed the x-default page to .com 2 weeks ago (we've tried both /uk/ and /us/ previously). Search Console says there are no hreflang tags at all. Additionally, we have a robots.txt file on each site which has a link to the corresponding sitemap files, but when viewing the robots.txt tester on Search Console, each property shows the robots.txt file for https://www.clientname.com only, even though when you actually navigate to this URL (https://www.clientname.com/robots.txt) you’ll get redirected to either https://www.clientname.com/uk/robots.txt or https://www.clientname.com/us/robots.txt depending on your location. Any suggestions how we can remove UK listings from Google US and vice versa?0 -
Schema for blogs
When I run a wordpress blog through the structured data testing tool I see that there is @type hentry. Is this enough for blogs etc? Is this a result of Wordpress adding in this markup? Do you recommend adding @blogposting type and if so why? What benefit to add a specific type of schema? How does it help in blogging? Thanks
Technical SEO | | AL123al4 -
Does a subdomain benefit from being on a high authority domain?
I think the title sums up the question, but does a new subdomain get any ranking benefits from being on a pre-existing high authority domain. Or does the new subdomain have to fend for itself in the SERPs?
Technical SEO | | RG_SEO0 -
Unnecessary pages getting indexed in Google for my blog
I have a blog dapazze.com and I am suffering from a problem for a long time. I found out that Google have indexed hundreds of replytocom links and images attachment pages for my blog. I had to remove these pages manually using the URL removal tool. I had used "Disallow: ?replytocom" in my robots.txt, but Google disobeyed it. After that, I removed the parameter from my blog completely using the SEO by Yoast plugin. But now I see that Google has again started indexing these links even after they are not present in my blog (I use #comment). Google have also indexed many of my admin and plugin pages, whereas they are disallowed in my robots.txt file. Have a look at my robots.txt file here: http://dapazze.com/robots.txt Please help me out to solve this problem permanently?
Technical SEO | | rahulchowdhury0 -
Blogs are best when hosted on domain, subdomain, or...?
I’ve heard the it is a best practice to host your blog within your site. I’ve also heard it’s best to put it on a subdomain. What do you believe is the best home for your blog and why?
Technical SEO | | vernonmack0 -
Exact match subdomains
Hi, I have seen significant SEO benefits from owning exact match domains and was wondering whether exact match subdomains hold the same (or some) of these benefits? eg. halloweencostumes.co.uk vs. halloween [dot] costumes.co.uk Many thanks.
Technical SEO | | martyc0