Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to combine 2 pages (same domain) that rank for same keyword?
-
Hi Mozzers,
A quick question. In the last few months I have noticed that for a number of keywords I am having 2 different pages on my domain show up in the SERP. Always right next to each other (for example, position #7 and #8 or #3 and #4). So in the SERP it looks something like:
- www.mycompetition1.com
- www.mycompetition2.com
- www.mywebsite.com/page1.html
4) www.mywebsite.com**/page2.html**
5) www.mycompetition3.com
Now, I actually need both pages since the content on both pages is different - but on the same topic. Both pages have links to them, but page1.html always tends to have more. So, what is the best practice to tell Google that I only want 1 page to rank? Of course, the idea is that by combining the SEO Juice of both pages, I can push my way up to position 2 or 1.
Does anybody have any experience in this? Any advice is much appreciated.
-
Hi there,
Realistically, the tag should be used for duplicates, yes. How "duplicated" a page is, is subjective: a page with 50% of the same content as another page is probably going to count as duplicated as far as Google goes... where that line of duplication acceptability goes isn't something any of us really know.
For pages where the content is totally different besides the header and footer, you technically shouldn't use canonicalisation. However, experiments have shown that Google honours the tag, even if the pages aren't duplicates. Dr. Pete did an experiment when the tag came out (admittedly a few years ago) where he showed that you could radically reduce the number of pages Google had indexed for a site by canonicalising everything to the home page. I personally had a client do this by accident a couple of years ago, and sure enough, their number of indexed pages dropped very quickly, along with all the rankings those pages had. As an ecommerce site that was ranking for clothing terms, this was very very bad. It took about six weeks to get those rankings back again after we fixed the tags, and the tags were fixed within about five days (should have been quicker but our urgent request went into a dev queue).
So the answer would be that Google seems to honour the tag no matter the content of the pages, but I am pretty sure that if you asked a Googler, they'd tell you that it should only be used for dupes or near-dupes.
-
Hi Jane,
Thanks for the advice. One question. I was under the impression that the rel="canonical" tag was for two pages that had the same content to let google know that the page it is pointing to is the original and should be the one to rank. Do you have any experience using them between 2 pages that have totally different content (minus the header and footer)?
Thanks again.
-
If you are happy for the second page to still exist but not rank, you should use the canonical tag to point the second page to the first one. This will lend the first page the majority of the strength of the second page and perhaps improve its authority and ranking as a result. However, the second page will no longer be indexed because the canonical tag tells Google: "ignore this page over here; it should be considered the same as the canonical version, here."
Again, this can benefit the first page, but it does mean that the second page will no longer rank at all. Only do this if you are okay with that scenario.
Cheers,
Jane
-
I'm afraid that there isn't a perfect solution, but there are various options to consider.
1.) The only way to "combine the SEO juice of both pages" is to 301 redirect one of the pages to the other (and add the content from the old page to the remaining one). However, this means that the second page will no longer exist for your website visitors (coming from organic search or not).
2.) You can use a rel=canonical tag pointing from the secondary page to the preferred one to encourage Google to list only the preferred one the pages in search results. In addition, you could use the robots.txt file or noindex meta tag (the meta tag is the preferred option) to block search engines from indexing the page and having it appear in search results. However, this will not "combine the SEO juice."
Assuming that it is crucial that the second page still exist on your website, I would probably not do anything. You appear twice in the first page of results -- great! Why mess with that? I would just focus on doing all the good SEO best practices and earning more links to those two pages to push them higher over time. (Of course, if I knew your exact situation, I would probably have additional suggestions.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Local SEO - ranking the same page for multiple locations
Hi everyone, I am aware that issue of local SEO has been approached numerous times, but the situation that I'm dealing with is slightly different, so I'd love to receive your expert advice. I'm running the website of a property management company which services multiple locations (www.homevault.com). From our local offices in the city center, we also service neighboring towns and communities ( ex: we have an office in Charlotte NC, from which we service Charlotte plus a dozen other towns nearby). We wanted to avoid creating dozens of extra local service pages, particularly since our offers are identical per metropolitan area and we're talking of 20-30 additional local pages for each area. Instead, we decided to create local service pages only for the main locations. Needless to say, we're now ranking for the main locations, but we're missing on all searches for property management in neighboring towns (we're doing good on searches such as 'charlotte property management', but we're practically invisible for 'davidson property management', although we're searvicing that area as well). What we've done so far to try and fix the situation: 1. The current location pages do include descriptions of areas that we serve. 2. We've included 1-2 keywords for the sattelite locations in the main location pages, but we're nowhere near the optimization needed to rank for local searches in neighboring towns (ie, some main local service pages rank on pages 2-4 for sattelite towns, so not good enough). 3. We've included the searviced areas in our local GMBs, directories, social media profiles etc. None of these solutions appear to work great. Should I go ahead and create the classic local pages for each and every town and optimize them on those particular keywords, even if the offer is practically the same, and the number of pages risks going out of control? Any other better ideas? Many thanks in advance!
Intermediate & Advanced SEO | | HomeVaultPM0 -
How Much Domain Age Matter In Ranking?
I am very confused about domain age. I read many articles about domain age, some experts say domain age does matter in ranking and some experts say it doesn't matter in the ranking. Kindly guide me about domain age.
Intermediate & Advanced SEO | | MuhammadQasimAttari0 -
Does having alot of pages with noindex and nofollow tags affect rankings?
We are an e-commerce marketplace at for alternative fashion and home decor. We have over 1000+ stores on the marketplace. Early this year, we switched the website from HTTP to HTTPS in March 2018 and also added noindex and nofollow tags to the store about page and store policies (mostly boilerplate content) Our traffic dropped by 45% and we have since not recovered. We have done I am wondering could these tags be affecting our rankings?
Intermediate & Advanced SEO | | JimJ1 -
Page rank and menus
Hi, My client has a large website and has a navigation with main categories. However, they also have a hamburger type navigation in the top right. If you click it it opens to a massive menu with every category and page visible. Do you know if having a navigation like this bleeds page rank? So if all deep pages are visible from the hamburger navigation this means that page rank is not being conserved to the main categories. If you click a main category in the main navigation (not the hamburger) you can see the sub pages. I think this is the right structure but the client has installed this huge menu to make it easier for people to see what there is. From a technical SEO is this not bad?
Intermediate & Advanced SEO | | AL123al0 -
Multiple pages optimised for the same keywords but pages are functionally different and visually different
Hi MOZ community! We're wondering what the implications would be on organic ranking by having 2 pages, which have quite different functionality were optimised for the same keywords. So, for example, one of the pages in question is
Intermediate & Advanced SEO | | TrueluxGroup
https://www.whichledlight.com/categories/led-spotlights
and the other page is
https://www.whichledlight.com/t/led-spotlights both of these pages are basically geared towards the keyword led spotlights the first link essentially shows the options for led spotlights, the different kind of fittings available, and the second link is a product search / results page for all products that are spotlights. We're wondering what the implications of this could be, as we are currently looking to improve the ranking for the site particularly for this keyword. Is this even safe to do? Especially since we're at the bottom of the hill of climbing the ranking ladder of this keyword. Give us a shout if you want any more detail on this to answer more easily 🙂0 -
Page Rank Worse After Optimization
For a long time, we had terrible on page SEO. No keyword targeting, no meta titles or descriptions. Just a brief 2-4 sentence product description and shipping information. Strangely, we weren't ranking too bad. For one product, we were ranking on page 1 of Google for a certain keyword. My goal to reach the top of page 1 would be easy (or so I thought). I have now optimized this page to rank better for the same keyword. I have a 276 word description with detailed specifications and shipping information. I have a strong title and meta description with keywords and modifers. I have also included a video demonstration, additional photos and an PDF of the owners manual. In my eyes, the page is 100% better than it ever was. In the eyes of MOZ, it's better also. I've got an A with the On-Page Grader. Why is this page now ranking on page 8 of Google? What have I done wrong? What can I do to correct it?
Intermediate & Advanced SEO | | dkeipper0 -
What are the effects of having Multiple Redirects for pages under the same domain
Dear Mozers, First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year ! I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too. Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of. How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time? To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site? Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed. What are your opinions about this ?
Intermediate & Advanced SEO | | eukmark0 -
Why does Google add my domain as a suffix to page title in SERPS?
Hi, If I do a search in Google - for one our products on our site, our site comes up - but it would appear that google is adding our domain name as a suffix to our title in the results... Anyone else seen this? Can I do anything about it? I would prefer it not to appear. Thanks!
Intermediate & Advanced SEO | | bjs20100