Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How does having multiple pages on similar topics affect SEO?
-
Hey everyone,
On our site we have multiple pages that have similar content. As an example, we have a section on Cars (in general) and then specific pages for Used Cars, European Cars, Remodeled Cars etc. Much of the content is similar on these page and the only difference is some content and the additional term in the URL (for example car.com/remodeled-cars and /european-cars).
In the past few months, we've noticed a dip in our organic ranking and started doing research. Also, we noticed that Google, in SERPs, shows the general page (cars.com/cars) and not the specific page (/european-cars), even if the specific page has more content. Can having multiple pages with similar content hurt SEO? If so, what is the best way to remedy this? We can consolidate some of the pages and make the difference between them a little clearer, but does it make that much of a difference for rankings?
Thanks in advance!
-
Makes a lot of sense, thank you.
-
Some great points there Devanur. There is also the option of canonical but the problem is it would mean you're having less pages indexed but one page (the original) would be stronger.
Duplicate content can hurt you but the other side of that is Matt Cutts has mentioned a few times that it wont hurt you unless its spammy, and boiler plate terms you can also normally get away with. For a good nights sleep though its easier just to fix it and know its one less thing to worry about.
Good luck.
-
Hi,
Having multiple pages with similar or identical content confuses the search engines and the outcomes in SERPs will be undesirable. Here is the deal: No two unique URLs should serve substantially similar or identical content. If it is the case, you should decide the URL that you would like to rank in the search engines, make others point to it via rel=canonical attribute. In general, the page that targets the most search keyword/phrase can be made the canonical URL or the preferred URL.
If I were you, I would have added unique content to the existing pages targeting the main keyword for the page.
For example, if the page talks about, 'used cars', this would become my target term for the page:
I would also go ahead with a thorough keyword research & analysis to find which keywords/phrases are being searched more in your geo-location or target market, add corresponding pages with highly targeted content for each of these keywords/phrases (if not added already).
The key here is, content that is unique, up-to-date, highly relevant and useful to the visitors. Such content would bring in dramatic improvements to your overall SEO ROI and search engines like Google love such content and these pages will be awarded good positions in the SERPs going forward. As you know, high quality content is a natural link magnet.
Here is an action plan, if I were you:
1. Make these pages unique by adding unique content
2. Do a thorough keyword research & analysis to find new content opportunities in your niche
3. Add new pages with unique content based on the outcome of step 2.
4. Update the Sitemap.xml file and submit it to webmaster tools
5. Repeat steps from 2 to 4 once in 6 months based on the results or as and when required.
Those were my two cents my friend. Good Luck.
Best regards,
Devanur Rafi
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
SEO for multiple languages [Arabic]
Hello all, I am currently managing a Marketplace that comes in two different languages: English & Arabic. The English website is, fortunately, doing quite well in terms of SEO performances but, not the Arabic one. The website has two kinds of content: Static content: controlled by me. It includes menu items, navigation, static pages etc which is properly translated among the two languages User-uploaded content: It includes ads/news posted by the user which may not be translated to Arabic if they chose not to do it. Now if somebody goes to the Arabic website and check a news item that doesn't have an Arabic translation, it will show the English title. I am assuming, serving content in a different language that is specified in the hreflang is a straight no, right?
Intermediate & Advanced SEO | | MozammilStorat0 -
Can noindexed pages accrue page authority?
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking). I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content. My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
Intermediate & Advanced SEO | | THandorf0 -
Link Juice + multiple links pointing to the same page
Scenario
Intermediate & Advanced SEO | | Mark_Ch
The website has a menu consisting of 4 links Home | Shoes | About Us | Contact Us Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links. Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact? Any other advise or best practice would be appreciated. Thanks Mark0 -
How does badly formatted HTML affect SEO?
Our website uses a custom built CMS, but uses a fairly standard WYSIWYG text editor. I've looked at some of the code it produces, and it's not pretty. My gut feeling tells me that this extra bloat is bad for SEO. Am I right in thinking that Google doesn't look kindly upon badly formatted and bloated HTML? Thanks,
Intermediate & Advanced SEO | | OptiBacUK
James0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Do widgets and gadgets affect SEO?
I have added a number of widgets and gadgets to my site that I suspect act like Iframes. If true do these widgets and gadgets and the content that they are linked to help or hurt my site from an SEO perspective? Examples are facebook gadget, wordpress blidget, weather gadget, google maps widget.
Intermediate & Advanced SEO | | casper4340 -
Number of forum posts per topic page
When optimizing a forum topic page for SEO, would it be better to have a higher number of posts per page than seperating the topic up into multiple pages? For example, out of the box a forum may display 15 posts per topic page - would there be any SEO benifit in changing that number to say 30 posts per page? I.e. more content per page and decreasing unnecessary "page 2, page 3, page 4"... etc. Your thoughts and comments are most appreciated.
Intermediate & Advanced SEO | | Peter2640