Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can too many pages hurt crawling and ranking?
-
Hi,
I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good.
We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)?
Please advice
-
Hi,
I don't believe having toooooooo many pages will hurt crawling and ranking. Actually having a lot of pages will give crawl bots more pages to crawl and when someone searches for keywords related to your pages, your pages might show up.
The only 2 problems I see from having too many pages are
-
With all these pages, are they all unique? With a lot of pages, it will be hard to manager and to keep track if all of them are unique. If you don't have unique pages and have a lot of duplicate, that will hurt your ranking.
-
The second problem is are you inter-linking all your pages? Can the bot crawl all your pages? You will need to have a good linking system and direct bots to different pages for them to crawl. Having a lot of pages will be difficult to manage as I mentioned above. Can you interlink all of them so the bots can crawl all of them? One solution I see to this is submitting a Sitemap but I am not sure if they will index everything since I had a problem with Google only indexing 4% of my sitemap and still can't find solution.
Hope this helps!
-
-
This is really just speculation...
It sounds like you're solid on the on-page, site architecture side. I would assume that crawling and indexation will slow down though if your offsite signals don't keep up though. By this, I mean that Google might see that you're doing everything right on your end, but that over time you're not creating content that very many people care to link to, share, etc, so they'll stop wasting resources on you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can spiders crawl jQuery Fancy Box scripts
Hi Everyone - I'm not a technical person at all. I have some content that will be hidden until a user clicks "learn more" where upon it will be displayed via jQuery Fancy Box script. The content behind the learn more javascript is important and I need it to be crawled by search engine spiders. Does anyone know if there will be a problem with this script?
Technical SEO | | Santaur0 -
Can iFrames count as duplicate content on either page?
Hi All Basically what we are wanting to do is insert an iframe with some text on onto a lot of different pages on one website. Does google crawl the content that is in an iFrame? Thanks
Technical SEO | | cttgroup0 -
Too Many On-Page Links - caused by a drop down menu
Many of our e-com sites we build for customers have drop down menus to help the user easily find products without having to click - Example: http://www.customandcommercial.com/ But this then causes the report to trigger too many on page links We do have a site map and a google site map So should I put code in place not to follow the drop down menu link items or leave in place?
Technical SEO | | spiralsites0 -
404 error - but I can't find any broken links on the referrer pages
Hi, My crawl has diagnosed a client's site with eight 404 errors. In my CSV download of the crawl, I have checked the source code of the 'referrer' pages, but can't find where the link to the 404 error page is. Could there be another reason for getting 404 errors? Thanks for your help. Katharine.
Technical SEO | | PooleyK0 -
Does it hurt to have a dynamic counter in your page title?
Currently we work with page titles which display the number of products we have as a counter. This number is highly volatile and can change every day, so that our page title changes all the time. We did this to improve user experience, meet expectations and improve click through rates. Question is whether this can hurt our rankings and if someone has experimented with this or has experience with this?
Technical SEO | | ElmarReizen0 -
Can you mark up a page using Schema.org and Facebook Open Graph?
Is it possible to use both Schema.org and Facebook Open Graph for structured data markup? On the Google Webmaster Central blog, they say, "you should avoid mixing the formats together on the same web page, as this can confuse our parsers." Source - http://googlewebmastercentral.blogspot.com/2011/06/introducing-schemaorg-search-engines.html
Technical SEO | | SAMarketing1 -
Dynamically-generated .PDF files, instead of normal pages, indexed by and ranking in Google
Hi, I come across a tough problem. I am working on an online-store website which contains the functionlaity of viewing products details in .PDF format (by the way, the website is built on Joomla CMS), now when I search my site's name in Google, the SERP simply displays my .PDF files in the first couple positions (shown in normal .PDF files format: [PDF]...)and I cannot find the normal pages there on SERP #1 unless I search the full site domain in Google. I really don't want this! Would you please tell me how to figure the problem out and solve it. I can actually remove the corresponding component (Virtuemart) that are in charge of generating the .PDF files. Now I am trying to redirect all the .PDF pages ranking in Google to a 404 page and remove the functionality, I plan to regenerate a sitemap of my site and submit it to Google, will it be working for me? I really appreciate that if you could help solve this problem. Thanks very much. Sincerely SEOmoz Pro Member
Technical SEO | | fugu0 -
Is a redirect based on a session cookie hurting rankings?
My clients business is divided in chain stores. All stores are set under the same franchise. There is one domain www.company.com with branches like www.company.com/location1/content and www.company.com/location2/content etc. I've taken care of duplicate content issues with rel="canonical" and duplicate page titles are also not a concern, anymore. Right now the concept is like this: If you visit the site for the first time you get to choose between the locations. Then a cookie is set and once you revisit www.company.com it will redirect you via a php header command to the location stored in your cookie: www.company.com/location1/content. My question is if this might hurt rankings in some kind of way as these aren't permanent redirects with a 301 but rather individual ones, based on your cookie.
Technical SEO | | jfkorn0