Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Silo vs breadcrumbs in 2015
-
Hi ive heard silos being mentioned in the past to help with rankings does this still apply?
and what about breadcrumbs do i use them with the silo technique or instead of which ones do you think are better or should i not be using these anymore with the recent google updates?
-
great thanks ill give that a go
-
It's been a while since I've used WP, but if you use posts (or posts and pages), you will have a major silo and duplicate content problem with blog category pages.
The way to solve this is to go to the section where you set up your post categories, and set the slug to be identical to your category page. For example, if you have a page category with the slug "blue-widgets", set the post category slug to "blue widgets". This makes the category page the parent for posts in that category.
There are also some adjustments that you will need to make to your URLs removing "/category/ from your URLs. I've done it, and it's pretty easy. Maybe another poster could give you the specifics.
-
great thanks very informative reply, i've started using wordpress for most of my sites now, is siloing easy enough to do in wordpress?
-
Silos will always work. It's not some trick - it's how Google works. Here's a very simplified explanation as to why...
Let's say that I have an eCommerce site, and I sell lawnmowers and Plywood. Let's also say that the Lawnmowers category page has a theoretical 100 points of link juice. Lets also say that the site sells 2 lawnmowers - the Fubar 2000 and the Toecutter 300. If the lawnmower category page only links to the Fubar 2000 and the Toecutter 300 pages, the category page will push 45 points of link juice to each page (pages can pass on +/-90% of their link juice, and 90/2=45).
Both pages will receive almost the full 45 point benefit because the pages are relevant to the category page.
If the Lawnmower category page instead only has 1 link to the Plywood page, the Lawnmower category page would push 90 points of link juice to the plywood page. But, the Plywood page would not receive the full benefit of the 90 points, because Lawnmowers and Plywood don't share much relevance. In this case, Google would heavily discount the 90 points, so that the Plywood page might only get the benefit of 30 points. Think of it as a leaky hose.
What happens to the other 60 Points of Link Juice? It gets dumped on the floor, and the site loses the ranking power of those 60 points.
Keep in mind that this is all theoretical, and that link juice comes in different flavors like apple, orange and prune, representing the different ranking factors (Trust, Authority, Topical Authority, Social Signals, etc.) . Orange might discount 90% while prune might only discount 10%. In this case, is there really a 67% link juice hit? Damned if I know, but I had to pick a number... This is all theoretical. I do know that link juice loss between pages that aren't relevant is dramatic. I also know that it is very possible to determine how your internal pages rank based on your internal link structure, and link placement on the page.
By siloing a website, I have seen rankings jump dramatically. Most websites hemorrhage link juice. Think of it as Link Juice Reclamation. The tighter you can build your silos, the less link juice gets dumped on the floor. By reclaiming the spilled link juice and putting it in the right places, you can dramatically increase your rankings. BTW, inbound links work in a similar fashion. If the Lawnmower page was an external site and linked to the Plywood page, the same discounts would apply. That's why it pays to get niche relevant backlinks for maximum benefit.
This in no way accounts for usability, and linking between silos can make sense to benefit end-users. Again, this model is probably overly simplified, and doesn't take into account Block Level Analysis, but the logic is sound. You can build spreadsheet models for link juice distribution factoring in Block level, discounts, etc. It's by no means accurate, but can give you a pretty good idea of where your link juice is going. You can model this on the old (and increasingly irrelevant) PageRank Algorithm. Pagerank is Logarithmic and it takes 8-9x as much link juice to move up in PR. If it takes 100 points of Link Juice to become a PR1, it takes 800-900 points to become a PR 2. Generally speaking a PR2 page, via links, can create roughly 7 to 75 PR1 pages, depending on how close the PR2 is to becoming a PR3.
-
Both is the way to go. Silos are essentially structuring your pages so that per topic, there is 1 master article and multiple supporting articles that link back to the master article. The topic only links to pages relevant to the topic and not other sections of the site.
You can use breadcrumbs in conjunction with a silo as the structure is suitable for them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Click To Reveal vs Rollover Navigation Better For Organic?
Hi, Any thoughts, data or insights as which is better in a top navigation... click to reveal the nav links or rollover to reveal the nav links? Regular content in an accordion (click to reveal) is evidently not best practice. Does that apply to navigation as well? Thanks! Best... Mike
Intermediate & Advanced SEO | | 945010 -
Google Indexing Of Pages As HTTPS vs HTTP
We recently updated our site to be mobile optimized. As part of the update, we had also planned on adding SSL security to the site. However, we use an iframe on a lot of our site pages from a third party vendor for real estate listings and that iframe was not SSL friendly and the vendor does not have that solution yet. So, those iframes weren't displaying the content. As a result, we had to shift gears and go back to just being http and not the new https that we were hoping for. However, google seems to have indexed a lot of our pages as https and gives a security error to any visitors. The new site was launched about a week ago and there was code in the htaccess file that was pushing to www and https. I have fixed the htaccess file to no longer have https. My questions is will google "reindex" the site once it recognizes the new htaccess commands in the next couple weeks?
Intermediate & Advanced SEO | | vikasnwu1 -
How To Implement Breadcrumbs
Hi, I'm looking to implement breadcrumbs for e-commerce store so they will appear in the SERP results like the attached image. In terms of implementing to a site, would you simply add HTML to each page like this Google example? Which looks like this: Books › Science Fiction Award Winners Then is there anything you need to do, to get this showing in the SERPs results e.g. doing something in search console. Or do you just wait into google has crawled and hopefully starts showing in the SERPs results? Cheers. wn3ybMMOQFW98fNQkxtJkA.png [SERP results with bread crumbs](SERP results with bread crumbs)
Intermediate & Advanced SEO | | jaynamarino0 -
M.ExampleSite vs mobile.ExampleSite vs ExampleSite.com
Hi, I have a call with a potential client tomorrow where all I know is that they are wigged-out about canonicalization, indexing and architecture for their three sites: m.ExampleSite.com mobile.ExampleSite.com ExampleSite.com The sites are pretty large... 350k for the mobiles and 5 million for the main site. They're a retailer with endless products. They're main site is not mobile-responsive, which is evidently why they have the m and mobile sites. Why two, I don't know. This is how they currently hand this: What would you suggest they do about this? The most comprehensive fix would be making the main site mobile responsive and 301 the old mobile sub domains to the main site. That's probably too much work for them. So, what more would you suggest and why? Your thoughts? Best... Mike P.S., Beneath my hand-drawn portrait avatar above it says "Staff" at this moment, which I am not. Some kind of bug I guess.
Intermediate & Advanced SEO | | 945010 -
The Great Subdomain vs. Subfolder Debate, what is the best answer?
Recently one of my clients was hesitant to move their new store locator pages to a subdomain. They have some SEO knowledge and cited the whiteboard Friday article at https://a-moz.groupbuyseo.org/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday. While it is very possible that Rand Fiskin has a valid point I felt hesitant to let this be the final verdict. John Mueller from Google Webmaster Central claims that Google is indifferent towards subdomains vs subfolders. https://www.youtube.com/watch?v=9h1t5fs5VcI#t=50 Also this SEO disagreed with Rand Fiskin’s post about using sub folders instead of sub domains. He claims that Rand Fiskin ran only 3 experiments over 2 years, while he has tested multiple subdomain vs subfolder experiments over 10 years and observed no difference. http://www.seo-theory.com/2015/02/06/subdomains-vs-subfolders-what-are-the-facts-on-rankings/ Here is another post from the Website Magazine. They too believe that there is no SEO benefits of a subdomain vs subfolder infrastructure. Proper SEO and infrastructure is what is most important. http://www.websitemagazine.com/content/blogs/posts/archive/2015/03/10/seo-inquiry-subdomains-subdirectories.aspx Again Rand might be right, but I rather provide a recommendation to my client based on an authoritative source such as a Google engineer like John Mueller. Does anybody else have any thoughts and/or insight about this?
Intermediate & Advanced SEO | | RosemaryB3 -
When to Use Schema vs. Facebook Open Graph?
I have a client who for regulatory reasons cannot engage in any social media: no Twitter, Facebook, or Google+ accounts. No social sharing buttons allowed on the site. The industry is medical devices. We are in the process of redesigning their site, and would like to include structured markup wherever possible. For example, there are lots of schema types under MedicalEntity: http://schema.org/MedicalEntity Given their lack of social media (and no plans to ever use it), does it make sense to incorporate OG tags at all? Or should we stick exclusively to the schemas documented on schema.org?
Intermediate & Advanced SEO | | Allie_Williams0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Multiple stores & domains vs. One unified store (SEO pros / cons for E-Commerce)
Our company runs a number of individual online shops, specialised in particular products but all in the same genre of goods overall, with a specific and relevant domain name for each shop. At the moment the sites are separate, and not interlinked, i.e. Completely separate brands. An analogy could be something like clothing accessories (we are not in the clothing business): scarves.com, and silkties.com (our field is more niche than this) We are about to launch a related site, (e.g. handbags.com), in the same field again but without precisely overlapping products. We will produce this site on a newer, more flexible e-commerce platform, so now is a good time to consider whether we want to place all our sites together with one e-commerce system on the backend. Essentially, we need to know what the pros and cons would be of the various options facing us and how the SEO ranking is affected by the three possibilities. Option 1: continue with separate sites each with its own domains. Option 2: have multiple sites, each on their own domain, but on the same ecommerce system and visible linked together for the customer (with unified checkout) – on the top of each site could be a menu bar linking to each site: [Scarves.com] – [SilkTies.com] – [Handbags.com] The main question here is whether the multiple domains are mutually beneficial, particularly considerding how close to target keywords the individual domains are. If mutually benefitial, how does it compare to option 3: Option 3: Having recently acquired a domain name (e.g. accessories.com) which would cover the whole category together, we are presented with a third option: making one site selling all of these products in different categories. Our main concern here would be losing the ability to specifically target marketing, and losing the benefit of the domains with the key words in for what people are more likely to be searching for (e.g. 'silk tie') rather than 'accessories.' Is it worth taking the hit on losing these specific targeted domain names for the advantage of increased combined inbound links?
Intermediate & Advanced SEO | | Colage0