Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Site structure: Any issues with 404'd parent folders?
-
Is there any issue with a 404'd parent folder in a URL? There's no links to the parent folder and a parent folder page never existed. For example say I have the following pages w/ content:
/famous-dogs/lassie/
/famous-dogs/snoopy/
/famous-dogs/scooby-doo/But I never (and maybe never plan to) created a general **/famous-dogs/ **page. Sitemaps.xml does not link to it, nor does any page on my site.
Is there any concerns with doing this? Am I missing out on any sort of value that might pass to a parent folder?
-
Yeah - there is various speculation about how signals or authority traverse folder structures (see for example this whiteboard Friday ) but I haven't seen anything suggesting it's permanent - all of this may be an argument for adding /famous-dogs/ at some point, but I wouldn't personally stress about it not being there at launch.
-
Yeah. I'd just leave it as a 404 in that case
-
In my scenario, considering I might add a parent "famous dogs" page at some point, it'd probably best to leave robots.txt alone, right?
-
Thanks for the response. This is what I expected.
I swear I read somewhere that Google may pass some form of value from a child to a parent. i.e. "/famous-dogs/lassie/" could pass some value to "/famous-dogs/", absent any links. Can't find the source, but I suppose I'm a bit worried that I'd permanently lose out on some value if the parent does not exist initially. Considering I may add a "famous dogs" parent page at some point.
-
PS - if you're worried about the crawling, you could always block it in robots.txt if you really wanted (but unless it's a huge site I wouldn't bother). Note - if you do go this route, do it carefully so as not to block all contents of the folder at the same time!
-
The short answer is that there should be no harm going with your proposed approach.
Longer version: I believe there are cases where Google has tried to crawl a directory like "/famous-dogs/" in your example purely because it appears as a sub-folder in the paths of other pages even though there are not any direct links to it. But even if it does crawl it, if you don't have or intend to have a page there, a 404 is a perfectly valid response.
In general, while there could be a case that it's worth creating a "/famous-dogs/" page if there is search demand you can fulfil, until or unless you do, there is no harm in it returning a 404 response.
-
Seems odd that indexers would care if a parent directory page exists or not. Is there any proof that Google will attempt crawl parent folder pages that aren't in sitemaps.xml and aren't linked to anywhere else?
Perhaps I'm slowly building out my site. Depending on the material/approach, it might make sense to release a page talking about a sub-category (lassie) before releasing content about a parent category (famous dogs). Or maybe "famous dogs" is such low search volume that it doesn't make sense to spend time creating a parent "famous dogs" page.
If I'm understanding correctly, with the above you're effectively telling me to:
1. Build a parent category page. If I don't plan on investing much time/effort into the parent page, noindex it.
2. Reorganize my site folder structure.
Neither seem like a great option.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Readd/Reindex a page that was 410'd
A script of ours had an error that caused some pages we didn't wish 410'd to be 410'd, we caught it in about 12 hours but for some pages it was too late. My question is, will those pages be reindexed again and how will that affect their page ranking will they eventually be back where they were? Would submitting a site map with them help, or what would be the best way to correct this error (submit the links to google indexer maybe?).
Intermediate & Advanced SEO | | Wana-Ryd0 -
Breaking up a site into multiple sites
Hi, I am working on plan to divide up mid-number DA website into multiple sites. So the current site's content will be divided up among these new sites. We can't share anything going forward because each site will be independent. The current homepage will change to just link out to the new sites and have minimal content. I am thinking the websites will take a hit in rankings but I don't know how much and how long the drop will last. I know if you redirect an entire domain to a new domain the impact is negligible but in this case I'm only redirecting parts of a site to a new domain. Say we rank #1 for "blue widget" on the current site. That page is going to be redirected to new site and new domain. How much of a drop can we expect? How hard will it be to rank for other new keywords say "purple widget" that we don't have now? How much link juice can i expect to pass from current website to new websites? Thank you in advance.
Intermediate & Advanced SEO | | timdavis0 -
Changed all external links to 'NoFollow' to fix manual action penalty. How do we get back?
I have a blog that received a Webmaster Tools message about a guidelines violation because of "unnatural outbound links" back in August. We added a plugin to make all external links 'NoFollow' links and Google removed the penalty fairly quickly. My question, how do we start changing links to 'follow' again? Or at least being able to add 'follow' links in posts going forward? I'm confused by the penalty because the blog has literally never done anything SEO-related, they have done everything via social and email. I only started working with them recently to help with their organic presence. We don't want them to hurt themselves at all, but 'follow' links are more NATURAL than having everything as 'NoFollow' links, and it helps with their own SEO by having clean external 'follow' links. Not sure if there is a perfect answer to this question because it is Google we're dealing with here, but I'm hoping someone else has some tips that I may not have thought about. Thanks!
Intermediate & Advanced SEO | | HashtagJeff0 -
Ranking 1st for a keyword - but when 's' is added to the end we are ranking on the second page
Hi everyone - hope you are well. I can't get my head around why we are ranking 1st for a specific keyword, but then when 's' is added to the end of the keyword - we are ranking on the second page. What could be the cause of this? I thought that Google would class both of the keywords the same, in this case, let's say the keyword was 'button'. We would be ranking 1st for 'button', but 'buttons' we are ranking on the second page. Any ideas? - I appreciate every comment.
Intermediate & Advanced SEO | | Brett-S0 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
What's the best possible URL structure for a local search engine?
Hi Mozzers, I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets. Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers. Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.
Intermediate & Advanced SEO | | _nitman0 -
Using WP All Import csv import plugin for wordpress to daily update products on large ecommerce site. Category naming and other issues.
We have just got an automated solution working to upload about 4000 products daily to our site. We get a CSV file from the wholesalers server each day and the way they have named products and categories is not ideal. Although most of the products remain the same (don't need to be over written) Some will go out of stock or prices may change etc. Problem is we have no control over the csv file so we need to keep the catagories they have given us. Might be able to create new catgories and have products listed under multiple categories? If anyone has used wp all import or has knoledge in this area please let me know. I have plenty more questions but this should start the ball rolling! Thanks in advance mozzers
Intermediate & Advanced SEO | | weebro0