Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
All page files in root? Or to use directories?
-
We have thousands of pages on our website; news articles, forum topics, download pages... etc - and at present they all reside in the root of the domain /.
For example:
/aosta-valley-i6816.html
/flight-sim-concorde-d1101.html
/what-is-best-addon-t3360.htmlWe are considering moving over to a new URL system where we use directories. For example, the above URLs would be the following:
/images/aosta-valley-i6816.html
/downloads/flight-sim-concorde-d1101.html
/forums/what-is-best-addon-t3360.htmlWould we have any benefit in using directories for SEO purposes? Would our current system perhaps mean too many files in the root / flagging as spammy? Would it be even better to use the following system which removes file endings completely and suggests each page is a directory:
/images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/If so, what would be better: /images/aosta-valley/6816/ or /images/6816/aosta-valley/
Just looking for some clarity to our problem!
Thank you for your help guys!
-
To my knowledge there hasn't been a definitive conclusion on this one.
The general advice as I know it seems to be: they are equally good, pick one, and make sure the other one (with slash if you choose to go for 'without slash' or vice versa) redirects to the chosen one (to avoid duplicate content).
-
I would personally place the keywords at the end for clarity. It indeed seems unnatural to have the id as the final part of the URL. Even if that does indeed cost you a tiny bit of 'keyword power', I would glady sacrifice that in exchange for a more user-friendly URL.
Limiting the amount of words in the URL does indeed make it look slightly less spammy, but slightly less user friendly as well. I guess this is just one of those 'weigh the pros/cons and decide for yourself'. Just make sure the URLs don't get rediculously long.
-
OK, so I have taken it upon myself to now have our URLs as follows:
/news/853/free-flight-simulator/
Anything else gets 301'd to the correct URL. /news/853/free-flight-simulator would be 301'd to /news/853/free-flight-simulator/ along with /news/853/free-flight-sifsfsdfdsfmulator/ ... etc.
-
Also, trailing slash? Or no trailing slash?
Without
/downloads/878/fsx-concorde
With
/downloads/878/fsx-concorde/
-
Dear Theo,
Thank you for your response - i found your article very interesting.
So, just to clarify - in our case, the best URL method would be:
/images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/This would remove the suffixes and also have the ID numbers at the end; placing the target keywords closer to the root of the URL; which makes a very slight difference...
EDIT: Upon thinking about it, I feel that the final keyword-targeted page would be more natural if it appeared at the end of the URL. For example: /images/6816/aosta-valley/ (like you have done on your blog).
Also, should I limit the amount of hyphenated words in the URL? For example in your blog, you have /does-adding-a-suffix-to-my-urls-affect-my-seo/ - perhaps it would be more concentrated and less spammy as /adding-suffix-urls-affect-seo/ ?
Let me know your thoughts.
Thank you for your help!
-
Matt Cutts states that the number of subfolders 'it is not a major factor': http://www.youtube.com/watch?v=l_A1iRY6XTM
Furthermore, a blog I wrote about removing suffixes: http://www.finishjoomla.com/blog/5/does-adding-a-suffix-to-my-urls-affect-my-seo/
Another Matt Cutts regarding your seperate question about the keyword order: http://www.youtube.com/watch?v=gRzMhlFZz9I
Having some structure (in the form of a single subfolder) would greatly add to the usability of your website in my opinion. If you can manage to use the correct redirects (301) from your old pages to your new ones, I wouldn't see a clear SEO related reason not to switch.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it ok to repeat a (focus) keyword used on a previous page, on a new page?
I am cataloguing the pages on our website in terms of which focus keyword has been used with the page. I've noticed that some pages repeated the same keyword / term. I've heard that it's not really good practice, as it's like telling google conflicting information, as the pages with the same keywords will be competing against each other. Is this correct information? If so, is the alternative to use various long-winded keywords instead? If not, meaning it's ok to repeat the keyword on different pages, is there a maximum recommended number of times that we want to repeat the word? Still new-ish to SEO, so any help is much appreciated! V.
Intermediate & Advanced SEO | | Vitzz1 -
Page rank and menus
Hi, My client has a large website and has a navigation with main categories. However, they also have a hamburger type navigation in the top right. If you click it it opens to a massive menu with every category and page visible. Do you know if having a navigation like this bleeds page rank? So if all deep pages are visible from the hamburger navigation this means that page rank is not being conserved to the main categories. If you click a main category in the main navigation (not the hamburger) you can see the sub pages. I think this is the right structure but the client has installed this huge menu to make it easier for people to see what there is. From a technical SEO is this not bad?
Intermediate & Advanced SEO | | AL123al0 -
URL structure - Page Path vs No Page Path
We are currently re building our URL structure for eccomerce websites. We have seen a lot of site removing the page path on product pages e.g. https://www.theiconic.co.nz/liberty-beach-blossom-shirt-680193.html versus what would normally be https://www.theiconic.co.nz/womens-clothing-tops/liberty-beach-blossom-shirt-680193.html Should we be removing the site page path for a product page to keep the url shorter or should we keep it? I can see that we would loose the hierarchy juice to a product page but not sure what is the right thing to do.
Intermediate & Advanced SEO | | Ashcastle0 -
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
Can noindexed pages accrue page authority?
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking). I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content. My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
Intermediate & Advanced SEO | | THandorf0 -
Using href lang tag for multi-regional targeting on the same page
Hi, I have the site au.example.com and I ranked on google AustraliaI would like to be ranked also in Google New Zeland for the same page (au.example.com) Because they are geographically & culturally close Can I place href lang tag for both countries and present the same page The code should look like: OR should i have create a different page for New Zealand (for eample: http://au.example.com/EN-NZ) And the code will look like: What will work better or there is other solution? Hope I’m clear.. Thanks!
Intermediate & Advanced SEO | | Kung_fu_Panda0 -
Do you add 404 page into robot file or just add no index tag?
Hi, got different opinion on this so i wanted to double check with your comment is. We've got /404.html page and I was wondering if you would add this page to robot text so it wouldn't be indexed or would you just add no index tag? What would be the best approach? Thanks!
Intermediate & Advanced SEO | | Rubix0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0