Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
All page files in root? Or to use directories?
-
We have thousands of pages on our website; news articles, forum topics, download pages... etc - and at present they all reside in the root of the domain /.
For example:
/aosta-valley-i6816.html
/flight-sim-concorde-d1101.html
/what-is-best-addon-t3360.htmlWe are considering moving over to a new URL system where we use directories. For example, the above URLs would be the following:
/images/aosta-valley-i6816.html
/downloads/flight-sim-concorde-d1101.html
/forums/what-is-best-addon-t3360.htmlWould we have any benefit in using directories for SEO purposes? Would our current system perhaps mean too many files in the root / flagging as spammy? Would it be even better to use the following system which removes file endings completely and suggests each page is a directory:
/images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/If so, what would be better: /images/aosta-valley/6816/ or /images/6816/aosta-valley/
Just looking for some clarity to our problem!
Thank you for your help guys!
-
To my knowledge there hasn't been a definitive conclusion on this one.
The general advice as I know it seems to be: they are equally good, pick one, and make sure the other one (with slash if you choose to go for 'without slash' or vice versa) redirects to the chosen one (to avoid duplicate content).
-
I would personally place the keywords at the end for clarity. It indeed seems unnatural to have the id as the final part of the URL. Even if that does indeed cost you a tiny bit of 'keyword power', I would glady sacrifice that in exchange for a more user-friendly URL.
Limiting the amount of words in the URL does indeed make it look slightly less spammy, but slightly less user friendly as well. I guess this is just one of those 'weigh the pros/cons and decide for yourself'. Just make sure the URLs don't get rediculously long.
-
OK, so I have taken it upon myself to now have our URLs as follows:
/news/853/free-flight-simulator/
Anything else gets 301'd to the correct URL. /news/853/free-flight-simulator would be 301'd to /news/853/free-flight-simulator/ along with /news/853/free-flight-sifsfsdfdsfmulator/ ... etc.
-
Also, trailing slash? Or no trailing slash?
Without
/downloads/878/fsx-concorde
With
/downloads/878/fsx-concorde/
-
Dear Theo,
Thank you for your response - i found your article very interesting.
So, just to clarify - in our case, the best URL method would be:
/images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/This would remove the suffixes and also have the ID numbers at the end; placing the target keywords closer to the root of the URL; which makes a very slight difference...
EDIT: Upon thinking about it, I feel that the final keyword-targeted page would be more natural if it appeared at the end of the URL. For example: /images/6816/aosta-valley/ (like you have done on your blog).
Also, should I limit the amount of hyphenated words in the URL? For example in your blog, you have /does-adding-a-suffix-to-my-urls-affect-my-seo/ - perhaps it would be more concentrated and less spammy as /adding-suffix-urls-affect-seo/ ?
Let me know your thoughts.
Thank you for your help!
-
Matt Cutts states that the number of subfolders 'it is not a major factor': http://www.youtube.com/watch?v=l_A1iRY6XTM
Furthermore, a blog I wrote about removing suffixes: http://www.finishjoomla.com/blog/5/does-adding-a-suffix-to-my-urls-affect-my-seo/
Another Matt Cutts regarding your seperate question about the keyword order: http://www.youtube.com/watch?v=gRzMhlFZz9I
Having some structure (in the form of a single subfolder) would greatly add to the usability of your website in my opinion. If you can manage to use the correct redirects (301) from your old pages to your new ones, I wouldn't see a clear SEO related reason not to switch.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Landing pages for paid traffic and the use of noindex vs canonical
A client of mine has a lot of differentiated landing pages with only a few changes on each, but with the same intent and goal as the generic version. The generic version of the landing page is included in navigation, sitemap and is indexed on Google. The purpose of the differentiated landing pages is to include the city and some minor changes in the text/imagery to best fit the Adwords text. Other than that, the intent and purpose of the pages are the same as the main / generic page. They are not to be indexed, nor am I trying to have hidden pages linking to the generic and indexed one (I'm not going the blackhat way). So – I want to avoid that the duplicate landing pages are being indexed (obviously), but I'm not sure if I should use noindex (nofollow as well?) or rel=canonical, since these landing pages are localized campaign versions of the generic page with more or less only paid traffic to them. I don't want to be accidentally penalized, but I still need the generic / main page to rank as high as possible... What would be your recommendation on this issue?
Intermediate & Advanced SEO | | ostesmorbrod0 -
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Using hreflang for international pages - is this how you do it?
My client is trying to achieve a global presence in select countries, and then track traffic from their international pages in Google Analytics. The content for the international pages is pretty much the same as for USA pages, but the form and a few other details are different due to how product licensing has to be set up. I don’t want to risk losing ranking for existing USA pages due to issues like duplicate content etc. What is the best way to approach this? This is my first foray into this and I’ve been scanning the MOZ topics but a number of the conversations are going over my head,so suggestions will need to be pretty simple 🙂 Is it a case of adding hreflang code to each page and creating different URLs for tracking. For example:
Intermediate & Advanced SEO | | Caro-O
URL for USA: https://company.com/en-US/products/product-name/
URL for Canada: https://company.com/en-ca/products/product-name /
URL for German Language Content: https://company.com/de/products/product-name /
URL for rest of the world: https://company.com/en/products/product-name /1 -
Hreflang and paginated page
Hi, I can not seem to find good documentation about the use of hreflang and paginated page when using rel=next , rel=prev
Intermediate & Advanced SEO | | TjeerdvZ
Does any know where to find decent documentatio?, I could only find documentation about pagination and hreflang when using canonicals on the paginated page. I have doubts on what is the best option: The way tripadvisor does it:
http://www.tripadvisor.nl/Hotels-g187139-oa390-Corsica-Hotels.html
Each paginated page is referring to it's hreflang paginated page, for example: So should the hreflang refer to the pagined specific page or should it refer to the "1st" page? in this case:
http://www.tripadvisor.nl/Hotels-g187139-Corsica-Hotels.html Looking foward to your suggestions.0 -
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
We are building URLs dynamically with apache rewrite.
Intermediate & Advanced SEO | | lcourse
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).0 -
Do I need to use rel="canonical" on pages with no external links?
I know having rel="canonical" for each page on my website is not a bad practice... but how necessary is it for pages that don't have any external links pointing to them? I have my own opinions on this, to be fair - but I'd love to get a consensus before I start trying to customize which URLs have/don't have it included. Thank you.
Intermediate & Advanced SEO | | Netrepid0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Sitemaps. When compressed do you use the .gz file format or the (untidy looking, IMHO) .xml.gz format?
When submitting compressed sitemaps to Google I normally use the a file named sitemap.gz A customer is banging on that his web guy says that sitemap.xml.gz is a better format. Google spiders sitemap.gz just fine and in Webmaster Tools everything looks OK... Interested to know other SEOmoz Pro's preferences here and also to check I haven't made an error that is going to bite me in the ass soon! Over to you.
Intermediate & Advanced SEO | | NoisyLittleMonkey0