Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Blog.mysite.com or mysite.com/blog?
-
Hi, I'm just curious what the majority think of what's the best way to start a blog on your website for SEO benefits. Is it better to have it under a sub domain or a directory? Or does it even matter?
-
From everything I've read, I agree that your safest bet is to go with the subfolder.
-
I agree with Tim and Adam and that said, sub-folders are better as a general rule of thumb for sure.
You might also want to refer to other similar questions here on SEOMOZ.
-
http://www.seomoz.org/q/blogs-are-best-when-hosted-on-domain-subdomain-or* http://www.seomoz.org/q/setting-up-a-company-blog-subdomain-or-new-url* http://www.seomoz.org/q/blog-vs-blog
and the post from Matt Cutts as well as the article from Rand that Adam mentioned.
-
-
I think Adam has hit the nail on the head. We recently moved our blog site from a subdomain to a subfolder and 301'd all the old URL’s with the intention that any entries that users find genuinely useful or interesting will be potentially linked to, thereby providing a benefit to the root domain.
As long as your blog is tightly related to your core business activity then I would go down the subfolder root although, in all honesty, I think subdomains potentially look a little more professional.
-
Hi Tim,
I generally prefer to go with the subfolder option (mysite.com/blog) rather than the subdomain (blog.mysite.com). The reason I prefer this option is because having the blog in a subfolder means that it will benefit from the value of the root domain. In other words, links that are obtained by the root domain will pass that value to the subfolders. However, a subdomain is treated as a separate site and therefore not much value is passed via the root.
Rand provides an excellent answer in a previous Q&A of a similar topic:
Hope that helps,
Adam.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does DA/PA have any effect on rankings?
I have seen many people are concerned about increasing DA and PA of their websites. While I am very curious why do people focus on increasing DA and PA? Does DA and PA effect the rankings of the website? Because I have recently launched my website regarding men beard trimmer and it is ranking on 1st page but not on number 1 position. Will increasing DA/PA of the site help me in occupying 1st position?
On-Page Optimization | | RyanAmin0 -
Moz bar not working on https://www.fitness-china.com/gym-equipment-names-pictures-prices
Moz bar not working on our website about gym equipment names https://www.fitness-china.com/gym-equipment-names-pictures-prices How long fix it?
On-Page Optimization | | ahislop5740 -
Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites). I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors. It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble. Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do? Thanks Moz community!
On-Page Optimization | | paulz9990 -
Best way to separate blogs, media coverage, and press releases on WordPress?
I'm curious what some of your thoughts are on the best way to handle the separation of blog posts, from press releases stories, from media coverage. With 1 WordPress installation, we're obviously utilizing the Posts for these types of content. It seems obvious to put press releases into a "press release" category and media coverage into a "media coverage" category.... but then what about blog posts? We could put blog posts into a "blog" category, but I hate that. And what about actual blog categories? I tried making sub-categories for the blog category which seemed like it was going to work, until the breadcrumbs looked all crazy. Example: Homepage > Blog > Blog > Sub-Category Homepage = http://www.example.com First 'Blog' = http://www.example.com/blog Second 'Blog' = http://www.example.com/category/blog Sub-Category = http://www.example.com/category/blog/sub-category This just doesn't seem very clean and I feel like there has to be a better solution to this. What about post types? I've never really worked with them. Is that the solution to my woes? All suggestions are welcome! EDIT: I should add that we would like the URL to contain /blog/ for blog posts /media-coverage/ for media coverage, and /press-releases/ for press releases. For blog posts, we don't want the sub-category to be in the URL.
On-Page Optimization | | Philip-DiPatrizio0 -
How do i block an entire category/directory with robots.txt?
Anyone has any idea how to block an entire product category, including all the products in that category using the robots.txt file? I'm using woocommerce in wordpress and i'd like to prevent bots from crawling every single one of products urls for now. The confusing part right now is that i have several different url structures linking to every single one of my products for example www.mystore.com/all-products, www.mystore.com/product-category, etc etc. I'm not really sure how i'd type it into the robots.txt file, or where to place the file. any help would be appreciated thanks
On-Page Optimization | | bricerhodes0 -
WordPress and category/subcategory landing pages
Hey, Here's my situation. I'm building a WordPress blog for product reviews of a certain niche. Current category setup is 4 main categories with 4-8 subcategories each. Each subcategory has a unique description that will help it become a landing page for certain keywords, after which it lists the posts from that subcategory. The posts will always be assigned to a sub-category, never to a main category. My issue is what to do with the main categories. They're fairly general so they're not really targeting any keywords, and don't have any unique descriptions attached to them. I was thinking of choosing between three options on designing the main category pages: List the subcategories + normal posts loop that bring the latest posts from the subcategories (may create a lot of duplicate content since the subcategory pages are also listing their posts) List only the subcategories (+ maybe just the latest post from each subcategory) Don't link the main categories at all, instead only use them to create dropdowns for the subcategories So, what would you choose, and why?
On-Page Optimization | | mihaiaperghis0 -
Does Google index dynamically generated content/headers, etc.?
To avoid dupe content, we are moving away from a model where we have 30,000 pages, each with a separate URL that looks like /prices/<product-name>/<city><state>, often with dupe content because the product overlaps from city to city, and it's hard to keep 30,000 pages unique, where sometimes the only distinction is the price & the city/state.</state></city></product-name> We are moving to a model with around 300 unique pages, where some of the info that used to be in the url will move to the page itself (headers, etc.) to cut down on dupe content on those unique 300 pages. My question is this. If we have 300 unique-content pages with unique URL's, and we then put some dynamic info (year, city, state) into the page itself, will Google index this dynamic content? The question behind this one is, how do we continue to rank for searches for that product in the city-state being searched without having that info in the URL? Any best practices we should know about?
On-Page Optimization | | editabletext0 -
.us VS .com
In general from what I have experienced a location specific extension such as .co.uk geo-targeted to the same location gives the best results when ranking BUT when I look at results from the US, page after page shows results of .com, surely if my above statement is true then a .us domain extension should rank better then a .com.
On-Page Optimization | | activitysuper0