Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to Structure URL's for Multiple Locations
-
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations.
We currently have 60 locations nationwide and our URL structure is as follows:
www.mydomain.com/locations/{location}
Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes)
The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes".
To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this.
Option 1
Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path:
Option 2
Build the city and state pages into the URL and breadcrumb path:
www.mydomain.com/locations/{state}/{area}/{location}
(i.e www.mydomain.com/locations/fl/orlando/waterford-lakes)
Any insight is much appreciated. Thanks!
-
Hi David,
Typically, your main landing pages are going to be those that represent the city of location, as in:
etc.
What I'm trying to understand is if you are saying you have more than one office within a single city (as in orlando office A, orlando office B, orlando office C) and are trying to hash out how to distinguish these same-city offices from one another. Is this the scenario, or am I not getting it? Please feel free to provide further details.
-
David -
It looks like there are two main options for you:
Keep the same URL structure (option 1), and create category pages that are state-based / area-based, that then have a short description of each location in that geographic area, with a link to their location page.
This is typically how it might be done with an eCommerce site, where you'd have a parent category (i.e. shoes) and then a sub-category (i.e. running shoes).
The downside to this is that you risk having duplicate content on these category pages.
Option #2 would be my recommendation, because you are including the area / state information into the URL.
One company that does not do this well is Noodles & Company. Their location URL looks like this:
http://www.noodles.com/locations/150/
... where "150" is a store ID in a database. Easy to pull out of a database table. Less helpful to the end user who doesn't know that store ID 150 = the one closest to them.
It would be much better to have it listed like:
http://www.noodles.com/locations/Colorado/Boulder/2602-Baseline/You don't want to go much beyond 4 layers, but it's a better way of indicating to Google and other search engines the location tree.
Also, I'd highly recommend using a rich-data format for displaying the location information.
For example, on the Customer Paradigm site, we use the RDFa system for tagging the location properly:
Customer Paradigm
5353 Manhattan Circle
Suite 103
Boulder CO, 80303
303.473.4400
... and then Google doesn't have to guess what the location's address and phone number actually are.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changes taken over in the SERP's: How long do I have to wait until i can rely on the (new) position?
I changed different things on a particular page (mainly reduced the exaggerated keyword density --> spammy). I made it recrawl by Google (Search Console). The new version has now already been integrated in the SERP's.Question: Are my latest changes (actual crawled page in the SERP's is now 2 days old) already reflected in the actual position in the SERP's or should I wait for some time (how long?) to evaluate the effect of my changes? Can I rely on the actual position or not?
On-Page Optimization | | Cesare.Marchetti0 -
Home page keyword in url
I have been looking into SEO for a few weeks now trying to perfect a homepage. Going through various sources on MOZ, and other examples out there on the internet, I keep seeing that you should have your keyword in the URL of the page. The homepage is the page most people want to rank the highest in google searches, however, you cannot put the keyword in the URL as most home page URLs are simply /. Should I actually make the home like this: www.example.com/key-word-example? I would imagine this would not be the normal for many users and would seem like it's not the home page.
On-Page Optimization | | Matthew_smart0 -
How important are clean URLs?
Just wanting to understand the importance of clean URLs in regards to SEO effectiveness. Currently, we have URLs for a site that reads as follows: http://www.interhampers.com.au/c/90/Corporate Gift Hampers Should we look into modifying this so that the URL does not have % or figures?
On-Page Optimization | | Gavo1 -
Hiding body copy with a 'read more' drop down option
Hi I just want to confirm how potentially damaging using java script to hide lots of on page body copy with a 'read more' button is ? As per other moz Q&A threads i was told that best not to use Javascript to do this & instead "if you accomplish this with CSS and collapsible/expandable <DIV> tags it's totally fine" so thats what i advised my clients dev. However i recently noticed a big drop in rankings aprox 1 weeks after dev changing the body copy format (hiding alot of it behind a 'read more' button) so i asked them to confirm how they did implement it and they said: "done in javascript but on page load the text is defaulting to show" (which is contrary to my instructions) So how likely is it that this is causing problems ? since coincides with ranking drop OR if text is defaulting to show it should be ok/not cause probs ? And should i request that they redo as originally instructed (css & collapsible divs) asap ? All Best Dan
On-Page Optimization | | Dan-Lawrence0 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0 -
Submitting multiple sitemaps
I recently moved over from html to wordpress. I have the google sitemap plugin on the new wordpress site, but in webmaster tools, it's only showing 71 pages, and I have hundreds, but many are html. Is it okay, to submit an html sitemap as well as the wp sitemap that's already in there?
On-Page Optimization | | azguy0 -
Best SEO structure for blog
What is the best SEO page/link structure for a blog with, say 100 posts that grows at a rate of 4 per month? Each post is 500+ words with charts/graphics; they're not simple one paragraph postings. Rather than use a CMS I have a hand crafted HTML/CSS blog (for tighter integration with the parent site, some dynamic data effects, and in general to have total control). I have a sidebar with headlines from all prior posts, and my blog home page is a 1 line summary of each article. I feel that after 100 articles the sidebar and home page have too many links on them. What is the optimal way to split them up? They are all covering the same niche topic that my site is about. I thought of making the side bar and home page only have the most recent 25 postings, and then create an archive directory for older posts. But categorizing by time doesn't really help someone looking for a specific topic. I could tag each entry with 2-3 keywords and then make the sidebar a sorted list of tags. Clicking on a tag would then show an intermediate index of all articles that have that tag, and then you could click on an article title to read the whole article. Or is there some other strategy that is optimal for SEO and the indexing robots? Is it bad to have a blog that is too heirarchical (where articles are 3 levels down from the root domain) or too flat (if there are 100s of entries)? Thanks for any thoughts or pointers.
On-Page Optimization | | scanlin0