Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Handling Multiple Restaurants Under One Domain
- 
					
					
					
					
 We are working with a client that has 2 different restaurants. One has been established since 1938, the other was opened in late 2012. Currently, each site has its own domain name. From a marketing/branding perspective, we would like to make the customers [web visitors] of the established restaurant aware of the sister restaurant. To accomplish this, we are thinking about creating a landing page that links to each restaurant. To do this, we would need to purchase a brand new URL, and then place each restaurant in a separate sub folder of the new URL. The other thought is to have each site accessed from the main new URL [within sub folders] and also point each existing URL to the appropriate sub folder for each restaurant. We know there are some branding and marketing hurdles with this approach that we need to think through/work out. But, we are not sure how this would impact their SEO––and assume it will not be good. Any thoughts on this topic would be greatly appreciated. 
- 
					
					
					
					
 Glad to hear it! 
- 
					
					
					
					
 Helps a ton! 
- 
					
					
					
					
 Thanks for your input! 
- 
					
					
					
					
 Hi ThinkCreativeGroup, If the restaurants have different names (i.e. Bob's Pancake House vs. Bill's Burgers) then you should keep them on separate domains and do all marketing completely separately without making any effort to link the two to one another. If there is some shared history, you could put it in a non-indexable infographic along the lines of "The Jones family opened Bob's Pancake House in 1938. In 1950, they invented their famous buckwheat cake stack. In 2012, son Bill opened Bill's Burgers across town". If, for reasons of pride, the owner wants to highlight something like this, that's fine, but I wouldn't do it in indexable text. If the restaurants are two branches of the same brand (i.e. both of them are Bob's Pancake House) then, yes, you could develop a single website with a landing page for each physical location. You would then be linking to each of these landing pages from the citation building campaign for each location. Hope this helps! 
- 
					
					
					
					
 Restaurant names/brand names are different? If that's the case I'd suggest you stick with one Website for each one of them and if you'd like you can link them together with a nofollow link. From an SEO perspective, you can get more authority if you use a single domain for both, but if I look it from the user point of view, 2 sites seems much more appropriate, and Google says every day: think, and build to users. It is ultimately your choice, those are just my 2 cents  
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		I am trying to generate GEO meta tag for my website where on one page there are multiple locations My question is, Can I add GEO tagging for every address?
 Am I restricted to 1 geo tag per page or can i add multiple geo tags ? Technical SEO | | lina_digital0
- 
		
		
		
		
		
		Spammers created bad links to old hacked domain, now redirected to our new domain. Advice?
 My client had an old site hacked (let's call it "myolddomain.com") and the hackers created many links in other hacked sites with links such as http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html The old myolddomain.com site was redirected to a different new site since then, but we still see over a thousand spam links showing up in the new site's Search Console 404 crawl errors report. Also, using the links: operator in google search, we see many results of spam links. Should we be worried about these bad links pointing to our old site and redirecting to 404s on the new site? What is the best recommendation to clean them up? Ignore? 410s? Other? I'm seeing conflicting advice out there. The old site is hosted by the client's previous web developer who doesn't want to clean anything up on their end without an ongoing hosting contract. So beyond turning redirects on or off, the client doesn't want to pay for any additional hosting. So we don't have much control over anything related to "myolddomain.com". 😞 Thanks in advance for any assistance! Technical SEO | | usDragons0
- 
		
		
		
		
		
		I have a GoDaddy website and have multiple homepages
 I have GoDaddy website builder and a new website http://ecuadorvisapros.com and I notices through your crawl test that there are 3 home pages http://ecuadorvisapros with a 302 temporary redirect, http://www.ecuadorvisapros.com/ with no redirect and http://www.ecuadorvisapros/home.html. GoDaddy says there is only one home page. Is this going to kill my chances of having a successful website and can this be fixed? Or can it. I actually went with the SEO version thinking it would be better, but it wants to auto change my settings that I worked so hard at with your sites help. Please keep it simple, I am a novice although I have had websites in the past I know more about the what's than the how's of websites. Thanks, Technical SEO | | ScottR.0
- 
		
		
		
		
		
		Tool to Generate All the URLs on a Domain
 Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content. Technical SEO | | timfrick0
- 
		
		
		
		
		
		Are .clinic domains effective?
 We acquired a .clinic domain for a client, they are right now running under a .ca and I was just wondering if there were any cons to making the switch. On the flip side are there any pros? I've tried to search for the answer but couldn't seem to come across anything, thank you if you have any knowledge or could point me to a resource. Technical SEO | | webignite0
- 
		
		
		
		
		
		Moving my domain to weebly
 I am thinking of moving my html website to weebly. They offer a 301 redirect for my domain name. Is that ok for SEO? Technical SEO | | bhsiao0
- 
		
		
		
		
		
		Checkout on different domain
 Is it a bad SEO move to have a your checkout process on a separate domain instead of the main domain for a ecommerce site. There is no real content on the checkout pages and they are completely new pages that are not indexed in the search engines. Do to the backend architecture it is impossibe for us to have them on the same domain. An example is this page: http://www.printingforless.com/2/Brochure-Printing.html One option we've discussed to not pass page rank on to the checkout domain by iFraming all of the links to the checkout domain. We could also move the checkout process to a subdomain instead of a new domain. Please ignore the concerns with visitors security and conversion rate. Thanks! Technical SEO | | PrintingForLess.com0
- 
		
		
		
		
		
		What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
 Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com Technical SEO | | fthead9
 User-agent: *
 Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0
 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				