Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best way to deal with an event calendar
- 
					
					
					
					
 I have an event calendar that has multiple repeating items into the future. They are classes that typically all have the same titles but will occasionally have different information. I don't know what is the best way to deal with them and am open to suggestions. Currently Moz anayltics is showing multiple errors (duplicate page titles, descriptions and overly dynamic urls). I'm assuming that it's showing duplicate elements way into the future. I thought of having the calendar no followed at all but the content for the classes seems valuable. Thanks, 
- 
					
					
					
					
 Sorry for all the posts however maybe this will help you as well that get rid of the dynamic uRLs http://www.webconfs.com/url-rewriting-tool.php Thomas 
- 
					
					
					
					
 A great completely and this is a good example of the type of difference changing the robots.txt file could make I would read all the information you can on it as it seems to be constantly updating. I used this info below as an example of a happy ending but to see the problems I would read all the stories you will see if you check out this link. http://wordpress.org/support/topic/max-cpu-usage/page/2 CPU usage from over 90% to less than 15%. Memory usage dropped by almost half, from 1.95 GB to 1.1 GB including cache/buffers. My setup is as follows: Linode 2GB VPS 
 Nginx 1.41
 Percona SQL Server using XtraDB
 PHP-FPM 5.4 with APC caching db requests and opcode via W3 Total Cache
 Wordpress 3.52
 All in One Event Calendar 1.11All the Best, Thomas 
- 
					
					
					
					
 I got the robots.txt file I hope this will help you. This is built into every GetFlywheel.com website they are a managed WordPress only hosting company website the reason they did this was the same reason Dan as described above. I'm not saying this is a perfect fix however after speaking with the founder of GetFlywheel I know they place this in the robots.txt file for every website that they host in order to try get rid of the crawling issue. This is an exact copy of any default robots.txt file from getflywheel.com Default Flywheel robots fileUser-agent: * 
 Disallow: /wp-admin/
 Disallow: /wp-includes/Disallow: /calendar/action:posterboard/ 
 Disallow: /calendar/action:agenda/
 Disallow: /calendar/action:oneday/
 Disallow: /calendar/action:month/
 Disallow: /calendar/action:week/
 Disallow: /calendar/action:map/As found on a brand-new website. If you Google "Max CPU All in one calendar" you will see more about this issue. I hope this is of help to you, Thomas PS here is what The maker of the all in one event calendar has listed on their site as a fix 
- 
					
					
					
					
 Hi Landon I had a client with a similar situation. Here's what I feel is the best goal; Calendar pages (weeks/months/days etc) - don't crawl, don't index Specific event pages - crawl and index Most likely the calendar URLs have not been indexed, but you can check with some site: searches. Assuming the have not been indexed, the best solution was to block crawling to certain URLs with robots.txt - calendars can go off into infinity, and you don't want to send the crawlers off into a black hole as it's not good for crawl budget, or for directing them to your actual content. 
- 
					
					
					
					
 is this the all-in-one event calendar for WordPress? If so I can give you the information or you can just Google CPU Max WordPress essentially you have to change the robots.txt file so the crawlers don't have huge issues as they do now with it. Get flywheel has that built into their robots.txt file if that is your issue I can go in and grab it for you. Sincerely, Thomas 
- 
					
					
					
					
 Besides this, take a look at the schema markup for Events it might help you mark up the page better so Google will understand what the page/ event is about: http://schema.org/Event 
- 
					
					
					
					
 Are the same classes in the future link to the same page? are you using canonical tags correctly? Your URL should help diagnose the problem and guide you better, 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		Best Practice for www and non www
 How is the best way to handle all the different variations of a website in terms of www | non www | http | https? In Google Search Console, I have all 4 versions and I have selected a preference. In Open Site Explorer I can see that the www and non www versions are treated differently with one group of links pointing to each version of the same page. This gives a different PA score. eg. http://mydomain.com DA 25 PA 35 http://www.mydomain.com DA 19 PA 21 Each version of the home page having it's only set of links and scores. Should I try and "consolidate" all the scores into one page? Should I set up redirects to my preferred version of the website? Thanks in advance Technical SEO | | I.AM.Strategist0
- 
		
		
		
		
		
		What is the best way to refresh a webpage of a news site, SEO wise?
 Hello all, we have a client which is a sports website. In fact it is a veyr big website and has a huge number of news per day. This is mostly the reason why it refreshes some of its pages with news list every 420 seconds. We currently use meta refresh. I have read here and elsewhere that meta refreshes should be avoided. But we don't do it to send to another page and pass any kind of page authority / juice. Is in this case javascript refresh better? Is there any other better way. What do you think & suggest? Thank you! Technical SEO | | pkontopoulos0
- 
		
		
		
		
		
		Best Way To Clean Up Unruly SubDomain?
 Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so? Technical SEO | | RocketZando0
- 
		
		
		
		
		
		Best Practices for adding Dynamic URL's to XML Sitemap
 Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool Technical SEO | | seekjobs
 http://www.xyz.com/products/product2-is-even-cooler
 http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0
- 
		
		
		
		
		
		Merging several sites into one - best practice
 I had 2 sites on the web (www.physicseditor.de, www.texutrepacker.com) and decided to move them all under one single domain (www.codeandweb.com) Both sites were ranking very good for several keywords. I not redirected the most important pages from the old domains with a 301 redirect to the new subpages (www.texturepacker.com => www.codeandweb.com/texturepacker) Google still delivers the old domains but the redirect take people directly to the new content. I've already submitted the new site map to google webmaster tools. Pages are already in the index but do not really show up in the search results. How long does it take until google accepts the new domain and delivers the new content in the search results? Was it ok what I did? Or is there some room for improvement? SeoMoz will of course not find any information about the new page since it is not yet directly linked in google. But I can't get ranking information for the "old" pages since SeoMoz tells me that it can't crawl the old domains.... Technical SEO | | gossi740
- 
		
		
		
		
		
		Best geotargeting strategy: Subdomains or subfolders or country specific domain
 How have the relatively recent changes in how G perceives subdomains changed the best route to onsite geotargeting i.e. not building out new country specific sites on country specific and hosted domains and instead developing sub-domains or sub-folders and geo-targeting those via webmaster tools ? In other words, given the recent change in G perception, are sub-domains now a better option than a sub-folder or is there not much in it ? Also if client has a .co.uk and they want to geo-target say France, is the sub-domain/sub-folder route still an option or is the .co.uk still too UK specific, and these options would only work using a .com ? In other words can sites on country specific domains (.co.uk , .fr, .de etc etc) use sub-folders or domains to geo-target other countries or do they have no option other than to develop new country specific (domains/hosting/language) websites ? Any thoughts regarding current best practice in this regard much appreciated. I have seen last Febs WBF which covers geotargeting in depth but the way google perceives subdomains has changed since then Many Thanks Dan Technical SEO | | Dan-Lawrence0
- 
		
		
		
		
		
		What is best practice for redirecting "secondary" domain names?
 For sites with multiple top-level domains that have been secured for a business or organization, I'm curious as to what is considered best practice for setting up 301 redirects for secondary domains. Is it best to do the 301 redirects at the registrar level, or the hosting level? So that .net, .biz, or other secondary domains funnel visitors to the correct primary/main domain name. I'm looking for the "best practice" answer and want to avoid duplicate content problems, or penalties from the search engines. I'm not trying to game the system with dozens of domain names, simply the handful of domains that are important to the client. I've seen some registrars recommend hosting secondary domains, and doing redirects from the hosting level (and they use meta refresh for "domain forwarding," which I want to avoid). It seems rather wasteful to set up hosting for a secondary domain and then 301 each URL. Technical SEO | | Scott-Thomas0
- 
		
		
		
		
		
		What is the best website structure for SEO?
 I've been on SEOmoz for about 1 month now and everyone says that depending on the type of business you should build up your website structure for SEO as 1st step. I have a new client click here ( www version doesn't work)... some bugs we are fixing it now. We are almost finished with the design & layout. 2nd question have been running though my head. 1. What would the best url category for the shop be /products/ - current url cat ex: /products/door-handles.html 2. What would you use for the main menu as section for getting the most out of SEO. Personally i am thinking of making 2-3 main categories on the left a section where i can add content to it (3-4 paragraphs... images maybe a video).So the main page focuses on the domain name more and the rest of the sections would focus on specific keywords, this why I avoid cannibalization. Main keyword target is "door handles" Any suggestions would be appreciated. Technical SEO | | mosaicpro0
 
			
		 
			
		 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				