Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best way to deal with an event calendar
- 
					
					
					
					
 I have an event calendar that has multiple repeating items into the future. They are classes that typically all have the same titles but will occasionally have different information. I don't know what is the best way to deal with them and am open to suggestions. Currently Moz anayltics is showing multiple errors (duplicate page titles, descriptions and overly dynamic urls). I'm assuming that it's showing duplicate elements way into the future. I thought of having the calendar no followed at all but the content for the classes seems valuable. Thanks, 
- 
					
					
					
					
 Sorry for all the posts however maybe this will help you as well that get rid of the dynamic uRLs http://www.webconfs.com/url-rewriting-tool.php Thomas 
- 
					
					
					
					
 A great completely and this is a good example of the type of difference changing the robots.txt file could make I would read all the information you can on it as it seems to be constantly updating. I used this info below as an example of a happy ending but to see the problems I would read all the stories you will see if you check out this link. http://wordpress.org/support/topic/max-cpu-usage/page/2 CPU usage from over 90% to less than 15%. Memory usage dropped by almost half, from 1.95 GB to 1.1 GB including cache/buffers. My setup is as follows: Linode 2GB VPS 
 Nginx 1.41
 Percona SQL Server using XtraDB
 PHP-FPM 5.4 with APC caching db requests and opcode via W3 Total Cache
 Wordpress 3.52
 All in One Event Calendar 1.11All the Best, Thomas 
- 
					
					
					
					
 I got the robots.txt file I hope this will help you. This is built into every GetFlywheel.com website they are a managed WordPress only hosting company website the reason they did this was the same reason Dan as described above. I'm not saying this is a perfect fix however after speaking with the founder of GetFlywheel I know they place this in the robots.txt file for every website that they host in order to try get rid of the crawling issue. This is an exact copy of any default robots.txt file from getflywheel.com Default Flywheel robots fileUser-agent: * 
 Disallow: /wp-admin/
 Disallow: /wp-includes/Disallow: /calendar/action:posterboard/ 
 Disallow: /calendar/action:agenda/
 Disallow: /calendar/action:oneday/
 Disallow: /calendar/action:month/
 Disallow: /calendar/action:week/
 Disallow: /calendar/action:map/As found on a brand-new website. If you Google "Max CPU All in one calendar" you will see more about this issue. I hope this is of help to you, Thomas PS here is what The maker of the all in one event calendar has listed on their site as a fix 
- 
					
					
					
					
 Hi Landon I had a client with a similar situation. Here's what I feel is the best goal; Calendar pages (weeks/months/days etc) - don't crawl, don't index Specific event pages - crawl and index Most likely the calendar URLs have not been indexed, but you can check with some site: searches. Assuming the have not been indexed, the best solution was to block crawling to certain URLs with robots.txt - calendars can go off into infinity, and you don't want to send the crawlers off into a black hole as it's not good for crawl budget, or for directing them to your actual content. 
- 
					
					
					
					
 is this the all-in-one event calendar for WordPress? If so I can give you the information or you can just Google CPU Max WordPress essentially you have to change the robots.txt file so the crawlers don't have huge issues as they do now with it. Get flywheel has that built into their robots.txt file if that is your issue I can go in and grab it for you. Sincerely, Thomas 
- 
					
					
					
					
 Besides this, take a look at the schema markup for Events it might help you mark up the page better so Google will understand what the page/ event is about: http://schema.org/Event 
- 
					
					
					
					
 Are the same classes in the future link to the same page? are you using canonical tags correctly? Your URL should help diagnose the problem and guide you better, 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		Is there a way to get a list of all pages of your website that are indexed in Google?
 I am trying to put together a comprehensive list of all pages that are indexed in Google and have differing opinions on how to do this. Technical SEO | | SpodekandCo0
- 
		
		
		
		
		
		Topic Cluster: URL Best Practices
 I'm trying to be mature and employ the Topic Cluster strategy to my content. In doing so I realized there are a few URL options. Some more difficult to execute than others. -Is it important to call out the Pillar Topic in your subtopic URL? Technical SEO | | dkellyagile
 -Does the Pillar Topic need to have its own landing page? (As opposed to just being part of the blog.) Here's an Example: My Pillar is: Inbound vs. Outbound
 My subtopic is: Marketing Platforms Here are the URL options I can think of... Option 1: https://pipelineinbound.com/blog/inbound-vs-outbound-marketing-platforms/ Option 2: https://pipelineinbound.com/blog/which-marketing-platforms/ Option 3: https://pipelineinbound.com/blog/marketing-platforms-inbound-vs-outbound/ Option 4 (Hardest): https://pipelineinbound.com/inbound-vs-outbound/marketing-platforms/ Are there some fundamental best practices for URL structure and Link Building as it pertains to Topic Clusters? Thanks!0
- 
		
		
		
		
		
		Best Practice for www and non www
 How is the best way to handle all the different variations of a website in terms of www | non www | http | https? In Google Search Console, I have all 4 versions and I have selected a preference. In Open Site Explorer I can see that the www and non www versions are treated differently with one group of links pointing to each version of the same page. This gives a different PA score. eg. http://mydomain.com DA 25 PA 35 http://www.mydomain.com DA 19 PA 21 Each version of the home page having it's only set of links and scores. Should I try and "consolidate" all the scores into one page? Should I set up redirects to my preferred version of the website? Thanks in advance Technical SEO | | I.AM.Strategist0
- 
		
		
		
		
		
		Best Practice on 301 Redirect - Images
 We have two sites that sell the same products. We have decided to retire one of the sites as we'd like to focus on one property. I know best practice is to redirect apples to apples, which in our case is easily done since the sites sold the same thing. www.SiteABC.com/ProductA can be redirected to www.SiteXYZ.com/ProductA. My question is how far does that thinking go regarding images? Each product has a main product page, of course, and then up to 6 images in some cases. Is it necessary to redirect www.SiteABC.com/ProductA-Image1.jpg to www.SiteXYZ.com/ProductA-Image1.jpg? Or can they all be redirected to just the product page? Technical SEO | | Natitude0
- 
		
		
		
		
		
		Merging several sites into one - best practice
 I had 2 sites on the web (www.physicseditor.de, www.texutrepacker.com) and decided to move them all under one single domain (www.codeandweb.com) Both sites were ranking very good for several keywords. I not redirected the most important pages from the old domains with a 301 redirect to the new subpages (www.texturepacker.com => www.codeandweb.com/texturepacker) Google still delivers the old domains but the redirect take people directly to the new content. I've already submitted the new site map to google webmaster tools. Pages are already in the index but do not really show up in the search results. How long does it take until google accepts the new domain and delivers the new content in the search results? Was it ok what I did? Or is there some room for improvement? SeoMoz will of course not find any information about the new page since it is not yet directly linked in google. But I can't get ranking information for the "old" pages since SeoMoz tells me that it can't crawl the old domains.... Technical SEO | | gossi740
- 
		
		
		
		
		
		Best 404 Error Checker?
 I have a client with a lot of 404 errors from Web Master Tools, and i have to go through and check each of the links because Some redirect to the correct page Some redirect to another url but its a 404 error Some are just 404 errors Does anyone know of a tool where i can dump all of the urls and it will tell me If the url is redirected, and to where if the page is a 404 or other error Any tips or suggestions will be really appreciated! Thanks SEO Moz'rs Technical SEO | | anchorwave0
- 
		
		
		
		
		
		What are your best tips for SEO on a shopping cart?
 So, I am working on a shopping cart platform (X-Cart) and so far don't like it. Also, the web designer is not someone I've worked with before and he is understandably conservative about access--which limits what I can and cannot do from the back end. One of the things I like to do is include text for the search engines. However, based on conversion, etc., I think the product images on a landing page (main brand info with specific products that show up) should show up first to move toward conversion first. I am thinking of adding the text below the product images on the brand pages so the viewer sees the products first while still keeping the content seo. My practice is to use between 300-350 words minimum on a page. Just wondering what best practices you have for a shopping cart. Care to share? Any tips or hints? Thoughts on what I might do that would be most effective? As always, thanks in advance for your sage advice! Technical SEO | | TheARKlady0
- 
		
		
		
		
		
		How best to redirect URL from expired classified ads?
 We have problem because our content are classifieds. Every ad expired after one or two mounts and then ad becomes inactive and we keep his page for one mount latter like a same page but we ad a notice that ad is inactive. After that we delete the ad and his page but need to redirect that URL to search results page which contains similar ads because we don't want to lose the traffic form that pages. How is the best way to redirect ad URL? Our thinking was to redirect internal without 301 redirection because the httacces file will be very big after a while and we are thinking to try a canonicalization because we don't want engine to think that we have to much duplicate content. Technical SEO | | Donaab0
 
			
		 
			
		 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				