Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is page speed important to improve SEO ranking?
- 
					
					
					
					
 @jasparcj 
 many codes of some pages, can be seen as malicious by the Google robot.
 Although they are not bad for users, google detects them in another way.
 It is best to consult and review those codes.
- 
					
					
					
					
 @pau4ner Thanks for your comment. And you're 100% right 
- 
					
					
					
					
 Nobody except Google can tell for sure, but many SEOs and myself (I work with several websites) haven't noticed any substantial change in rankings after improving our site's speed. In fact, you can find many examples of slower sites ranking above much faster websites. I believe though that if your site speed is so slow that it impairs user's experience, you will lose rankings. Not due to the speed itself, but due to the higher bounce rates it is causing. In summary (and according to my experience and many other SEO professionals), if your website loads quite fast, improving its speed won't cause any ranking improvements. But if it is very slow, it may have a positive impact. 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
 Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated! Technical SEO | | AKCAC1
- 
		
		
		
		
		
		Migrating Subfolder content to New domain Safely
 Hello everyone, I'm currently facing a challenging situation and would greatly appreciate your expertise and guidance. I own a website, maniflexa.com, primarily focused on the digital agency niche. About 3 months ago, I created a subfolder, maniflexa.com/emploi/, dedicated to job listings which is a completely different niche. The subfolder has around 120 posts and pages. Unfortunately, since I created the subfolder, the rankings of my main site have been negatively impacted. I was previously ranking #1 for all local digital services keywords, but now, only 2 out of 16 keywords have maintained their positions. Other pages have dropped to positions 30 and beyond. I'm considering a solution and would like your advice: I'm planning to purchase a new domain and migrate the content from maniflexa.com/emploi/ to newdomain.com. However, I want to ensure a smooth migration without affecting the main domain maniflexa.com rankings and losing backlinks from maniflexa.com/emploi/ pages. Is moving the subfolder content to a new domain a viable solution? And how can I effectively redirect all pages from the subfolder to the new domain while preserving page ranks and backlinks? Intermediate & Advanced SEO | | davidifaso
 I wish they did, but GSC doesn't offer a solution to migration content from subfolder to a new domain. 😢 Help a fellow Mozer. Thanks for giving a hand.0
- 
		
		
		
		
		
		Do articles written about artificial intelligence rank on Google?
 This is my personal website. I wonder, will the articles written about artificial intelligence rank on Google, or will the site not rank? https://withpositivity.com/ Community | | lowzy0
- 
		
		
		
		
		
		Ranking going south
 Hi - I have a site Simply Stairlifts and I don't understand it but I've followed all the SEO processes of cleaning the site and building links, but ranking just keeps falling - any advise would be very gratefully received 👍 . SEO Tactics | | Naju2310
- 
		
		
		
		
		
		Is NitroPack plugin Black Hat SEO for speed optimization
 We are getting ready to launch our redesigned WP site and were considering using NitroPack performance optimization plugin, until some of our developers started ringing the alarm. Here is what some in the SEO community are saying about the tool. The rendering of the website made with the NitroPack plugin in the Page Metric Test Tools is based entirely on the inline CSS and JS in the HTML file without taking into account additional numerous CSS or JS files loaded on the page. As a result, the final metric score does not include CSS and JavaScript files evaluation and parsing. So what they are saying is that a lot of websites with the NitroPack plugin never become interactive in the Page Metric Tools because all interactivity is derived from JavaScript and CSS execution. So, their "Time to Interactive" and "Speed Index" should be reported as equal to infinity. Would Google consider this Black Hat SEO and start serving manual actions to sites using NitroPack? We are not ready to lose our hard-earned Google ranking. Please, let me know your thoughts on the plugin. Is it simply JS and CSS "lazy loading" that magically offers the first real-world implementation that works magic and yields fantastic results, or is it truly a Black Hat attempt at cheating Google PageSpeed Insights numbers? Thank you! On-Page Optimization | | opiates0
- 
		
		
		
		
		
		Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
 We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc) On-Page Optimization | | smaavie
 Cooking Method (fry, bake, boil, steam, etc)
 Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
 find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
 find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5
 
			
		 
			
		 
			
		 
				
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				