Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What's your best hidden SEO secret?
- 
					
					
					
					
 Work hard play hard and stay away from grey or black areas  
- 
					
					
					
					
 Hah hah. I agree with Richard. The mo community. And I'm also sympatico with Gianluca. Llong walks outside in the fresh air does wonders for my creativity. Ideas come easier. 
- 
					
					
					
					
 To make rest a project after a while. That pause will benefit your creativity, because your brain will work on it in the background without stress. When you return to the project, it will seems new somehow and those ideas your mind was breeding will come out with force. 
- 
					
					
					
					
 I have not been here long enough to have any 'hidden' secrets except SEOmoz  However, I do find that backlinks with great anchors really, really works well. 
- 
					
					
					
					
 That blackhat is a really poor long term strategy  
- 
					
					
					
					
 That one single hour in each day when I get in the "zone" and do week's worth of work in a single shot. Still figuring out how to extend that for the full 8 hours  
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		Forwarded vanity domains, suddenly resolving to 404 with appended URL's ending in random 5 characters
 We have several vanity domains that forward to various pages on our primary domain. Intermediate & Advanced SEO | | SS.Digital
 e.g. www.vanity.com (301)--> www.mydomain.com/sub-page (200) These forwards have been in place for months or even years and have worked fine. As of yesterday, we have seen the following problem. We have made no changes in the forwarding settings. Now, inconsistently, they sometimes resolve and sometimes they do not. When we load the vanity URL with Chrome Dev Tools (Network Pane) open, it shows the following redirect chains, where xxxxx represents a random 5 character string of lower and upper case letters. (e.g. VGuTD) EXAMPLE:
 www.vanity.com (302, Found) -->
 www.vanity.com/xxxxx (302, Found) -->
 www.vanity.com/xxxxx (302, Found) -->
 www.vanity.com/xxxxx/xxxxx (302, Found) -->
 www.mydomain.com/sub-page/xxxxx (404, Not Found) This is just one example, the amount of redirects, vary wildly. Sometimes there is only 1 redirect, sometimes there are as many as 5. Sometimes the request will ultimately resolve on the correct mydomain.com/sub-page, but usually it does not (as in the example above). We have cross-checked across every browser, device, private/non-private, cookies cleared, on and off of our network etc... This leads us to believe that it is not at the device or host level. Our Registrar is Godaddy. They have not encountered this issue before, and have no idea what this 5 character string is from. I tend to believe them because per our analytics, we have determined that this problem only started yesterday. Our primary question is, has anybody else encountered this problem either in the last couple days, or at any time in the past? We have come up with a solution that works to alleviate the problem, but to implement it across hundreds of vanity domains will take us an inordinate amount of time. Really hoping to fix the cause of the problem instead of just treating the symptom.0
- 
		
		
		
		
		
		How will changing my website's page content affect SEO?
 Our company is looking to update the content on our existing web pages and I am curious what the best way to roll out these changes are in order to maintain good SEO rankings for certain pages. The infrastructure of the site will not be modified except for maybe adding a couple new pages, but existing domains will stay the same. If the domains are staying the same does it really matter if I just updated 1 page every week or so, versus updating them all at once? Just looking for some insight into how freshening up the content on the back end pages could potentially hurt SEO rankings initially. Thanks! Intermediate & Advanced SEO | | Bankable1
- 
		
		
		
		
		
		One of my Friend's website Domain Authority is Reducing? What could be the reason?
 Hello Guys, One of my friend's website domain authority is decreasing since they have moved their domain from HTTP to https. Intermediate & Advanced SEO | | Max_
 There is another problem that his blog is on subfolder with HTTP.
 So, can you guys please tell me how to fix this issue and also it's losing some of the rankings like 2-5 positions down. Here is website URL: myfitfuel.in/
 here is the blog URL: myfitfuel.in/mffblog/0
- 
		
		
		
		
		
		Partial Match or RegEx in Search Console's URL Parameters Tool?
 So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them. Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789= All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe? Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt? Intermediate & Advanced SEO | | Ria_0
- 
		
		
		
		
		
		404's - Do they impact search ranking/how do we get rid of them?
 Hi, We recently ran the Moz website crawl report and saw a number of 404 pages from our site come back. These were returned as "high priority" issues to fix. My question is, how do 404's impact search ranking? From what Google support tells me, 404's are "normal" and not a big deal to fix, but if they are "high priority" shouldn't we be doing something to remove them? Also, if I do want to remove the pages, how would I go about doing so? Is it enough to go into Webmaster tools and list it as a link no to crawl anymore or do we need to do work from the website development side as well? Here are a couple of examples that came back..these are articles that were previously posted but we decided to close out: http://loyalty360.org/loyalty-management/september-2011/let-me-guessyour-loyalty-program-isnt-working http://loyalty360.org/resources/article/mark-johnson-speaks-at-motivation-show Thanks! Intermediate & Advanced SEO | | carlystemmer0
- 
		
		
		
		
		
		After reading of Google's so called "over-optimization" penalty, is there a penalty for changing title tags too frequently?
 In other words, does title tag change frequency hurt SEO ? After changing my title tags, I have noticed a steep decline in impressions, but an increase in CTR and rankings. I'd like to once again change the title tags to try and regain impressions. Is there any penalty for changing title tags too often? From SEO forums online, there seems to be a bit of confusion on this subject... Intermediate & Advanced SEO | | Felix_LLC0
- 
		
		
		
		
		
		Culling 99% of a website's pages. Will this cause irreparable damage?
 I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick Intermediate & Advanced SEO | | Townpages0
- 
		
		
		
		
		
		Best practice to redirects based on visitors' detected language
 One of our websites has two languages, English and Italian. The English pages are available at the root level: Intermediate & Advanced SEO | | Damiano
 www.site.com/ English homepage www.site.com/page1
 www.site.com/page2 The Italian pages are available under the /it/ level:
 www.site.com/it Italian homepage www.site.com/it/pagina1
 www.site.com/it/pagina2 When an Italian visitor first visits www.mysit.com we'd like to redirect it to www.site.com/it but we don't know if that would impact search engine spiders (eg GoogleBot) in any way... It would be better to do a Javascript redirect? Or an http 3xx redirect? If so, which of the 3xx redirect should we use? Thank you0
 
			
		 
			
		 
			
		 
				
		 
			
		 
			
		 
				
				 
				
				 
				
				 
				
				 
				
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				