Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Switched from Wix to Wordpress dreaded hashtag URL
- 
					
					
					
					
 Recently took over managing a site for a non-profit which was using the dreaded Wix. Switched over to Wordpress but now Google still has the old URL's with the hashtag. Can't forward them in .htaccess and don't want to add javascript for fear of slowing down load time. I found a solution that seems like it will take hours and hours of work. I found the solution at http://www.thedriversgarage.com/web-technology/redirecting-hashbang-urls-wix-urls/ but it seems like it would take hours with all the URL's. I submitted an XML sitemap in Google webmaster tools. My question is, how serious could this effect SEO for my site? Google accepted the new sitemap but still has the old URL's in SERP. How long does this generally take to remove? Will the hashtag URL's penalize the site for duplicate content? If so is there a way to tell Google the homepage without hashtags is the page with original content? Sort of like the rel=canonical tag which I know wont work as the hashtag URL's all redirect to the homepage so they will all have the tag. Does Google ignore the hashtag? Could there even be a benefit to this, possibly the homepage getting more page authority due to the redirects? How serious is this? Thanks in advancing. 
- 
					
					
					
					
 I'm in the same boat, and even tried the DRIVERS GARAGE solution (which is also posted on quite a few other blog sites). Unfortunately, that did not work for me. Neither did the REDIRECTION WP plugin, nor did editing my .htaccess a zillion different ways. Heck, I even tried creating directories and html files with embedded java. Here is the only redirection that DID WORK for me (as indicated it would by Peter): JAVASCRIPT (1) Create a Javascript file with this code: var hashesarr = { "#!old-news/chi3":'/new-page/', 
 "#!another-news/dkc8":'/another-new-page/',
 "#!something-old/eckje8":'/something-new/' };
 for (var hash in hashesarr) {
 var patt = new RegExp(hash);
 if (window.location.hash.match(patt) !== null) {
 window.location.href = hashesarr[hash];
 }
 }(2) Save that file to your theme's child folder (so it doesn't get overwritten in the future by theme or Wordpress updates. 
 I saved my file here: \wp-content\themes\aweseometheme-child\(3) In your SEO Plugin, or wherever you can edit the home page's HEAD file, add this code: (4) Test, make changes, try again and PRESTO! As a disclaimer, I have not yet tested to see how this will affect SEO Pagerank or Google redirects. I'm guessing I will still have to implement the Sitemap with the UGLY url's per the DRIVERS GARAGE. But all my client really cared about was that the client's who bookmarked specific pages, or had links pointing to deep pages would be redirected properly. MY AHA ANSWER WAS FOUND HERE: 
 http://www.simosh.com/article/cbgaifec-301-redirect-from-wix-to-wordpress.html
 (Alex Nikitenko is a genious!)AND JAVASCRIPT INSTRUCTION HERE: 
 https://codex.wordpress.org/Using_Javascript
- 
					
					
					
					
 Tuff situation. Why? Browser didn't sent # and everything behind it to the server. So if you trying to get url as http://www.example.com/#!my-super-duper-url 
 Browser will sent to the server request for http://www.example.com/ and server will process it. But full url that browser want is also included #! fragment. This mean that you can't make .htaccess redirect, nor some server side redirects for the moment.So same hurt also all bot and crawlers (Including Moz Roger!). And there was solution: 
 https://developers.google.com/webmasters/ajax-crawling/docs/specification?hl=en
 but later this solution was deprecated:
 https://googlewebmastercentral.blogspot.bg/2015/10/deprecating-our-ajax-crawling-scheme.htmlAnd this make things complicated. For now they still support old solution so will be OK for bots. Probably for some users that comes from bookmarks, emails and/or other traffic sources can have hard times. Because will be redirected to "homepage". So maybe combination of both methods (JS redirector + your actual method) can save the day for humans and bots. 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		Missing trailing slash in URL on subpages resulting in Moz PA of 1
 Even here in moz community I am noticing it. Is it really a factor to have an ending slash on the page? Does it make a difference? Our website has a homepage PA of 63, DA of 56 but all of our sub-pages are just 1 and they have been up for 4 months. Web Design | | serverleap1
- 
		
		
		
		
		
		Website Redesign - What to do with old 301 URLs?
 My current site is on wordpress. We are currently designing a new wordpress site, with the same URLs. Our current approach is to go into the server, delete the current website files and ad the new website files. My current site has old urls which are 301 redirected to current urls. Here is my question. In the current redesign process, do i need to create pages for old the 301 redirected urls so that we do not lose them in the launch of the new site? or is the 301 command currently existing outside of our server so this does not matter? Thank you in advance. Web Design | | CamiloSC0
- 
		
		
		
		
		
		Interlinking using Dynamic URLs Versus Static URLs
 Hi Guys, Could you kindly help us in choosing best approach out of mentioned below 2 cases. Case. 1 -We are using: We interlink our static pages(www.abc.com/jobs-in-chennai) through footer, navigation & by showing related searches. Self referential Canonical tags have been implemented. Case. 2 -We plan to use: We interlink our Dynamic pages(www.abc.com/jobs-in-chennai?source=footer) through footer, navigation & by showing related searches. Canonical tags have been implemented on dynamic urls pointing to corresponding static urls Query 1. Which one is better & expected to improve rankings. Query 2. Will shifting to Case 2 negatively affect our existing rankings or traffic. Regards Web Design | | vivekrathore0
- 
		
		
		
		
		
		How to know if a wordpress theme is coded correctly for Seo
 Hi, So I am curious if there is a tool to see if a site is coed properly for Google? I am running Avada, a standalone theme, yet I am also using a cache plugin. But when I search my code, its all like on one huge line. So I am curious if there is a way to verify or check if a theme is coded correctly? Thank you Web Design | | Berner1
- 
		
		
		
		
		
		Old site to new WordPress site - Client concerned about Yahoo Ranking
 Hello, Back Story I have a client (law firm) who has a large .html website. He has been doing his own SEO for years and it shows. I think the only reason he reached out to a professional is because he got a huge penalty from Google last fall and fell very far down in rankings. Although, he still retains a #1 spot in Yahoo for his site for the keyword phrase he wants. I have been creating a new WordPress theme for the client and creating all new pages and updating the formatting/SEO. From the beginning I have told the client that when we delete the old site and install a new WordPress site (same domain name, but different page hierarchy) he will take a bump in the search engines until all the 301 redirects get sorted out. I told him I can't guarantee any time frame of how long the dip in SEO will last. Some sites bounce right back while others take longer. Last week, during a discussion, he tells me that if he loses his #1 ranking on Yahoo for any length of time he thinks he will go out of business. Needless to say I was a little taken back. When it comes to SEO I use best practice techniques, do my research, stay on top of trends but I never guarantee rankings when moving to a new site. I'm thinking of ways I can help elevate any type of huge SEO drop off and help the client. Here is what I was thinking of suggesting to the client and I would love some feedback. Main Question He has another domain he isn't doing anything with. It's pretty much his domain name with pc added. I was thinking about using that domain to create a simple 1-2 page WordPress website with brand new content (no duplicate content) aimed at attracting his keyword phrase. I would do as much SEO as I could with a 1-2 page site and give it a month or so to see if this smaller site can get into the top #10 in Yahoo, or higher. Then, when we move the site he will still have a website on the first page of Yahoo for his keyword phrase. I hope I explained it clearly 🙂 I would be open to any suggestions anyone may have. Thanks Web Design | | Bill_K0
- 
		
		
		
		
		
		How to bounce back after a new url & new site design?
 About a month ago, my company changed domains (from the long-established www.imageworksstudio.com to the new www.imageworkscreative.com) and also did a complete overhaul of our site. We tried to do everything necessary to keep Google happy as we went through this change, but we've suffered a drastic loss of both rankings and traffic. I know that can happen as a result of a redesign AND as a result of a new domain, but I'm wondering how long you would expect it to take before we bounced back and also, what can we do in the meantime to improve? Web Design | | ScottImageWorks0
- 
		
		
		
		
		
		Best Practice issue: Modx vs Wordpress
 Lately I've been working a lot with Modx to create a new site for our own firm as well for other projects. But so far I haven't seen the advantages for SEO purposes other then the fact that with ModX you can manage almost everything yourself including snippets etc without to much effort. Wordpress is a known factor for blogging and since the last 2 years or so for websites. My question is: Which platform is better suited for SEO purposes? Which should I invest my time in? ModX or Wordpress? Hope to hear your thought on the matter Web Design | | JarnoNijzing0
- 
		
		
		
		
		
		The use of foreign characters and capital letters in URL's?
 Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó. We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago Web Design | | wdziedzic0
 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				