Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Switched from Wix to Wordpress dreaded hashtag URL
-
Recently took over managing a site for a non-profit which was using the dreaded Wix. Switched over to Wordpress but now Google still has the old URL's with the hashtag. Can't forward them in .htaccess and don't want to add javascript for fear of slowing down load time.
I found a solution that seems like it will take hours and hours of work. I found the solution at http://www.thedriversgarage.com/web-technology/redirecting-hashbang-urls-wix-urls/ but it seems like it would take hours with all the URL's.
I submitted an XML sitemap in Google webmaster tools.
My question is, how serious could this effect SEO for my site? Google accepted the new sitemap but still has the old URL's in SERP. How long does this generally take to remove? Will the hashtag URL's penalize the site for duplicate content? If so is there a way to tell Google the homepage without hashtags is the page with original content? Sort of like the rel=canonical tag which I know wont work as the hashtag URL's all redirect to the homepage so they will all have the tag.
Does Google ignore the hashtag? Could there even be a benefit to this, possibly the homepage getting more page authority due to the redirects? How serious is this? Thanks in advancing.
-
I'm in the same boat, and even tried the DRIVERS GARAGE solution (which is also posted on quite a few other blog sites). Unfortunately, that did not work for me. Neither did the REDIRECTION WP plugin, nor did editing my .htaccess a zillion different ways. Heck, I even tried creating directories and html files with embedded java.
Here is the only redirection that DID WORK for me (as indicated it would by Peter):
JAVASCRIPT
(1) Create a Javascript file with this code:
var hashesarr = { "#!old-news/chi3":'/new-page/',
"#!another-news/dkc8":'/another-new-page/',
"#!something-old/eckje8":'/something-new/' };
for (var hash in hashesarr) {
var patt = new RegExp(hash);
if (window.location.hash.match(patt) !== null) {
window.location.href = hashesarr[hash];
}
}(2) Save that file to your theme's child folder (so it doesn't get overwritten in the future by theme or Wordpress updates.
I saved my file here: \wp-content\themes\aweseometheme-child\(3) In your SEO Plugin, or wherever you can edit the home page's HEAD file, add this code:
(4) Test, make changes, try again and PRESTO!
As a disclaimer, I have not yet tested to see how this will affect SEO Pagerank or Google redirects. I'm guessing I will still have to implement the Sitemap with the UGLY url's per the DRIVERS GARAGE. But all my client really cared about was that the client's who bookmarked specific pages, or had links pointing to deep pages would be redirected properly.
MY AHA ANSWER WAS FOUND HERE:
http://www.simosh.com/article/cbgaifec-301-redirect-from-wix-to-wordpress.html
(Alex Nikitenko is a genious!)AND JAVASCRIPT INSTRUCTION HERE:
https://codex.wordpress.org/Using_Javascript -
Tuff situation. Why? Browser didn't sent # and everything behind it to the server.
So if you trying to get url as http://www.example.com/#!my-super-duper-url
Browser will sent to the server request for http://www.example.com/ and server will process it. But full url that browser want is also included #! fragment. This mean that you can't make .htaccess redirect, nor some server side redirects for the moment.So same hurt also all bot and crawlers (Including Moz Roger!). And there was solution:
https://developers.google.com/webmasters/ajax-crawling/docs/specification?hl=en
but later this solution was deprecated:
https://googlewebmastercentral.blogspot.bg/2015/10/deprecating-our-ajax-crawling-scheme.htmlAnd this make things complicated. For now they still support old solution so will be OK for bots. Probably for some users that comes from bookmarks, emails and/or other traffic sources can have hard times. Because will be redirected to "homepage". So maybe combination of both methods (JS redirector + your actual method) can save the day for humans and bots.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do Wordpress sites outrank SquareSpace?
I was a big fan of Wordpress. I used it for 10 years. However, because I run a very small business, the constant upkeep needed on WP in the end started to frustrate me in the end, so I moved to SquareSpace. However, I am beginning to question my decision, as one of my sites is struggling really badly, and I mean badly. The other sites are okay. So I started asking around, and most people are saying there shouldn't be a difference. A few people have said their Wordpress sites always outranks their SquareSpace sites. Then I read what Rand Fishkin said in the below Twitter thread, now I am even more confused. I am very reluctant to move to Wordpress, its just so much hassle. But at the same time, if a site doesn't get much traffic then it's useless. https://twitter.com/drew_pickard/status/991659074134556673 https://twitter.com/randfish/status/991974456477278209 Please let me know your thoughts and experience.
Web Design | | RyanUK0 -
WordPress redirects are taking too long to navigate: Anyone ever faced this?
Hi community, We are using wordpress website. We have redirected hundreds of URLs from wordpress redirect manager for last 10 years around. Suddenly from last one week, the redirects are taking too long to navigate to the pages; like around 1 minute. Could you anybody face the same issue? Please help me on this. Thanks
Web Design | | vtmoz0 -
Question Mark In URL??
So I am looking at a site for a client, and I think I already have my answer, but wanted to check with you guys. First off the site is in FLASH and HTML. I told the client to dump the flash site, but she isn't willing right now. So the URLS are generated like this. Flash: http://www.mysite.com/#/page/7ca2/wedding-pricing/ HTML: http://www.mysite.com/?/page/7ca2/wedding-pricing/ checking the site in Google with a site:mysite, none of the interior pages are indexed at all. So that is telling me that Google is pretty much ignoring everything past the # or ?. Is that correct? My recommendation is to dump the flash site and redo the URLS in a SEo friendly format.
Web Design | | netviper0 -
Yes or No for Ampersand "&" in SEO URLs
Hi Mozzers I would like to know how crawlers see the ampersand (& or &) in your URLs and if Google frown upon this or not? As far as I know they purely recognise this as "and" is this correct and is there any best practice for implementing this, as I know a lot of people complained before about & in links and that it is better to use it as &, but this is not on links, this is on URLs. Reason for this is that we looking to move onto an ASP.Net MVC framework (any suggestions for a different framework are welcome, we still just planning out future development) and in order to make use of the filter options we have on our site we need a parameter to indicate the difference on a routing level (routing sends to controller, controller sends to model, model sends to controller and controller sends to view < this is pattern of a request that comes in on the framework we will be using). I already have -'s and /'s in the URLs (which is for my SEO structuring) so these syntax can't be used for identifying filters the user clicks or uses to define their search as it will create a complete mess in the system. Now we looking at & to say; OK, when a user lands on /accommodation and they selects De Kelders (which is a destination in our area) the page will be /accommodation/de-kelders on this page they can define their search further to say they are looking for 5 star accommodation and it should be close to the beach, this is where the routing needs some guidance and we looking to have it as follow: /accommodation/de-kelders/5-star&close-to-the-beach. Now, does the "&" get identified by search engines on a URL level as "and" and does this cause any issues with crawling or indexation or would it be best to look at another solution? Thanks, Chris Captivate
Web Design | | DROIDSTERS0 -
URLs with Hashtags - Does Google Index Them?
Hi there, I have a potential issue with a site whereby all pages are dynamically populated using Javascript. Thus, an example of an URL on their site would be www.example.com/#!/category/product. I have read lots of conflicting information on the web - some says Google will ignore everything after the hashtag; other people say that Google will now index everything after the hashtag. Does anybody have any conclusive information about this? Any links to Google or Matt Cutts as confirmation would be brilliant. P.S. I am aware about the potential issue of duplicate content, but I can assure you that has been dealt with. I am only concerned about whether Google will index full URLs that contain hashtags. Thanks all! Mark
Web Design | | markadoi840 -
URL structure for multiple cities?
Hi, i am in the process of setting up a business directory site that will be used in a number of cities, though i am initially launching with only one city. My question is, what is the best URL structure to use for the site and should i start using this URL structure from day one? At the moment i am using www.mysite.com.au as my primary website where it contains all listings for the the one initial launch city. Though to plan for the future i was considering this URL structure: www.mysite.com.au/cityname so for example, if i launch in the city Sydney initially then all website traffic that goes to www.mysite.com.au would simply be redirected (302 temp redirect?) to www.mysite.com.au/sydney. When i expand to other cities www.mysite.com.au would simply be a "select your city" screen that then redirects to the city of choice (similar to www.groupon.com page). How would doing a 302 redirect from www.mysite.com.au to www.mysite.com.au/city impact on SEO for the initial launch? Or should i just place this on the root domain since no other cities exist at the moment?
Web Design | | adamkirk0 -
Missing Meta Description Tag - Wordpress Tag
I am going through my crawl diagonostics issues and I have lots of "Missing Meta Description Tags". However when I look at the url's they are Wordpress Tags, which do not have a meta description. Shall I just ignore these errors or should I find a way to add a meta description? Is it important?
Web Design | | petewinter0 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó. We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | | wdziedzic0