Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
4XX Errors - Adding %5c%5c to Links
-
Hi all
Hope someone can help me with this.
The internal links on my hubby's business site occasionally break and add %5c%5c%5c endlessly to the end of the url - like this:
site.com/about/hours-of-operation/\\\\\\\\%
I cannot for the life of me figure out why it is doing this and while it has happened to me from time to time, I can't recreate it.
My crawl diagnostics here in my SEOMox campaign show 19-20 urls doing this - it's nuts.
Any insight?
Thank you!!
Jennifer
~PotPieGirl
-
Both Shane and I looked and neither of saw any / vs \ issues.
Then, I just took a peek at my source code and look what I saw:
http://screencast.com/t/y02R4RS2L
Think that is it?
Thanks for replying!
Jennifer
-
Sent you a PM, Shane - thanks so much for the offer!
By "occasionally break", I mean that every now and again, any link on the site will freak out and add the %5c jibberish to the end.
Jennifer
-
It would really have to be....
The only reason for a %5C is the use of backslash - as that is actually what it means in code.
**business site occasionally break **
How do you mean "break" ?
My crawl diagnostics here in my SEOMox campaign show 19-20 urls doing this - it's nuts.
Is the only issue from this in SEOMOZ reports?
If you would like to PM me the site I can attempt to profile it
-
Thanks for replying so quickly, Shane!
I don't believe it's a / vs \ issue. It's a wordpress site. Key pages are in the top nav bar (all urls correct) and all the side bar links are 'widgets' and created by Wordpress.
For some reason I am suspecting a theme issue, but if I can't recreate the error, I'll have no way of knowing if changing the theme solves the problem.
Site has been online since 2010 with no issues...this is a new issue (past couple months according to crawl diagnostics).
Thanks!!!
-
somewhere in your url coding you added backslash instead of slash which is not valid.
-
It appears from research it is actually \ (backslash) not / (slash)
So possibly somewhere in your site you have used \ instead of / but of course this is just a possibility, as it appears that %5C is the delimiter or interpretation of \ in original Unix.
Hope this helps
PS a quick check at http://www.w3schools.com/tags/ref_urlencode.asp verified %5c is URL encode, for backslash - as it is treated as a special character, so somewhere in your CMS or code, you have used backslash instead of slash.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap error in Webmaster tools - 409 error (conflict)
Hey guys, I'm getting this weird error when I submit my sitemap to Google. It says I'm getting a 409 error in my post-sitemap.xml file (https://cleargear.com/post-sitemap.xml). But when I check it, it looks totally fine. I am using YoastSEO to generate the sitemap.xml file. Has anyone else experienced this? Is this a big deal? If so, Does anyone know how to fix? Thanks EwTswL4
Technical SEO | | Extima-Christian0 -
Schema Markup Errors - Priority or Not?
Greetings All... I've been digging through the search console on a few of my sites and I've been noticing quite a few structured data errors. Most of the errors are related to: hcard, hentry and hatom. Most of them are missing author & entry-title, while the other one is missing: fn. I recently saw an article on SEL about Google's focus on spammy mark-up. The sites I use are built and managed by vendors, so I would have to impress upon them the impact of these errors and have them prioritize, then fix them. My question is whether or not this should be prioritized? Should I have them correct these errors sooner than later or can I take a phased approach? I haven't noticed any loss in traffic or anything like that, I'm more focused on what negative impact a "phased approach" could have. Any thoughts?
Technical SEO | | AfroSEO0 -
How to set up internal linking with subcategories?
I'm building a new website and am setting up internal link structure with subcategories and hoping to do so with best Seo practices in mind. When linking to a subcategory's main page, would I make the internal link www.xxx.com/fishing/ or www.xxx.com/fishing/index.html or does it matter? I'm just trying to avoid duplicate content I guess, if Google saw each page as a separate page. Any other cautions when using subdirectories in my navigation?
Technical SEO | | wplodge0 -
Links from Instructables.com?
This is a silly newbie question. But will posting on www.instructables.com with some valuable content and url link back to my site help with "linking"? Or do they put a no-follow on all links on their site? Thanks for answering! Ron
Technical SEO | | yatesandcojewelers0 -
How to fix broken links?
Hi, I use WordPress CMS with Yoast SEO plugin. I have just found out that my 403 errors increased dramatically. It seems that all my tags below of each post are being broken for some reason. When i click on the tags i get the following massage: **403 Forbidden Request forbidden by administrative rules. ** I assume it has something to do with the configuration within Yoast SEO plugin. Dose anyone know how should i fix that? Thanks, Raviv evsGujA
Technical SEO | | Indiatravelz0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
Links from the same server has value or not
Hi Guys, Sometime ago one of the SEO experts said to me if I get links from the same IP address, Google doesn't count them as with much value. For an example, I am a web devleoper and I host all my clients websites on one server and link them back to me. Im wondering whether those links have any value when it comes to seo or should I consider getting different hosting providers? Regards Uds
Technical SEO | | Uds0 -
How to find links to 404 pages?
I know that I used to be able to do this, but I can't seem to remember. One of the sites I am working on has had a lot of pages moving around lately. I am sure some links got lost in the fray that I would like to recover, what is the easiest way to see links going to a domain that are pointing to 404 pages?
Technical SEO | | MarloSchneider0