Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is link cloaking bad?
-
I have a couple of affiliate gaming sites and have been cloaking the links, the reason I do this is to stop have so many external links on my sites.
In the robot.txt I tell the bots not to index my cloaked links.
Is this bad, or doesnt it really matter?
Thanks for your help.
-
I can't make a judgement on it, but you might check out Graywolf's recent post this month on masking affiliate links. http://www.wolf-howl.com/affiliate-marketing/how-to-mask-affiliate-links/
-
Thanks
-
Thanks
-
Okay... when I think of cloaked link I think of a link that is being hidden from the user and is there only for keyword or other SEO purposes. If your link has a function, I think you are okay and the nofollow should do the trick.
-
So you use robots.txt to disallow indexing of anything under the /bet/ folder, you link to 'bet/XYZ' using nofollow and 'bet/XYZ' has a redirect on it?
I'm going to go with safe. It's a fairly common practice.
-
Not sure I explained my self properly, so if I show an example that might help.
The links I am cloaking are behind buttons which say 'bet now', the cloaked link is http://www.comparebestodds.com/bet/betfair/ and is set as nofollow.
The link which is cloaking is
http://www.betfair.com/?clkID=16251_67988CCB46EC4C389F77AD796257F6&rfr=16251This links are important as its what will make money for my site, and they are important to the users as this is what they need to click to get taken to a site to place a bet.
Hope that makes it a little clearer.
Thanks
-
It sounds like you're already nofollowing the links. This will reduce the number of links on your page as the Search Engines see it, which looks to be your goal.
Assuming this is what you're aiming to do, there's no reason to hide your links. If you don't want search engines OR users to see them, just get rid of them altogether.
-
So are the links usable to visitors of the site?
If you think the links are valuable, you should add the "nofollow" tag to not send over any link juice. Keeping them hidden from the visitor is a bad practice and I think could potentially get you penalized. If you don't want them used, then don't have them on the page.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
disavow link more than 100,000 lines
I recieved a huge amount of spamy link (most of them has spam score 100) Currently my disavow link is arround 85.000 lines but at least i have 100.000 more domain which i should add them. All of them are domains and i don't have any backlink in my file. My Problem is that google dosen't accept disavow link which are more than 2MB and showes this message : File too big: Maximum file size is 100,000 lines and 2MB What should i do now?
Technical SEO | | sforoughi0 -
Should we Nofollow Social Links?
I've been asked the question of whether if we should nofollow all of our social links, would this be a wise thing to do? I'm not exactly getting a clear answer from search results and thought you guys would be best to ask 🙂 Thanks in advance.
Technical SEO | | JH_OffLimits0 -
Personalized Content Vs. Cloaking
Hi Moz Community, I have a question about personalization of content, can we serve personalized content without being penalized for serving different content to robots vs. users? If content starts in the same initial state for all users, including crawlers, is it safe to assume there should be no impact on SEO because personalization will not happen for anyone until there is some interaction? Thanks,
Technical SEO | | znotes0 -
Spammers created bad links to old hacked domain, now redirected to our new domain. Advice?
My client had an old site hacked (let's call it "myolddomain.com") and the hackers created many links in other hacked sites with links such as http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html The old myolddomain.com site was redirected to a different new site since then, but we still see over a thousand spam links showing up in the new site's Search Console 404 crawl errors report. Also, using the links: operator in google search, we see many results of spam links. Should we be worried about these bad links pointing to our old site and redirecting to 404s on the new site? What is the best recommendation to clean them up? Ignore? 410s? Other? I'm seeing conflicting advice out there. The old site is hosted by the client's previous web developer who doesn't want to clean anything up on their end without an ongoing hosting contract. So beyond turning redirects on or off, the client doesn't want to pay for any additional hosting. So we don't have much control over anything related to "myolddomain.com". 😞 Thanks in advance for any assistance!
Technical SEO | | usDragons0 -
Referencing links in Articles and Blogs
Hi I am wondering if the <sup>tag in html is picked up by google as a reference point?</sup> I.e when you put a superscript in word it puts a small number next to your sentence. Then you have a list of reference at the end of the blog/article does google recognise this?
Technical SEO | | Cocoonfxmedia0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
What is link Schemes?
Hello Friends, Today I am reading about link schemes on http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66356 there are a several ways how to avoid Google penalties and also talk about the low quality links. But I can't understand about "Low-quality directory or bookmark site links" Is there he talked about low page rank, Alexa or something else?
Technical SEO | | KLLC0 -
Having www. and non www. links indexed
Hey guys, As the title states, the two versions of the website are indexed in Google. How should I proceed? Please also note that the links on the website are without the www. How should I proceed knowing that the client prefers to have the www. version indexed. Here are the steps that I have in mind right now: I set the preferred domain on GWMT as the one with www. I 301 redirect any non www. URL to the www. version. What are your thoughts? Should I 301 redirect the URL's? or is setting the preference on GWMT enough? Thanks.
Technical SEO | | BruLee0