Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
One Way Links vs Two Way Links
-
Hi,
Was speaking to a client today and got asked how damaging two way links are. i.e. domaina.com links to domainb.com and domainb.com links back to domaina.com. I need a nice simple layman's explanation of if/how damaging they are compared to one way links. And please don't answer with you lose link juice as I have a job explaining link juice.... I am explaining things to a non techie!
Thank you!!
-
think about it, you pass PR to me and i'll pass PR to you. At the same time we'll tell Google we are in cahoots together.:(
The easiest way to explain it is that Google is that linking to other sites on your site might make Google think that you are selling links and could get you penalized, so as an SEO you have a rule to never link out to other sites unless they are trusted domains like Facebook, Twitter or G+ pages. And in those cases you are linking to your social media profile pages for the user and not for helping other sites rank better which is against Google's TOS.
There is information directly on Google's Webmaster Guidelines that you can provide to them. Sometimes people prefer to hear it directly from the horses mouth.
-
Link exchanges used to do a lot. These days not so much. You can still get a benefit from them, but you need to be careful. Don't use the same anchor text, try not to get sitewide footer links, and only do so with highly valuable partners.
I mean if you can get a DA 90 site to link back to your DA 13 site then yea a link exchange would probably benefit you a bit.
But overall 1 way links are the best way to go.
-
I guess the simplest way to explain that is that they cancel each other out so have no benefit.
Additionally, if used in excess with keyword anchor text they may have a negative impact (through skewing the anchor text ratios).
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
Does google credit links from iFrames or created by Javascript, if so, is one more powerful than the other?
Consider this example, because I want to be clear about what I mean. You have two websites. Lets all them www.a.com and www.b.com. On www.a.com/some/page, there is an iframe something like this:
Intermediate & Advanced SEO | | A Former User
<iframe src="www.b.com/some/special/path"></iframe>
Then content of this iframe is a bunch of pictures, text and numbers, as well as a group of links, linking each picture to www.b.com for example the links might be:
www.b.com/content/1
www.b.com/content/2
www.b.com/content/3 Questions: When google crawls **www.a.com/some/page, **does it pass link juice to www.b.com/content/*? Does google instead consider these to be internal links within b.com itself. because links to www.b.com/content/ ** are actually from b.com itself, since the domain of the iframe is actually: www.b.com/some/special/path 3) Is there any amount of link juice passed from www.a.com/some/page to* www.b.com/some/special/path **because this is the src= element of an iframe that a.com is hosting? Consider an alternative setup. Where instead of using an iframe the contents of the above described iFrame is actually added the the page dynamically using javascript, and a call to an API endpoint at b.com. Resulting in these links being added directly to the body of a.com without being wrapped in an iframe element. Questions:
4) Do these links that were created after page load still get crawled and credited by google? (i have heard in the past that google was going to start crawling javascript, i just don't know if this is known for a fact yet).
5) Do links created on the client side hold the same weight as a link that was served directly via the backend html generation? If both the links within the iframe and the links within the javascript embed method pass link juice. Is one preferred over the other? is one known to be more effective than the other? Thanks!0 -
Link Brokers Yes or No?
We have a client who has asked us to talk to link brokers to speed up the back linking process. Although I've been aware of them for ages I have never openly discussed the possible use of 'buying' links or engaging in that part of the industry. Do they have a place in SEO and if so what is the MOZ communities thoughts?
Intermediate & Advanced SEO | | wearehappymedia0 -
SEO Considerations for merging two brand website into one
Hello fellow Mozzers, We have two websites for two similar brands at my place of employment, the two brands currently serve slighly different products but could be held quite happily under one branded site. As part of a potential group merger into one sole brand, we will have to create one joined up website which will then feature all our products. The newly merged site will also have more scope to allow us to expand our product range where as currently one brand is kind of specific to a particular market due to its name. So as part of the Merge, I have to consider the potential implications for our search traffic, as this is an integral part of our business. Brand A - older, more authorative, great content, good organic positions - top 10 for pretty much all terms we favour. Brand B - younger, but has more marketing scope due to name, still good site and lots of content. Unfortunately Brand B has more in terms of potential lifespan, but is currently the less authorative of the two sites we run. it has lower DA and PR according to my Moz Analytics, a lower number of quality links and less content. In order to give the Brand B website the boost that is needed and in effect replace Brand A in the serps which has great organic positions, I need to make sure all bases are ticked for an action plan. So far this is what I have. Transfer all exisiting Brand A web pages to Brand B website. Rel canonical all Brand A pages to now point to Brand B websites new pages. 301 redirect all pages on Brand A to Brand B during the transfer. Once 301 redirects are in place then request external sites to actually repoint to Brand B website for any links. Update xml Sitemaps Update any content that mentions Brand B to now be Brand A. resubmit sitemaps to Webmaster tools Update all social profiles Update all local search profiles and listings Update all review sites with new brand name / merge any with both brands On a supplementary note for customer information, looking to also keep the older Brand A Home page up for a short time to help people understand the transition rather than a complete redirect which to our demographic could confuse and alienate people. Will also look to send a mass email to roughly 400K people informing them of the move abd how it affects them. I have no doubt there will be some glaringly obvious additions, any further advice would be much appreciated. Hope you are all well. Tim
Intermediate & Advanced SEO | | TimHolmes1 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
One company, two address. How do I handle footer NAP?
I have a client with two address that fall under the same brand. One address is in CA and the other is in NY. I have a single domain and will be creating separate landing pages for each location but wanted to know how I should handle the NAP in the footer of the other pages. Should I list both NAPs, one NAP or neither NAPs in the footer? Thanks in advance for your help.
Intermediate & Advanced SEO | | DigitalWorkboots0 -
Do 404 Pages from Broken Links Still Pass Link Equity?
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this. When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost? We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name). Anyone have a clear answer? Thanks!
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
10,000+ links from one site per URL--is this hurting us?
We manage content for a partner site, and since much of their content is similar to ours, we canonicalized their content to ours. As a result, some URLs have anything from 1,000,000 inbound links / URL to 10,000+ links / URL --all from the same domain. We've noticed a 10% decline in traffic since this showed up in our webmasters account & were wondering if we should nofollow these links?
Intermediate & Advanced SEO | | nicole.healthline0