Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Do Backlinks to a PDF help with overall authority/link juice for the rest of the domain?
- 
					
					
					
					
 We are working on a website that has some high-quality industry articles available on their website. For each article, there is an abstract with a link to the PDF which is hosted on the domain. We have found in Analytics that a lot of sites link directly to the PDF and not the webpage that has the abstract of the article. Can we get any benefit from a direct PDF link? Or do we need to modify our strategy? 
- 
					
					
					
					
 You can/need to build links from within the domain to thePDF report you want to share also. Links within the PFD file will also help for people who share the file via email, and not within the site. This will bring people back to your domain. 
- 
					
					
					
					
 Also have a Google for optimising pdfs for search, i know there are number of ways & options http://www.seomoz.org/ugc/how-to-optimize-pdf-documents-for-search http://searchengineland.com/eleven-tips-for-optimizing-pdfs-for-search-engines-12156 
- 
					
					
					
					
 Add links to your site within the .pdf You can also add your branding to the .pdf 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		How can I stop a tracking link from being indexed while still passing link equity?
 I have a marketing campaign landing page and it uses a tracking URL to track clicks. The tracking links look something like this: http://this-is-the-origin-url.com/clkn/http/destination-url.com/ The problem is that Google is indexing these links as pages in the SERPs. Of course when they get indexed and then clicked, they show a 400 error because the /clkn/ link doesn't represent an actual page with content on it. The tracking link is set up to instantly 301 redirect to http://destination-url.com. Right now my dev team has blocked these links from crawlers by adding Disallow: /clkn/ in the robots.txt file, however, this blocks the flow of link equity to the destination page. How can I stop these links from being indexed without blocking the flow of link equity to the destination URL? Technical SEO | | UnbounceVan0
- 
		
		
		
		
		
		Best practices for controlling link juice with site structure
 I'm trying to do my best to control the link juice from my home page to the most important category landing pages on my client's e-commerce site. I have a couple questions regarding how to NOT pass link juice to insignificant pages and how best to pass juice to my most important pages. INSIGNIFICANT PAGES: How do you tag links to not pass juice to unimportant pages. For example, my client has a "Contact" page off of there home page. Now we aren't trying to drive traffic to the contact page, so I'm worried about the link juice from the home page being passed to it. Would you tag the Contact link with a "no follow" tag, so it doesn't pass the juice, but then include it in a sitemap so it gets indexed? Are there best practices for this sort of stuff? Technical SEO | | Santaur0
- 
		
		
		
		
		
		Transfer a Main Domain to a Sub-Domain
 My IT department tells me they want to transfer my main site domain, which has been in existence since 1999 as an e-commerce site (maindomain.com) to a sub-domain (www2.maindomain.com) or a completely new domain (newdomain.net). This is because we are launching a new website and B2C e-commerce engine, but we still have to maintain the legacy B2B e-commerce engine which contains hard-coded URLs, and both systems can't use the same domain. I've been researching the issue across SEOmoz, but I haven't come across this exact type of scenario (mostly I've seen a sub-domain to new domain). I see major problems with their proposal, including negative SEO impact, loss of domain authority/ranking and issues with branding. Does anyone know the exact type of impact I can expect to see in this scenario and specific steps I should go about to minimize the impact? Btw, I will be using Danny Dover's guide on properly moving domains where appropriate. Thanks! Technical SEO | | AscendLearning0
- 
		
		
		
		
		
		HELP: Wrong domain showing up in Google Search
 So i have this domain (1)devicelock.com and i also had this other domain (2)ntutility.com, the 2nd domain was an old domain and it is not in use anymore. But when i search for devicelock on Google, the homepage devicelock.com does not exist. Only ntutility.com comes up. I asked one of the developer how the redirect is happening from the old domain to the new one and he told me its through a DNS forward. And there is no way to have an .htacess file to set up a 301 instead. Please help! Technical SEO | | Devicelock0
- 
		
		
		
		
		
		How to increase your Domain Authority
 Hi Guys, Can someone please provide some pointers on how to best increase your Domain Authority?? Thanks Gareth Technical SEO | | GAZ090
- 
		
		
		
		
		
		Domain authority and keyword difficulty
 I know there are too many variables for a certain answer, however do people take their domain authority into account when using keyword difficulty tool? I have a new domain which only has a score of seven at the moment. When using the keyword searching tool what is the maximum difficulty level keywords people would target initially? Obviously I would seek to increase the difficulty of the words over time but to start off its a hard choice between keywords which can be ranked for in a reasonable period of time and the keywords which are getting enough traffic to make the effort worthwhile. Technical SEO | | Grumpy_Carl0
- 
		
		
		
		
		
		Does Google pass link juice a page receives if the URL parameter specifies content and has the Crawl setting in Webmaster Tools set to NO?
 The page in question receives a lot of quality traffic but is only relevant to a small percent of my users. I want to keep the link juice received from this page but I do not want it to appear in the SERPs. Technical SEO | | surveygizmo0
- 
		
		
		
		
		
		What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
 Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com Technical SEO | | fthead9
 User-agent: *
 Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0
 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				