Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best way to embed PDF documents for SEO?
-
I have been using SCRIBD to embed PDF documents on my site but until recently I did not include the link back to SCRIBD. Will my site get credit for this content or will it go to SCRIBD? Is there a better way to embed PDF documents for SEO?
-
they are not at the moment - the blog is hosted on wordpress so the link goes back to that site. Should i add them with HTML or as PDF to my site?
-
Aha, so the content was already posted on your site. Than I wouldn't worry too much about the SCRIBD embedding. Are the 60 pages of blog posts also available on one page on your site? If not, I'd make a page with all the content on it and link to the PDF file near the top of the site to have a good alternative.
-
I have embedded them for the ease of use by users of my site. For example, I have 60 pages of blog posts in one embed that people can page through easily to read. Is there a better way to do this?
-
Is there a particular reason why you embed PDF documents?
To make sure your website gets credit for that content, either post it on your site as content or as a PDF document on your domain.
If you want to rank for a particular keyword, you're always better to have your pages ranked in stead of the PDF file. It's a lot more user friendly and people can continue browsing your site if they land on your web page in stead of your PDF file.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
SEO Advice for Angular JS
We are changing our homepage (and gradually the rest of the site) to Angular JS.
Intermediate & Advanced SEO | | theLotter
In order not to lose anything in terms of SEO we are implementing Hashbangs + escaped fragment snapshots. Are there any other SEO considerations you think we should have and/or additional elements that we could add to the page to improve it in terms of SEO?0 -
Is .ME domain is effective in SEO ?
I am always listening about TLD. com. org .net but what about the .me domain. Can this will be effective in SEO. Can i able to beat down my competitors, if i choose .me . I also have a .com or other TLD option but if i am making my name than .me is for me but i need your suggestion for the seo purpose. Is there really domain affective in term of SEO.
Intermediate & Advanced SEO | | pnb5670 -
Is CloudFlare bad for SEO?
I have been hit by DDoS attacks lately...not on a huge scale, but probably done by some "script kiddies" or competitors of mine. Still, I need to take some action in order to protect my server and my site against all of this spam traffic that is being sent to it. In the process of researching the tools available for defending a website from a DDoS attack, I came across the service offered by CloudFlare.com. According to the CloudFlare website, they protect your site against a DDoS attack by showing users/visitors they find suspicious an interstitial that asks them if they are a real user or a bot...this interstitial contains a Captcha that suspicious users are asked to enter in order to visit the site. I'm just wondering what kind of an effect such an interstitial could have on my Google rankings...I can imagine that such a thing could add to increased click-backs to the SERPs and, if Google detects this, to lower rankings. Has anyone had experience with the DDoS protection services offered by CloudFlare, who can say a word or two regarding any effects this may have on SEO? Thanks
Intermediate & Advanced SEO | | masterfish1 -
Best way to permanently remove URLs from the Google index?
We have several subdomains we use for testing applications. Even if we block with robots.txt, these subdomains still appear to get indexed (though they show as blocked by robots.txt. I've claimed these subdomains and requested permanent removal, but it appears that after a certain time period (6 months)? Google will re-index (and mark them as blocked by robots.txt). What is the best way to permanently remove these from the index? We can't use login to block because our clients want to be able to view these applications without needing to login. What is the next best solution?
Intermediate & Advanced SEO | | nicole.healthline0 -
Is DOCTYPE important for SEO?
Hello fellow Mozzers. I am just having a brief look at a potential clients website before speaking to them tomorrow and whilst looking at the source I noticed that they don't appear to have a clear definition for their Doctype. All the have at the top of each page is I have to admit that Doctypes aren't my strong point but I know that they are normally slightly more descriptive than this. Can this have any effect on rankings? or is this just an issue for W3C validation? Thanks 🙂
Intermediate & Advanced SEO | | AdeLewis0 -
Does font size affect SEO?
In the eyes of Google, would the font text size of say a news article affect SEO? For example, a slightly larger font being easier to read by those with bad eyes? Accessibility? If so, what size would be ideal? 10, 12, 14? Your thoughts and suggestions are greatly appreciated.
Intermediate & Advanced SEO | | Peter2640