Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does posting an article on multiple sites hurt seo?
-
A client of mine creates thought leadership articles and pitches multiple sites to host the article on their site to reach different audiences.
The sites that pick it up are places such as AdAge and MarketingProfs and we do get link juice from these sources most of the time.
Does having the same article on these sites as well as your own hurt your SEO efforts in any way? Could it be recognized as duplicate content?
I know the links are great just wondering if there is any other side effects especially when there are no links provided!
Thank you!
-
It depends. If the article goes on your site first it gets indexed and all the credit. If someone takes it for their own usage and does not link back to you it can hurt them. If they syndicate your article and trackback to the original, AKA the first one indexed they will not be punished.
-
There is a larger issue at play here.
Submitting the same article to multiple outlets is a sure way of pissing off editors and destroying relationships. It could be seen as less than exemplary conduct. I speak as a former editor.
If your client is a thought leader, the best bet is to submit one article to one outlet. Which is not to say you can't write another article for another publication that is a variation on the theme.
I work with thought leaders in several fields. Guest blogging is a hugely effective technique. The outlets are thrilled to get a free article from a leading expert that is far more authoritative than what they usually publish.
But you must insist on a link back or there is no SEO benefit. (There may be a marketing and branding benefit.) Often the link back can be done in the author's note. Even better is getting it in the text in a natural way. And you have to be relentless is ensuring the links actually appear. Not infrequently, you have to follow up post-publication.
My strategy is to time the guest blogging activity to coincide with the release of research or an e-book. We target 5-7 leading publications. Each gets an original and unique article that focusses on one aspect of the material. The articles on the third-party sites point back to the full version on our own site.
Just to be clear: we're not talking about cutting and pasting. We're talking about an original article customized to the third-party site and its audience that may have go through several drafts.
It's quite a bit of work, but it pays off. Big time.
These days, I call myself a web strategist. But sometimes I also act as a content strategist. I really think this is the future of our industry, post Panda and Penguin.
-
If the text is exactly the same in each article then yes, Google looks for large chunks of duplicate text. Usually the way to do this would be to rewrite the article for each site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our clients Magento 2 site has lots of obsolete categories. Advice on SEO best practice for setting server level redirects so I can delete them?
Our client's Magento website has been running for at least a decade, so has a lot of old legacy categories for Brands they no longer carry. We're looking to trim down the amount of unnecessary URL Redirects in Magento, so my question is: Is there a way that is SEO efficient to setup permanent redirects at a server level (nginx) that Google will crawl to allow us at some point to delete the categories and Magento URL Redirects? If this is a good practice can you at some point then delete the server redirects as google has marked them as permanent?
Technical SEO | | Breemcc0 -
What is SEO best practice to implement a site logo as an SVG?
What is SEO best practice to implement a site logo as an SVG?
Technical SEO | | twisme
Since it is possible to implement a description for SVGs it seems that it would be possible to use that for the site name. <desc>sitename</desc>
{{ STUFF }} There is also a title tag for SVGs. I’ve read in a thread from 2015 that sometimes it gets confused with the title tag in the header (at least by Moz crawler) which might cause trouble. What is state of the art here? Any experiences and/or case studies with using either method? <title>sitename</title>
{{ STUFF }} However, to me it seems either way that best practice in terms of search engines being able to crawl is to load the SVG and implement a proper alt tag: What is your opinion about this? Thanks in advance.1 -
Google displays multiple titles for same article. What does this mean?
I've linked to some screenshots so that it what I'm talking about makes more sense. Sometimes, when I perform a search, I see an article with the correct article title listed as the page title in the SERPs. Other times, I see the wrong page title – it's a generic somethin' or other done by my client's web design company with a bunch of keywords thrown in. The latter (not the correct article title) also appears at the top of the browser tab for every article on my client's site. I know this is bad, but what can be done about it? This would never happen if my client used Wordpress or some easily modifiable CMS, but they're using a proprietary one maintained by the group that designed the website. open?id=0BxB_dYL1ylGgVVF1dHlwdXp2dFU open?id=0BxB_dYL1ylGgdWJjdlJoRlRIR00
Technical SEO | | Greenery0 -
Best Web-site Structure/ SEO Strategy for an online travel agency?
Dear Experts! I need your help with pointing me in the right direction. So far I have found scattered tips around the Internet but it's hard to make a full picture with all these bits and pieces of information without a professional advice. My primary goal is to understand how I should build my online travel agency web-site’s (https://qualistay.com) structure, so that I target my keywords on correct pages and do not create a duplicate content. In my particular case I have very similar properties in similar locations in Tenerife. Many of them are located in the same villa or apartment complex, thus, it is very hard to come up with the unique description for each of them. Not speaking of amenities and pricing blocks, which are standard and almost identical (I don’t know if Google sees it as a duplicate content). From what I have read so far, it’s better to target archive pages rather than every single property. At the moment my archive pages are: all properties (includes all property types and locations), a page for each location (includes all property types). Does it make sense adding archive pages by property type in addition OR in stead of the location ones if I, for instance, target separate keywords like 'villas costa adeje' and 'apartments costa adeje'? At the moment, the title of the respective archive page "Properties to rent in costa adeje: villas, apartments" in principle targets both keywords... Does using the same keyword in a single property listing cannibalize archive page ranking it is linking back to? Or not, unless Google specifically identifies this as a duplicate content, which one can see in Google Search Console under HTML Improvements and/or archive page has more incoming links than a single property? If targeting only archive pages, how should I optimize them in such a way that they stay user-friendly. I have created (though, not yet fully optimized) descriptions for each archive page just below the main header. But I have them partially hidden (collapsible) using a JS in order to keep visitors’ focus on the properties. I know that Google does not rank hidden content high, at least at the moment, but since there is a new algorithm Mobile First coming up in the near future, they promise not to punish mobile sites for a collapsible content and will use mobile version to rate desktop one. Does this mean I should not worry about hidden content anymore or should I move the descirption to the bottom of the page and make it fully visible? Your feedback will be highly appreciated! Thank you! Dmitry
Technical SEO | | qualistay1 -
How should I structure a site with multiple addresses to optimize for local search??
Here's the setup: We have a website, www.laptopmd.com, and we're ranking quite well in our geographic target area. The site is chock-full of local keywords, has the address properly marked up, html5 and schema.org compliant, near the top of the page, etc. It's all working quite well, but we're looking to expand to two more locations, and we're terrified that adding more addresses and playing with our current set-up will wreak havoc with our local search results, which we quite frankly currently rock. My question is 1)when it comes time to doing sub-pages for the new locations, should we strip the location information from the main site and put up local pages for each location in subfolders? 1a) should we use subdomains instead of subfolders to keep Google from becoming confused? Should we consider simply starting identically branded pages for the individual locations and hope that exact-match location-based urls will make up for the hit for duplicate content and will overcome the difficulty of building a brand from multiple pages? I've tried to look for examples of businesses that have tried to do what we're doing, but all the advice has been about organic search, which i already have the answer to. I haven't been able to really find a good example of a small business with multiple locations AND good rankings for each location. Should this serve as a warning to me?
Technical SEO | | LMDNYC0 -
Bing rank drop off for multiple sites
Hi Mozzers, Seeing some wacky stuff going on on some sites I manage. In more than a few, the ranking on bing has dropped basically overnight from page one spots to not being found on the first 100 positions. Anyone else seeing similar results? Some of the sites are fairly new, some have been around for ages, some are wordpress, some are not. I've been searching for some news of a big change on bing, but keep reading about bing dropping the thin sites during black friday. In one example, I had the site set up in BWT for a while, and had a look at the data. The reports show that the pages are crawled, the index summary shows pages indexed, and there seems to be no crawl errors, but rankings are absolutely gone. Also, I can't see the sites in bing if I search "site:example.com" in bing. Here's 2 examples, the first would make sense since it's pretty thin as I havent added much content yet: http://homewindowtint.org but this one doesn't make sense to me. Sure there's a few errors, but to be dropped like a rock seems weird http://www.ahmedandsukaram.com
Technical SEO | | rosstaylor0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0 -
First click on SEO redirecting to a competitor site?
I just experienced something VERY odd and wondered if any of you had an idea of what it might be. When I did a search on Google and clicked the top SEO listing I was taken to a competitor of the number 1 listed site i.e. NOT the site I clicked on. When I clicked the back button and clicked it again, I was taken to the correct site. This happened with two different searches and I was taken to two different sites. Could this be a clever/sinister cookie implemented by the competitor; a site I frequent regularly? Could this be malware implemented by an affiliate? Could this be a Google glitch?
Technical SEO | | Red_Mud_Rookie0