Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate Content - Bulk analysis tool?
-
Hi
I wondered if there's a tool to analyse duplicate content - within your own site or on external sites, but that you can upload the URL's you want to check in bulk?
I used Copyscape a while ago, but don't remember this having a bulk feature?
Thank you!
-
Great thank you!
I'll give both a go!
-
Great thanks
Yes I use screaming frog for this, but it was to look at actual page content. So yes to see if sites copy our content, but also to see whether we need to update our product content as some products are very similar.
I'll check the batch process on copyscape thanks!
-
I have not used this tool in this way, but have used it for other crawler projects related to content clean up and it is rock solid. They have been very responsive to me on questions related to use of the software. http://urlprofiler.com/
Duplicate content search is the project next on my list, here is how they do it.
http://urlprofiler.com/blog/duplicate-content-checker/
You let URL profiler crawl the section of your site that is most likely to be copied (say your blog) and you tell URL profiler what section of your HTML to compare against (i.e. the content section vs the header or footer). URL profiler then uses proxies (you have to buy the proxies) to perform Google searches on sentences from your content. It crawls those results to see if there is a site in the Google SERPs that has sentences from your content word for word (or pretty close).
I have played with Copyscape, but my markets are too niche for it to work for me. The logic here from URL profilers is that you are searching the database that most matters, Google.
Good luck!
-
I believe you might be able to use List Mode in ScreamingFrog to accomplish this, however it depends on ultimately what your goal is to check for duplicate content. Do you simply want to find duplicate titles or duplicate descriptions? Or do you want to find pages with sufficiently similar text as to warrant concern?
== Ooops! ==
It didn't occur to me that you were more interested in duplicate content caused by other sites copying your content rather than duplicate content among your list of URLs.
Copyscape does have a "Batch Process" tool but it is only available to paid subscribers. It does work quite nicely though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical: Same content but different countries
I'm building a website that has content made for specific countries. The url format is: MyWebsite.com/<country name="">/</country> Some of the pages for <specific url="">are the same for different countries, the <specific url="">would be the same as well. The only difference would be the <country name="">.</country></specific></specific> How do I deal with canonical issues to avoid Google thinking I'm presenting the same content?
On-Page Optimization | | newbyguy0 -
How to fix duplicate content for homepage and index.html
Hello, I know this probably gets asked quite a lot but I haven't found a recent post about this in 2018 on Moz Q&A, so I thought I would check in and see what the best route/solution for this issue might be. I'm always really worried about making any (potentially bad/wrong) changes to the site, as it's my livelihood, so I'm hoping someone can point me in the right direction. Moz, SEMRush and several other SEO tools are all reporting that I have duplicate content for my homepage and index.html (same identical page). According to Moz, my homepage (without index.html) has PA 29 and index.html has PA 15. They are both showing Status 200. I read that you can either do a 301 redirect or add rel=canonical I currently have a 301 setup for my http to https page and don't have any rel=canonical added to the site/page. What is the best and safest way to get rid of duplicate content and merge the my non index and index.html homepages together these days? I read that both 301 and canonical pass on link juice but I don't know what the best route for me is given what I said above. Thank you for reading, any input is greatly appreciated!
On-Page Optimization | | dreservices0 -
How do I fix my portfolio causing duplicate content issues?
Hi, Im new to this whole duplicate content issue. I have a website, fatcatpaperie.com that I use the portofolio feature in Wordpress as my gallery for all my wedding invitations. I have a ton of duplicate content issues from this. I don't understand at all how to fix this. I'd appreciate any help! Below is an example of one duplicate content issue. They have slightly different names, different urls, different images and all have no text. But are coming up as duplicates. Would it be as easy as putting a different metadescription for each?? Thanks for the help! Rena | "Treasure" by Designers Fine Press - Fat Cat Paperie http://fatcatpaperie.com/portfolio-item/treasure-designers-fine-press 1 0 0 0 200 3 duplicates "Perennial" by Designers Fine Press - Fat Cat Paperie http://fatcatpaperie.com/portfolio-item/perennial-by-designers-fine-press 1 0 0 0 200 1 of 3 duplicates "Primrose" by Designers Fine Press - Fat Cat Paperie http://fatcatpaperie.com/portfolio-item/8675 1 0 0 0 200 2 of 3 duplicates "Catalina" by Designers Fine Press - Fat Cat Paperie http://fatcatpaperie.com/portfolio-item/catalina-designers-fine-press |
On-Page Optimization | | HonestSEOStudio0 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
Multilingual site with untranslated content
We are developing a site that will have several languages. There will be several thousand pages, the default language will be English. Several sections of the site will not be translated at first, so the main content will be in English but navigation/boilerplate will be translated. We have hreflang alternate tags set up for each individual page pointing to each of the other languages, eg in the English version we have: etc In the spanish version, we would point to the french version and the english version etc. My question is, is this sufficient to avoid a duplicate content penalty for google for the untranslated pages? I am aware that from a user perspective, having untranslated content is bad, but in this case it is unavoidable at first.
On-Page Optimization | | jorgeapartime0 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0 -
Website Speed Testing Tools
I have been experimenting with a number of plugins to speed up the loading of my wordpress site and find that website speed testing tools are all over the map when it comes to results. Even the same tool can produce vastly different results. Does anyone have experience with one that is both dependable and consistent?
On-Page Optimization | | casper4340 -
Duplicate meta descriptions
Hi all, I'm using Yoast's SEO plugin and when I run a On Page report card here on SEOMOZ it says there are 2 descriptions tags I've been trying to fix this but can't (I'm new!) Anyone any ideas on this? Thanks Elaine
On-Page Optimization | | elaineryan0