Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I disable the indexing of tags in Wordpress?
-
Hi,
I have a client that is publishing 7 or 8 news articles and posts each month. I am optimising selected posts and I have found that they have been adding a lot of tags (almost like using hashtags) .
There are currently 29 posts but already 55 tags, each of which has its own archive page, and all of which are added to the site map to be indexed (https://sykeshome.europe.sykes.com/sitemap_index.xml).
I came across an article (https://crunchify.com/better-dont-use-wordpress-tags/) that suggested that tags add no value to SEO ranking, and as a consequence Wordpress tags should not be indexed or included in the sitemap.
I haven't been able to find much more reliable information on this topic, so my question is - should I get rid of the tags from this website and make the focus pages, posts and categories (redirecting existing tag pages back to the site home page)?
It is a relatively new websites and I am conscious of the fact that category and tag archive pages already substantially outnumber actual content pages (posts and news) - I guess this isn't optimal.
I'd appreciate any advice.
Thanks
-
Yes it would be best if you were the tags option off, It excellent performance working for example Shillong Teer Club chart
-
Disabling the indexing of tags in WordPress can be beneficial for SEO purposes, as it prevents search engines from indexing individual tag pages, which may otherwise lead to duplicate content issues. However, whether to disable tag indexing depends on your specific website goals and content structure. If you use tags sparingly and they add value to your site's organization, leaving them indexed may be beneficial. Evaluate your SEO strategy and content structure to determine the best approach for your WordPress site.
-
I'm having the same problem right now, my site is fairly new but it was already receiving some organic traffic then, all of a sudden traffic sank like a brick and i can't find the answer why.
Only thing that changed was adding tags to blog posts which i think might be creating duplicate content so I'm proceeding to disable those, will leave categories alive for the moment because they were bringing traffic but if nothing changes after it will deindex them as well, site in question is sluthpass hope i can recover traffic after disabling those annoying tags.
-
Hello experts, I have disabled the tags but it is still showing in Google. What should I do now? Here you can check redeemcodecenter.com, Thanks in Advance.
-
If you have a large number of tags that don't add clear value to your site's content, disabling tag indexing in WordPress can be beneficial for search engine optimization. However, if your tags are well-curated and provide meaningful navigation for your users, enabling indexing can improve discoverability and site organization. Evaluate the relevance and usefulness of your tags, considering both SEO considerations and user experience before making a decision.
-
I experienced the same problem. I once read an article that Google prefers websites that have a neat structure. and in my opinion, it is difficult to make tags more structured on my website. my site is malasngoding.com . what do you think?
-
Even i was also looking the answer for same for my website https://abcya.in/ i think tags should not be indexed.
-
I don't think it's a good idea. I'm testing tons of articles with and without tags for my website, Dizzibooster. It seems that adding tags will provide an edge for indexing purposes. However, you can test these things yourself."
-
I am facing the same issue on my website AmazingFactsHindi. As per our expert discussion and after reading this forum, I decided to de-index all the Tags and Category pages that are creating duplicate issues on our website.
-
We had similar questions on SEO. We experimented with disabling tags for the last 4 weeks. The only impact so far, I was able to find is that the [thecodebuzz(https://www.thecodebuzz.com/) website did not get hits for a few impressions which were based on tags keys. We are still evaluating the impact.
-
Heyo,
If your tags and categories are providing value to your users or helping with your site's SEO, you might not want to remove them from search engine indexes. I disabled it on my site OceanXD, And it was a good decision for me. -
I have a same question but I have found blocking category and tag is good for SEO.
I have blogsite Tech News Blog I have crated around 400 tag but I have seen this was crating duplicate issue.My personal opinion tag and category de index will be better for SEO.
-
@JCN-SBWD You can index your tags in as much as it doesn't affect the indexing of your posts. Tags do get traffic as well. The only reason why I stopped indexing my tags is because it affects the indexing of my post. Tags got indexed in a matter of minutes while it takes hours, sometimes days before my posts get indexed.
-
I would recommend to disable tags indexing as there are cases where you are multiple tags for same topic. You can index categories as mentioned above that they are more structure and define your website in some way. If you write custom excerpt for each post, it helps categories to have unique content for each post except.
-
It’s a good idea to block tags, since they are duplicate content and may dilute the performance of your real pages. But if you find certain tag or author pages bring valid traffic, you can make an exception for them. It's up to you
-
Can you please explain what exactly you do.
-
-
Many thanks for the prompt response and also for confirming my suspicions, it is much appreciated.
The robots suggestion is handy too.
-
Personally I usually do this as well as robots.txt blocking them to save on crawl allowance, but you should no-index first as if Google is blocked from crawling (robots.txt) then how will they find the no-index tags? So it needs to be staggered
I find that the tag URLs result in quite messy SERPs so I prefer to de-index those and then really focus on adding value to 'actual' category URLs. Because categories have a defined structure they're better for SEO (IMO)
Categories are usually good for SEO if you tune and tweak them up (and if their architecture is linear) but tags are very messy
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Is Indexing my 301 Redirects to Other sites
Long story but now i have a few links from my site 301 redirecting to youtube videos or eCommerce stores. They carry a considerable amount of traffic that i benefit from so i can't take them down, and that traffic is people from other websites, so basically i have backlinks from places that i don't own, to my redirect urls (Ex. http://example.com/redirect) My problem is that google is indexing them and doesn't let them go, i have tried blocking that url from robots.txt but google is still indexing it uncrawled, i have also tried allowing google to crawl it and adding noindex from robots.txt, i have tried removing it from GWT but it pops back again after a few days. Any ideas? Thanks!
Intermediate & Advanced SEO | | cuarto7150 -
Multiple H2 tags
Is it advisable to use only one H2 tag? The template designs for some reason is ended up with multiple H2 tags, I realise if any think it's that each one is that are important and it is all relative. Just trying to assess if it's worth the time and effort to rehash the template. Has anyone done any testing or got any experience? Thanks
Intermediate & Advanced SEO | | seoman101 -
Meta Robot Tag:Index, Follow, Noodp, Noydir
When should "Noodp" and "Noydir" meta robot tag be used? I have hundreds or URLs for real estate listings on my site that simply use "Index", Follow" without using Noodp and Noydir. Should the listing pages use these Noodp and Noydr also? All major landing pages use Index, Follow, Noodp, Noydir. Is this the best setting in terms of ranking and SEO. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Removing UpperCase URLs from Indexing
This search - site:www.qjamba.com/online-savings/automotix gives me this result from Google: Automotix online coupons and shopping - Qjamba
Intermediate & Advanced SEO | | friendoffood
https://www.qjamba.com/online-savings/automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. and Google tells me there is another one, which is 'very simliar'. When I click to see it I get: Automotix online coupons and shopping - Qjamba
https://www.qjamba.com/online-savings/Automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. This is because I recently changed my program to redirect all urls with uppercase in them to lower case, as it appears that all lowercase is strongly recommended. I assume that having 2 indexed urls for the same content dilutes link juice. Can I safely remove all of my UpperCase indexed pages from Google without it affecting the indexing of the lower case urls? And if, so what is the best way -- there are thousands.0 -
Meta NoIndex tag and Robots Disallow
Hi all, I hope you can spend some time to answer my first of a few questions 🙂 We are running a Magento site - layered/faceted navigation nightmare has created thousands of duplicate URLS! Anyway, during my process to tackle the issue, I disallowed in Robots.txt anything in the querystring that was not a p (allowed this for pagination). After checking some pages in Google, I did a site:www.mydomain.com/specificpage.html and a few duplicates came up along with the original with
Intermediate & Advanced SEO | | bjs2010
"There is no information about this page because it is blocked by robots.txt" So I had added in Meta Noindex, follow on all these duplicates also but I guess it wasnt being read because of Robots.txt. So coming to my question. Did robots.txt block access to these pages? If so, were these already in the index and after disallowing it with robots, Googlebot could not read Meta No index? Does Meta Noindex Follow on pages actually help Googlebot decide to remove these pages from index? I thought Robots would stop and prevent indexation? But I've read this:
"Noindex is a funny thing, it actually doesn’t mean “You can’t index this”, it means “You can’t show this in search results”. Robots.txt disallow means “You can’t index this” but it doesn’t mean “You can’t show it in the search results”. I'm a bit confused about how to use these in both preventing duplicate content in the first place and then helping to address dupe content once it's already in the index. Thanks! B0 -
tags inside <a>tags - is this bad?</a>
Hi, I'm currently redesigning my website, and in many places, I've now decided to make links a little bit more obvious for the user, using tags within a <a>tag in order to make the entire block of text clickable. I was just wondering if this could have a negative impact in the search engines. My gut feeling is no, since I'm actually improving usability, but I guess it could have an impact on how Google looks at the anchor text? An example of the HTML is as follows: </a> <a></a> <a></a> [Cristal Night Club Hotels <address>1045 5th Street
Intermediate & Advanced SEO | | mjk26
Miami Beach, FL33139</address> 6.4 miles from Miami Dade County Auditorium](http://localhost:8080/frontend/venue-hotels/cristal-night-club-hotels/301022 "Hotels near Cristal Night Club") Thanks for your thoughts and comments, Best wishes Mike0 -
How important is the number of indexed pages?
I'm considering making a change to using AJAX filtered navigation on my e-commerce site. If I do this, the user experience will be significantly improved but the number of pages that Google finds on my site will go down significantly (in the 10,000's). It feels to me like our filtered navigation has grown out of control and we spend too much time worrying about the url structure of it - in some ways it's paralyzing us. I'd like to be able to focus on pages that matter (explicit Category and Sub-Category) pages and then just let ajax take care of filtering products below these levels. For customer usability this is smart. From the perspective of manageable code and long term design this also seems very smart -we can't continue to worry so much about filtered navigation. My concern is that losing so many indexed pages will have a large negative effect (however, we will reduce duplicate content and be able provide much better category and sub-category pages). We probably should have thought about this a year ago before Google indexed everything :-). Does anybody have any experience with this or insight on what to do? Thanks, -Jason
Intermediate & Advanced SEO | | cre80 -
Should I remove Meta Keywords tags?
Hi, Do you recommend removing Meta Keywords or is there "nothing to lose" with having them? Thanks
Intermediate & Advanced SEO | | BeytzNet0