Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hide sitelinks from Google search results
-
Does anyone have any recommendations on how you can tell Google (hopefully via a URL) not to index that page of a website? I have tried through SEO Yoast to hide certain sitemaps (which has worked to a degree) but certain functionalities of Wordpress websites show links without them actually being part of a "sitemap" so those links are harder to hide.
I'm having an issue with one of my websites - the sitelinks that Google is suggesting are nowhere near the most popular pages and I know that you can't make recommendations through Google not to show certain pages through Search Console. anymore.
Any suggestions are greatly appreciated! Thanks!
-
Yes, I tried the old Search Console option before I posted in here but sadly, it just redirects you back to the new version. However, I didn't even think about the redirect opportunity and considering the website is built on Wordpress, that should be easy enough to set up.
Thanks so much!
-
Ah. So, then I might try one of the following:
- My preferred approach would be to set up a redirect for that URL to a valid new URL. That way, you would make the best use of the traffic coming from the Sitelink, for whatever time it might remain there. After a while, I suspect Google will either update the sitelink title and description with those from the new redirected page, or perhaps drop that sitelink eventually in favor of another page.
- If you can't do the above (maybe you are not able to set up redirects from the old URL), then I might go the route of using the Search console (old version) to request removal of the old URL (Google Index > Remove URLs). If it really does give a proper 404 response code, then this should work. It doesn't do the job on its own if the URL still gives a valid response code. But a 404 plus a removal should get rid of it. That said, then you are rolling the dice with whatever Google decides to promote as a replacement sitelink. So, I would prefer the first approach, if I thought I could make the best of the traffic coming from that link.
-
Hi There,
Thanks so much for your reply. The trick with this is that the page that is showing as a sitelink is not even part any of the website's sitemaps. We just rebuilt the website for the client about 3 months ago - went from static website to Wordpress and for some unknown reason - Google is remembering a .php link from the old website somehow, but it is nowhere in our FTP, so if you click on it - it provides a 404 error.
The other disadvantage is that the old website was never SEO'ed or had proper page titles so users are confusing that sitelink as the new website link and it goes to a 404 page and people think the website isn't working.
Have I explained a bit better? Does that change your suggestion? Thanks!
-
For an html page, you would include the following line in the HEAD section of the page:
But in your question, I am unclear if you are maybe trying to noindex the sitemap itself? If that is the case, if you are wanting to direct Google to not index an XML file (rather than an html page), in theory you could inject a X-Robots-Tag: noindex into the header for the sitemap file (google how to do that). But probably no need to do that.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Google Search Console Still Reporting Errors After Fixes
Hello, I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site. I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling. According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails. What could be going on here? How can we resolve these error in GSC.
Technical SEO | | tif-swedensky0 -
Desktop & Mobile XML Sitemap Submitted But Only Desktop Sitemap Indexed On Google Search Console
Hi! The Problem We have submitted to GSC a sitemap index. Within that index there are 4 XML Sitemaps. Including one for the desktop site and one for the mobile site. The desktop sitemap has 3300 URLs, of which Google has indexed (according to GSC) 3,000 (approx). The mobile sitemap has 1,000 URLs of which Google has indexed 74 of them. The pages are crawlable, the site structure is logical. And performing a Landing Page URL search (showing only Google/Organic source/medium) on Google Analytics I can see that hundreds of those mobile URLs are being landed on. A search on mobile for a longtail keyword from a (randomly selected) page shows a result in the SERPs for the mobile page that judging by GSC has not been indexed. Could this be because we have recently added rel=alternate tags on our desktop pages (and of course corresponding canonical ones on mobile). Would Google then 'not index' rel=alternate page versions? Thanks for any input on this one. PmHmG
Technical SEO | | AlisonMills0 -
How to avoid instead suggestion from Google search results ?
Hi, When I search for "Zotey" in google, the following message is being displayed. Showing results for zotye
Technical SEO | | segistics
Search instead for zotey Anyone let me know how to get rid of this conflict asap? Regards, Sivakumar.0 -
I added a WP Customer Reviews plugin but nothing seems to appear on Google search
Hi, I've added the wordpress Wp Customer Reviews plugin to a my client's website and we brought some past clients to put on reviews in order to empower the hReview factor. Google as scraped the website several times since but we don't see any change in the organic serp. Can you please tell me if I've done something wrong or I forgot something? That's the website - Capital Garage Door Thanks!
Technical SEO | | captainjoe0 -
How to remove my cdn sub domins on Google search result?
A few months ago I moved all my Wordpress images into a sub domain. After I purchased CDN service, I again moved that images to my root domain. I added User-agent: * Disallow: / to my CDN domain. But now, when I perform site search on the Google, I found that my CDN sub domains are indexed by the Google. I think this will make duplicate content issue. I already hit by the Panguin. How do I remove these search results on Google? Should I add my cdn domain to webmaster tools to request URL removal request? Problem is, If I use cdn.mydomain.com it shows my www.mydomain.com. My blog:- http://goo.gl/58Utt site search result:- http://goo.gl/ElNwc
Technical SEO | | Godad1 -
How do I get out of google bomb?
Hi all, I have a website named bijouxroom.com; and I was in the 7th page for the search term takı in google; and 2nd page for online takı. Now, I see that in one day my results seem to be on the 13th and 10th page in google respectively. I made too much anchor text for takı and online takı. What shall I do to gain my positions back? Thanks in advance. Regards,
Technical SEO | | ozererim0 -
Why are old versions of images still showing for my site in Google Image Search?
I have a number of images on my website with a watermark. We changed the watermark (on all of our images) in May, but when I search for my site getmecooking in Google Image Search, it still shows the old watermark (the old one is grey, the new one is orange). Is Google not updating the images its search results because they are cached in Google? Or because it is ignoring my images, having downloaded them once? Should we be giving our images a version number (at the end of the file name)? Our website cache is set to 7 days, so that's not the issue. Thanks.
Technical SEO | | Techboy0 -
Should I set up a disallow in the robots.txt for catalog search results?
When the crawl diagnostics came back for my site its showing around 3,000 pages of duplicate content. Almost all of them are of the catalog search results page. I also did a site search on Google and they have most of the results pages in their index too. I think I should just disallow the bots in the /catalogsearch/ sub folder, but I'm not sure if this will have any negative effect?
Technical SEO | | JordanJudson0