Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is there a way to prevent Google Alerts from picking up old press releases?
- 
					
					
					
					
 I have a client that wants a lot of old press releases (pdfs) added to their news page, but they don't want these to show up in Google Alerts. Is there a way for me to prevent this? 
- 
					
					
					
					
 Thanks for the post Keri. Yep, the OCR option would still make the image option for hiding "moo" 
- 
					
					
					
					
 Harder, but certainly not impossible. I had Google Alerts come up on scanned PDF copies of newsletters from the 1980s and 1990s that were images. The files recently moved and aren't showing up for the query, but I did see something else interesting. When I went to view one of the newsletters (https://docs.google.com/file/d/0B2S0WP3ixBdTVWg3RmFadF91ek0/edit?pli=1), it said "extracting text" for a few moments, then had a search box where I could search the document. On the fly, Google was doing OCR work and seemed decently accurate in the couple of tests I had done. There's a whole bunch of these newsletters at http://www.modelwarshipcombat.com/howto.shtml#hullbusters if you want to mess around with it at all. 
- 
					
					
					
					
 Well that is how to exclude them from an alert that they setup, but I think they are talking about anyone who would setup an alert that might find the PDFs. One other idea I had, that I think may help. If you setup the PDFs as images vs text then it would be harder for Google to "read" the PDFs and therefore not catalog them properly for the alert, but then this would have the same net effect of not having the PDFs in the index at all. Danielle, my other question would be - why do they give a crap about Google Alerts specifically. There has been all kinds of issues with the service and if someone is really interested in finding out info on the company, there are other ways to monitor a website than Google Alerts. I used to use services that simply monitor a page (say the news release page) and lets me know when it is updated, this was often faster than Google Alerts and I would find stuff on a page before others who did only use Google Alerts. I think they are being kind of myopic about the whole approach and that blocking for Google Alerts may not help them as much as they think. Way more people simply search on Google vs using Alerts. 
- 
					
					
					
					
 The easiest thing to do in this situation would be to add negative keywords or advanced operators to your google alert that prevent the new pages from triggering the alert. You can do this be adding advanced operators that exclude an exact match phrase, a file type, the clients domain or just a specific directory. If all the new pdf files will be in the same directory or share a common url structure you can exclude using the "inurl:-" operator. 
- 
					
					
					
					
 That also presumes Google Alerts is anything near accurate. I've had it come up with things that have been on the web for years and for whatever reason, Google thinks they are new. 
- 
					
					
					
					
 That was what I was thinking would have to be done... It's a little complicated on why they don't want them showing up in Alerts. They do want them showing up on the web, just not as an Alert. I'll let them know they can't have it both ways! 
- 
					
					
					
					
 Robots.txt and exclude those files. Note that this takes them out of the web index in general so they will not show up in searches. You need to ask your client why they are putting things on the web if they do not want them to be found. If they do not want them found, dont put them up on the web. 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		Finding issue on Gtmetrix speed test and google speed test
 Hey, when i have tested my website https://www.socprollect-mea.com/ on GT metrix. When i test morning, it shown page speed of 3.8 seconds and in noon it shown 2 seconds and later i check, it is defeclecting. speed on the Google page speed test as well. What kind of error is this and does anyone have the solution for this issue? On-Page Optimization | | nazfazy0
- 
		
		
		
		
		
		Does RSS Feed help to rank better in Google?
 Hello, I heard RSS Feed helps in ranking. However, I am not sure if I should enable RSS Feed or not. Whenever I publish an article on my site , I see that many other websites have leeched my Feed and get's the same article I written published with a nofollow backlink to my website article. The worst part is that my article doesn't appear in Google search, but the website which copied my article gets ranked in Google. Although the article gets index on google (checked by using site:website.com). Although some articles show up after 24 hours by ranking higher from the sites which copied my article. Any idea what should I do? Thank you On-Page Optimization | | hakhan2010
- 
		
		
		
		
		
		Google Console returning 0 pages as being indexed
 HI there, I submitted my site notebuster.net to Search Console over a month ago and it is showing 0 pages as being indexed under the index status report. I know this isn't right as I can see that in google alone by typing in (site:notebusters.net) there are 113 pages indexed. Any idea why this might be? Thanks On-Page Optimization | | CosiCrawley0
- 
		
		
		
		
		
		Google Webmaster Guideline Change: Human-Readable list of links
 In the revised webmaster guidelines, google says "[...] Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page)." (Source: https://support.google.com/webmasters/answer/35769?hl=en) I guess what they mean by this is something like this: http://www.ziolko.de/sitemap.html Still, I wonder why they say that. Just to ensure that every page on a site is linked and consequently findable by humans (and crawlers - but isn't the XML sitemap for those and gives even better information)? Should not a good navigation already lead to every page? What is the benefit of a link-list-page, assuming you have an XML sitemap? For a big site, a link-list is bound to look somewhat cluttered and its usefulness is outclassed by a good navigation, which I assume as a given. Or isn't it? TL;DR: Can anybody tell me what exactly is the benefit of a human-readable list of all links? Regards, Nico On-Page Optimization | | netzkern_AG0
- 
		
		
		
		
		
		Blocking Subdomain from Google Crawl and Index
 Hey everybody, how is it going? I have a simple question, that i need answered. I have a main domain, lets call it domain.com. Recently our company will launch a series of promotions for which we will use cname subdomains, i.e try.domain.com, or buy.domain.com. They will serve a commercial objective, nothing more. What is the best way to block such domains from being indexed in Google, also from counting as a subdomain from the domain.com. Robots.txt, No-follow, etc? Hope to hear from you, Best Regards, On-Page Optimization | | JesusD3
- 
		
		
		
		
		
		How does Google Detect which keywords my website should show up for in the SE?
 When I checked my Google Webmaster Tools I found that my website is showing up for keywords that I didn't optimize for ... for example I optimize my website for "funny pictures with captions", and the website is showing up for "funny images with captions". I know that this is good, but the keyword is dancing all around, sometimes I search for "funny pictures with captions" and I show up in the 7th page, and some time I don't show up. and the same goes for the other keyword. of course I am optimizing for more than two keywords but the results is not consistent. my question is how does Google decide which keywords you website should show up for? Is it the on-page keywords?, or is it the off-page anchor text keywords? Thank you in advance ... On-Page Optimization | | FarrisFahad
 FarrisFahad0
- 
		
		
		
		
		
		Best Way to Use Date in Title
 Hi, I do most of the current copy for our blog which you can find here http://appointedd.com/blog/ I believe having a regular blog structure with a mix of irregular ad hoc posts to go in around these. So, for this blog, I write an article on "Beauty Industry News" every week. Now, I don't want to use the same title for each post, so I've peen butting in the date after each one i.e. "Beauty Industry News - 24/04/13". Is this best practice or is there a better way of naming regular posts? Thanks in advance! On-Page Optimization | | LeahHutcheon0
- 
		
		
		
		
		
		Does 'XXX' in Domain get filtered by Google
 I have a friend that has xxx in there domain and they are a religious based sex/porn addiction company but they don't show up for the queries that they are optimized against. They have a 12+ year old domain, all good health signs in quality links and press from trusted companies. Google sends them adult traffic, mostly 'trolls' and not the users they are looking for. Has anyone experienced domain word filtering and have a work around or solution? I posted in the Google Webmaster help forums and that community seems a little 'high on their horses' and are trying to hard to be cool. I am not too religious and don't necessarily support the views of the website but just trying to help a friend of a friend with a topic that I have never encountered. here is the url: xxxchurch.com Thanks, Brian On-Page Optimization | | Add3.com0
 
			
		 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				