Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Pure spam Manual Action by Google
-
Hello Everyone,
We have a website http://www.webstarttoday.com. Recently, we have received manual action from Google says "Pages on this site appear to use aggressive spam techniques such as automatically generated gibberish, cloaking, scraping content from other websites, and/or repeated or egregious violations of Google’s Webmaster Guidelines." . Google has given an example http://smoothblog.webstarttoday.com/. The nature of the business of http://www.webstarttoday.com is to creating sub-domains (website builder). Anyone can register and create sub-domains.
My questions are:
- What are the best practices in case if someone is creating sub-domain for webstarttoday.com?
- How can I revoke my website from this penalty?
- What should i do with other hundreds of sub-domains those are already created by third party like http://smoothblog.webstarttoday.com? .
- Why these type of issues don't come with WordPress or weebly. ?
Regards,
Ruchi
-
That's great news that you got the penalty revoked.
It can often take a few days for the manual spam actions viewer to show that there is no longer a penalty. Also, keep an eye on the manual spam actions viewer. I've seen a number of sites lately that got a pure spam penalty revoked and then a few days or weeks later got either a thin content penalty or an unnatural links penalty. Hopefully that's not the case for you though!
-
It could be that the message is only disappearing tomorrow.
The message from Google however doesn't say that the penalty is revoked but that it has been revoked or adjusted. It's possible that the penalty is now only applied to the specific subdomain rather than the site as a whole. Is it still the original message which is shown under Manual actions?
Would update the terms & conditions anyway - so that you can react quick if you see other actions appearing. Try to scan the subdomains from time to time to make sure that they are not violating the Google guidelines.
Regards,
Dirk
-
Thanks Dirk,
You have nicely give all answers of my questions. I will take care of your points while creating the sub-domains. Also, I received this message from Google after filing the reconsideration request:
Dear Webmaster of http://www.webstarttoday.com/
We have processed the reconsideration request from a site owner for http://www.webstarttoday.com/. The site has been reviewed for violations of our quality guidelines. Any manual spam actions applied to the site have been revoked or adjusted where appropriate.
As per the message my website should had revoked from the penalty but the penalty is still showing, under "Manual action".
Thanks,
Ruchi
-
Thanks for your quick repose. Much appreciated.
-
^ VERY nice, Dirk!
-
Hi,
Try to answer your questions point by point:
1. You could add to your terms & conditions that sites created need to follow Google webmasterguidelines - and if they are not followed you can delete the subdomain.
2. Revoke the penalty is only possible by cleaning the site and removing the contested content. It depends on your current terms & conditions if you have the possibility to force the one who is managing this blog to clean the site.
3. Idem as above - if your terms & conditions didn't stipulate that messing with Google guidelines is forbidden, there is not much you can do at this point.
4. Wordpress is hosting the blogs on wordpress.com - the main site is wordpress.org. Weebly has terms & conditions that forbid Spam/SEO sites (probably Wordpress.com has this as well - but it's stated very clearly on the Weebly.com)
Update terms & conditions if necessary - send warning to offending blog users & delete them if necessary.
Hope this helps,
Dirk
-
Hi there
1. Here are a couple of resources: Moz and HotDesign 2. Pure Spam: What Are Google Penalties & What to Do to Recover from Search Engine Watch and this Q+A thread from Moz
3. I would go through your subdomains - find the ones that are blatant spam or thin with content and remove them. I would then make sure that they are blocked in your robots.txt.
4. I would say because Wordpress is the top used CMS in the world and a lot of reputable websites use it.I would really work on the spam features for your product - looking for IPs that continually create websites, thin content, cloaking, off topic websites, link farms, etc. It's your duty as a CMS to watch how your users use the product. Not only will it keep your product's reputation clean, it will also show that you are taking steps to run a product with integrity.
Hope this all helps - good luck!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does Google handle fractions in titles?
Which is better practice, using 1/2" or ½"? The keyword research suggests people search for "1 2" with the space being the "/". How does Google handle fractions? Would ½ be the same as 1/2?
Intermediate & Advanced SEO | | Choice2 -
Does google ignore ? in url?
Hi Guys, Have a site which ends ?v=6cc98ba2045f for all its URLs. Example: https://domain.com/products/cashmere/robes/?v=6cc98ba2045f Just wondering does Google ignore what is after the ?. Also any ideas what that is? Cheers.
Intermediate & Advanced SEO | | CarolynSC0 -
Google not Indexing images on CDN.
My URL is: http://bit.ly/1H2TArH We have set up a CDN on our own domain: http://bit.ly/292GkZC We have an image sitemap: http://bit.ly/29ca5s3 The image sitemap uses the CDN URLs. We verified the CDN subdomain in GWT. The robots.txt does not restrict any of the photos: http://bit.ly/29eNSXv. We used to have a disallow to /thumb/ which had a 301 redirect to our CDN but we removed both the disallow in the robots.txt as well as the 301. Yet, GWT still reports none of our images on the CDN are indexed.
Intermediate & Advanced SEO | | alphonseha
The above screenshot is from the GWT of our main domain.The GWT from the CDN subdomain just shows 0. We did not submit a sitemap to the verified subdomain property because we already have a sitemap submitted to the property on the main domain name. While making a search of images indexed from our CDN, nothing comes up: http://bit.ly/293ZbC1While checking the GWT of the CDN subdomain, I have been getting crawling errors, mainly 500 level errors. Not that many in comparison to the number of images and traffic that we get on our website. Google is crawling, but it seems like it just doesn't index the pictures!?
Can anyone help? I have followed all the information that I was able to find on the web but yet, our images on the CDN still can't seem to get indexed.
0 -
How do you check the google cache for hashbang pages?
So we use http://webcache.googleusercontent.com/search?q=cache:x.com/#!/hashbangpage to check what googlebot has cached but when we try to use this method for hashbang pages, we get the x.com's cache... not x.com/#!/hashbangpage That actually makes sense because the hashbang is part of the homepage in that case so I get why the cache returns back the homepage. My question is - how can you actually look up the cache for hashbang page?
Intermediate & Advanced SEO | | navidash0 -
For how long does Google honor a 302 redirect?
Greetings! I would love some recent experiences to support our experience which is +/- 1 year old on this question. Based on our experiences around a year ago, I believe that Google will only honor a 302 temporary redirect for a relatively short period - perhaps up to a month - and then it will begin treating the redirect as a 301 redirect and will remove the old page from the index. Have others seen this? Is there an update on what the max "safe" period to have a 302 in place could be? We have a domain that is soon to experience about 3 months of "downtime" with no content on it, but the content will be back after that time. Ideally we would 302 redirect the pages elsewhere just for that downtime period. However, I don't want to do a 302 redirect if there is a risk that the pages will lose all of their accumulated authority and indexing. Basically, is there any safe way to just put the domain on ice for a few months? Please share recent experience only. Thanks for your insights!
Intermediate & Advanced SEO | | g-s-m0 -
Page position dropped on Google
Hey Guys, My web designer has recommended this forum to use, the reason being: my google position has been dropped from page 1 to page 10 in the last week. The site is weloveschoolsigns.co.uk, but our main business site is textstyles.co.uk the school signs are a product of text styles. I have been told off my SEO company, that because I have changed the school logo to the text styles logo, Google have penalised me for it, and dropped us from page 1 for numerous keywords, to page 10 or more. They have also said that duplicate content within the school site http://www.weloveschoolsigns.co.uk/school-signs-made-easy/ has also a contributed to the drop in positions. (this content is not on the textstyles site) Lastly they said, that having the same telephone number is a definate no no. They said that I have been penalised, because google see the above as trying to monopolise on the market. I don’t know if all this is true, as the SEO is way above my head, but they have quoted me £1250 to repair all the errors, when the site only cost £750. They have also mentioned that because of the above changes, the main text styles site will also be punished. Any thoughts on this matter would be much appreciated as I don't know whether to pay them to crack on, or accept the new positions. Either way I'm very confused. Thanks Thomas
Intermediate & Advanced SEO | | TextStylesUK0 -
Multiple Authors Google + Authorship
Hello, I took a look through past questions but can't seem to find a definitive answer on setting up Google + Authorship credit (for multiple authors) using a Wordpress blog. Has anyone had experience setting this up? Or could you recommend solid reading/research? I took a look at a couple of Wordpress plug in's but just found them very confusing (so did our IT contact who will ultimately be setting up code for this.) Any direction or advice is appreciated.
Intermediate & Advanced SEO | | SEOSponge0 -
How to prevent Google from crawling our product filter?
Hi All, We have a crawler problem on one of our sites www.sneakerskoopjeonline.nl. On this site, visitors can specify criteria to filter available products. These filters are passed as http/get arguments. The number of possible filter urls is virtually limitless. In order to prevent duplicate content, or an insane amount of pages in the search indices, our software automatically adds noindex, nofollow and noarchive directives to these filter result pages. However, we’re unable to explain to crawlers (Google in particular) to ignore these urls. We’ve already changed the on page filter html to javascript, hoping this would cause the crawler to ignore it. However, it seems that Googlebot executes the javascript and crawls the generated urls anyway. What can we do to prevent Google from crawling all the filter options? Thanks in advance for the help. Kind regards, Gerwin
Intermediate & Advanced SEO | | footsteps0