Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Right way to block google robots from ppc landing pages
-
What is the right way to completely block seo robots from my adword landing pages? Robots.txt does not work really good for that, as far I know.
Adding metatags noindex nofollow on the other side will block adwords robot as well. right?
Thank you very much,
Serge
-
Thank you very much
-
Gotcha. I did some searching around and you will not block the AdWords bot unless you explicitly block AdsBot-Google. A wildcard user agent disallow will not block the AdsBot-Google. Hope that helps!
-
Thanks guys. AdWords robots scans the page to determine it's relevancy to your ad group.
When you block it in robots.txt ONLY, the link to this page will be indexed and show up in your serp results site:example.com
I wondered if you add meta tags noindex nofollow you also blocking all robots to scan the page, adwords as well
-
If you have a specific directory for PPC pages then you can follow these steps that have worked for us wonderfully:
Block the directory /ppc to the robots.txt file for all user agents. This can even be live before the /ppc directory even exists.
User-agent: *
Disallow: /ppc
Add noindex, nofollow in the meta tags for all pages in /ppc
I'm not sure what you are referring to when you mention an AdWords robot?
-
Dear Serge,
Your question is difficulty to answer, because you have several possibilities. If you use the noindex, will stop all the robots.
There is a post at SEOMoz blog writed by Lindsay that i think will answer your question. You will find it here: http://www.seomoz.org/blog/robot-access-indexation-restriction-techniques-avoiding-conflicts
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The particular page cannot be indexed by Google
Hello, Smart People!
On-Page Optimization | | Viktoriia1805
We need help solving the problem with Google indexing.
All pages of our website are crawled and indexed. All pages, including those mentioned, meet Google requirements and can be indexed. However, only this page is still not indexed.
Robots.txt is not blocking it.
We do not have a tag "nofollow"
We have it in the sitemap file.
We have internal links for this page from indexed pages.
We requested indexing many times, and it is still grey.
The page was established one year ago.
We are open to any suggestions or guidance you may have. What else can we do to expedite the indexing process?1 -
Should we rename and update a page or create a new page entirely?
Hi Moz Peoples! We have a small site with a simple site navigation, with only a few links on the nav bar. We have been doing some work to create a new page, which will eventually replace one of the links on the nav bar. The question we are having is, is it better to rename the existing page and replace its content and then wait for the great indexer to do its thing, or perm delete the page and replace it with the new page and content? Or is this a case where it really makes no difference as long as the redirects are set up correctly?
On-Page Optimization | | Parker8180 -
Listing all services on one page vs separate pages per service
My company offers several generalized categories with more specific services underneath each category. Currently the way it's structured is if you click "Voice" you get a full description of each voice service we offer. I have a feeling this is shooting us in the foot. Would it be better to have a general overview of the services we offer on the "Voice" page that then links to the specified service? The blurb about the service on the overview page would be unique, not taken from the actual specific service's page.
On-Page Optimization | | AMATechTel0 -
WordPress and category/subcategory landing pages
Hey, Here's my situation. I'm building a WordPress blog for product reviews of a certain niche. Current category setup is 4 main categories with 4-8 subcategories each. Each subcategory has a unique description that will help it become a landing page for certain keywords, after which it lists the posts from that subcategory. The posts will always be assigned to a sub-category, never to a main category. My issue is what to do with the main categories. They're fairly general so they're not really targeting any keywords, and don't have any unique descriptions attached to them. I was thinking of choosing between three options on designing the main category pages: List the subcategories + normal posts loop that bring the latest posts from the subcategories (may create a lot of duplicate content since the subcategory pages are also listing their posts) List only the subcategories (+ maybe just the latest post from each subcategory) Don't link the main categories at all, instead only use them to create dropdowns for the subcategories So, what would you choose, and why?
On-Page Optimization | | mihaiaperghis0 -
Missing meta descriptions on indexed pages, portfolio, tags, author and archive pages. I am using SEO all in one, any advice?
I am having a few problems that I can't seem to work out.....I am fairly new to this and can't seem to work out the following: Any help would be greatly appreciated 🙂 1. I am missing alot of meta description tags. I have installed "All in One SEO" but there seems to be no options to add meta descriptions in portfolio posts. I have also written meta descriptions for 'tags' and whilst I can see them in WP they don't seem to be activated. 2. The blog has pages indexed by WP- called Part 2 (/page/2), Part 3 (/page/3) etc. How do I solve this issue of meta descriptions and indexed pages? 3. There is also a page for myself, the author, that has multiple indexes for all the blog posts I have written, and I can't edit these archives to add meta descriptions. This also applies to the month archives for the blog. 4. Also, SEOmoz tells me that I have too many links on my blog page (also indexed) and their consequent tags. This also applies to the author pages (myself ). How do I fix this? Thanks for your help 🙂 Regards Nadia
On-Page Optimization | | PHDAustralia680 -
301 redirects from several sub-pages to one sub-page
Hi! I have 14 sub-pages i deleted earlier today. But ofcourse Google can still find them, and gives everyone that gives them a go a 404 error. I have come to the understading that this wil hurt the rest of my site, at least as long as Google have them indexed. These sub-pages lies in 3 different folders, and i want to redirect them to a sub-page in a folder number 4. I have already an htaccess file, but i just simply cant get it to work! It is the same file as i use for redirecting trafic from mydomain.no to www.mydomain.no, and i have tried every kind of variation i can think of with the sub-pages. Has anyone perhaps had the same problem before, or for any other reason has the solution, and can help me with how to compose the htaccess file? 🙂 You have to excuse me if i'm using the wrong terms, missing something i should have seen under water while wearing a blindfold, or i am misspelling anything. I am neither very experienced with anything surrounding seo or anything else that has with internet to do, nor am i from an englishspeaking country. Hope someone here can light up my path 🙂 Thats at least something you can say in norwegian...
On-Page Optimization | | MarieA1 -
How do you block development servers with robots.txt?
When we create client websites the urls are client.oursite.com. Google is indexing theses sites and attaching to our domain. How can we stop it with robots.txt? I've heard you need to have the robots file on both the main site and the dev sites... A code sample would be groovy. Thanks, TR
On-Page Optimization | | DisMedia0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5