• BBgmoro

        See all notifications

        Skip to content
        Moz logo Menu open Menu close
        • Products
          • Moz Pro
          • Moz Pro Home
          • Moz Local
          • Moz Local Home
          • STAT
          • Moz API
          • Moz API Home
          • Compare SEO Products
          • Moz Data
        • Free SEO Tools
          • Domain Analysis
          • Keyword Explorer
          • Link Explorer
          • Competitive Research
          • MozBar
          • More Free SEO Tools
        • Learn SEO
          • Beginner's Guide to SEO
          • SEO Learning Center
          • Moz Academy
          • MozCon
          • Webinars, Whitepapers, & Guides
        • Blog
        • Why Moz
          • Digital Marketers
          • Agency Solutions
          • Enterprise Solutions
          • Small Business Solutions
          • The Moz Story
          • New Releases
        • Log in
        • Log out
        • Products
          • Moz Pro

            Your all-in-one suite of SEO essentials.

          • Moz Local

            Raise your local SEO visibility with complete local SEO management.

          • STAT

            SERP tracking and analytics for enterprise SEO experts.

          • Moz API

            Power your SEO with our index of over 44 trillion links.

          • Compare SEO Products

            See which Moz SEO solution best meets your business needs.

          • Moz Data

            Power your SEO strategy & AI models with custom data solutions.

          Turn SEO data into actionable content briefs

          Turn SEO data into actionable content briefs

          Learn more
        • Free SEO Tools
          • Domain Analysis

            Get top competitive SEO metrics like DA, top pages and more.

          • Keyword Explorer

            Find traffic-driving keywords with our 1.25 billion+ keyword index.

          • Link Explorer

            Explore over 40 trillion links for powerful backlink data.

          • Competitive Research

            Uncover valuable insights on your organic search competitors.

          • MozBar

            See top SEO metrics for free as you browse the web.

          • More Free SEO Tools

            Explore all the free SEO tools Moz has to offer.

          Let your business shine with Listings AI

          Let your business shine with Listings AI

          Get found
        • Learn SEO
          • Beginner's Guide to SEO

            The #1 most popular introduction to SEO, trusted by millions.

          • SEO Learning Center

            Broaden your knowledge with SEO resources for all skill levels.

          • On-Demand Webinars

            Learn modern SEO best practices from industry experts.

          • How-To Guides

            Step-by-step guides to search success from the authority on SEO.

          • Moz Academy

            Upskill and get certified with on-demand courses & certifications.

          • MozCon

            Save on Early Bird tickets and join us in London or New York City

          Access 20 years of data with flexible pricing
          Moz API

          Access 20 years of data with flexible pricing

          Find your plan
        • Blog
        • Why Moz
          • Digital Marketers

            Simplify SEO tasks to save time and grow your traffic.

          • Small Business Solutions

            Uncover insights to make smarter marketing decisions in less time.

          • Agency Solutions

            Earn & keep valuable clients with unparalleled data & insights.

          • Enterprise Solutions

            Gain a competitive edge in the ever-changing world of search.

          • The Moz Story

            Moz was the first & remains the most trusted SEO company.

          • New Releases

            Get the scoop on the latest and greatest from Moz.

          Surface actionable competitive intel
          New Feature

          Surface actionable competitive intel

          Learn More
        • Log in
          • Moz Pro
          • Moz Local
          • Moz Local Dashboard
          • Moz API
          • Moz API Dashboard
          • Moz Academy
        • Avatar
          • Moz Home
          • Notifications
          • Account & Billing
          • Manage Users
          • Community Profile
          • My Q&A
          • My Videos
          • Log Out

        The Moz Q&A Forum

        • Forum
        • Questions
        • My Q&A
        • Users
        • Ask the Community

        Welcome to the Q&A Forum

        Browse the forum for helpful insights and fresh discussions about all things SEO.

        1. Home
        2. SEO Tactics
        3. Intermediate & Advanced SEO
        4. May know what's the meaning of these parameters in .htaccess?

        Moz Q&A is closed.

        After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

        May know what's the meaning of these parameters in .htaccess?

        Intermediate & Advanced SEO
        3
        9
        2071
        Loading More Posts
        • Watching

          Notify me of new replies.
          Show question in unread.

        • Not Watching

          Do not notify me of new replies.
          Show question in unread if category is not ignored.

        • Ignoring

          Do not notify me of new replies.
          Do not show question in unread.

        • Oldest to Newest
        • Newest to Oldest
        • Most Votes
        Reply
        • Reply as question
        Locked
        This topic has been deleted. Only users with question management privileges can see it.
        • esiow2013
          esiow2013 last edited by

          Begin HackRepair.com Blacklist

          RewriteEngine on

          Abuse Agent Blocking

          RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ^(.
          )Zeus.Webster [NC,OR]
          RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
          RewriteRule ^.
          - [F,L]

          Abuse bot blocking rule end

          End HackRepair.com Blacklist

          1 Reply Last reply Reply Quote 1
          • esiow2013
            esiow2013 last edited by

            Now it's clear. Thanks a lot ThompsonPaul! 🙂

            1 Reply Last reply Reply Quote 0
            • ThompsonPaul
              ThompsonPaul @esiow2013 last edited by

              Thanks! 🙂

              Typically these blacklists are created and maintained by security specialists who have done testing on the different bots to determine which are legit/beneficial and which are crapbots. They then provide these lists for others to use. Often the lists are amalgamations of bots detected and analysed on a number of different sites and by a number of different specialists to act as a double-check for each other.

              You do need to be careful that you are using a well-curated list, as carelessly blocking bots can cause problems for legitimate bots. You would check out the creator of such a list the same way you'd check out the creator of a plugin you're considering using - check reviews, look at comments and responses on the post that provides the blacklist etc.

              That answer your question?

              Paul

              1 Reply Last reply Reply Quote 1
              • esiow2013
                esiow2013 last edited by

                Hi ThompsonPaul,

                Wow! Superb explanation. One thing I just want to clarify, how would I know if these bots are "bad bots".

                Thanks a lot! 🙂

                ThompsonPaul 1 Reply Last reply Reply Quote 0
                • ThompsonPaul
                  ThompsonPaul last edited by

                  As Lynn mentions, these entries form a blacklist for "bad bots". These are bots that are identified as being harmful (or at least non-helpful) to the real use of a website. Bots are essentially spiders that crawl and record the pages of your site the same way the GoogleBot does.There are 2 main reasons for blocking them

                  1. Too many unnecessary bots can put a real strain on server resources, causing the site to slow down for real users. This can be especially problematic with bad bots as they do not respect the entries in your robots.txt file and so will crawl even blocked pages. This can mean huge numbers of extra pages get crawled, leading to even more load.

                  2. Many (most?) of these bots are collecting data for nefarious purposes. Some are scrapers to collect your site content in order to re-use it illegally on another site, some are scanning for certain files/plugins on your site known to be insecure so they can target them for attack, etc.

                  Best case scenario, these bots waste your bandwidth and can cause site slowdowns on low-powered (e.g. shared) servers. Worst case, they can actually cause harm to your site.

                  There are literally many thousands of these types of bots out there, and their creators often change their identifying user agents just to get around these types of blacklists. But many have been around for some time and still use the same identifier. So having a blacklist to block the most common of them is actually very good security practice. To be totally proactive however, you'd need to update the list every couple of months.

                  Bottom line - those entries are providing some security and overload protection for your site, and there's essentially no downside to having them in place even if they're not catching everything.

                  Hope that helps - if any of my explanation isn't clear, just holler 🙂

                  Paul

                  1 Reply Last reply Reply Quote 2
                  • esiow2013
                    esiow2013 last edited by

                    Thanks Lynn! I'll just remove these parameters and leave this one:

                    BEGIN WordPress

                    <ifmodule mod_rewrite.c="">RewriteEngine On
                    RewriteBase /
                    RewriteCond %{REQUEST_FILENAME} !-f
                    RewriteCond %{REQUEST_FILENAME} !-d
                    RewriteRule . /index.php [L]
                    Rewritecond %{http_host} ^domain.com [NC]
                    Rewriterule ^(.*)$ http://www.domain.com/$1 [R=301,NC]</ifmodule>

                    END WordPress

                    1 Reply Last reply Reply Quote 0
                    • LynnPatchett
                      LynnPatchett @esiow2013 last edited by

                      I dont use something like this myself. I suppose if you are having some problem with bots it might be useful, maybe someone else can chime in if they have some experience with this kind of blocking.

                      1 Reply Last reply Reply Quote 1
                      • esiow2013
                        esiow2013 last edited by

                        Thanks Lynn! Is this really necessary?

                        LynnPatchett 1 Reply Last reply Reply Quote 0
                        • LynnPatchett
                          LynnPatchett last edited by

                          HI,

                          It is checking to see if the visiting user agent contains any of these strings (NC is telling it non case sensitive) and if it does to return a 403 forbidden message.

                          1 Reply Last reply Reply Quote 1
                          • 1 / 1
                          • First post
                            Last post

                          Browse Questions

                          Explore more categories

                          • Moz Tools

                            Chat with the community about the Moz tools.

                          • SEO Tactics

                            Discuss the SEO process with fellow marketers

                          • Community

                            Discuss industry events, jobs, and news!

                          • Digital Marketing

                            Chat about tactics outside of SEO

                          • Research & Trends

                            Dive into research and trends in the search industry.

                          • Support

                            Connect on product support and feature requests.

                          • See all categories

                          Related Questions

                          • fablau

                            What's the best way to noindex pages but still keep backlinks equity?

                            Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?

                            Intermediate & Advanced SEO | | fablau
                            3
                          • andyheath

                            Magento: Should we disable old URL's or delete the page altogether

                            Our developer tells us that we have a lot of 404 pages that are being included in our sitemap and the reason for this is because we have put 301 redirects on the old pages to new pages. We're using Magento and our current process is to simply disable, which then makes it a a 404. We then redirect this page using a 301 redirect to a new relevant page. The reason for redirecting these pages is because the old pages are still being indexed in Google. I understand 404 pages will eventually drop out of Google's index, but was wondering if we were somehow preventing them dropping out of the index by redirecting the URL's, causing the 404 pages to be added to the sitemap. My questions are: 1. Could we simply delete the entire unwanted page, so that it returns a 404 and drops out of Google's index altogether? 2. Because the 404 pages are in the sitemap, does this mean they will continue to be indexed by Google?

                            Intermediate & Advanced SEO | | andyheath
                            0
                          • MyPetWarehouse

                            Duplicate Content through 'Gclid'

                            Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.

                            Intermediate & Advanced SEO | | MyPetWarehouse
                            0
                          • friendoffood

                            Alt tag for src='blank.gif' on lazy load images

                            I didn't find an answer on a search on this, so maybe someone here has faced this before. I am loading 20 images that are in the viewport and a bit below.  The next 80 images I want to 'lazy-load'.  They therefore are seen by the bot as a blank.gif file.  However, I would like to get some credit for them by giving a description in the alt tag.  Is that a no-no?  If not, do they all have to be the same alt description since the src name is the same?  I don't want to mess things up with Google by being too aggressive, but at the same time those are valid images once they are lazy loaded, so would like to get some credit for them. Thanks!  Ted

                            Intermediate & Advanced SEO | | friendoffood
                            0
                          • dancape

                            Using the same content on different TLD's

                            HI Everyone, We have clients for whom we are going to work with in different countries but sometimes with the same  language. For example we might have a client in a competitive niche working in Germany, Austria and Switzerland (Swiss German) ie we're going to potentially rewrite our website three times in German, We're thinking of using Google's href lang tags and use pretty much the same content - is this a safe option, has anyone actually tries this successfully or otherwise? All answers appreciated. Cheers, Mel.

                            Intermediate & Advanced SEO | | dancape
                            1
                          • PottyScotty

                            Creating 100,000's of pages, good or bad idea

                            Hi Folks, Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites.  Should we focus on the long tail again? One option for us is to take every town across the UK and create pages using our activities.  e.g. Stirling
                            Stirling paintball
                            Stirling Go Karting
                            Stirling Clay shooting We are not going to link to these pages directly from our main menus but from the site map. These pages would then show activities that were in a 50 mile radius of the towns.  At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to. With 45,000 towns and 250 activities we could create over a million pages which seems very excessive!  Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get. Is there a limit to how big a site should be? edit

                            Intermediate & Advanced SEO | | PottyScotty
                            0
                          • WEB-IRS

                            Include Cross Domain Canonical URL's in Sitemap - Yes or No?

                            I have several sites that have cross domain canonical tags setup on similar pages.  I am unsure if these pages that are canonicalized to a different domain should be included in the sitemap.  My first thought is no, because I should only include pages in the sitemap that I want indexed. On the other hand, if I include ALL pages on my site in the sitemap, once Google gets to a page that has a cross domain canonical tag, I'm assuming it will just note that and determine if the canonicalized page is the better version.  I have yet to see any errors in GWT about this.   I have seen errors where I included a 301 redirect in my sitemap file.  I suspect its ok, but to me, it seems that Google would rather not find these URL's in a sitemap, have to crawl them time and time again to determine if they are the best page, even though I'm indicating that this page has a similar page that I'd rather have indexed.

                            Intermediate & Advanced SEO | | WEB-IRS
                            0
                          • petrakraft

                            What's your best hidden SEO secret?

                            Don't take that question too serious but all answers are welcome 😉 Answer to all:
                            "Gentlemen, I see you did you best - at least I hope so! But after all I suppose I am stuck here to go on reading the SEOmoz blog if I can't sqeeze more secrets from you!

                            Intermediate & Advanced SEO | | petrakraft
                            9

                          Get started with Moz Pro!

                          Unlock the power of advanced SEO tools and data-driven insights.

                          Start my free trial
                          Products
                          • Moz Pro
                          • Moz Local
                          • Moz API
                          • Moz Data
                          • STAT
                          • Product Updates
                          Moz Solutions
                          • SMB Solutions
                          • Agency Solutions
                          • Enterprise Solutions
                          • Digital Marketers
                          Free SEO Tools
                          • Domain Authority Checker
                          • Link Explorer
                          • Keyword Explorer
                          • Competitive Research
                          • Brand Authority Checker
                          • Local Citation Checker
                          • MozBar Extension
                          • MozCast
                          Resources
                          • Blog
                          • SEO Learning Center
                          • Help Hub
                          • Beginner's Guide to SEO
                          • How-to Guides
                          • Moz Academy
                          • API Docs
                          About Moz
                          • About
                          • Team
                          • Careers
                          • Contact
                          Why Moz
                          • Case Studies
                          • Testimonials
                          Get Involved
                          • Become an Affiliate
                          • MozCon
                          • Webinars
                          • Practical Marketer Series
                          • MozPod
                          Connect with us

                          Contact the Help team

                          Join our newsletter
                          Moz logo
                          © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                          • Accessibility
                          • Terms of Use
                          • Privacy

                          Looks like your connection to Moz was lost, please wait while we try to reconnect.