Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Attack of the dummy urls -- what to do?
-
It occurs to me that a malicious program could set up thousands of links to dummy pages on a website:
www.mysite.com/dynamicpage/dummy123
www.mysite.com/dynamicpage/dummy456
etc..
How is this normally handled? Does a developer have to look at all the parameters to see if they are valid and if not, automatically create a 301 redirect or 404 not found? This requires a table lookup of acceptable url parameters for all new visitors.
I was thinking that bad url names would be rare so it would be ok to just stop the program with a message, until I realized someone could intentionally set up links to non existent pages on a site.
-
Hello,
I am also having this issue with hundreds of dummy urls that never existed as a part of our website's blog. Do I go into parameters and specify each of the dummy urls to avoid this?
Thanks in advance for any help!!!! (and sorry to piggyback this question Theodore-hope you don't mind!)
-
Thanks Ray. Appreciate the advice!
-
It's great that you've identified issues like this. I also suggest that if you know certain parameters are generated often and not necessary to index, that you go into your Google Webmaster Tools account > Crawl > URL Parameters and proactively set the crawl rate to 'No URLs' is appropriate. I do this with certain custom parameters for sites that are prone to having these extra URLs indexed mistakenly.
-
Hi Ray-pp,
Thanks for your answer. I'm not getting anything significant, but occasionally a bot will come with extra stuff added to the parameter names, so it got me to thinking a malicious program or nasty competitor might want to do that to cause havoc. My understanding is that 404s don't hurt SEO ranking from Google, but I was thinking that the way things are set up now no-one would get a 404 and in fact Google would index the 'bad' pages, so maybe I needed to do something proactively to 404 or 301 such pages so they would never get put into an index at all.
Since my site has lots of dynamically generated pages, I've had my share of surprises, and am just trying to avoid any new ones!
-
Hi Theodore - You pose an interesting problem, are you currently experiencing this issue? I don't see why someone would create a bunch of random non-existent links to your site, but if they did (and the pages were receiving low quality traffic) then I would proactively disavow those domains that created the links. That would be enough to prevent any penalties you're afraid of receiving.
If, however, you're noticing that specific 404 pages are receiving quality traffic (maybe an old page was removed but good traffic is still sent to the page) then you would want to 301 that page to its closest relative page that deserves the traffic and authority.
Does that help? Maybe a little more information around you specific problem would allow me to tailor the advice better.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
410 or 301 after URL update?
Hi there, A site i'm working on atm has a thousand "not found" errors on google console (of course, I'm sure there are thousands more it's not showing us!). The issue is a lot of them seem to come from a URL change. Damage has been done, the URLs have been changed and I can't stop that... but as you can imagine, i'm keen to fix as many as humanly possible. I don't want to go mad with 301s - but for external links in, this seems like the best solution? On the other hand, Google is reading internal links that simply aren't there anymore. Is it better to hunt down the new page and 301-it anyway? OR should I 410 and grit my teeth while google crawls and recrawls it, warning me that this page really doesn't exist? Essentially I guess I'm asking, how many 301s are too many and will affect our DA? And what's the best solution for dealing with mass 404 errors - many of which aren't attached or linked to from any other pages anymore? Thanks for any insights 🙂
Intermediate & Advanced SEO | | Fubra0 -
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
Removing .html from URLs - impact of rankings?
Good evening Mozzers. Couple of questions which I hope you can help with. Here's the first. I am wondering, are we likely to see ranking changes if we remove the .html from the sites URLs. For example website.com/category/sub-category.html Change to: website.com/category/sub-category/ We will of course make sure we 301 redirect to the new, user friendly URLs, but I am wondering if anyone has had previous experience of implementing this change and how it has effected rankings. By having the .html in the URLs, does this stop link juice being flowed back to the root category? Second question: If one page can be loaded with and without a forward slash "/" at the end, is this a duplicate page, or would Google consider this as the same page? Would like to eliminate duplicate content issues if this is the case. For example: website.com/category/ and website.com/category Duplicate content/pages?
Intermediate & Advanced SEO | | Jseddon920 -
URL Rewriting Best Practices
Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
Code to change country in URL for locale results
How do I change the code in my URL to search in Google by specific location?
Intermediate & Advanced SEO | | theLotter0 -
Canonical URLs and Sitemaps
We are using canonical link tags for product pages in a scenario where the URLs on the site contain category names, and the canonical URL points to a URL which does not contain the category names. So, the product page on the site is like www.example.com/clothes/skirts/skater-skirt-12345, and also like www.example.com/sale/clearance/skater-skirt-12345 in another category. And on both of these pages, the canonical link tag references a 3rd URL like www.example.com/skater-skirt-12345. This 3rd URL, used in the canonical link tag is a valid page, and displays the same content as the other two versions, but there are no actual links to this generic version anywhere on the site (nor external). Questions: 1. Does the generic URL referenced in the canonical link also need to be included as on-page links somewhere in the crawled navigation of the site, or is it okay to be just a valid URL not linked anywhere except for the canonical tags? 2. In our sitemap, is it okay to reference the non-canonical URLs, or does the sitemap have to reference only the canonical URL? In our case, the sitemap points to yet a 3rd variation of the URL, like www.example.com/product.jsp?productID=12345. This page retrieves the same content as the others, and includes a canonical link tag back to www.example.com/skater-skirt-12345. Is this a valid approach, or should we revise the sitemap to point to either the category-specific links or the canonical links?
Intermediate & Advanced SEO | | 379seo0 -
Exact keyword URL or not?
Hi all, I have a quick question about the proper use of permalinks. Let's say that I have a website about sports and I want to create an internal page dedicated to shoes. I know that the keyword "shoe" has 15.000 monthly visits, while the keyword "shoes" has 1.000 monthly visits. How do I have to name the internal page? http://www.example.com/shoe or http://www.example.com/shoes (with a final 's')? I would think that by naming the URL http://www.example.com/shoes, the search engine would consider that page for the keywords "shoe" and "shoes", but I am not sure about it. Should I create a URL that only focuses on one specific keyword ("shoe", in this example) or a URL that may encompass more than one keyword ("shoe" and "shoes")? I hope this is clear. Thank you for your time and help. All best, Sal
Intermediate & Advanced SEO | | salvyy0 -
Htaccess Redirect with %C2%A0 in URL
Below is my setup for redirects in .htaccess file in my root word press installation. The www to non-www works well, so no problems there Other page redirects work well, too (example: redirect 301 /some-page/ http://mysite.com/another-page/ (I didn't post those because I have a few too many : ) So here it goes... RewriteEngine On
Intermediate & Advanced SEO | | pepsimoz
RewriteCond %{HTTP_HOST} ^www.mysite.com$ [NC]
RewriteRule ^(.*)$ http://mysite.com/$1 [R=301,L] BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress redirect 301 /archives/10-college- majors/ http://mysite.com/archives/10-college-majors/ redirect 301 /archives/10-college-%20majors/ http://mysite.com/archives/10-college-majors/ redirect 301 /archives/10-college-%C2%A0majors/ http://mysite.com/archives/10-college-majors/ I'm having a problem with the last 301 redirect: redirect 301 /archives/10-college-%C2%A0majors/ http://mysite.com/archives/10-college-majors/ not working... As you can see I've tried using other varations of the "space" but no go. I also used a redirect in cPanel's Redirect screen; testing all the possible options + wildcard I've also tried this: http://serverfault.com/questions/201829/using-special-characters-in-apache-mod-rewrite-rule (perhaps unsuccessfully, because it caused a 500 server error and it's a different situation in my case) I also saw something here: http://www.webmasterworld.com/apache/3908682.htm but I don't know if it works and how I would implement that + do so without compromising ALL other redirects. Note: the URL displays with a space in the address bar of all major web browsers: http://mysite.com/10-college- majors/ and goes to a 404 page I have a goregous page / PR6 / high authority site linking to the URL on my site, but they copied the URL with a space somehow. I contacted the person responsible for the website and he claims it works fine (aka he didn't check it). Is there a clean way to redirect ONLY this problematic URL without compromising other redirects, etc? Any ideas would be great. I'll respond with progress. Thanks in advance. UPDATE the redirect works, and it did work. Even so, when looking at source of page linking to mine, the URL looks like this: ``` http://mysite.com/archives/10-college- majors/ Clicking the URL in Source View in FireFox takes me to ``` http://mysite.com/archives/10-college-%C2%A0majors/ none of my 301 redirects should direct there. I don't have any redirect plugins either.0