Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to remove 404 pages wordpress
-
I used the crawl tool and it return a 404 error for several pages that I no longer have published in Wordpress. They must still be on the server somewhere?
Do you know how to remove them? I think they are not a file on the server like an html file since Wordpress uses databases?
I figure that getting rid of the 404 errors will improve SEO is this correct?
Thanks,
David
-
Yeah...as others have noted, there often is the live link somewhere else that points to a page that is now gone...
So a 404 really is the LINK page....as long as it's out there, it'll point to that non-existant page....so a 301 can help, or (this was fun) you can 301 the incoming 404 link BACK to the linking page itself....
teeHee...yeah, not such a good idea but a tactic that we did have to use about 4 years ago to get a spam directory to "buzz off!!!"
-
Hey David
Once you publish a page/post in WordPress and submit a sitemap, you are stuck with those pages. I've experienced this problem a lot as I use WordPress often. Once you trash a page there and delete it permanently, it's not stored anywhere in the WordPress CMS. They are just reading as 404s since they existed and now no longer exist.
As stated above, just make sure you are not linking to your trashed page anywhere in your site.
I've done a couple things with 404 Pages on my WordPress sites:
1. Make an awesome 404 page so that people will stay on the site if they found your 404 page on accident. Google will eventually stop crawling 404s so this is a good temporary way to engage users.
2. 301 Redirect the 404s to relevant pages. This helps keep your link juice and also helps with the user experience (since they are reaching a relevant page)
Hope that helps!
-
404's are a natural part of websites, Google understands that. As long as you don't have links to pages on your site that are 404'ing you're fine. So basically, just make sure your website is not the source of your 404's.
-
Anything you type after your domain which isn't an actual page will return a not found error; it doesn't mean the page exists somewhere. [Try entering yourdomain.com/anythingyouwant and you will get a 404.] Or am I misunderstanding the question? In any case, 404 errors are not necessarily bad for SEO, as long as they are not harming the user experience.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Source page showsI have 2 h1 tags on my page. I can only find one.
When I grade my page it says I have more than one h1 tag. I view the source page and it shows there are two h1 headings with the same wording. If I delete the one h1 heading I can find, the page source shows I have deleted both of them. I don't know how to get to the other heading to delete it. And I'm off page one of google! Can anybody help? Clay Stephens
Moz Pro | | Coot0 -
Is one page with long content better than multiple pages with shorter content?
(Note, the site links are from a sandbox site and has very low DA or PA) If you look at this page, you will see at the bottom a lengthy article detailing all of the properties of the product categories in the links above. http://www.aspensecurityfasteners.com/Screws-s/432.htm My question is, is there more SEO value in having the one long article in the general product category page, or in breaking up the content and moving the sub-topics as content to the more specific sub-category pages? e.g. http://www.aspensecurityfasteners.com/Screws-Button-Head-Socket-s/1579.htm
Moz Pro | | AspenFasteners
http://www.aspensecurityfasteners.com/Screws-Cap-Screws-s/331.htm
http://www.aspensecurityfasteners.com/Screws-Captive-Panel-Scre-s/1559.htm0 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
Dynamic contents causes duplicate pages
Technical help required - please!
Moz Pro | | GBCweb
In our Duplicate Content Pages Report I see a lot of duplicate pages that are created by one URL plus several versions of the same page with the dynamic content, for example,
http://www.georgebrown.ca/immigranteducation/programs
http://www.georgebrown.ca/school-program.aspx?id=1909&Sortid=Study
http://www.georgebrown.ca/school-program.aspx?id=1909&Sortid=Term
http://www.georgebrown.ca/school-program.aspx?id=1909&Sortid=Certification
http://www.georgebrown.ca/school-program.aspx?id=1909&Sortid=Title How do we solve it?0 -
Tool recommendation for Page Depth?
I'd like to crawl our ecommerce site to see how deep (clicks from home page) pages are. I want to verify that every category, sub-category, and product detail page is within three clicks of the home page for googlebot. Suggestions? Thanks!
Moz Pro | | Garmentory0 -
Page Authority is the same on every page of my site
I'm analyzing a site and the page authority is the exact same for every page in the site. How can this be since the page authority is supposed to be unique to each page?
Moz Pro | | azjayhawk0 -
How to check Page Authority in bulk?
Hey guys, I'm on the free trial for SEOmoz PRO and I'm in love. One question, though. I've been looking all over the internet for a way to check Page Authority in bulk. Is there a way to do this? Would I need the SEOmoz API? And what is the charge? All I really need is a way to check Page Authority in bulk--no extra bells and whistles. Thanks, Brandon
Moz Pro | | thegreatpursuit0