Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
404 or 410 status code after deleting a real estate listing
-
Hi there,
We manage a website which generates an overview and detailpages of listings for several real estate agents.
When these listings have been sold, they are removed from the overview and pages. These listings appear as not found in the crawl error overview in Google Search Console. These pages appear as 404's, would changing this to 410's solve this problem? And if not, what fix could take care of this problem?
-
Good answer Dirk.
I like your idea of adding valuable, relevant content to the pages Dirk, good thinking.
Personally, I'd rather Iet Google know these pages are removed intentionally and not due to errors, so 410 rather than leaving as 40.
One thing to be mindful of, though, is how much crawl budget you're willing to give to these pages. If we're talking about a lot of pages in bulk, I'd be worried how much crawl budget they'd eat up over time. As you point out, they'd likely drop in rank anyway due to loss of internal links too, so might be the cost to the crawl budget isn't worth it?.
Another solution (using your idea Dirk), would be to somehow automate the process of, when a listing is marked as sold, the listing is removed, other properties in the same area are added (as you suggest), then some time later (month or two?), a 410 header set.
The other option would be to 301 the old pages back to the area page for the properties (perhaps with something like a bootstrap message saying the property is sold but others in the area are available). This would pass juice etc back to that page. but, of course, you'd be telling G that the page had permanently moved, which isn't quite the case.
-
The answer from Kristen is correct. However changing 404 to 410 will just let these pages appearing as 410 in the Search Console. The fact that they are appearing is not a problem - it's just that Google wants to notify you that pages return a 4xx status. If this is intended (like in your case) you can just ignore these messages and mark them as fixed.
In your case you could as well consider another option - remove the pages from the listings but keep them published (with status 200). Update the page, indicating that the original property is sold but list some other (similar) properties as an alternative. This way, if there are external pages linking to the property page the link value doesn't get lost and if people would accidentally land on this page they still find content which could be interesting to them (as you remove the navigation links to these pages they become orphans - so little change that they will rank very high in Google)
Dirk
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Over-optimizing Internal Linking: Is this real and, if so, what's the happy medium?
I have heard a lot about having a solid internal linking structure so that Google can easily discover pages and understand your page hierarchies and correlations and equity can be passed. Often, it's mentioned that it's good to have optimized anchor text, but not too optimized. You hear a lot of warnings about how over-optimization can be perceived as spammy: https://neilpatel.com/blog/avoid-over-optimizing/ But you also see posts and news like this saying that the internal link over-optimization warnings are unfounded or outdated:
Intermediate & Advanced SEO | | SearchStan
https://www.seroundtable.com/google-no-internal-linking-overoptimization-penalty-27092.html So what's the tea? Is internal linking overoptimization a myth? If it's true, what's the tipping point? Does it have to be super invasive and keyword stuffy to negatively impact rankings? Or does simple light optimization of internal links on every page trigger this?1 -
Categories showing on SERP listings?
Hi I was wondering if anyone knows what these are called? See attached screenshot. Basically, it looks like Google is pulling the primary category and then sub categories from the site and adding them to the SERP listing. Are there any benefits to this besides possibly higher CTR? Cheers. wn3ybMMOQFW98fNQkxtJkA.png
Intermediate & Advanced SEO | | wozniak651 -
Where does Movie Theater schema markup code live?
What I am trying to accomplishI want what AMC has. When searching google for a movie at AMC near me, Google loads the movie times right onto the top of the first page. When you click the movie time it links to a pop up window that gives you the option to purchase from MovieTickets.com, Fandango or AMC.com.Info about my theaterMy theater hosts theater info and movie time info on their website. Once you click the time you want it takes you to a third party ticket fulfillment site via sub domain that I have little control over. Currently Fandango tickets show up in Google like AMCs but the option to buy on my theater site does not.Questions Generally, how do I accomplish this? Does the schema code get implemented on the third party ticket purchasing site or on my site? How can I ensure that the Google pop-up occurs so that users have a choice to purchase via Fandango or on my theaters website? TSt9g
Intermediate & Advanced SEO | | ColeBField2 -
Is there a way to get a list of Total Indexed pages from Google Webmaster Tools?
I'm doing a detailed analysis of how Google sees and indexes our website and we have found that there are 240,256 pages in the index which is way too many. It's an e-commerce site that needs some tidying up. I'm working with an SEO specialist to set up URL parameters and put information in to the robots.txt file so the excess pages aren't indexed (we shouldn't have any more than around 3,00 - 4,000 pages) but we're struggling to find a way to get a list of these 240,256 pages as it would be helpful information in deciding what to put in the robots.txt file and which URL's we should ask Google to remove. Is there a way to get a list of the URL's indexed? We can't find it in the Google Webmaster Tools.
Intermediate & Advanced SEO | | sparrowdog0 -
Chinese Sites Linking With Bizarre Keywords Creating 404's
Just ran a link profile, and have noticed for the first time many spammy Chinese sites linking to my site with spammy keywords such as "Buy Nike" or "Get Viagra". Making matters worse, they're linking to pages that are creating 404's. Can anybody explain what's going on, and what I can do?
Intermediate & Advanced SEO | | alrockn0 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
Can we retrieve all 404 pages of my site?
Hi, Can we retrieve all 404 pages of my site? is there any syntax i can use in Google search to list just pages that give 404? Tool/Site that can scan all pages in Google Index and give me this report. Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
What should happen to expired real estate listings?
For a real estate website, when a house is sold or taken off of the market. What should happen to the listing? 301 redirect it to the grouping (such as zip code or city) which that listing resides in? 404 it?
Intermediate & Advanced SEO | | wattssw0