Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Search Console Showing 404 errors for product pages not in sitemap?
-
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url).
Is this expected? Will these errors eventually go away/stop being monitored by Google?
-
@woshea Implement 301 redirects from the old URLs to the new ones. This tells search engines that the old page has permanently moved to a new location. It also ensures that visitors who click on old links are redirected to the correct content.
-
Yes, it is not uncommon for Google to show 404 errors for products with URL changes, even if the correct new URLs are listed in the sitemap. This is because Google's crawlers may take some time to recrawl and update their index with the new URLs.
Typically, these 404 errors should eventually go away and stop being monitored by Google once the search engine has fully indexed and recognized the new URLs. However, the time it takes for this process to happen can vary based on the frequency of Googlebot's crawls and the size of your website. I am also facing this issue in my site flyer maker app and resolve this issue using the below techniques.
-
Ensure that your sitemap is up-to-date and includes all the correct URLs for your products.
-
Check for any internal links on your website that may still be pointing to the old URL and update them to the new URL.
-
Use 301 redirects from the old URL to the new URL. For example, set up a 301 redirect from product old URL to product new URL. This tells Google and other search engines that the content has permanently moved to a new location.
-
-
@woshea Yes, it is not uncommon for Google to show 404 errors for products with URL changes, even if the correct new URLs are listed in the sitemap. This is because Google's crawlers may take some time to recrawl and update their index with the new URLs.
Typically, these 404 errors should eventually go away and stop being monitored by Google once the search engine has fully indexed and recognized the new URLs. However, the time it takes for this process to happen can vary based on the frequency of Googlebot's crawls and the size of your website. I am also facing this issue in my site flyer maker app and resolve this issue using the below techniques.
-
Ensure that your sitemap is up-to-date and includes all the correct URLs for your products.
-
Check for any internal links on your website that may still be pointing to the old URL and update them to the new URL.
-
Use 301 redirects from the old URL to the new URL. For example, set up a 301 redirect from product old URL to product new URL. This tells Google and other search engines that the content has permanently moved to a new location.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practices for retiring 100s of blog posts?
Hi. I wanted to get best practices for retiring an enterprise blog with hundreds of old posts with subject matter that won't be repurposed. What would be the best course of action to retire and maintain the value of any SEO authority from those old blog pages? Is it enough to move those old posts into an archive subdirectory and Google would deprioritize those posts over time? Or would a mass redirect of old blog posts to the new blog's home page be allowed (even though the old blog post content isn't being specifically replaced)? Or would Google basically say that if there aren't 1:1 replacement URLs, that would be seen as soft-404s and treated like a 404?
White Hat / Black Hat SEO | | David_Fisher0 -
Search Console Missing field 'mainEntity'
Hello,
SEO Tactics | | spaininternship
I am with a problem, in my site I added a faq with schema structure (https://internships-usa.eu/faq/). But is appearing the following problem in Search Console:
Missing field 'mainEntity' ["WebPage","FAQPage"],"@id":"https://internships-usa.eu/faq/#webpage","url":"https://internships-usa.eu/faq/","name":"Help Center - Internships USA","isPartOf":{"@id":"https://internships-usa.eu/#website"},"datePublished":"2022-05-31T14:43:15+00:00","dateModified":"2022-06-01T08:07:13+00:00","breadcrumb":{"@id":"https://internships-usa.eu/faq/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https://internships-usa.eu/faq/"]}]}, What do I have to do to solve this?0 -
Google Search console says 'sitemap is blocked by robots?
Google Search console is telling me "Sitemap contains URLs which are blocked by robots.txt." I don't understand why my sitemap is being blocked? My robots.txt look like this: User-Agent: *
Technical SEO | | Extima-Christian
Disallow: Sitemap: http://www.website.com/sitemap_index.xml It's a WordPress site, with Yoast SEO installed. Is anyone else having this issue with Google Search console? Does anyone know how I can fix this issue?1 -
How to remove Parameters from Google Search Console?
Hi All, Following are parameter configuration in search console - Parameters - fl
Technical SEO | | adamjack
Does this parameter change page content seen by the user? - Yes, Changes, reorders, or narrows page content.
How does this parameter affect page content? - Narrow
Which URLs with this parameter should Googlebot crawl? - Let Googlebot decide (Default) Query - Actually it is filter parameter. I have already set canonical on filter page. Now I am doing tracking of filter pages via data layer and tag manager so in google analytic I am not able to see filter url's because of this parameter. So I want to delete this parameter. Can anyone please help me? Thanks!0 -
How to check if an individual page is indexed by Google?
So my understanding is that you can use site: [page url without http] to check if a page is indexed by Google, is this 100% reliable though? Just recently Ive worked on a few pages that have not shown up when Ive checked them using site: but they do show up when using info: and also show their cached versions, also the rest of the site and pages above it (the url I was checking was quite deep) are indexed just fine. What does this mean? thank you p.s I do not have WMT or GA access for these sites
Technical SEO | | linklander0 -
Miss meta description on 404 page
Hi, My 404 page did not have meta description. Is it an error? Because I run report and seomoz said that a problem. Thanks!
Technical SEO | | JohnHuynh0 -
Tags showing up in Google
Yesterday a user pointed out to me that Tags were being indexed in Google search results and that was not a good idea. I went into my Yoast settings and checked the "nofollow, index" in my Taxanomies, but when checking the source code for no follow, I found nothing. So instead, I went into the robot.txt and disallowed /tag/ Is that ok? or is that a bad idea? The site is The Tech Block for anyone interested in looking.
Technical SEO | | ttb0 -
Google's "cache:" operator is returning a 404 error.
I'm doing the "cache:" operator on one of my sites and Google is returning a 404 error. I've swapped out the domain with another and it works fine. Has anyone seen this before? I'm wondering if G is crawling the site now? Thx!
Technical SEO | | AZWebWorks0