Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
-
Hi there,
A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description.
I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are:
-
If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap?
-
Is submitting each url manually bad, and if so, why?
-
Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls?
-
Any other suggestions?
-
-
Hi David,
The Fetch and Render looked blank, but I know Google can still read the code since it picked up on the schema we added less than a week after we added it. I sent the javascript guides over to our developers, but I would still really appreciate you looking at the URL if possible. I can't find a way to DM you on here, so I've sent you a LinkedIn request. Feel free to ignore it if there's a better way to communicate
- JW
-
That is a interesting Question
-
Hi,
I would mostly look into the site itself, from what you've mentioned here I don't think that the problem is in your sitemap but more on the side or React. Are you using server side or client side rendering for the pages in React? That usually can have a big impact on how Google is able to see the different pages and pick up on content (including meta tags).
Martijn.
-
Hi DigitalMarketingSEO,
This sounds like it's Google having some issues with your React website.
There are plenty of good SEO for Javascript guides out there that I would recommending reading through:
https://www.elephate.com/blog/ultimate-guide-javascript-seo/
https://builtvisible.com/javascript-framework-seo/
https://www.briggsby.com/dealing-with-javascript-for-seoHow did the "Fetch and Render" look? Was Googlebot able to see your page exactly as a human user would?
Can you share the URL here (or PM me)? I've done a lot of work on JS sites and I'd be happy to take a quick look to see I can give some more specific advice.
Cheers,
David
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does changing content and design of the website gonna affect my all the backlinks i have made till now
i have been working on my link profile for a month now, after learning about 5 step moz methodology i have decided that i would like to change all of the content of my site and taylor it to what my customers need, am i gonna loose all the domain authority if make changes? if it gonna affect, hows that gonna come out
Web Design | | calvinkj0 -
Any risks involved in removing a sub-domain from search index or completely taking down? Ranking impact?
Hi all, One of our sub-domains has thousands of indexed pages but traffic is very less and irrelevant. There are links between this sub-domain to other sub domains of ours. We are planning to take this subdomain completely. What happens if so? Google responds for this with a ranking change? Thanks
Web Design | | vtmoz0 -
Website Redesign and Migration to Squarespace killed my Ranking
My old website was dated, ugly, impossible to update and a mess between hard-coded pages and WP, but we were ranking #1 in the organic searches for our key words. I just redesigned my website using Squarespace. I kept most of the same text on the pages (for key words) and kept the same Meta-Tags and Title Tags for each page as much as possible. Once I was satisfied that I had done as much on-page optimization as I could, I changed the IP in our Domain Name Registry so that it would point to our new website on the Squarespace host. And our new website was live! ...Then I watched in dismay as our ranking fell into oblivion. I think this might have something to do with not doing any 301 redirects from the old website and losing all of my link juice. Is this the case? And, if so, how do I fix it? Our website url is www.kanataskinclinic.ca Thanks
Web Design | | StillLearning1 -
E-Commerce Website Architecture - Cannibalization between Product Categories and Blog Categories?
Hi, I have an e-commerce site that sells laptops. My main landing pages and category pages are as follows:
Web Design | | BeytzNet
"Toshiba Laptops", "Samsung Laptops", etc. We also run a WP blog with industry news.
The posts are divided into categories which are basically as our landing pages.
The posts themselves usually link to the appropriate e-commerce landing page.
For example: a post about a new Samsung Laptop which is categorized in the blog under "Samsung Laptops" will naturally link somewhere inside to the "samsung laptops" ecommerce landing page. Is that good or do the categories on the blog cannibalize my more important e-commerce section landing pages? Thanks0 -
Yes or No for Ampersand "&" in SEO URLs
Hi Mozzers I would like to know how crawlers see the ampersand (& or &) in your URLs and if Google frown upon this or not? As far as I know they purely recognise this as "and" is this correct and is there any best practice for implementing this, as I know a lot of people complained before about & in links and that it is better to use it as &, but this is not on links, this is on URLs. Reason for this is that we looking to move onto an ASP.Net MVC framework (any suggestions for a different framework are welcome, we still just planning out future development) and in order to make use of the filter options we have on our site we need a parameter to indicate the difference on a routing level (routing sends to controller, controller sends to model, model sends to controller and controller sends to view < this is pattern of a request that comes in on the framework we will be using). I already have -'s and /'s in the URLs (which is for my SEO structuring) so these syntax can't be used for identifying filters the user clicks or uses to define their search as it will create a complete mess in the system. Now we looking at & to say; OK, when a user lands on /accommodation and they selects De Kelders (which is a destination in our area) the page will be /accommodation/de-kelders on this page they can define their search further to say they are looking for 5 star accommodation and it should be close to the beach, this is where the routing needs some guidance and we looking to have it as follow: /accommodation/de-kelders/5-star&close-to-the-beach. Now, does the "&" get identified by search engines on a URL level as "and" and does this cause any issues with crawling or indexation or would it be best to look at another solution? Thanks, Chris Captivate
Web Design | | DROIDSTERS0 -
Google also indexed trailing slash version - PLEASE HELP
Hi Guys, We redesigned the website and somehow our canonical extension decided to add a trailing slash to all URLs. Previously our canonical URLs didn't have a trailing slash. During the redesign we haven't changed the URLs. They remained same but we have now two versions indexed. One with trailing slash one without. I've now fixed the issue and removed the the trailing slash from canonical URLs. Is this the correct way of fixing it? Will our rankings be effected in a negative way? Is there anything else I need to do. The website went live last Tuesday. Thanks
Web Design | | Jvalops0 -
Custom 404 Page Indexing
Hi - We created a custom 404 page based on SEOMoz recommendations. But.... the page seems to be receiving traffic via organic search. Does it make more sense to set this page as "noindex" by its metatag?
Web Design | | sftravel0 -
Indexing Dynamic Pages
Hi, I am having an issues among others, regarding indexing dynamic pages. Our website, www.me-by-melia, was just put live and I am concerned the bottom naviagtion pages (http://www.me-by-melia.com/#store, http://www.me-by-melia.com/#facebook, etc) will not be indexed and create duplicate pages. Also, when you open these pages in a new tab, it takes you to homepage. The website was created in HTML5. Please advise.
Web Design | | Melia0