Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Inconsistency between content and structured data markup
-
Hi~ everyone
What does Google think about the inconsistency between content and structured data markup?
Is this kind of a cheating way ? Is hurt my SEO?
-
This is what they say, explicitly: https://developers.google.com/search/docs/guides/sd-policies. Specifically, see the "Quality Guidelines > Content" section.
In terms of actual penalties, ranking influence, or marking pages as spam , I can't say from experience as I've never knowingly used markup inconsistent with the information visible on the page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Product Schema Markup for All Products
Hi Team, Google search console used to allow you to use their structured data markup helperhttps://www.google.com/webmasters/markup-helper/u/0/ to markup multiple product pages at once that were similar. I do not see this feature anymore with the new search console. Does anyone have a recommendation for marking up multiple product pages without having to have schema markup firing in GTM for each product page?
Technical SEO | | agrier0 -
SEO + Structured Data for Metered Paywall
I have a site that will have 90% of the content behind a metered paywall. So all content is accessible in a metered way. All users who aren't logged in will have access to 3 articles (of any kind) in a 30 day period. If they try to access more in a 30 day period they will hit a paywall. I was reading this article here on how to handle structured data with Google for content behind a paywall: https://www.searchenginejournal.com/paywalls-seo-strategy/311359/However, the content is not ALWAYS behind a paywall, since it is metered. So if a new user comes to the site, they can see the article (regardless of what it is). Is there a different way to handle content that will be SOMETIMES behind a paywall bc of a metered strategy? Theoretically I want 100% of the content indexed and accessible in SERPs, it will just be accessible depending on the user's history (cookies) with the site. I hope that makes sense.
Technical SEO | | triveraseo0 -
Recurring events and duplicate content
Does anyone have tips on how to work in an event system to avoid duplicate content in regards to recurring events? How do I best utilize on-page optimization?
Technical SEO | | megan.helmer0 -
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
Handling of Duplicate Content
I just recently signed and joined the a-moz.groupbuyseo.org system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Url folder structure
I work for a travel site and we have pages for properties in destinations and am trying to decide how best to organize the URLs basically we have our main domain, resort pages and we'll also have articles about each resort so the URL structure will actually get longer:
Technical SEO | | Vacatia_SEO
A. domain.com/main-keyword/state/city-region/resort-name
_ domain.com/family-condo-for-rent/orlando-florida/liki-tiki-village_ _ domain.com/main-keyword-in-state-city/resort-name-feature _
_ domain.com/family-condo-for-rent/orlando-florida/liki-tiki-village/kid-friend-pool_ B. Another way to structure would be to remove the location and keyword folders and combine. Note that some of the resort names are long and spaces are being replaced dynamically with dashes.
ex. domain.com/main-keyword-in-state-city/resort-name
_ domain.com/family-condo-for-rent-in-orlando-florida/liki-tiki-village_ _ domain.com/main-keyword-in-state-city/resort-name-feature_
_ domain.com/family-condo-for-rent-in-orlando-florida/liki-tiki-village-kid-friend-pool_ Question: is that too many folders or should i combine or break up? What would you do with this? Trying to avoid too many dashes.0 -
Duplicate content and http and https
Within my Moz crawl report, I have a ton of duplicate content caused by identical pages due to identical pages of http and https URL's. For example: http://www.bigcompany.com/accomodations https://www.bigcompany.com/accomodations The strange thing is that 99% of these URL's are not sensitive in nature and do not require any security features. No credit card information, booking, or carts. The web developer cannot explain where these extra URL's came from or provide any further information. Advice or suggestions are welcome! How do I solve this issue? THANKS MOZZERS
Technical SEO | | hawkvt10 -
Duplicate Content issue
I have been asked to review an old website to an identify opportunities for increasing search engine traffic. Whilst reviewing the site I came across a strange loop. On each page there is a link to printer friendly version: http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes That page also has a link to a printer friendly version http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes&printfriendly=yes and so on and so on....... Some of these pages are being included in Google's index. I appreciate that this can't be a good thing, however, I am not 100% sure as to the extent to which it is a bad thing and the priority that should be given to getting it sorted. Just wandering what views people have on the issues this may cause?
Technical SEO | | CPLDistribution0