Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Event Schema markup for multiple events (same location/address)?
-
I was wondering if its possible to markup multiple events on the same page for one location/address using the event schema.org markup? I tried doing it on a sample page below:
http://www.rama.id.au/event-schema-test/
Google's schema testing tool shows that its all good (except for warning for offers). Just wanted to know if I am doing it correctly or is there a better solution. Any help would be much appreciated.
Thank you

-
Webmaster tools / search console https://www.google.com/webmasters/tools/home?hl=en
https://search.google.com/structured-data/testing-tool/u/0/
You can use this as a template
Or use this great tool https://jsonld.com/json-ld-generator/
(remember with JSON-LD you must have the content in HTML if you post via JSON-LD)
-
-
Hi Vincent
Again, it may be a matter of creating individual pages for each event w proper Schema, or it is possible to use tags in Schema as well. So, each event would be wrapped by their proper Event tag, with the address information in a meta tag included. Both are rather tedious. You can read more here:
https://schema.org/docs/gs.html#schemaorg_testing
Ideally though, you'd have an individual page for each event.
You could follow Ticketmaster's path and use data-vocabulary.com markup, however, Schema is the standard. If you're wondering what I mean, run the following URL through the Google Structure Data Markup Tester I linked to in my previous comment:
http://www.ticketmaster.com/Chicago-Bulls-tickets/artist/805914
Sorry for not posting links - I am on my phone and I cannot. Will update in the AM.
Hope these help! Good luck!
-
Hello Oleg and Patrick
Thank you so much gentlemen for helping me out. Unfortunately, I cannot wrap each event in it's own using itemscope itemtype="http://schema.org/Event" as each event will then require me to specify the same address multiple times due to the "location" attribute being a required field. Since the address occurs only once on the pageI am bound to use it only once by tying the same address to multiple address. On a side note, how come the Google schema testing tool able to pass my implementation on the sample URL?
Hope to hear from you soon.
Thanks once again.
-
I agree with Oleg here - each event should have it's own page.
That being said, it is possible to markup individual events on the same page, because each event is has it's own unique attributes. Each event will be wrapped in it's own itemscope itemtype="http://schema.org/Event" - so be mindful of that.
You can read more here.
Keep in mind Google and Yandex have structured data markup testing tools.
Hope this helps! Good luck!
-
You should create separate events for each event you have (even if the location is the same)
from https://schema.org/Event --> "Repeated events may be structured as separate Event objects."
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sudden Indexation of "Index of /wp-content/uploads/"
Hi all, I have suddenly noticed a massive jump in indexed pages. After performing a "site:" search, it was revealed that the sudden jump was due to the indexation of many pages beginning with the serp title "Index of /wp-content/uploads/" for many uploaded pieces of content & plugins. This has appeared approximately one month after switching to https. I have also noticed a decline in Bing rankings. Does anyone know what is causing/how to fix this? To be clear, these pages are **not **normal /wp-content/uploads/ but rather "index of" pages, being included in Google. Thank you.
Technical SEO | | Tom3_150 -
Rel Canonical, Follow/No Follow in htaccess?
Very quick question, are rel canonical, follow/no follow tags, etc. written in the htaccess file?
Technical SEO | | moon-boots0 -
<sub>& <sup>tags, any SEO issues?</sup></sub>
Hi - the content on our corporate website is pretty technical, and we include chemical element codes in the text that users would search on (like S02, C02, etc.) A lot of times our engineers request that we list the codes correctly, with a <sub>on the last number. Question - does adding this code into the keyword affect SEO? The code would look like SO<sub>2</sub>.</sub> Thanks.
Technical SEO | | Jenny10 -
Is it worth adding schema markup to articles?
I know things like location, pagination, breadcrumbs, video, products etc have value in using schema markup. What about things like articles though? Is it worth all the work involved in having the pages mark up automatically? How does this effect SEO, and is it worthwhile? Thanks, Spencer
Technical SEO | | MarloSchneider0 -
Double Slash // in URL
My client is using double forward slahes in URL like this "//" is this affecting SEO?
Technical SEO | | yanaiguana1110 -
How to create unique content for businesses with multiple locations?
I have a client that owns one franchise location of a franchise company with multiple locations. They have one large site with each location owning it's own page on the site, which I feel is the best route. The problem is that each location page has basically duplicate content on each page resulting in like 80 pages of duplicate content. I'm looking for advice on how to create unique content for each location page? What types of information can we write about to make each page unique, because you can only twist sentences and content around so much before it just all sounds cookie cutter and therefore offering little value.
Technical SEO | | RonMedlin0 -
Location Based Content / Googlebot
Our website has local content specialized to specific cities and states. The url structure of this content is as follows: www.root.com/seattle www.root.com/washington When a user comes to a page, we are auto-detecting their IP and sending them directly to the relevant location based page - much the way that Yelp does. Unfortunately, what appears to be occurring is that Google comes in to our site from one of its data centers such as San Jose and is being routed to the San Jose page. When a user does a search for relevant keywords, in the SERPS they are being sent to the location pages that it appears that bots are coming in from. If we turn off the auto geo, we think that Google might crawl our site better, but users would then be show less relevant content on landing. What's the win/win situation here? Also - we also appear to have some odd location/destination pages ranking high in the SERPS. In other words, locations that don't appear to be from one of Google's data center. No idea why this might be happening. Suggestions?
Technical SEO | | Allstar0 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0