Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Event Schema markup for multiple events (same location/address)?
-
I was wondering if its possible to markup multiple events on the same page for one location/address using the event schema.org markup? I tried doing it on a sample page below:
http://www.rama.id.au/event-schema-test/
Google's schema testing tool shows that its all good (except for warning for offers). Just wanted to know if I am doing it correctly or is there a better solution. Any help would be much appreciated.
Thank you

-
Webmaster tools / search console https://www.google.com/webmasters/tools/home?hl=en
https://search.google.com/structured-data/testing-tool/u/0/
You can use this as a template
Or use this great tool https://jsonld.com/json-ld-generator/
(remember with JSON-LD you must have the content in HTML if you post via JSON-LD)
-
-
Hi Vincent
Again, it may be a matter of creating individual pages for each event w proper Schema, or it is possible to use tags in Schema as well. So, each event would be wrapped by their proper Event tag, with the address information in a meta tag included. Both are rather tedious. You can read more here:
https://schema.org/docs/gs.html#schemaorg_testing
Ideally though, you'd have an individual page for each event.
You could follow Ticketmaster's path and use data-vocabulary.com markup, however, Schema is the standard. If you're wondering what I mean, run the following URL through the Google Structure Data Markup Tester I linked to in my previous comment:
http://www.ticketmaster.com/Chicago-Bulls-tickets/artist/805914
Sorry for not posting links - I am on my phone and I cannot. Will update in the AM.
Hope these help! Good luck!
-
Hello Oleg and Patrick
Thank you so much gentlemen for helping me out. Unfortunately, I cannot wrap each event in it's own using itemscope itemtype="http://schema.org/Event" as each event will then require me to specify the same address multiple times due to the "location" attribute being a required field. Since the address occurs only once on the pageI am bound to use it only once by tying the same address to multiple address. On a side note, how come the Google schema testing tool able to pass my implementation on the sample URL?
Hope to hear from you soon.
Thanks once again.
-
I agree with Oleg here - each event should have it's own page.
That being said, it is possible to markup individual events on the same page, because each event is has it's own unique attributes. Each event will be wrapped in it's own itemscope itemtype="http://schema.org/Event" - so be mindful of that.
You can read more here.
Keep in mind Google and Yandex have structured data markup testing tools.
Hope this helps! Good luck!
-
You should create separate events for each event you have (even if the location is the same)
from https://schema.org/Event --> "Repeated events may be structured as separate Event objects."
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am trying to generate GEO meta tag for my website where on one page there are multiple locations My question is, Can I add GEO tagging for every address?
Am I restricted to 1 geo tag per page or can i add multiple geo tags ?
Technical SEO | | lina_digital0 -
Google displays multiple titles for same article. What does this mean?
I've linked to some screenshots so that it what I'm talking about makes more sense. Sometimes, when I perform a search, I see an article with the correct article title listed as the page title in the SERPs. Other times, I see the wrong page title – it's a generic somethin' or other done by my client's web design company with a bunch of keywords thrown in. The latter (not the correct article title) also appears at the top of the browser tab for every article on my client's site. I know this is bad, but what can be done about it? This would never happen if my client used Wordpress or some easily modifiable CMS, but they're using a proprietary one maintained by the group that designed the website. open?id=0BxB_dYL1ylGgVVF1dHlwdXp2dFU open?id=0BxB_dYL1ylGgdWJjdlJoRlRIR00
Technical SEO | | Greenery0 -
301 Redirect for multiple links
I just relaunched my website and changed a permalink structure for several pages where only a subdirectory name changed. What 301 Redirect code do I use to redirect the following? I have dozens of these where I need to change just the directory name from "urban-living" to "urban", and want it to catch the following all in one redirect command. Here is an example of the structure that needs to change. Old
Technical SEO | | shawnbeaird
domain.com/urban-living (single page w/ content)
domain.com/urban-living/tempe (single page w/ content)
domain.com/urban-living/tempe/the-vale (single page w/ content) New
domain.com/urban
domain.com/urban/tempe
domain.com/urban/tempe/the-vale0 -
Schema for Banks and SEO
I'm researching Schema opportunities for a bank, but besides the shema markup available today (like bankorcreditunion) and developments with FIBO, I can find no answer as to the effect of tagging interest rates and such in terms of SERP/CTR performance or visibility. Does anyone have a case study to share or some insight on the matter?
Technical SEO | | Netsociety0 -
<sub>& <sup>tags, any SEO issues?</sup></sub>
Hi - the content on our corporate website is pretty technical, and we include chemical element codes in the text that users would search on (like S02, C02, etc.) A lot of times our engineers request that we list the codes correctly, with a <sub>on the last number. Question - does adding this code into the keyword affect SEO? The code would look like SO<sub>2</sub>.</sub> Thanks.
Technical SEO | | Jenny10 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Location Based Content / Googlebot
Our website has local content specialized to specific cities and states. The url structure of this content is as follows: www.root.com/seattle www.root.com/washington When a user comes to a page, we are auto-detecting their IP and sending them directly to the relevant location based page - much the way that Yelp does. Unfortunately, what appears to be occurring is that Google comes in to our site from one of its data centers such as San Jose and is being routed to the San Jose page. When a user does a search for relevant keywords, in the SERPS they are being sent to the location pages that it appears that bots are coming in from. If we turn off the auto geo, we think that Google might crawl our site better, but users would then be show less relevant content on landing. What's the win/win situation here? Also - we also appear to have some odd location/destination pages ranking high in the SERPS. In other words, locations that don't appear to be from one of Google's data center. No idea why this might be happening. Suggestions?
Technical SEO | | Allstar0 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0