Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to fix site breadcrumbs on mobile google search
-
For past one month, I have been doing some research on how to fix this issue on my website but all my efforts didn't work out I really need help on this issue because I'm worried about this I was hoping that Google will cache or understand the structure of my site and correct the error the breadcrumb is working correctly on desktop but not shown on mobile.
For Example take a look at : https://www.xclusivepop.com/omah-lay-bad-influence/
-
@ham35841 For Yoast SEO breadcrumb issues on Apk Mod Games URLs like https://www.revdl.com/faceapp-apk-download.html/, check Yoast settings, confirm category structure, ensure theme compatibility, and update Yoast to the latest version. If problems persist, consider seeking professional assistance. Explore solutions on https://titandevsquad.com/ for practical web development support.
-
To fix site breadcrumbs on mobile Google search programmatically, update structured data with JSON-LD or microdata, ensure responsiveness through CSS media queries, test using Google's Rich Results Test, and optimize CSS for mobile. Additionally, enhance functionality with JavaScript for touch-friendly navigation and monitor Google Search Console for debugging.
-
@ericrodrigo hello please, this issue persist on my newly launched blog. Can you still assist as I am still encountering same error, and attached blog is a sample of what I meant - https://glitzvibes.com/songs/asake-only-me/
Please @Ericrodrigo and @ham35841 kindly assist me with any help you can as this is obviously affecting my ranking.
-
@ericrodrigo thanks
-
.Breadcrumbss { display: none; }I added the above code to my site header and it worked perfectly. You can see the demo here: [https://9jababa.com/between-davido-and-international-show-promoter-who-claims-she-doesnt-know-who-he-is/]
-
simply add this to css
.Breadcrumbss {
display: none;
}I Added On My Site Timesverse.in
https://timesverse.in/angry-birds-2-v2-55-1-mod-diamonds-energy-download-apk-obb-for-android/
-
@ericrodrigo I don't understand this issue as well because I'm also encountering same error on two of my websites... At first, I thought it was caused by the web themes and I had to change the theme but the error still keep repeating itself.
Here are samples from the two URLs:
https://dripnaija.com/nasty-c-best-i-ever-had-mp3/
and
https://mzansinow.co.za/gigi-lamayne-feelin-u-ft-mi-casa-blxckie/Please @Ericrodrigo and @ham35841 If you have any solution to this, don't hesitate to share. Thank you all!
-
i have problem like this
i enabled yoast seo for breadcrumb but breadcrumb dont display all sub category
you can see the problem on this url : https://www.revdl.com/faceapp-apk-download.html/
i have many url like this. this problem is more for Apk Mod Games
Thank you
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is this site ranked #1 in Google with such a low DA (is DA not important anymore?)
Hi Guys, Would you mind helping me with the below please? I would like to get your view on it and why Google ranks a really new domain name #1 with super low domain authority? Or is Domain Authority useless now in Google? It seems like from the last update that John Mueller said that they do not use Domain Authority so is Moz Domain Authority tool not to take seriously or am I missing something? There is a new rehab in Thailand called https://thebeachrehab.com/ (Domain authority 13)It's ranked #1 in Google.co.th for these phrases: drug rehab thailand but also for addiction rehab thailand. So when checking the backlink profile it got merely 21 backlinks from really low DA sites (and some of those are really spammy or not related). Now there are lots of sites in this industry here which have a lot higher domain authority and have been around for years. The beach rehab is maybe only like 6 months old. Here are three domains which have been around for many years and have much higher DA and also more relevant content. These are just 3 samples of many others... <cite class="iUh30">https://www.thecabinchiangmai.com (Domain Authority 52)</cite>https://www.hope-rehab-center-thailand.com/ (Domain Authority 40)https://www.dararehab.com (Domain Authority 32) These three sites got lots of high DA backlinks (DA 90++) from strong media links like time.com, theguardian.com, telegraph.co.uk etc. (especially thecabinchiangmai.com) but the other 2 got lots of solid backlinks from really high DA sites. So when looking at the content, thebeachrehab.com has less content as well. Can anyone have a look and let me know your thoughts why Google picks a brand new site, with DA 13 and little content in the top compared to competition? I do not see the logic in this? Cheers
White Hat / Black Hat SEO | | igniterman75
John0 -
Mobile SERP Thumbnail Image Control
Is there any way we can control the image that is selected next to the mobile serps? What google selects for the mobile serp thumbnail on a few of our serps is not conducive to high CTR.
White Hat / Black Hat SEO | | gray_jedi1 -
Should I delete older posts on my site that are lower quality?
Hey guys! Thanks in advance for thinking through this with me. You're appreciated! I have 350 pieces of Cornerstone Content that has been a large focus of mine over the last couple years. They're incredibly important to my business. That said, less experienced me did what I thought was best by hiring a freelance writer to create extra content to interlink them and add relevancy to the overall site. Looking back through everything, I am starting to realize that this extra content, which now makes up 1/3 my site, is at about 65%-70% quality AND only gets a total of about 250 visitors per month combined -- for all 384 articles. Rather than spending the next 9 months and investing in a higher quality content creator to revamp them, I am seeing the next best option to remove them. From a pros perspective, do you guys think removing these 384 lower quality articles is my best option and focusing my efforts on a better UX, faster site, and continual upgrading of the 350 pieces of Cornerstone Content? I'm honestly at a point where I am ready to cut my losses, admit my mistakes, and swear to publish nothing but gold moving forward. I'd love to hear how you would approach this situation! Thanks 🙂
White Hat / Black Hat SEO | | ryj0 -
Forcing Google to Crawl a Backlink URL
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests). My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
White Hat / Black Hat SEO | | Choice0 -
A Branded Local Search Strategy utilizing Microsites?
Howdy Moz, Over and over we hear of folks using microsites in addition to their main brand for targeting keyword specific niches. The main point of concern most folks have is either in duplicate content or being penalized by Google, which is also our concern. However, in one of our niches we notice a lot of competitors have set up secondary websites to rank in addition to the main website (basically take up more room on the SERPS). They are currently utilizing different domains, on different IPs, on different servers, etc. We verified because we called and they all rang to the same competitors. So our thought was why not take the fight to them (so to speak) but with a branding and content strategy. The company has many good content pieces that we can utilize, like company mottos, missions statements, special projects, community outreach that can be turned into microsites with unique content. Our strategy idea is the take a company called "ACME Plumbing" and brand for specific keywords with locations like sacramentoplumberwarranty.com where the site's content revolves around plumber warranty info, measures of a good warranty, plumbing warranty news (newsworthy issues), blogs, RCS - you get the idea...and send both referral traffic and link to the main site. The ideal is to then repeat the process with another company aspect like napaplumbingprojects.com where the content of the site is focused on cool projects, images, RCS, etc. Again, referring traffic and link juice to the main site. We realize that this adds the amount of RCS that needs to be done, but that's exactly why we're here. Also, any thoughts of intentionally tying in the brand to the location so you get urls like acmeplumbingsacarmento.com?
White Hat / Black Hat SEO | | AaronHenry1 -
A site is using their competitors names in their Meta Keywords and Descriptions
I can't imagine this is a White Hat SEO technique, but they don't seem to be punished for it by Google - yet. How does Google treat the use of your competitors names in your meta keywords/descriptions? Is it a good idea?
White Hat / Black Hat SEO | | PeterConnor0 -
Why do websites use different URLS for mobile and desktop
Although Google and Bing have recommended that the same URL be used for serving desktop and mobile websites, portals like airbnb are using different URLS to serve mobile and web users. Does anyone know why this is being done even though it is not GOOD for SEO?
White Hat / Black Hat SEO | | razasaeed0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0