Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Large Competitor closed, how to capitalize in search. Any ideas?
-
Hey Mozzers,
One of our biggest competitors closed down on January 1st, 2020 in several US cities. They did stay open in some areas just FYI. The competitor's website is www.execucar.com. This is a very large company that has a presence in almost all US major airports. It's a private car service just like Uber but for wealthy individuals.
For example. when you search " lax car service" they are #3 on Google or "car service to lax" they're #2 still.
What can we do to get more of their traffic and actual business? Has anyone done something like this before or knows quick and easy tactics to get their clients? We have a local landing page: https://dcacar.com/lax-car-service that ranks 9 through 11 for those same keywords.
Thanks for your thoughts and time.
Davit
-
Hi Miriam,
Hope you're doing well. Thanks for always answering my questions.
Yes, they closed at LAX Los Angeles and many other US cities as well. But they still rank for that location and rank very well. It has been almost 2 months since they stopped their operation at LAX. And to answer your 2nd question yes, we do serve LAX we're not ranking as well as we would like but I was hoping to try to capitalize on this opportunity.
One thing to mention that is important is this. Their website does not mention that they no longer serve these locations, in fact their city page still lists these pages as being served. But if you try to get a quote online or even call the phone number you will get a response that we no longer serve this location.
The company was sold to a venture capital firm which decided to close some of the unprofitable cities. I reached out to the venture firm to see if it possible to buy their local phone number, customer email list etc but they are not interested in selling those. So I was wondering what else can I do to get their customers. They have been in business for almost 20 years and they do have a large number of customer lists etc.
-
Hi Davit!
Some questions:
Are you saying that this business closed its location that services LAX, but they are still ranking for it?
Or, are you saying they still have a location open there?
Do you serve LAX?
Please, let me know. Thanks!
-
Same happened with https://www.devicesprice.com/ I have hired to worker who help me in case but I am also wondering to get good idea.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search console validation taking a long time?
Hello! I did something dumb back in the beginning of September. I updated Yoast and somehow noindexed a whole set of custom taxonomy on my site. I fixed this and then asked Google to validate the fixes on September 20. Since then they have gotten through only 5 of the 64 URLS.....is this normal? Just want to make sure I'm not missing something that I should be doing. Thank you! ^_^
Intermediate & Advanced SEO | | angelamaemae0 -
How to get local search volumes?
Hi Guys, I want to get search volumes for "carpet cleaning" for certain areas in Sydney, Australia. I'm using this process: Choose to ‘Search for new keyword and ad group ideas’. Enter the main keywords regarding your product / service Remove any default country targeting Specify your chosen location (s) by targeting specific cities / regions Click to ‘Get ideas’ The problem is none of the areas, even popular ones (like north sydney, surry hills, newtown, manly) are appearing and Google keyword tool, no matches. Is there any other tools or sources of data i can use to get accurate search volumes for these areas? Any recommendations would be very much appreciated. Cheers
Intermediate & Advanced SEO | | wozniak650 -
Hacked website - Dealing with 301 redirects and a large .htaccess file
One of my client's websites was recently hacked and I've been dealing with the after effects of it. The website is now clean of malware and I already appealed to Google about the malware issue. The current issue I have is dealing with the 20, 000+ crawl errors which are garbage links that were created from the hacking. How does one go about dealing with all the 301 redirects I need to create for all the 404 crawl errors? I'm already noticing an increased load time on the website due to having a rather large .htaccess file with a couple thousand 301 redirects done already which I fear will result in my client's website performance and SEO performance taking a hit as well.
Intermediate & Advanced SEO | | FPK0 -
Large robots.txt file
We're looking at potentially creating a robots.txt with 1450 lines in it. This will remove 100k+ pages from the crawl that are all old pages (I know, the ideal would be to delete/noindex but not viable unfortunately) Now the issue i'm thinking is that a large robots.txt will either stop the robots.txt from being followed or will slow our crawl rate down. Does anybody have any experience with a robots.txt of that size?
Intermediate & Advanced SEO | | ThomasHarvey0 -
Crawled page count in Search console
Hi Guys, I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console. History: Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this: Noindex filterpages. Exclude those filterspages in Search Console and robots.txt. Canonical the filterpages to the relevant categoriepages. This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day. To complicate the situation: We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well. Questions: - Excluding in robots.txt should result in Google not crawling those pages right? - Is this number of crawled pages normal for a website with around 1000 unique pages? - What am I missing? BxlESTT
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Geo-Redirect: good idea or not?
Hi Mozzers, The background: I have this very corporate .com domain which is used worldwide. Next to that, we have another .com domain which is specifically created for the US visitors. Within the organic rankings, we notice that our corporate domain is ranking much better in the US. Many visitors are arriving on this domain. As it is a corporate domain being used worldwide, they get lost. My questions: I know there are ways to redirect by location. Would it be smart to automatically redirect US visitors for the corporate domain to the commercial US-specific domain? Is it possible to only redirect US visitors and leave the website as it is for visitors from other countries. Won't this harm the corporate website (organically) worldwide? If this would be a good idea, any recommended plugins or concrete procedures? Thank you so much for helping me out!
Intermediate & Advanced SEO | | WeAreDigital_BE
Sander0 -
Search Engine Pingler
Hello everyone, it's me again 😉 I've just got a Pro membership on SeoMoz and I am full of questions. A few days ago I found very interesting tool called: Search Engine Pingler And description of it was like this: Your website or your page was published a long time, but you can not find it on google. Because google has not index your site. Tool Search engine pingler will assist for you. It will ping the URL of your Page up more than 80 servers of google and other search engines. Inform to the search engine come to index your site. So my question is that tool really helps to increase the indexation of the link by search engine like Google, if not, please explain what is a real purpose of it. Thank you to future guru who can give a right answer 🙂
Intermediate & Advanced SEO | | smokin_ace0 -
Best way to block a search engine from crawling a link?
If we have one page on our site that is is only linked to by one other page, what is the best way to block crawler access to that page? I know we could set the link to "nofollow" and that would prevent the crawler from passing any authority, and we can set the page to "noindex" to prevent it from appearing in search results, but what is the best way to prevent the crawler from accessing that one link?
Intermediate & Advanced SEO | | nicole.healthline0