Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Traffic exchange referral URL's
-
We have a client who once per month is being hit by easyihts4u.com and it is creating huge increases in their referrals. All the hits go to one page specifically. From the research we have done, this site and others like it, are not spam bots. We cannot understand how they choose sites to target and what good it does for them, or our client to have hits all on one days to one page? We created a filter in analytics to create what we think is a more accurate reflection of traffic. Should be block them at the server level as well?
-
Hi Teamzig! Did Chris's response help? We'd love an update.

-
I can't say I've come across this one before but I've just done some brief research and basically it appears to be a fake traffic website, as the name would suggest.
Sites like this work on a scheme where you visit sites in their list of "members" and this earns you credits. The more credits you earn this way, the more people can visit your site. Think of it like exchanging visits.
I've never used one of these myself but I'd imagine there will be certain criteria that must be met to earn the credits; maybe a certain time on page or performing an action so you're not generating a bounce.
A bit more info I found on a black hat forum:
"...easyhists4u is, alike others, website service, which will give you plenty of free and real traffic, however, to get the free traffic, if you don't wanna pay for it, you firstly need to earn credits and to earn credits, you need to browse other people's websites, then you earn credits and you can exchange those credits for a free traffic. It's simple and easy, but the catch is, you're trading your own free time for your traffic, which is kinda a deadend, because you need to spend like 1 hour to get enough credits for like 10 visitors to your page, which is kinda a joke, if you think about that."
As for the question of how this benefits you, real and engaged traffic has been proven to help your rankings in the short term. It stands to reason that continuing to pay for traffic every day would offer you continued improved rankings as a result of this traffic. I don't condone this sort of thing since it misses the true point of SEO and just focusses on SERP positions.
Finally, how your client was selected is one that I can only speculate on. I'd suggest it's either a previous SEO provider submitted them to this scheme so they could report "traffic growth" to the client. Alternatively, the owners of this site fake traffic site could randomly select domains to drop into their members list so that people do exactly what you've done here - notice them in the referral list and look closer at them.
All in all, I'd suggest contacting them and attempting to have the site removed from their list and adding them to the disavow file as well. It's unlikely you'll get a response from them but it's worth the 60 seconds it takes to send an email anyhow.
Fake traffic like this is painful because it completely messes with your stats and forces you to mess around with filters to exclude them as a referral source.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
High Rank and Traffic of low DA and Backlinks
Hi guys, is a pleasure being a part of this community, hope in learning a lot with you guys, i just started a year learning about SEO and it been a big journey. I was looking at some competition of some websites that i been optmizing, and i found a website that called my attention and i cant figure out whats going on, it haves huge traffic but in terms of technicall SEO is really week, and not just this but also in terms of DA and backlinks (most of them spammy - 20 backlinks), the domain in question is bhnews.com.br I notice that doesnt have any social media, not analytics, etc. The only thing that i notice is that there is a website or a company called "BH news" (televesion), but its not related with it, since the type of information that bhnews.com.br presents is "lottery" results. So this kind of situation confuses me a lot, because is a lot of hard work in optmizing a website to rank in google, and than i come a across with this type of website with 20 backlinks (most of anchor or name of domain), and than haves like 2M visits per month and ranks for keywords related with the this type of sites of lottery. Can someone tell me if there is some kind of black seo, or something that is making this rank so high? regards
White Hat / Black Hat SEO | | jogobicho0 -
How to improve PA of Shortened URLs
Why some of shortened urls like bitly/owly/googl has PA>40? I tried everything to improve PA of my shortened urls like facebook shares, retweets and backlinks to them but still i have PA-1. Checkout this URL: https://a-moz.groupbuyseo.org/blog/state-of-links in MOZ OSE and you will many 301 links from shortners
White Hat / Black Hat SEO | | igains
I asked many seo experts about this but no one answered this question so today subscribed MOZ pro for the solution. Please give me the answer.0 -
How authentic is a dynamic footer from bots' perspective?
I have a very meta level question. Well, I was working on dynamic footer for the website: http://www.askme.com/, you can check the same in the footer. Now, if you refresh this page and check the content, you'll be able to see a different combination of the links in every section. I'm calling it a dynamic footer here, as the values are absolutely dynamic in this case. **Why are we doing this? **For every section in the footer, we have X number of links, but we can show only 25 links in each section. Here, the value of X can be greater than 25 as well (let's say X=50). So, I'm randomizing the list of entries I have for a section and then picking 25 elements from it i.e random 25 elements from the list of entries every time you're refreshing the page. Benefits from SEO perspective? This will help me exposing all the URLs to bots (in multiple crawls) and will add page freshness element as well. **What's the problem, if it is? **I'm wondering how bots will treat this as, at any time bot might see us showing different content to bots and something else to users. Will bot consider this as cloaking (a black hat technique)? Or, bots won't consider it as a black hat technique as I'm refreshing the data every single time, even if its bot who's hitting me consecutively twice to understand what I'm doing.
White Hat / Black Hat SEO | | _nitman0 -
Direct Traffic has Dropped 48% to Last Year
Since February of 2013 our organic traffic at http://www.weddingshoppeinc.com had been declining. We were able to get traffic back up to par with numbers from the previous year by December of 2013. In March of 2014 our direct traffic took a major hit and hasn’t improved. We know our mobile traffic is part of the problem, but the issue has affected traffic from desktop and mobile devices. Is this an organic traffic problem, or is our decrease in direct traffic coming from somewhere else? Has anyone else seen this issue, or does anyone have advice? Here is what we’ve already looked into and updates to note: Before this issue, when we compared organic and direct traffic, direct was usually half of what organic was (i.e., if organic was at 10 visitors, direct was at 5). However organic traffic has followed normal trends and direct has dropped. In August we updated our .net code to MVC to drop our first byte from 1,700 to 300 milliseconds. However, if you look at our m. site, it’s around 1,000 milliseconds. We changed our SEO strategy in May to follow best practices. We’ve been rewriting old content. We haven’t ever done any black hat SEO, just have some old blogs from 2010-2012 that have too many keywords. These are getting edited. In March we moved our images to a CDN for our images. We’re currently working on server errors and broken links, but nothing significant changed around March to affect our traffic. Very recently, our web developers said that they believed our direct traffic had been getting tracked wrong in Google Analytics prior to March 2014. However they think they fixed the issue in a March push. We've taken this theory into account, but we also see a drop in revenue at the time of their push that correlates with the drop in traffic, so we know there’s a bigger issue. Any input you can provide would be greatly appreciated!
White Hat / Black Hat SEO | | JimmyFritz1 -
One page with multiple sections - unique URL for each section
Hi All, This is my first time posting to the Moz community, so forgive me if I make any silly mistakes. A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff. I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material. I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this? My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page. I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds? Let me know your thoughts! Thank you for the help!
White Hat / Black Hat SEO | | jaspercurry0 -
Does Trade Mark in URL matter to Google
Hello community! We are planning to clean up TM and R in the URLs on the website. Google has indexed these pages but some TM pages are have " " " instead displaying in URL from SERP. What's your thoughts on a "spring cleaning" effort to remove all TM and R and other unsafe characters in URLs? Will this impact indexed pages and ranking etc? Thank you! b.dig
White Hat / Black Hat SEO | | b.digi0 -
Closing down site and redirecting its traffic to another
OK - so we currently own two websites that are in the same industry. Site A is our main site which hosts real estate listings and rentals in Canada and the US. Site B hosts rentals in Canada only. We are shutting down site B to concentrate solely on Site A, and will be looking to redirect all traffic from Site B to Site A, ie. user lands on Toronto Rentals page on Site B, we're looking to forward them off to Toronto Rentals page on Site A, and so on. Site A has all the same locations and property types as Site B. On to the question: We are trying to figure out the best method of doing this that will appease both users and the Google machine. Here's what we've come up with (2 options): When user hits Site B via Google/bookmark/whatever, do we: 1. Automatically/instantly (301) redirect them to the applicable page on Site A? 2. Present them with a splash page of sorts ("This page has been moved to Site A. Please click the following link <insert anchor="" text="" rich="" url="" here="">to visit the new page.").</insert> We're worried that option #1 might confuse some users and are not sure how crawlers might react to thousands of instant redirects like that. Option #2 would be most beneficial to the end-user (we're thinking) as they're being notified, on page, of what's going on. Crawlers would still be able to follow the URL that is presented within the splash write-up. Thoughts? We've never done this before. It's basically like one site acquiring another site; however, in this case, we already owned both sites. We just don't have time to take care of Site B any longer due to the massive growth of Site A. Thanks for any/all help. Marc
White Hat / Black Hat SEO | | THB0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0