Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can hotlinking images from multiple sites be bad for SEO?
-
Hi,
There's a very similar question already being discussed here, but it deals with hotlinking from a single site that is owned by the same person.
I'm interested whether hotlinking images from multiple sites can be bad for SEO.
The issue is that one of our bloggers has been hotlinking all the images he uses, sometimes there are 3 or 4 images per blog from different domains.
We know that hotlinking is frowned upon, but can it affect us in the SERPs?
Thanks,
James
-
Sorry, hotlinking was the wrong word to use, we're actually just embedding the images.
Is it possible that Google recognises that spammy sites (as an example) tend to embed lots of images and therefore use it as an indicator of spam?
Also, is poor netiquette ever taken into account? Again, maybe because Google is trying to find spammy sites?
For the record, it is something we'll be fixing (especially from a copyright point of view), but we're trying to prioritise this. If there's a potential SEO impact, we'll sort it quick, if not, then we'll do more pressing things first.
-
Okay, so hotlinking is the wrong terminology to use. Do you think embedding images is taken into account by Google?
For example, would Google see spammy sites embedding lots of images, and therefore use it as an indicator of spam?
-
That's confused me too! Embedding an image from another site is hotlinking. A href doesn't have anything to do with it.
-
Excuse me, it's late in the day. Embedding is still referencing the sites image URL right?
Also, what if the site changes the directory or something and all the images on your site now 404.
-
Another thing to consider is that requesting images from multiple sites will create a lag in load times. Most modern browsers will download multiple files in parallel from the one host. Multiple hosts will mean the page load will occur in series (not parallel) and this will create a slower load time.
Hope this helps!
Dan
-
Sorry, I assumed you meant you were hotlinking images, rather than just embedding them. If you're just using tags with no <href> defined (so just embedding, not hotlinking), then you're right - this won't cause a problem.</href>
-
Create and host your own image or use a royalty-free image so you won't suffer from someone claiming copyright, this should be your biggest concern here.
-
Takeshi is right. Bandwidth can cost money, so there's that as well as the copyright theft. You could also fall victim to a 'switcheroo': http://www.deuceofclubs.com/switcheroo/index.html - I've done this myself before by adding a polite message asking someone not to hotlink.
Google don't include hotlinked images in Google News so it is something they may take into account when ranking a page in their general search.
-
Surely that only works if it's an actual link, right? Simply using the tag shouldn't be regarded as a link by Google?
-
You are definitely missing out on image traffic by not hosting your own images. Plus, hotlinking is poor netiquette since you are using someone else's bandwidth without their permission. If the images are copyrighted, then you could be hit by DMCA requests which can negatively impact your SEO.
-
Hi James
A lot of this will depend on the site you're linking to.
It's long been a part of the ranking algorithm that if you link to sites that are seen negatively by Google, due to spam/malware/etc, then your site may be viewed negatively itself. Without knowing where your blogger has been linking from, it's hard to say - but it's worth running a check just in case.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does anyone know the linking of hashtags on Wix sites does it negatively or postively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please?
Does anyone know the linking of hashtags on Wix sites does it negatively or positively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please? For example at the bottom of this blog post https://www.poppyandperle.com/post/face-painting-a-global-language the hashtags are linked, but they don't go to a page, they go to search results of all other blogs using that hashtag. Seems a bit of a strange approach to me.
Technical SEO | | Mediaholix0 -
I am trying to generate GEO meta tag for my website where on one page there are multiple locations My question is, Can I add GEO tagging for every address?
Am I restricted to 1 geo tag per page or can i add multiple geo tags ?
Technical SEO | | lina_digital0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Is pointing multiple domains to a single website beneficial for SEO or not?
A client has purchased many domains with keywords in each. They want to have us point each domain to their site for better SEO. Is this a good or bad thing to do?
Technical SEO | | thinkcreativegroup0 -
Can you have a /sitemap.xml and /sitemap.html on the same site?
Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community! My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain? For example, we've already put the html sitemap in place here: https://www.pioneermilitaryloans.com/sitemap Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts. I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this. What do you think?
Technical SEO | | PioneerServices0 -
Can dynamically translated pages hurt a site?
Hi all...looking for some insight pls...i have a site we have worked very hard on to get ranked well and it is doing well in search. The site has about 1000 pages and climbing and has about 50 of those pages in translated pages and are static pages with unique urls. I have had no problems here with duplicate content and that sort of thing and all pages were manually translated so no translation issues. We have been looking at software that can dynamically translate the complete site into a handfull of languages...lets say about 5. My problem here is these pages get produced dynamically and i have concerns that google will take issue with this aswell as the huge sudden influx of new urls....as now we could be looking at and increase of 5000 new urls. (which usually triggers an alarm) My feeling is that it could be risking the stability of the site that we have worked so hard for and maybe just stick with the already translated static pages. I am sure the process could be fine but fear a manual inspection and a slap on the wrist for having dynamically created content?? and also just risk a review trigger period. These days it is hard to know what could get you in "trouble" and my gut says keep it simple and as is and dont shake it up?? Am i being overly concerned? Would love to here from others who have tried similar changes and also those who have not due to similar "fear" thanks
Technical SEO | | nomad-2023230 -
Same Video on Multiple Pages and Sites... Duplicate Issues?
We're rolling out quite a bit of pro video and hosting on a 3-party platform/player (likely BrightCove) that also allows us to have the URL reside on our domain. Here is a scenario for a particular video asset: A. It's on a product page that the video is relevant for. B. We have an entry on our blog with the video C. We have a separate section of our site "Video Library" that provides a centralized view of all videos. It's there too. D. We eventually give the video to other sites (bloggers, industry educational sites etc) for outreach and link-building. A through C on our domain are all for user experience as every page is very relevant, but are there any duplicate video issues here? We would likely only have the transcript on the product page (though we're open to suggestions). Any related feedback would be appreciated. We want to make this scalable and done properly from the beginning (will be rolling out 1000+ videos in 2010)
Technical SEO | | SEOPA0 -
How to push down outdated images in Google image search
When you do a Google image search for one of my client's products, you see a lot of first-generation hardware (the product is now in its third generation). The client wants to know what they can do to push those images down so that current product images rise to the top. FYI: the client's own image files on their site aren't very well optimized with keywords. My thinking is to have the client optimize their own images and the ones they give to the media with relevant keywords in file names, alt text, etc. Eventually, this should help push down the outdated images is my thinking. Any other suggestions? Thanks so much.
Technical SEO | | jimmartin_zoho.com0