~Delete 26491: различия между версиями

Материал из Wiki Mininuniver
Перейти к навигацииПерейти к поиску
(Новая: I have a couple of websites, and just not too long ago (this past month), I've noticed that one particular of them is using what I would contemplate excessive bandwidth. Digging a tiny d...)
 
м (Moderator переименовал страницу Saving bandwidth by stopping spider access to your wordpress weblog. в ~Delete 26491: Spam)
 
(не показана 1 промежуточная версия этого же участника)
Строка 1: Строка 1:
I have a couple of websites, and just not too long ago (this past month), I've noticed that one particular of them is using what I would contemplate excessive bandwidth. Digging a tiny deeper, it appears that bots are using all the bandwidth.<br><br>This is just my personal web site – I have a travel blog in the principal domain and a family history site in a subdomain. In the past, I have used on average 400mb per month of bandwidth. This month I've had to improve the bandwidth to 1.5GB, but it really is probably going to go over (it's presently at 1.38GB). These two internet sites are not massive, and don't get stacks of hits, mainly just pals and family. The family members history web site has a couple of large images, but nothing excessive.<br><br>Hunting at awstats this month, the blog site has used 150mb of bandwidth + 480mb of bot bandwidth (380mb of that is msnbot). The family history site has employed 55mb of bandwidth + 650mb of bot bandwidth (620mb of that is googlebot).<br><br>Most robots determine themselves by a custom user agent in the request headers. Which can effortlessly be blocked with htaccess.<br><br>There are a number of great articles on this. Let me know if you have any issues, as it is a matter of identifying the offending bots/crawlers and banning them as per your need to have. <br><br>I have utilised google tools to tell it to not frequent the web site as significantly about a week ago, but it doesn't appear to have made a distinct. Brief of telling the robots to bugger off completely through the robots.txt file (it really is just my personal website, but it really is nevertheless nice to be listed in google!), is there something else I can do?<br><br>I at the moment use this on all my internet sites , generally it blocks all negative user agents , bad bots and scrappers, Not only can it save your content material from getting mass harvested but will also save you a little bandwidth due to the fact of much less bots operating around your [http://www.blogsense-wp.com/news/wordpress-prevent-spider-access/ wordpress spiderspanker] web site. Hope it assists<br><br>can tell you that Google drags in a great numerous spiders due to advertizing, specifically if you are using Adsense on your internet site along with ising the various advertisements from the Google ad network partners - these partners also send their bots to test your visitors sources and [http://www.blogsense-wp.com/news/wordpress-prevent-spider-access/ spynderspanker review] what adverts to place on your web site - Google has been hitting challenging lately simply because of the algorithm tweaks and Adsense possessing had a lucrative month in the terms of the amount of new advertisers on board.<br><br>Even if you slow down the crawl rate, you will nevertheless see a big chunk of bandwidth disappear. The bots are way also intermittent to make correct adjustments unless you wish to block them. <br><br>There's a [http://www.blogsense-wp.com/news/wordpress-prevent-spider-access/ filter spiders wordpress] new wordpress plugin that can support with this! I've gotten a couple of emails re a product becoming sold to remove or "spank" the poor spiders that are taking up lots of bandwidth and not adding value to your company, freeing up space for real visitors and not causing a dilemma with hosting limits. It's named Spyder spanker at that name .com if you want to see the sales page.<br><br>Anyway, I'm not positive if this is a thing valuable that I need to have or not. I do see a lot of spider activity in my stats, but I always believed that was sort of excellent b/c it means they are crawling my websites and hopefully indexing them.<br><br>The big danger is stealing your bandwidth. Some of the spiders sent by spammers will hammer your website as quick as they can, slowing down response for your human visitors.
+
Content removed

Текущая версия на 09:38, 26 декабря 2025

Content removed