CranialBlaze
Expert Member
- Joined
- Jan 24, 2008
- Messages
- 4,025
1 of my clients sites has become quite popular, unfortunately that is not always a good thing, the site in question is only targeted at South Africans and i did a very good job with SEO so over 1200 unique visitors are coming through daily, but about 40% are from international visitors and with the site being hosted locally that amounts to over 40GB in traffic per month which works out to nearly R10 000 for bandwidth usage.
I went looking for a solution and found a way using htaccess of blocking all ip addresses outside the ZA range, its quite a long list you need to add to the htaccess, the only problem is that search engine bots are now also getting to the 403 Forbidden page, i tested this htaccess on old unused but still indexed site.
<Limit GET HEAD POST>
order deny,allow
# Country: SOUTH AFRICA
# ISO Code: ZA
# Total Networks: 882
# Total Subnets: 15,870,464
allow from 41.0.0.0/11
allow from 41.48.0.0/13
#
deny from all
</Limit>
Above is an extract from the htaccess just to give an idea, looking through logs and searching the net i got the user agent id's for the most important search engines, only problem i do not know how to include it into the file to make sure that it will work.
The main ones seem to be
User-agent: Teoma
User-agent: ia_archiver
User-agent: msnbot
User-agent: Slurp
User-agent: Googlebot
i would greatly appreciate some help fixing up this file and if anyone wants a copy of it they can just pm me, could be very useful if you do not need international traffic and are locally hosted, the less bandwidth you waste on people who are never gonna end up earning you money, the better at the end.
Thanks a lot for any assistance.
I went looking for a solution and found a way using htaccess of blocking all ip addresses outside the ZA range, its quite a long list you need to add to the htaccess, the only problem is that search engine bots are now also getting to the 403 Forbidden page, i tested this htaccess on old unused but still indexed site.
<Limit GET HEAD POST>
order deny,allow
# Country: SOUTH AFRICA
# ISO Code: ZA
# Total Networks: 882
# Total Subnets: 15,870,464
allow from 41.0.0.0/11
allow from 41.48.0.0/13
#
deny from all
</Limit>
Above is an extract from the htaccess just to give an idea, looking through logs and searching the net i got the user agent id's for the most important search engines, only problem i do not know how to include it into the file to make sure that it will work.
The main ones seem to be
User-agent: Teoma
User-agent: ia_archiver
User-agent: msnbot
User-agent: Slurp
User-agent: Googlebot
i would greatly appreciate some help fixing up this file and if anyone wants a copy of it they can just pm me, could be very useful if you do not need international traffic and are locally hosted, the less bandwidth you waste on people who are never gonna end up earning you money, the better at the end.
Thanks a lot for any assistance.