guest2013-1
guest
- Joined
- Aug 22, 2003
- Messages
- 19,800
Wondering if you guys knew about any local CDN's? All I find is a bunch of doo-hickey-mc-bob when looking for CDN's in SA on google
There are a number... but you won't find details about them on googleWondering if you guys knew about any local CDN's?
There are a number... but you won't find details about them on google![]()
If I told you i'd have to kill you...So tell use more if you know of them
If I told you i'd have to kill you...
Sorry why bother to respond to this thread then if you have no usefull info
What was the point of your initial question?Cause his e-penis needed a boost
I think www.akamai.com are the only one's with a presence in ZA
What was the point of your initial question?
Did you want to know if more content is moving local or are you looking for a service provider to use?
If its the former - then I can confirm: heavy media content is being hosted from local servers. If it is the latter... then phone akamai![]()
AcidRaZor said:Wondering if you guys knew about any local CDN's?
AcidRaZor said:Wondering if you guys knew about any local CDN's?
Looking to do a killer webapp, AcidRaZor? I don't think there is one available locally that can hold a candle to the ones overseas (that should be no surprise). Also quite surprised by some of the answers in this thread.
Yes but I'm not necessarily ruling out international CDN's either. If I can find one that serves locally as well it would be brilliant. Currently looking into Akamai but I'm probably not going to be able to generate as much traffic as Blizzard would so not sure how their pricing would be for someone small
You are not going to see any significant benefit doing content distribution within our borders. The vast majority of Internet access backhauls into JHB and most of the major peering is happening in JHB. (although CINX is growing quickly)Obviously you do not know of any CDN's that produce content locally, nor can you be any help within this regard as a simple question seems to confuse you too much to muster up any kind of decent response.
You are not going to see any significant benefit doing content distribution within our borders. The vast majority of Internet access backhauls into JHB and most of the major peering is happening in JHB. (although CINX is growing quickly)
Placing servers though out the country would mostly be a waste of money and probably cost your more than buying a properly spec'ed service on one server. If you really think that you are going to be overloading a single server then have a chat to hetzner - they have a load-balancing solution which this site runs on. Failing that I can put you in touch with some clever network engineers who will gladly build you a solution - at a price.
The only CDN I know of that is distributed locally is mirror.ac.za and there was some serious cash put into that system.
So, your first assumption of open source software only being used for educational purposes...well, do you know the saying "assume makes an ass out of you and me"? There are plenty of businesses who run businesses entirely based on Linux/Open Source architectures. The benefit of having a local repository is, on a cost level, absolutely phenomenal and the good which entities like IS and TENET do for the people who *do* use these mirrors is extremely highly valued.Even though that might be a form of CDN, it's mostly used for educational purposes (to distribute free open source programs) and does not really cater for commercial business...
CDN: Content Delivery/Distribution Network. Wikipedia definition citation below:As for your (incorrectly) assumed idea that I'd like several servers within the border of South Africa to distribute the content, please refer to my reply to you earlier where I explained what a CDN is.
So, a bunch of "servers", "distributed" within the "network". A "local CDN" would then, logically, be a bunch of servers distributed within a local network. That local network would most probably be locations such as SAIX, IS, GamCo, Posix, IMPOL or similar hosters. Datacenters that have connectivity which is reachable on local-only circuits. Why distributed? To alleviate a connection bottleneck which might arise from using only a single hoster.A content delivery network or content distribution network (CDN) is a system of computers containing copies of data, placed at various points in a network so as to maximize bandwidth for access to the data from clients throughout the network. A client accesses a copy of the data near to the client, as opposed to all clients accessing the same central server, thereby causing a bottleneck near that server.
Well then you just buy international connectivity alongside the local connectivity for the servers on which you build your CDN.Essentially I'll only need one presence within South Africa, be it in JHB, CPT or Durban, it really does not matter where the traffic comes from, as long as it is served on a CDN locally (and available internationally).
I'm fairly certain ambo understands the advantages of both situations, as well as how a load-balancer solution could be necessitated in a CDN scenario (a limited amount of internet-visible IPs, for instance, where you have multiple servers behind the balancer for reasons of load distribution as well as failover redundancy).I almost started going into the more technical areas of why using a CDN gives you more bang for your buck but you obviously do not (or will not) understand the benefits of using something like that and (and I'm willing to bet cold hard cash, you're a techie) would rather suggest a pricey load-balancing server based solution.
No, your maximum throughput would be as fast as your hardware could possibly serve it up. If the file was in memory and not on some filesystem on a heavily fragmented drive (in a case where the filesystem which you use does actually fragment) or on a drive with really slow seek times, then it would get served directly from memory. A quick online search tells me that the bandwidth of DDR2-800 RAM (a likely choice of RAM, still fairly well-priced and easy to use en-mass for building cheap, large servers) is 6400 MB/s. That means that you would need 512 individual 100Mbps connections to max out the bandwidth between your RAM controller and your network interface (100Mbps = 12MB/s, ref: http://www.wolframalpha.com/input/?i=100Mbps). That's assuming your PCI bus can even handle that sort of throughput. But if you were to access that file from disk, that's a wholly different story. Also just as an aside, many (server?) operating systems cache recently-opened files so a buffered read of them is much quicker. Which translates to a much quicker re-server time on the next request for that file. And if you're at the point where you're running a CDN, the chances of files being re-requested are relatively high.Screw it... here goes (even if it falls on deaf ears) If you have 100mb on 1 server serving a 1 meg file for download. Your maximum throughput on that would only be 10mb/s. Now assuming you're downloading it off of a 4mbps ADSL line, and that, at 90% speed, will keep the server busy at least 2.2 seconds for 1 download. This server can then only cater for 22 simultaneous downloads which would have an impact on the speed the server can process and send out web requests for the website.
See above points, probably covers this question in consequence.Having a load balanced solution might be the answer, but how does having 5 different servers to upgrade the capacity at which I can serve a 1mb file help me in keeping down my costs?
Again, see above points.If I have a website on a decent racked server, it can take many more hits than it normally would under strain of the file download happening if I were using a CDN who has the infrastructure already and invoices me for bandwidth used.
Except for the derision you display here, this remark is basically pointless.mirror.ac.za... are you a student? That will explain even more...
Hi. Good to meet you. Now that we've put the formalities aside, to business..
So, your first assumption of open source software only being used for educational purposes...well, do you know the saying "assume makes an ass out of you and me"? There are plenty of businesses who run businesses entirely based on Linux/Open Source architectures. The benefit of having a local repository is, on a cost level, absolutely phenomenal and the good which entities like IS and TENET do for the people who *do* use these mirrors is extremely highly valued.
CDN: Content Delivery/Distribution Network. Wikipedia definition citation below:
So, a bunch of "servers", "distributed" within the "network". A "local CDN" would then, logically, be a bunch of servers distributed within a local network. That local network would most probably be locations such as SAIX, IS, GamCo, Posix, IMPOL or similar hosters. Datacenters that have connectivity which is reachable on local-only circuits. Why distributed? To alleviate a connection bottleneck which might arise from using only a single hoster.
Well then you just buy international connectivity alongside the local connectivity for the servers on which you build your CDN.
I'm fairly certain ambo understands the advantages of both situations, as well as how a load-balancer solution could be necessitated in a CDN scenario (a limited amount of internet-visible IPs, for instance, where you have multiple servers behind the balancer for reasons of load distribution as well as failover redundancy).
No, your maximum throughput would be as fast as your hardware could possibly serve it up. If the file was in memory and not on some filesystem on a heavily fragmented drive (in a case where the filesystem which you use does actually fragment) or on a drive with really slow seek times, then it would get served directly from memory. A quick online search tells me that the bandwidth of DDR2-800 RAM (a likely choice of RAM, still fairly well-priced and easy to use en-mass for building cheap, large servers) is 6400 MB/s. That means that you would need 512 individual 100Mbps connections to max out the bandwidth between your RAM controller and your network interface (100Mbps = 12MB/s, ref: http://www.wolframalpha.com/input/?i=100Mbps). That's assuming your PCI bus can even handle that sort of throughput. But if you were to access that file from disk, that's a wholly different story. Also just as an aside, many (server?) operating systems cache recently-opened files so a buffered read of them is much quicker. Which translates to a much quicker re-server time on the next request for that file. And if you're at the point where you're running a CDN, the chances of files being re-requested are relatively high.
Now, if you meant you were looking for space on a CDN that's already locally established, perhaps you should indicate that. Akamai have a local access node hosted at Internet Solutions as of some time ago, and you might find it a good idea to contact them to ask for pricing.
Akamai is present in South Africa, if you need contact details for them, let me know.
Cacheboy CDN is also present in South Africa, though that generally only does opensource content (VLC, Firefox etc), again, if you need contact details, let me know
I don't know of any others unfortunately. I also need to explicitly state that I am involved directly in neither of these.
Thanks
Andrew
Thank you Andrew, if you don't mind posting those contact details here for their South African sales team (if one exists) it would be great. Otherwise, if it's just their normal contact details, you can either post it here or pm me, I'll re-post it and take responsibility for any repercussions.