: Bandwidth heavy site... use co-location? I'm working on a web site that is likely to be very bandwidth-heavy. A major feature of the site when in active use can pull up to 1Mbps for a single
I'm working on a web site that is likely to be very bandwidth-heavy. A major feature of the site when in active use can pull up to 1Mbps for a single session. Luckily, once users get over the new toy factor, the use of this feature will probably be 1-5% or less (probably much less) of session time.
However, new users are likely to play with this feature a good bit, especially at launch. I'm very concerned about bandwidth use.
This is more or less a niche market, so I won't ever be needing to scale to crazy levels like YouTube. However, it is entirely possible for it to be a couple terabytes/month.
Is co-location my best option? What cheap bandwidth services (colocation/hosted/cloud/whatever) are out there?
More posts by @Goswami781
6 Comments
Sorted by latest first Latest Oldest Best
We have a high traffic site and have a lot of pictures that are loaded on each page. We have dedicated servers but decided to put the pictures on Amazon S3. It sounds like you may be talking about video files or some other type of large file which I think would still apply here. Here are some pros and cons (for us)
Pros
Less diskspace needed on our servers
Less bandwidth for our servers
Our log files are significantly smaller
We can easily integrate it with Amazon CloudFront to make the loading even faster for visitors
Cons
It costs a LITTLE more. We could save a little money by having it on our own servers
Less control over them (Amazon) going down...lucky for us, they don't really go down. :)
Other Thoughts
If it isn't media files or large file downloads that your talking about, my answer and several of the others might not make sense. Give us some more details and we'll do our best to help.
Purely historically:
Back in the days before Facebook games, people were all in to browser based, text-format MMO's.
A relatively new one was Ogame. It was graphic heavy and a map system of 9 times 999 pages(9 universes with 999 sectors with room for 15 planets each, and each planet could have a moon).
The amount of users joining was insane and the amount of traffic even more so.
So what did they do to solve it? They started using a PHP template system and allowed users to host the images and CSS files themselves. All you had to do was click a checkbox and enter the absolute path to the base folder. They would save this in their database, use the HTML <base> element and and the template system set the URI from path/to/image to file:///path/to/image
After that, all the img links could stay the same. Nothing needed to be downloaded, because the users already had it. Meaning faster page-load for the user(meaning better product reviews) and also lower bandwidth usage for the company hosting the site.
And as an added bonus, they sold it as "make you own custom backgrounds and images, aren't we nice guys for letting you do this?"
Depending on the target state/country (or the world) i would use many cluster ("Cloud") solutions on different locations (location networks should be peered ;-)). On one side you have the full control over your CDN but on the other site you have a lot to do (like monitoring, taking care on soft- and hardware infrastructure and many more).
So "managed" solutions like AWS or something. There are a lot of CDN/Cloud providers which provides a great range of features.
OFFTOPIC: Take a look at Puppet[1] :-)
[1] www.puppetlabs.com/
It sounds like what you need to look at is a CDN such as Amazon Cloudfront.
Check out these questions for discussions about the usage of CDNs:
CDN - Content Delivery Networks. How do they work and why would I want to use one?
When did you decide to use a CDN? How did you measure the “success” of using a CDN?
A lot would depend on how many concurrent sessions you are expecting. If more then a few concurrent sessions is likely then you are going to need something that grants you a 100Mbit connection, 1Gbit if you expect more than 50.
It will also depend on what sort of resilience you require - if you must have uptime guarantees and other SLAs and/or fail-over systems to take over in case of a problem (because the project is important enough for a short period of down time to be embarrassing) then your options are more limited and your costs will be higher.
If you can separate the large data from the rest of the app then you do not need to move everything to a new hosting solution. For instance if the large bandwidth items are video files then you could rent a dedicated server with good bandwidth somewhere and host them on that - you can get servers on good hosts with decent bandwidth and 100mbit+ connections surprisingly cheaply these days (I pay /month for a small server with a 10Mbit link that I could saturate in both directions 24/7 if I needed to, so a 100Mbit link with a beefier server attached is not going to be expensive unless you need guaranteed uptime and other SLAs and/or server management from the hosting provider). If the server is just serving static files (even large ones) you do not need much of a machine in terms of CPU and RAM, just fast drives and bandwidth. It might also be worth looking at "cloud" hosted solutions or a content delivery network - they might be easier to scale should you under-guess how much bandwidth you need are are in theory much more resilient (so you might get a decent uptime guarantee with compensation if they fail to keep to that SLA). Keeping the bandwidth hogging action separate in these ways has the added advantage that if the high-bandwidth feature does attract enough attention to make it crawl that won't block all your other features at the same time.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.