It should be noted that having a Netflix OpenConnect Hardware Appliance is 100% free of charge for ISPs. The device itself is free, but I suppose you could factor in the cost to power the 4U server and the rack space used by it but those costs are negligible compared to the additional peering bandwidth used by not having one.
From the FAQ[0]
>What does the appliance cost my organization?
> The appliances (and any necessary replacements) are provided
> to participating ISPs free of charge when used within the
> terms of the license agreement.
Other CDNs (Akamai, Level3, etc.) pay Comcast/TimeWarner to host boxes in their datacenters, and to support interconnects to their private CDN networks.
If they start giving it to Netflix for free, how do they continue to charge other companies for the same service? The thing to note is that the established precedent is for companies to pay, because they see value in it. Netflix is trying to get something for free that other companies pay for. Oh, but they slapped the word "Open" on the front of it, so it's about free speech!
The ISPs don't give a shit about downstream bandwidth consumption because they get paid by the CDNs for that. What they do care about is last-mile bandwidth; i.e. the bandwidth between their datacenter and the customers' homes. It's not unlimited and there are constraints. That problem doesn't get solved (in fact, it gets worse) with OpenConnect.
It's also worth noting that a 10ge flatrate port to Cogent isn't going to cost them that much either -- and then they don't have to deal with any additional hardware, just a crossconnect. But for some reason, paying an additional ~$6K a month so that millions of your subscribers have better service is too high a price to pay
10 gigabit can't come close to covering the growing demand, no?
During primetime hours, Netflix represents some substantial fraction of total U.S. internet traffic. If you're Comcast, running 25% of the U.S. broadband market, you're seeing a big chunk of that traffic. What if Netflix decides to double their max streaming rate? Or quadruple it? We can argue again about what it means that Comcast sells you a 20Mbit-max connection, but as a practical matter, it will cost them a lot more than $6K/month to keep up with Netflix's growth in subscribers, average daily viewership and stream quality.
10Gbps isn't a lot in terms of total capacity or utilization but even a 100 extra kbps of burstable bandwidth/user can greatly improve the experience. It's the difference between buffering every 30sec and watching with only minor stuttering. Also, bandwidth costs only go down/mbps as you get bigger. $6K/month is what you pay if you're a small business; Comcast or Verizon would get high-level pricing (puts their cost somewhere around ~$3.5K/mn). A 50Gbps commit with another 50Gbps at 95th percentile pricing would would cost them fractions of a cent per customer but would make people much happier.
Now they'll never to it because they'd rather keep that ~$25K.
100Kbps/subscriber is around 200Gbps for Comcast. And that's just 100Kbps! Netflix is going to keep telling their customers they're getting a suboptimal HD experience (and they'll be right!) until the average stream gets somewhere near Blu-ray's ~20Mbps. Maybe half that if you think codecs are going to get better, but by then 4K will be winding down the pike.
The point is that for Comcast and Verizon, there's no end in sight: A few more ports connected in a few more data centers isn't going to do more than make a few subscribers slightly happier for a few months.
It doesn't have to be dedicated, just bustable to a reasonably large number of subscribers at a time. Netflix will prefretch data and a small amount of burstable bandwidth goes a long way in viewing quality. This isn't about letting everyone stream SuperHD content at the same time, this is about relieving some pressure around the average bitrate (somewhere around 1900Kbps)
From the FAQ[0]
[0](PDF) https://secure.netflix.com/us/layout/signup/deviceinfo/OpenC...