Will there ever be enough bandwidth?

‘If content is king, then distribution is God Almighty’, says the mouthpiece for the NBN.
[This is archived content and may not display in the originally intended format.]

‘If content is king, then distribution is God Almighty’, says the mouthpiece for the NBN.

A lot has changed since Marshall McLuhan coined the phrase ‘the medium is the message’, but perhaps the relationship between content and carrier is just the same as ever.

Landry Fevre is the General Manager for Media, Commercial Management with NBN Co, and on his reckoning the fourth longest-serving employee (rising to third with the imminent departure of CEO Mike Quigley) and a better ambassador for the virtues of the fastest, broadest, most ubiquitous broadband service would be hard to find. It’s hard to believe that anyone listening to his presentation could fail to be impressed – though admittedly he made no mention of the cost of the rollout.

But Fevre’s keynote address to the SMPTE 2013 conference – entitled The Future Landscape Of Media Entertainment – went well beyond a technical discussion of the benefits of faster downloads. Instead he presented a staggering wealth of metrics, tracing the development of computing power, delivery methods and bandwidth, and applications, over the past 30-odd years. Extrapolating the trends forward just three or four years, he demonstrated the ongoing explosion of alternative distribution models for content of all forms, both driving and driven by increased bandwidth.

Some statistics: the NBN is a wholesale carriage network: customers will be connected to an ISP as at present, and the ISP will have bought bandwidth from NBN. Current plans will have about 93% of premises connected directly by fibre, the remainder either by fixed wireless or by satellite, and the entire territory of Australia will be covered, from Cocos Islands to Lord Howe Island. Fevre estimated that about 80% of the network will be underground, the remainder on poles. The project uses some 160,000 km of fibre and so far 200,000 premises have been passed, but the rate is accelerating and planned to peak at 1.2 million premises per year. Take-up, too, was slow at first but is accelerating and about 1/3rd of users are using 100Mbps connections, others slower. 1Gbps will become available at the end of this year.

Apart from the domestic broadband entertainment applications that are becoming familiar (such as internet, TV, and voice over IP), connections will be used for public services such as government, health and education; monitoring; and business applications. Among many examples he listed were health applications such as tele-medicine (driven by cost avoidance rather than new medical techniques), and education by way of the Virtual Classroom.

He noted three drivers in technical innovation, and showed a graph (one of many) illustrating the sudden surge in growth of computing power that started around 1998; the sudden surge in storage capacity that started around 2000, and internet speeds, which have taken a couple of steps up over the past decade but are now steady, poised, he suggested, to surge with the availability of fibre connectivity. The smaller bumps-up in the speed graph, he said, enabled the successful growth bursts of iTunes in 2003, Google Maps in 2005, Skype in 2009, and, he predicted, 4K TV in 2013 (in the US maybe: no Australian broadcaster has such plans). His conclusion is that innovation is currently constrained by stagnant internet speeds, but can take off once that brake is released.

One factor missing from simplistic and limited estimates of bandwidth demands is the growth in the number of connected devices per household: in 2009 there were, on average, two – typically both PCs. Today there are five to six devices, and there could be seven or eight by 2016. Only a quarter of these would be PCs; with 33% smartphones and 22% tablets. Internet-connected TVs and games devices make up the balance.

The battle for revenue for the content providers has ever more contenders: in addition to the traditional free-to-air and Pay-TV providers, screens (of all sorts) are now filled from Telstra’s T-box, Fetch-TV and Netflix, and other OTT services including those from traditional broadcasters such as ABC iView. (‘Over-The-Top’ refers not to outrageous content but to TV content delivered by a content provider via a third party service provider, using any of a wide range of devices ranging from internet-connected TVs to tablets and Playstations). Landry noted that in the US in 2012, DVD’s had fallen to just 21% of on-demand viewing, while DVR’s provided 42%, and streamed Video-on-Demand was up to 35%. However, he noted with some concern that just one company, Netflix, dominated with a 32% share of the total on-demand market.

Landry noted that different video delivery models had different strengths: for example traditional broadcast was cheap and efficient to deliver and had ubiquitous reach, but was limited in viewer choice. IPTV was relatively low cost but provided much more viewer flexibility.

Some see opportunities in the new flexible technologies to present extra-high quality material to niche audiences. Landry referred to players in the arts sector who fancied building their own channels: but he warned that the overall business of a ‘channel’ involved more than simply playing out content. Who would build an audience, he asked: who would design a compelling app? Who would manage marketing, and the delivery capability?

Nevertheless, he noted a number of opportunity drivers for OTT providers: very competitive costs for content; rapidly rising broadband speeds; and the gap between Pay-TV penetration and broadband penetration, which currently reaches twice as many households as Pay-TV.

Landry provided many more observations, all of which supported his premise that there was no such thing as too much broadband speed: every trend he could pick was pushing at the limits.

We asked if he could think of any earlier predictions of technological developments or applications that had not been fulfilled: he could only suggest that download limits in early broadband plans had not stuck, but had been rapidly exceeded. In summary, he suggested that the key themes to watch were: ‘direct to fan’ delivery of Sports, Reality TV and Premium TV; the number and quality of screens, multi-screen imaging and more devices in the home; and shifts in content creation styles (here Landry threw in the term ‘hyper-serialisation*’) leading to changes in viewer behaviour.

In the Q & A session afterwards, the inevitable question about the future for NBN if the government changed was asked. Landry confided: ‘I’ve been briefed on this’ before providing his prepared answer, which was that the government provides a ‘Statement of Expectation’ and NBN works to that statement and would work to a new statement if a new government provided one.

He felt that while the 2010 election had been a make-or-break time for NBN depending on the election outcome, that there was little doubt that NBN would survive this time around, although if a changed government made alterations to their brief it would require all existing contracts (especially with Telstra) to be renegotiated, which he said (with some lack of enthusiasm) could cause significant delays in the rollout.

*Your reporter confesses he had to look up Landry’s throw-away term ‘hyper-serialisation’. Whereas series drama was, until recently, produced in self-contained episodes so that networks could re-run them in any order, now that complete seasons can be downloaded (or bought in box sets from the video store), episodes are written in a very distinctive sequence, each one with a compelling hook to the next. Forget about waiting a week to watch the next episode – you want to see it now – and you can.

That is, apparently, hyper-serialisation, and it’s an almost perfect example of the McLuhan principle: the medium has changed, and with it the message, the story, and how we watch it.

(Pictured: We are facing an explosion in demand for bandwidth.)

Dominic Case
About the Author
Dominic Case was the Technology Manager for the Atlab Group for many years, and on the Board of the AFC during the period that the NFSA was a part of that organisation. He also worked for the NFSA briefly as head of the Film Branch, and for a year as Development Manager. He gave a paper at the last SMPTE conference on the difficulties faced by film archives in the digital era.