Friday, March 18, 2016

For or against Adaptive Bit Rate? part V: centralized control

I have seen over the last few weeks much speculations and claims with T-Mobile's Binge On service launch and these have accelerated with yesterday's announcement of Google play and YouTube joining the service. As usual many are getting on their net neutrality battle horse using fraught assumptions and misconceptions to reject the initiative.

I have written at length about what ABR is and what are its pros and cons, you can find some extracts in the links at the end of this post. I'll try here to share my views and expose some facts to enable a more pragmatic approach.

I think we can safely assume that every actor in the mobile video delivery chain wants to enable the best user experience for users, whenever possible.
As I have written in the past, in the current state of affair, adaptive bit rate is often times corrupted in order to seize as much network bandwidth as possible, which results in devices and service providers aggressively competing for bits and bytes.
Content providers assume that highest quality of content (1080p HD video for instance) equals maximum experience for subscriber and therefore try and capture as much network resource as possible to deliver it. Browser / apps / phone manufacturers also assume that more speed equals better user experience, therefore try to commandeer as much capacity as possible. The flaw here is the assumption that the optimum is the product of many maxima self regulated by an equal and fair apportioning of resources. This shows a complete ignorance of how networks are designed, how they operate and how traffic flows through these networks.

An OTT cannot know why a user’s session downstream speed is degrading, it can just report it. Knowing why is important because it enables to make better decisions in term of the possible corrective actions that need to be undertaken to preserve the user’s experience. For instance, a reduction of bandwidth for a particular user can be the result of handover (4G to 3G or cells with different capacity), or because of congestion in a given cell or due to the distance between the phone and the antenna or whether a user enters a building, an elevator, or whether she is reaching her data cap and being throttled etc.… Reasons can be multiple and for each of them, a corrective action can have a positive or a negative effect on the user’s experience. For instance, in a video streaming scenario, you can have a group of people in a given cell streaming Netflix and others streaming YouTube. Naturally, the video streamed is in progressive download adaptive bit rate format, which means that the stream will try to increase to the highest available download bit rate to deliver the highest video definition possible. All sessions will theoretically increase the delivered definition up to the highest available or the highest delivery bit rate available, whichever comes first. In a network with much capacity, everyone ramps up to 1080p and everyone has a great user experience.

More often than not, though, that particular cell cannot accommodate everyone’s stream at the highest definition at the same time. Adaptive bit rate is supposed to help there again by stepping down definition until it fits within available delivery bit rate. It unfortunately can’t work like that when we are looking at multiple sessions from multiple OTTs. Specifically, as soon as one player starts reducing its definition to meet lower bit rate delivery, that freed-up bandwidth is grabbed by other players, which can now look at increasing even more their definition. There is no incentive for content provider to reduce bandwidth fast to follow network condition, because they can become starved by their competition in the same cell.

The solution here is simple, the delivery of ABR video content has to be managed and coordinated between all providers. The only way and place to provide this coordination is in the mobile network, as close to the radio resource as possible. [...]

This and more in my upcoming Mobile Edge Computing report.


Part I:What is ABR?
Part II: For ABR
Part III:Why isn't ABR more succesful
Part IV: alernatives

Tuesday, March 15, 2016

Mobile QoE White Paper




Extracted from the white paper "Mobile Networks QoE" commissioned by Accedian Networks. 

2016 is an interesting year in mobile networks.  Maybe for the first time, we are seeing tangible signs of evolution from digital services to mobile-first. As it was the case for the transition from traditional services to digital, this evolution causes disruptions and new behavior patterns in the ecosystem, from users to networks, to service providers.
Take for example social networks. 47% of Facebook users access the service exclusively through mobile and generate 78% of the company’s ad revenue. In video streaming services, YouTube sees 50% of its views on mobile devices and 49% Netflix’ 18 to 34 years old demographics watch it on mobile.
This extraordinary change in behavior causes unabated traffic growth on mobile networks as well a changes in the traffic mix. Video becomes the dominant use that pervades every other aspect of the network. Indeed, all involved in the mobile value chain have identified video services as the most promising revenue opportunity for next generation networks. Video services are rapidly becoming the new gold rush.


“Video services are the new gold rush”
Video is essentially a very different animal from voice or even other data services. While voice, messaging and data traffic can essentially be predicted fairly accurately as a function of number and density of subscribers, time of day and busy hour patterns, video follows a less predictable growth. There is a wide disparity in consumption from one user to the other, and this is not only due to their viewing habits. It is also function of their device screen size and resolution, the network that they are using and the video services they access. The same video, viewed on a social sharing site on a small screen or on full HD or at 4K on a large screen can have a 10 -20x impact on the network, for essentially the same service.


Video requires specialized equipment to manage and guarantee its quality in the network, otherwise, when congestion occurs, there is a risk that it consumes resources effectively denying voice, browsing, email and other services fair (and necessary) access to the network.
This unpredictable traffic growth results in exponential costs for networks to serve the demand.
As mobile becomes the preferred medium to consume digital content and services, Mobile Network Operators (MNOs), whose revenue was traditionally derived from selling “transport,” see their share squeezed as subscribers increasingly value content and have more and more options in accessing it. The double effect of the MNOs’ decreasing margins and increasing costs forces them to rethink their network architecture.
New services, on the horizon such as Voice and Video over LTE (VoLTE & ViLTE), augmented and virtual reality, wearable and IoT, automotive and M2M will not be achievable technologically or economically with the current networks.

Any architecture shift must not simply increase capacity; it must also improve the user experience. It must give the MNO granular control over how services are created, delivered, monitored, and optimized. It must make best use of capacity in each situation, to put the network at the service of the subscriber. It must make QoE — the single biggest differentiator within their control — the foundation for network control, revenue growth and subscriber loyalty.
By offering exceptional user experience, MNOs can become the access provider of choice, part of their users continuously connected lives as their trusted curator of apps, real-time communications, and video.


“How to build massively scalable networks while guaranteeing Quality of Experience?”

As a result, the mobile industry has embarked on a journey to design tomorrow’s networks, borrowing heavily from the changes that have revolutionized enterprise IT departments with SDN (Software Defined Networking) and innovating with 5G and NFV (Networks Functions Virtualization) for instance. The target is to emulate some of the essential attributes of innovative service providers such as Facebook, Google and Netflix who have had to innovate and solve some of the very same problems.


QoE is rapidly becoming the major battlefield upon which network operators and content providers will differentiate and win consumers’ trust.  Quality of Experience requires a richly instrumented network, with feedback telemetry woven through its fabric to anticipate, detect, measure any potential failure.

Tuesday, March 8, 2016

Standards approach or Open Source?


[...] Over the last few years, wireless networks have started to adopt enterprise technologies and trends. One of these trends is the open source collaborative model, where, instead of creating a set of documents to standardize a technology and leave vendors to implement their interpretation, a collective of vendors, operators and independent developers create source code that can be augmented by all participants.

Originally started with the Linux operating system, the open source development model allows anyone to contribute, use, and modify source code that has been released by the community for free.

The idea is that a meritocratic model emerges, where feature development and overall technology direction are the result of the community’s interest. Developer and companies gain influence by contributing, in the form of source code, blueprints, documentation, code review and bug fixes.

This model has proven beneficial in many case for the creation of large software environments ranging from operating system (Linux), HTTP servers (Apache) or big data (Hadoop) that have been adapted by many vendors and operators for their benefit.

The model provides the capacity for the creation and adoption of new technologies without having necessarily a large in-house developer group in a cost effective manner.
On the other hand, many companies find that the best-effort collaborative environment is not necessarily the most efficient model when the group of contributors come from very different background and business verticals.

While generic server operating system, database technology or HTTP servers have progressed rapidly and efficiently from the open source model, it is mostly due to the fact that these are building block elements designed to do only a fairly limited set of things.

SDN and NFV are fairly early in their development for mobile networks but one can already see that the level of complexity and specificity of the mobile environment does not lend itself easily to the adoption of generic IT technology without heavy customization.

In 2016, open source has become a very trendy buzzword in wireless but the reality shows that the ecosystem is still trying to understand and harness the model for its purposes. Wireless network operators have been used to collaborating in fairly rigid and orthodox environments such as ETSI and 3GPP. These standardization bodies have been derided lately as slow and creating sets of documentations that were ineffective but they have been responsible for the roll out of 4 generations of wireless networks and the interoperability of billions of devices, in hundreds of networks with thousands of vendors.

Open source is seen by many as a means to accelerate technology invention with its rapid iteration process and its low documentation footprint. Additionally, it produces actual code, that is pre tested and integrated, leaving little space for ambiguity as to its intent or performance. It creates a very handy level playing field to start building new products and services.

The problem, though is that many operators and vendors still treat open source in wireless as they did the standards, expecting a handful of contributing companies to do the heavy lifting of the strategy, design and coding and placing change requests and reviews after the fact. This strategy is unlikely to succeed, though. The companies and developers involved in open source coding are in for their benefit. Of course they are glad to contribute to a greater ecosystem by creating a common denominator layer of functional capabilities, but they are busy in parallel augmenting the mainline code with their customization and enhancements to market their products and services.


One of the additional issues with open source in wireless for SDN and NFV is that there is actually very little that is designed specifically for wireless. SDN, OpenStack, VMWare, OpenFlow… are mostly defined for general IT and you are more likely to find an insurance a bank or a media company at OpenStack forums than a wireless operator. The consequence is that while network operators can benefit from implementation of SDN or OpenStack in their wireless networks, the technology has not been designed for telco grade applicability and the chance of it evolving this way are slim without a critical mass of wireless oriented contributors. Huawei, ALU, Ericsson are all very present in these forums and are indeed contributing greatly but I would not rely on them too heavily to introduce the features necessary to ensure vendor agnosticism...

The point here is that being only a customer of open source code is not going to result in the creation of any added value without actual development. Mobile network operators and vendors that are on the fence regarding open source movements need to understand that this is not a spectator sport and active involvement is necessary if they want to derive differentiation over time.

Tuesday, March 1, 2016

Mobile World Congress 16 hype curve

Mobile World Congress 2016 was an interesting show in many aspects. Here are some of my views on most and least hyped subjects, including mobile video, NFV, SDN, IoT, M2M, augmented and virtual reality, TCP optimization, VoLTE and others

First, let start with mobile video, my pet subject, as some of you might know. 2016 sees half of Facebook users to be exclusively mobile, generating over 3/4 of the company's revenue while half of YouTube views are on mobile devices and nearly half of Netflix under 34 members watch from a mobile device. There is mobile and mobile, though and a good 2/3 of these views occur on wifi. Still, internet video service providers see themselves becoming mobile companies faster than they thought. The result is increased pressure on mobile networks to provide fast, reliable video services, as 2k, 4K, 360 degrees video, augmented and virtual reality are next on the list of services to appear. This continues to create distortions to the value chain as encryption, ad blocking, privacy, security, net neutrality, traffic pacing and prioritization are being used as weapons of slow attrition by traditional and new content and service providers. On the network operators' side, many have deserted the video monetization battlefield. T-Mobile's Binge On seems to give MNOs pause for reflection on alternative models for video services cooperation. TCP optimization has been running hot as a technology for the last 18 months and has seen Teclo Networks acquired by Sandvine on the heels of this year's congress.

Certainly, I have felt that we have seen a change of pace and tone in many announcements, with NFV hyperbolic claims subsiding somewhat compared to last year. Specifically, we have seen several vendors live deployments, but mostly revolving around launches of VoLTE, virtualized EPC for MVNOs, enterprise or verticals and ubiquitous virtualized CPE but still little in term of multi-vendor generic traffic NFV deployments at scale. Talking about VoLTE, I now have several anecdotal evidence from Europe, Asia and North America that the services commercially launched are well below expectation in term of quality an performance against circuit switched voice.
The lack of maturity of standards for Orchestration is certainly the chief culprit here, hindering progress for open multi vendor service automation. 
Proof can be found in the flurry of vendors "ecosystems". If everyone works so hard to be in one and each have their own, it underlines the market fragmentation rather than reduces it. 
An interesting announcement showed Telefonica, BT, Korea Telecom, Telekom Austria, SK, Sprint,  and several vendors taking a sheet from OPNFV's playbook and creating probably one of the first open-source project within ETSI, aimed at delivering a MANO collaborative project,.
I have been advocating for such a project for more than 18 months, so I certainly welcome the initiative, even if ETSI might not feel like the most natural place for an open source project. 

Overall, NFV feels more mature, but still very much disconnected from reality. A solution looking for problems to solve, with little in term of new services creation. If all the hoopla leads to cloud-based VPNs, VoLTE and cheaper packet core infrastructure, the business case remains fragile.

The SDN announcements were somewhat muted, but showing good progress in SD-WAN, and SD data center architecture with the recognition, at last, that specialized switches will likely still be necessary in the short to medium term if we want high performance software defined fabric - even if it impacts agility. The compromises are sign of market maturing, not a failure to deliver on the vendors part in my opinion.

IoT, M2M were still ubiquitous and vague, depicted alternatively as next big thing or already here. The market fragmentation in term of standards, technology, use cases and understanding leads to baseless fantasist claims from many vendors (and operators) on the future of wearable, autonomous transports, connected objects... with little in term of evidence of a coherent ecosystem formation. It is likely that a dominant player will emerge and provide a top-down approach, but the business case seems to hinge on killer-apps that hint a next generation networks to be fulfilled.

5G was on many vendors' lips as well, even if it seems to consistently mean different things to different people, including MIMO, beam forming, virtualized RAN... What was clear, from my perspective was that operators were ready at last to address latency (as opposed or in complement of bandwidth) as a key resource and attribute to discriminate services and associated network slices.

Big Data slid right down the hype curve this year, with very little in term of  announcement or even reference in vendors product launches or deployments. It now seems granted that any piece of network equipment, physical or virtual must generate rivulets that stream to rivers and data lakes, to be avidly aggregated, correlated by machine learning algorithms to provide actionable insights in the form of analytics and alerts. Vendors show progress in reporting, but true multi vendors holistic analytics remains extremely difficult, due to the fragmentation of vendors data attributes and the necessity to have both data scientists and subject matter experts working together to discriminate actionable insights from false positives.

On the services side, augmented and virtual reality were revving up to the next hype phase with a multitude of attendees walking blindly with googles and smartphones stuck to their face... not the smartest look and unlikely to pass novelty stage until integrated in less obtrusive displays. On the AR front, convincing use cases start to emerge, such as furniture shopping (whereas you can see and position furniture in your home by superimposing them from a catalogue app), that are pragmatic and useful without being too cumbersome. Anyone who had to shop for furniture and send it back because it did not fit or the color wasn't really the same as the room will understand. 
Ad blocking certainly became a subject of increased interest, as operators and service providers are still struggling for dominance. As encrypted data traffic increases, operators start to explore ways to provide services that users see as valuable and if they hurt some of the OTTs business models, it is certainly an additional bargaining chip. The melding and reforming of the mobile value chain continues and accelerates with increased competition, collaboration and coopetition as MNOs and OTTs are finding a settling position. I have recently ranted about what's wrong with the mobile value chain, so I will spare you here.

At last, my personal interest project this year revolves around Mobile Edge Computing. I have started production on a report on the subject. I think the technology has potential unlock many new services in mobile networks and I can't wait to tell you more about it. Stay tuned for more!