What NFV needs now: Dynamic composition and automated interoperability

by Dave Duggal, January 27, 2015


In my last post, Orchestration is not enough for NFV, I contrasted the requirements of NFV against conventional orchestration. The point is that the old siloed, static and centralized methods simply don’t support the demands for dynamic, diverse and distributed networking. It’s time for something new!

There is a fairly broad understanding of the business drivers for network functions virtualization (NFV) and software-defined networking (SDN):

  • DevOps automation to lower costs and improve productivity;
  • service velocity to accelerate time-to-market with new value-added services; and
  • business agility to respond to competitive threats and seize opportunities.

Technically, both movements apply the same well-understood technique to enhance control of the network: separation of concerns. SDN de-couples the control plane from the data plane, while NFV de-couples network functions from network devices. Voila! The network is programmable from a higher level of abstraction.

Orchestration for a new virtual world 

This is great conceptually, but to cross the chasm to implementation we need to efficiently, safely and scalably recompose these newly de-coupled components to reliably deliver network services in a multi-vendor, multi-technology and multi-layer environment.

This is not simply a conventional orchestration question. If we take that view, we’ll simply re-integrate the network back into tightly-coupled services, replacing physical spaghetti (hardwired and custom silicon solutions) with virtual spaghetti (hardcoded and custom software solutions), at great cost and risk, and with little benefit.

The point of separating the concerns is that we can better optimize the use of the network with flexible and dynamic compositions. To achieve the business objectives of SDN and NFV we need to technically enable real-time, data-driven, policy-controlled integration of vendors’ virtual network functions (VNFs), industry protocols and device controllers.

However, this is not just a change in networking; it’s a phase change in software architecture. If you want to control what you connect and when, you first need to automate interoperability itself.

Automating interoperability

Instead of integrating services in advance at design-time, services should be systematically late-bound at run-time as directed by policies and interaction-specific metadata in order to contextualize service delivery. This represents a dramatic shift from the syntactic integration of conventional orchestration to semantic interoperability.

With semantic interoperability, elements are not integrated to each other to form siloed applications; rather, loosely coupled objects are connected by references to an information model. This approach enables applications to be assembled based on abstract contracts, which are logical models of a service with metadata references.

Note that objects are de-coupled from contracts just as SDN and NFV separate concerns, and for the same purpose – control from a higher-level of abstraction. Now management intent can be separated from infrastructure complexity so the business can declaratively describe the policies they want to effect (what service, function, conditions and so on) without having to understand low-level implementation details. In effect, they are just describing logic over endpoints.

Opportunities to add value

Since the abstract contracts don’t explicitly define fixed implementations using specific app, data, process or network resources, they need to be interpreted at run-time by an agent. The agent acts as an intermediary responding to events (for example, new order requests, or changes in network state or policy) and translating abstract contracts into concrete implementation. Every event is an opportunity for an agent to add value – not just to perform a static script, but to ‘reason’ (for example, correlate details, perform functions, run analytics and so on).

Agents resolve all references to find-and-bind the right resources just in time, performing all necessary connections and transformations to provide a smart, context-enhanced response. Late-binding the elements of an orchestration allows for feedback loops. Information about the network state can feed analytics, which drives policies and results in dynamically constructed flows. This is next-gen networking based on next-gen service orientation (see my related post Semantic SOA makes Sense!).

This approach has other significant benefits:

  • Virtually centralized policy control – security, business compliance and IT governance policies can be linked to abstract contracts addressing the historic challenge of managing consistent policy enforcement of system-wide concerns across diverse and distributed components.
  • Service adaptability – Since abstract contracts are not tightly coupled to any resources in advance, they can automatically evolve with updates, upgrades and substitution of underlying components without breaking.
  • DevOps productivity – Automating interoperability eliminates repetitive, tedious and time-consuming integration work, accelerating service delivery and freeing up valued resources to do value-added work.
Rethinking software architecture for NFV

In the final analysis, the demands of SDN and NFV can’t be met by throwing around buzz words like ‘big data’ or ‘policy management’. The challenges are more fundamental and require a thoughtful reconsideration of software architecture.

Syntactic connections made sense in a vertically integrated world when we wanted to standardize experiences over centrally controlled resources which seldom changed, but that time is behind us. Middleware stacks made sense, when you had one, but now we have mobile, social, big data and cloud stacks compounding integration headaches.

Old network abstractions ultimately collapse under the weight of kludged patches and extensions. The world is increasingly dynamic, distributed and diverse. Communications service providers need to re-organize for this software-defined environment so they can succeed as digital businesses.

Our Cloud NFV Catalyst project has been addressing these concerns. It demonstrates how a dynamic platform based on an information fabric provides the high-level abstraction necessary for policy-controlled infrastructure and ‘network-aware’ services. The Catalyst features an open and diverse eco-system of VNF providers as well as a northbound interface to business support system partners for product catalog and ordering.

CloudNFV is a working model of ‘zero-touch orchestration, operations and management’ that is aligned with and informed by the activity of the Forum’s ZOOM project. It provides a vehicle for dynamic implementations of Forum information models and best practices.