When the International Telecommunications Union (ITU) was founded in 1865, it aimed to achieve interoperability between the networks of different countries. The primary output of the ITU and most telecoms standards bodies since - is the technical specification of interfaces. While standards bodies often do more than writing these technical specifications for interfaces, this is still typically their core focus.

What do standards bodies produce?

Interface specifications

Standards bodies write technical specifications for the interface between the different entities or components. The typical approach is to specify only what is necessary for successful interconnection. For example, the ‘air interface’ between a smartphone and a mobile network specifies all aspects of the radio signal, how sessions/calls are established and released, how the smartphone can move between mobile base stations, and how data is passed between the smartphone and the network. However, it does not specify what is in that data. Functionality such as applications and application stores are not part of the ‘air interface’ specification.

Another example is the Internetworking Protocol (IP) the fundamental protocol of the Internet which was developed with the maxim ‘everyone over IP and IP over everything’. The IP specification allows any application to use it and there is nothing in the IP specification which refers to applications. In addition, IP is designed to work over any physical network and the specification has no reference to physical networks. This means that that physical networks have come and gone as technology has developed, but the IP specification has remained unchanged.

Standards bodies consider the balance between precision and innovation. A precise standard is more likely to guarantee successful interconnection, but it may implicitly specify old technology or disallow an attractive enhancement a supplier has devised. Different standards bodies will make different choices depending on the exact circumstances of the interface they are specifying.

A final consideration is the time it takes to write a specification compared to the time it takes to develop a solution which complies with the interface. Ideally, the rate of innovation should be set by the pace of development, not by the time it takes to agree and write specifications. When the development is purely software, and especially if following an ‘agile’ process, traditional writing of standards can appear slow, and other solutions, such as open-source reference implementations can be more attractive than a traditional interface standard; the interface standard is implicitly defined in the reference implementation.

Conformance testing, validation, and certification

When a standards body writes a standard, it raises the question; does a particular implementation conform to the specification and can this be certified? Where this is important, some standards groups are structured to carry out test, validation, and certification alongside the process of specification.

Some bodies will do this to provide confidence in the technical specifications, but is most commonly found for standards which are legal requirements, such as safety standards and some regulatory interfaces. 

Best practice, operational guidelines, and industry education

Many standards bodies also produce less formal documents which give advice and insight, from improving the overall effectiveness of the industry to offering insights into the impact of technical developments and standardisation. These informal documents can give a much more accessible insight into the intent of a standard, what its impact should be, and how it may be best implemented and exploited.

Reference implementations

Where an implementation is purely software, and even more if the software is using an ‘agile’ development process, the ‘standard’ for the industry is often developed as an open source project as, essentially, creating a reference implementation. The emphasis tends to be on augmenting an existing system on a regular basis through an approach of periodic ‘releases’. This means that new features are regularly added, and the interface specifications are updated in lock step with the releases. For technology which is purely software implementation, this has several advantages. First, the interface specification is already tested including any adverse interaction between new features and old features. Second, the dynamic behaviour of the interface, which is often very difficult to specify in a normal standards document, is implicitly defined by the reference implementation.

It should be noted that there is no fundamental requirement for any implementation to use the open-source code, although many do. In practice, many suppliers will use the open-source code as a base for their implementation.

The idea of a reference implementation is older than the current system of open-source projects. In the very early days of the Internet, where most functionality was implemented in software. The IETF, the Internet standards body, worked with a maxim of ‘rough consensus and running code’. In other words, a working reference implementation was more important than complete agreement on the specification.

Levels of Interface Specification

Standards will vary in the degree of specification, with different approaches lending themselves to different scenarios. If the standard will be implemented many times by many organisations, the high cost of developing the standard can be dispersed across all the implementations. Moreover, the very existence of the standard may encourage mass scale implementation with considerable benefit to all involved. In such instances, highly specific specifications are of most value. On the other hand, if a specification will only be implemented by a few organisations and each organisation may only have one implementation, the cost of standardisation is less easily dispersed. In these circumstances, specific features unique to each implementation may also be important, suggesting a looser standards specification produced at a lower cost is likely to be of greater value.

Different standards bodies make different choices as to the level of precision and detail in their specifications:

Precise technical specifications

Generally, any interface between customer equipment and a network operator is likely to benefit from precise specification which can be tested, validated, and certified. The specification should be sufficiently clear and complete that anyone implementing the specification can be confident their implementation will interconnect successfully with other implementations without further work.

Reasonably precise technical specifications

Interfaces which are within the network, including those between network operators, are implemented and operated by a relatively modest number of technically skilled suppliers and network operators respectively.  In this circumstance, standards bodies often strike a different balance: standards may contain several options to suit the preferences of different suppliers and operators and may even leave some aspects unspecified.

The specifications can be regarded as ‘reasonably precise’ but some additional development, testing, and validation is almost certainly required before there is satisfactory interconnection. In addition, the interconnection may not support all possible features.

Indicative specification

Certain instances of implementation  are sufficiently rare that large elements of the implementation will be specific to that instance. This is often the case with network management interfaces where the details of a specific vendor's equipment and the network operator’s management systems are likely to require specific development to integrate the equipment into the network operator’s management environment. In this case, standards tend to be more ‘indicative’ than definitive; providing a starting template for the development which can greatly reduce the development costs, but not avoid them completely. A good example of this are the standards of TM Forum.

Different forms of organisation which define standards

Legally binding SDOs

Some SDOs are cited by statutory legislation and their standards carry the force of law in a particular jurisdiction. In the UK, for example, electrical and optical safety standards are the responsibility of BSI while regulated telecommunications interfaces are the responsibility of NICC. At the European level, the equivalent of BSI are CEN and CENELEC and the equivalent of NICC is ETSI (though ETSI now has a much wider role than just legally binding interfaces).

Formal SDOs

While some SDOs are formally recognised in statutory requirements, there are many other organisations which are widely recognised within the industry at the primary standard body in a particular technical area. Given the global nature of telecoms, these tend to be global bodies, the oldest of which is the ITU. Founded in 1865, the ITU was reconstituted in 1947 under the UN and is now a collaboration of member states without any jurisdiction over sovereign states.

3GPP is another global body responsible for all the relevant technical specifications for mobile networks. In fact, the 3GPP organisational partners are regional or national standards bodies, and the partners automatically adopt 3GPP specification as their standards. In Europe this is ETSI, and so all 3GPP specifications are published as ETSI standards.

Other formal SDOs include the IETF - the Internet standards body - and the IEEE - responsible for all the formal standards for Ethernet and WiFi.

Industry associations

Alongside SDOs there are several industry associations which write specifications. These may lack formal SDO status but are largely treated as such via  general acceptance by the industry. ORAN Alliance and ONF are examples of industry associations whose specifications carry considerable weight in the industry. In some cases, specifications generated by industry associations may be adopted, or used as a basis, by a regional SDO such as ETSI.

De-facto industry leader

In some cases, standards emerge organically when one player’s solution gains widespread support. Such standards are often called ‘de-facto’ standards; they exist irrespective of any prior intention to standardise, e.g..  the Microsoft Windows operating system and the Microsoft Office document formats. The player responsible for the de-facto standard may subsequently offer their specification to a standards body, as occurred with the Microsoft Office document formats which are formally standardised through ISO/IEC.

Open-source projects

If an open software project becomes very popular, it can also have the effect of creating a de-facto standard for the interfaces it offers. 

Backwards compatibility, innovation, and evolution in standards

The great thing about a standard is that everyone uses it, however this pervasive use can hinder acceptance of new innovations. If a new idea emerges which is different to the current standard, its proponents face the daunting task of persuading everyone using the current standard to pivot to the new idea,.

This dilemma is especially acute in telecoms where much of the value of a current standard arises from the fact everyone is doing the same thing. This is normally a very real challenge, even when the new idea is substantially better, and is often an early consideration in any standards activity:

The consequence of not having backwards compatibility

Without backward compatibility the value of a new idea can only be fully realised when everyone has adopted it. In reality this typically means everyone using both the existing and new solution at the same time, normally for a considerable length of time. Take the fundamental protocol of the Internet, the Internetworking protocol (IP). IP version 4 (IPv4) was designed in the early 1980s and by the early 1990s it was clear it could be improved. However, its proposed successor, IPv6 was not designed to be backwards compatible with IPv4 and now 30 years later, IPv4 still is used across the Internet. Unfortunately for IPv6, by the time it was agreed, IPv4 was a victim of its own success, and the Internet was simply too big to change.  

Evolutionary strategies
  • An additional feature. If the new idea is incremental to the existing standard, it can be added to the existing standard and users take up the feature as they find the need. For example, 3GPP adds new features on a regular basis (bundled into a release, typically issued every 18 months to two years), which can be implemented within the same generation of mobiles or networks, in most cases without impacting existing features or capabilities of the same generation.
  • Service interconnection. For instance, the technical working of mobile voice calling in 4G and 5G works very differently to 2G, but the new standard has a full interworking specification so that a 2G handset can call a 5G handset without the user realising anything has changed.
  • Overlay. The new technology uses the existing network as its means of transporting information between its nodes, a technique often called ‘tunnelling’. The rapid expansion of the Internet exploited this in the 1990s with ‘dial up’ Internet access.
  • Underlay. The technology only affects the transport between the nodes of the existing network and changing these out does not change the operation of the existing network. New optical technologies normally use this technique.
Anticipating backwards compatibility

A good standard may well try to allow for future evolution and deliberately leave aspects of the specification open for future developments, anticipating some or all of the evolutionary strategies above. This includes the concept of “forward compatibility”, where hooks are provided in a release which allow a new feature to be defined later without impacting the early functionality directly.