Thursday, 26 March 2026

3GPP Study on Modernization of Specification Format and Procedures for 6G (6GSM)

The development of each new mobile generation is not only about new technologies and capabilities. It also requires evolution in the way standards themselves are created, maintained and consumed. As work on 6G gradually begins to take shape, the 3rd Generation Partnership Project (3GPP) has started examining whether the tools and processes used to write its specifications are still fit for purpose.

One of the first steps in this direction is the study titled Study on Modernization of Specification Format and Procedures for 6G (6GSM), documented in TR 21.802. The study looks at how the current approach to specification development works, the limitations that are becoming more visible as specifications grow larger and more complex, and the possible directions for modernising the process as the industry prepares for the 6G era.

3GPP specifications form the backbone of the mobile industry. They define how networks, devices and services interoperate across the globe. However, the way these specifications are produced has largely remained unchanged for many years. Today, most specifications are created and maintained using document based workflows centred around Microsoft Word and DOCX files. Delegates submit Change Requests that modify the text of these documents, and editors manually merge the approved changes into updated specification versions. This approach has served the industry well for decades because it is familiar, widely supported and easy for participants to understand.

The study recognises that the current workflow has several strengths. The document format provides a consistent structure across thousands of specifications. Contributors can edit content directly using familiar WYSIWYG tools, review tracked changes, include diagrams and tables, and collaborate during meetings by editing documents in real time on shared screens. These capabilities have helped large groups of experts work together efficiently during standardisation meetings.

At the same time, as specifications grow larger and more complex, the limitations of the current approach are becoming more visible. One of the most obvious challenges is the heavy reliance on manual processes. Change Requests must be merged into specifications by editors, which can introduce delays before updated versions are published. When multiple Change Requests modify the same sections of a document, identifying conflicts or inconsistencies can be difficult.

Scale is another factor. Many technical specifications now run into hundreds or even thousands of pages. Opening, searching or editing such large DOCX files can become slow and occasionally unstable. Large tables, embedded diagrams and complex formatting further increase file sizes and processing overhead.

Understanding how a feature evolves across specification versions can also be difficult for readers and implementers. Engineers often need to trace how a particular capability has changed between releases, but linking the final specification text back to the relevant Change Requests or understanding the context behind changes is not always straightforward.

The document format itself also presents challenges for automated processing. Extracting structured information from DOCX files requires significant preprocessing because textual content is mixed with binary elements such as images and embedded objects. This makes it harder for tools to analyse specifications or automate parts of the development workflow.

Navigation across specifications is another area where improvements could help. Many features are defined across multiple technical specifications produced by different working groups. Following references between documents or understanding how procedures interact across specifications can take time and effort, especially for engineers who are new to the standards.

To address these challenges, the study explores a number of alternative specification formats that could be considered for future work. Options such as OpenDocument, AsciiDoc, Markdown and LaTeX are discussed, along with more structured or restricted DOCX based approaches. Some proposals also consider hybrid models where different formats could coexist while maintaining a single authoritative source.

Text based markup formats such as Markdown or AsciiDoc are particularly interesting because they separate content from presentation. This structure can make version control and automated processing easier. These formats are widely used in software development environments and integrate well with modern collaboration tools that track changes and manage contributions from multiple participants.

LaTeX is another potential option, particularly for documents that require complex technical formatting or mathematical expressions. Meanwhile, restricted DOCX approaches attempt to preserve compatibility with existing workflows while enforcing stricter formatting rules to reduce complexity and improve consistency.

Beyond the document format itself, the study also looks at broader improvements to the way specifications are developed and maintained. One important idea is the use of modern version control systems such as Git. These systems are widely used in software development and allow contributors to track changes in detail, manage parallel development branches and merge updates in a more controlled manner. Applying similar workflows to standards development could improve traceability and help identify conflicts earlier.

The study also highlights the potential for automated validation tools that could check Change Requests for formatting errors, missing references or structural inconsistencies before they are submitted. Such tools could reduce the editorial workload while improving the overall quality and consistency of specifications.

Another possible direction is the use of machine readable formats for structured elements within specifications. Interfaces, protocol definitions or data models could be stored separately in structured files and then referenced or generated automatically within the main specification. This approach could reduce duplication and make it easier for implementers to reuse information directly in development environments.

The modernisation study does not recommend a single solution at this stage. Instead, it provides a detailed analysis of the current situation and explores possible directions for future work. Any transition will need to balance the benefits of new tools and formats with the practical realities of the existing ecosystem. The 3GPP community relies on a large set of established workflows, tools and expertise, and maintaining accessibility for all participants will be important.

As the industry moves towards 6G, the scale and complexity of specifications will continue to grow. Ensuring that the processes used to create and manage these specifications evolve alongside the technologies themselves will be essential. In that sense, modernising specification formats and procedures may become an important step in preparing the standards ecosystem for the next generation of mobile innovation.

If you want to learn more about this, check out:

  • 6G Specification Modernization discussions from Nokia & Ericsson here.
  • Ongoing 6GSM Workshop discussions here.
  • 3GPP TR 21.802: Study on modernization of specification format and procedures for 6G here.

Related Posts

Tuesday, 3 March 2026

Strengthening Critical Infrastructure Security with OSINT

Cybersecurity conversations in telecoms often focus on IT systems, cloud platforms and enterprise networks. Yet beyond the data centres and mobile cores lies another domain that is arguably even more critical to society. Industrial Control Systems (ICS) and Operational Technology (OT) environments underpin the power plants, water treatment facilities, railways, petrochemical sites and manufacturing plants that keep daily life running. These environments are increasingly in the crosshairs of cyber attackers.

A comprehensive YouTube course titled OSINT for ICS and OT brings much needed attention to this area. Created by Mike Holcomb, the 10 plus hour course explores how Open Source Intelligence (OSINT) can be used to better understand, assess and protect ICS and OT environments. For anyone working in telecoms infrastructure, utilities, transport or industrial sectors, this is highly relevant material.

Mike focuses on the practical reality that there are still relatively few accessible and high quality resources dedicated to OT and ICS cybersecurity. While IT security has matured with abundant training paths, certifications and community support, the world of control systems security remains comparatively underserved. That gap is particularly concerning given the importance of critical infrastructure to national resilience and economic stability.

In his channel overview, Mike explains that his work is aimed at a broad audience. It includes IT cybersecurity professionals looking to pivot into OT security, engineers already working in industrial environments who want to strengthen their defensive posture, and owners or operators who are building or refining a cybersecurity programme for their facilities. This inclusive approach reflects the multidisciplinary nature of OT security, where engineering, networking and cybersecurity disciplines intersect.

The turning point for many in this field was the discovery of Stuxnet, the first widely known cyber weapon designed to disrupt industrial processes. The malware specifically targeted centrifuges in a uranium enrichment facility, manipulating physical processes while masking its actions from operators. For Mike, learning about Stuxnet sparked a deeper curiosity about how control systems function inside power plants and other facilities, and how they can be secured. That same question remains highly relevant today.

For readers of The 3G4G Blog, there is a natural connection. As telecom networks evolve towards 5G, private networks and future 6G systems, connectivity is extending deeper into industrial domains. Smart grids, connected factories and digitalised transport systems rely on robust communications as well as secure control environments. The boundary between IT and OT continues to blur. Understanding how adversaries might gather intelligence about exposed assets, misconfigurations or vulnerable systems using open sources is therefore a critical skill.

The OSINT for ICS and OT course aims to demystify that process. It looks at how publicly available information can reveal insights about industrial environments and how defenders can use the same techniques proactively. Rather than waiting for an incident, organisations can identify potential weaknesses and exposure before an attacker does. This proactive mindset aligns closely with modern security best practice across both telecom and industrial sectors.

Another important aspect is accessibility. The course is freely available on YouTube, lowering the barrier to entry for those who may be curious about OT security but unsure where to start. In a domain where specialist training can be expensive and difficult to find, open educational content plays a valuable role in building community knowledge and capability.

Critical infrastructure protection is not a niche concern. It affects the electricity that powers base stations, the water that cools data centres and the transport systems that support supply chains. As cyber threats continue to evolve, the need for professionals who understand both networking and industrial control environments will only grow.

For those interested in expanding their horizons beyond traditional telecom security and into the protection of the systems that underpin modern society, this course is well worth exploring. It is encouraging to see experienced practitioners sharing knowledge openly and helping to strengthen resilience across critical infrastructure sectors.

Related Posts