Interoperability has graduated from slogan to stress test. Over the past five years defense communities worldwide have layered standards, exercises, and modular architecture mandates on top of legacy platforms. The result is measurable progress in pockets and capabilities that now interoperate in controlled conditions, but not the universal, day-zero interoperability many architects promised. The practical truth is that interoperability is now a continuous program of work rather than a one-time engineering deliverable.

What counts as progress matters. NATO’s CWIX events in 2024 and 2025 moved from hundreds to more than 500 validated capabilities and from roughly 2,500 participants to over 3,000, producing tens of thousands of discrete tests across command, control, and information services. Those tests are not academic. They exercise Federated Mission Networking profiles, exercise mission network topologies, and de-risk the “first day” connectivity problems that historically cripple coalition operations. They demonstrate that when national teams adopt common specifications and perform systematic verification, systems can be made to interoperate reliably in the short term. But CWIX also surfaces the seams: custom data encodings, authentication gaps, and fragile ad hoc integrations that need manual fixes during events.

Standards and open architectures have become the principal lever for long term interoperability. MOSA-oriented standards families such as SOSA, FACE, CMOSS, and OMS are being codified into procurement and platform design. The U.S. services and industry have moved toward these building blocks, and the Defense Standardization Program published CMOSS interoperability requirements to ASSIST in 2025 to help make plug-and-play card-level integration more routine. Vendors now produce OMS- or SOSA-compliant modules, and avionics and sensor stacks are increasingly designed to support the FACE software environment. Those are important structural changes that reduce integration time and encourage competitive refresh cycles.

Yet adoption is uneven and often incomplete. The Government Accountability Office emphasized that MOSA can deliver savings and agility only when acquisition programs plan for it early and coordinate across portfolios. A January 2025 GAO assessment found programs were inconsistent in applying MOSA principles and often avoided the short-term cost or governance changes required to realize long-term benefits. In short, standards without acquisition incentives and programmatic discipline risk producing paperwork rather than interoperable fielded capability.

The joint problem goes beyond hardware interfaces to the data layer. Modern middleware such as DDS and ROS 2 have narrowed messaging incompatibilities in robotics and unmanned systems by providing data-centric publish/subscribe semantics and QoS controls. Academic and industry reviews through 2024 and into 2025 show DDS becoming a mainstream choice for real-time, multi-vendor robotic stacks and for integrating perception and autonomy pipelines. That improves on-point interoperability for unmanned systems and edge compute, but it does not erase the need for agreed semantics, schemas, and governance for operational data shared among services or nations. Without common data models or translation governance, systems can connect but still misinterpret the meaning of exchanged fields.

Governance remains the core bottleneck. The Joint All-Domain Command and Control initiative and its multinational extension, CJADC2, have unlocked significant experimentation budgets and prototyping activity, but independent GAO reviews conclude DoD still lacks a comprehensive investment and governance framework. Multiple service-level programs pursuing overlapping goals create duplication and leave senior leaders without a single set of metrics to judge whether interoperability improvements are cumulative or merely parallel. Until governance ties requirements, standards selection, verification, and program-level incentives together, fielded interoperability will remain fragile and project dependent.

Security and classification constraints further complicate interoperability across coalitions. FMN-style federations and NATO mission networking approaches deliberately separate unclassified, classified, and mission-specific enclaves. That architecture solves some problems but forces more complex federation logic: identity and trust management across certificate authorities, controlled data cross-domain transfer, and operational rules for data sharing. Exercises show the mechanics work when well rehearsed. They also show that real-world missions add time pressure and adversary actions that expose previously unseen failure modes. Interoperability in garrison does not automatically translate to secure, resilient, distributed operations in contested environments.

The industrial landscape is both part of the solution and a source of friction. DoD and allied mandates that favor MOSA and open standards are nudging prime contractors and subsystem vendors toward common interfaces. Demonstrations such as OMS-compliant sensors and mission computers prove the technical feasibility of mixing vendors on a platform. But industrial business models, intellectual property, and data rights still create incentives for vertical stacks and subtle protocol variations. Without consistent contract language and interoperable conformance suites that tie payments or follow-on work to verified interoperability, industry will rationally hedge by shipping minimal-conformance solutions.

So have interoperability challenges been solved? The honest answer is no. The problem is now tractable, measurable, and testable in ways it was not a decade ago. NATO CWIX campaigns, MOSA standards in procurement, and middleware like DDS reduce the accidental complexity that once made federation impractical. Yet conceptual clarity and engineering discipline have outpaced programmatic alignment and governance changes. Interoperability has moved from a binary goal to a systems-of-systems program: successful outcomes require harmonized acquisition, early standards adoption, data model governance, robust cross-domain security, and sustained exercise-driven verification.

Practical next steps are straightforward and unglamorous. First, require MOSA and data-model decisions at milestone A so programs bake in interfaces and conformance testing. Second, fund and staff a joint interoperability authority with teeth: a cross-service organization that approves data models, oversees conformance test suites, and ties interoperability outcomes to milestone payments. Third, scale federation exercises and open testbeds that simulate contested degradation and adversary interference; exercises like Project Convergence and CWIX are models to expand, not replace. Fourth, normalize contractual language on IP and data rights so vendors compete on capability rather than lock-in. Finally, invest in tooling that automates schema evolution, runtime translation, and provenance so that coalition data can be trusted and used in real time. The technology exists for many of these items. The missing ingredient is program-level governance that treats interoperability as measurable capability rather than a checkbox.

Interoperability is not an engineering punch list you cross off once. It is an organizational habit layered on an engineering baseline. The landscape in 2025 shows clear advances: standards are maturing, test events routinely validate multi-vendor setups, and middleware choices reduce friction at the edge. If the defense enterprise can align acquisition incentives, governance, and verification, we will move from episodic interop victories to persistent interoperability. Until then, solved will remain a future tense goal and a performance metric to be earned in exercises, procurements, and deployments.