Joint All‑Domain Command and Control is less a single system and more an engineering and organizational challenge: connect sensors, shooters, and decision makers across air, land, sea, space and cyberspace so that machines and humans can sense, make sense and act faster than an adversary. The Department of Defense signed an implementation plan in 2022 that sets this broad objective and assigns responsibilities, but the promise of a seamless, department‑wide C2 fabric collides repeatedly with legacy architectures, stove‑piped acquisition programs and the messy realities of tactical comms.

Put bluntly, interoperability is the metric by which JADC2 will live or die. Multiple authoritative reviews have concluded the program is still in an early phase: the GAO found the department has articulated goals but has not yet defined which existing systems will contribute to JADC2 or how future capabilities will be sequenced and governed. That gap produces divergent service efforts that must be stitched together rather than designed from the outset to be composable.

What does that stitching look like on the technical plane? At least three realities matter.

1) Multiple waveforms, incompatible radio languages. Tactical datalinks are not a single uniform bus. F‑35s use MADL, F‑22s use IFDL, and a large portion of coalition and fourth‑generation platforms rely on Link 16. Each was designed for different tradeoffs in security, stealth and throughput. In practice, the department has relied on pragmatic gateway and translation solutions to move data between radios and into the broader networked environment during experiments, but gateways add latency, increase attack surface and require careful engineering and sustainment planning.

2) Data formats and semantic interoperability. Beyond getting bits across physical waveforms, we must agree on what those bits mean. The military has long used standardized message sets such as the J‑series, but decades of incremental extensions, vendor implementations and limited enforcement have left a situation where compliance with a written standard does not guarantee plug‑and‑play behavior. Real interoperability demands machine‑readable contracts, discoverability, and an operationally governed registry of interfaces rather than a hope that everyone implements the same spec the same way.

3) Infrastructure and edge constraints. The move to a joint data layer is premised on cloud and edge compute. The department’s Joint Warfighting Cloud Capability awarded multi‑vendor contracts to hyperscalers to bring cloud services to the enterprise and the tactical edge, which helps with data federation and model hosting. But cloud availability, classification boundaries, and tactical link capacity constrain what can be centralized versus what must run at the edge. The correct engineering tradeoffs are a mixture of local preprocessing, prioritized data flows and progressive synchronization strategies.

Those technology realities are worsened by organizational and programmatic friction. Each service has been pursuing its own path to satisfy near‑term missions: the Air Force’s Advanced Battle Management System, the Army’s Project Convergence, the Navy’s Project Overmatch and allied experimentation programs. The department has created cross‑cutting offices and CDAO responsibilities to improve integration, but independent acquisition timelines and differing standards requirements mean much of the jointness must be achieved by interfaces and governance, not by wholesale replacement of service capabilities.

Congress and the Joint Staff have recognized both urgency and complexity. Legislation and oversight have pressed the department for a deployable joint data integration layer and required reports on plans and timelines for prototypes in Indo‑Pacific theaters. Those statutory deadlines are a blunt instrument intended to focus attention, but they cannot substitute for rigorous engineering, persistent experimentation and funding that ties integrations to life‑cycle support.

What has worked so far and what should guide the next two years? Experience from experiments and operational demonstrations suggests a pragmatic, layered approach:

  • Prioritize effect chains, not mythical full‑stack interoperability. Field discrete sensor‑to‑shooter use cases end‑to‑end and harden them. These experiments provide measurable latency, throughput and reliability numbers that programs can budget against. The shift from trying to connect everything at once to focusing on urgent operational problems reduces scope and produces reusable integration patterns.

  • Adopt an API and data‑contract first posture. Rather than relying solely on prescriptive monolithic standards created by committees, prioritize working reference implementations and publish machine‑readable interface contracts and a central registry so producers and consumers can discover and validate interfaces. This is the difference between specifying the syntax in a 100‑page spec and shipping a test harness that proves two implementations interoperate in a contested environment.

  • Embrace multi‑vendor cloud federation with tactical constraints. JWCC and similar cloud efforts are necessary enablers, but teams must design for intermittent connectivity, local compute and prioritized sync. Data governance must codify what is allowed to cross classification boundaries and how coalition data is tagged and filtered.

  • Continue and harden gateway patterns while reducing their lifetime. Gateways and translators are a practical necessity today, but each gateway is a technical debt item that requires secure maintenance. Use gateways to buy time for migration rather than as permanent crutches. Standardize gateway test suites and operationalize continuous integration of interface tests.

  • Invest equally in people and engineering. Interoperability is socio‑technical. Combatant commands need embedded systems engineers and data engineers who understand tactical waveforms, crypto and observability. Congressional reporting requirements around personnel and demonstration timelines point to this necessity, and acquisition leaders should budget for the engineering teams that will own joint interfaces long after prototypes are retired.

In short, interoperability for JADC2 is not a single procurement or a single standard. It is an incremental program of experiment, measure, harden and scale. The department has the right high‑level blueprint, cloud partners and experimentation campaigns to make progress. The remaining work is engineering at scale: reduce heterogeneity where feasible, standardize interfaces where necessary, and make machine‑testable contracts the default. If those practices become routine, the promise of sensing and acting at the speed of relevance will be an engineering milestone rather than a strategic slogan.