Article icon
Article

From Air-Gapped to Always-On: The Data Governance Nightmare of Factory Floor Convergence

After years of hype, Industry 4.0 is a reality across modern manufacturing and production environments. From increased robotization to smarter automation systems, along with the embrace of big data, there are multiple evolutions happening at the same time. Leaders buoyed by the possibilities of predictive maintenance, cost reduction, and capacity planning are realizing that teams must change with the technology.

As a result, another long-debated and long-awaited trend – the convergence of information technology (IT) and operational technology (OT) – is unfolding in tandem. Spurred on by new solutions, it, too, warrants attention as traditional network silos break down and data flows more freely.

Machinery that was once air-gapped now needs to be always-on and share increased volumes of data to enterprise repositories, data lakes, and cloud warehouses. In a far cry from years gone by, OT data directly feeds machine learning (ML) and artificial intelligence (AI) engines for advanced analytics and operational intelligence.

This flips industrial governance structures on their head. Today, OT not only shares field device communications but also bigger data streams, which in turn warrants IT oversight to ensure reliability and security. This brings the two teams, which have been culturally and historically distinct, closer together, demanding a total rethink of data management and responsibility.

Air-Gapped Machinery and Siloed Systems

What happens in OT stays in OT – this was the simple principle governing decades of industrial data and network management. This attitude made sense, since OT systems were designed for isolation, security through obscurity, and operated apart from the wider network. Unlike software, which evolves through updates and patches, legacy machinery can last for years while driving production forward. The data stayed local and the machinery communicated via PLCs, SCADA systems, and control networks out of view from the rest of the enterprise. Uptime, safety, and deterministic performance were the aim.

This was a different world with different rules to IT. The other side of the network focused on things like data security, network performance, and enterprise applications. Naturally, this led to a cultural divide, with IT focused on data governance and OT on operations.

Governance was simple because it was separate. This delineation of duty worked because factory data wasn’t feeding business intelligence, real-time data-driven decision-making happened on the factory floor, and regulatory compliance focused on safety rather than data. But interconnected endpoints and expanding networks are drawing this era to a close – and organizations unprepared for the transition are at risk.

The Reality of Always-On Data Demands

Terabytes of operational information that were once locked in proprietary systems are now flowing through to the cloud. This creates several data governance challenges including:

  • Security blind spots: Gaps inevitably appear when IT security protocols meet OT operational requirements
  • Fragmented monitoring: Multiple tools show different pieces of the puzzle but not a complete picture
  • Communication breakdowns: When an issue spans both domains, troubleshooting becomes a complex dance of interdepartmental coordination
  • Delayed response times: Without unified visibility, problems that cross IT/OT boundaries take longer to resolve

Traditional data governance frameworks simply weren’t designed for this hybrid world. Teams are asking themselves: How can you maintain data lineage when fragmented monitoring creates visibility gaps? Or, how do you ensure data quality when communication breakdowns delay problem detection? Further, how do you prove compliance when security blind spots leave data flows undocumented? If left unanswered, bad actors exploit these very gaps.

Applying IT security policies without considering OT requirements risks delaying or breaking real-time industrial processes – where latencies of more than 50 milliseconds can trigger safety locks and halt production. On the other hand, exposing OT networks to IT infrastructure without proper security controls can open the door to breaches that cost millions of dollars. Failure to bridge the IT/OT gap is a lose-lose that threatens downtime, a breach, or both.

Future-Forward Collaboration on the Factory Floor

The good news is that forward-thinking companies can both bridge the IT/OT gap and safely unlock production data through enhanced collaboration. Organizations succeeding in this space share a common trait: they’re building cross-functional teams where IT professionals, production engineers, and cybersecurity specialists work together rather than in silos.

The most mature organizations establish dedicated OT networking engineering teams that combine IT expertise with operational understanding – serving as translators between worlds that once rarely communicated. These teams leverage industrial communications designed with security in mind, such as encrypted MQTT and OPC UA with security policies, while implementing industrial firewalls that enable unidirectional data flows with built-in validation and rate limiting. This way, network segmentation, asset inventories, and threat intelligence are considered right alongside data governance and compliance requirements. The resulting collaboration transforms what was a cultural divide into a strategic advantage.

The effects of this transformation are already being felt. Centroflora Group, a Brazilian manufacturer of botanical extracts and pharmaceutical ingredients, was running three separate production segments (evaporation, drying, and supply extraction) on PLCs. Diagnosing issues required physical checks and upwards of seven hours. However, after bridging the IT/OT gap with a single pane of glass providing unified visibility across all three production segments, diagnostic time dropped to just a few minutes. The company even achieved a 3-4% reduction in power costs by identifying equipment that was running unnecessarily.

These are the kinds of proactive insights that only governed, unified data can deliver. More importantly, this unified approach answers the critical governance questions: Comprehensive monitoring enables end-to-end data lineage tracking, real-time visibility ensures data quality issues are caught immediately, and automated audit trails prove compliance every step of the way.

As my colleague Daniel Sukowski often says, security starts with transparency, and transparency starts with monitoring. The same principle applies to data governance: You can’t govern what you can’t see, and you can’t see what you don’t monitor. The convergence of IT and OT demands this transparency across the entire infrastructure.

Data Governance Bootcamp

Learn techniques to implement and manage a data governance program – February 10, 17, and 24.