Month: May 2026
Key Differences Between Network Cybersecurity and Control System Cybersecurity
By Joe Weiss with Alan J Weissberger
Introduction:
The Operational Technology (OT) [1.] cybersecurity [2.] community continues to ignore control system cyber-incidents [3.] – a governance failure masquerading as a vocabulary issue.
IT and OT network data breaches are documented in multiple sources such as the Verizon Data Breach Report, CISA documents, and others. Palo Alto Networks notes that nearly 70% of industrial firms had an OT cyber-attack last year. Those cyber-attacks were from data breaches – not always causing equipment damage.
Industrial organizations need an integrated and cyber resilient IT-OT framework to address this increasingly sophisticated threat landscape, but it appears they’re not well prepared to defend against network or control system cyberattacks.
Note 1. Operational Technology refers to the combination of hardware and software designed to directly monitor, control, and manage physical devices, industrial equipment, and critical processes.
Note 2. Cybersecurity can be defined as the practice of protecting people, systems and data from cyberattacks by using various technologies, processes and policies.
Note 3. Cyber-incidents are defined as electronic communications between systems that effects Confidentiality, Integrity, or Availability. This is an IT-centric definition because Safety is not addressed.
Image Credit: txOne Networks
There are two communities addressing cybersecurity:
- The more prevalent community is the one involved in data security. This includes IT and OT network security and is focused on data breaches.
- The second community is focused on engineering security. It is less well-known, but very critical. This discipline is focused on safety, reliability, and productivity.
Professor Ross Anderson stated in his seminal book, “Security Engineering: A Guide to Building Dependable Distributed Systems,” that security engineering is about building systems to remain dependable in the face of malice, error, or mischance.”
The culture gap between network security and engineering organizations will be addressed in the June 2026 issue of IEEE Computer magazine, “Packets and Process: What Network Security and Engineering Get Wrong About Each Other.”
Discussion:
The OT cybersecurity community’s mission is to focus on OT network cyber-attacks. However, its charter does not extend to malicious and unintentional control system cyber incidents involving process sensors, actuators, motors, turbines, transformers, etc.
Importantly, control system cyber incidents can be physics-related rather than network-related. The 2007 Aurora vulnerability test at the Idaho National Laboratory destroyed a 2 MW commercial diesel generator by remotely restarting the generator out- of-phase with the grid. This is a gap in protection of the electric grid and was addressed in the October 2025 IEEE Computer magazine article, “Physics-Based Cyberattacks Against Electric Power Grids and Alternating Current Equipment.”
Idaho National Laboratory ran the Aurora Generator Test in 2007 to demonstrate how a cyberattack could destroy physical components of the electric grid. The diesel generator used in the experiment beginning to smoke as shown below:
Aurora Generator Test. Image Credit: Wikipedia
As such, industry and government OT cybersecurity experts continue to downplay the threat of control system cyberattacks and ignore actual control system incidents that do not originate from OT networks by not calling them cyber-related.
There have been more than 20 million control system cyber incidents that have killed more than 30,000 people. Most of these incidents occurred below the IP-Ethernet layers where there is no cyber forensics nor cybersecurity training. As a result, the majority of these incidents were not identified as being cyber-related.
This indicates that control system cyber incidents that are not classified as IP-Ethernet incidents need their own classification as issues to be addressed by cybersecurity policy, especially for critical infrastructure where accidental and/or malicious cyber failures could result in widespread death and destruction.
Given the current geopolitical environment, nation-states are actively reassessing their capabilities to disrupt adversary infrastructure at scale. In this context, dismissing control system cyber incidents solely because they do not originate from traditional IP-based vectors introduces significant risk. Threat actors are increasingly targeting critical infrastructure and associated control systems—spanning both IT and OT domains—leveraging diverse attack surfaces beyond conventional network entry points.
A parallel issue within both the IT and OT security communities is the tendency to classify incidents as “cyber” only when malicious intent is confirmed. This narrow definition is problematic.
For example, the July 2024 CrowdStrike-related outage, which caused global operational disruptions, clearly met the functional criteria of a cyber-incident due to its systemic impact on networked systems. However, its non-malicious origin led some security governance bodies to exclude it from cyber incident classification. Such distinctions can undermine resilience planning, as they fail to account for the full spectrum of cyber-induced operational risk, including software supply chain failures and systemic misconfigurations.
ERPI Focus:
The European Risk Policy Institute (ERPI) was founded by the Australian Risk Policy Institute as part of the Global Risk Policy Network. EPRI Chairman wrote in a blog titled, “Control system cyber incidents and network breaches are apples and oranges”:
“From our ERPI / 3°C World SRP® perspective, Weiss is pointing at a governance failure masquerading as a vocabulary issue: if you define “cyber incident” through an IT breach lens, you will miss (or dismiss) the incidents that actually move risk —those that degrade continuity lifelines by disrupting physical processes. He makes the case that control-system cyber incidents include electronic/automation failures across sensor signals, control logic, firmware and field device communications, and that many are non-malicious yet still produce loss of view, loss of control, equipment damage, and safety/environmental consequences.
What matters strategically is the reporting and response architecture. Breach-centric metrics (and the cultural reflex that “no attack = no incident”) bias organizations toward under-detection, weak root-cause discipline, and false trend comparisons—exactly when coupled infrastructures are most fragile and repair cycles are tight. Weiss’s bridge condition is practical: align engineering and security on a shared incident definition, and train both communities in control-system incident reality so that operational anomalies are treated as cyber-relevant signals, not “maintenance noise.”
If you’re responsible for critical infrastructure, this is a reminder to recalibrate your incident taxonomy and your board narrative: the control-room outcome is the headline, and the network story is only one possible path to it.”
The Crucial Importance of Process Sensors:
Process sensors represent the biggest gap between data security and engineering security. Perplexity.ai explains this gap in detail -see below, but first we distinguish between data security and engineering security:
- Data security focuses on IP-native devices such as firewalls, routers, switches, etc.
- Engineering security should be focused on engineering devices and equipment that could cause equipment damage and deaths but have no cybersecurity, authentication, or cyber forensics. This includes process sensors, actuators, motors, transformers, inverters, etc. However, that focus is often not achieved.
Perplexity.ai on the Data vs. Information Security Gap:
Process sensors sit at the junction of physical process integrity and digital telemetry, so they expose a gap that neither data security teams nor engineering security teams fully own today. In practice, security teams tend to focus on protecting data pipelines, identities, and networks, while engineering teams focus on measurement accuracy, calibration, availability, and safety; the sensor itself often falls between those disciplines.
Process sensors are not just data endpoints; they are safety- and control-relevant instruments whose outputs can drive operators, PLCs, DCS logic, and downstream automation. If a sensor is spoofed, degraded, miscalibrated, or manipulated, the result is not only bad data but potentially unsafe or physically damaging control actions.
Traditional data security assumes the main problem is confidentiality, integrity, and availability of information in transit or at rest. But process sensors often use legacy field protocols, serial links, gateways, or embedded devices that were engineered for function and reliability rather than cryptographic assurance, making them difficult to secure using conventional enterprise controls.
Engineering security is usually optimized for process correctness, alarms, redundancy, and fail-safe behavior, not adversarial manipulation of the measurement layer. That means the sensor may be treated as trustworthy instrumentation, even though compromised or false sensor data can undermine control logic, operator decisions, and safety systems.
The core mismatch: The real issue is that data security protects the pipeline, while engineering security protects the process, but process sensors belong to both domains. Because ownership is split, sensor trust, authentication, anomaly detection, and physical tamper resistance are often addressed inconsistently or not at all, creating a blind spot at the boundary between cyber and physical risk.
Highlights of Sensors Converge Conference Presentation:
To address these important issues and gaps, I will be presenting at the Sensors Converge conference in Santa Clara, CA on May 7, 2026. The title of my talk is, “Process Sensor Monitoring for Cybersecurity, Reliability, and Safety.” The presentation will include the following topics:
- Process sensors (Level 0 devices) are inherently cyber vulnerable yet remain largely unrecognized by cybersecurity organizations.
- Process sensor incidents, both malicious and unintentional, have caused catastrophic and fatal cyber/operational events across multiple sectors, but were not identified as being cyber-related.
- Fatalities have occurred in every decade since the 1980s, including this decade.
- Monitoring process sensors at the physics level can materially improve reliability, safety, and cybersecurity.
- A discussion of what a process sensor cybersecurity program should include and what organizations should be involved.
- The implications of process sensors which are not cyber-secure, because they don’t meet U.S. and/or EU cybersecurity requirements.
Nation-state actors, including Russia, China, and Iran, understand Level 0 cyber deficiencies. In sharp contrast, most cyber defenders do not and won’t identify process sensor incidents as being cyber-related. This gap helps explain why process sensor cybersecurity remains largely absent from OT security forums and RSA Conference discussions. It may also explain why government OT cybersecurity advisories don’t include insecure Level 0 devices, even though process sensors provide the trusted input to controllers and SCADA/DCS systems.
Conclusions:
Network cybersecurity functions across IT and OT domains, and control system engineering organizations, operate with fundamentally different objectives, taxonomies, and thresholds for identifying and classifying cyber incidents. This divergence has led to a persistent disconnect in how incidents affecting control systems are recognized and addressed within broader network security governance frameworks. Dismissing control system cyber events because they fall outside narrow, IT-centric definitions is not merely a semantic issue—it reflects a structural governance gap with direct implications for critical infrastructure resilience.
To address this, industry and government stakeholders must converge on a harmonized definition of cyber incidents that encompasses both network-centric and control system–centric perspectives. This alignment should be supported by cross-domain training, ensuring that both network security practitioners and engineering teams possess sufficient understanding of control system architectures, threat models, and failure modes. Without such integration, efforts to compare incident frequency, severity, and systemic impact across IT networks and control systems will remain inconsistent and misleading. More critically, this fragmentation will continue to obscure systemic risk, leaving essential infrastructure sectors exposed to increasingly sophisticated and multi-domain cyber threats.
About Joe Weiss:

Joe Weiss is an expert on control system cyber security. He authored the 2010 book, “Protecting Industrial Control Systems from Electronic Threats.”
Joe is an ISA Fellow, Emeritus Managing Director of ISA99, an IEEE Senior Member, has patents on instrumentation, control systems, and OT networks. He is a professional engineer with CISM and CRISC certifications and is a member of Control Process Automation Hall of Fame.
References:
https://www.paloaltonetworks.com/resources/research/state-of-ot-security-report



