In complex operational environments, performance is not just about data—it is about how effectively people can interpret that data and act with confidence.
During my work on pipeline measurement systems at Flow-Cal, I focused on how users detect anomalies in high-volume, time-based datasets—identifying subtle shifts in pressure, flow rate, and gas composition that could signal operational issues or financial loss.
While the system was built to present data, the real challenge was supporting how users actually worked: recognizing patterns, validating signals across time, and separating meaningful anomalies from normal variation.
This approach closely aligns with Human and Organizational Performance—designing for real-world behavior, acknowledging that error is part of complex systems, and ensuring the system supports better decisions, not just better displays.
Context: A System Built for Data, Not for Decisions
In pipeline operations, measurement systems serve as the backbone for both operational monitoring and financial accountability. Flowcal was designed to aggregate high-frequency data—captured multiple times per second—and roll it into usable time intervals (1, 15, 30, and 60 minutes). This data informed critical decisions around flow rate, pressure stability, and gas composition.
At a surface level, the system worked. It collected, stored, and displayed enormous volumes of data with precision.
But the reality of work in this environment was far more complex.
Users were not simply reviewing data—they were searching for meaning across miles of pipeline, often after the fact, trying to determine whether a subtle deviation represented a real issue or simply noise in the system. The interface required them to scroll through dense tables of numbers, manually identifying anomalies across time.
The system supported data access.
It did not support decision-making.
The Problem: When “Work as Imagined” Breaks Down
The system had been designed around an implicit assumption: that users would analyze structured datasets in a linear, methodical way.
In practice, that assumption did not hold.
Operators and analysts were:
This created a fundamental gap between:
That gap introduced cognitive strain, slowed decision-making, and increased the risk of both missed anomalies and false positives.
In this environment, “error” was not a user failure—it was an inevitable outcome of:
Users were expected to detect meaningful anomalies buried within normal system variability.
Rather than asking users to “be more accurate,” the design approach shifted to:
Supporting human pattern recognition by making anomalies visible, not hidden within raw data.
This led to design directions such as:
Errors were treated as symptoms of system design, not individual shortcomings.
Users had developed their own strategies to cope with the system:
From a traditional lens, these might be labeled as inefficiencies.
From a Human Factors and HOP perspective, they were adaptive behaviors—evidence of how the system actually functioned in practice.
The goal was not to eliminate these behaviors, but to understand and support them.
Design efforts focused on:
The core of this work was grounded in research:
What emerged was not a single workflow, but a set of situational strategies depending on:
These insights reframed the problem:
Users were not analyzing data—they were conducting investigations.
This shift informed a move away from static data tables toward more exploratory, decision-support-oriented designs.
User behavior was shaped by:
These constraints created a constant tension:
Design solutions needed to operate within that tension.
This led to a focus on:
The system began to reflect the decision context, not just the data structure.
The outcome of this work was not just a redesigned interface, but a shift in how the system supported operational performance.
By aligning the system with how users actually worked:
Most importantly, the system moved from being:
to:
While this work was conducted under the banner of Human Factors and UX Research, it directly reflects the principles of Human and Organizational Performance.
At its core, the effort was not about improving an interface—it was about:
The result was not just better usability, but improved system performance.