Data Collection Speed Determines Whether or Not KPIs Can Accurately Assess Individual PID Control Loop Health and Plant-Wide Production Performance
If you’re old enough to remember silent movies, then you know what it’s like to see the characters seemingly jiggle around the screen. Black spots would randomly appear as the story was told with the help of subtitles. Spots and subtitles aside, these movies appeared jittery due to their slow speed. The slow frame rate – a precursor of data collection speed – resulted in visual gaps.
Before the first standards were set in the 1920s at 24 frames per second (fps), projection speeds could be as slow as 12 fps. Films shown at that speed had a tendency of straining the viewer’s eyes and they left viewers feeling as though they missed something. Putting speed into perspective, today’s virtual reality headsets display content at 120 fps. That’s 5-10 times faster than the original standard and, as a result, the visual experience is rich.
The collection speed of data matters to process manufacturers just as much as frames per second does to movie fanatics. Manufacturers are deploying increasing quantities of sensors with which they can monitor and improve their facility’s production performance. The expectation is that they can gain added insight from the data that’s collected. It’s somewhat ironic then that some manufacturers take issue with collecting their data at rates fast enough for effective analysis. At less than adequate speeds process characteristics can be distorted. Worse still, the information the data portrays can be misleading. This is of course the effect called aliasing.
Consider the following control loop performance diagnostics that rely on sufficient data speed:
It’s well known that the mechanical elements associated with moving a valve or damper can wear over time. As these linkages or other parts start to wear down, their ability to consistently move the valve stem or damper degrades. Stiction is a measure of ‘sticky friction’ within such a Final Control Element (FCE). It’s also the leading mechanical issue faced by engineering staff.
If the speed – or resolution – isn’t fast enough, then trended data may not showcase the saw- or square-toothed patterns that are indicative of Stiction. The result is needless process variability.
- Valve Travel
This is a measure of movement in a PID control loop’s output signal. It gives practitioners insight into how much effort is required by the FCE to maintain control. Increasing amounts of effort are indicative of a change in performance of the controller and/or the FCE.
Excessive Valve Travel may be obscured by slow data and the additional work performed by the FCE may simply be overlooked. An overworked FCE is one that’s on a path toward failure.
This metric calculates the likelihood that the Process Variable (PV) is oscillating. It’s a measure of the amount of power associated with the dominant frequency divided by the total power from the same loop’s spectral analysis. In essence, it indicates whether the PV has a single, dominant oscillation that is unnecessarily accelerating wear and tear.
Without sufficient resolution the analysis can be flawed and oscillatory behavior can go unnoticed. Since oscillations reverberate across downstream processes, the impact can be severe.
While collecting and storing process data at high speed is essential to effective analysis, it’s important to note that speed requirements can vary from loop to loop. Indeed the dynamics associated with a temperature loop are typically much different from those of a pressure loop. While the former responds to change over a span of minutes or even hours, the later may respond in not just seconds but sub-seconds. More about that aspect of data speed in a future post!