For a constant flow and set of process operating conditions, is the observed total loop deadtime relatively constant? We know from last week’s blog, the deadtime also depends upon the sensor time constant and hence it’s fouling. Less recognized is that it depends upon whether a step change is made in the controller output versus its setpoint.
The closed loop deadtime (e.g. deadtime in automatic mode) is generally greater than the open loop deadtime (e.g. deadtime in manual mode).
The deadtime from control valve stick-slip and backlash is the valve resolution and deadband, respectively divided by the rate of change of the controller output. For small step changes (particularly for pneumatic positioners), the response time also gets incredibly slow. For a large step change in controller output, the dead time from stick-slip and backlash is zero and the response time is minimal (except for large actuators). Next week, we will discuss some other ramifications of step size.
For a step change in controller setpoint, there is a kick from proportional action (for a PID structure with proportional action on error) and a ramp from reset action. If the kick is not enough to get the valve to move then the loop has to wait on reset action and the chosen closed loop time constant. Thus the deadtime identified for a setpoint change depends upon the controller tuning. Equations 2-47 through 2-50 in the book Advanced Control Unleashed show the development of an equation to estimate the increase in the deadtime from a control valve based on the open loop deadtime. While, these equations are for deadband, they can be used for stick-slip if you consider that half of a deadband is roughly equal to a resolution limit, which is often the case for the best throttle valves (e.g. sliding stem valves with diaphragm actuators). Note the presence of a detuning factor Kx that is approximately the inverse of the Lambda factor (the ratio of closed loop to open loop time constant).
For adaptive controllers or on-demand tuning software that rely upon setpoint changes, very sluggish initial tuning or an unnecessarily large closed loop time constant specified will lead to a larger identified deadtime and overly conservative settings that tends to keep the loop deadtime larger and hence the controller detuned.
Dead time is bad news because the controller has no effect on the process during this time interval. The minimum peak error for a disturbance is basically how far the process is driven away from set point during the total loop deadtime by the process upset. The minimum peak error from a load upset can be estimated as the average rate of change of the process variable multiplied by the dead time. The minimum integrated error is proportional to the deadtime squared. These relationships for peak and integrated error are developed in Equations 2-38 through 2-44 of Advanced Control Unleashed. If this is not enough to get you to rush out and buy a copy, I am offering for a limited time a $0.25 rebate (generous considering the royalties are donated to a university). Just send me your receipt in a self-addressed and with enough postage to get to my secret island hideaway.