ERT dips below lowest RRT using UC_CLIENT_SETTINGS/Linear Regression
search cancel

ERT dips below lowest RRT using UC_CLIENT_SETTINGS/Linear Regression


Article ID: 216249


Updated On:


CA Automic Operations Manager


A task that usually runs for a very similar RRT ( real runtime ) will show a very different ERT ( estimated run time ) when Method as set in UC_CLIENT_SETTINGS is chosen for Calculation method in the runtime page.  For example, if the task usually runs for 5 minutes, but one of them goes to 25 minutes, the ERT will show something like 45 minutes and eventually will show under 5 minutes, something like this:

The peak and dip are very uncharacteristic compared to the RRT.


Release : 12.2



the spike and dip is how linear regression works.  It emphasizes increases or decreases in runtime and so is not the correct ERT to use for instances like this where the runtime is usually static and every now and then goes higher or lower.  It's more meant for instances where you would have many changes.  Using average would probably be better in this case.  To set it like this on all items, using UC_CLIENT_SETTINGS can set it for all objects with an ERT.  

Additional Information

More info on the run time page can be found here:

More info on ERT calculations can be found here: