Confidence Interval and Uncertainty Display

The Users Options provides a set of controls for both uncertainty computation and reporting purposes on the Analysis tab

The Confidence Interval directly determines the reported uncertainty magnitude values reported for a point. For example, if you use the default 1.0 sigma Confidence Interval then you are making a statement in your report that you have a 68.26% confidence that the true measurement lies within the uncertainty parameters. If you want higher confidence, you can set the uncertainty to 3.0 sigma for example, which will greatly increase the uncertainty magnitudes reported but allow you to state a 99.74% confidence.

Uncertainty can be displayed wither using a Monte Carlo simulation process that generates an uncertainty cloud or can be performed using a Covariance analysis.

For uncertainty cloud display :

Sensitivity clouds are handled differently than other point clouds. For this reason we have a separate Cloud Magnification value which helps visualize these clouds graphically. Increasing the magnification value will scale the cloud about the measurement to make it easier to see. The deviations can also be depicted as lines drawn from the measured point. This magnification control also controls the size of uncertainty ellipsoids when enabled.

Uncertainty can also be performed using Covariance analysis. Because it does not use a simulation process for the computation it is both faster and does not produce uncertainty clouds. As an alternative, ellipsoids can be displayed to encompass the region of uncertainty.

For uncertainty ellipsoid display :