ⓘ Error analysis, mathematics. In mathematics, error analysis is the study of kind and quantity of error, or uncertainty, that may be present in the solution to a ..

                                     

ⓘ Error analysis (mathematics)

In mathematics, error analysis is the study of kind and quantity of error, or uncertainty, that may be present in the solution to a problem. This issue is particularly prominent in applied areas such as numerical analysis and statistics.

                                     

1. Error analysis in numerical modeling

In numerical simulation or modeling of real systems, error analysis is concerned with the changes in the output of the model as the parameters to the model vary about a mean.

For instance, in a system modeled as a function of two variables z = f x, y {\displaystyle \scriptstyle z\,=\,fx,y}. Error analysis deals with the propagation of the numerical errors in x {\displaystyle \scriptstyle x} and y {\displaystyle \scriptstyle y} around mean values x ¯ {\displaystyle \scriptstyle {\bar {x}}} and y ¯ {\displaystyle \scriptstyle {\bar {y}}} to error in z {\displaystyle \scriptstyle z} around a mean z ¯ {\displaystyle \scriptstyle {\bar {z}}}.

In numerical analysis, error analysis comprises both forward error analysis and backward error analysis.

                                     

1.1. Error analysis in numerical modeling Forward error analysis

Forward error analysis involves the analysis of a function z ′ = f ′ {\displaystyle \scriptstyle z=fa_{0},\,a_{1},\,\dots,\,a_{n}} which is an approximation usually a finite polynomial to a function z = f {\displaystyle \scriptstyle z\,=\,fa_{0},a_{1},\dots,a_{n}} to determine the bounds on the error in the approximation; i.e., to find ϵ {\displaystyle \scriptstyle \epsilon } such that 0 ≤ | z − z ′ | ≤ ϵ {\displaystyle \scriptstyle 0\,\leq \,|z-z|\,\leq \,\epsilon }. The evaluation of forward errors is desired in validated numerics.

                                     

1.2. Error analysis in numerical modeling Backward error analysis

Backward error analysis involves the analysis of the approximation function z ′ = f ′ {\displaystyle \scriptstyle z\,=\,fa_{0},\,a_{1},\,\dots,\,a_{n}}, to determine the bounds on the parameters a i = a i ¯ ± ϵ i {\displaystyle \scriptstyle a_{i}\,=\,{\bar {a_{i}}}\,\pm \,\epsilon _{i}} such that the result z ′ = z {\displaystyle \scriptstyle z\,=\,z}.

Backward error analysis, the theory of which was developed and popularized by James H. Wilkinson, can be used to establish that an algorithm implementing a numerical function is numerically stable. The basic approach is to show that although the calculated result, due to roundoff errors, will not be exactly correct, it is the exact solution to a nearby problem with slightly perturbed input data. If the perturbation required is small, on the order of the uncertainty in the input data, then the results are in some sense as accurate as the data "deserves". The algorithm is then defined as backward stable. Stability is a measure of the sensitivity to rounding errors of a given numerical procedure; by contrast, the condition number of a function for a given problem indicates the inherent sensitivity of the function to small perturbations in its input and is independent of the implementation used to solve the problem.



                                     

2.1. Applications Global positioning system

The analysis of errors computed using the global positioning system is important for understanding how GPS works, and for knowing what magnitude errors should be expected. The Global Positioning System makes corrections for receiver clock errors and other effects but there are still residual errors which are not corrected. The Global Positioning System GPS was created by the United States Department of Defense DOD in the 1970s. It has come to be widely used for navigation both by the U.S. military and the general public.

                                     

2.2. Applications Molecular dynamics simulation

In molecular dynamics MD simulations, there are errors due to inadequate sampling of the phase space or infrequently occurring events, these lead to the statistical error due to random fluctuation in the measurements.

For a series of M measurements of a fluctuating property A, the mean value is:

⟨ A ⟩ = 1 M ∑ μ = 1 M A μ. {\displaystyle \langle A\rangle ={\frac {1}{M}}\sum _{\mu =1}^{M}A_{\mu }.}

When these M measurements are independent, the variance of the mean < A > is:

σ 2 ⟨ A ⟩ = 1 M σ 2 A, {\displaystyle \sigma ^{2}\langle A\rangle={\frac {1}{M}}\sigma ^{2}A,}

but in most MD simulations, there is correlation between quantity A at different time, so the variance of the mean < A > will be underestimated as the effective number of independent measurements is actually less than M. In such situations we rewrite the variance as:

σ 2 ⟨ A ⟩ = 1 M σ 2 A,}

where ϕ μ {\displaystyle \phi _{\mu }} is the autocorrelation function defined by

ϕ μ = ⟨ A μ A 0 ⟩ − ⟨ A ⟩ 2 ⟨ A 2 ⟩ − ⟨ A ⟩ 2. {\displaystyle \phi _{\mu }={\frac {\langle A_{\mu }A_{0}\rangle -\langle A\rangle ^{2}}{\langle A^{2}\rangle -\langle A\rangle ^{2}}}.}

We can then use the auto correlation function to estimate the error bar. Luckily, we have a much simpler method based on block averaging.

                                     

2.3. Applications Scientific data verification

Measurements generally have a small amount of error, and repeated measurements of the same item will generally result in slight differences in readings. These differences can be analyzed, and follow certain known mathematical and statistical properties. Should a set of data appear to be too faithful to the hypothesis, i.e., the amount of error that would normally be in such measurements does not appear, a conclusion can be drawn that the data may have been forged. Error analysis alone is typically not sufficient to prove that data have been falsified or fabricated, but it may provide the supporting evidence necessary to confirm suspicions of misconduct.



                                     
  • Error analysis can refer to one of the following: Error analysis mathematics is concerned with the changes in the output of the model as the parameters
  • Mathematical analysis is the branch of mathematics dealing with limits and related theories, such as differentiation, integration, measure, infinite series
  • analysis is the study of algorithms that use numerical approximation as opposed to symbolic manipulations for the problems of mathematical analysis
  • distinction is most important in regression analysis where the concepts are sometimes called the regression errors and regression residuals and where they
  • Observational error or measurement error is the difference between a measured value of a quantity and its true value. In statistics, an error is not a mistake
  • instead of π In the mathematical field of numerical analysis the numerical stability of an algorithm indicates how the error is propagated by the algorithm
  • an error Blooper Blunder disambiguation Error analysis disambiguation Perfection Popular errors Refractive error Trial and error Margin of error Medical
  • treatment of the analysis of algorithms. The book provides mathematical knowledge and skills for computer science, especially for the analysis of algorithms
  • modelled as linear combinations of the potential factors, plus error terms. Factor analysis aims to find independent latent variables. The theory behind
  • allows separation of experimental error into its components: lack of fit and random pure error Model Mathematical relationship which relates changes