ⓘ Statistical principles ..

Coherence (statistics)

The coherency principle in Bayesian decision theory is the assumption that personal probabilities follow the ordinary rules for probability calculations where the validity of these rules corresponds to the self-consistency just referred to and thus that consistent decisions can be obtained from these probabilities.

Conditionality principle

The conditionality principle is a Fisherian principle of statistical inference that Allan Birnbaum formally defined and studied in his 1962 JASA article. Informally, the conditionality principle can be taken as the claim that experiments which were not actually performed are statistically irrelevant. Together with the sufficiency principle, Birnbaums version of the principle implies the famous likelihood principle. Although the relevance of the proof to data analysis remains controversial among statisticians, many Bayesians and likelihoodists consider the likelihood principle foundational ...

Craps principle

In probability theory, the craps principle is a theorem about event probabilities under repeated iid trials. Let E 1 {\displaystyle E_{1}} and E 2 {\displaystyle E_{2}} denote two mutually exclusive events which might occur on a given trial. Then the probability that E 1 {\displaystyle E_{1}} occurs before E 2 {\displaystyle E_{2}} equals the conditional probability that E 1 {\displaystyle E_{1}} occurs given that E 1 {\displaystyle E_{1}} or E 2 {\displaystyle E_{2}} occur on the next trial, which is P ⁡ }}} The events E 1 {\displaystyle E_{1}} and E 2 {\displaystyle E_{2}} need not be co ...

Cromwell's rule

Cromwells rule, named by statistician Dennis Lindley, states that the use of prior probabilities of 1 or 0 should be avoided, except when applied to statements that are logically true or false, such as 2+2 equaling 4 or 5. The reference is to Oliver Cromwell, who wrote to the General Assembly of the Church of Scotland on 5 August 1650, including a phrase that has become well known and frequently quoted: I beseech you, in the bowels of Christ, think it possible that you may be mistaken. As Lindley puts it, assigning a probability should "leave a little probability for the moon being made of ...

Principle of indifference

The principle of indifference is a rule for assigning epistemic probabilities. The principle of indifference states that in the absence of any relevant evidence, agents should distribute their credence equally among all the possible outcomes under consideration. In Bayesian probability, this is the simplest non-informative prior. The principle of indifference is meaningless under the frequency interpretation of probability, in which probabilities are relative frequencies rather than degrees of belief in uncertain propositions, conditional upon state information.

Likelihood principle

In statistics, the likelihood principle is the proposition that, given a statistical model, all the evidence in a sample relevant to model parameters is contained in the likelihood function. A likelihood function arises from a probability density function considered as a function of its distributional parameterization argument. For example, consider a model which gives the probability density function ƒ x | θ of observable random variable X as a function of a parameter θ. Then for a specific value x of X, the function L {\displaystyle {\mathcal {L}}} θ | x = ƒ x | θ is a likelihood functio ...

Principle of marginality

In statistics, the principle of marginality is the fact that the average effects, of variables in an analysis are marginal to their interaction effect - that is, the main effect of one explanatory variable captures the effect of that variable averaged over all values of a second explanatory variable whose value influences the first variables effect. The principle of marginality implies that, in general, it is wrong to test, estimate, or interpret main effects of explanatory variables where the variables interact or, similarly, to model interaction effects but delete main effects that are m ...

Orthogonality principle

In statistics and signal processing, the orthogonality principle is a necessary and sufficient condition for the optimality of a Bayesian estimator. Loosely stated, the orthogonality principle says that the error vector of the optimal estimator is orthogonal to any possible estimator. The orthogonality principle is most commonly stated for linear estimators, but more general formulations are possible. Since the principle is a necessary and sufficient condition for optimality, it can be used to find the minimum mean square error estimator.

Sparsity-of-effects principle

In the statistical analysis of the results from factorial experiments, the sparsity-of-effects principle states that a system is usually dominated by main effects and low-order interactions. Thus it is most likely that main effects and two-factor interactions are the most significant responses in a factorial experiment. In other words, higher order interactions such as three-factor interactions are very rare. This is sometimes referred to as the hierarchical ordering principle. The sparsity-of-effects principle actually refers to the idea that only a few effects in a factorial experiment w ...

Sufficient statistic

In statistics, a statistic is sufficient with respect to a statistical model and its associated unknown parameter if "no other statistic that can be calculated from the same sample provides any additional information as to the value of the parameter". In particular, a statistic is sufficient for a family of probability distributions if the sample from which it is calculated gives no additional information than does the statistic, as to which of those probability distributions is that of the population from which the sample was taken. A related concept is that of linear adequacy, which is w ...


ⓘ Statistical principles

  • for statistical issues, the harmonization of methodology and the implementation of the statistical principles of the Hellenic and European Statistical Systems
  • Elementary Principles in Statistical Mechanics, published in March 1902, is a work of scientific literature by Josiah Willard Gibbs which is considered
  • approaches to statistical - decision problems and to statistical inference, and the actions and deductions that satisfy the basic principles stated for these
  • branch of statistical mechanics, which treats and extends classical thermodynamics, is known as statistical thermodynamics or equilibrium statistical mechanics
  • In statistical hypothesis testing, a result has statistical significance when it is very unlikely to have occurred given the null hypothesis. More precisely
  • A statistical model is a mathematical model that embodies a set of statistical assumptions concerning the generation of sample data and similar data from
  • Statistical inference is the process of using data analysis to deduce properties of an underlying probability distribution. Inferential statistical analysis
  • Statistical disclosure control SDC also known as statistical disclosure limitation SDL or disclosure avoidance, is a technique used in data - driven
  • of Home Affairs. The Federal Statistical Office is the national service provider and competence centre for statistical observations in areas of national
  • or First Statistical Account of Scotland was published between 1791 and 1799 by Sir John Sinclair of Ulbster. The New or Second Statistical Account of