Now showing 1 - 7 of 7
  • Publication
    Invariants in probabilistic reasoning
    (Elsevier, 2018-02) ;
    Recent research has identified three invariants or identities that appear to hold in people's probabilistic reasoning: the QQ identity, the addition law identity, and the Bayes rule identity (Costello and Watts, 2014, 2016a, Fisher and Wolfe, 2014, Wang and Busemeyer, 2013, Wang et al., 2014). Each of these identities represent specific agreement with the requirements of normative probability theory; strikingly, these identities seem to hold in people's judgements despite the presence of strong and systematic biases against the requirements of normative probability theory in those very same judgements. These results suggest that the systematic biases seen in people's probabilistic reasoning follow mathematical rules: for these particular identities, these rules cause an overall cancellation of biases and so produce agreement with normative requirements. We assess two competing mathematical models of probabilistic reasoning (the ‘probability theory plus noise’ model and the ‘quantum probability’ model) in terms of their ability to account for this pattern of systematic biases and invariant identities.
      133Scopus© Citations 18
  • Publication
    The rationality of illusory correlation
    (American Psychological Association, 2019-04-01) ;
    When presented with 2 samples (a smaller sample from a Minority population and a larger sample from a Majority population), where some rare or frequent features occur at exactly the same rate in both samples, people reliably associate the rare feature with the Minority population and the frequent feature with the Majority population. This pattern is referred to as "illusory correlation," reflecting the standard assumption that such associations are fundamentally irrational. In this article we show that this assumption is incorrect, and demonstrate that this pattern of association linking rare features with the Minority and frequent features with the Majority (given a sample where those features occurred at the same proportion in both categories, and no further information) is in fact correct and follows a result in epistemic probability theory known as the "Rule of Succession." Building on this result, we present a new computational model of frequency-based illusory correlation, based on the Rule of Succession. We also discuss the implications of the Rule of Succession for our understanding of various other cognitive biases.
      272Scopus© Citations 16
  • Publication
    Surprisingly rational: Probability theory plus noise explains biases in judgment
    (American Psychological Association, 2014-07) ;
    The systematic biases seen in people’s probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people’s probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people’s probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise
      453Scopus© Citations 92
  • Publication
    People's conditional probability judgments follow probability theory (plus noise)
    (Elsevier, 2016-09) ;
    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.
      748Scopus© Citations 32
  • Publication
    Surprising rationality in probability judgment: Assessing two competing models
    We describe 4 experiments testing contrasting predictions of two recent models of probability judgment: the quantum probability model (Busemeyer, Pothos, Franco, & Trueblood, 2011) and the probability theory plus noise model (Costello & Watts, 2014, 2016a). Both models assume that people estimate probability using formal processes that follow or subsume standard probability theory. One set of predictions concerned agreement between people's probability estimates and standard probability theory identities. The quantum probability model predicts people's estimates should agree with one set of identities, while the probability theory plus noise model predicts a specific pattern of violation of those identities. Experimental results show the specific pattern of violation predicted by the probability theory plus noise model. Another set of predictions concerned the conjunction fallacy, which occurs when people judge the probability of a conjunction P(A∧B) to be greater than one or other constituent probabilities P(A) or P(B), contrary to the requirements of probability theory. In cases where A causes B, the quantum probability model predicts that the conjunction fallacy should only occur for constituent B and not for constituent A: the noise model predicts that the fallacy should occur for both A and B. Experimental results show that the fallacy occurs equally for both, contrary to the quantum probability prediction. These results suggest that people's probability estimates do not follow quantum probability theory. These results support the idea that people estimate probabilities using mechanisms that follow standard probability theory but are subject to random noise.
      114Scopus© Citations 17
  • Publication
    An upper bound on Jacobsthal's function
    (American Mathematical Society, 2014-11) ;
    The function h(k) represents the smallest number m such that every sequence of m consecutive integers contains an integer coprime to the first k primes. We give a new computational method for calculating strong upper bounds on h(k).
  • Publication
    Analogical Retrieval
    (University College Dublin. School of Computer Science and Informatics, 2007-11-29) ;
    We observe that thus far all computational models of analogy have modelled memory as a set of disjoint, encapsulated, domains. As there does not appear to be any psychological evidence for modelling memory in this way, we suggest that a more realistic model of analogy could be constructed if memory was modelled as one large data structure. We argue that the retrieval sub-process of analogy may not be independent of the mapping sub-process, and that both processes may well be governed by structural similarity. We describe a computational model of analogy which incorporates these three ideas; it models mapping and retrieval together, uses structural similarity to govern matching, and models memory as one large data structure. Retrieval in this system corresponds to the searching of the data structure for analogical matches to a supplied probe. We suggest a practical and efficient algorithm for such retrieval.