One such algorithm, the Metropolis-Hastings variant of Markov Chain Monte Carlo ( MCMC), is applicable to a wide range of models and is thus the one used by the toolbox (Metropolis, Rosenbluth, Rosenbluth, Teller, & Teller, 1953 Hastings, 1970). For some models, it is possible to derive closed-form expressions for the posterior distribution, but for most models, this is intractable, and so sampling-based algorithms are used to approximate it. For the purposes of exploratory data analysis, it is common to use a noninformative or weakly informative prior that spreads the probability thinly over a swath of plausible parameter values (e.g., the Jeffreys prior, a class of noninformative priors that are invariant under reparameterization of the model Jeffreys, 1946 Jaynes, 1968) to avoid an inordinate influence of the prior on inferences.Įstimating the full posterior distribution is harder than finding the maximum likelihood estimate. Because a prior can have an arbitrarily large impact on the resulting inference, it is important both to carefully consider which distribution is appropriate and, when communicating results that depend on those inferences, to report exactly the choice that was made. Analysts add value through judicious selection of priors that faithfully reflect their beliefs. The prior, P( θ), conveys which parameter values are thought to be reasonable, and specifying it can be as straightforward as setting upper and lower bounds (for example, bounding the guess rate between 0 and 1). (In fact, this applies to any function in the toolbox.) For example, to view the code for the swap model, type edit SwapModel, which will show the model's probability distribution function, the parameter ranges, and the specification of priors for the model parameters.īayesian inference provides a rational rule for updating prior beliefs (“the prior”) based on experimental data. It is also possible to view the full code for a model by running edit m. For example, to access the help file for StandardMixtureModel, run help StandardMixtureModel. For more information about a particular model m, type help m at the MATLAB prompt. ( 2009), as well as several explanatory models, such as VariablePrecisionModel (e.g., Fougnie et al., 2012 van den Berg et al., 2012). In support of this goal, the toolbox includes descriptive models, such as the StandardMixtureModel (that of Zhang & Luck, 2008) and SwapModel of Bays et al. Thus, with little effort, the MemToolbox can be used to simplify (and speed up) existing workflows by allowing for straightforward fitting of nearly all the standard models used in the visual working-memory literature.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |