-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
CU-5t5y0p Regarding issue #222 of pymc-devs/pymc-examples #4986
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 4 commits
da1fae4
17d04dd
371edd4
dd2ef62
4821251
e8faa9f
29c4d6a
8753246
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -17,4 +17,20 @@ Functional Programming | |
This contrasts with functions or methods that depend on variables that are not explicitly passed as an input (such as accessing `self.variable` inside a method) or that alter the inputs or other state variables in-place, instead of returning new distinct variables as outputs. | ||
Dispatching | ||
Choosing which function or method implementation to use based on the type of the input variables (usually just the first variable). For some examples, see Python's documentation for the [singledispatch](https://docs.python.org/3/library/functools.html#functools.singledispatch) decorator. | ||
::::: | ||
|
||
OriolAbril marked this conversation as resolved.
Show resolved
Hide resolved
|
||
[Equidispersion](http://www.ce.memphis.edu/7012/L20_CountDataModels_v2.pdf) | ||
If in a Poisson distribution if the variance equals the mean of the distribution, it is reffered to as equidispersion. | ||
|
||
[Generalized Poisson PMF](https://www.sciencedirect.com/science/article/pii/S0047259X14000256) | ||
A generalization of the {term}`Poisson distribution`, with two parameters X1, and X2, is obtained as a limiting form of the {term}`generalized negative binomial distribution`. The variance of the distribution is greater than, equal to or smaller than the mean according. as X2 is positive, zero or negative. For formula and more detail visit the link | ||
OriolAbril marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
[Bayes' theorem](https://en.wikipedia.org/wiki/Bayes%27_theorem) | ||
Describes the probability of an event, based on prior knowledge of conditions that might be related to the event. For example, if the risk of developing health problems is known to increase with age, Bayes' theorem allows the risk to an individual of a known age to be assessed more accurately (by conditioning it on their age) than simply assuming that the individual is typical of the population as a whole. | ||
OriolAbril marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
[Markov Chain](https://setosa.io/ev/markov-chains/) | ||
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. For a visual explantation, visit the link. | ||
|
||
[Markov Chain Monte Carlo](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo) | ||
[MCMC](https://machinelearningmastery.com/markov-chain-monte-carlo-for-probability/) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Let's leave only one title (and one title link) for Markov Chain Monte Carlo, which can look like this: There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I prefer very much to have the two titles, especially thinking about linking to that term, but I agree both should have the same link. If we use
to link to it. Otherwise we should be able to use both
or maybe only one of them, not completely sure how it works, if both are usable, only the first, only the last... but I do know the glossary allows multiple names per definition. |
||
Markov chain Monte Carlo (MCMC) methods comprise a class of algorithms for sampling from a probability distribution. By constructing a {term}`Markov Chain` that has the desired distribution as its equilibrium distribution, one can obtain a sample of the desired distribution by recording states from the chain. Various algorithms exist for constructing chains, including the Metropolis–Hastings algorithm. | ||
::::: |
Uh oh!
There was an error while loading. Please reload this page.