Skip to content

Commit 8753246

Browse files
Changed websites to trustworthy ones; used dollar delimited math for equation in Bayes Theorem; Made 2 headings for MCMC
1 parent 29c4d6a commit 8753246

File tree

1 file changed

+10
-5
lines changed

1 file changed

+10
-5
lines changed

docs/source/glossary.md

Lines changed: 10 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
A glossary of common terms used throughout the PyMC3 documentation and examples.
44

55
:::::{glossary}
6-
[Equidispersion](http://www.ce.memphis.edu/7012/L20_CountDataModels_v2.pdf)
6+
[Equidispersion](https://www.researchgate.net/publication/321375217_Extended_Poisson_INAR1_processes_with_equidispersion_underdispersion_and_overdispersion)
77
If in a Poisson distribution if the variance equals the mean of the distribution, it is reffered to as equidispersion.
88

99
[Generalized Poisson PMF](https://www.jstor.org/stable/1267389)
@@ -12,13 +12,18 @@ A glossary of common terms used throughout the PyMC3 documentation and examples.
1212
[Bayes' theorem](https://en.wikipedia.org/wiki/Bayes%27_theorem)
1313
Describes the probability of an event, based on prior knowledge of conditions that might be related to the event. For example, if the risk of developing health problems is known to increase with age, Bayes' theorem allows the risk to an individual of a known age to be assessed more accurately (by conditioning it on their age) than simply assuming that the individual is typical of the population as a whole.
1414
Formula:
15-
{term}`P(A|B) = (P(B|A) P(A))/P(B)`
15+
$$
16+
\begin{eqnarray}
17+
P(A|B) = (P(B|A) P(A))/P(B)
18+
\end{eqnarray}
19+
$$
1620
Where A and B are events and P(B) != 0
1721

1822

19-
[Markov Chain](https://setosa.io/ev/markov-chains/)
20-
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. For a visual explantation, visit the link in the title.
23+
[Markov Chain](https://en.wikipedia.org/wiki/Markov_chain)
24+
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
2125

22-
[Markov Chain Monte Carlo (MCMC)](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo)
26+
[Markov Chain Monte Carlo](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo)
27+
[MCMC]
2328
Markov chain Monte Carlo (MCMC) methods comprise a class of algorithms for sampling from a probability distribution. By constructing a {term}`Markov Chain` that has the desired distribution as its equilibrium distribution, one can obtain a sample of the desired distribution by recording states from the chain. Various algorithms exist for constructing chains, including the Metropolis–Hastings algorithm.
2429
:::::

0 commit comments

Comments
 (0)