Skip to content

Commit d68a8a8

Browse files
committed
Updated writeup in the low-level interface notebook
1 parent 6bdce65 commit d68a8a8

File tree

1 file changed

+5
-29
lines changed

1 file changed

+5
-29
lines changed

demo/notebooks/prototype_interface.ipynb

Lines changed: 5 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -20,43 +20,19 @@
2020
"to the C++ code that doesn't require modifying any C++.\n",
2121
"\n",
2222
"To illustrate when such a prototype interface might be useful, consider\n",
23-
"the classic BART algorithm:\n",
24-
"\n",
25-
"**INPUT**: $y$, $X$, $\\tau$, $\\nu$, $\\lambda$, $\\alpha$, $\\beta$\n",
26-
"\n",
27-
"**OUTPUT**: $m$ samples of a decision forest with $k$ trees and global variance parameter $\\sigma^2$\n",
28-
"\n",
29-
"Initialize $\\sigma^2$ via a default or a data-dependent calibration exercise\n",
30-
"\n",
31-
"Initialize \"forest 0\" with $k$ trees with a single root node, referring to tree $j$'s prediction vector as $f_{0,j}$\n",
32-
"\n",
33-
"Compute residual as $r = y - \\sum_{j=1}^k f_{0,j}$\n",
34-
"\n",
35-
"**FOR** $i$ **IN** $\\left\\{1,\\dots,m\\right\\}$:\n",
36-
"\n",
37-
" Initialize forest $i$ from forest $i-1$\n",
38-
" \n",
39-
" **FOR** $j$ **IN** $\\left\\{1,\\dots,k\\right\\}$:\n",
40-
" \n",
41-
" Add predictions for tree $j$ to residual: $r = r + f_{i,j}$ \n",
42-
" \n",
43-
" Update tree $j$ via Metropolis-Hastings with $r$ and $X$ as data and tree priors depending on ($\\tau$, $\\sigma^2$, $\\alpha$, $\\beta$)\n",
44-
"\n",
45-
" Sample leaf node parameters for tree $j$ via Gibbs (leaf node prior is $N\\left(0,\\tau\\right)$)\n",
46-
" \n",
47-
" Subtract (updated) predictions for tree $j$ from residual: $r = r - f_{i,j}$\n",
48-
"\n",
49-
" Sample $\\sigma^2$ via Gibbs (prior is $IG(\\nu/2,\\nu\\lambda/2)$)\n",
23+
"that that \"classic\" BART algorithm is essentially a Metropolis-within-Gibbs \n",
24+
"sampler, in which the forest is sampled by MCMC, conditional on all of the \n",
25+
"other model parameters, and then the model parameters are updated by Gibbs.\n",
5026
"\n",
5127
"While the algorithm itself is conceptually simple, much of the core \n",
5228
"computation is carried out in low-level languages such as C or C++ \n",
53-
"because of the tree data structure. As a result, any changes to this \n",
29+
"because of the tree data structures. As a result, any changes to this \n",
5430
"algorithm, such as supporting heteroskedasticity and categorical outcomes (Murray 2021) \n",
5531
"or causal effect estimation (Hahn et al 2020) require modifying low-level code. \n",
5632
"\n",
5733
"The prototype interface exposes the core components of the \n",
5834
"loop above at the R level, thus making it possible to interchange \n",
59-
"C++ computation for steps like \"update tree $j$ via Metropolis-Hastings\" \n",
35+
"C++ computation for steps like \"update forest via Metropolis-Hastings\" \n",
6036
"with R computation for a custom variance model, other user-specified additive \n",
6137
"mean model components, and so on."
6238
]

0 commit comments

Comments
 (0)