Skip to content

Engine docs for censored #753

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Jun 12, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 8 additions & 8 deletions man/details_linear_reg_h2o.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

26 changes: 13 additions & 13 deletions man/details_logistic_reg_h2o.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

16 changes: 8 additions & 8 deletions man/details_multinom_reg_h2o.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

16 changes: 8 additions & 8 deletions man/details_poisson_reg_h2o.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 3 additions & 1 deletion man/rmd/C5_rules_C5.0.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,12 @@ For this engine, there is a single mode: classification



This model has 1 tuning parameters:
This model has 2 tuning parameters:

- `trees`: # Trees (type: integer, default: 1L)

- `min_n`: Minimal Node Size (type: integer, default: 2L)

Note that C5.0 has a tool for _early stopping_ during boosting where less iterations of boosting are performed than the number requested. `C5_rules()` turns this feature off (although it can be re-enabled using [C50::C5.0Control()]).

## Translation from parsnip to the underlying model call (classification)
Expand Down
5 changes: 5 additions & 0 deletions man/rmd/bag_tree_rpart.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -77,6 +77,11 @@ bag_tree(tree_depth = integer(1), min_n = integer(1), cost_complexity = double(1
```{r child = "template-uses-case-weights.Rmd"}
```

## Other details

```{r child = "template-survival-median.Rmd"}
```

## References

- Breiman L. 1996. "Bagging predictors". Machine Learning. 24 (2): 123-140
Expand Down
9 changes: 8 additions & 1 deletion man/rmd/bag_tree_rpart.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,8 @@ bag_tree(tree_depth = integer(1), min_n = integer(1), cost_complexity = double(1
##
## Model fit template:
## ipred::bagging(formula = missing_arg(), data = missing_arg(),
## cp = double(1), maxdepth = integer(1), minsplit = integer(1))
## weights = missing_arg(), cp = double(1), maxdepth = integer(1),
## minsplit = integer(1))
```


Expand All @@ -123,6 +124,12 @@ This model can utilize case weights during model fitting. To use them, see the d

The `fit()` and `fit_xy()` arguments have arguments called `case_weights` that expect vectors of case weights.

## Other details



Predictions of type `"time"` are predictions of the median survival time.

## References

- Breiman L. 1996. "Bagging predictors". Machine Learning. 24 (2): 123-140
Expand Down
5 changes: 5 additions & 0 deletions man/rmd/boost_tree_mboost.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,11 @@ boost_tree() %>%
```{r child = "template-tree-split-factors.Rmd"}
```

## Other details

```{r child = "template-survival-mean.Rmd"}
```

## References

- Buehlmann P, Hothorn T. 2007. Boosting algorithms: regularization, prediction and model fitting. _Statistical Science_, 22(4), 477–505.
Expand Down
8 changes: 7 additions & 1 deletion man/rmd/boost_tree_mboost.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ boost_tree() %>%
##
## Model fit template:
## censored::blackboost_train(formula = missing_arg(), data = missing_arg(),
## family = mboost::CoxPH())
## weights = missing_arg(), family = mboost::CoxPH())
```

`censored::blackboost_train()` is a wrapper around [mboost::blackboost()] (and other functions) that makes it easier to run this model.
Expand All @@ -52,6 +52,12 @@ boost_tree() %>%

This engine does not require any special encoding of the predictors. Categorical predictors can be partitioned into groups of factor levels (e.g. `{a, c}` vs `{b, d}`) when splitting at a node. Dummy variables are not required for this model.

## Other details



Predictions of type `"time"` are predictions of the mean survival time.

## References

- Buehlmann P, Hothorn T. 2007. Boosting algorithms: regularization, prediction and model fitting. _Statistical Science_, 22(4), 477–505.
Expand Down
5 changes: 5 additions & 0 deletions man/rmd/decision_tree_partykit.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,11 @@ decision_tree(tree_depth = integer(1), min_n = integer(1)) %>%
```{r child = "template-tree-split-factors.Rmd"}
```

## Other details

```{r child = "template-survival-median.Rmd"}
```

## References

- [partykit: A Modular Toolkit for Recursive Partytioning in R](https://jmlr.org/papers/v16/hothorn15a.html)
Expand Down
9 changes: 8 additions & 1 deletion man/rmd/decision_tree_partykit.md
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,8 @@ decision_tree(tree_depth = integer(1), min_n = integer(1)) %>%
##
## Model fit template:
## parsnip::ctree_train(formula = missing_arg(), data = missing_arg(),
## maxdepth = integer(1), minsplit = min_rows(0L, data))
## weights = missing_arg(), maxdepth = integer(1), minsplit = min_rows(0L,
## data))
```

`censored::cond_inference_surv_ctree()` is a wrapper around [partykit::ctree()] (and other functions) that makes it easier to run this model.
Expand All @@ -114,6 +115,12 @@ decision_tree(tree_depth = integer(1), min_n = integer(1)) %>%

This engine does not require any special encoding of the predictors. Categorical predictors can be partitioned into groups of factor levels (e.g. `{a, c}` vs `{b, d}`) when splitting at a node. Dummy variables are not required for this model.

## Other details



Predictions of type `"time"` are predictions of the median survival time.

## References

- [partykit: A Modular Toolkit for Recursive Partytioning in R](https://jmlr.org/papers/v16/hothorn15a.html)
Expand Down
5 changes: 5 additions & 0 deletions man/rmd/decision_tree_rpart.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,11 @@ decision_tree(
```{r child = "template-uses-case-weights.Rmd"}
```

## Other details

```{r child = "template-survival-mean.Rmd"}
```

## Examples

The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#decision-tree-rpart) for `decision_tree()` with the `"rpart"` engine.
Expand Down
10 changes: 8 additions & 2 deletions man/rmd/decision_tree_rpart.md
Original file line number Diff line number Diff line change
Expand Up @@ -99,8 +99,8 @@ decision_tree(
##
## Model fit template:
## pec::pecRpart(formula = missing_arg(), data = missing_arg(),
## cp = double(1), maxdepth = integer(1), minsplit = min_rows(0L,
## data))
## weights = missing_arg(), cp = double(1), maxdepth = integer(1),
## minsplit = min_rows(0L, data))
```

## Preprocessing requirements
Expand All @@ -115,6 +115,12 @@ This model can utilize case weights during model fitting. To use them, see the d

The `fit()` and `fit_xy()` arguments have arguments called `case_weights` that expect vectors of case weights.

## Other details



Predictions of type `"time"` are predictions of the mean survival time.

## Examples

The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#decision-tree-rpart) for `decision_tree()` with the `"rpart"` engine.
Expand Down
5 changes: 3 additions & 2 deletions man/rmd/discrim_flexible_earth.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,8 +44,9 @@ discrim_flexible(
## Computational engine: earth
##
## Model fit template:
## mda::fda(formula = missing_arg(), data = missing_arg(), nprune = integer(0),
## degree = integer(0), pmethod = character(0), method = earth::earth)
## mda::fda(formula = missing_arg(), data = missing_arg(), weights = missing_arg(),
## nprune = integer(0), degree = integer(0), pmethod = character(0),
## method = earth::earth)
```

## Preprocessing requirements
Expand Down
4 changes: 2 additions & 2 deletions man/rmd/discrim_linear_mda.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,8 @@ discrim_linear(penalty = numeric(0)) %>%
## Computational engine: mda
##
## Model fit template:
## mda::fda(formula = missing_arg(), data = missing_arg(), lambda = numeric(0),
## method = mda::gen.ridge, keep.fitted = FALSE)
## mda::fda(formula = missing_arg(), data = missing_arg(), weights = missing_arg(),
## lambda = numeric(0), method = mda::gen.ridge, keep.fitted = FALSE)
```

## Preprocessing requirements
Expand Down
4 changes: 2 additions & 2 deletions man/rmd/gen_additive_mod_mgcv.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ gen_additive_mod(adjust_deg_free = numeric(1), select_features = logical(1)) %>%
```

```
## GAM Specification (regression)
## GAM Model Specification (regression)
##
## Main Arguments:
## select_features = logical(1)
Expand All @@ -50,7 +50,7 @@ gen_additive_mod(adjust_deg_free = numeric(1), select_features = logical(1)) %>%
```

```
## GAM Specification (classification)
## GAM Model Specification (classification)
##
## Main Arguments:
## select_features = logical(1)
Expand Down
4 changes: 2 additions & 2 deletions man/rmd/mlp_brulee.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ mlp(
```

```
## Single Layer Neural Network Specification (regression)
## Single Layer Neural Network Model Specification (regression)
##
## Main Arguments:
## hidden_units = integer(1)
Expand Down Expand Up @@ -91,7 +91,7 @@ mlp(
```

```
## Single Layer Neural Network Specification (classification)
## Single Layer Neural Network Model Specification (classification)
##
## Main Arguments:
## hidden_units = integer(1)
Expand Down
Loading