Skip to content

Don't link to mixOmics #751

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Jun 12, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Imports:
ggplot2,
globals,
glue,
hardhat (>= 1.0.0),
hardhat (>= 1.1.0),
lifecycle,
magrittr,
prettyunits,
Expand Down
4 changes: 3 additions & 1 deletion man/rmd/C5_rules_C5.0.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,12 @@ For this engine, there is a single mode: classification



This model has 1 tuning parameters:
This model has 2 tuning parameters:

- `trees`: # Trees (type: integer, default: 1L)

- `min_n`: Minimal Node Size (type: integer, default: 2L)

Note that C5.0 has a tool for _early stopping_ during boosting where less iterations of boosting are performed than the number requested. `C5_rules()` turns this feature off (although it can be re-enabled using [C50::C5.0Control()]).

## Translation from parsnip to the underlying model call (classification)
Expand Down
3 changes: 2 additions & 1 deletion man/rmd/bag_tree_rpart.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,8 @@ bag_tree(tree_depth = integer(1), min_n = integer(1), cost_complexity = double(1
##
## Model fit template:
## ipred::bagging(formula = missing_arg(), data = missing_arg(),
## cp = double(1), maxdepth = integer(1), minsplit = integer(1))
## weights = missing_arg(), cp = double(1), maxdepth = integer(1),
## minsplit = integer(1))
```


Expand Down
2 changes: 1 addition & 1 deletion man/rmd/boost_tree_mboost.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ boost_tree() %>%
##
## Model fit template:
## censored::blackboost_train(formula = missing_arg(), data = missing_arg(),
## family = mboost::CoxPH())
## weights = missing_arg(), family = mboost::CoxPH())
```

`censored::blackboost_train()` is a wrapper around [mboost::blackboost()] (and other functions) that makes it easier to run this model.
Expand Down
3 changes: 2 additions & 1 deletion man/rmd/decision_tree_partykit.md
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,8 @@ decision_tree(tree_depth = integer(1), min_n = integer(1)) %>%
##
## Model fit template:
## parsnip::ctree_train(formula = missing_arg(), data = missing_arg(),
## maxdepth = integer(1), minsplit = min_rows(0L, data))
## weights = missing_arg(), maxdepth = integer(1), minsplit = min_rows(0L,
## data))
```

`censored::cond_inference_surv_ctree()` is a wrapper around [partykit::ctree()] (and other functions) that makes it easier to run this model.
Expand Down
4 changes: 2 additions & 2 deletions man/rmd/decision_tree_rpart.md
Original file line number Diff line number Diff line change
Expand Up @@ -99,8 +99,8 @@ decision_tree(
##
## Model fit template:
## pec::pecRpart(formula = missing_arg(), data = missing_arg(),
## cp = double(1), maxdepth = integer(1), minsplit = min_rows(0L,
## data))
## weights = missing_arg(), cp = double(1), maxdepth = integer(1),
## minsplit = min_rows(0L, data))
```

## Preprocessing requirements
Expand Down
5 changes: 3 additions & 2 deletions man/rmd/discrim_flexible_earth.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,8 +44,9 @@ discrim_flexible(
## Computational engine: earth
##
## Model fit template:
## mda::fda(formula = missing_arg(), data = missing_arg(), nprune = integer(0),
## degree = integer(0), pmethod = character(0), method = earth::earth)
## mda::fda(formula = missing_arg(), data = missing_arg(), weights = missing_arg(),
## nprune = integer(0), degree = integer(0), pmethod = character(0),
## method = earth::earth)
```

## Preprocessing requirements
Expand Down
4 changes: 2 additions & 2 deletions man/rmd/discrim_linear_mda.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,8 @@ discrim_linear(penalty = numeric(0)) %>%
## Computational engine: mda
##
## Model fit template:
## mda::fda(formula = missing_arg(), data = missing_arg(), lambda = numeric(0),
## method = mda::gen.ridge, keep.fitted = FALSE)
## mda::fda(formula = missing_arg(), data = missing_arg(), weights = missing_arg(),
## lambda = numeric(0), method = mda::gen.ridge, keep.fitted = FALSE)
```

## Preprocessing requirements
Expand Down
4 changes: 2 additions & 2 deletions man/rmd/gen_additive_mod_mgcv.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ gen_additive_mod(adjust_deg_free = numeric(1), select_features = logical(1)) %>%
```

```
## GAM Specification (regression)
## GAM Model Specification (regression)
##
## Main Arguments:
## select_features = logical(1)
Expand All @@ -50,7 +50,7 @@ gen_additive_mod(adjust_deg_free = numeric(1), select_features = logical(1)) %>%
```

```
## GAM Specification (classification)
## GAM Model Specification (classification)
##
## Main Arguments:
## select_features = logical(1)
Expand Down
4 changes: 2 additions & 2 deletions man/rmd/mlp_brulee.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ mlp(
```

```
## Single Layer Neural Network Specification (regression)
## Single Layer Neural Network Model Specification (regression)
##
## Main Arguments:
## hidden_units = integer(1)
Expand Down Expand Up @@ -91,7 +91,7 @@ mlp(
```

```
## Single Layer Neural Network Specification (classification)
## Single Layer Neural Network Model Specification (classification)
##
## Main Arguments:
## hidden_units = integer(1)
Expand Down
4 changes: 2 additions & 2 deletions man/rmd/mlp_keras.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ mlp(
```

```
## Single Layer Neural Network Specification (regression)
## Single Layer Neural Network Model Specification (regression)
##
## Main Arguments:
## hidden_units = integer(1)
Expand Down Expand Up @@ -70,7 +70,7 @@ mlp(
```

```
## Single Layer Neural Network Specification (classification)
## Single Layer Neural Network Model Specification (classification)
##
## Main Arguments:
## hidden_units = integer(1)
Expand Down
4 changes: 2 additions & 2 deletions man/rmd/mlp_nnet.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ mlp(
```

```
## Single Layer Neural Network Specification (regression)
## Single Layer Neural Network Model Specification (regression)
##
## Main Arguments:
## hidden_units = integer(1)
Expand Down Expand Up @@ -64,7 +64,7 @@ mlp(
```

```
## Single Layer Neural Network Specification (classification)
## Single Layer Neural Network Model Specification (classification)
##
## Main Arguments:
## hidden_units = integer(1)
Expand Down
2 changes: 1 addition & 1 deletion man/rmd/pls_mixOmics.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ pls(num_comp = integer(1), predictor_prop = double(1)) %>%
- Determines the number of predictors in the data.
- Adjusts `num_comp` if the value is larger than the number of factors.
- Determines whether sparsity is required based on the value of `predictor_prop`.
- Sets the `keepX` argument of [mixOmics::spls()] for sparse models.
- Sets the `keepX` argument of `mixOmics::spls()` for sparse models.

## Translation from parsnip to the underlying model call (classification)

Expand Down
2 changes: 1 addition & 1 deletion man/rmd/pls_mixOmics.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ pls(num_comp = integer(1), predictor_prop = double(1)) %>%
- Determines the number of predictors in the data.
- Adjusts `num_comp` if the value is larger than the number of factors.
- Determines whether sparsity is required based on the value of `predictor_prop`.
- Sets the `keepX` argument of [mixOmics::spls()] for sparse models.
- Sets the `keepX` argument of `mixOmics::spls()` for sparse models.

## Translation from parsnip to the underlying model call (classification)

Expand Down
2 changes: 1 addition & 1 deletion man/rmd/proportional_hazards_glmnet.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ proportional_hazards(penalty = double(1), mixture = double(1)) %>%
##
## Model fit template:
## censored::glmnet_fit_wrapper(formula = missing_arg(), data = missing_arg(),
## alpha = double(1))
## weights = missing_arg(), alpha = double(1))
```

## Preprocessing requirements
Expand Down
2 changes: 1 addition & 1 deletion man/rmd/proportional_hazards_survival.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ proportional_hazards() %>%
##
## Model fit template:
## survival::coxph(formula = missing_arg(), data = missing_arg(),
## x = TRUE, model = TRUE)
## weights = missing_arg(), x = TRUE, model = TRUE)
```

## Other details
Expand Down
3 changes: 2 additions & 1 deletion man/rmd/rand_forest_partykit.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,8 @@ rand_forest() %>%
## Computational engine: partykit
##
## Model fit template:
## parsnip::cforest_train(formula = missing_arg(), data = missing_arg())
## parsnip::cforest_train(formula = missing_arg(), data = missing_arg(),
## weights = missing_arg())
```

`censored::cond_inference_surv_cforest()` is a wrapper around [partykit::cforest()] (and other functions) that makes it easier to run this model.
Expand Down
15 changes: 9 additions & 6 deletions man/rmd/rule_fit_xrf.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,9 +66,10 @@ rule_fit(
##
## Model fit template:
## rules::xrf_fit(formula = missing_arg(), data = missing_arg(),
## colsample_bytree = numeric(1), nrounds = integer(1), min_child_weight = integer(1),
## max_depth = integer(1), eta = numeric(1), gamma = numeric(1),
## subsample = numeric(1), lambda = numeric(1))
## xgb_control = missing_arg(), colsample_bynode = numeric(1),
## nrounds = integer(1), min_child_weight = integer(1), max_depth = integer(1),
## eta = numeric(1), gamma = numeric(1), subsample = numeric(1),
## lambda = numeric(1))
```

## Translation from parsnip to the underlying model call (classification)
Expand Down Expand Up @@ -112,9 +113,10 @@ rule_fit(
##
## Model fit template:
## rules::xrf_fit(formula = missing_arg(), data = missing_arg(),
## colsample_bytree = numeric(1), nrounds = integer(1), min_child_weight = integer(1),
## max_depth = integer(1), eta = numeric(1), gamma = numeric(1),
## subsample = numeric(1), lambda = numeric(1))
## xgb_control = missing_arg(), colsample_bynode = numeric(1),
## nrounds = integer(1), min_child_weight = integer(1), max_depth = integer(1),
## eta = numeric(1), gamma = numeric(1), subsample = numeric(1),
## lambda = numeric(1))
```

## Differences from the xrf package
Expand All @@ -134,6 +136,7 @@ These differences will create a disparity in the values of the `penalty` argumen

## Preprocessing requirements


Factor/categorical predictors need to be converted to numeric values (e.g., dummy or indicator variables) for this engine. When using the formula method via \\code{\\link[=fit.model_spec]{fit()}}, parsnip will convert factor columns to indicators.

## Other details
Expand Down
4 changes: 2 additions & 2 deletions man/rmd/svm_linear_LiblineaR.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ svm_linear(
```

```
## Linear Support Vector Machine Specification (regression)
## Linear Support Vector Machine Model Specification (regression)
##
## Main Arguments:
## cost = double(1)
Expand All @@ -55,7 +55,7 @@ svm_linear(
```

```
## Linear Support Vector Machine Specification (classification)
## Linear Support Vector Machine Model Specification (classification)
##
## Main Arguments:
## cost = double(1)
Expand Down
4 changes: 2 additions & 2 deletions man/rmd/svm_linear_kernlab.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ svm_linear(
```

```
## Linear Support Vector Machine Specification (regression)
## Linear Support Vector Machine Model Specification (regression)
##
## Main Arguments:
## cost = double(1)
Expand All @@ -53,7 +53,7 @@ svm_linear(
```

```
## Linear Support Vector Machine Specification (classification)
## Linear Support Vector Machine Model Specification (classification)
##
## Main Arguments:
## cost = double(1)
Expand Down
4 changes: 2 additions & 2 deletions man/rmd/svm_poly_kernlab.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ svm_poly(
```

```
## Polynomial Support Vector Machine Specification (regression)
## Polynomial Support Vector Machine Model Specification (regression)
##
## Main Arguments:
## cost = double(1)
Expand Down Expand Up @@ -64,7 +64,7 @@ svm_poly(
```

```
## Polynomial Support Vector Machine Specification (classification)
## Polynomial Support Vector Machine Model Specification (classification)
##
## Main Arguments:
## cost = double(1)
Expand Down
4 changes: 2 additions & 2 deletions man/rmd/svm_rbf_kernlab.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ svm_rbf(
```

```
## Radial Basis Function Support Vector Machine Specification (regression)
## Radial Basis Function Support Vector Machine Model Specification (regression)
##
## Main Arguments:
## cost = double(1)
Expand Down Expand Up @@ -60,7 +60,7 @@ svm_rbf(
```

```
## Radial Basis Function Support Vector Machine Specification (classification)
## Radial Basis Function Support Vector Machine Model Specification (classification)
##
## Main Arguments:
## cost = double(1)
Expand Down