Skip to content

Commit cf06205

Browse files
DavisVaughantopepo
andauthored
Don't link to mixOmics (#751)
* Don't link to mixOmics This causes a Note about mixOmics not being available for cross referencing when parsnip is checked as a revdep of any package that it depends on, like hardhat. That forces a manual inspection by CRAN. * hardhat update * re-doc Co-authored-by: Max Kuhn <[email protected]>
1 parent d879482 commit cf06205

22 files changed

+47
-38
lines changed

DESCRIPTION

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ Imports:
2525
ggplot2,
2626
globals,
2727
glue,
28-
hardhat (>= 1.0.0),
28+
hardhat (>= 1.1.0),
2929
lifecycle,
3030
magrittr,
3131
prettyunits,

man/rmd/C5_rules_C5.0.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,10 +7,12 @@ For this engine, there is a single mode: classification
77

88

99

10-
This model has 1 tuning parameters:
10+
This model has 2 tuning parameters:
1111

1212
- `trees`: # Trees (type: integer, default: 1L)
1313

14+
- `min_n`: Minimal Node Size (type: integer, default: 2L)
15+
1416
Note that C5.0 has a tool for _early stopping_ during boosting where less iterations of boosting are performed than the number requested. `C5_rules()` turns this feature off (although it can be re-enabled using [C50::C5.0Control()]).
1517

1618
## Translation from parsnip to the underlying model call (classification)

man/rmd/bag_tree_rpart.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -107,7 +107,8 @@ bag_tree(tree_depth = integer(1), min_n = integer(1), cost_complexity = double(1
107107
##
108108
## Model fit template:
109109
## ipred::bagging(formula = missing_arg(), data = missing_arg(),
110-
## cp = double(1), maxdepth = integer(1), minsplit = integer(1))
110+
## weights = missing_arg(), cp = double(1), maxdepth = integer(1),
111+
## minsplit = integer(1))
111112
```
112113

113114

man/rmd/boost_tree_mboost.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ boost_tree() %>%
4242
##
4343
## Model fit template:
4444
## censored::blackboost_train(formula = missing_arg(), data = missing_arg(),
45-
## family = mboost::CoxPH())
45+
## weights = missing_arg(), family = mboost::CoxPH())
4646
```
4747

4848
`censored::blackboost_train()` is a wrapper around [mboost::blackboost()] (and other functions) that makes it easier to run this model.

man/rmd/decision_tree_partykit.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -104,7 +104,8 @@ decision_tree(tree_depth = integer(1), min_n = integer(1)) %>%
104104
##
105105
## Model fit template:
106106
## parsnip::ctree_train(formula = missing_arg(), data = missing_arg(),
107-
## maxdepth = integer(1), minsplit = min_rows(0L, data))
107+
## weights = missing_arg(), maxdepth = integer(1), minsplit = min_rows(0L,
108+
## data))
108109
```
109110

110111
`censored::cond_inference_surv_ctree()` is a wrapper around [partykit::ctree()] (and other functions) that makes it easier to run this model.

man/rmd/decision_tree_rpart.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -99,8 +99,8 @@ decision_tree(
9999
##
100100
## Model fit template:
101101
## pec::pecRpart(formula = missing_arg(), data = missing_arg(),
102-
## cp = double(1), maxdepth = integer(1), minsplit = min_rows(0L,
103-
## data))
102+
## weights = missing_arg(), cp = double(1), maxdepth = integer(1),
103+
## minsplit = min_rows(0L, data))
104104
```
105105

106106
## Preprocessing requirements

man/rmd/discrim_flexible_earth.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -44,8 +44,9 @@ discrim_flexible(
4444
## Computational engine: earth
4545
##
4646
## Model fit template:
47-
## mda::fda(formula = missing_arg(), data = missing_arg(), nprune = integer(0),
48-
## degree = integer(0), pmethod = character(0), method = earth::earth)
47+
## mda::fda(formula = missing_arg(), data = missing_arg(), weights = missing_arg(),
48+
## nprune = integer(0), degree = integer(0), pmethod = character(0),
49+
## method = earth::earth)
4950
```
5051

5152
## Preprocessing requirements

man/rmd/discrim_linear_mda.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -34,8 +34,8 @@ discrim_linear(penalty = numeric(0)) %>%
3434
## Computational engine: mda
3535
##
3636
## Model fit template:
37-
## mda::fda(formula = missing_arg(), data = missing_arg(), lambda = numeric(0),
38-
## method = mda::gen.ridge, keep.fitted = FALSE)
37+
## mda::fda(formula = missing_arg(), data = missing_arg(), weights = missing_arg(),
38+
## lambda = numeric(0), method = mda::gen.ridge, keep.fitted = FALSE)
3939
```
4040

4141
## Preprocessing requirements

man/rmd/gen_additive_mod_mgcv.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ gen_additive_mod(adjust_deg_free = numeric(1), select_features = logical(1)) %>%
2626
```
2727

2828
```
29-
## GAM Specification (regression)
29+
## GAM Model Specification (regression)
3030
##
3131
## Main Arguments:
3232
## select_features = logical(1)
@@ -50,7 +50,7 @@ gen_additive_mod(adjust_deg_free = numeric(1), select_features = logical(1)) %>%
5050
```
5151

5252
```
53-
## GAM Specification (classification)
53+
## GAM Model Specification (classification)
5454
##
5555
## Main Arguments:
5656
## select_features = logical(1)

man/rmd/mlp_brulee.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ mlp(
5353
```
5454

5555
```
56-
## Single Layer Neural Network Specification (regression)
56+
## Single Layer Neural Network Model Specification (regression)
5757
##
5858
## Main Arguments:
5959
## hidden_units = integer(1)
@@ -91,7 +91,7 @@ mlp(
9191
```
9292

9393
```
94-
## Single Layer Neural Network Specification (classification)
94+
## Single Layer Neural Network Model Specification (classification)
9595
##
9696
## Main Arguments:
9797
## hidden_units = integer(1)

man/rmd/mlp_keras.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ mlp(
3636
```
3737

3838
```
39-
## Single Layer Neural Network Specification (regression)
39+
## Single Layer Neural Network Model Specification (regression)
4040
##
4141
## Main Arguments:
4242
## hidden_units = integer(1)
@@ -70,7 +70,7 @@ mlp(
7070
```
7171

7272
```
73-
## Single Layer Neural Network Specification (classification)
73+
## Single Layer Neural Network Model Specification (classification)
7474
##
7575
## Main Arguments:
7676
## hidden_units = integer(1)

man/rmd/mlp_nnet.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ mlp(
3333
```
3434

3535
```
36-
## Single Layer Neural Network Specification (regression)
36+
## Single Layer Neural Network Model Specification (regression)
3737
##
3838
## Main Arguments:
3939
## hidden_units = integer(1)
@@ -64,7 +64,7 @@ mlp(
6464
```
6565

6666
```
67-
## Single Layer Neural Network Specification (classification)
67+
## Single Layer Neural Network Model Specification (classification)
6868
##
6969
## Main Arguments:
7070
## hidden_units = integer(1)

man/rmd/pls_mixOmics.Rmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ pls(num_comp = integer(1), predictor_prop = double(1)) %>%
4242
- Determines the number of predictors in the data.
4343
- Adjusts `num_comp` if the value is larger than the number of factors.
4444
- Determines whether sparsity is required based on the value of `predictor_prop`.
45-
- Sets the `keepX` argument of [mixOmics::spls()] for sparse models.
45+
- Sets the `keepX` argument of `mixOmics::spls()` for sparse models.
4646

4747
## Translation from parsnip to the underlying model call (classification)
4848

man/rmd/pls_mixOmics.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ pls(num_comp = integer(1), predictor_prop = double(1)) %>%
4747
- Determines the number of predictors in the data.
4848
- Adjusts `num_comp` if the value is larger than the number of factors.
4949
- Determines whether sparsity is required based on the value of `predictor_prop`.
50-
- Sets the `keepX` argument of [mixOmics::spls()] for sparse models.
50+
- Sets the `keepX` argument of `mixOmics::spls()` for sparse models.
5151

5252
## Translation from parsnip to the underlying model call (classification)
5353

man/rmd/proportional_hazards_glmnet.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ proportional_hazards(penalty = double(1), mixture = double(1)) %>%
4343
##
4444
## Model fit template:
4545
## censored::glmnet_fit_wrapper(formula = missing_arg(), data = missing_arg(),
46-
## alpha = double(1))
46+
## weights = missing_arg(), alpha = double(1))
4747
```
4848

4949
## Preprocessing requirements

man/rmd/proportional_hazards_survival.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ proportional_hazards() %>%
2828
##
2929
## Model fit template:
3030
## survival::coxph(formula = missing_arg(), data = missing_arg(),
31-
## x = TRUE, model = TRUE)
31+
## weights = missing_arg(), x = TRUE, model = TRUE)
3232
```
3333

3434
## Other details

man/rmd/rand_forest_partykit.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -85,7 +85,8 @@ rand_forest() %>%
8585
## Computational engine: partykit
8686
##
8787
## Model fit template:
88-
## parsnip::cforest_train(formula = missing_arg(), data = missing_arg())
88+
## parsnip::cforest_train(formula = missing_arg(), data = missing_arg(),
89+
## weights = missing_arg())
8990
```
9091

9192
`censored::cond_inference_surv_cforest()` is a wrapper around [partykit::cforest()] (and other functions) that makes it easier to run this model.

man/rmd/rule_fit_xrf.md

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -66,9 +66,10 @@ rule_fit(
6666
##
6767
## Model fit template:
6868
## rules::xrf_fit(formula = missing_arg(), data = missing_arg(),
69-
## colsample_bytree = numeric(1), nrounds = integer(1), min_child_weight = integer(1),
70-
## max_depth = integer(1), eta = numeric(1), gamma = numeric(1),
71-
## subsample = numeric(1), lambda = numeric(1))
69+
## xgb_control = missing_arg(), colsample_bynode = numeric(1),
70+
## nrounds = integer(1), min_child_weight = integer(1), max_depth = integer(1),
71+
## eta = numeric(1), gamma = numeric(1), subsample = numeric(1),
72+
## lambda = numeric(1))
7273
```
7374

7475
## Translation from parsnip to the underlying model call (classification)
@@ -112,9 +113,10 @@ rule_fit(
112113
##
113114
## Model fit template:
114115
## rules::xrf_fit(formula = missing_arg(), data = missing_arg(),
115-
## colsample_bytree = numeric(1), nrounds = integer(1), min_child_weight = integer(1),
116-
## max_depth = integer(1), eta = numeric(1), gamma = numeric(1),
117-
## subsample = numeric(1), lambda = numeric(1))
116+
## xgb_control = missing_arg(), colsample_bynode = numeric(1),
117+
## nrounds = integer(1), min_child_weight = integer(1), max_depth = integer(1),
118+
## eta = numeric(1), gamma = numeric(1), subsample = numeric(1),
119+
## lambda = numeric(1))
118120
```
119121

120122
## Differences from the xrf package
@@ -134,6 +136,7 @@ These differences will create a disparity in the values of the `penalty` argumen
134136

135137
## Preprocessing requirements
136138

139+
137140
Factor/categorical predictors need to be converted to numeric values (e.g., dummy or indicator variables) for this engine. When using the formula method via \\code{\\link[=fit.model_spec]{fit()}}, parsnip will convert factor columns to indicators.
138141

139142
## Other details

man/rmd/svm_linear_LiblineaR.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ svm_linear(
2929
```
3030

3131
```
32-
## Linear Support Vector Machine Specification (regression)
32+
## Linear Support Vector Machine Model Specification (regression)
3333
##
3434
## Main Arguments:
3535
## cost = double(1)
@@ -55,7 +55,7 @@ svm_linear(
5555
```
5656

5757
```
58-
## Linear Support Vector Machine Specification (classification)
58+
## Linear Support Vector Machine Model Specification (classification)
5959
##
6060
## Main Arguments:
6161
## cost = double(1)

man/rmd/svm_linear_kernlab.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ svm_linear(
2727
```
2828

2929
```
30-
## Linear Support Vector Machine Specification (regression)
30+
## Linear Support Vector Machine Model Specification (regression)
3131
##
3232
## Main Arguments:
3333
## cost = double(1)
@@ -53,7 +53,7 @@ svm_linear(
5353
```
5454

5555
```
56-
## Linear Support Vector Machine Specification (classification)
56+
## Linear Support Vector Machine Model Specification (classification)
5757
##
5858
## Main Arguments:
5959
## cost = double(1)

man/rmd/svm_poly_kernlab.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ svm_poly(
3333
```
3434

3535
```
36-
## Polynomial Support Vector Machine Specification (regression)
36+
## Polynomial Support Vector Machine Model Specification (regression)
3737
##
3838
## Main Arguments:
3939
## cost = double(1)
@@ -64,7 +64,7 @@ svm_poly(
6464
```
6565

6666
```
67-
## Polynomial Support Vector Machine Specification (classification)
67+
## Polynomial Support Vector Machine Model Specification (classification)
6868
##
6969
## Main Arguments:
7070
## cost = double(1)

man/rmd/svm_rbf_kernlab.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ svm_rbf(
3232
```
3333

3434
```
35-
## Radial Basis Function Support Vector Machine Specification (regression)
35+
## Radial Basis Function Support Vector Machine Model Specification (regression)
3636
##
3737
## Main Arguments:
3838
## cost = double(1)
@@ -60,7 +60,7 @@ svm_rbf(
6060
```
6161

6262
```
63-
## Radial Basis Function Support Vector Machine Specification (classification)
63+
## Radial Basis Function Support Vector Machine Model Specification (classification)
6464
##
6565
## Main Arguments:
6666
## cost = double(1)

0 commit comments

Comments
 (0)