Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expression params #323

Draft
wants to merge 15 commits into
base: main
Choose a base branch
from
Draft

Expression params #323

wants to merge 15 commits into from

Conversation

mb706
Copy link
Contributor

@mb706 mb706 commented Nov 17, 2020

l = lrn("classif.ranger")

# the Learner class should do this:
l$param_set$context_available = "task" 

l$param_set$values$mtry = to_tune(p_dbl(0, 1,
    trafo = function(x) ContextPV(function(task)
        max(1, round(length(task$feature_names) * x)),
      x)
))

at = AutoTuner$new(l, rsmp("holdout"), msr("classif.acc"), trm("none"),
  tnr("grid_search", resolution = 10))
at$train(tsk("iris"))

at$tuning_result
#>        mtry learner_param_vals  x_domain classif.acc
#> 1: 0.7777778          <list[1]> <list[1]>        0.96

at$learner$param_set$values
#> $mtry
#> ContextPV function(task)
#>         max(1, round(length(task$feature_names) * x))
#> Using following environment:
#> $x
#> [1] 0.7777778

at$learner$model
#> Ranger result
#> 
#> Call:
#>  ranger::ranger(dependent.variable.name = task$target_names, data = task$data(),
#>       probability = self$predict_type == "prob", case.weights = task$weights$weight,
#>       mtry = 3) 
#> 
#> Type:                             Classification 
#> Number of trees:                  500 
#> Sample size:                      150 
#> Number of independent variables:  4 
#> Mtry:                             3 
#> Target node size:                 1 
#> Variable importance mode:         none 
#> Splitrule:                        gini 
#> OOB prediction error:             4.67 % 

See how mtry as the Learner sees it is 3, derived from its $values$mtry, which is

function(task) max(1, round(length(task$feature_names) * x))

with x set to 0.78. task is, by default, taken from the environment in which the Learner calls self$param_set$get_values(tags = "train").


I am not yet completely happy with this solution:

  • I am not sure whose responsibility it should be to register available context
  • I am not sure whether this way of supplying context is a good one, or there should be something like a global registry, or dynGet(), or the like.

Some problems that arose: what should the behaviour of ParamSetCollection$get_valuess() be when the context is not fulfilled?

  • Should PSC$get_values() just not be called?
  • maybe just NULL / NA / some value indicating missingness?
@mllg
Copy link
Sponsor Member

mllg commented Nov 17, 2020

@mllg
Copy link
Sponsor Member

mllg commented Nov 17, 2020

What bug is the workaround for?

@mb706
Copy link
Contributor Author

mb706 commented Nov 17, 2020

This one: mlr-org/mlr3#581

The "workaround" is having the function object inherit the class "function", so it is arguable if that is maybe the way it is supposed to be.

@mlr-org mlr-org deleted a comment from lintr-bot Nov 18, 2020
@berndbischl
Copy link
Sponsor Member

hi martin, thx a lot for taking the time to look at this. but.... you are kinda making it harder than it should be to review this?

  1. FunctionParamValue is essential and not explained

  2. The code in ParamSet you added is complicated enough to warrant comments

  3. Where in the calling code is "env" passed? Thats essential. I think you must have done this already in a branch / locally?
    Because your example already works. This must be reviewed too.

  4. Pls show an example with resampling and using "n" somewhere, maybe with minsplit. Which n is taken? exactly the size of the training set?

@berndbischl
Copy link
Sponsor Member

Furthermore, maybe not essential now (If this works we should REALLY try to use it), but:
I don't think we make (even an experienced user) write this:
trafo = function(x) FunctionParamValue(function(env)

probably some sugar / shortcut might be needed?

Base automatically changed from master to main January 25, 2021 19:28
@mb706 mb706 marked this pull request as draft May 14, 2021 20:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
4 participants