@@ -9,18 +9,19 @@ returning an object, `model`, on which other methods, such as [`predict`](@ref)
9
9
[`transform`](@ref), can be dispatched. [`LearnAPI.functions(algorithm)`](@ref) returns a
10
10
list of methods that can be applied to either `algorithm` or `model`.
11
11
12
- The second signature is provided by algorithms that do not generalize to new observations
13
- (called *static algorithms*). In that case, `transform(model, data)` or `predict(model,
14
- ..., data)` carries out the actual algorithm execution, writing any byproducts of that
15
- operation to the mutable object `model` returned by `fit`.
16
-
17
12
For example, a supervised classifier might have a workflow like this:
18
13
19
14
```julia
20
15
model = fit(algorithm, (X, y))
21
16
ŷ = predict(model, Xnew)
22
17
```
23
18
19
+ The second signature, with `data` omitted, is provided by algorithms that do not
20
+ generalize to new observations (called *static algorithms*). In that case,
21
+ `transform(model, data)` or `predict(model, ..., data)` carries out the actual algorithm
22
+ execution, writing any byproducts of that operation to the mutable object `model` returned
23
+ by `fit`.
24
+
24
25
Use `verbosity=0` for warnings only, and `-1` for silent training.
25
26
26
27
See also [`predict`](@ref), [`transform`](@ref), [`inverse_transform`](@ref),
@@ -41,7 +42,7 @@ If `data` encapsulates a *target* variable, as defined in LearnAPI.jl documentat
41
42
[`transform`](@ref) are implemented and consume data, then
42
43
[`LearnAPI.features(data)`](@ref) must return something that can be passed as data to
43
44
these methods. A fallback returns `first(data)` if `data` is a tuple, and `data`
44
- otherwise` .
45
+ otherwise.
45
46
46
47
The LearnAPI.jl specification has nothing to say regarding `fit` signatures with more than
47
48
two arguments. For convenience, for example, an algorithm is free to implement a slurping
@@ -63,16 +64,6 @@ Return an updated version of the `model` object returned by a previous [`fit`](@
63
64
`update` call, but with the specified hyperparameter replacements, in the form `p1=value1,
64
65
p2=value2, ...`.
65
66
66
- Provided that `data` is identical with the data presented in a preceding `fit` call *and*
67
- there is at most one hyperparameter replacement, as in the example below, execution is
68
- semantically equivalent to the call `fit(algorithm, data)`, where `algorithm` is
69
- `LearnAPI.algorithm(model)` with the specified replacements. In some cases (typically,
70
- when changing an iteration parameter) there may be a performance benefit to using `update`
71
- instead of retraining ab initio.
72
-
73
- If `data` differs from that in the preceding `fit` or `update` call, or there is more than
74
- one hyperparameter replacement, then behaviour is algorithm-specific.
75
-
76
67
```julia
77
68
algorithm = MyForest(ntrees=100)
78
69
@@ -83,6 +74,16 @@ model = fit(algorithm, data)
83
74
model = update(model, data; ntrees=150)
84
75
```
85
76
77
+ Provided that `data` is identical with the data presented in a preceding `fit` call *and*
78
+ there is at most one hyperparameter replacement, as in the above example, execution is
79
+ semantically equivalent to the call `fit(algorithm, data)`, where `algorithm` is
80
+ `LearnAPI.algorithm(model)` with the specified replacements. In some cases (typically,
81
+ when changing an iteration parameter) there may be a performance benefit to using `update`
82
+ instead of retraining ab initio.
83
+
84
+ If `data` differs from that in the preceding `fit` or `update` call, or there is more than
85
+ one hyperparameter replacement, then behaviour is algorithm-specific.
86
+
86
87
See also [`fit`](@ref), [`update_observations`](@ref), [`update_features`](@ref).
87
88
88
89
# New implementations
@@ -102,11 +103,6 @@ Return an updated version of the `model` object returned by a previous [`fit`](@
102
103
`update` call given the new observations present in `new_data`. One may additionally
103
104
specify hyperparameter replacements in the form `p1=value1, p2=value2, ...`.
104
105
105
- When following the call `fit(algorithm, data)`, the `update` call is semantically
106
- equivalent to retraining ab initio using a concatenation of `data` and `new_data`,
107
- *provided there are no hyperparameter replacements.* Behaviour is otherwise
108
- algorithm-specific.
109
-
110
106
```julia-repl
111
107
algorithm = MyNeuralNetwork(epochs=10, learning_rate=0.01)
112
108
@@ -117,6 +113,11 @@ model = fit(algorithm, data)
117
113
model = update_observations(model, new_data; epochs=2, learning_rate=0.1)
118
114
```
119
115
116
+ When following the call `fit(algorithm, data)`, the `update` call is semantically
117
+ equivalent to retraining ab initio using a concatenation of `data` and `new_data`,
118
+ *provided there are no hyperparameter replacements* (which rules out the example
119
+ above). Behaviour is otherwise algorithm-specific.
120
+
120
121
See also [`fit`](@ref), [`update`](@ref), [`update_features`](@ref).
121
122
122
123
# Extended help
0 commit comments