You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
let optimizer = SGD<Model, Float>(learningRate: 0.02)
44
44
var classifier =Model()
45
45
let x: Tensor<Float> =...
46
46
let y: Tensor<Float> =...
47
+
```
48
+
49
+
#### Run a training loop
47
50
51
+
One way to define a training epoch is to use the [`Differentiable.gradient(in:)`](https://github.com/apple/swift/blob/652523f49581a42986ef2b6b04a593ed47496122/stdlib/public/core/AutoDiff.swift#L214) method.
52
+
53
+
```swift
48
54
for_in0..<1000 {
49
55
let 𝛁model = classifier.gradient { classifier -> Tensor<Float>in
50
56
let ŷ = classifier.applied(to: x)
@@ -56,6 +62,18 @@ for _ in 0..<1000 {
56
62
}
57
63
```
58
64
65
+
Another way is to make use of methods on `Differentiable` or `Layer` that produce a pullback (i.e. a backpropagation function). Pullbacks allow you to compose your derivative computation with great flexibility.
66
+
67
+
```swift
68
+
for_in0..<1000 {
69
+
let (ŷ, backprop) = classifier.valueWithPullback(at: x)
For more tutorials and models, go to [**tensorflow/swift-tutorials**](https://github.com/tensorflow/swift-tutorials) and [**tensorflow/swift-models**](https://github.com/tensorflow/swift-models).
0 commit comments