Skip to content

Commit e7e68a0

Browse files
authored
Update with a pullback example.
1 parent a6c4d15 commit e7e68a0

File tree

1 file changed

+22
-4
lines changed

1 file changed

+22
-4
lines changed

README.md

Lines changed: 22 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -24,9 +24,9 @@ import TensorFlow
2424
let hiddenSize: Int = 10
2525

2626
struct Model: Layer {
27-
var layer1 = Dense(inputSize: 4, outputSize: hiddenSize, activation: relu)
28-
var layer2 = Dense(inputSize: hiddenSize, outputSize: hiddenSize, activation: relu)
29-
var layer3 = Dense(inputSize: hiddenSize, outputSize: 3, activation: {$0})
27+
var layer1 = Dense<Float>(inputSize: 4, outputSize: hiddenSize, activation: relu)
28+
var layer2 = Dense<Float>(inputSize: hiddenSize, outputSize: hiddenSize, activation: relu)
29+
var layer3 = Dense<Float>(inputSize: hiddenSize, outputSize: 3, activation: {$0})
3030

3131
@differentiable(wrt: (self, input))
3232
func applied(to input: Tensor<Float>) -> Tensor<Float> {
@@ -37,14 +37,20 @@ struct Model: Layer {
3737
}
3838
```
3939

40-
#### Run a training loop
40+
#### Initialize a model and an optimizer
4141

4242
```swift
4343
let optimizer = SGD<Model, Float>(learningRate: 0.02)
4444
var classifier = Model()
4545
let x: Tensor<Float> = ...
4646
let y: Tensor<Float> = ...
47+
```
48+
49+
#### Run a training loop
4750

51+
One way to define a training epoch is to use the [`Differentiable.gradient(in:)`](https://github.com/apple/swift/blob/652523f49581a42986ef2b6b04a593ed47496122/stdlib/public/core/AutoDiff.swift#L214) method.
52+
53+
```swift
4854
for _ in 0..<1000 {
4955
let 𝛁model = classifier.gradient { classifier -> Tensor<Float> in
5056
let ŷ = classifier.applied(to: x)
@@ -56,6 +62,18 @@ for _ in 0..<1000 {
5662
}
5763
```
5864

65+
Another way is to make use of methods on `Differentiable` or `Layer` that produce a pullback (i.e. a backpropagation function). Pullbacks allow you to compose your derivative computation with great flexibility.
66+
67+
```swift
68+
for _ in 0..<1000 {
69+
let (ŷ, backprop) = classifier.valueWithPullback(at: x)
70+
let (loss, 𝛁ŷ) = ŷ.valueWithGradient { ŷ in softmaxCrossEntropy(logits: ŷ, labels: y) }
71+
print("Model output: \(ŷ), Loss: \(loss)")
72+
let 𝛁model = backprop(𝛁ŷ)
73+
optimizer.update(&classifier.allDifferentiableVariables, along: 𝛁model)
74+
}
75+
```
76+
5977
For more tutorials and models, go to [**tensorflow/swift-tutorials**](https://github.com/tensorflow/swift-tutorials) and [**tensorflow/swift-models**](https://github.com/tensorflow/swift-models).
6078

6179
## Development

0 commit comments

Comments
 (0)