-
Notifications
You must be signed in to change notification settings - Fork 137
Conversation
Which toolchain are you using? Seems to compile fine on the June 17 nightly for Ubuntu, thought I haven't tested it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add a test to Tests/TensorFlowTests/LayerTests.swift
.
@dan-zheng Yeah on it. Also what kind of test would you want? Would any sample differentiable function be okay? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The code looks good. Could you please add some tests? I also have some suggestions on API documentation.
Alright @rxwei , @dan-zheng I'd like to ask a few things:
Or i could just add some comments in the documentation as to how this layer is to be used Also the build and tests pass locally, I'm just getting the documentation right now. |
We don't need to remove anything at the moment. But now that I think about it, I think that a protocol for function layers could be interesting. Any conforming type can get an automatically-implemented public protocol FunctionLayer: Layer {
@noDerivative
var function: @differentiable (Input) -> Output { get }
}
public extension FunctionLayer {
func callAsFunction(_ input: Input) -> Output {
function(input)
}
} With this, you can redefine a struct Tanh<Scalar: TensorFlowFloatingPoint>: FunctionLayer {
@noDerivative var function = tanh
} But yeah, it is still a wrapper, not the best interface. It would be great if Swift supported function conforming to protocols, in which way all differentiable functions can conform to
It is great that you are looking to write a tutorial about it! However, as you've seen in the test case, this functionality is a bit cumbersome in that it requires the user to spell out the |
@rxwei So we discussed that layers like zero padding are just wrapper functions, So I'll mark them as done in the list with this functionality? Also, could a build be triggered on this? |
I put some more thoughts to this. I believe that it is still valuable to have layers |
Working on this currently, IMO a major functionality addition to the Layer API we currently have. This would allow for any differentiable function to be wrapped around and added as a layer in a model.
Also avoids API redundancies for layers like reshape, padded etc.
I'm currently getting the error that the layer doesn't conform to protocol of Layer and that a call function is needed. As far as i understand, for a structure to inherit a protocol, you need to extend and define all the functions in the protocol, something like abstract classes theoretically. Any thoughts on where i might be going or doing it wrong?
Refer to discussion in #54 here