Skip to content
This repository was archived by the owner on Jul 1, 2023. It is now read-only.

Add Function/Lambda Layer #298

Merged
merged 7 commits into from
Jun 28, 2019
Merged

Add Function/Lambda Layer #298

merged 7 commits into from
Jun 28, 2019

Conversation

Shashi456
Copy link
Contributor

@Shashi456 Shashi456 commented Jun 26, 2019

Working on this currently, IMO a major functionality addition to the Layer API we currently have. This would allow for any differentiable function to be wrapped around and added as a layer in a model.

Also avoids API redundancies for layers like reshape, padded etc.

I'm currently getting the error that the layer doesn't conform to protocol of Layer and that a call function is needed. As far as i understand, for a structure to inherit a protocol, you need to extend and define all the functions in the protocol, something like abstract classes theoretically. Any thoughts on where i might be going or doing it wrong?

Refer to discussion in #54 here

@jon-tow
Copy link
Contributor

jon-tow commented Jun 26, 2019

Which toolchain are you using? Seems to compile fine on the June 17 nightly for Ubuntu, thought I haven't tested it.

@Shashi456 Shashi456 marked this pull request as ready for review June 28, 2019 07:22
Copy link
Member

@dan-zheng dan-zheng left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add a test to Tests/TensorFlowTests/LayerTests.swift.

@Shashi456
Copy link
Contributor Author

Shashi456 commented Jun 28, 2019

@dan-zheng Yeah on it. Also what kind of test would you want? Would any sample differentiable function be okay?

Copy link
Contributor

@rxwei rxwei left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code looks good. Could you please add some tests? I also have some suggestions on API documentation.

@Shashi456
Copy link
Contributor Author

Shashi456 commented Jun 28, 2019

Alright @rxwei , @dan-zheng I'd like to ask a few things:

  • Are there any layers that we need to remove which are just wrapper functions currently?
  • Could I write a tutorial about this functionality, and if yes how could i go about doing that?

Or i could just add some comments in the documentation as to how this layer is to be used

Also the build and tests pass locally, I'm just getting the documentation right now.

@rxwei
Copy link
Contributor

rxwei commented Jun 28, 2019

  • Are there any layers that we need to remove which are just wrapper functions currently?

We don't need to remove anything at the moment. But now that I think about it, I think that a protocol for function layers could be interesting. Any conforming type can get an automatically-implemented callAsFunction(_:) method.

public protocol FunctionLayer: Layer {
    @noDerivative
    var function: @differentiable (Input) -> Output { get }
}

public extension FunctionLayer {
    func callAsFunction(_ input: Input) -> Output {
        function(input)
    }
}

With this, you can redefine a Tanh layer like this:

struct Tanh<Scalar: TensorFlowFloatingPoint>: FunctionLayer {
    @noDerivative var function = tanh
}

But yeah, it is still a wrapper, not the best interface. It would be great if Swift supported function conforming to protocols, in which way all differentiable functions can conform to Layer. This is one of the directions we can look into later with the Swift community.

  • Could I write a tutorial about this functionality, and if yes how could i go about doing that?

It is great that you are looking to write a tutorial about it! However, as you've seen in the test case, this functionality is a bit cumbersome in that it requires the user to spell out the Input and Output types. I'd suggest holding off on this until we figure out a better way to make this functionality available, without the cost of requiring explicit generic arguments.

@Shashi456
Copy link
Contributor Author

Shashi456 commented Jun 28, 2019

@rxwei So we discussed that layers like zero padding are just wrapper functions, So I'll mark them as done in the list with this functionality?

Also, could a build be triggered on this?

@rxwei
Copy link
Contributor

rxwei commented Jun 28, 2019

I put some more thoughts to this. I believe that it is still valuable to have layers ZeroPadding1D, ZeroPadding2D and ZeroPadding3D as structs defined in the library for API completeness (completing the layer list you presented!). After all layers have been ported, we can think about what to do in the long-term for reducing the boilerplate in these layer definitions.

@rxwei rxwei merged commit 472b29f into tensorflow:master Jun 28, 2019
@Shashi456 Shashi456 deleted the lambda branch August 29, 2019 16:03
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants