Skip to content
This repository was archived by the owner on Jul 1, 2023. It is now read-only.

Add Leaky relu activation function #260

Merged
merged 5 commits into from
Jun 19, 2019
Merged

Conversation

Shashi456
Copy link
Contributor

@Shashi456 Shashi456 commented Jun 19, 2019

I was going to add SELU as well, but it has a prerequisite of initializing with variance scaling, which needs flow control. I'll add that later. Test and build passes locally.

Working on Separable layers next and the layer abstraction we spoke about.

@Shashi456
Copy link
Contributor Author

Also are there any other activations to be added?

Copy link
Contributor

@rxwei rxwei left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I left some comments.

@rxwei rxwei merged commit 484148f into tensorflow:master Jun 19, 2019
@Shashi456 Shashi456 deleted the math-op branch June 19, 2019 16:32
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants