-
Notifications
You must be signed in to change notification settings - Fork 137
Conversation
Co-Authored-By: tanmayb123 <[email protected]>
So there's good news and bad news. 👍 The good news is that everyone that needs to sign a CLA (the pull request submitter and all commit authors) have done so. Everything is all good there. 😕 The bad news is that it appears that one or more commits were authored or co-authored by someone other than the pull request submitter. We need to confirm that all authors are ok with their commits being contributed to this project. Please have them confirm that here in the pull request. Note to project maintainer: This is a terminal state, meaning the ℹ️ Googlers: Go here for more info. |
For the record, CLA passed in #100. |
𝛁state = 𝛁input.state | ||
reversed𝛁inputs.append(𝛁input.input) | ||
} | ||
return (.init(cell: 𝛁cell), .init(Array(reversed𝛁inputs.reversed()))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note: Regarding this array reversal, while I could've zero-initialized an array of timeStepCount
tensors and modified them in reverse order, it would be less efficient because of the cost of heap-allocating timeStepCount
extra tensors.
I think CI isn't working. |
Yup, still investigating. Locally all tests are passing. |
Based off of #100 by @tanmayb123. My push to that branch accidentally closed that PR, so I'm starting a new one.
RNN<Cell: RNNCell>
that forms a recurrent layer from a cell. Thanks @tanmayb123 for starting this!RNN.call(_:)
.SimpleRNNCell
,LSTMCell
,RNN
) conditionally conform toVectorNumeric
andVectorNumeric
for easier integration with more efficient optimizers in the future.Tensor.init(glorotUniform:seed:)
.RNN
usingSimpleRNNCell
.Related to #52. Resolves #91.