Skip to content

Avoid deadlock between Put and Get #109

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Aug 5, 2015
Merged

Conversation

stevenosborne-wf
Copy link
Contributor

There is a problem where the batch channel fills up and Puts hold the lock preventing Gets from catching up.

@dustinhiatt-wf @brianshannan-wf @tylertreat @alexandercampbell-wf

There is a problem where the batch channel fills up and Puts hold
the lock preventing Gets from catching up.
@alexandercampbell-wk
Copy link
Contributor

+1

for i, val := range b.items {
cpItems[i] = val
}
go func() { b.batchChan <- cpItems }()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This wouldn't necessarily preserve the order of items coming in versus going out, as go routines could add the batches in any order once they get backed up, or really before too.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we care about the ordering, but might want to document it.

@brianshannan-wf
Copy link
Collaborator

+1 if we don't care about ordering per above comment

for {
select {
case <-b.batchChan:
case <-time.After(5 * time.Millisecond):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems like there could be a more intelligent/safer way to know if anyone is waiting to put stuff on the channel.

@stevenosborne-wf
Copy link
Contributor Author

@brianshannan-wf @tylertreat-wf Comments addressed. @alexandercampbell-wf Can you review the most recent commits?

@alexandercampbell-wk
Copy link
Contributor

LGTM but I don't know much about this code.

@@ -158,6 +165,11 @@ func (b *basicBatcher) Dispose() {
b.flush()
b.disposed = true
b.items = nil

// Drain the batch channel and all routines waiting to put on the channel
for len(b.batchChan) > 0 || atomic.LoadInt32(&b.waiting) > 0 {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if you'll ever hit it, or if it matters, but two threads could have len(b.batchChan) evaluate to greater than 0 at the same time and if there was only one the other will be blocked on the next line.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're safe in here because of the lock.

@dustinhiatt-wf
Copy link
Contributor

+1

1 similar comment
@brianshannan-wf
Copy link
Collaborator

+1

@stevenosborne-wf
Copy link
Contributor Author

Investigating regression.

@tylertreat-wf
Copy link
Contributor

+1

@stevenosborne-wf
Copy link
Contributor Author

@dustinhiatt-wf @brianshannan-wf @alexandercampbell-wf Fix deadlock in dispose. Ready for review.

@dustinhiatt-wf
Copy link
Contributor

+1

1 similar comment
@brianshannan-wf
Copy link
Collaborator

+1

tylertreat-wf added a commit that referenced this pull request Aug 5, 2015
Avoid deadlock between Put and Get
@tylertreat-wf tylertreat-wf merged commit c5e9b05 into master Aug 5, 2015
@tylertreat-wf tylertreat-wf deleted the batcher_deadlock branch August 5, 2015 21:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants